Release Notes for version

Product release notes detail every single modification made on the release. Find out what changed with Visual SEO Studio version

Visual SEO Studio

Published: Monday, September 7, 2015

This is a minor feature release. It also makes several steps to improve overall usability and user experience.

What follows is a fully detailed Release Notes; for an easier-to-read, non-exhaustive overview with pretty screenshots, please read: Visual SEO Studio 0.9.1 'Zêna' is out!

New Features:

  • Now the crawler can optionally pass the HTTP/HTTPS boundary within a site, i.e. it can consider the two versions as part of the same site and the links pointing to the (supposedly) wrong version as internal.
    This helps to more easily spot wrong internal links, for example after a site migration.
    The new "cross protocol" crawl option default is true (thus changing the traditional behaviour), as it permits to more easily spot potential problems.
  • The new "Show in code" menu option - available from "Page Links", "Page H1-H6 headings" and "Validation" windows - causes the referred HTML (or XML) element to be highlighted in the "Content" (source code) window.

User Experience / Usability:

  • Setup UI: Visual SEO Studio setup is now translated in all supported language.
    If supported, the OS language is used, otherwise English will be used instead. User can switch between languages to use the preferred one.
  • Option "Open last used project" now defaulted to true (only works for new installations, thought).
  • Crawl option "Use HTTP compression" moved to "Advanced Settings".
  • When asked to convert a project, now user is warned the operation might take several minutes.
  • Session window, crawl options details panel: for the "Forced Courtesy-Delay" entry it is now specified that the value is in seconds.
  • Session window, crawl options details panel: for the "Max download size" entry it is now specified that the value is in KB.
  • Tree views top nodes now also include the protocol (e.g. http://) other than the (sub)domain name.


  • Fixes error in handling multiple wildcard * characters in robots.txt specifications, which resulted in the spider not matching in some cases a matching path.
    Thanks to the German user who so precisely described the issue.
  • Typos fixed (all languages).