Release Notes for version

Product release notes detail every single modification made on the release. Find out what changed with Visual SEO Studio version

Visual SEO Studio

Published: Sunday, August 30, 2015

This is a performances and stability release. A brand new engine dramatically reduces the software memory usage an permits crawling and managing much larger data sets. Performances have been tweaked all over the application and the program is overall more responsive and faster even when dealing with larger crawls.
The new architecture is an important milestone, setting the foundations for new features to come.

What follows is a fully detailed Release Notes; for an easier-to-read, non-exhaustive overview with pretty screenshots, please read: Visual SEO Studio 0.9.0 'Sasquatch'!

New Features:

  • A new engine dramatically decreases memory consumption permitting to crawl much larger sites.
    Crawl options max number of crawl-able URLs is now set to 150K thanks to the new architecture (which would allow much more, we'll loose the limit in time in order to better understand and work on all possible bottlenecks).

User Experience / Usability:

  • The progress bar appearing when opening a new tab now distinguishes three separate phases:
    Reading, Processing and Displaying data (and an explicit text details what the show percentage is referring to).
    This improves UX when dealing with large data sets and long running tasks.
  • Loading large data sets from disk and processing them: the progress bar advancement (now split in three phases) values now match much more precisely the actual state of advancement of the task.
  • Reports progress bar text is now more contrasted and thus more readable.
  • Crawl and Index Views now do not freeze the UI when loading a large data set and remain responsive.
  • Improved responsivity of UI for Table View in case of large data sets.


  • During the processing phase of reports needing to elaborate pages HTML content (GA Suggestions and custom filters), the used memory is continuously auto-tuned according to the available memory, and gradually released to the OS as soon as not needed.
    This permits to maximize speed performance for large page sets without exhausting the computer memory.
  • When crawling a web site, the currently fetched URL node is not anymore ensured to be visible. User Experience is not bad (some even prefer it).
  • Micro-events (HTTP issues and non-crawled items to be shown in UI) packaged in batches, making the processing phases much faster after opening tabs.
  • When closing a tab during a processing phase, the (possible long running) task is now actually aborted, saving resources.


  • "Google Webmaster Tools" authentication options renamed as "Google Search Console" after Google re-branded the service.
  • Setup UI: better wording for close button.
  • Setup: in case of installation failure, user can choose to send an installation log to
  • Crawl options, default number of crawl-able URLs now set to 75K (half of the new raised limit), now that the new engine permits to crawl larger data sets.
  • Minor improvements in supporting Windows 10.


  • Better robustness against Out-of-Memory issues during reports processing phase when dealing with very reduced available memory.
    The new engine actually makes the scenario much less likely in the first place.
  • Improved robustness in case of tabs closed during non-idle phases.


  • Fixed real-world crash occurring when at application start up users clicked where a side panel link will appear. This long standing issue affected several users.
  • Fixed real world crash (one anonymous user affected) occurred during validation of an invalid xml document.
  • Fixed real world crash (one anonymous user affected): searching for a text value in a grid no more selects the cell containing the text if that cell is invisible; that cell is now skipped. Previously it was selected instead, causing VSS to crash. A cell can be invisible because the user decided to hide a column.