A new Feature Release of the increasingly popular SEO software is out:
Visual SEO Studio 0.8.12 code-named "Bot Swarm".
There is a lot of work under the wood, which required extensive testing before publishing. Luckily enough I found no cheap flights to warmer places during winter holidays, and rainy days helped me staying focused on my preferred tool.

Crawl Performances Improved

I worked hard to squeeze performances by completely decoupling form the crawl process data storage and User Interface update.
The overall result has been nearly doubling the crawl speed in the best case scenario of a single very responsive web site, without sacrificing crawl path visualization nor overloading the web server.

Improved crawl performances in Visual SEO Studio 0.8.2
Improved crawl performances in Visual SEO Studio 0.8.12

New Feature: Multi Site Parallel Crawling

Sometimes SEO professionals need to quickly crawl several sites at the same time, for competitors analysis or to check prospective customers' sites.
The new asynchronous storage queue permits saving crawl data with almost no additional time costs during the crawl process.

This made it possible to push the new – unique in Visual SEO Studio – feature: parallel crawling.

Multi Site Parallel Crawling options dialog
Multi Site Parallel options dialog

Visual SEO Studio has always been able to perform parallel crawling (and all concurrency issues have been fixed with the previous releases), now it makes easier to launch the spiders at the same time.
There still is space for improvement optimizing the access to the storage queue, so expect parallel spidering to become even faster in the future.

Improved usability

There are many usability fixes to discover in the release. Most notably, it's not straightforward closing tabs without having to resort to the far-right 'X' icon.

Tab closure helpers in Visual SEO Studio
Tab closure helpers improve usability

A user contacted me asking how to close tabs, as he wasn't sure the far-right X would limit to close only the active tab and not all of them.
I believe details are important. If you have to use a program daily (or even just once, for that matter), it doesn't have to make you think, everything has to be clear. So one user in doubt was one user too many.
I'm glad I did, as I find myself using the new feature as the normal way to close tabs now!

Conclusions, and What's Next

There are several other changes in 0.8.12 release. For a detailed and boring list, please see the complete Release Notes.

With every release announcement on this blog, I add a list of work-in-progress features I plan to release in the near future. The list increases with time and the schedule is often rearranged and priorities reviewed.

This is what is very likely to be seen with the next release, hopefully a month or less from now:

  • Crawl URLs by Sitemap
  • Support for Russian language
  • Support for Polish language
  • Basic integration with Yandex.Webmasters

Other than the above features, here are the features which can see the light soon (if not in the next release):

  • Optionally Limit crawl to a sub-directory
  • Override robots.txt Disallow directives for Administered Sites
  • Show nodes of URLs skipped during crawl process (e.g. exceeded link depth, robots.txt blocked)
  • Visualize incoming internal links for each page (an often required feature)
  • A major review of the way crawl data is handled, without holding the – compressed – html in-memory. I have a working proof of concept that doesn't sacrifice the reporting speed, but I need to see if I can achieve it without losing backward compatibility with previously crawled data (with thousands of users it's better avoid unnecessary data format changes).
    This would permit to crawl smoothly a much, much bigger number of URLs – in the order of magnitude of hundreds of thousands, depending on the available RAM memory - without incurring in out-of-memory crashes.
  • Robustness improvements: a guard against out-of-memory crashes to permit finalizing crawl data before exhausting computer memory during very long crawl processes for huge web sites.
  • …all other previously announced features in "What's next" sections of official release announcements

Time to test the new Multiple Sites Parallel Crawl now!