Lately I've been pushing software updated more often than usual, and today I'm happy to announce a new 0.8.18 version of my favourite SEO Audit tool. I will also introduce a few features introduced with the previous versions, which I never officially presented on this blog.
Screenshot Taker revamped
During the last few months I had received a lot of feedback on the product and the screenshots taker revealed itself as the weakest part. Not only it was responsible of crashes, but in same instances users were not able to save their screenshots as the browser didn't capture the "document.load" event. Furthermore, given the way the tool tries to estimate the layout width, adaptive layouts didn't render at best.
Now not only all the known crash conditions have been fixed, but user can force to stop loading the page, reload it, and chose a preferred screen resolution width to start with.
The new Screenshot Taker dialog
The window now also sports the URL where the screenshot was taken from.
Another little usability helper: when the URL you want the screenshot from is redirected to another URL, the new URL is shown instead of the original one (but the image is associated to the original URL) and evidenced in blue.
A new crawl option
Several users kept asking me to be able to override robots.txt Disallow directives for the sites they administer. The request made perfect sense and I'm glad to present the new option:
The new crawl options to ignore robots.txt Disallow directives
Once you selected the "Ignore robots.txt Disallow directives" option, you can specify an optional path to avoid anyway. This permits to deal with so-called spider-traps you set on your administered sites.
Full support of robots.txt extensions
Pigafetta, the Visual SEO Studio crawler, already supported some extensions to the robots.txt original syntax (e.g. crawl-delay), but missed some others which are now increasingly used and supported by the major search engines. This release adds full support to Allow directives and wildcards path matching.
For a complete description of the robots.txt protocol support, see the Pigafetta crawler page.
Pretty Charts and Pie Icons
Introduced in rel. 0.8.17 in HTML/URL/GA Suggestions, Pie Icons enhance usability giving the at a glance the idea of how extended - and how much it matters, thank to proper colouring - an issue is.
Pie icons are available as a dedicated column in the summary grid, and in all report tabs.
A pie chart also helps getting a better idea of how contents break down. Expect more charts all over the tool in the near future.
GWT tells you how many URLs are blocked by the robots.txt file; Visual SEO Studio lists them all.
Non-crawlable items pane in Visual SEO Studio
Introduced with version 0.8.15, the features opens to other features that will see the light in the near future, so stay tuned!
Conclusion, and What's Next
During the last two months I published new versions more frequently, about two releases per month (against the normal one-per-month). I tested my ability to respond to a tighter release cycle, in particular to push faster important fixes, and I'm quite satisfied with the result. What I've not been able to do is to give proper result to the new feature with dedicated blog posts. With this post I profited to spend a few words also highlight at least the most notable new features of the previous versions.
There are several other improvements worth mentioning, such an improved memory usage detection which further increased the program robustness against out-of-memory exceptions during large crawls, and the improved organization of the User Interface and its usability (e.g. Colours Legend, Session Pane, Start URL evidenced in session headers, etc..).
Also the current release has much more in it. For a detailed list of the changes, please read the full release notes.
For the immediate future, I'm working to close all issues reported by the users, but also expect some cool new major feature I am working on since time :)
Happy crawling with Visual SEO Studio!