On April 6, 2013 I published the new 0.8.5 Beta release of Visual SEO Studio.
As a complement to the full release notes, since the previous release I started to highlight in a blog post the major improvements and changes.

Go International: full support for Unicode characters encoded in URLs

The program has always correctly displayed Unicode characters, but what if they are URL-encoded in the URL?

Decoding of Unicode URL paths is fully supported in Visual SEO Studio
Decoding of Unicode URL paths is fully supported in Visual SEO Studio

While for several western countries it is considered a best practice limiting URL characters to a restricted set (e.g. avoiding accents on vowels in the URL), for may other cultures not only the rule might not apply, it is common practice.

Showing the original encoded URL in most view would make them very hard to use.
This release makes a huge step towards these users who have to deal with Cyrillic, Arab, Hebrew, Persian, Japanese, Tamil, etc... in the URLs.

More than half of Visual SEO Studio users are from countries where it's very likely to find URL-encoded characters in the address bar. I felt I couldn't ignore their needs.

The feature is unique in Visual SEO Studio, at least is not present in the most known competitor products.

Tabular View face-off - Visual SEO Studio
Which view gives you more insights?

URL decoding support is added throughout the whole application: all URL related properties (Path, Location, Canonical, Referrer, HREFs...) are shown both in the original and URL-decoded version, using the URL-decoded (and human readable) as preferred one for all the views where user has quickly spot what the pages are about.

Unicode characters are transparently shown in modern browsers address bar, in the SERP... URLs are for humans too; it's about time SEO tools adapt to it.

Usability: a "sincere" progress bar

The progress bar shown during crawling was misleading: it took as a maximum the max number of URLs set in the crawl options (default is 500), even if the site was - say - 50 pages.
This version fixes the issue: now the progress bar adjusts as the length of URL in the queue changes. That's because the total number is not known in advance. The few users who tried a private beta found it a much better solution.

Crawling progress bar improved
Crawling progress bar now reflecting queue length, and showing instant values

The feedback improved too, and now the progress bar shows a text with the current value, maximum and percentage.

Usability: better showing applied crawl-delay

Visual SEO Studio crawl speed is pretty good: although not optimized yet, when crawling at full speed I can usually squeeze 5-15 pages per second, comparable to what many competitor products achieve.
But full speed crawling is allowed only for sites listed in the "Administered Sites" group, otherwise the crawl-delay directive in the robots.txt file is respected, or a courtesy delay of 10 seconds (quick takeaway: ensure to add your sites in the Administered list!).

The problem is, many people do not realize you can crawl your sites at full speed by overriding the applied crawl-delay, nor understand the courtesy delay was 10s.
Two distinct and well known SEO professionals I highly respect took the time to test the tool, and discarded it at first thinking crawling speed was too slow. Those are brilliant guys, there is a clear usability issue there.

Even if 0.8.5 development was already frozen for the testing phase, I made an exception and did a quick minor improvement:
Now when no "override crawl-delay" option is selected, default crawl-delay shown as 10.0s (instead of 1.0s), to avoid creating false expectations. When selected, it is automatically set to 0.0s, the most common choice.

Applied crawl-delay values made clearer, and overridden smarter
Applied crawl-delay values made clearer, and overridden smarter

It still is not perfect, but should help. I will work more on the issue of both real and perceived crawling speed in the near future.

Usability: URLs breakdown in summary tabs, and better grouping

HTML and URL suggestions summary tabs now show a break-down of visited items:

Breakdown of visited URLs
Break-down of visited URLs in HTML and URL Suggestions summary tabs

It makes much clearer how percentages are calculated in the reports.
The feature stems from a chat I had with a user who tested a private beta; he had no quick way to know how many noindex pages he had, and I thought that's a task for the tool to say!

What we were discussing were actually his impressions on another improvement:
in the previous versions of Visual SEO Studio, the HTML Suggestions listed the "Duplicated title tags" and "Duplicated meta-description tags" without taking into account that non-canonical URLs should be skipped as they do not cause duplicate content issues.
Now the 0.8.5 version shows two distinct reports: one with the usual algorithm, the other taking into account only the canonical version of the pages.

Duplicate title tags counts are better computed
"Duplicate title tags" counts are better computed

The new version better resembles totals reported by Google Webmaster Tools. Please notice that - provided you performed a full-crawl - Visual SEO Studio reports should be considered more precise, as GWT is always several days out of date.
The old versions are kept for comparison for a while, but might be dropped soon (and the new reports will take the old names); I'm waiting for users feedback on this.

Other changes

When preparing the Release Notes (please refer to it for a detailed full list of the changes) I myself have been impressed by the sheer amount of small improvements in usability, small fixes, performance gains...

One thing I put a lot of effort into has also been tackling down the most critical known crashing issues. If you read the Release Notes, you've probably seen I fixed several. Some never actually occurred to any user, other where reported by crash reports. Fixing crashes is always prioritized in my book, but some took time and several occurrences before I could fix them. A special thanks goes to the users who reported the crashes providing details and an e-mail address I could ask further details.
Today there remain only two critical issues I couldn't reproduce and fix; they affected about 1% of the user base (a 1% too high!) at application start-up; they are normally solved by launching again or restarting the PC.

In case you incur in a crash, please provide a valid e-mail address if you can. It is no mandatory of course, but you would help me better locate the problem, speed up the fix, and I could provide a private beta or a workaround before the official fix is released.

What's next

There are several features queuing for the next release. I can't promise everything will make it for it, but here is a wish-list:

  • A custom filter to extract pages matching a boolean chain of advances search operators (e.g. "contains", "not contains", "contains more than once", etc..)
  • Pre-packaged search filters for the most common actions
  • Ability to crawl a sitemap or a plain URL list, even of URLs from different sites
  • A much better memory management, to highly reduce RAM usage
  • The ability to export XML Sitemap from any table listing a set of pages
  • Reviewing the usability of the Create XML Sitemap feature (it already is appreciated, but can be improved)
  • ...and many other usability improvements and fixes

So, did you try the latest Visual SEO Studio version?
Then I'd love hearing your impressions and feedback!

Comments are open on linked Google+ page.