Another month has passed, and I kept working hard on our favorite SEO Audit Tool. After so many nights working on it, it's a pleasure for me to announce you Visual SEO Studio 0.8.7

Troubleshoot Analytics accounts in no time

This month the cherry on the cake is a brand new feature: "GA Suggestions":

GA Suggestions reports suite - Visual SEO Studio
GA Suggestions reports suite

Ever had to figure out whether Google Analytics® tracking code is installed on every page, installed correctly, and so on?
If so, for the future you will probably save precious hours of work, highly reducing the chances some installation error slips through:
Just launch the report suite against your crawled data, and all the most common issues with mis-configured accounts will be checked in no time.

What does it check:

  • Pages missing GA tracking code
    Other than missing important information about your user behavior, if some pages miss it your Analytics reports could show more visits than what you actually have.
  • Pages with repeated GA tracking code
    Your Bounce Rate is 3% or lower, close to zero, and you are so proud about it? You've better not be: a Bounce Rate lower than 20-40% is not normally possible, what probably happens is you are invoking the code twice and all your data is screwed!
  • Multiple GA codes
    Gone through a migration and changed account number? Find out whether you cleaned everything up properly
  • Pages with old Synchronous script
    If you tracking code is detected as synchronous, it's about time to give it a refresh: the recommended asynchronous code can help loading your pages much faster!
  • Pages with much older Urchin code
    How long time you didn't touch it? You are missing a good bunch of powerful free reporting features.
  • ...and more to come!

Please notice the feature is probably unique in Visual SEO Studio, at least is not present in the most known competitor products.

There are two things to keep in mind:
First, the report examine only the crawled pages; private sections are not crawled and cannot be inspected; also a few pages might exceed the maximum download length and be truncates, if the tracking code was in the truncated part, they might be reported as missing code.
Second, false positives are possible: since the whole page HTML is inspected, a page describing Analytics code might be reported as having multiple tracking code. Inspecting scripts content only is of course possible, but would impose a performance overhead for a very uncommon case.
GA suggestions are meant as suggestions of possible problem, always double check!

Custom Filters: the power of a new SEO query engine! (experimental)

Every time I release a new Visual SEO Studio version, it's always painful to decide what can make it for the release date, and what will have to wait some more. Delaying the date is sometimes possible, but users are waiting for features and fixes. The general rule of thumb is: if a new feature is mature enough, it is enabled and made available, otherwise next release will be.

This time I had to face a dilemma: the query engine was ready, stable and performs fast enough; if not 100% complete it is surely usable, yet the user interface still had a few issues.
Considering it would surely benefit many users already, I decided to publish it anyway. So ladies and gentlemen, I'm proud to present the Custom Filters!

Custom Filters, advanced query engine
Custom Filters, an advanced query engine

Custom Filters is an SEO oriented query engine conceived to perform any kind of customized filtering over crawled pages data via a simple user interface.
I designed it to make it at the same time powerful, intuitive, and easy to use. I believe I largely succeeded.
Operands are common SEO entities (e.g. title, path, html...), operators are presented related to the selected operand, it's crystal clear whether the rows are computed with a logical AND or a logical OR.
The result set will open in a Tabular View, where it can further rearranged, sorted, exported to native Excel format...

Expect this feature to enrich in the near future.
At the moment Load and Save filter are disabled (the operands might change slightly, and I prefer not having to convert filters generated with an experimental version), and there are a few issues in the user interface I will have to fix. Give it a try, you will not be disappointed by unleashed power it gives you!

A new "URL suggestion": Too Many Tokens

This was suggested by a user, who taught me something I didn't know: there is evidence that only up to 16 words are taken into account in a link anchor text (source: updated on April 2013).
What has it to do with URLs? Turns out anchor texts are often made by full URLs, which are tokenized in single words up to the above mentioned limit (sources - in German - are dated 2009, and based on an old version of the previously cited article which detected the limit as 8 words, now superseded).

So the reasoning is: if you want to squeeze the most of your URL in terms of anchor text juice, keep it within the first 16 tokens.

Google Webmaster Tool report "Links to your site", "anchor text" gives us a clue on how the URLs are tokenized; pure anchor texts do not exceed 16 words (alt tag texts can be listed with higher word count):

Tokenization of URL anchor texts in Google Webmaster Tools
Tokenization of URL anchor texts in Google Webmaster Tools

It took me some time to make the sense of the original set of articles in German (I don't speak German, and had to use Google translation, mainly to English - I really hope to have correctly got the sense of them), then find up to date research, and then experiment a little myself as best as I could.

I don't know how critical the information is, it really depends how many "followed URL-links" you have; nevertheless I found the research intriguing.

I introduced a few simplifications, in order not to raise an alarm flag for elements of the URL which normally do not carry much juice: I take the full URL (schema/protocol included), but ignore the end made by file extension and querystring parameters. Feedbacks are welcome.

One thing is clear: I really have to put in place the integrated help system.

Usability: Progress Bar for long background tasks

When performing long task - where long means "potentially more than one second" - Visual SEO Studio works the task in a background thread, avoiding to freeze the User Interface. So far so good, the UI is still responsive, but when the task took one minute or more (e.g. when loading and processing huge sites data), the user started to wonder whether the program was really working, and how much would it take.
Visual feedback is important, that's why this new version now shows progress bars when processing long tasks

Long tasks give advancement feedback with a progress bar - Visual SEO Studio
Long tasks give advancement feedback with a progress bar

The background task is usually split in two: loading data - which could be instantaneous if it already is hold in memory manager, or could take up to one minute or more for very large datasets and slow computers - and processing it. This last step is usually very fast, and rarely takes more than a second or two.
At present the progress bar doesn't distinguish the two tasks, but will do it in future releases.

A long standing crash fixed

Remember I told you about the long tasks performed in background thread?
Well, turned out I'm not that good making diagnosis for crashes occurring during that phase: I'm able to know what more or less occurs, but not where.
One crash occurred while opening the Create Sitemap form, in the rare conditions when all crawled data were not exportable (e.g. case of noindex pages) in an XML Sitemap.
The issue affected very few users, and I'm glad to say it's now gone.
Being able to fix a crash as soon as possible is vital, so for the future I will improve the diagnostics for those asynchronous tasks as well.

About the development process

As I said, being able to quickly publish a fix is extremely important. So far, I've been quite good with it, but the user base increasing day by day, and new features being published, the chance of crashing bugs slipping through all the test nets increases.
Until now I've adopted a so called "single branch" development model, where new features, bug fixed and experimental change are all sharing the same work space. It permitted me to speed up development in the initial stages, but not it would make harder to quickly release an urgent bug fix, are a personalized private beta release (I had to disable each time a bunch of not-yet-ready features and test for regressions).

It's time to slow a little down to go faster afterward: I will soon adopt a two branch development model, where all changes are first performed in a development branch, and only a selection of mature and well tested changes are pushed into the pre-production branch.

Conclusions, and What's Next

The 0.8.7 release gives increased inspection power to skilled SEOs; it also contains many more changes - for a detailed list of the changes, please refer to the full release notes.

What's coming next?
As usual, several features are queuing for the next releases. I can't promise everything will make it for it, but here is a wish-list:

  • Refine the Custom Filters feature
  • Language localization: Visual SEO Studio will be translated in other languages; the fist non-English one will be Italian ('cause I happen to speak it)
  • Usability: introductions of "usage paths", to guide the user in the most common tasks
  • Help system: it's a huge work, and has to be started
  • Ability to crawl a sitemap or a plain URL list, even of URLs from different sites
  • The ability to export XML Sitemap from any table listing a set of pages
  • Reviewing the usability of the Create XML Sitemap feature (it already is appreciated, but can be improved)
  • Basic integration with Yandex Webmaster Tools APIs
  • ...and many other usability improvements and fixes

So, did you try the latest Visual SEO Studio version?
Then I'd love hearing your impressions and feedback!