According to Homer, Teukros fought the Trojan War aside his half-brother Ajax (Public Domain image, statue of Teukros in Pontevedra)Crawl Ajax-based sites
Speaks Polish
Available-Memory checkpoints
SERP snippet preview updated
'Truncated titles' report updated
Usability improvements
Conclusion, and What's Next

 

Crawl Ajax-based sites

Ever had a client with a Wix site? May be a low budget one?
Auditing Ajax-based sites is not a pain any more thanks to Visual SEO Studio 'Teukros'

An Ajax site fully crawled by Visual SEO Studio
droneripreseaeree.com, an Ajax site fully crawled by Visual SEO Studio

Ajax sites supporting the specifications for crawlable Ajax (e.g. Wix sites) can now be visited like any other site.
The integration is seamless all over the program and all pretty URLs with ajax hashbang fragments are reported as part of the URL path throughout the program.

Please notice this unique feature makes Visual SEO Studio one of the very few (if any) SEO software to permit auditing an ajax based web site.

A special thanks to my far cousin who after showing me his Wix based Action Drone web site made me realize I had to make the software able to crawl all that stuff.

Visual SEO Studio Speaks Polish

As previously announced, now Visual SEO Studio speaks Polish:

Visual SEO Studio with Polish user interface
Visual SEO Studio with Polish user interface

With hundreds of Polish-speaking users, I felt a Polish translation was an important "nice to have". You can image my surprise when Mariusz Kołacz out-of-the-blue volunteered for the task. Mariusz is a known skilled SEO and software developer from Poland, a good match with the ideal translator profile for Visual SEO Studio.
This translation has been under work for some time and procrastinated by our commitments, and you can imagine how I felt having this piece of jewel almost complete but still needing some last minute honing. Last minute help come from my polyglot friend Łukasz Janowski, who also speaks IT and some SEO.

I'm now proud to say: "Visual SEO Studio mówi po polsku!".

Available-Memory checkpoints during crawling

I like to consider it a measure of the success Visual SEO Studio is experiencing:
An increasing number of uses tries to crawl very large sites until their PC exhausts the available free memory. The result is an "out-of-memory" crash.

Now crawler during spidering tries its best to check in advance whether there is enough free memory to continue with the crawl session. If it understands there's not enough it stops the crawl session, re-compacts the memory and invites closing other programs.

Crawl session is stopped if not enough memory is detected to continue
Crawl session is stopped if not enough memory is detected to continue

The checkpoints are expected to highly reduce the chance an out-of-memory occurs during crawling, but they still are theoretically possible.
Also trying to inspect crawl session completed because of insufficient memory could lead to an out-of-memory crash. I've tried to tune the memory consumption at my best, and will improve it further in the future.

I spent good part of a week testing the feature, trying to reach the edge.
I assure you it's a daunting task: you PC becomes unresponsive, slooooow, and it starts producing so many errors you didn't even imagine were possible. In this scenario I've been working.

SERP snippet preview updated

On March 2014 Google changed the layout of its SERP pages.
Title links now use a bigger font, and the way long title are truncated has changed as well: first it preserved entire words, now truncation occurs at character level.

SERP snippets emulate updated Google layout
SERP snippets emulate updated Google layout

Visual SEO Studio is up-do-date with the new layout and offer a correct simulation of Google SERP snippet.

'Truncated titles' report updated

For all those of you meticulous SEOs and webmasters who polished your title tags not to be truncated and fit the 512px limit, a bad news and a good news:

  • bad news is you have to do it all over again:
    Google increased the font size and changed truncation rule
  • good news is that Visual SEO Studio is the only SEO tool able to report all truncated titles together, and show you a SERP snippet preview, and it's up-to-date with the latest Google layout changes.

So just crawl your site, launch the "HTML Suggestions" report, click on the "Truncated titles" tab, and save tons of time. And yes, you can export the pages set to .xlsx (Excel native format, compatible with OpenOffice as well).

Usability improvements

The most immediate one is the use of a dedicated icon for robots.txt files in Crawl and Index View.

Worth noticing: in tool windows (e.g HTTP Issues, Find Results, ...) now status of robots.txt file nodes (ok/error/warning) changes the used logic.
Not found is now considered just a warning, not an error (a missing robots.txt file is perfectly valid), while a redirection is now considered an error instead of a warning (a redirected robots.txt file is invalid even if tolerated by same search engines).

There are several other minor improvements in this release, among them:

  • An expiration date shown for Administered Sites.
  • Page links now also shows the content of the title attribute. I does not carry SEO value - to the best of my knowledge - but user might want to inspect it for usability issues, or to check against keyword stuffing.

Conclusions, and What's Next

This version opens a new frontier to on-site SEO Auditing:
Taking Wix alone, there are 34ML Wix sites out there, and only Visual SEO Studio to help you audit them! These are huge opportunities an SEO professional could now tackle.

Furthermore, it is an important milestone for the Polish speaking market, and a more stable release for all the users.

It also receives several little improvements; for a detailed list of the changes, please read the full release notes.

The next immediate step will be a new feature:
Crawl sitemaps (normal, nested with index sitemap, and also read from robots.txt directive). The feature is half done, but have to make it robust against big data sets, compression, validation, etc...

Then next priorities for the future will be:

  • being able to smoothly upgrade the database schema.
  • dramatically reduce memory consumption.
    My goal is to be able to smoothly crawl 500K / 1ML URLs without issues. Yes, it is theoretically feasible.

OK, enough reading. You surly have plenty of work to do with your Ajax sites to audit, and titles to review.
So do download Visual SEO Studio latest release, and unleash 'Teukros'!