Temptation, by William-Adolphe Bouguereau (derived by public domain artwork) 30 days Pro Trial
Two new crawl options
More power to "Crawl URL list"
Spanish translation revamped
Conclusions, and what’s next

 

Free users can try the Professional Edition for 30 days

There's a lot of cool stuff in the Professional Edition to make you save time and do your SEO work better; we realized many users didn't know about its features so we decided to let them test the full unlimited edition for 30 days, free of charge.

The Free Trial registration window
The Free Trial registration window

After the Trial period, user can decide whether to upgrade to the paid Professional Edition, or can continue using the free Community Edition as before. Sounds like a good deal!

Two new crawl options

Visual SEO Studio is very strict about respecting the standard Robots Exclusion Protocol. It's a good thing, there are cases however where being stricter than the most commonly used search engine makes harder to inspect further issues before fixing the REP violations.
Based on users' feedback, we decided to let user override standard behaviour in accordance of what googlebot would do.

Take the standard-compliant behaviour when robots.txt file returns a 401 "Unauthorized" or a 403 "Forbidden" HTTP result code. Googlebot doesn't comply to the standard, and interprets all 4xx codes as "full allow".
According to the standard REP only 200 "OK", 404 "Not Found" and 410 "Gone" status code should be interpreted as such for robots.txt

Fred's opinion on googlebot's respect for REP
Fred's opinion on googlebot's respect for REP

While we believe Google behaviour is not correct, and we have to take into account also other search engines, we also have to enable our user to inspect a web site as Google does even on such uncommon cases. So we now permit user to override the standard behaviour.

the new crawl options
the new crawl options

There is another case where googlebot goes against the standard: when a request to the robots.txt file redirects toward the root address (i.e. the website home page). This scenario is more common, there are many CMSs redirecting to the / root address any resource not found instead of returning a 404 "Not Found" status code as they should, robots.txt included. Here we don't blame much googlebot for the choice, as it is unlikely webmaster's intent to prevent search engines from crawling and indexing their content.
It is correct for an SEO tool like Visual SEO Studio to report the issue; at the same time, we understood we needed to permit overriding standard behaviour to let our users inspect content like the major search engine would.

More power to "Crawl URL list"

We are really proud of the Crawl URL list feature. Our users love it. It saves tons of time merging different URL lists from several backlink provider source.
Now it is even more powerful!

The improvements in Crawl URL List
The improvements in Crawl URL List

After having listened to our users we added some crawl options here too, those that make more sense in the context.
Then we gave users an immediate way to see the distinct domains and sub-domains from the merged URL list (yes, you can export all the lists!).
We also added an additional backlink intelligence provider among the supported CSV formats.
New suggestions are already pouring in from our user base, so expect more improvements in the future!

Spanish translation revamped

Even if we were told the Spanish translation was mostly OK, we knew it was not up to par with the quality we want to offer. It was some time we were looking for a maintainer, and we finally found Javier Marcilla, a talented web marketer and SEO who gladely took the ball. Javier is not the kind to waste time, and already submitted an helluva lot of correction to the translation. Spanish speaking users will certainly appreciate his effort!

Conclusions, and what's next

There's much more in this release, for a complete and boring list, see the Release Notes. A lot of stuff we are working on too.
Development will stop only for a few days next week: we will be at the Web Marketing Festival in Rimini, 8-9 July 2016. Federico will give a speech about Crawl Budget Optimization, and the rest of the team will be at the sponsor's stand and attend the tracks. It will be a great opportunity to meet with our users and exchange ideas. If you want to join us to the WMF16, you still are on time to use the entrance discount coupon for the event: wmf16spons
After the beaches of Rimini, full steam ahead developing new features, with renewed enthusiasm.

Now, are you using the free Community Edition?
Register for free to the 30-day Trial period, unleash the power of the Professional Edition!


Comments are open on linked Google+ page.