The new version of the SEO tool sports an entire set of features to help webmasters working with case-insensitive web servers (e.g. IIS), in addition to other precious improvements.
Ladies and gentlemen, let me guide you through the most important of them.
SEO case-sensitivity issues easier to deal with
We'll start this tour from the main entrance, the "Crawl a Site..." command.
You can see there is a new crawl option: "Ignore upper/lower case differences in URLs" to treat URLs as case-insensitive.
A new crawl option in Visual SEO Studio
The default value is not ticked, as URLs are treated by specification as case-sensitive. When not using the option, Visual SEO Studio's crawler makes it easy to spot case-sensitivity issues due to bad internal linking on a case-sensitive web server (e.g. MS IIS):
Crawl e Index of a site having URL case-sensitivity issues
There are special cases though when an SEO professional might want to ignore casing to focus on other issues.
Suppose she already assessed internal duplicate content issues due to bad casing in internal links on an IIS web server. Now she can concentrate on other SEO tasks until the internal links are not taken care of.
detail of the new crawl option, ticked
The same site crawled with the new option
Please folks, now let's gather and have a look in the 'URL Suggestions' area at two new reports dedicated to potential case-sensitivity issues:
Two new reports in URL Suggestions
"URLs differing in casing only" helps better spotting internal duplication issues due to wrongly cased URLs in internal links in case of case-insensitive web server
"Duplicate canonical tags (case-insensitive)" takes care of reporting similar issues with badly formed canonical link tags.
Hint: report "Duplicate title tags (canonical)" in "HTML Suggestions" is another important tool to locate also case-sensitivity issues in URLs. The new reports are more focused on the exact potential cause, though.
Based on users' feedback, the column 'Canonical' has been added to the both the "Duplicated Canonical tag" and "Duplicated Canonical tag (case-insensitive)" reports. This permits to see whether a case-sensitivity issue is somehow dealt with a canonical URL tag.
Now, please follow me to the next new feature...
Full support of all possible HTTP and robots meta directives
The Program now fully supports the X-Robots-Tag HTTP header, both for generic and bot-specific user-agent.
It also support of bot-specific meta tag directive (e.g. <meta name="botname" content="noindex, nofollow" />).
Properties window, new fields dedicated to HTTP and Meta robots directives
Since "index" and "follow" directive are received also via generic bot meta directives and HTTP headers, the page Properties window provides two new "Index" and "Follow" boolean fields, showing the combined effect of all different directives.
The feature is received on all parts of the software. Please have a look at the section Custom Filters:
Custom Filters SEO oriented query engine now uses such combined logic to query, saving users tons of time via the brand new "RobotsDirective" operator.
Custom Filters new operator, "RobotsDirective"
There's no need to express the full (no)index/(no)follow combining all the fields values with multiple operators, the new RobotsDirective operator takes care of it for you.
Note: the new directive replaces the old one, which only took into account the generic meta robots tag.
Customizable URL Parameters and extensions
Let's join now at the preferences window.
User can now add it's own set of URL Parameters to be ignored during a site exploration. Parameters are organized in three groups ("Session IDs", "Tracking Tags" and "Miscellaneous") just for clarity.
The newly organized URL Parameters option window, now customizable
Similarly, file extensions to be ignored can be extended as well with your own extensions set.
There are several minor improvements. For example, the Page Links, now shows also the value of the "target" attribute, if present.
Please folks, come a little closer and observe this detail:
The Session window shows the new new crawl option value, case-sensitivity of the visited URLs.
Furthermore, now that the crawler supports bot-specific HTTP and meta robots directives, the used user-agent to which they apply to is shown as well.
The crawl options shown in Session window
Support to multiple .net framework versions has been improved.
Visual SEO Studio can now run on any .Net Framework version starting from 3.5 to higher. It previously required having .Net 3.5 installed and enabled, and prevented some users installing it with ease.
A new Setup wizard, re-done from scratch to permit complete support of all .net framework versions (3.5 or greater) and to better comply with UE and Italian Law regulations when asking for EULA acceptance.
Furthermore, a good bunch of bug fixes enrich the 'iOnic' release. For a full and boring list, please consult the official Release Notes.
Conclusions, and What's Next
I hope you enjoyed this tour as much as I did being your guide.
After a period of re-organization, program development is now proceeding at full speed, with a tighter release cycle scheduled, so expect other news soon.
In the meanwhile, there are all those IIS based web sites with case-sensitivity issues waiting for you to fix them, and now you have the proper tool to deal with them in no time.
So stop procrastinating, go ahead and fire the new Visual SEO Studio 'iOnic' and hunt them all!
Comments are open on linked Google+ page.