I started working on the new features months ago, and routinely had to delay their release due to other priorities. It's always a pain having to sacrifice new features, and I'm so glad now to be able to announce 'Piri Reis' is out!
Crawl XML Sitemaps, Index Sitemaps and robots.txt
Other programs are able to crawl sitemap files, but Visual SEO Studio takes it to another level: not only it can crawl normal XML Sitemap files, it can also recursively explore Index Sitemap files, and even robots.txt files listing sitemaps using the "Sitemap:" directive.
The new "Crawl XML Sitemap" dialog
The Crawl View, with its tree view, is particularly well suited to show the nested structure of the new exploration mode.
During Sitemap exploration, a first protocol-specific validation is performed, and URLs not complying to the specifications detailed on sitemap.org are skipped and reported as non-crawlable.
A damn-fast and comprehensive Sitemap Validation Tool
Building a fully complete XML Sitemap validation tool able to validate at the blink of an eye has been technically challenging. The end result is a little jewel.
You can image the satisfaction the moment it was able to perform in no-time a complete validation not only against all used sitemap schemas, also against a rich set of validation rules against the official protocol specifications detailed on the sitemap.org consortium web site.
Validations is performed for:
- normal sitemaps schema
- index sitemaps schema
- image sitemap schema
- video sitemap schema
- mobile sitemap schema
- news sitemap schema
- alternate/hreflang schema
- XML sitemaps protocol for normal sitemaps
- XML sitemaps protocol for index sitemaps
- encoding, HTTP response codes, ...
Like all grids within Visual SEO Studio, all data can be copied exported to Excel or CSV format (using a right mouse click on the grid header to access the export command), and all cell contents can be copied using CTRL+C keyboard combination.
Changes to project data format handled with on click
For the first time since many releases, I changed the format used to store the projects file with all crawl data.
With thousand of users already having crawled million of URLs, existing projects saved with previous versions of Visual SEO Studio had to be usable.
Old crawl data is converted with one click
The program now recognizes older projects and offers to convert them to the new version. It creates a backup copy first, and updates the old data. I tested it thoroughly and works fast and smooth.
Conclusions, and What's Next
The new version is an important step in the product evolution. The new crawl features and the new db format lay the foundations to further improvements I'm really excited to deliver in the near future.
The current release has much more in it. For a detailed list of the changes, please read the full release notes.
The new crawl sitemap features need some improvements: currently RSS, Atom and text sitemaps are not supported yet, and compressed sitemaps are supported only if a proper HTTP header declaring the compression type is provided by the web server.
Some crawl options already available in the normal crawl mode will be added for the sitemap exploration too.
Sitemaps can also be huge, and the maximum URL limits could be hit. Those will be addressed in the next future.
For the short term, a new version of the product will come soon, mainly addressing overall program usability.
OK, enough talking. Time to take the map and navigate your content with Visual SEO Studio!
Comments are open on linked Google+ page.