Crawl XML Sitemaps and robots.txt

Auditing XML Sitemaps couldn't be any easier: all it takes is entering the URL of a normal Sitemap, an Index Sitemap, or even a robots.txt file!

Crawling XML Sitemap URLs is a common problem for SEO and e-commerce professionals. Visual SEO Studio gives you all the power you need to perform the task and spot at a glance any potential issue with your sitemaps.

Sitemaps can be crawled recursively, and are presented nested within the intuitive user interface. Not only you can crawl normal or index sitemaps, the programs goes a step further and lets you even crawl all the XML Sitemaps listed within a robots.txt file using the Sitemap: directive.

Three-level crawling of XML Sitemaps: robots.txt, index Sitemap and normal sitemaps - Visual SEO Studio
Three-level crawling of XML Sitemaps: robots.txt, index and normal sitemaps
(click to enlarge)

"Crawl Sitemaps" main features:

  • Crawls all URLs listed in a normal XML Sitemap file
  • ...or all URLs listed in sitemaps listed within an Index Sitemap file
  • ...or take an even further step and crawls all sitemaps listed within a robots.txt file
  • ...report visually all non-crawlable items (blocked by robots.txt, out of sitemap scope, ...)
  • Gives you all the reporting power provided for normal site explorations.

..oh, and yes, sitemaps are fully validated too:

XML Sitemap validation - Visual SEO Studio
Validator tool applied to an XML Sitemap (click to enlarge)

We believe Visual SEO Studio to have the fastest and most comprehensive XML Sitemap validator on the market, a real hidden gem!

Note:
"Crawl Sitemaps" is available only in the Professional Edition. You can evaluate it for free for 15 days by registering the Trial version.

See also: