New crawl option to ease auditing redirections
Custom Filters, added new queryable property
Conclusions, and what’s next
Released a couple of days ago, "SEO Swallow" adds minor improvements that make bulk checking the correctness of HTTP redirects much easier. Typical scenario is when you need to audit a site migration, be it a change of domain name or protocol.
For the specific case of HTTP to HTTPS migrations we also made easier detecting mixed contents.
After the new changes we decided to provide our users a thorough tutorial explaining how to use our favourite SEO tool to audit a migration to HTTPS: Auditing a HTTP to HTTPS migration
New crawl option to ease auditing redirections
Visual SEO Studio has always been very strict following Robots Exclusions Protocol (REP), and to fully comply with it a 30x redirection on a robots.txt file should be interpreted as a "full disallow", thus preventing any attempt to request pages within the web property controlled by the robots.txt file (remember, each subdomain is controlled by its own robots.txt).
Despite what the REP says, a 301 redirection on a robots.txt file is a terribly common case in a site migration: webmasters usually write a single rule for the whole site content. Search Engines know it, and are tolerant.
Visual SEO Studio now by default is tolerant too (a minor breaking change with the past) and gives its users the chance to control the behaviour with a new crawl option:
The new crawl option for 30x redirect on robots.txt
The new option "For robots.txt file, treat a redirection to [other domain]/robots.txt as 'full allow'" is available both in "Crawl Site" and "Crawl URL List".
The new value default is true, as it better matches the most common usage scenario (despite the REP rule would be the contrary).
Custom Filters, added new queryable property
Our users love the power of Custom Filters, yet on a regular basis someone points out we forgot something, and (s)he usually is darn right.
We forgot to permit filtering over the entire URL (we only permitted to filter over the URL path).
Sorry guys/gals, now the missing bit is no longer missing:
Custom Filters: filtering a set of pages over their URL initial part
Conclusions, and what’s next
SEO Swallow is a major help for auditing site migration, yet it can be improved. We received precious suggestions from our beta testers and were added to the list.
There several other changes and improvements in SEO Swallow, for a complete list please check the Release Notes.
What's next? There are a lot of good stuff under work; on this occasion we want to reassure our users of one thing: yes, we are working on a Mac port. We are not giving a release date at this stage.
Now, don't let that migration audit wait any longer, launch Visual SEO Studio "SEO Swallow"!