Frequently Asked Questions

Got any question about Visual SEO Studio? See if this list can satisfy your curiosity.

  1. Licenses, costs, and Editions
  2. Crawl issues
  3. Various

Licenses, costs, and Editions

Is Visual SEO Studio a Free product?

Visual SEO Studio comes in two flavours:

  • Community Edition
    It's Free. Free as in "free beer". No charge. Gratis. No matter whether you use it for personal or professional use. Enjoy it.

  • Professional Edition
    You can get the paid version by installing the Community Edition and selecting the "Get the Professional Edition" menu option to purchase a licence.
    You can evaluate it for free for 15 days by registering the Trial version.

Does Visual SEO Studio "Community Edition", as a Free product, have limitations?

Yes. You can find more on the Editions comparison page.
The Free version will always be worth using for non-professional (and often, even professional) use.

How can I evaluate the 15-day Trial version?

Easy: you download the Community Edition, and when you launch it the first time you'll be prompted with a window offering you to register for the Trial. In case you closed it, you can always invoke it from the Help menu, with the "Register the Professional Edition trial" option.
All it takes is using your e-mail address; you will receive an Activation Code to insert at the end of the registration. As soon as you complete the registration, the installed product will switch to the Professional Edition for a period of 15 days: no limitations and all the features enabled.
The Trial evaluation period is available only once.
Please see the page How to activate the 15-day Trial version for further details.

How can I purchase the Professional Edition license?

This is detailed in the page How to Purchase a Licence.
Available payments methods are: Credit Card, PayPal, and Wire Transfer.
See also the page Pricing.

How much does a Visual SEO Studio license cost?

Please check our Pricing page. Final price may slightly vary depending on the taxation applied.
VAT applies to B2C customers in the European Union and Italian B2B customers, at the rate in force in the client's country of residence. For all other cases will only be applied a 2€ stamp duty.

Can I have a discount if I purchase more licenses?

Yes. Please contact us for details.

Can I use a Visual SEO Studio license on more PCs?

No. Visual SEO Studio license model is "per-seat", a license is valid only on a single PC.
It is bound to the specific PC "hardware signature" so cannot even work on another machine.

Can I move my paid license to another PC?

Yes, just contact us and it could be solved in a matter of minutes.
You'll be asked:

  • to demonstrate you are the actual license holder or authorized by them (you might find it weird, but we have to verify it and it is not always immediate as the contacter might be a different person, using a different e-mail address, etc..)
  • possibly the original license serial number (a.k.a. activation code). That would speed up the process, but if you don't find it we will be able to find it.
  • the "Hardware signature" of the new PC you want to move the paid license to. You can copy-and-paste it from within Visual SEO Studio (Help menu, "About..." window).

Product started in Community mode, my activation code doesn't work any longer

If the license is still valid (check if it has not expired!) it is possible that the hardware signature of your PC has changed because of a recent configuration change. It could be you changed your hard disk, renamed the PC, or similar changes. Some major Win10 upgrades have also been reported to change the hardware signature.
If that's your case, please contact us. We will move the license to the new configuration.

How to contact you for license issues?

The simplest way is by using the built-in "Send us a message" feature accessible from within Visual SEO Studio (Help menu, option "Send us a message") so we will automatically also know your current hardware signature.
Please don't forget to supply an e-mail: if your configuration has changed we might not be able to deduct the license holder. Alternatively, please provide all details to identify you and explain your case.

Window 'send the author a message'
please remember to provide an e-mail address so we can reply you

We will reply you via e-mail as soon as possible.

Crawl issues

The program crawls this site very slowly, with 2s delay between every call. Can I speed it up?

Check the "Crawl-Delay" reported in the Crawl Progress right panel. If it is greater than zero it is evidenced in red. In this case it could be 2s.
The robots.txt file of the site you are crawling set a Crawl-Delay directive to prevent agents overloading the server resources; Visual SEO Studio respects it up to two seconds (a limit that today is considered highly conservative).
For verified sites - where since you have demonstrated to have administrative permissions you can basically do whatever you want - you can override it by setting a courtesy delay of 0.0s crawl option in the "Crawl speed" tab.
See also the page Managing the Verified Sites list.

The program does not crawl a site.

There are several possibile explanations, each with its own solution:

  1. No resources where crawled, not even the site robots.txt file
    Most likely there has been a network error while attempting to crawl the site. It could be a DNS error, a firewall or a proxy issue. Check any detail in the bottom Output panel. You can configure your proxy setting from the Tools -> Preferences menu item.
    Check if you can browse the site with your preferred browser. If the browser can't, it could be an issue with the web site. It could be a temporary issue, or something more serious to address with your site administrators.
  2. Only the robots.txt file has been crawled, but it reports an error
    Visual SEO Studio fully respects the Robots Exclusion Protocol; in order to do it, it has to first download the site robots.txt file to understand what limitations the site administrators asked to comply with. If it cannot read it it conservatively consider it as a "do not crawl" (note that a missing robots.txt file is not considered a block). Check any detail in the bottom Output panel.
  3. Only the robots.txt file has been crawled, with no errors
    Visual SEO Studio fully respects the Robots Exclusion Protocol; in order to do it, it has to first download the site robots.txt file to understand what limitations the site administrators asked to comply with. In this case it could be that the site administrators prevented access to the spider with a Disallow directive in the robots.txt file (note: default Visual SEO Studio user-agent is "Pigafetta"). Check any detail in the bottom Output panel.
    If you are a site owner, and you have verified the site, you can ignore or override robots.txt directives, change user-agent, and basically do whatever you want in your own property.

The program only crawls the first page of a site.

The most likely reason is that the web page content has been truncated before any link definition in the HTML.
This could happen in web pages full of on-page CSS and scripts; WP plugins are common offenders. To understand if that is your case, select the page node and look at the Properties right panel. Is the property "Truncated" true? Then look at the Content panel, you'll likely see that the page HTML header is huge and the truncation occurs before the body definition.
To work around the issue: raise the "Maximum download size per URL (KB)" and crawl the website again.

After crawling a site for a while the crawler gets HTTP 403 responses.

This is quite uncommon, as Visual SEO Studio is probably the most polite SEO spider on earth: not only it fully respects Robots Exclusion Protocol, it has an adaptive engine that continually monitors web server response time to avoid overloading it. Nevertheless, site administrators could have set up restrictive policies that might recognize it as a potential resource waste and block it after a while.
To work the issue around, set a proper "courtesy delay" between each HTTP request (note that no parallel HTTP calls will be done, only the processing phase with be concurrent).

My site has XXX pages, but the program only finds YYY

Crawl options can significantly vary what the spider can discover.
The most likely cause is the maximum crawl depth set. Highly paginated contents would not be discovered. Try setting the crawl depth option to the maximum allowed (you might wonder why setting a maximum crawl depth; without that, the program would have no defenses against involuntary "spider traps", like an infinite calendar).

There may be other causes to impede the discovery and/or crawling of site URLs. For example if some pages exceed the maximum download size, their content would be truncated; any link defined in the truncated part would not be seen and consequently crawled.

Other pages may only be linked from within pages blocked by robots.txt; or from private pages. The spider needs to find links in order to follow them, and can only see them in pages it is allowed to visit.

My site has only XXX pages, but the program says it has many more

Most likely the web site has some internal duplication issues. A typical case where the number of URLs can be four times the expected number is when the site responds to both http:// and https:// URL versions, and to both www. and non-www. URL versions. This could be caused by a HTTP/HTTPS migration where no 301 redirects were set in place, where internal links where not all fixed, and no canonical URL was specified.

So, even if your perception is to have only XXX pages, since search engines - and thus also Visual SEO Studio - consider different URLs as different pages, the actual number of pages as seen by a search engine is much bigger. Google may of course recognize the internal duplication, but do not assume it will pick the version you prefer. Search on Google for your site pages using the site: operator to see which URLs it is picking (most other search engines also recognize the site: operator).

Other reasons could be that the same pages are reachable through different URLs. A typical case is the "faceted navigation" you can find in many e-commerce sites, where a product page can be present under multiple categories, or under multiple search filters, and search filters are part of the URL. They are seen by the search engine as internal duplicate content.
The solution here is first using a "canonical URL" to tag each product page with the preferred URL. Once a page is "canonicalized", only the version with the canonical URL will be indexed by the search engine even if more are crawled. Visual SEO Studio, like the search engine spiders, will see and visit all page versions, but will not report them as a duplicate content issue in the HTML Suggestions report, and will mark in its Views the pages non canonical versions in light green to help recognizing them.
Faceted navigation can be also an issue in terms of "crawl budget" consumption, where the search engine spider would repeatedly visit the same logical content wasting time and resourcing instead of prioritizing visits to pages you care more about. To fix that, make sure the spider finds unique crawl paths by using a clear internal navigation, a clean link structure, and blocking undesired crawl paths with Robots Exclusion Protocol rules (i.e. robots.txt file Disallow directives, nofollow attributes, nofollow robots meta tag...).

For web sites hosted on IIS web servers (or other web servers with a case-insensitive file system), difference in character casing in the URL are ignored by the web server (against the official URL specifications), and internal links with the wrong casing would lead to the discovery of a "new" duplicate page instead of a broken link.
In such cases, the cure is locating all such cases with the proper report in HTML Suggestions in Visual SEO Studio, fix the wrong internal links, and use a proper canonical URL meta tag.

Other internal content duplication issues can be caused by useless URL query parameters. This again can happen in faceted navigation, but also in several other cases.
The easiest way to deal with them is tagging each page with the proper canonical URL meta tag.


Can I report bugs and make features requests? How?

Of course! Bugs reports and feature requests will always be welcome.

Visual SEO Studio has an automated crash reporting system to handle unexpected run-time exceptions.
We strongly encourage adding indication on how to replicate the issue, as what automatically gathered only helps us understand the point in the program code where the crash occurred, not its root cause. For example, if the crash occurred while analyzing a web page, we have no indication about its URL. If you think it's relevant to locate the root cause and you can, please specify it.

Users can also manually report other types of bugs, or make implementation requests. The quickest way is by using the window provided within the program itself (see help menu, "Send the author a message..."), or this site contact form.

Do you plan a Mac version? And a Linux version?

A Mac version is currently downloadable as a free Beta. It is stable enough and almost complete.
At this stage, we do not feel to make promises about the final release date.
A Linux version would be a byproduct of the port to Mac; at the moment we do not plan a commercial Linux release, it will depend on the market request.

What is the company behind Visual SEO Studio?

Following the large adoption of the software Visual SEO Studio by its user base, on June 2015 we founded aStonish Studio srl, a software house dedicated to develop, distribute, market and support the SEO tool.