Note di rilascio per la versione 0.9.2.68

Le note di rilascio del prodotto dettagliano ogni singola modifica apportata al rilascio. Scopri cos'è cambiato con Visual SEO Studio versione 0.9.2.68

Visual SEO Studio 0.9.2.68

Published: martedì 15 settembre 2015

This is a usability and feature release, with great attention to user experience.

What follows is a fully detailed Release Notes; for an easier-to-read, non-exhaustive overview with pretty screenshots, please read: Visual SEO Studio 0.9.2 'Clouseau'

New Features:

  • HTML validation
    The "Validation" window shows also all HTML parsing errors found.
    This is not a full W3C HTML validation, it's a super-fast, local - no need to call a web service - validation of only those errors which could prevent content being correctly understood by humans and search engines. HTML parsing errors are localized. Note: The "Validation window was formerly named "XML Sitemap Validation", it has now been renamed and sports a new icon (and yes, it still does XML Sitemap validation, and is the most complete and fast tool on the market for that!).
  • Custom Filters, you can now query pages on the number of HTML parse errors found in them. Powerful!
  • Non-crawled items, when blocked by robots.txt, now show on a new coloumn "robots.txt blocking directive" Disallow directive and it's line number. Clicking on the content cell opens the blocking robots.txt in Content view and hightlights the cited Disallow directive.
    The same command is also available via context menu entry "Show blocking robots.txt" right clicking anywhere on the row.
  • Page Links new coloumn, "robots.txt blocking directive", shows for internal links the robots.txt Disallow directive blocking the URL pointed by the link, and it's line number. Clicking on the content cell opens the blocking robots.txt in Content view and hightlights the cited Disallow directive.
    The same command is also available via context menu entry "Show blocking robots.txt" right clicking anywhere on the row.
  • "Show blocking robots.txt" context menu command, available from "Page Links" and "Non-crawled items" windows, selects the robots.txt node, shows in the "Content" page the robots.txt content, and highlights the actual robots.txt Disallow directive responsible of the blockage.
  • "Show in DOM" context menu command.
    Available from "Page Links", "H1-H6 page headers" and "Validation" windows, when fired highlights in the DOM window the corresponding HTML/XML element.
  • Columns order and visibility changes are now persisted. Every time a grid is loaded it will remember the customization. At the moment the feature is available for the Table View grid only.
    From the options dialog window user can restore grid layout to its default setting.
  • "Show in code" context menu command available also for any node (elements, comments, text...) in the DOM view.

User Experience / Usability:

  • "Line" and "Position" columns in "Page Links", "H1-H6 headers" and "Validation" windows are now clickable links, to make crystal clear the feature is more than that.
    They'll pop up a context menu with only the two applicable options: "Show in code" and "Show in DOM".
  • Crawling: When site exploration was prevented by an unexpected HTTP status code in robots.txt, the crawler reported a misleading reason "Blocked by robots.txt". Now It explains the exact reason, along with the status code met (in Output window).
  • Crawling: In case of actual robots.txt directive preventing the spider to crawl a site, the related line in robots txt is cited in Output window.
  • Find result window cells can now be selected (and copied) individually, before the entire row was selected.
  • Non-crawled items window cells can now be selected (and copied) individually, before the entire row was selected.
  • Content window, colouring schema changed to get rid of an ugly bright green for script tags and content.
  • Non-crawled items window now disambiguates the resource protocol.

Performances:

  • During crawl using "Page Link"s windows to inspect robots.txt impact over links has much less memory footprint.
  • Page Links, inspecting robots.txt impact over links is much faster in case of links pointing to many (sub)domains.

Various:

  • Install process: Visual SEO Studio will be launched in the language selected in the setup process (if user keeps the default option to launch the program at installation end).
  • Setup UI: Back button on the Options form has the arrow image again.

Fixes:

  • Fixes theoretical crashing condition (no user ever experienced it) when inspecting DOM (GA Suggestion or some Custom Filters) of pages in large data sets.
  • robots.txt viewer didn't disambiguate among HTTP/HTTPS versions.
  • DOM window, comments were shown with double opening and closing markup.
  • Crawl Session window: the used User-Agent wasn't shown in the crawl options summary.
  • Previously non-localized message now translated.
  • Typos fixed (various languages).