Manual: Performance Suggestions

The feature "Performance Suggestions" of Visual SEO Studio, documented in detail.

Performance Suggestions

Performance Suggestions is a set of reports based on Steve Souders' research on web performance and later derived works.
Each report measures for all crawl session web pages a specific dimension and compares their values against limits based on averages measured across the whole Web.

You can learn more about Performance Suggestions in this page: FAQs: Performance Suggestions.
Note: the Images Inspector tool also provides valuable information about image sizes.

Summary

The Summary tab sheet give you an overall perspective of the reports available in Performance Suggestions.
You can quickly select it anytime, even when not it is not visible, by clicking on the Show Summary link.

Reports table columns

Description

The descriptive name of the report. The text is an active link that once clicked will select the tab sheet containing the related report.

Pages

The number of pages affected by the issue detected by the report, i.e. the number of pages listed in the report.

Tot Pages

The total number of pages taken into account when elaborating the reports. This number is the same for all the listed reports.

Percentage

The percentage of pages detected by the specific report, computed as the ratio between the previous two values.

Keep in mind that same reports make binary decisions to decide whether to catalog a page as affected or not by an issue, using a fixed (configurable) threshold and seeing whether the specific measured dimension exceeds it or not.
In such cases the percentage says nothing about how the dimension is distributed. Take for example the report "Page size (KB)": it will tell you - say - all pages having HTML size greater than 50 KB, and from their number you have a percentage. But how many of them are 500 KB? How many are excessively heavy?
To give you a much better idea of the distribution, every report which uses a threshold to be computed comes with a dedicated Histogram view. You can visualize it in the bottom pane.

Pie chart icon

A mini pie chart visually reporting two important information:

  • The percentage shown in the previous column, related to the colored pie slice.
  • The alert level of the issue investigated by the report. It is reported by the color of the slice:
    • Red: the issue investigated has to be considered an Error.
    • Yellow: the issue investigated has to be considered a Warning.
      Please notice that a warning is not a "light error", but something that at this stage the program cannot determine if it is a potential error or something wanted.
    • Azure: the report is just Informational.

As previously said for the Percentage column, reports which use a threshold to be computed are better evaluated by looking also at the Histogram bottom pane to understand how the measured dimension is distributed.

Page size (KB)

Entry type: Warning Warning

With page size we mean here the weight in KB of the HTML file only.
Page size is not normally a big issue in terms of performances as the HTML part is usually only a fraction of the overall page weight (think about the added weight of image files, for example). Nevertheless, everything adds up, and in big highly trafficked web sites reducing page size could lead to important savings in bandwidth and improvement in server responsiveness.
Not only that: more often than not high page size indicate errors in server setting were tonnes of CSS and javascript code are added to the HTML head section; in such cases really impairing web pages performances.

This report lists all pages exceeding a given page size threshold, sorted in descending order.
Column "KB" shows for each page the actual value of page size.

Used threshold, at the time this guide was written, is "Greater than 50 KB". Like many others, it can be customized in the program Options, reachable via the program main menu Tools -> Preferences... entry.
Over time with changes in the SEO world, we update default threshold to keep you up-to-date with the ever evolving market.

Before striving to get lightest HTML pages as possible, let's recap a basic concept: web performance optimization is made of trade-offs, and the HTML page weight is often the middle-ground.

  • Do you want to optimize for first-time view?
    First-time view is the first time a browser visits your site. There is nothing in browser cache so it will have to download all resources.
    Like in life, "first impression matters", and you want new visitors to visualize your page the quicker as possible.
    You avoid using shared CSS and javascript files and you add all that load within the page, then you embed images and at the end you have highly reduced the number of HTTP requests (which are costly).
    Now you have a big HTML page, but once downloaded it the browser knows everything it should know to display it.
  • Do you want to optimize for repeated view?
    If your website users on average visit more pages, you may want them to experience a faster navigation leveraging more the browser cache.
    You use as much as possible shared CSS and javascript files and avoid embedded images. Now you have a light HTML page relying on separate shared resource files to download in order for the browser to know how to display it. The first time it will have to download all shared resources with costly separate HTTP requests, but after the first time it will find them all in the browser cache and next pages it will only have to download the light HTML part.
  • Do you want to optimize for both first-time and repeated views?
    Then you use trade-offs. You use shared CSS and scripts, but also add to the page the CSS rules that permit the browser to display as soon as possible the content "above the fold" (i.e. in the visible part of the browser window) and whatever it takes to permit the user to interact with the page sooner. The trick here is increasing the "perceived speed".

How can you fix the issue on the reported pages:

First thing to do is assessing how big the problem is, and if you can live with it if pages are not that heavy after all.
As a rule of thumb: a HTML size of 25 KB or less is good, 50 KB is highly acceptable, and up to 80-100 KB is still within global statistics variance.
Without forgetting that some first-time view optimization techniques can make the HTML page slightly heavier, try to understand what makes them heavy:

  • In WP web sites, a common cause is an excess of on-page CSS and script blocks. Check the page source code to understand if that is the case. If so, the most likely culprit is a plugin. Make sure the plugin(s) is/are actually used and useful, and well configured.
  • Other times the heavy pages are not that heavy, and the problem is just caused by the theme (the page template). If that is the case, the Histogram view in the bottom panel should show the increased page size is distributed among all pages. Of course you may want to optimize that too, but keep in mind priorities first: no point squeezing a few bytes in the HTML, and then having heavy loads in - say - images.
  • Another common problem we see in WP site: the owner installed a "performance optimization" plugin which in the attempt to save HTTP requests stuffed everything within the page source, but went too much off the mark.
    Such tools have to be tuned carefully, always measuring the outcome before using them in production! (other times we have seen them attempting to decrease HTTP request by merging all shared CSS and scripts, but stupidly invoking them with new query parameters each time so that the browser cache would never help).
  • For old ASP.NET WebForms sites, a common culprit is a useless ViewState. Refer to the dedicated report for more insights about it.

In all cases, check the HTML source code!
You are supposed to be able reading it, knowing at least the bases of HTML, CSS and Javascript.

Download Time (ms)

Entry type: Warning Warning

High values for all pages may indicate performance problems of the web server hosting the website. High values for single pages likely indicate a too heavy content.

Consider a page download time along with the value of the page size: a high download time with a high page size indicates a page too heavy, a high download time with a low page size indicates performance problems on the server side.

This report lists all pages exceeding a given download time threshold, sorted in descending order.
Column "Milliseconds" shows for each page the actual value of download time.

Used threshold, at the time this guide was written, is "Greater than 1000 ms". Like many others, it can be customized in the program Options, reachable via the program main menu Tools -> Preferences... entry.
Over time with changes in the SEO world, we update default threshold to keep you up-to-date with the ever evolving market.

How can you fix the issue on the reported pages:

First, ensure the high download time is not caused by a slow Internet connection used while crawling the website with Visual SEO Studio.
Check the Histogram view in the bottom panel to see how it is distributed. If it is distributed among all pages or if there are single outliers.

If the problem is in the hosting web server, consider upgrading to a higher plan, or change hosting altogether.
You should also have the web server nearer to your users. For example, the majority of your visitors were based in the Europe, it would make little sense having the web server in the US.

If the problem lays in the page size, refer to the solutions offered for the Page size report.

TTFB (ms)

Entry type: Warning Warning

"Time To First Byte" is the time elapsed between the HTTP call to the web server and the first received by the spider.
High values for all pages may indicate performance problems of the web server hosting the website.

This report lists all pages exceeding a given TTFB threshold, sorted in descending order.
Column "Milliseconds" shows for each page the actual value of TTFB.

Used threshold, at the time this guide was written, is "Greater than 200 ms". Like many others, it can be customized in the program Options, reachable via the program main menu Tools -> Preferences... entry.
Over time with changes in the SEO world, we update default threshold to keep you up-to-date with the ever evolving market.

How can you fix the issue on the reported pages:

First, ensure the high TTFB is not caused by a slow Internet connection used while crawling the website with Visual SEO Studio.

High TTFB values distribute among all pages (check the Histogram view in the bottom panel to see how it is distributed) may indicate performance problems of the web server hosting the website.

If the problem is in the hosting web server, consider upgrading to a higher plan, or change hosting altogether.
You should also have the web server nearer to your users. For example, the majority of your visitors were based in the Europe, it would make little sense having the web server in the US.

If you have full control on your web server back-end and an internal development team, consult with your team to analyze and fix what delays building web pages. In our experience with internal dev teams and custom CMSs these problems were often caused by excessive pressure on the database server, without using any kind of application caching system.

Other times high TTFB in production servers are simply due to high website traffic using too many resources so that new HTTP requests are kept "on hold". Again, consider upgrading your backend server.

Meta charset (non UTF-8)

Entry type: Warning Warning

In order to read a text, like the web page HTML source code, a computer needs to know the text encoding used to write it.
Using the wrong encoding would result in not being able to correctly read character not based on the Western alphabet (ASCII characters); you can notice it when you see web pages with texts rendered in browsers with weird characters, or little square characters.
In order to know which text encoding to use, browsers and spiders can use three different pieces of information:

  • Reading from the Content-Type HTTP header the charset optional part, e.g.:
    Content-Type: text/html; charset=utf-8
  • Reading it from the BOM (Byte Order Mark)
    BOM is an optional sequence of characters at the beginning of Unicode text streams signalling which Unicode character encoding is used. Of course, it only works for Unicode texts, when specified.
    Notice that not all web clients are able to correctly read BOM values (even if it does not break any Unicode specifications). Visual SEO Studio can read it correctly.
  • Reading the charset from the HTML meta tags, if specified.
    HTML historically used the http-equiv meta tag to specify an equivalent of the HTTP Content-Type header, e.g.:
    <META http-equiv="Content-Type" content="text/html; charset=EUC-JP">
    HTML5 introduced a new charset attribute:
    <meta charset="utf-8" />
  • Making an educated guess when no information is available, and hope for the best.
    This means using the most used encoding, UTF-8.

Among all possible sources, the third, reading the charset from the HTML meta tags is a little like a "chicken and egg problem":
In order to read the encoding from the meta tag you need to read the text, and in order to read the text you need to know the encoding!
Luckily, HTML meta tags only use ASCII characters, so the problem is solved this way:

  • Assume the encoding is UTF-8 (i.e. using the "best guess" option);
  • Read the used charset from the meta tags, if specified;
  • If the charset read from the meta tags is UTF-8 (or if it is not specified), keep the decoded text;
  • If the charset read from the meta tags is not UTF-8, discard the decoded text and read it again with the correct charset.

You can now understand that when the used charset is specified using a meta tag and it is not UTF-8, the browser is paying a performance penalty because it will have to discard the text decoded so far and restart reading the HTML source code.

Our recommendation is always using UTF-8, and specify it in the Content-Type HTTP header.

This report lists all pages using non UTF-8 meta charset (when no charset is specified in HTTP header or BOM).

How can you fix the issue on the reported pages:

Use the Content-Type HTTP header to specify a charset (please refer to your CMS / web server documentation to know how to do it) and then remove the meta tag.
If you can, set your backend system to use UTF-8 encoding.

ViewState size (bytes)

Entry type: Warning Warning

You will only care about looking at this report when dealing with web pages built with old Microsoft "ASP.NET WebForms" technology.
Back at the time MS engineers had the terrible idea of hiding the Web stateless nature to an army of VB programmers who had not clue what the Web really was, to enable them building web applications similarly to the way they were used to.
To achieve this they placed within the pages generated with their IDE a hidden field which looks like this:

<input type="hidden" name="__VIEWSTATE" id="__VIEWSTATE" value="/wEPDwULL ...a potentially gigantic text value... dtBPL7w=" />

One of the many problems was that this hidden field was, more often than not, populated with a huge encoded text even for pages that did not even need it, like most of public pages of a web site.
The result was in many cases pages many MB heavy of useless data.

While soon appeared on the Internet several ViewState viewer to examine single pages, what really was missing before Visual SEO Studio implemented it was a tool able to analyze the ViewState of all website public pages.
So far we believe it to still be the only tool able to analyze ViewState usage site-wide.

You can use the ViewState viewer in the bottom panel to have better insights on the selected web page ViewState.

This report lists all pages exceeding a given ViewState size threshold, sorted in descending order.
Column "Bytes" shows for each page the actual value of ViewState size.

Used threshold, at the time this guide was written, is "Greater than 1 KB". Like many others, it can be customized in the program Options, reachable via the program main menu Tools -> Preferences... entry.
Over time with changes in the SEO world, we update default threshold to keep you up-to-date with the ever evolving market.

How can you fix the issue on the reported pages:

What to do once the tool has helped you to point out parts of the web page needlessly using the ViewState?
In ASP.NET WebForm each single server-side control supports the boolean attribute EnableViewState, which unfortunately defaults to true.
In our experience in public pages the most common culprits for bloating the ViewState are server-side generated tables.
When you understand a single control does not really need to use ViewState for the webpage to work, set the attribute - or ask your developer to do it - to false:

<asp:Table runat="server" EnableViewState="false" ...

In most cases you will be able to set EnableViewState="false" at page level:

<%@ Page EnableViewState = "false" ...

Just ensure with your developer you are not going to affect the correct functioning of the site when suppressing ViewState at page level.
Please refer to MS documentation for all about the ViewState and the EnableViewState attribute.

You can learn more about the ViewState viewer reading the blog post we wrote when we first published the feature.

Page images

Entry type: Warning Warning

Image are usually the "heavy luggage" in a web page total weight. The Images Inspector tool provides valuable information about image sizes; here we will focus on their number.

Every image - added to the web page with a code snippet like <img src="...the image file URL..." /> - represents an additional HTTP request (other than the bytes to download), and the less HTTP request your page requires to render, the quicker it will load.

There is not a strict rule on how many images a web page should have at most. It really depends on the website type: the site of a photographer would likely have more images than - say - the site of a copywriter. Given the general rule, you should customize the used threshold gearing it to your website niche.

This report lists all pages exceeding a given number of page images threshold, sorted in descending order.
Column "Count" shows for each page the actual number of page images.

Used threshold, at the time this guide was written, is "Greater than 5". Like many others, it can be customized in the program Options, reachable via the program main menu Tools -> Preferences... entry.
Over time with changes in the SEO world, we update default threshold to keep you up-to-date with the ever evolving market.

How can you fix the issue on the reported pages:

Ensure the images make sense as part of the actual content. If you used the "Save images" crawl option, you'd be able to see previews of the images to check what they are.

If you see you are using decorative images like dots, arrows, and so on, replace them with Unicode characters styled with CSS. Other decorative images can be grouped with the "sprites" technique to reduce the number of files to download.

If you need all those images, optimize their size and weight (use the Images Inspector to analyze their weight and pixel dimensions), and when they are not all "above the fold" (i.e. in the visible part of the browser window) use lazy loading:

<img src="image.png" loading="lazy" alt="…" width="200" height="200" />

We also suggest to adopt a policy on the maximum number of content images per page tailored to your site niche and style, and adhere to it.

Nested tables

Entry type: Error Error

In a not so distant past, when there were no valid alternative because of the infancy of CSS, nested HTML tables were often used to implement page layouts.
Other than not be used for their semantic purpose (HTML tables should be used to represent data tables), nested table make the web page render slower because the browser has to recursively re-compute page elements position and size when building the web page graphically. The more the tables are nested, the worst the rendering performances are.

This report lists all pages exceeding a given number of nested tables threshold, sorted in descending order.
Column "Count" shows for each page the actual number of nested tables.

Used threshold, at the time this guide was written, is "Greater than 0". Like many others, it can be customized in the program Options, reachable via the program main menu Tools -> Preferences... entry.
Over time with changes in the SEO world, we update default threshold to keep you up-to-date with the ever evolving market.

How can you fix the issue on the reported pages:

If your page template uses nested tables to achieve a desired layout, it is high time you rebuild it from scratch with a modern, responsive, mobile-first approach, using CSS grids.

If the nested tables are within the single pages, check why they are used. Most likely, they are used for layout reasons and should be replaced with CSS styling.

DOM depth

Entry type: Warning Warning

When a browser reads a web page HTML source code, it builds out of it a DOM (Document Object Model), a tree structure it keeps in memory to represent the whole HTML document.

The more complex the DOM is, i.e. the bigger the number of tree node elements and the deeper the tree structure is, the slower the web page will be to display and react to users' and scripts' actions, and the more computer memory it will consume.

This report lists all pages exceeding a given DOM depth threshold, sorted in descending order.
Column "Count" shows for each page the actual value of DOM depth.

Used threshold, at the time this guide was written, is "Greater than 20". Like many others, it can be customized in the program Options, reachable via the program main menu Tools -> Preferences... entry.
Over time with changes in the SEO world, we update default threshold to keep you up-to-date with the ever evolving market.

How can you fix the issue on the reported pages:

Check if the complex DOM structure is caused by the theme (the page template). If that is the case, the Histogram view in the bottom panel should show the high DOM depth is distributed across all pages. If it is, consider changing template using a lighter one.

If the high DOM depth is present only in single pages, check if you can simplify them. Use also other metrics to assess the priority of the fix: page rendering time, page traffic, etc.

DOM elements

Entry type: Warning Warning

When a browser reads a web page HTML source code, it builds out of it a DOM (Document Object Model), a tree structure it keeps in memory to represent the whole HTML document.

The more complex the DOM is, i.e. the bigger the number of tree node elements and the deeper the tree structure is, the slower the web page will be to display and react to users' and scripts' actions, and the more computer memory it will consume.

This report lists all pages exceeding a given number of DOM elements threshold, sorted in descending order.
Column "Count" shows for each page the actual number of DOM elements.

Used threshold, at the time this guide was written, is "Greater than 500". Like many others, it can be customized in the program Options, reachable via the program main menu Tools -> Preferences... entry.
Over time with changes in the SEO world, we update default threshold to keep you up-to-date with the ever evolving market.

How can you fix the issue on the reported pages:

Check if the complex DOM structure is caused by the theme (the page template). If that is the case, the Histogram view in the bottom panel should show the high number of DOM elements is distributed across all pages. If it is, consider changing template using a lighter one.

If the high number of DOM elements is present only in single pages, check if you can simplify them. Use also other metrics to assess the priority of the fix: page rendering time, page traffic, etc.

Scripts

Entry type: Warning Warning

Scripts are common culprits in cases of web performance problems.

An example of script tag in the HTML syntax is, in case of script defined within the web page:

<script>
...(the script body)...
</script>

In case of script defined in a shared file:

<script src="...(a relative or absolute URL)..."></script>

You should keep down the number of shared scripts to reduce the number of HTTP requests, and avoid as much as possible blocking scripts, scripts that force the browser to block page rendering. This is normally achieved by using the defer or async attributes, or by moving their definition right before the body tag closure, after all visual HTML elements are defined.

This report lists all pages exceeding a given number of scripts threshold, sorted in descending order.
Column "Count" shows for each page the actual number of scripts.

Used threshold, at the time this guide was written, is "Greater than 4". Like many others, it can be customized in the program Options, reachable via the program main menu Tools -> Preferences... entry.
Over time with changes in the SEO world, we update default threshold to keep you up-to-date with the ever evolving market.

How can you fix the issue on the reported pages:

Use the "Page CSS/JS files" bottom panel to see what the scripts are. You might find that some are not really used, and should be removed.

If they are shared among all pages you could merge them together to reduce the number of costly HTTP requests.
The CMS you use might offer ways to automate it; for example you can find for WP several plugins able to merge your script files. Use them with caution because if not properly tuned they could even hurt performances: configure them with care and test before publishing on the production web site. Then measure again!

Blocking scripts

Entry type: Warning Warning

Blocking scripts - evidenced in the Page CSS/JS files bottom panel with the snail symbol - are scripts that force the browser to block page rendering.

When the browser encounters a <script> block, it has no way to know in advance if the script contains only variables and function definitions (which would not pose a problem) or if it contains direct commands to be executed that would alter the HTML document - things like the old document.write(...) - so it has to stop HTML parsing, and download and execute the script before resuming HTML it.
This blocks the browser from displaying a partial result, and also forces it to give precedence to the script when downloading resources in parallel, with the effect of increasing page load time.

To prevent the problem of blocking scripts the historical, and still valid, solution was to put all scripts in the HTML after the definition of all visual elements, before the </body> tag closure (moving them after the body end tag could work in some browsers, but they are supposed to stop parsing content after it).
Then HTML5 introduced the optional async and defer attributes to let developers declare the script can be executed asynchronously, or synchronously after the page was loaded.

A script blocks page rendering when it does not use attributes async or defer (which only make sense when the src attribute is used), and when is not defined after all visual HTML elements are.

This report lists all pages exceeding a given number of blocking scripts threshold, sorted in descending order.
Column "Count" shows for each page the actual number of blocking scripts.

Used threshold, at the time this guide was written, is "Greater than 1". Like many others, it can be customized in the program Options, reachable via the program main menu Tools -> Preferences... entry.
Over time with changes in the SEO world, we update default threshold to keep you up-to-date with the ever evolving market.

Note: by default the program allows one blocking script because a common Google Analytics setup code is actually a "blocking script", albeit a very short and almost instantaneous one to execute, which created an external script DOM element with the async attribute.

How can you fix the issue on the reported pages:

After having removed the unused scripts, if the ones left are "blocking" you try making them "non-blocking".

First thing: move the all scripts before the </body> end tag. This will work also for older browser that do not support defer and async, and has more or less the same effect of the defer attribute: execute them after the web page has been built.
Declaration order matters: if scriptB depends on scriptA, then scriptA must come first.

When you have no control over the script position, but you can add the defer attribute, do it. Execution order will be preserved, and start after the web page was loaded.

If you know the script has no special dependencies, you can add to it the async attribute to load asynchronously in parallel with other resources.

Duplicate script URLs

Entry type: Error Error

Invoking the same shared script more than once could seem a trivial error, as one might think that having the browser already downloaded the script it already is in the browser cache and thus there wouldn't be a performance hit... wrong!
It can pose serious web performance problems: scripts do not only define javascript function (defining a function twice would not be an issue), they can also contain direct commands to be executed on the document DOM. Take for example an instruction to be executed after the document load event, it would be run as many times as the shared script appears in the HTML source code.

This report lists all pages exceeding a given number of duplicate script URLs threshold, sorted in descending order.
Column "Count" shows for each page the actual number of duplicate script URLs.

Used threshold, at the time this guide was written, is "Greater than 0". Like many others, it can be customized in the program Options, reachable via the program main menu Tools -> Preferences... entry.
Over time with changes in the SEO world, we update default threshold to keep you up-to-date with the ever evolving market.

How can you fix the issue on the reported pages:

Remove of duplicate script declarations, and test the behavior correctness before pushing the change in production.

CSS styles

Entry type: Warning Warning

While not as much as scripts, also styles can impact web performance.

An example of style tag in the HTML syntax is, in case of style defined within the web page:

<style>
...(the style body)...
</style>

In case of style defined in a shared file:

<link href="...(a relative or absolute URL)..." rel="stylesheet" />

You should keep the number of shared style files to a bare minimum in order to reduce the number of HTTP requests.

Styles can impact web performance especially when they are blocking styles, i.e. styles defined within the body section instead of the HTML head. Blocking styles force the browser to block page rendering in order to re-evaluate rendering rules, potentially at the cost of restart painting the web page.

This report lists all pages exceeding a given number of style blocks threshold, sorted in descending order.
Column "Count" shows for each page the actual number of style blocks.

Used threshold, at the time this guide was written, is "Greater than 4". Like many others, it can be customized in the program Options, reachable via the program main menu Tools -> Preferences... entry.
Over time with changes in the SEO world, we update default threshold to keep you up-to-date with the ever evolving market.

How can you fix the issue on the reported pages:

Remove all style declaration that are not actually needed, and merge shared CSS style to reduce the number of HTTP requests.
Depending on the CMS you use, you might find tools to automate it. For example for WP you can find several plugins able to merge shared CSS styles; do not trust their default behavior: test them before pushing the changes into production, and ensure that the merged result can be cached by the browser, it must have a fixed URL.

Blocking styles

Entry type: Warning Warning

Blocking styles - evidenced by the snail symbol - are styles defined within the body section instead of the HTML head. They force the browser to block page rendering in order to re-evaluate rendering rules, potentially at the cost of restart painting the web page.

This report lists all pages exceeding a given number of blocking style blocks threshold, sorted in descending order.
Column "Count" shows for each page the actual number of blocking style blocks.

Used threshold, at the time this guide was written, is "Greater than 0". Like many others, it can be customized in the program Options, reachable via the program main menu Tools -> Preferences... entry.
Over time with changes in the SEO world, we update default threshold to keep you up-to-date with the ever evolving market.

How can you fix the issue on the reported pages:

Move all style blocks within the HTML head section.

Duplicate style URLs

Entry type: Error Error

Unlike Duplicate script URLs, duplicate style URLs do not pose much problems in modern browsers, because they would be downloaded only once (if they share the very exact same URL), and their rules evaluated only once.
Older browsers could behave less efficiently though.

This report lists all pages exceeding a given number of duplicate style URLs threshold, sorted in descending order.
Column "Count" shows for each page the actual number of duplicate style URLs.

Used threshold, at the time this guide was written, is "Greater than 0". Like many others, it can be customized in the program Options, reachable via the program main menu Tools -> Preferences... entry.
Over time with changes in the SEO world, we update default threshold to keep you up-to-date with the ever evolving market.

How can you fix the issue on the reported pages:
We recommend if possible to locate all duplicate shared styles and invoke them only once in the head section of the HTML source code, to avoid potential performance problems with older browsers.