The purpose of page health indicators is to show if anything is measurably wrong, such as a broken link, a high rate of negative feedback, etc. You can think of them as analytics smoke alarms that monitor for signs that something is not as it should be.
Each chart is colored to indicate its status: green means everything is OK, yellow means that some issues were detected, and red means something needs improvement.
We determine whether an indicator needs attention or not by comparing across pages of the same content type.
This table gives a month-by-month assessment of your page’s health as being “OK,” having “some issues,” or “needs improvement.”
By clicking on the “month” or “assessment” tabs at the top of the columns, you can sort results by month or assessment to track changes.
The "page health overall" assessment is derived from the following metrics:
# of “nos” per 1000 sessions
The ratio of “no” feedback submissions for every 1,000 visits. This refers to feedback from the “Did you find what you were looking for on this webpage” survey found at the bottom of most Mass.gov pages.
The percent of visits in which people used search, the main navigation, or clicked the Mass.gov logo to leave the page, instead of engaging with content on it.
Rate of traffic to children (Applies to “parent” pages such as Binders, Curated Lists, Organization, Service and Topic pages)
The percent of visits in which visitors click a link on your content instead of ejecting or leaving without clicking anything.
How easy your content is to read for the average person (measured by grade level based on the Flesch-Kincaid formula).
The number of links that don’t work.
The number of negative feedback submissions for every 1000 sessions that include your page.
Why “nos per 1000” and not just “nos”?
When measuring negative feedback, it’s important to account for the size of your audience. For example, imagine 50 respondents may have answered “no” to the “Did you find what you were looking for on this website?” survey at the bottom of your page. If you recorded 10,000 sessions over the last full month (sessions refers to a group of user interactions in Google Analytics), that's not too bad. If you recorded 100 sessions over the last full month and had 50 “nos,” that’s a bigger concern.
If your content has a large number of “nos,” it’s worth reviewing the verbatim comments that people are leaving on your page. This can help you spot trends and address issues on your page. You can review verbatim comments either by clicking the “Feedback” tab at the top right of your page when logged in to the CMS, or using the Feedback Manager tool for a deeper dive on your organization’s content.
If a page has fewer than 500 sessions, "no" responses per 1,000 visitors won't be counted. If this is the case, you'll see “null” under # of “nos” per 1000 sessions.
This measures the percentage of visits in which people escape your page, presumably because it doesn’t include what they thought they’d find there. In an effort to find what they need, they used the search bar, the main navigation tabs, or even clicked on the Mass.gov logo as a last resort.
A high eject rate could indicate that your content is incomplete or that people are being directed to it from other pages when they shouldn’t be. Consulting the “Previous URL” chart in the “Audience” report could reveal clues about the path your visitors have taken to get to your page, and what they were hoping to find there.
This applies only to parent pages, that content whose primary purpose is to send users to more granular pages. An example would be a broad Service page with links to more specific Information Details and How-to “children” pages.
An effective parent page will make visitors aware of relevant child pages and provide easy access to them. A low rate of traffic to children might indicate that links need to be better organized on the parent page, that the links to child pages could be more descriptive, or that the text for those child pages is clearer.
Parent page types:
Readability measures how easy it is for someone to comprehend your content. The grade level shown is based on the Flesch-Kincaid formula.
The targeted readability for Mass.gov content is Grade 6, but we know this is more difficult for some content than others. You’ll know whether your content can serve its purpose at a higher grade level, such as if it is technical information targeted for an expert audience.
Learn how to check your content’s readability using the Siteimprove quality assurance tool.
This indicator tells you the number of broken links on your page. If you do not have any broken links, you will see a 0. If you have 10 broken links, you will see a 10. You should have zero broken links on your page.
Sometimes links are added to content incorrectly. Other times they change and earlier versions no longer work. Checking Siteimprove regularly can uncover all types of broken links, including for documents.
Content types are designed differently, and this tends to shape user behavior.
For example, eject rates across Topic pages are generally much higher than across Information Details pages. One explanation for this is that users who are browsing Topic pages are beginning with less knowledge of what they’re looking for. If they knew, they’d probably have used a search engine to find exactly the page they need.
Page health indicators bake this variability into their assessment of your content.
Was this article helpful?