All Mass.gov authors & editors have access to embedded web analytics dashboards that provide data about content. These dashboards provide basic reporting capabilities and help you identify when things are going wrong. They’re split into 3 reports (or tabs):
Page health: Metrics that indicate if something is going wrong.**
Audience: Metrics that describe the size and basic makeup of people who view your content.
Visitor interactions: Metrics that describe what visitors do/click on your content.
There is also a date selector at the top of the dashboard that affects all charts across all reports.
When using these dashboards, keep in mind that web analytics only measures behavior, and not a visitor’s holistic experience. This means that these dashboards will reveal patterns about what people tend to do.
**The Promotional Page dashboard also includes configurable key performance indicators that measure if your content is successful.
The page health report consists of metrics that describe user behavior indicative of a frustrating experience. This tab provides insight about what’s going wrong with your content and areas that require your attention.
It includes the following metrics:
What’s a “good” score?
Nos per 1000 sessions
The ratio of “no” feedback submissions for every 1000 visits.
2.0 to 5.0 (Varies by content type.)
The percent of visits in which visitors used search, the main navigation, or clicked the Mass.gov logo instead of engaging with content on the page.
3% to 10% (Varies by content type.)
Traffic to children (only applies to service and organization pages)
The percent of visits in which visitors click a link on your content instead of ejecting or leaving without clicking anything.
40 to 60% (Varies by content type.)
How easy your content is to read for the average person (measured by grade level).
Aim for 6-8th grade.
The number of links that don’t work.
Please note that, except for on Promotional Pages, dashboards do not attempt to measure if pages are “performing well.” This is because what indicates success varies greatly from page to page. Instead, we look for trends in user behavior that suggests that people are having a difficult time finding or getting what they’re looking for.
Why do page health scores vary by content type?
Content types are designed differently, and these design differences tend to shape user behavior. For example, eject rates across topic pages are generally much higher than across information detail pages. One plausible explanation for this is that users who are browsing topic pages are beginning with less knowledge of what they’re looking for. If they knew, they’d probably have used a search engine to find exactly the page they need.
Page health indicators bake this variability baked into their assessment of your content. To understand how your content is doing relative to others of its content type,
The audience report describes how visitors find your content, as well as the volume of visitors. Note that this report is entirely descriptive & includes no judgment about what each metric means for your content’s performance.
The number of times the page was loaded.
The number of sessions a page appeared in. Corresponds with “unique pageviews” in Google Analytics.
How visitors who eventually land on your page come to Mass.gov (e.g. through a search engine, social media, etc.)
The Mass.gov pages visitors visit before this one. Corresponds with “previous page path” in Google Analytics.
Sessions by device type
Segments your visitors by the device type they use. Corresponds with “device category” in Google Analytics.
The visitor interaction report focuses on how visitors engage with your content. Note that this report is entirely descriptive & includes no judgment about what each metric means for your content’s performance.
What links to other Mass.gov pages people click on your content. Corresponds with setting the current page to “previous page path” in Google Analytics.
What links to downloads or other websites people click on your content. Corresponds with outbound link and download event tracking in Google Analytics
Number of “yes” feedback submissions. Data is from Formstack.
Number of “no” feedback submissions. Data is from Formstack.
Using the dashboard: Basics
The date filter
Activate the date filter by clicking on the black box under the heading “Time range.”
The “default” menu offers a number of preset ranges to choose from. All of these are relative to today, e.g. “Last week” means from 7 days ago to today. You can also set specific start and end dates using the “custom” menu.
All charts on the dashboard except for page health charts show data by day. Page health charts require at least a month’s worth of data for an accurate calculation.
Exporting data from a chart
You may export data from any chart on the dashboard. To do this, click the 3 vertical dots in the upper right and click “Export CSV.”
Currently, you can’t export the entire dashboard to a CSV. However, we are able to provide this kind of report if you submit a ServiceNow request.
Using this data to make content changes
To identify actionable improvements you could make based on these dashboards, we recommend beginning with the following questions:
Who is my audience?
Why are they coming to my content?
The answers to these questions may provide valuable context for the data you see on the web analytics dashboards, especially if you are able to combine them with information about your agency’s business processes. Here are some example scenarios:
Scenario 1: Many more pageviews than critical downloads
The main reason people come to your content is to download a form, which they must then fill out and mail to your agency. The eject rate is quite high – more than 10%. In addition, the number of form downloads is only half the number of pageviews.
A plausible interpretation of this might be that people are able to get to the page, but are having trouble finding the form. You could link it in additional places, change the link text to something more explicit, or change the page’s content to clarify that to obtain the service, visitors should fill out the form and mail it in.
Another possibility is that visitors are missing a step in your agency’s business process and aren’t actually ready to download the form and fill it out yet. To address this, you add content to other, related pages that clarifies what they need to do before they can download the form.
How would you know if your changes worked? The number of downloads per pageview goes up.
Scenario 2: Too many negative feedback submissions.
The nos per 1000 sessions metric is higher than you’d like to see. You also notice that an unusual number of people are visiting your content on mobile devices – almost 60%. You hypothesize that your content is too long when viewed on mobile, and this prevents visitors from finding critical information.
To respond to this, you could try editing out as many unnecessary words as possible or splitting the content into different pages for different segments of your audience. You also try to make the content on other, related pages more concise, since people who visit this page are likely to visit those, too.
How would you know if your changes worked? The number of nos per 1000 goes down in the months after you make your changes.
Scenario 3: Content for technical experts is not reaching its audience
Your readability score is a little higher than Mass Digital recommends – grade 10 – but you know your content is intended for technical experts, so you’re not too concerned. You are concerned, however, that according to the “Sources of traffic” chart, only a very small percentage of traffic is coming from search.
You revise your content to use more of the language your audience uses, which you learn about from the Feedback Manager, and also from an agency program manager who frequently communicates with your customers.
How would you know if your changes worked? The share of traffic from “organic” (search engines) increases--and possibly your total traffic, too.