What Google’s Chrome User Experience (CrUX) Report is, and why you should care.
We have written about how Core Web Vitals (CWV) can impact your end-users and how improving CWV can cause better business KPIs, such as higher conversion rates. But there are more reasons for prioritizing CWV, like SEO improvements. Google, in particular, likes high-performing websites. Google rewards websites that demonstrate good CWV optimization with higher search rankings, resulting in more organic traffic. Good CWV has also shown to give you higher Quality Scores in Google Ads, improving your ad quality and ad spend.
How do we know what Google considers “good” CWV? They published guidelines about each of the CWV metrics. Real-user monitoring (RUM) data from an analytics provider like Blue Triangle will let you see if you fit the guidelines, but Google does not have access to the same RUM data. Instead, Chrome User Experience (CrUX) Report data is what Google sees when evaluating web performance, so let’s find out what the CrUX Report is and how you can leverage it.
What is the CrUX Report?
CrUX Report data is accessible via Google’s BigQuery project, Google’s CrUX Dashboard, and Google’s CrUX API. It is also part of Google’s PageSpeed Insights. Each tool shows the data differently because of the time period and data filtering differences. The CrUX Dashboard, for instance, reports on data once per month, but CrUX Report data is available daily in the API. Understanding what makes up the CrUX Report will help you know what you’re looking at.
It’s real-user monitoring data.
The Chrome User Experience Report is a subset of real-user monitoring data. The RUM data used to create the CrUX Report is collected directly from the Google Chrome browser, not from a normal JS tag.
When you monitor the performance of real users on your site, some of those users are using Chrome browsers, and some of the Chrome users opt in to the data collection required to build the CrUX Report.
For a Chrome user’s data to be reported to Google, they need to:
- Have opted-in to syncing browsing history
- Have usage statistic reporting enabled
- Have not set up a Sync passphrase
Even with all these conditions, millions of data points from real users are reported from Chrome to Google daily.
It’s aggregated data.
Each CrUX Report data point collected on any day represents an aggregate look at a trailing 28-day period. The aggregation for performance is reported at the 75th percentile. For example, a CrUX Report data point created on March 29 would include data from March 1 through March 28 and give you the 75th percentile of performance for all users in the data set.
When you report on percentiles, you report on distribution. When there are not enough data points to get an accurate view of a distribution, a percentile may not be available. This means that CrUX Report data might not exist for URLs or site origins that don’t have many visitors.
The availability of CrUX Report data can also vary over time. Google does not define the amount of data that separates sufficient and insufficient data for reporting.
It has dimensions.
CrUX Report data can be filtered by different dimensions, which include:
- Device Type: defined by User-Agent (phone, tablet, or desktop)
- Country: inferred by ISP (USA, CA, UK, etc.)
- Effective Connection Type: defined by the Network Information API (3g, 4g, offline, etc.)
These filters affect the values of the reported performance metrics, including Core Web Vitals, which are:
- Largest Contentful Paint (LCP)
- Cumulative Layout Shift (CLS)
- First Input Delay (FID)
We’ve covered the CWV metrics before if you’re interested in reading more about them.
It measures different populations.
Due to the nature of real users, different people will visit different sites. If two sites are identical and are marketed to different users, aggregate performance data for each site will look different. This is due to differences between populations.
Population differences can be summarized as variations in:
- Number of people in the population
- Population sample size
- Device type and its hardware
- Software being used
- Network speed
- Geographic location
For comparing any datasets, whether, between different origins or different URLs, you should keep filters as consistent as possible. For instance, compare mobile users to other mobile users and not to desktop users.
It also means that comparing datasets can be fraught with unknowns. If a site tends to have users who opt-out of data collection in Chrome, the average user may not be well represented with CrUX Report data.
It’s different from the Blue Triangle RUM data.
Blue Triangle collects 100% of your real-user traffic when our tag is active on your site. CrUX data is a subset of that real-user data, but it is collected directly from Chrome browsers and not accessed from a JS tag. While you can see the performance of Safari users with Blue Triangle, for instance, CrUX data only looks at a subset of Chrome users. As such, CrUX data does not capture the entire picture of all your real-user website visitors.
Why you should care.
In 2020, we published extensive guidance on how web performance can impact your SEO and even looked at Google’s journey with performance-based ranking, mobile-first indexing, and Core Web Vitals as the standard.
A new development is the additional benefit of better ad placement, and possibly lower overall ad spend due to better Quality Scores in Google Ads. The Quality Score is determined partially by the landing page experience for your ad campaigns. Let’s look at how you can use the CrUX Report to improve search rankings and Quality Scores.
CrUX data gives insight into how Google sees your site.
In May of 2020, Google announced the future use of Core Web Vitals in search results rankings and a measure of page experience. The update for using page experience in rankings went into full effect in June 2021.
The CrUX Report contains a glimpse of the data that Google’s search algorithm has access to. When you look at CrUX, you get a small look at how the ranking algorithm views your site -- and the other sites in your industry.
This also gives you an advantage for Google Ads. The Quality Score, which is meant to be a diagnostic tool that shows you how you stack up against other similar ads, uses landing page experience to evaluate an ad’s effectiveness. Landing page experience is partially evaluated on reliability. Google recommends using the RUM data available in PageSpeed Insights, which we now know is the CrUX Report, to assess loading speed and page experience.
CrUX allows you to compare your site to others.
For Google, which prizes efficient web design, the transparency of its CrUX API is most likely meant to be used as a motivator. After all, they are the ones who manage web crawlers that encounter a vast number of websites built at different times with competing standards. The more consistency they can get between the most popular or relevant sites online, the easier it is for them to function as a service.
Because the CrUX API is publicly accessible, you can access CrUX Report data for origins and URLs outside of your own and even ones that directly compete with you. This means that you can see how you stack up against your competition. While the points in the first part of this blog raised about comparing datasets still stand, seeing how your performance trends relate to your biggest competitors is invaluable. And, best of all, it’s RUM data, unlocking new ways to utilize these insights.
Blue Triangle has recently built a report that will allow you to track multiple URLs between your site and your competitor's sites and see the CrUX data associated with each. By storing the CrUX API data over time, you can create a powerful historical look at how you compare to the rest of your industry.
We’ll write another post about that in the future, so subscribe below to watch for more!