This article includes full instructions to set up my (Tony McCreath) Core Web Vitals report using your users Core Web Vitals experience on your website. Set up is via Google Analytics 4 (gtag or GTM), BigQuery and Google Data Studio.
Table of contents
- They stole my thunder
- What is this Core Web Vitals thing all about?
- Adding the code to your site (for GA4 using gtag)
- Adding the code to your site (for GA4 using GTM)
- Adding GA4 definitions (optional)
- Create a BigQuery materialised table
- Schedule updates of the materialised table (optional)
- Create a Data Source in Data Studio
- Make your own copy of the Core Web Vitals report
- Using the reports
- Feedback
They stole my thunder
At Google I/O 2021 the web.dev team made a Core Web Vitals presentation explaining how to Measure and debug performance with Google Analytics 4 and BigQuery. Basically, how to set up cool Core Web Vitals reports using your own websites visitor data. This was backed up with a technical article explaining how they did it.

I was stunned. I’d spent the last few days working on the same thing for a presentation that I’m going to make at the DeepSEOcon conference later this year. They’d stolen my thunder 😲
I paced around trying to work out what to do next.

Eventually it became clear. web.dev are the rulers of Web Vitals and the code I use to track those metrics. I had to pull my stuff inline so that it worked with their solution. i.e. make it so people can use my report on the same setup. Time to get to work.

web.dev’s article details the technical work involved. This article provides the specific steps you can follow to set it up if you are using the standard gtag or GTM implementation for GA4.
What is this Core Web Vitals thing all about?

Core Web Vitals are part of the upcoming page experience ranking signals from Google. They measure real users experiences as they view pages of a website via the Chrome browser (CrUX). Website owners want to give these users a good experience to benefit from the upcoming ranking factor.
Tools like Lighthouse and the Chrome performance report provide lab data. i.e. the developers own experience. This is good for detailed analysis and testing but does not accurately reflect the experience from your real visitors who will have different network connections from different locations with different devices. Users also interact with the pages unlike these testing tools. Interaction is required to calculate FID and affects CLS.

CrUX data gives you a real user view of the web vitals (field data) and can be accessed via tools like Page Speed Insights. However, this data is highly aggregated so you only get a high level view of your performance. In many cases you will not get page level data and will have to work from origin (domain) level data. The results are also an average over the last 28 days meaning there is a big lag from making a change to fully seeing its affect in the report. Pages need to be visted by a threshold of opted in Chrome users in the 28 days to be included in the report.

The Google Search Console has a Core Web Vitals report that also uses the CrUX data. Which means it suffers from the same limitations like the 28 day lag. When there is not enough data for individual URLs the Search Console reports on aggregated sets of URLs or even just the whole site. This means you often don’t see data at the page level and is why people often see sets of URLs change from good to bad at the same time. The reported URLs are often samples from these sets.
It’s also worth noting that the Core Web Vitals report is based on CrUX data and not the Google search index. It has nothing to do with the page being indexed for search. It’s not uncommon for low trafficked sites to see no URLs in this report due to a lack of CrUX data, and drastic changes in URL counts can happen. This does not mean those URLs are no longer performing in search.

Another option is for website owners to directly gather their own Core Web Vitals data for their own site visitors. The advantage is that this includes data at the page view level for all visitors that can be tracked. This granular data makes it far easier to quicky spot where issues are and how improvements are affecting your metrics.
This article details how you can set this up on your own website to generate reports in Google Data Studio that help you fix your Core Web Vitals issues. Let’s get started…
Adding the code to your site (for GA4 using gtag)
This solution assumes you have already implemented your GA4 tracking via gtag. See the next section if you use GTM. For BigCommerce stores using our Tag Rocket app, this is already done for you (and more) when you enable the GA4 tag.
web.dev have a web-vitals github repository that provides a lot of details on code you can use to track the Core Web Vitals metrics. They have also documented some extra code to help you Debug Web Vitals in the field. We will be using all that code with some additions from me.

You need to place the following code on all pages. It send all your Web Vitals events to your GA4 property. The code can go anywhere. I suggest just after your base GA4 code.
We’ve found that tracking page types can be very useful in segmenting the data and narrowing down which parts of the website are having trouble.

I also think gathering the effective connection type of a user (e.g. 4g) can be of value.
To enable these you need to alter your core gtag to send the ‘page_type’ and ‘effective_connection_type’ parameters. Changes are in green. Don’t forget to replace ‘PAGE TYPE NAME’ (in red) with your code to dynamically get the page type name. How you set page_type is down to your CMS and how you want to segment your pages.
<script>
function getEffectiveConnectionType(){
var connection = navigator.connection || navigator.mozConnection || navigator.webkitConnection;
if(connection && connection.effectiveType) return connection.effectiveType;
return 'unknown';
}
</script>
<!-- Global site tag (gtag.js) - Google Analytics -->
<script async="async" src="https://www.googletagmanager.com/gtag/js?id=G-0BQR1PRHYJ"></script>
<script>
window.dataLayer = window.dataLayer || [];
function gtag(){dataLayer.push(arguments);}
gtag('js', new Date());
gtag('config', 'G-0BQR1PRHYJ', {
page_type: 'PAGE TYPE NAME',
effective_connection_type: getEffectiveConnectionType()
});
</script>
Tracking page types can be useful for reporting on a lot of things. GA4 also has reports that support the ‘content_group’ parameter which I currently set to the same value.
Once you have added that you will be sending the same Web Vitals events and parameters as used by the solution web.dev provided. This includes the standard metrics parameters (metric_id, metric_value, metric_delta), the debug parameters (debug_target, debug_event, debug_timing, event_time), the rating for the metric (metric_rating) plus the optional page_type parameter.
Adding the code to your site (for GA4 using GTM)
Simo Ahava has written a great article (as usual) on using GTM to send CWV events to GA4 which uses a Custom Template he developed. Unfortunately it only does the basic metrics due to a limitation of custom template, and it uses a different set of GA4 parameters to those used by web.dev and me. This section provides an alternate GTM solution that supports the debug info and uses parameters compatible with the web.dev report and mine.
I’ll assume you are familiar with GTM and have added the GA4 Configuration tag already.
Create a Custom HTML tag for all pages that contains the following code. It’s very similar to the gtag one but it sends the Core Web Vitals data to the dataLayer and the JavaScript is downgraded to work with GTM. I’ve followed Simo’s dataLayer event naming convention so that it should make our solutions compatible.
You also want to add a tag sequence to this tag to make sure the GA4 Configuration tag fire before this one. We don’t want to send events to GA4 before it exists.

We now need to define all the user defined variables that we send in the dataLayer. Again I’ve followed Simos lead on the convention. Note that my report does not use the rounded values or the rating, but I’ve kept them for compatibility with other solutions.
Variable name | Data Layer Variable Name |
---|---|
DLV – webVitalsMeasurement.name | webVitalsMeasurement.name |
DLV – webVitalsMeasurement.id | webVitalsMeasurement.id |
DLV – webVitalsMeasurement.value | webVitalsMeasurement.value |
DLV – webVitalsMeasurement.delta | webVitalsMeasurement.delta |
DLV – webVitalsMeasurement.valueRounded | webVitalsMeasurement.valueRounded |
DLV – webVitalsMeasurement.deltaRounded | webVitalsMeasurement.deltaRounded |
DLV – webVitalsMeasurement.debugTarget | webVitalsMeasurement.debugTarget |
DLV – webVitalsMeasurement.debugEvent | webVitalsMeasurement.debugEvent |
DLV – webVitalsMeasurement.debugTiming | webVitalsMeasurement.debugTiming |
DLV – webVitalsMeasurement.eventTime | webVitalsMeasurement.eventTime |
DLV – webVitalsMeasurement.rating | webVitalsMeasurement.rating |
We’re getting there. Isn’t GTM meant to make things easy?
Time to create a trigger for the coreWebVitals event. Like this…

We’re now on to the GA4 event tag itself. Here we use our new trigger and add all the parameters we want to send.
First set the Event name to {{DLV – webVitalsMeasurement.name}}
The following table shows what to send so it works with the web.dev report and mine.
Parameter name | Value |
---|---|
metric_name | {{DLV – webVitalsMeasurement.name}} |
metric_id | {{DLV – webVitalsMeasurement.id}} |
metric_value | {{DLV – webVitalsMeasurement.value}} |
value | {{DLV – webVitalsMeasurement.delta}} |
metric_delta | {{DLV – webVitalsMeasurement.delta}} |
debug_target | {{DLV – webVitalsMeasurement.debugTarget}} |
debug_event | {{DLV – webVitalsMeasurement.debugEvent}} |
debug_timing | {{DLV – webVitalsMeasurement.debugTiming}} |
event_time | {{DLV – webVitalsMeasurement.eventTime}} |
metric_rating | {{DLV – webVitalsMeasurement.rating}} |

Like with the gtag implementation we recommend sending a page_type parameter to GA4 so that you can segment your reports. How you determine the value of the page_type is down to you. You will then have to send it in the dataLayer, create a variable for it, and add it to your GA4 Configuration fields.

Setting up the effective connection type requires the following Custom JavaScript varable called ‘Effective connection type’ to be created:
You can then add it as a field called effective_connection_type in your GA4 Connection tag.

I’ll leaved it up to you on how you test and finally publish it. Simo’s article has a good section on testing.
Adding GA4 definitions (optional)
This solution does not need you to add these parameters as definitions in the GA4 admin. However, defining them lets you directly report on them in GA4 and when directly connecting to Data Studio. It also gives me an opportunity to briefly explain what they do.
Clicking on a cell will copy its content to your clipboard.
Dimension Name | Scope | Description | Event Parameter |
---|---|---|---|
Web Vitals metric ID | Event | This is used to group web vitals that happen in the same page view | metric_id |
Debug target | Event | This identifies the selector path to the element that contributed most to the metric | debug_target |
Debug timing | Event | For FID events it indicates if the event was before ‘pre_dcl’ or after ‘post_dcl’ the content was loaded | debug_timing |
Event time | Event | The time when the Web Vitals event happened | event_time |
Web Vitals rating | Event | ‘good’, ‘ni’ or ‘poor’. Based on the scores set by web.dev. Used in the LCP, CLS, FID events | metric_rating |
Page type | Event | Used to identify the type of page (e.g. template name) for segmenting page based reports | page_type |
Effective connection type | Event | Uses the Network Information API to get the effective connection speed of the user. 4g, 3g, 2g, slow-2g and unknown | effective_connection_type |
Metric Name | Scope | Description | Event Parameter | Unit of measure |
---|---|---|---|---|
Web Vitals value | Event | The value of a web vital. Used in the LCP, FID and CLS events. | metric_value | Standard |
Web Vitals delta | Event | The difference since the last report for this web vital. ‘value’ is also set to the delta | metric_delta | Standard |
Connect to BigQuery
web.dev and I went the route of using BigQuery for our reports. The reason for this is the limitations of directly connecting GA4 to Data Studio.
- The mapping of custom GA4 properties to Data Studio fields is not reliable. Changing the data source causes the fields to shuffle around in the report. Hopefully they will fix this at some point.
- Data Studio has no mechanism to group the Web Vitals events by page view to determine the final value for a Web Vital. We use BigQuery to pre-group the data for us.
First you want to set up a Google Cloud account and create a project to connect GA4 to.
Then go into the GA4 admin, selecting ‘BigQuery Linking’ and clicking on the Link button. You will then be able to select your new project and complete the linking process. You need to enable Daily export.

Once complete it will take about 24 hours before your first GA4 export table is created.
You can check the project by selecting the project in your Google Cloud account and then selecting BigQuery from the side bar. You should see the project listed. Once the first export is complete you will be able to expand it to see the exported tables. This is what mine looks like once the table was created.

Time for a break. See you tomorrow…

Create a BigQuery materialised table
Morning.
After the first events table is exported we can move to the next step. We need to convert the data so that it can be easily used by DataStudio. As mentioned before, the main task is to work out the final Web Vitals scores for each page view so that Data Studio can deal with it.
We perform this conversion by adding an SQL query that creates a new table with the data we need. The query we use is based on the one documented by web.dev with a few extras added so that it can support my report.
If you’ve already created the materialised table as per web.devs instructions, you will need to add the ‘Tony’s additions’ sections and alter ‘Tony’s modification’ lines to that SQL for my report to work.
Otherwise, click the ‘compose new query’ button and add the following SQL to the editor. You will need to edit all occurances of your_project.analytics_123456789 to make it use your project and dataset.
Once you have edited it you can run it for the first time. If it does not work you may have not got your table names right.
When it works it will create a table called ‘web_vitals_summary’ in the same place as your GA4 export. We’re going to reference that later.

Save the query as ‘Web Vitals Summary’ so you can re-run it later. You can find saved queries via an option in the footer.
Creating this materialised table not only makes it easier to create reports, it also may save you money. Queries cost money once you’ve used up your free monthly allowance of 1 Terabyte. This table reduces the size of the queries you make to BigQuery when viewing reports, making them faster and cheaper.
Schedule updates of the materialised table (optional)
At the moment you would need to periodically re-run the query to get their latest data into your report. You may want to leave this as a manual process if you are hitting the free monthly allowance. If not, you can set the query up to be scheduled every day.
To do that, when editing the query you will see a Schedule option at the top. Create one and name it ‘Web Vitals Summary’. You may have to enable the API, refresh the page, login and maybe go back to the saved query to get it to work (a bit flaky is this one). You can find scheduled queries from the expandable side bar.
GA4 does not seem to be consistent in the time it does its daily export so you may have to live with a day or two lag in the report (or run it manually as needed)
Create a Data Source in Data Studio
Data Studio uses data sources to provide the data for reports. The next step is to create a data source based on our new BigQuery materialised table.
Open our Core Web Vitals BigQuery data source and click on the copy icon. Then select BigQuery, Authorize and then find your ‘web_vitals_summary’ table. Then click reconnect.

You may want to rename the source by clicking on its name at the top right. Otherwise, we’re all done here.
From v2.0 onwards the report uses the functions provided by this data source. This is so we can create data sources using different connectors. To create a new data source using a different connector you would need to implement these functions (using the exact same field ids).
Make your own copy of the Core Web Vitals report
Almost there. We just need to create the report using your data source.
Open the Core Web Vitals report and click ‘Use Template’ at the top right. Then select your data source and click ‘Copy Report’. You may need to do a refresh to get your new data source showing.
Rename the report (top right) to whatever you want to call it. Then switch to view mode for a better experience.
Using the reports
Over time more daily data will be imported from GA4 making the time based reports great for tracking progress.
Browse through the different pages in the report via the left side menu.
The dropdown filters at the top affect the whole report. Why not tunnel down to a specific location, browser, page type or even if the user was engaged or not.

Try clicking on table rows. Many will further filter down the data for that page.

Page URLs and the PSI (Page Speed Insights) columns are links.
I think there may be some interesting data to come from the distribution reports with regard to engagement. I’m already seeing longer LCP values causing a reduction in engagement with mobile users.

This is your own copy of the report. Feel free to edit it to make it fit your needs. e.g. Changing the above chart to a 100% stacked chart will probably work well once there is enough data. If you improve it or add something great…
Feedback
Please send me any feedback you have. Ideas, issues or just to show off your report scores. My @TonyMcCreath Twitter account is a good place to do that. I’ll be waiting.

Hi
Thanks for sharing this complete guide for checking coreWebVitals, I think there is a mismatch in GTM, in the parameters table.
value : {{DLV – webVitalsMeasurement.delta}}
Hi Amin, could you clarify what this missmatch is? My solution does not use the delta, so it would have not been tested!
Thanks for the writeup! This topic is going to be very important in the next several months. I hope you continue to share more on GA4, CWV, and BigQuery!
I will be publishing some more reports in September after my talk at DeepSEOcon
Awesome article! Would this also be possible using Universal Analytics?
It is possible, in fact the webvitals script guide includes examples for GA. I chose GA4 because it has a richer event model that makes it easier for me to send many different bits of data in one event.
Hello, Tony.
Using Simo’s template and running queries from web.dev article won’t work, right?
This is the GA4 event https://ibb.co/Z1BpVnf
This is what I have in GA4 https://ibb.co/jWBLvYw
This is in GCP – BigQuery https://ibb.co/3TMnVp0
With this query
SELECT * FROM `my_project_id.analytics_XXXXX.events_*`
WHERE event_name IN (‘LCP’, ‘FID’, ‘CLS’)
I got this https://ibb.co/XkzTM4v
And using this query https://web.dev/vitals-ga4/#lcp-fid-and-cls-at-the-75percent-percentile-(p75)-across-the-whole-site
I got this https://ibb.co/d7M69zs – no FID, no LCP, just 1 CLS
Seems that I’m not getting correct data… Any idea how to fix this?
Thanks!
I think you need to change the parameter names to match up. Your using web_vitals_measurement_value while the web.dev query (and my queries) use metric_value. The web.dev query also requires a metric_id parameter.
Hi,
I used Simo template lately and I try to use your solution (especially beacause of nice Data Studio presentation and GA4 posibilities).
BUT – your Data Layer pushes works only when I have two tags active – your custom html and Simo Core Web Vitals template. When I paused Simo tag, then coreWebVitals are not visible on dataLayer and I have error in my console “Uncaught ReferenceError: webVitals is not defined at HTMLScriptElement.a.onload (:1:109)”
Should I keep two tags, whether it will not negatively affect the data, e.g. duplication?
It sounds like my Core Web Vitals tag is not running or something is trying to use it before it loads the script that creates the webVitals object.
I don’t think having both tags running is a good idea. As in the previous comment, you could make Simo’s template work by altering a few parameter names.
Would delayed firing of the script and GA4 tag (e.g. due to cookie acceptance dialogs) effect the Web Vitals reported?
I would presume so. You would lose data for people who did not accept.
I think the script will still work once run and gather the same results.
You could speed things up by preloading any scripts ready for their use on acceptance.