Douglas Armstrong from Midwood Guitar Studio contacted us in June about improving the site’s performance. Their theme developer is Damien from Above & Beyond, who recommended us.

We don’t usually take on development work, but I had worked with Damien before and knew he was keen on improving his themes, so we decided to take it on as a pet project with the idea of writing this case study.

Google focuses on the Core Web Vitals when it comes to measuring a user’s experience of a web page. The data they use comes from real user data and is available to the public via the Chrome User eXperience report (CrUX). The Core Web Vitals are a subset of the Web Vitals Google thinks are worth monitoring.

Our first step was to install Tag Rocket which gathers Web Vitals from the real users that visit the site. This data is far more fine grained than the data provided by CrUX. It enables us to discover where the issues are and rapidly measure progress (CrUX is aggregated over 28 days).

They also gave us access to their Store Admin, Google Analytics, and Google Search Console.

I created a local development copy of their theme and added it to GitHub so we could track the changes I made.

Initial Status

The PageSpeed Insights tool includes a page’s CrUX data at the top. This is a good way to see the current state while we wait for the detailed Tag Rocket data to populate.

Results are split between mobile and desktop as the experience for each can be very different. For mobile, most web vitals were in the needs improvement area.

Initial CrUX Core Web Vitals scores for mobile

While on desktop, Cumulative Layout Shift (CLS) was well in the red area.

Initial CrUX Core Web Vitals scores for desktop

The initial data from Tag Rocket…

Summary of the initial Core Web Vitals

Phase one – Cumulative Layout Shift (CLS)

Tag Rocket provides a report at the page-type level. Many issues relate to specific page types as they share common templates. In this case, the three most visited page types (category, product, home) were failing CLS, so I decided to focus on that first. I also noticed that desktop users had worse CLS scores, just like the CrUX report showed.

CLS Needs Improvement for the main page types

CLS is about unexpected movement as the page loads or while the user is not interacting with it. This category page for a desktop shows how shifty it is as elements load, like styles, fonts and components/widgets.

Watching Layout Shifts as a page loads

I made several CSS fixes to reduce the layout shifts. These fixes had to be tested at several device widths as the design is responsive and has several transitions as the width reduces.

  • Hiding the initial carousels content (brown banner bar) until the carousel is built
  • Added an aspect ratio to the log so the browser could calculate its height
  • Pre-hid a widget that was initially visible and later hidden
  • Minimised shifts in the top right menu as icons and elements were added

I added a font preload to give it a chance to load before the page is displayed.

Another tricky one was on smaller widths, the banner was dynamically altered to use a mobile menu. This caused things like the brown carousel to suddenly appear, moving everything down. It took a combination of CSS and code changes to fix that. It’s not easy when code starts moving things around after it has been loaded!

All the above fixes would have affected every page on the site, so they would be a big win if they worked.

I then examined a product page. They had an issue with the product thumbnail system, which also changed depending on width. Some CSS to define the sizes of the thumbnail helped there. The page also contained some widgets that shifted things around as they loaded. I pre-allocated it some space to stop that from happening.

Moving on to the home page and it had a few sliders that shifted things around when they were built. Some CSS managed to get them to look right before the sliders were built.

I went live with the changes and tested a few with PageSpeed Insights. This is a product page which showed the most improvement…

Product page

The home page and category pages scored green for desktop and mobile. The next day Tag Rocket started showing real user data for the changes.

CLS status over time

And after 28 days, we will see it reflected in PageSpeed Insights and the Google Search Console reports.

Phase two – Largest Contentful Paint (LCP)

This is how Google measures the speed of a site. It is typically the time it takes for the main image of a page to load and show fully.

I audited the usual suspect pages using the Chrome developer tools. I do like the performance tab. And made these changes:

  • Requested some larger images (1.6MB) were reduced in byte size
  • preloading of the main images (already done in most cases)
  • Removed lazy load (JavaScript based) from the main images, including the first 4 images on a category page
  • Removed lazy load from smaller images that were above the fold.
  • Stopped blocking resources block when possible
  • Moved the remaining blocking resources to as late as possible and added a preload.
  • Ensured the webfonts preload was working
  • Removed duplicate scripts
  • Hard coded some elements that were being dynamically added by a page builder

The home page had some tricky code to determine the aspect ratio of the carousel, alter it accordingly and hide the carousel until the aspect ratio was worked out. This code loaded a different version of the carousel image to get the aspect ratio, it then displayed the carousel, which would load the real image. This caused a significant delay in showing the main image.

Every page had a carousel/slider, and the home page had several. These were programmed to start rotating as soon as possible, taking over the browsers process. There is no value in animating a carousel if it has no content yet, so I put in a mechanism to delay its running. That way, the browser can focus on rendering the page.

Phase three – First Input Delay (FID)

Input delays happen if the computer is busy processing things and takes time to respond to users’ input. Long JavaScript tasks can stop the browser from responding. Mobiles often have less power and struggle more with sites that run a lot of JavaScript.

The previously-mentioned building of the carousels was causing a long task when the browser could not respond. The delaying of each carousel’s code also split their running into short tasks that would not cause any user response issues.

There was also a very CPU-intense widget on all pages. As the site was already passing FID, I suggested removing it would help. It was causing the home page to fail FID. But at this point Tag Rocket was recording a sea of green:

Good Core Web Vitals by page type

Phase four – waiting

Tag Rocket indicated things were good, but we have to wait 28 days for Google to tell us if they are happy. This is what the Google Search Console was reporting at the start. Only a few pages passed:

Google Search Console Initial Core Web Vitals

As we waited, I made several refinements based on clues given by Tag Rocket.

Ten days later, and desktop went almost all green:

Google Search Console Desktop Core Web Vitals moving to good

I investigated the ones that failed, and it was because they had extra banners added to the pages that were not sized. So when their image loaded, the page got pushed down.

Another week and mobile mostly went green as well:

Google Search Console Mobile Core Web Vitals moving to good

It looked like mobile LCP was borderline and easy for pages to flip.

At this point, I got permission to remove a jQuery script. The problem with jQuery is that other scripts use it, so in most cases it has to block until it has been loaded. I’d determined it was never used! Gone.

Another bonus with Tag Rocket is that it monitors for JavaScript and console errors. So we will be aware if altering or removing a script broke something.

And for desktop, we gained the good URL status for 98.4% of pages.

98.4% Google Page Experience

And a month later, we lost it 🙁

Good to Needs Improvement
Loss of Google Page Experience

Mobile was slightly failing LCP, while desktop was slightly failing CLS. Tag Rocket indicated it could be the product page at issue:

Core Web Vitals by page type

Some work with the designer further refined the optimisations, and we got 100% green.

Google Search Console Core web Vitals 100% good

At this point, it seemed that everything had been done that could be done at the theme level. And that seemed to mean we were borderline passing. The BigCommerce CMS has a slow initial HTML response time (TTFB), making it an uphill battle to pass LCP. So what can we do next?

CDN HTML Caching with Zycada

I suggested a few of these solutions to the client, and they decided to go with Zycada. I’ll try to explain what they do. When a user visits a page, Zycada predicts what page they will go to next, and preloads that pages HTML into their CDN. If they get it right, the user receives the following page’s HTML in sub 100ms instead of the usual 700ms+ from BigCommerce. Learning takes a week or so before they enable this caching mechanism.

This report gives you a good clue about when they enabled the system:

TTFB changes when enabling Zycada

We’ve recently created a new and more accurate report system. This is the report that I think best shows the effect of Zycada (as it’s new, we don’t have a before shot of this one):

TTFB Distribution v Bounce Rate using Zycada on BigCommerce

The big green bar on the left is when users get pages from Zycada’s cache with sub 100ms responses. I’d say that any response time better than around 600ms has been improved by Zycada.

The pie charts give you an idea of the percentage of page views with TTFB under a specific time. It’s nice to see green close to the 75th percentile, meaning almost 3 out of 4 page views are considered good TTFBs. And the 100ms piece of the pie gets around a third of page views.

If you look at the bottom side of the charts, you will see bounce rates. Look how low they are when TTFB is small and how high they go when it is big. This means about a third of the page views are sub 100ms and have close to a zero percent bounce rate! Zycada is stopping people from leaving the site.

LCP is the speed metric we really want to know about. Here is its’ distribution:

LCP for a BigCommerce store using Zycada

This time the pie chart shows the green bar going past the 75th percentile mark. This indicates that the pages pass Google’s requirements. Desktop is doing great, with a third of page views being under half a second for LCP. For mobile, the distribution is shifted a bit, and LCP passing is still borderline. As TTFB does not show this effect, I assume the delays relate to loading further resources like the main image.

Since then, we had another temporary hiccup with mobile LCP. Confirming it is still very much on the border line of 2.5 seconds.

Mobile LCP fail and recovery

I suspect people on slower networks are pulling down the score. A hard one to solve. This is what 3G users see:

Core Web Vitals for 3G users


Getting a BigCommerce store to pass the Core Web Vitals with some HTML and CSS skills is definitely possible. LCP is the hardest metric to fix, even using an HTML CDN to massively speed up the experience for many users.