I’ve spent the last few months researching and testing how to speed up a BigCommerce store. It’s still a work in progress and contains some hacks. We can use Page Speed Insights to show the results of my efforts for both mobile and desktop devices…

At the moment only the category pages have been fully optimised. There’s quite a bit of work to make it robust and safe to use. Think of it as a prototype on what can be done. You can see all my changes on my fork of Cornerstone in the speed2 branch.

The Life Of A Page Load

When a user clicks on a web page link or enters it into the address bar the browser goes through a sequence of tasks and milestones to eventually display the page to the user. Here I’ll talk about the most significant milestones and about ways to speed each one up to result in the page loading faster.

Just to scare you from the start, here’s a W3C graph showing the timing attributes as a user navigates to a page. I’ll be talking about many of these steps with ideas on how to speed them up.

W3C Navigation Timings

BigCommerce Cornerstone 6.2.0 Performance

All my modifications are based off Cornerstone 6.2.0. As a base line, here are some results of a demo produce page using the base theme.

Not the worst scores in the world, but both devices are failing Largest Contentful Paint (LCP), which is what Google uses to determine of a page is a good experience which is rewarded with a ranking boost.

And scores typically go downward as widgets and features are added to pages.

The main issues reported are mostly beyond our level of control:

  • Reduce unused JavaScript (several js files added by the theme)
  • Reduce unused CSS (mainly the themes bundled CSS file)
  • Eliminate render-blocking resources (CSS files and web fonts)
  • Reduce initial server response time (that’s BigCommerce’s servers)

These lab tests can vary by a lot, making it quite hard to be conclusive about any results. Over time I should have more real world data to confirm if these changes improved the page experience for most users.

Let’s see how we can improve things.

Making Connections

Before a browser can talk to a new domain/hostname it has to make a few requests to set up a connection to it. A DNS request to find out what server the domain is hosted on (IP address), then it needs to establish a TCP/IP connection to that server using the correct port number, and finally it makes that connection secure.

This has to be done with every new domain used during the loading of a page. And it can add 100s of milliseconds to the delay in loading the first resource from each new domain discovered.

You can see the time used up by connections in the previous WebPageTest waterfall diagram as the green, orange and purple bars before some requests.

For a BigCommerce page WebPageTest indicated it took about 175ms to make the initial connection to the pages domain.

These connections have to be made if you want to access a domain, and there is nothing that can be done about the delay when loading the initial HTML for a page. But it is possible to get further connections set up in advance of a domain being needed. This is by using some resource hints at the top of the HTML. Here’s what BigCommerce do by default.

<link rel="dns-prefetch preconnect" href="https://cdn11.bigcommerce.com/s-65027" crossorigin>
<link rel="dns-prefetch preconnect" href="https://fonts.googleapis.com" crossorigin>
<link rel="dns-prefetch preconnect" href="https://fonts.gstatic.com" crossorigin>

There are a few mistakes here.

  1. For compatibility with Safari the dns-prefetch should not be done in the same tag as the preconnect
  2. I don’t believe the folder in the first connection has any meaning. It’s expecting domain level connections
  3. There is no value in doing this for resources that are requested instantly (the first two)

Best practice is to do a preconnect as well as the dns-prefetch in a separate link tag, and in that order. This is mainly to be backward compatible. In some cases, like these, you will also need to add crossorigin. Test each way as it will only work when done right.

Here is what I think BigCommerce should do:

<link rel="preconnect" href="https://fonts.gstatic.com"  crossorigin="anonymous">
<link rel="dns-prefetch" href="https://fonts.gstatic.com"  crossorigin="anonymous">

Pre-connection like this is of most value when the browser does not instantly know that a request to a domain will be needed. The browser can get the setting up of the connection out of the way so that the request goes out faster when needed.

These are also browser hints. This means the browser will decide if and when it will do the preconnect.

Lighthouse recommends only doing a few preconnect tags and this preconnect article also warns about over use.

WebPageTest is good at highlighting time used up by making connections. You can see here how the connection to fonts.gstatic.com was made early so that fonts were requested as soon as they were known about. But for the other pre-connects BigCommerce makes, it made no difference.

Our Tag Rocket app can add a lot of delayed requests to different domains. As these requests are made late in the process I decided to test preconnecting them all. I added a little more than a few at the end of the head section. This let the critical resources get first in the queue with these low priority connections at the end.

You can see how some connections were made early, but I did not see any noticeable change in the total timings of the page load. This may be because WebPageTest emulates a slower device where the network and CPU are busy, and therefore it has little time to do any pre-connecting:

I suspect this would have more benefit on faster devices and connections.

Consider doing the preconnect/dns-prefetch for any resources on different domains that are requested later on, especially if they are related to how the page looks.

Time To First Byte (TTFB)

When a user wants to see a page, the first thing the browser does (after making the connection) is request the HTML for that page. The HTML is the scaffolding required to know how to build the page. It contains the basic layout and references to the resources (images, scripts, style sheets) required to complete the page. Once the browser has made the HTML request it can’t do anything until the HTML is returned, as it knows nothing about the page. The time it takes from the request to the first part of the response is TTFB.

Browsers are smart, as they start receiving the HTML they can start processing it. They don’t need all the HTML to get the ball rolling. For example, URL based resources referenced at the top of the HTML can be requested well before the whole of the HTML is received. So TTFB is the point where things start getting done.

Unfortunately, many CMS systems tend to have a poor TTFB. Often the big time consumer is for the websites server to build the HTML. These days CMS platforms have quite a complex set of code to build the HTML and this complexity means they have to build all the HTML before they can start sending it back to the browser. Making TTFB not much before the response has been fully sent.

For BigCommerce TTFB with WebPageTest is over 500ms. Then the content downloads almost instantly. Add the connection time and we see it takes over a second before any HTML is sent to the browser.

The point I’m making here is that things could be sped up if TTFB was smaller, so that the browser can start processing the page earlier. Especially if it can start requesting resources.

And this is where section.io come in. Their BigCommerce TTFB solution can massively reduce the TTFB by almost instantly returning a cached version of the top part of the HTML from their CDN. A point in the HTML is decided as a marker for where the caching should end. Anything above it is returned very quickly to the browser, while the rest is returned at the usual speed. Which means the browser can process what is in that cached part very early on.

To take advantage of this you want to include as many required URL resources into that top section as possible. So the browser starts loading them. Any CSS, font and script files. And that’s what I did – Chrome’s Network report with no throttling is a great way to show what it does:

You can see that it has loaded all the CSS and JS files well before the HTML has been completed. If I show the same thing on a store without section.io you see how these critical resources don’t event start to load until after the HTML is completed.

I’ve a more detailed article on how to best set up section.io for BigCommerce.

If it is not possible to move the resource into the cached area, you can add preloads…

Preloading Resources

The preload tag tell the browser to load a resource now, in anticipation that it will be wanted later. Its main value is when the browser will not know at the time of the preload that the resource will be needed. This means the resource is available earlier and the page loads quicker. This is especially valuable in you have the above TTFB solution where you can preload resources in the cache section and get resources loading very early on.

Here are some of the reasons to add preloads:

  • Preload the main image for a page to reduce LCP
  • Preload web fonts to avoid flicker when they are shown
  • Preload important scripts placed in the footer (especially if they are blocking)
  • Preload scripts that affect the display

I also fixed an issue where important images were being lazy-loaded. Read more about that in my BigCommerce LCP quick fixes article.

The result of all this was a far earlier start to loading the essential resources. CSS fonts and the main image were loaded from the TTFB moment which meant we have now reduced LCP (green dotted line) by over 600ms. In fact LCP happens almost instantly as the HTML is completed. The only delay is due to the CPU having to put the page together.

Stop Render Blocking

A script tag without async or defer applied to it is a render blocking resource. BigCommerce typically includes a few in the footer:

<script src="https://cdn11.bigcommerce.com/***/theme-bundle.main.js"> </script>
<script type="text/javascript" src="https://cdn11.bigcommerce.com/shared/js/csrf-protection-header-***.js"></script>

A render blocking script means it has to be loaded and executed before the browser can continue to put together the rest of the HTML for the page. Render blocking can cause big delays in the processing of subsequent requests, which means it can hurt the most when in the head section.

I managed to remove all render blocking scripts.

  • preloaded and async deferred the fonts script
  • inlined the fonts CSS
  • inlined critical CSS and applied a trick to make the main CSS preload and non blocking
  • preloaded and added the main script via JavaScript so it did not block

The inlined critical CSS is only applied to the category pages at the moment. This actually introduced a new problem. Rendering was done so early that on slow connections the fonts had not loaded yet, and that caused an annoying flicker. To solve that I have the page hidden until the fonts have loaded. I need to work on a more robust way to do this as a failure could mean the page never shows!

If you apply async or defer to a script it is no longer render blocking. It loads in the background. With async it executes as soon as it has loaded. With defer it executes just before the DCL event.

DOM Content Loaded (DCL)

The DOM (Document Object Model) is a computer’s representation of the HTML for a page. It uses a tree like structure that matches how HTML elements contain each other. The DOM can be manipulated by scripts to change the page.

Once the browser has processed all the HTML and render blocking resources to create the DOM it is ready to fire the DOMContentLoaded event. Just before it does that it will execute any deferred scripts that have already been loaded.

This is an important moment as it means scripts can now safely play with with the DOM and alter the page. For example a review widget may add review stars and reviews to certain parts of a page.

Scripts that alter the page typically wait on the DOMContentLoaded event before they do their thing. So if the event is delayed, these widgets are slow to show up on the page.

We want to make DCL happen as quickly as possible. Avoid slow render blocking files. Use preconnect and preload to get resources before they are asked for and needed. Make scripts async so that they don’t block the event.

Scripts that only use defer still delay DCL. This is because they have to be loaded so that they can execute just before DCL. If it’s safe to do so you can also add async to the tag. This means it will not block if the script has not been loaded on time. async behaviour trumps defer if a browser support both.

Adding async to a script can sometimes be tricky. It loads and runs in the background, so nothing can be dependent on it. For example, users of jQuery have to have it loaded and run before their code runs. And unfortunately I often see a BigCommerce site that is render blocking with multiple copies of jQuery. A tricky one to solve.

Many tag providers use a special code pattern to let them async the main script while letting people queue up their commands for when it is loaded. It’s so frequently used that our TagRocket app has a special function to do it for us. You can even specify a function to run when the script has loaded.

If you do add defer or async to a script tag, make sure you test things to make sure it still works.

Our tagging system gets going at DCL. All the important stuff should have been done, and now it’s time for the background tasks like tracking.

The next milestone is when the page is classed as fully loaded…

OnLoad

And we get to the final event of a page loading. The OnLoad event.

This basically happens when the browser has run out of things to do. All known resources are loaded and run. This is when the spinny load thing stops spinning and the user can assume the page is ready.

Often this is delayed by the lower priority tasks like setting up pixels and sending tracking data to the various channels. preconnect and preload can help, but don’t let these low priority resources get in the way of important stuff.

Another possible delay to onload is if you have a lot of images. Which can be solved by…

Lazy Loading Images

Sometimes a page can be long and contain lots of images. By default they all have to be loaded before we cross the finish line and fire OnLoad.

Modern browsers now natively support a lazyload option. The browser can decide which of the lazyload images it needs now, and forget about the rest for later.

<img loading="lazy" src="image.jpg"/>

The BigCommerce Cornerstone theme currently uses a script (lazysizes) to support lazy-loading, as the built in method is new and not yet supported on Safari.

I created a highly modified version of BigCommerce’s responsive image code (responsive-img.html):

  • Switch to native lazyloading if the browser supports it
  • Only load lazysizes if it is required
  • Support for LQIP images even when using native lazyloading
  • More control over how responsive images can be added
  • Support for the upcoming importance attribute
  • Backwards compatible

I need to reviews all images using the code to make sure they are using optimal sizes and are not lazy-loading prominent images.

Note that using lazysizes based lazy loading solutions on images that are at the top of the page can negatively affect your LCP. From what I’ve seen the native lazy-loading does not have that issue. Especially if you preload the important image as well. So the switch to native where possible means most users will not have an LCP problem if they accidentally lazy-load an important image.

Conclusion

Here’s a waterfall of the full load process on my machine with no throttling. The result in this case (fast connection) is that the critical path to LCP is BigCommerce’s server response time for the HTML. And a bit more for the CPU to build the page. DCL and Load are delayed by running the themes main JavaScript file. And we are fully loaded within one and a half seconds.

If I throttle my connection to emulate a fast 3G connection we still get the benefits of all the early loading but we see that the fonts are loaded after the page is drawn introducing the issue of font flicker. And now the OnLoad event is delayed by loading the main CSS file and images.

For Largest Contentful Paint, it went from 3.9 to 2.3 seconds on a mobile and 1.3 to 0.6 on a desktop. That should get everyone in Google’s good books.

And the user gets a very smooth transition without any layout shifts of font jitters. The page literally just shows up with everything in place, even on slower connections.

You can play with the test page but remember it is a test site and I will be actively working on other things that will affect the Page Speed Insight scores, like adding lots of scripts for our Tag Rocket app.

Hopefully some of these ideas make it into Cornerstone so that BigCommerce can potentially become the fastest ecommerce platform in the world.