If you’ve plugged your URL into Google’s PageSpeed Insights within the last month, you’ll have noticed that it looks a little different. Where you used to receive a simple optimization score, your scores are now divided by platform and split into two scores, “Page Speed” and “Optimization.”
The changes were made as a result of the new Speed Update launched July 9, 2018. Now, instead of relying on lab data, Google uses field data to measure site speed. By extracting information from the Chrome User Experience Report (CrUX) database, Google is able to discern how fast your average user finds your site.
That means that even if your website is lightning-fast on your end, visitors with older smartphones might experience delays — which could impact your speed score, and possibly your website’s ranking. If you haven’t already, it’s time to double down on speed optimization.
I am going to break down Google’s nine PageSpeed Insight Rules, list their best-practice advice, and then dive into some advanced steps you can take to optimize your site speed even more.
1. Avoid landing page redirects
Why it matters. Redirects delay page rendering and slow down your mobile site experience. Each redirect adds an extra Hypertext Transfer Protocol (HTTP) request-response roundtrip and sometimes adds numerous additional roundtrips to also perform the domain name system (DNS) lookup, Transmission Control Protocol (TCP) handshake and transport layer security (TLS) negotiation.
What Google recommends. Create a responsive website with no more than one redirect from a given URL to the final landing page.
Advanced recommendations. Try to avoid redirects altogether. However, if you need to use redirects, choose the type of redirect based on your need:
- 301 versus 302 redirects. Use permanent redirects (301) when you delete old content and redirect to new content, or when you don’t have an alternate page to redirect users to. Use temporary redirects (302) when making short-term changes, such as limited time offers, or when redirecting users to device-specific URLs. Don’t worry; you won’t lose link equity either way!
2. Enable compression
Why it matters. Reducing the size of your content shortens the time it takes to download the resource, reduces data usage for the client and improves your pages’ time to render.
What Google recommends. Gzip all compressible content. You can find sample configuration files for most servers through the HTML5 Boilerplate project.
- Prioritize removing unnecessary data. Compression is great, but the best-optimized resource is a resource not sent. Review your site resources periodically and eliminate unnecessary data before compression to guarantee the best results.
- Consider alternatives to Gzip encoding. If you want to use a tool other than Gzip, Brotli is a lossless compression algorithm that combines a modern variant of the LZ77 algorithm, Huffman coding and second order context modeling. It’s supported by all modern browsers and has a compression ratio comparable to the best general-purpose compression methods currently available. Brotli compresses very slowly and decompresses fast, so you should pre-compress static assets with Brotli+Gzip at the highest level and compress dynamic HTML with Brotli at level 1–4.
- Use different compression techniques for different resources. Compression can be applied to HTML code, as well as various digital assets that your page requires, but you’ll need to apply different techniques and algorithms to your web fonts, images, CSS and so on to achieve the best result. For example, if you’re using HTTP/2, then using HPACK compression for HTTP response headers will reduce unnecessary overhead.
3. Improve server response time
Why it matters. Fast server response times are a necessity; 53 percent of mobile visitors will abandon a page that doesn’t load within three seconds.
High-quality website development is essential if you want to avoid central processing unit (CPU) starvation, slow application logic, slow database queries, slow routing, slow frameworks and slow libraries.
What Google recommends. Server response time should always be below 200ms.
- Measure server response time and Real User Measurements (RUMs). Use a tool like WebPageTest.org, Pingdom, GTmetrix or Chrome Dev Tools to pinpoint existing performance issues and figure out what’s slowing down your content delivery process. Remember, even if your tests show a site speed improve this user’s experience, you’d have to aim for:
- A first meaningful paint
- A SpeedIndex value
- Transmission time interval (TTI)
- Optimize for user experience. While configuring your server:
- Use HTTP/2 (and remember that your CDNs also support HTTP/2) for a performance boost.
- Enable online certificate status protocol (OCSP) stapling on your server to speed up TLS handshakes.
- Support both IPv6 and IPv4. IPv6’s neighbor discovery (NDP) and route optimization can make websites 10–15 percent faster.
- Add resource hints to warm up the connection and speed up delivery with faster DNS-lookup, preconnect, prefetch and preload.
4. Leverage browser caching
Why it matters. When fetching resources over the network, more roundtrips needed between the client and server means more delays and higher data costs for your visitors. You can mitigate this slow and expensive process by implementing a caching policy which helps the client figure out if and when it can reuse responses it has returned in the past.
What Google recommends. Explicit caching policies that answer:
- Whether a resource can be cached.
- Who can cache it.
- How long it will be cached.
- How it can be efficiently revalidated (if applicable) when the caching policy expires.
Google recommends a minimum cache time of one week and up to one year for static assets.
- Use Cache-Control to eliminate network latency and avoid data charges. Cache-control directives allow you to automatically control how (e.g., “no-cache” and “no-store”) and for how long (e.g., “max-age,” “max-stale” and “mini-fresh”) the browser can cache a response without needing to communicate with the server.
- Use ETags to enable efficient revalidation. Entity tag (ETag) HTTP headers communicate a validation token that prevents data from being transferred if a resource hasn’t changed since the last time it was requested. This improves the efficiency of resource update checks.
- Consult Google’s recommendations for optimal Cache-Control policy. Google has created a checklist and a flowchart that will help you cache as many responses as possible for the longest possible period and provide validation tokens for each response:
The rule of thumb is that mutable (i.e., likely to change) resources should be cached for a very short time, whereas immutable (i.e., static) resources should be cached indefinitely to avoid revalidation.
Why it matters. Minification eliminates redundant data from the resources delivered to your visitors, and it can have a drastic impact on your overall site speed and performance.
What Google recommends. No redundant data within your web assets (e.g., comments or space symbols in HTML code, repeated styles in CSS or unnecessary image metadata).
6. Optimize images
Why it matters. Images account for an average of 60 percent of your web page size, and large images can slow your site to a crawl. Optimizing images helps by reducing file size without significantly impacting visual quality.
What Google recommends. Make sure your website and images are responsive. Use relative sizes for images, use the picture element when you want to specify different images depending on device characteristics, and use a srcset attribute and the x descriptor in the img element to inform browsers when to use specific images.
Advanced recommendations. Follow this checklist of the most common optimization techniques:
- Eliminate unnecessary image resources.
- Leverage CSS3 to replace images.
- Use web fonts instead of encoding text in images.
- Use vector formats where possible.
- Minify and compress scalable vector graphics (SVG) assets to reduce their size.
- Pick the best raster formats (start by selecting the right universal format: GIF, PNG or JPEG, but also consider adding image format WebP and JPEG extended range (XR) assets for modern clients.
- Experiment with optimal quality settings. Remember that there is no single best format or “quality setting” for all images: each combination of particular compressor and image contents produces a unique output.
- Resize on the server and serve images scaled to their display size.
- Remove metadata.
- Enhance img tags with a srcset parameter for high dots per inch (DPI) devices.
- Use the picture element to specify different images depending on device characteristics, like device size, device resolution, orientation and more.
- Use image spriting techniques carefully. With HTTP/2, it may be best to load individual images.
- Consider lazy loading for non-critical images.
- Cache your image assets.
- Automate your image optimization process.
When it comes to image optimization, there’s no single “best” way to do it. Many techniques can reduce the size of an image, but finding the optimal settings for your images will require careful consideration of format capabilities, the content of encoded data, quality, pixel dimensions and more. For more tips, visit Google’s guide to Optimizing Images.
7. Optimize CSS delivery
Why it matters. Browsers typically follow these five steps when rendering a page:
- Process HTML markup and build the document object model (DOM) tree.
- Process CSS markup and build the CSS object model (CSSOM) tree.
- Combine the DOM and CSSOM into a render tree.
- Run layout on the render tree to compute the geometry of each node.
- Paint the individual nodes to the screen.
In other words, a page needs to process CSS before it can be rendered. When your CSS is bloated with render-blocking external stylesheets, this process often requires multiple roundtrips which will delay the time to first render.
What Google recommends. Inlining small CSS directly into the HTML document to eliminate small external CSS resources.
- Avoid inlining large CSS files. While inlining small CSS can speed up the time it takes for a browser to render the page, inlining large CSS files will increase the size of your above-the-fold CSS and will actually slow down render time.
- Avoid inlining CSS attributes. Similarly, inlining CSS attributes on HTML elements often results in unnecessary code duplication, and it’s blocked by default with a Content Security Policy.
8. Prioritize visible content
Why it matters. If your above-the-fold content exceeds the initial congestion window (typically 14.6kB compressed), then loading your content will require multiple roundtrips to load and render your content. This can cause high latencies and significant delays to page loading, especially for mobile users.
What Google recommends. Reducing the size of above-the-fold content to no more than 14kB (compressed).
- Limit the size of the data required to render above-the-fold content. If you’ve been following along, you should already be using resource minification, image optimization, compression and all the other tips and tricks to reduce the size of your above-the-fold content.
- Organize your HTML markup to render above-the-fold content immediately. Changing your HTML markup structure can greatly expedite the rate at which your above-the-fold content loads and renders — but what you change will vary from page to page. For example, may need to split your CSS into different parts: an inline part responsible for styling the above-the-fold portion of the content and a stylesheet that defers the remaining part. Or you may need to change the order of what loads on your page first (e.g., main content before widgets).
- Inline critical scripts and defer non-critical scripts. Scripts that are necessary for rendering page content should be inlined to avoid extra network requests. These should be as small as possible in order to execute quickly and deliver good performance. Non-critical scripts should be made asynchronous and deferred until after the first render. Just remember that asynchronous scripts are not guaranteed to execute in a specified order.
Conclusion: Testing the results of the speed update
To find out what impact the Speed Update actually has on SERP positions, the SEO PowerSuite (my company) team and I conducted two experiments — one before and one immediately after Google rolled out their update.
We discovered even before the update that the correlation between a mobile site’s position in the SERPs and its average optimization score was already extremely high (0.97) but that a site’s first contentful paint (FCP) and DOM content loaded (DCL) metrics (now displayed on PageSpeed Insights beneath your Page Speed score) had little to no bearing on your position.
We didn’t notice any significant changes one week after the update, which is understandable: It takes time for the update to come into full action. The correlation between optimization score and position in mobile search engine result pages (SERPs) is high, while the correlation between FCP/DCL and position is low.
Within the past three months, the optimization scores of sites ranking within the top 30 positions of mobile SERPs have all increased by an average of 0.83 points. We feel that is an industry-wide rise in the quality of websites.
What this tells us is that the standards for what constitutes a fast, optimized site are increasing — and you can’t afford to become complacent. Improving speed, like SEO as a whole, is a process, and if you don’t keep tweaking and improving, you risk being left behind.
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.