How to Score 100 on Google PageSpeed Insights: More Lessons Learned

Last year I took a very deep dive to learn all I could about improving page speed for a WordPress site. I wrote a massive (20,000+ words) page speed optimization guide to share what I learned and if you haven’t already read that, I recommend doing so before reading this article (which is also huge at 10,000+ words).

My original goal was to make that post an “everything you need to know” resource and I think it pretty well accomplishes that, but since then I have learned some new tips and tricks, changed my theme, switched all my sites to new hosting providers, switched my primary optimization plugin, and achieved perfect 100 scores for 14 of my sites on Google’s PageSpeed test. I also focused on improving tests that aren’t explicitly covered on Google’s test but are on other popular testing sites. Here is a summary of scores for this site (testing a special page with no ads).

So, prepare yourself for another long read as I share what I have learned since then.

Stages of Page Speed Optimization

This is not scientific, but it seems to me that people sort of progress in stages in their site performance optimization efforts, somewhat as follows:

  1. Only worry about Google’s test and picking the low-hanging fruit (compression, browser caching, page caching).
  2. Still focus primarily on Google’s test but trying to use extra tools and technique to get more performance.
  3. Start to worry about multiple test sites, but only for the “big picture.”
  4. Start to notice the more detailed warnings spelled out in the various testing sites.

Finally, another key change point is whether you focus your optimization efforts on all pages or just the home page. That change could happen at any stage really but a lot of people (including myself) sort of focus on just one page.

Google PageSpeed Insights - mobile test results

New Theme

When I first tackled page speed optimization, I was using a modified off-the-shelf theme.

The thing about using an off-the-shelf theme, whether paid or free, is that it is going to be designed for flexibility and wide applicability. Thus, it will have features you probably won’t need for your site, but that others might. I already discussed how you can use enqueue and dequeue to help rid your theme of unnecessary scripts to improve performance and speed, but depending on the theme you are often doing DB queries to find out if features are enabled. These aren’t necessarily heavy duty queries, but done multiple times per page load, they add up.

My sites have fairly low demands for theme functionality so I started thinking that the cleanest and fastest solution would be a custom theme. But, where to start? I did a Google search for fast themes and found the useful article, “85 Free, Alluring and Lightweight WordPress Themes.” After perusing those themes, I found myself drawn to the First theme, which is based on the Underscores developer theme.

Customizing the First theme to create my own custom theme was actually quite a bit of work so I don’t necessarily recommend it. If you do create your own, however, another (admittedly small) speed improvement will result from being able to hard-code your preferences rather than relying on multiple theme option checks to see if a feature is enabled or not.

Regardless of your theme choice, one thing that deserves noting and which I neglected to mention before is the difference between presentation and functionality. I discussed using your child theme’s functions.php file for your various customized optimization code. That will work but best practice is to use a theme only to control presentation—that is the look and feel—of a site. To do the kind of back end changes that will improve performance but not alter the site’s appearance, you really should use a custom plugin. Doing so is not especially difficult but it is beyond the scope of this article. So, if you don’t want to be bothered and you know you don’t plan to change your theme (which is true for me and my custom theme), no problem. But, if you do ever change to a different theme, you will have to port all those page speed modifications to it. And, by that time you may have forgotten the how, what, and why of those changes (hint: always generously comment your coding work).

Google PageSpeed Insights - desktop test results

New Web Hosting

I have previously demonstrated that it is possible to achieve a perfect page speed score with a basic shared server running regular Apache. I stand by that, but with the caveat that the quality of your shared server and the load it faces from your hosting neighbors will dictate whether you can pass the all-important server response speed test.

For a variety of reasons, I recently decided to choose new hosting providers for all my websites, using hosting plans customized for WordPress sites. A lot of these provide server-based optimizations, including caching, that result in faster sites and, usually, better server response times (or the First Byte Time test at WebPageTest). Still, I notice that the response time limit is pretty strict (I believe it is 0.2s) for Google’s testing and I often have to re-test my sites to get it to pass. I find it especially likely to produce a warning the first time I run the test, even when I know a cached copy of the page exists. I don’t really know what is happening here but if anyone else has any insight, please share.

Pingdom page speed test results

Caching

My caching setup has completely changed since my previous article. At that time I had no server caching and I was using the Falcon cache option that was available with the popular Wordfence security plugin. Falcon was an underappreciated caching solution—simple and effective, but the Wordfence folks decided to kill it so I had to find a new plugin. I have previously considered W3 Total Cache, WP Fastest Cache, Gator Cache, and Cachify but ultimately I settled on the popular WP Super Cache, which I think provides a nice balance of features, performance and usability.

For a while I was using WP Super Cache on all my sites, but when I recently migrated some of my sites to Siteground I abandoned it in favor of their proprietary SG SuperCacher, which so far has worked pretty well.

Finally, I should say that all of my new webhosts (Siteground, TMDHosting, 1&1) have server caching solutions configured (typically Memcached, OpCache and/or nginx caching). As a general rule, you will get more performance improvement with server-based caching than with page caching via a plugin, though, naturally, it is best to use both.

Optimization Plugins and Tools

Autoptimize (AO) vs Speed Booster Pack

In my previous article I mentioned that there are multiple routes to effective performance improvement and that just because one or more plugins work effectively for someone doesn’t mean they will for you as well. You need to experiment. In the past I used and recommended Speed Booster Pack. I still think it is a great option, and it seems the developer has been more active in updating it recently.

One of the reasons I abandoned Speed Booster Pack is simply because I decided to add a lot of its functionality to my custom theme. But, as I warned earlier, that is generally not a good idea (use themes for presentation and plugins for functionality). The other reason I stopped using it is that I decided to give the popular Autoptimize plugin another try.

Autoptimize and Speed Booster Pack aren’t really interchangeable as they don’t do the same things, though there is some overlap in features. Generally speaking, AO is more limited in what it does, but IMHO it does those things better.

I doubt you will go wrong choosing either Speed Booster Pack or Autoptimize, but I think using them together probably makes a lot of sense, or at least is worth trying. Though I haven’t tried them together myself, I would suggest using Autoptimize for combining CSS and JS scripts and for minifying CSS, JS and HTML and using the lazy load and other Speed Booster Pack features that Autoptimize doesn’t include.

PageSpeedGrader test results

Autoptimize

So, now I am using Autoptimize and I like it. If you read my previous article, you know that I couldn’t quite get to a 100 score using it, mostly because the consolidated CSS file it creates causes render blocking warnings. However, Autoptimize has steadily improved (the author is very proactive about its development) and there is now a solution to that issue (keep reading).

If you are not interested in AO, skip this entire section, but if you are, I will share my experiences and tips for using it effectively.

First off, what does Autoptimize actually do? From the plugin FAQ page:

It concatenates all scripts and styles, minifies and compresses them, adds expires headers, caches them, and moves styles to the page head, and scripts (optionally) to the footer. It also minifies the HTML code itself, making your page really lightweight.

and

Autoptimize is not a simple “fix my Pagespeed problems” plugin; it “only” aggregates & minifies (local) JS & CSS and allows for some nice extra’s as removing Google Fonts and deferring the loading of the CSS. As such Autoptimize will allow you to improve your performance (load time measured in seconds) and will probably also help you tackle some specific Pagespeed warnings. If you want to improve further, you will probably also have to look into e.g. page caching, image optimization and your webserver configuration, which will improve real performance.

If you do decide to use try Autoptimize, I recommend first reading “So how does Autoptimize work anyway?,” the FAQ and “How to make Autoptimize (even) faster.” When you are ready to configure AO, consider the following things.

GTmetrx test results

Minification

In my previous article I wrote that the HTML minify feature in Autoptimize didn’t prevent a warning from Google’s test but that has now changed. It is still not perfect as I get the tiniest warning (99/100 for that specific sub-test) on GTmetrix, though that doesn’t prevent a perfect overall page speed score.

Note: For what it is worth, according to Frank Goossen, Autoptimize, WP Minfy, BWP Minify, and W3 Total Cache all use Minify code from Steve Clay (mrclay), which combines, minifies, and serves CSS or JavaScript files.

Note that some plugins add JavaScript and/or CSS directly to your page rather than referencing an external file. Typically, these same plugins don’t bother minifying the code they add. This combination can cause a problem if you choose not to aggregate inline JS or inline CSS in the Autoptimize options, as that code will just be left as originally added. If there is enough of it, you could find yourself with a PageSpeed minify HTML warning.

Aggregate Inline CSS/JS?

As I just mentioned, some plugins inject CSS and/or JavaScript directly into your page’s HTML. If you enable the aggregate inline CSS and JavaScript options in AO, you will most likely pass all minify page speed tests. But, you also expose yourself to a significant risk, which is why these options are not enabled by default. Again, quoting from the FAQ:

Before Autoptimize 2.0.0, inline code was always optimized with all CSS pushed in the head-section and all JS at the end with a defer-flag. This often caused 2 problems; the priority of inline CSS got lost and inline JS could contain page- or request-specific code which broke Autoptimize’s caching mechanism leading to too many cached files and the minification running over and over. This is why as from AO 2.0 by default inline code is not optimized (except for those upgrading from previous versions). Additionally, to avoid inline JS breaking because jQuery is not available, js/jquery/jquery.js is excluded by default.

As with everything, the answer is not “set it and forget it” but rather test different configurations. It may be that your choice of plugins and theme don’t have much or any inline CSS and JS code so that you neither need to aggregate nor worry about excluding jQuery by default. Or, perhaps there is inline code but aggregating it doesn’t affect your site’s functionality. Or perhaps you can get away with aggregating one and not the other (CSS/JS). This is where using an XAMPP development site setup can really be helpful.

PageSpeedtest test results

Cache Size Warning

Autoptimize caches the aggregated files it creates. You can manually purge these files but there is no automated system for doing so. If you have configured the plugin properly, pages on your site should always end up using the same AO cached files and thus the overall size of the cache should not get out of hand.

On the other hand, if you don’t get the configuration right, it is possible for AO to end up creating new files for every page or even for the same page each time it is viewed. How can that be, you might wonder? Well, it usually comes down to the previous discussion of inline CSS and JS. As I mentioned, some plugins (and, possibly even some themes) inject CSS or JS to achieve some functionality. Often the reason they do this (rather than just including an external file) is because something about that code needs to be customized to the circumstances encountered.

For example, I use the WordPress Popular Posts plugin, which offers a data sampling feature that allows the plugin to track only a subset of your traffic and report on the tendencies detected in that sample set. This is very useful for high traffic sites to reduce server load that the plugin might otherwise cause. The problem with this, in terms of Autoptimize, is that the plugin uses a random number to customize its sampling code and this leads to pages having slight JS code differences each time they are loaded. Since Autoptimize names the files it creates based on a calculation of the code in the page (JS code for JS files, CSS code for CSS files, not the entire HTML), even a single letter’s difference in that code will lead to a new file being created. You can imagine how very quickly the number of files created (and, thus, in the AO cache) will balloon.

Once the AO cache size surpasses a certain limit (I think it is 500MB) the normally green circle in your WordPress admin header bar will start to flash red. If you ever get to this stage, you almost certainly need to do one of two things. The simplest is to disable the aggregate inline option (either for CSS, JS or both, but JS is the most likely). The better solution is to find the offending code and exclude it from Autoptimize. How can you do this?

Plugin author Frank Goossen offers a method, but essentially you want to view the source code for two different posts. Then do a search for .js files. You should see one Autoptimize generated file (something similar to autoptimize_0b24c156e7b70633ae3a0ac2049078a0.js) for each page. If these are the same on both pages, that is a very good sign. If not, there are probably some differences that need to be investigated. To do so, open each .js file in an editor that has a compare function (I use Notepad++). To make it a bit easier, since there will be long lines of text, do a find and replace for each semicolon (;) and replace with a semicolon followed by a line feed (;\n) (you could use the } symbol instead of a semicolon). How to do this will depend on your text editor. For Notepad++ you will use the replace function but will need to make sure you check the Extended (\n, \r, \t, \0, \x…) option in the Search Mode box. Now run your compare function (found under the Plugins menu item in Notepad++) and scroll through the comparison screen until you find the differences. Hopefully, something obvious will jump out at you as causing the difference (a post_id, a token, etc.). Find some variable in that code that is unique and common to both files and add that to the exclude code in the AO settings. Save, and repeat the whole process. Keep doing this until the AO external JS file for both pages is the same.

varvy page speed test results

jQuery Dependencies

Autoptimize excludes the jQuery script by default, which means you are going to get a render blocking warning if you use the default AO setup. Why does it do this?

Again, we have inline code to thank. When plugins and themes correctly enqueue scripts that rely on jQuery (very many do), there is a way to specify that dependency and AO respects this in its aggregation algorithm. So, if your theme and all your plugins use only external scripts, AO will aggregate these in a way that jQuery comes first and thus your site will probably function without any problems.

But, what if some JS code is inserted manually into your page by a plugin and that code also relies on jQuery? If you have now deferred the loading of all external JS files, including jQuery files, when your page loads, it is going to try to run the inline JS code before it has a chance to load jQuery. Whatever functionality the inline code was trying to achieve will fail. One solution to this is to use the aggregate inline JS option, but keep in mind my earlier discussion of cache size.

Again, you really should test before enabling AO on a live site. When it comes to JavaScript issues, it is a good idea to use the Developer Tools feature in Chrome (other browsers have similar tools), which will quickly show you any JS warnings or errors, usually the result of something like jQuery not being loaded early enough.

Render Blocking Warning with Autoptimize

The way Autoptimize handles JavaScript code by default means you shouldn’t get any JS render blocking warning on your pagespeed test. CSS is another story. Of course, this isn’t a situation unique to AO; it is very difficult to beat the render blocking blues when it comes to CSS. Still, there are three ways around this.

  1. Inline all of your CSS. This is not generally recommended because, for most themes, that will be a lot of code. Usually, inlining all that code will improve your page speed score, though if doing so makes the size of your page large enough, you may get penalized for “roundtrip times.” Regardless of effect on your score, inlining all CSS may negatively affect your site performance. Why? Because every visit to a page on your site will require downloading the entire CSS again, whereas an external CSS file can be cached by the browser so it only needs to be downloaded once. I actually do inline all my CSS, but that’s because I custom designed my theme to have a small amount of CSS. Frank Goossen offers more information on inlining as well.
  2. Inline and defer. If you don’t want to inline all CSS, AO offers the option to just inline the critical CSS needed to render “above the fold” content and defer loading the rest as a single optimized CSS file. Unfortunately, determining what CSS is needed for above the fold content is not straightforward since it depends on where the fold is, which in turn depends on screen size (size of mobile screen, mobile vs. desktop, etc.). Things are further complicated by the fact that different pages on your site quite likely have different above the fold CSS requirements (home page vs. search results vs. category listings vs. individual posts). Still, if you want to give it a go, Chris Hooper offers an excellent Autoptimize inline defer CSS tutorial which highlights Jonas Ohlsson’s useful Critical Path CSS Generator tool. The AO FAQ also links to other potentially useful tools. Do note that you probably should strip out your render-blocking inline CSS from your core style.css file, but if you decide to stop using AO or this AO feature, you will then have to remember to add back what you took out. Since you are deferring the loading of that “duplicate” CSS, it might not be worth the effort (again, test to see for yourself). Finally, the Above The Fold Optimization plugin (see below) might be a useful way to generate your site’s critical CSS or serve as a complete alternative to Autoptimize for handling CSS.
  3. Defer only. Instead of trying to figure out which CSS is critical for above the fold rendering, you can use the AO inline and defer option but just leave the inline box empty. This is a bit of a cheat to appease the PageSpeed god, but be aware it will lead to a flash of unstyled content (FOUC) because your site will load text before any CSS, which is why you probably shouldn’t do this.

Note: Dave Mankoff provides the clearest description I have read about the prioritize visible content rule:

The easiest way to think about this rule is this: Find all of the HTML elements (div, img, form, input, span, table, iframe etc.) that are visible above the fold when your page is fully loaded. Make sure that your inlined CSS specifies the necessary information to figure out the size of those elements. This might include widths, heights, font sizes, font weights, floats, displays, etc. This includes content as well – if you have a p tag that starts empty and then gets filled in via AJAX, that causes a problem because it will cause the size of the p tag to change.

Generate data: URIs for images?

One great Autoptimize feature is the ability to replace small images with Base64 equivalents (i.e., the generate data: URIs for images AO option). This will prevent sprite warnings in PageSpeed tests, which should also be prevented with simple lazy loading (unless they are above the fold). But, while lazy loading will help you get a higher score, Base64 will actually reduce the number of HTTP requests and thus improve real speed. If you cannot use this feature or if you have images that are not being encoded by Autoptimize then you can create them yourself. One good free online tool is Base 64 Encoder.

This can also help when image dimensions aren’t specified. For example, I use a 1×1 transparent gif in my lazy load code, but it isn’t helpful to set the 1×1 dimensions because it is just a placeholder for the much larger image to be loaded. But, not specifying the width and height produces a page speed warning. Instead, I just replaced the reference to the image with its Base64 equivalent (which, for reference, looks like data:image/png;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7) and the warning disappeared.

404 Pages

A minor issue with Autoptimize is that it might lead to a lot of 404 pages, though you can change from 404 pages to 410 (gone) instead. In this interesting look at 404 and 410,

4xx status codes mean that there was probably an error in the request, which prevented the server from being able to process it. Specifically, a 410 error means ‘gone.’ In Google’s terms, ‘the server returns this response when the requested resource has been permanently removed. It is similar to a 404 (Not found) code, but is sometimes used in the place of a 404 for resources that used to exist but no longer do.’ […] According to Google, when a page issues a 404 header response, it may sometimes still revisit the page to make sure it’s really invalid. Using the 410 response code will at least ensure that Google won’t ever go back to that page again and get to the more important pages on the site, helping overall crawlability.

Above The Fold Optimization

Above The Fold Optimization is a plugin I recently found that seems quite promising, but I haven’t played with it yet. The plugin includes a lot of potentially useful advanced optimization features, but the features of most interest I think are the JS and CSS load optimization options and the critical CSS tools. The Quality Test feature seems quite useful for development as it will show you a side-by-side comparison of how your site looks with critical CSS vs. full CSS. Finally, the Google font optimization feature seems promising as well.

Do note that “this plugin is not a simple ‘on/off’ plugin. It is a tool for optimization professionals and advanced WordPress users.” And, as the plugin author further writes in one of the support threads, “regrettably we didn’t find time yet to invest in making it easy. The current plugin originally was quickly setup for (custom) optimization projects, it was never intended as a public plugin. Our engineers are currently focusing on even more advanced optimization technologies. When the technologies are perfect and shaped into a good standard for long term usage, then it will be converted to a easy to use plugin with documentation.”

Fast Velocity Minify

Fast Velocity Minify is a plugin I wasn’t aware of when I wrote my original optimization article (it might not have existed back then). I have not tried it, but it seems to be quite similar to Autoptimize. I am happy with Autoptimize so I will stick with it, but if you are still considering different optimization plugins, it might be worth a try.

Complete Analytics Optimization Suite

I have talked about localizing external scripts and that is most helpfully applied to Google Analytics. Since my initial work to do it myself, I have now discovered the Complete Analytics Optimization Suite plugin by Daan van den Bergh. This plugin enables you to completely optimize the usage of Google Analytics by locally hosting the GA JavaScript file and keeping it updated using wp_cron(). You can also easily anonymize the IP address of your visitors, set an Adjusted Bounce Rate and choose whether to load the tracking code in the header or footer.

PageSpeed Ninja

PageSpeed Ninja is a new plugin I just discovered that seems quite promising, though it has yet to become popular and I haven’t tried it myself. It seems to be similar to Speed Booster Pack in that it offers lots of common optimizations (lazy loading, minification, leverage browser caching, GZIP compression, JS/CSS combining, defer loading, etc.). It also claims to be able to auto-generate above-the-fold critical CSS which, if true, is a very cool thing. The admin panel UI also seems quite slick. I would love to get some feedback if anyone gives it a try.

PHPWee

I had been using PHPWee, which is a nice open source script, but I find that the Autoptimize minify, even for HTML, works quite well so I no longer need PHPWee. If you choose not to use Autoptimize or if it still produces minify warnings (which could easily happen if you choose not to aggregate inline code), then read my previous article for how to use PHPWee (and read the comments section for a common problem with it)

WP Scripts & Styles Optimizer

In my previous article I referenced code and options to configure the inclusion or exclusion or scripts as well as the re-ordering of them. At that time I wasn’t aware of a new, potentially useful plugin called WP Scripts & Styles Optimizer. It’s a relatively new plugin and doesn’t have many users so I am not sure how well it will get supported going forward but if you are not comfortable working with PHP code it might be a great option. Even if you don’t use it to make changes, it is a pretty good way to see what scripts are being used by your site setup.

Beware Social Sharing Plugins

Frank Goossen discussed how Social sharing can significantly slow down your website in a post back in 2012. I am sure the various plugins he considered have changed since then and many more were not tested, but the concept remains true today, which is that sharing widgets “slow webpage loading and rendering down as they almost invariably come with ‘3rd party tracking’ for behavioral marketing purposes.” So, ultimately, you have to decide how important having social sharing buttons is to you. Assuming you want them, there are options that won’t kill your page speed, but you will have to seek them out and they likely won’t be the plugins that look the best or have all the bells and whistles. For what it’s worth, I have long used Simple Share Buttons Adder, but I don’t recommend it since it was acquired by ShareThis and subsequently they added a lot of tracking and other scripts that make it incredibly bloated (read the countless bad reviews since the acquisition). I actually dequeue all those annoying new scripts so I am still using it, but you should probably just look elsewhere.

Testing Tools

Core Testing Tools

I previously listed the tools I like but since then I have added a few new ones. I think most of us tend to work through a progression on the page speed optimization learning curve and the best way to do that is to really get into the test result details provided by the various testing tools. Naturally, there is a lot of overlap but each of the tools I rely on provides at least one test or feature that distinguishes it from the rest. If you are a real page speed warrior, maximizing your scores for all of the tests will test your mettle. For convenience, the full list of my preferred tools is:

  • Google PageSpeed Insights is the reason you are probably reading this. In his article, “Faster Sites: Beyond PageSpeed Insights,” Benjamin Estes gives a skeptical reading of each of the PageSpeed Insights components as well as providing a good high-level look at what affects page speed.
  • Mobile-Friendly Test analyzes exactly how mobile-friendly your site is, and focuses on elements beyond speed as well.
  • The Google PageSpeed Insights Extension for Chrome is great for several reasons. First, you can use it to test a development site, especially one not accessible on the Internet. Second, you don’t have to visit the official Google test page, which can be slow and requires 30 seconds between tests. It also makes testing many pages much easier than doing so via the Google test site. Finally, it gives a nice layout of results with detailed breakdowns of all the issues covered (some of which are hidden when viewing results on the public Google test page).
  • Pingdom. This tool is very popular and lets you test from three different geographic locations. It also always seems to generate warnings that other tests don’t and I generally like its one-page display of results (each test result gets a color, letter grade and score out of 100), as well as its report on what percentage of sites yours is faster than. One thing I don’t like about it is the lack of detail on any individual warning.
  • GTmetrix. The big selling point for this one is that it performs two tests: page speed and YSlow. YSlow is almost always going to give you a worse performance score because it tests some things most other tests don’t, especially the use of a CDN. You cannot specify a geographical region, but it will report the region that was used as well as the browser. I also like the presentation of results for this test site with each individual result being given a score out of 100 and color-coded and sorted to easily spot the biggest problem areas. Click any recommendation line and you will get more details as well as a link to an article explaining the test.
  • WebPageTest is another good site. One thing that sets it apart is that your site gets tested three times instead of just once. You can also choose the geographical location, browser type and even the connection type/speed. The basic results are simply displayed as a letter grade for six different test categories, including use of a CDN (either a check or an X for that one). Detailed results are also available, but you have to click through to see those on separate pages. FYI, Moz has a useful guide to Web performance using WebPageTest.
  • Varvy offers two testing tools: Page speed optimization and SEO (which also includes a page speed component). I don’t think the page speed test actually provides anything of value that the other tests don’t, but the SEO test (which is the test available on the home page) is good for testing compliance with a lot of Google guidelines (some speed related, some SEO). The site generally provides excellent articles explaining each test.
  • PageSpeedGrader. This is a new addition to my arsenal and if I were to drop one in the future, this would be it. It’s a pretty simple test that shows you an overall grade out of 100 as well as more typical results with only the recommendations for improvement shown on the core results page (there are tabs for successful tests, a timeline, and requests made in order to load the page).
  • DareBoost. In the comments on my previous article, Damien from DareBoost recommended his tool. He claims it offers some checks other tools don’t, including a speed index, checking for duplicate scripts, checking performance best practices in your jQuery code, performance timings (not sure what he means by that), SEO, compliance, security, etc. The concern I expressed was that his tool required registration to use, but it seems you can now do five tests without registration, though it will be necessary for some features and you have to share via social media to get permission to download a PDF of the report generated, but the core test can now be done without registration. Personally, I don’t find the test to add much value in terms of speed issues but the very extensive list of specific SEO, security, accessibility and quality issues definitely makes it worth adding this to your suite of testing tools.
  • Lighthouse is a tool I only found since I started writing this article and it is focused on Progressive Web Apps (PWA), but it seems quite useful for non-PWA sites as well. You can run it against any web page, live or development, public or requiring authentication. It has audits for Progressive Web App, Performance, Accessibility, and Best Practices. There are multiple ways you can run Lighthouse, but the simplest is to use the Developer Tools feature in Chrome (just choose the Audits option).
Note: When you start doing some more complicated optimization efforts, check the files list (Pingdom) and make sure all scripts have been accessed without errors.
Chrome pagespeed extension
Chrome pagespeed extension results

Server Response Codes and Headers

When you visit a website, you see the content on the page, but your browser sees extra information, declared in HTTP header fields. Mostly these are set by your server but sometimes you can set additional ones via an .htaccess file or with PHP coding but that’s getting into some pretty advanced territory. Generally, I wouldn’t worry much about changing your site’s headers, but it is instructive to know what they are and there are some useful tools for that, including REDbot, HTTP Status Code Checker, and HTTP / HTTPS Header Check, with REDbot being the most useful. It will point out common problems and suggest improvements and although it is not a HTTP conformance tester, it can find a number of HTTP-related issues. REDbot interacts with the resource at the provided URL to check for a large number of common HTTP problems, including: invalid syntax in headers, ill-formed messages (e.g., bad chunking, incorrect content-length), incorrect GZIP encoding, and missing headers. Additionally, it will tell how well your resource supports a number of HTTP features.

Other Specialized Tools

There are some other specialized tools that can be helpful as you slog your way through your efforts, including:

  • The WAVE Web Accessibility Tool is a cool online testing tool to see how well your site can be used by those with physical impairments.
  • MX Toolbox is a good tool to test lots of different things, including your DNS and mail server setup, presence on spam blacklists, etc.
  • The Check GZIP Compression tool is useful if you get a compress resources warning.
  • Query Monitor is an incredibly useful debugging plugin for anyone developing with WordPress. It has some advanced features not available in other debugging plugins, but I find it most useful for seeing which scripts (both JS and CSS) are being used on any given page. It is also useful for seeing what kinds of database queries are being made. This is not a plugin you will need very often, but when you do it is a real gem.
  • Learn to love Chrome’s Developer Tools. I already mentioned how useful it is for finding potential JavaScript problems. The Inspect Element feature (which is also available in FF) is great for working on CSS issues. There is also a network section that will show you where your site is using time/resources and thus point to possible ways you can improve. And, if you use the recommended PageSpeed or Lighthouse extensions, they are accessed from a Developer Tools menu.
  • The HTTP/2 and SPDY indicator extension for Chrome shows a lightning icon in the address bar that’s blue if the page is HTTP/2 enabled, green if the page is SPDY enabled, and gray if neither is available. If enabled, you can click on the icon to get a variety of detailed information about the connections.

Miscellaneous Testing Tips

Testing Without Ads

I mentioned in my previous article about the challenges Google Adsense presents. If you are using it, you will want a way to test your site with it disabled. Fortunately, some ad plugins provide an easy way to not display ads on a per-page basis or based on some filtering criteria. I use and highly recommend Ad Inserter, which offers a variety of filter options to accomplish this. With Ad Inserter installed, there are two ideas worth considering:

  1. Create a page and/or post to use just for testing purposes. For my sites, I created a page called noads. I used a Lorem Ipsum text filler. A test page is also very helpful if you have multiple websites you are trying to optimize as you can sort of compare apples to apples
  2. Use the query string filter to block ads on any existing page based on passing a certain query string. So, for example, you can filter out a query string of ‘noads’ (e.g., domain.com/?noads=true). This is a great idea because it allows you to easily test any page on your site. There are two potential downsides, however. First, the presence of query strings is actually a warning on some of the testing sites (especially Pingdom). But, it isn’t a factor for Google’s test, so that is nice. Second, your caching solution may not support caching URLs with query strings or may have the option disabled. If the latter, enable it, at least while you test.

Caching

If you are making iterative changes in an attempt to improve your pagespeed, some of the test sites may not notice those changes due to caching. For plugin-based caching, you can purge the cache, either just for the page you are testing or for the entire site, depending on your caching plugin. If you have server-based caching, you may have no recourse but to wait until a later time/date to re-test or to test a different page. If you are using Cloudflare, don’t forget to purge its cache and put it into development mode.

Various Optimization Tips

Here are some random things I have learned since I wrote the first article. They are not in any particular order.

Images and the Prioritize Visible Content (PVC) Warning

In my latest round of optimization work and testing a few of my sites started getting prioritize visible content warnings. In my case, these warnings were only showing up on the desktop score. It took me a long time to figure out what was going on (methodical testing, testing various page elements until the warning went away). So, what was the problem? Images above the fold—in my case my logo.

Now, the problem was not that my logo was not optimized (there is a separate warning for images that are not optimized). There is actually not much information I could find online about this issue, but I did find a Google forum thread that hints at the problem being related to the dimensions of the image, though not the file size. I confirmed this on all my sites by simply experimenting with smaller width and height values for my logo (not actually replacing the logo, just scaling it down). I couldn’t find any pattern or formula for knowing when an image will trigger the PVC warning and when it won’t but if you are mysteriously getting a warning, try two things. First, make sure you are specifying a width and height for any above the fold images. If you aren’t, that might be enough to clear the warning. Next, if that didn’t help, remove any images above the fold to see if they are the problem. If so, then start playing with different, smaller dimensions until you get a clear test result.

Also on the topic of images, the folks behind my image optimization plugin of choice, Ewww Image Optimizer, offer a few useful tips on dealing with images and Pagespeed, including: stripping out metadata, configuring the plugin to optimize images that aren’t in your uploads directory (e.g., images from a plugin), and dealing with the WebPageTest image requirement.

I also recently read an article hawking the service Cloudinary (note that there is a free plan as well). While that is a sponsored post for a commercial product, the content is actually very informative and covers several ideas about how images affect site performance that I hadn’t considered before.

Comments and Gravatars

In my previous article I talked about the page speed performance hit you take by using gravatar avatars in your comments section because their browser caching is set to only 5 minutes. I neglected to mention another problem they represent: they require your server to make multiple HTTP requests to the external server that stores the gravatar images. The more commenters that have a registered gravatar, the bigger the hit you will see.

The way I see it, there are three ways to address the gravatar performance hit:

  1. The simplest solution is to simply choose not to display avatars at all (this is done in the Settings > Discussion section of your WP admin panel).
  2. You can choose to use localized copies of your commenters’ gravatars. I previously supplied code for localizing your own avatar but at that time I didn’t know how to handle all your commenters. Recently, I have learned that some plugins exist to do just that. Perhaps the simplest is WP User Avatar, which allows you to use one default local avatar for everyone, ignoring any personalized gravatar that a commenter might have. Of course, this doesn’t give much of a personalized feel to your comments, so a more interesting plugin is FV Gravatar Cache, which downloads local copies of gravatars for all your commenters and then replaces the HTML references in your comments section with those. Harrys Gravatar Cache and Optimize Gravatar Avatar seem to do more or less the same as FV Gravatar Cache, though last I checked only FV supports retina display versions.
  3. My preferred method is to lazy load comments. This solution can help with both readability and speed performance, regardless of how you handle gravatars. I haven’t done extensive research, but the Lazy Load for Comments plugin seems to fit the bill. Alternatively, you could get lazy loading plus many other useful features by replacing your default WP comments setup with the wpDiscuz plugin, though I haven’t personally given that a try. Ideally, you might use lazy loading with localized gravatars but as of this writing my two preferred plugins don’t work together, though I have proposed an easy fix for that which will hopefully find its way into the next version of the FV Gravatar Cache plugin.

Finally, note that there are several show/hide post comments plugins that might sound like they would help with page speed, but mostly these just toggle the CSS to hide and then display the comments, which are loaded as normal. So, all the text and gravatars are still there—you just can see them until you toggle them on.

Serve Content from a Cookieless Domain

If your site sets a cookie, all HTTP requests must include it. This is really unnecessary when requesting static content like images, JavaScript and CSS and thus some page speed tests will give you a warning or penalize your score. This problem can most easily be solved by using a CDN or you could even create a separate domain or subdomain to serve cookies, but if you are using HTTP/2 there really is no point in worrying about this.

FontAwesome (FA)

A good tip regarding graphics is to make heavy use of FontAwesome. For many web-related graphics needs, there is probably already a FA icon you can use. It is scalable, you can change its color and it is a font not an image so it is much faster and less resource intensive to use. The only caveat is that you will probably get hit with some warning if you just load FA from a third party server. Instead, try to localize FontAwesome to your own child theme.

Google Analytics

Any time you call an external script you run the risk of triggering page speed warnings, often due to headers set by the third party server, which you cannot control. That’s why I recommend the advanced technique of localizing external scripts when possible. I have been successfully localizing the Google Analytics script for years but recently I noticed that Google now recommends using their new Global Site Tag tracking code so I have now switched to using that. Since the new tracking code still relies on the analytics.js file I previously localized, that localization didn’t need to change, but in addition, you will now need to localize a copy of the gtag file found at https://www.googletagmanager.com/gtag/js?id=UA-XXXXXX-Y (replace the X’s and Y with whatever your UA ID is).

If you decide to switch to the new gtag GA tracking code, there is one big page speed issue to be aware of. One thing Global Site Tag supports is Remarketing and Advertising Reporting Features, which provide demographic information on your site’s visitors. This feature is not enabled by default but if you decide to enable it (in the Tracking Code -> Data Collection section of GA), you will get a Minimize Redirects warning from Pingdom.

As far as I can tell, there is no way to keep this feature and eliminate the warning. So, your choices are: accept a redirect chain penalty (Pingdom only) or stop hosting the GA script locally and accept a Leverage Browser Caching error (most testing sites). This gets back to the question of how much you are willing to compromise for the sake of a “score.”

Support for the If-Modified-Since header?

I quickly mentioned the idea of server headers above. The reason they came to my attention was because some of my sites were failing Varvy’s If Modified Since test. This is not a test I have seen the other major pagespeed testing sites include and therefore it probably won’t affect your scores. However, that doesn’t mean it isn’t important.

The basic explanation of the If Modified Since header is that your server should send a “304 Not Modified” response if the page has not been modified since it was last requested. Using this saves bandwidth and reprocessing on both the server and client, as only the header data must be sent and received in comparison to the entirety of the page being served again.

Usually, a well configured server will already support the If Modified Since header and you won’t need to do anything. But, if your server doesn’t, you can generally add either a Last-Modified or an Etag header yourself. This is something you might expect WordPress to add by default, but it doesn’t. You can, however, can add either or both of those headers with either a plugin or some custom code.

If you are using WP Super Cache, there is an advanced setting for 304 Not Modified browser caching that should hopefully do the job without need for any extra plugin or code. If not, the Add Headers plugin will most likely take care of this but, unfortunately, that plugin is officially abandoned. It still works and probably will for a long time, but there is no guarantee of that. There is an alternative, SEO Header Easy plugin, but that is two years old and has no support threads and only 400 installs.

If you want to manually add the Last-Modified header in your own customized plugin or your theme, there is some useful sample code at StackExchange. If you go that route, use the previously recommended REDbot, or alternatively last-modified.com, which will issue a specialized request necessary to test if your server is configured to send the proper 304 response header.

Image Optimization

I previously discussed some useful tools for compressing your images. Since then I have come across two new tools I like: Compressor.io and Page Weight.

The former is a simple online tool. Just drag any image you want to compress and wait for the result. It’s not an option for bulk compression but it does a nice job. One warning: if you try to compress an image more than once, Compressor.io seems to work but actually will just download the first, cached, optimization. One way to get around this is to make a copy of the image, changing its name and submit that.

I am actually new to Page Weight, having learned about it recently from Noupe. The tool is provided by imgIX and is meant to drive sales of its online cloud image optimization service, but it is free and the results for my limited testing were impressive. All you do is paste a URL into the site and it will analyze the overall weight of the page and what percentage is due to images. If any of those images can be effectively compressed, it will show you that, and you can even download a compressed copy of the worst offender.

HTTPS (SSL) and HTTP/2

I mentioned HTTP/2 in my previous article but I was not using it because it only works if you use SSL/HTTPS. It also must be supported by your web host. One of the primary reasons I migrated all my sites to new hosting providers was to get access to the free Let’s Encrypt SSL certificates, which I am now using on all my websites. And, all my new hosting providers do support HTTP/2 as well.

If you read a bit about HTTP/2 you will come across the argument that combining the disparate CSS and JavaScript files won’t be effective with HTTP/2 in use. The idea is that it is better to download multiple smaller files in parallel than just one larger file. Also, since the aggregation is done for each page, some reusable code (like jQuery) will end up being bundled into many different per-page aggregate files, thus reducing the ability of a client browser to simply cache that reusable code. That is inefficient.

The Autoptimize FAQ references three articles which indicate that it is premature to rely on HTTP/2 over the aggregating files technique (all three are worth a read). Two of those articles were written in late 2015, so I don’t know if things have changed since then. The third article was written in early 2017, but its test cases included HTTP/1 and HTTP/2 with no optimization; aggregation and minification; and aggregation, minification and a CDN. Since the core argument is to NOT aggregate if you are using HTTP/2, I would have liked to see a test case comparing HTTP/1 and HTTP/2 with just minification (no aggregation).

A List Apart also points out that a not-insignificant number of people still use browsers that don’t support HTTP/2, though you can use your analytics reports to see if that is true in your case or not. I imagine there are a lot of articles on the topic, but ultimately I like the conclusion offered on the AO FAQ page:

Configure, test, reconfigure, retest, tweak and look what works best in your context. Maybe it’s just HTTP/2, maybe it’s HTTP/2 + aggregation and minification, maybe it’s HTTP/2 + minification.

On the same topic, I have seen multiple discussions of the tradeoff between SSL overhead (it takes a bit of negation time for a client browser to verify the SSL certificate for your site, thus negatively impacting your page speed) and the boost from the various HTTP/2 technology advantages. I suppose that is an interesting theoretical topic, but since the main value in switching to HTTPS is security and, possibly, increased SEO performance, I think the argument irrelevant.

Yet another interesting question is whether it is still recommended to use a CDN with HTTP/2. As David Attard writes:

You’ve probably seen over and over again how one of the main ways to improve the performance of a site is to implement a CDN (Content Delivery Network). But why should a CDN still be required if we are now using HTTP/2? There is still going to be a need for a CDN, even with HTTP/2 in place. The reason is that besides a CDN improving performance from an infrastructure point of view (more powerful servers to handle the load of traffic), a CDN actually reduces the distance that the heaviest resources of your website need to travel.

Finally, I have read that Pingdom and Google’s PageSpeed Insight perform their tests without HTTP/2 support. I don’t know if or when that might change. GTmetrix, WebPageTest and DareBoost do support HTTP/2 and I am not sure about PageSpeedGrader or Varvy. The point is one I have made multiple times already: do you care more about a score or about the experience of your site’s visitors? Yes, in theory you should be able to worry about both, but in this case, if you focus only on Google’s score you will probably be driven to optimize for HTTP/1, which may not be what is best for your site.

Cloudflare

I have been on the fence about Cloudflare for a long time. It’s definitely a great, free service you should consider but there are a few things to think about first.

  1. Features. Even free accounts offer some great features, the two most notable being a Content Delivery Network (CDN) and a traffic filtering firewall. The first can really improve your site performance across the globe and the second can help reduce your server load and prevent malicious attacks. There are also other performance enhancement features, like minification and Rocket Loader, though be careful with those, especially if you are already using one or more page speed optimization plugins or tools.
  2. Speed. The CDN functionality of Cloudflare can enhance your site performance, especially for people visiting your site from a location geographically distant from your server. But, there is also a speed penalty to pay since Cloudflare acts as a reverse proxy, essentially acting as a middle man between visitors to your site and your site’s server. Typically, this performance hit will be most noticeable in the server response time (TTFB) since that is something your page speed score will probably get penalized for. To be fair, this issue is somewhat hotly debated and Cloudflare offers a compelling reason why you should stop worrying about Time To First Byte (TTFB). One of the commenters in that article points out that the issue is important for SEO so I am not sure which issue is the most important.I should also point out that Cloudflare offers a feature called Railgun, which can significantly alleviate this connection speed issue, but Railgun is a premium feature not available on the free plans. However, one of my new hosts, Siteground, actually offers the Railgun feature for free with their plans, which is fantastic. They are the only host I know of that does that.
  3. SSL. In the past, Cloudflare’s free account did not offer the Full and Full Strict SSL modes, but now they do, which is great. Full mode is for self-signed certificates, which are not recommended (but are better than nothing). Full Strict is for proper certificates, including the free ones from Let’s Encrypt. So, in terms of free functionality, things are looking good for using Let’s Encrypt with a free Cloudflare plan. It should be mentioned, however, that the free plan does not support a custom SSL certificate. What that means is that when someone visits your site, the certificate they will see in their browser is the shared UniversalSSL certificate from Cloudflare, not your SSL certificate. And that certificate is shared among dozens or even hundreds of different websites. The connection is secure but anyone who inspects your SSL certificate will see all those shared domain names that have no relation to your site. If you are already using a free Cloudflare account with SSL, you can check this yourself in the browser or you can use the SSL Checker tool. Some people consider this a big deal and others, including myself, don’t.

If you choose to use Cloudflare, consider using Siteground for your hosting. I already mentioned that it is the only host I am aware of that offers the premium Railgun feature for free. Besides that, they make setup and management incredibly easy. With other hosts, you typically need to visit the Cloudflare site and use the setup tool to detect your server’s DNS settings. Then you need to go to your domain name registrar and manually change from your current name servers to Cloudflare’s. Siteground has partnered with Cloudflare to make these steps unnecessary. Everything can be done in a couple of clicks from your Siteground control panel and no name server change is required at all. This means it is very easy to turn it on and off if you want to test its effectiveness. You can also change the core settings, purge the cache and put your site in development mode from your control panel.

Code Snippets

I provided some useful code examples in my previous article, and here are a few more that may be of interest. These types of code snippets typically go in your child theme’s functions.php file, but I recently came across a nifty plugin called, appropriately, Code Snippets, that will let you add and organize your various snippets without having to change your theme. I haven’t used it myself but it could be a nice way to keep all your various modifications organized and help separate out functionality (plugin) from appearance (theme).

Remove Emoji Support

WordPress automatically converts emoji text to the corresponding graphic. It might seem insignificant, but it does cause performance issues. Disable them by adding the following lines to your child theme’s functions.php file

[credit: Thomas Vanhoutte]

Hide / Show Comments

Earlier I mentioned the idea of lazy loading comments and said that the various plugins to toggle the display of comments on a page doesn’t really help with page speed issues. Still, if you are interested in such functionality, even if only for presentation purposes, here is some code I adapted from Benedict Eastaugh’s hidden_comment_form.php script. You could place this in a custom plugin, as he does, or in your functions.php file, but I just added it directly to my comments file:

There are a couple of things to point out. First, I am hiding a <div> element with id of have-comments. I added that element to my comments.php form to surround just the existing comments I wanted to hide. You can use an existing <div> id that your theme uses or add one like I did. You can also choose to include the comment form in the hidden <div> or leave it visible at all times (as I do).

My toggle button has “View All Comments” as the text and is placed in a <span> with id cf_toggle. You will want to add styling for that <span> in your style.css file and you may prefer to make it a <div> instead. I use a <span> because I also include the number of existing comments in <H2> tags and I would like to keep that and the toggle button inline. Change anything and everything to suit your tastes.

Page Speed Monitor Script

Most of the core page speed testing sites I use are free but also offer paid plans to regularly monitor and test your site. I’m a bit cheap for that but I started wondering if there is possibly a script available to do my own regular testing. I found some pagespeed.php code, which I modified slightly to test more than one site and to email the results. I then setup a cron job to run it once a week. Here is the code:

More Resources

Like this content? Why not share it?
Share on FacebookTweet about this on TwitterGoogle+Share on LinkedInBuffer this pagePin on PinterestShare on Redditshare on TumblrShare on StumbleUpon

1 Comment

  1. Thank you for this helpful resource. Wish I had stumbled upon it sooner than to discover things on my own, which is indeed a time and effort intensive process.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.