100 / 100Suggestions Summary

Congratulations! No issues found.

It all started as a little (but quite fierce) competition between myself and Jake Cobley.

The goal was simple; who could achieve the illusive 100/100 on Google PageSpeed Insights.

As it turned out, through some collaboration with one another we both did.

Can I get a woop woop?

It has also been a very useful learning exercise.

But how?

Platform Choice

Firstly, the targets were our blogs. The question from the outset therefore was what software to use. Obviously there is wordpress and that has some great features but there are many layers to negotiate before the page can be sent to the user and in my experience would only negatively impact on speed.

I get you can use various layers of cache but it is equally important that un-cached it was also razor fast.

Did i mention this was a competition :)


Enter Jekyll.

Jekyll is a static site generator built from markdown files. That’s right, no LAMP stack to negotiate. Just pure, raw html files flown straight in on the speedy express. This fit the bill nicely as I didn’t need features such as post scheduling and wysywig editing that wordpress and other platforms could offer. I just wanted to get the content of my blog in front of the user as quikcly as possible (Actually, i mainly use it for note taking as an aid to memory. If others find it useful then great :D).

So i have super speedy platform to use but I cannot simply rely on Jekyll to do everything for me.



For this i used a plugin called jekyll-press which compresses the html during compile time by removing any unnecessary white space (the stuff you don’t care about in production).


Jekyll out of the box using sass and compiles it for you on each build. It is quite a trivial task to tell Jekyll to compress the files as well.

In my _config.yml:

    style: compressed

Actually any parameter you can pass to the sass compiler can be added here.

(Stop) Render Blocking

Next, it was important to not render block. That is to not stop the browser from showing the content to user because it requires further information like stylesheets and javascript to be loaded.


My CSS files (SCSS) as it turned out are small enough to inline them in the <head> section. The main bonus of this is that no extra request is required to get the styles. They come in with the initial request for the html document. Win!

To actually achieve this though the main css file has to placed within the _includes folder (and have the front matter removed) and then parsed manually using the in built scss compiler. Here is the code snippet located in the <head> section required to do this.

{.% capture include_to_scssify %.}
  {.% include main.scss %.}
{.% endcapture %.}
{.{ include_to_scssify | scssify }.}

Note: To prevent jekyll parsing the above I have put dots in. {.% for example needs the dot removing to work. Same for %.}, {.{ and }.}


The only JS my blog currently contains is google analytics. Now this presents a small problem because Google themselves tell you to not download the js file (assuming you want the latest updates immediately). They also only tell the browser to cache the file for 2 hours. That negatively marks you on their own PageSpeed tester :(

Let me re-iterate; This was a competition to acheive 100/100, not 99/100.

The only thing for it was to download it and load it locally.

Of course I will update it every now and then.

This is also placed just before the </body> tag and loaded “async”.

    window.ga=window.ga||function(){(ga.q=ga.q||[]).push(arguments)};ga.l=+new Date;
    ga('create', '[redacted]', 'auto');
    ga('send', 'pageview');
<script async src='/js/analytics.js'></script>

Note: I also wrap the analytics code in a if jekyll.environment == "production". Don’t want those stats being skewed by local development now.

Any other JavaScript should also be placed just before the </body>.


Browser Caching

Why should the browser make more requests than it needs to. Tell it to cache your css / js and any image files for a long period of time and only get a fresh copy of those files when the cache exists.

In my .htaccess file i have;

<IfModule mod_expires.c>
    ExpiresActive On

    <FilesMatch \.(css|js|jpg)$>
        ExpiresDefault "access plus 1 year"
If you have any other file types to cache just add them to (css js jpg).

Character Sets

Specifying a character set early enables the browser to start parsing the HTML and scripts immediately. Without it forces the browser to try and figure out what character set to use ref.

You can do also do this in your .htaccess file.

Add the following

AddDefaultCharset UTF-8

When you inspect the headers of the document now you will see Content-Type: text/html; charset=UTF-8


“The problem with ETags is that they are constructed to be unique to a specific resource on a specific server. For busy sites with multiple servers, ETags can cause identical resources to not be cached, degrading performance.” - www.websiteoptimization.com

I have turned these off in my .htaccess by adding the following:

FileETag none

How Am I Looking?

Take a look

As it stands the total download size (of the homepage) is 14.4kb and the TTFB is <100ms. Not too shabby :) In total there are 3 requests. 2 of those are Google Analytics doing it’s thing but that is fine by me! Total page load time is ~<300ms.



What is CloudFlare? Go see

It has a whole raft of features most of which i am not using or need to use. A few of the ones I do use though include its CDN, a free SSL and site downtime protection via their “Always On” feature for when my host has outages.


www. or not?

It is important to choose to run with or without www. at the start of your address. What you choose doesn’t matter. What is important is that you choose one or the other and stick to it. The reason is because

The reason for that is because each is seen as a different website and thus dupliate content.

I have gone with www. See how to make sure it is enforced by reading my always www article.


Google favours websites that run entirely under https.

Here is how I do that.

In my .htaccess:

<IfModule mod_rewrite.c>
    RewriteCond %{HTTP:X-Forwarded-Proto} !=https
    RewriteCond %{HTTPS} !=on
    RewriteCond %{REQUEST_METHOD} !=POST
    RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]

Thanks to a technique mentioned to me by Aaron Brady you can reduce the number of 301s that occur , if the same user comes to your website again on a none secure link, by telling the browser that your website should always run under https.

Add the following to your .htaccess file:

Header always set Strict-Transport-Security "max-age=86400"

Now the browser will change to the secure version without having to go back to the server.

  • jekyll
  • pagespeed
  • google

Like this post? Share it :)

Related Posts