Getting an all A-score on Webpagetest.org

  • April 30, 2014

A very important thing to me is a very good performance of my web applications. It really matters when it comes to user satisfaction how fast the webpage is actually served and rendered. There are too many thinks you can adjust to archive a better time.

But how do you measure these values. There are several online services which allow you to evaluate your current webpage in terms of different metrics. One service I like to use as a good start for basic content delivery is the Webpagetest. It analyses various factors of how you webpage is actually served for the first or a second time. This is very import because caching on the client side can help you to minify you load as well as minifying your own load.

It took quite some time to reach all suggestions without removing content or elements of the website. I just want to write about the biggest challenges I had while improving my sites.

The Minified Content

It is important to have as less as possible amounts of requests from the browser to your server. Most browsers even allow a limited amount of requests at the same time to the same host. This means that some files may only be loaded after others have finished. I used to compress the content only for productive builds and serve uncompressed files in the development process. But this brought a lot of unexpected behaviors when the code reached the productive environment. And one thing you want to avoid at all costs is to have unexpected factors after deploying to your productive servers.

Minify CSS

You can combine all you css files into one main css file and compress this file as well by removing all line breaks or comments. Because I use LESS CSS in my projects it is very easy to create a minified css file by adding a command line option.

lessc  -x --clean-css style.less style.css  

Minify JS

All JavaScript files can be combined into one file as well. I just use a simple foreach loop.

$content ='';  
foreach ($scripts as $script) {  
   $content .= file_get_content($script);  
file_put_content('script.js', $content);  

This script.js file is then compressed using the Closure compiler online service. A simple CURL call sends the uncompressed file and receives a compressed version. You can change the compilation_level depending on the structure of your JavaScript code. A very hight compression requires changes in the JavaScript code you have written.

 $ch = curl_init();  
 $curlConfig = array(  
     CURLOPT_URL            => "http://closure-compiler.appspot.com/compile ",  
     CURLOPT_POST           => true,  
     CURLOPT_HTTPHEADER => array('Content-Type: application/x-www-form-urlencoded; charset=UTF-8'),   

 curl_setopt_array($ch, $curlConfig);  

 curl_setopt($ch, CURLOPT_POSTFIELDS,  
     .'&js_code=' . urlencode(file_get_contents('script.js'))  

 $compress_code = curl_exec($ch);  



Should I include third party libraries like jquery.js or bootstrap.js also or use a CDN for those? I include all other JavaScript files as well. So there is really only one single file. Because there are already a lot of different jQuery versions and a short cache times in most browsers the chance your user will already have a jquery.js file in his cache is not that high (Source Statichtml ). Most people say that a Google hosted file is served by a large CDN network with fast delivery times. But they don’t mention that you can serve your file compressed file via CDN as well. I well get to this point later.

Minify Images

It seems to be very important to use progressive saved JPG images. You properly already take care of this when you export files from Photoshop or other graphic tools. But what about files your server generates? When using the ImageGD library in PHP you just simple at this one line to your image conversion (only for JPG files).


The created image will be saved as progressive JPG and everything is alright.

Minify HTML output

Is it important that you HMTL codes should be well indented and easy to read? Most developer tools display indention correct no matter what the code says. So you can use an output buffer and remove all unwanted stuff from your HTML (like comments, line breaks) before sending it to the browser. Most Frameworks already come with an output buffer.

I use this function (Source Stackoverflow )

 function sanitize_output($buffer) {  
     $search = array(  
            '/\>[^\S ]+/s',  // strip whitespaces after tags, except space  
         '/[^\S ]+\</s',  // strip whitespaces before tags, except space  
             // '/(\s)+/s'       // shorten multiple whitespace sequences <-- breaks it  
     $replace = array('>','<','\\1');  
     $buffer = preg_replace($search, $replace, $buffer);  
     return $buffer;  


You can deliver all your static file using a CDN. I use Amazon CloudFront as alias for my domain.

A path to your script before


A path with CloudFront


A path with CloudFront and a cname domain (no HTTPS possible, but good for all public pages)


But how do you tell the system there have been changes to this file? You can log into your AWS CloudFront account and set up a „Invalidating“ request for your files. Better is to attach a version number to each file. This version number should be in the filename and not be added via ‘?version=1234’.

A path with CloudFont and version name


You can change your .htaccess file to still serve the script.min.js file by adding this line

# Create a rule for static content to be versionized not by query param  
RewriteRule ^(.*)\.[\d]+\.(css|js|ico|eot|svg|woff|ttf|gif|png|jpg)$ $1.$2 [L] # Strip out the version number  

When you host your application in a cloud some provider add the deployment version the the environment variables. I just use this one (it is the GIT commit number) as version number. So with every new deployment in the productive environment all file paths will be updated to a new version at the same time.

The JavaScript defer loading

This is something I just had to get used to. It tells you do execute any JavaScript after the page has finished rendering. I load all JS files after the page has rendered (read it Stackoverflow ) and then execute all JavaScript functions I need. This means I just created a small function buffer and add everything to it.



The basic code

var run_functions=[];  
function run(b){run_functions[].push(b)};  
function execute(){var c;for(var c in run_functions){if("function"==typeof run_functions[c]){run_functions[c]()}}run_functions[]=[]};  

The .htacess

You need to set correct expire header for all static and non static files. A good source for advices is AskApache . This is what I am using at the moment.

<ifModule mod_expires.c>  
      ExpiresActive On  
      ExpiresDefault "access plus 1 seconds"  
      ExpiresByType text/html "access plus 1 seconds"  
      ExpiresByType text/php "access plus 1 seconds"  
      ExpiresByType text/css "access plus 31536000 seconds"  
      ExpiresByType text/javascript "access plus 31536000 seconds"  
      ExpiresByType application/javascript "access plus 31536000 seconds"  
      ExpiresByType application/x-javascript "access plus 31536000 seconds"  
      ExpiresByType application/x-font-woff "access plus 31536000 seconds"  
      ExpiresByType image/ico "access plus 31536000 seconds"  
      ExpiresByType image/gif "access plus 31536000 seconds"  
      ExpiresByType image/jpg "access plus 31536000 seconds"  
      ExpiresByType image/jpeg "access plus 31536000 seconds"  
      ExpiresByType image/png "access plus 31536000 seconds"  
      ExpiresByType image/bmp "access plus 31536000 seconds"  
      ExpiresByType image/x-icon "access plus 31536000 seconds"  
      ExpiresByType image/svg+xml "access plus 31536000 seconds"  

The third party components

One thing you just cannot really control are third party components on your website. This can be a Facebook like button, twitter feed or the Google Analytics code. They decide how to deliver their files and when those expire. I could not’t find a good solution to this problem. You can see that the “Progressive JPG” test completely fails on this site because of the Twitter widget. This is even worse when it comes to W3C validation.