Nerd alert: This post is only for crazy website performance freaks. Proceed at your own risk.

The holy grail for us crazy website performance freaks is to reach a perfect score of 100/100 in Google Page Speed without sacrificing important features of the website we’re building. One of those important features is Google Analytics. Gotta have Google Analytics, right?!

Let’s say that you’ve optimized your website to the perfect score of 100/100 and now decide to add Google Analytics. Too bad, your score is now 98/100. That’s because the ga.js JavaScript file loaded from Google’s servers doesn’t have a far-future expires HTTP header. To get the perfect score back, we need to fix this problem.

Getting back to 100/100

Here’s a solution that I use on one of my websites. It involves the addition of a single .ashx file to your web project. It’s isolated, safe to use and works.

The role of the .ashx file is to act as a proxy to the Google Analytics ga.js script file by downloading its content and serving it with a sufficient expires header. It caches the script file on the server, so it doesn’t have to download the ga.js file every time a visitor hits your website.

Step 1: Add an empty .ashx file (Generic Handler) to the root of you project and call it ga.ashx.

Step 2: Copy this code into the .ashx file you created in step 1.

Step 3: Modify the Google Analytics tracking script on your page to look like this:

    var _gaq = _gaq || [];
    _gaq.push(['_setAccount', 'UA-12345678-9']);

<script src="/ga.ashx" async="async" defer="defer"></script>

Voila! That’s it. You now have the perfect score back.

Optional step 4: Don’t like the .ashx extension? Then change it to .js by adding this to the web.config:

    <rule name="analytics">
      <match url="^ga.js" />
      <action type="Rewrite" url="ga.ashx" />

You need to add it to the <system.webServer> section of web.config. Remember to run the AppPool in Integrated Pipeline Mode for the <system.webServer> section to kick in.

Then just change the script tag to this:

<script src="/ga.js" async="async" defer="defer"></script>

Wait, is this a good idea?

I’ll let you be the judge of that. What I can tell you is that this exact code is running on one of my websites and it reports data to Google Analytics just fine.

Does it work? Yes
Is it cool? Totally
Should I do it? N/A


Comment by Steve Syfuhs

I may just be confused, but I can't figure out how that caches on the server. Does WebClient cache requested content?

It looks like anytime the ashx is called it will always go off to Google to get the script.

Comment by Mads Kristensen

These 2 lines in the ashx file are the ones caching the response on both the server and client side for 30 days:

context.Response.Cache.SetValidUntilExpires(true); // Caches on the server based on the next line
context.Response.Cache.SetExpires(DateTime.Now.AddDays(30)); // Sends the expires header to the client

Comment by Will

I'm as obsessive about page speed scores as the next fool, but this feels ill-advised to me. You've just built a caching proxy but deliberately broken it. It's reasonable to think that gs.js is 'in sync' with systems at Google, and by maintaining a much longer lifetime you are breaking their expectations about the field lifetime of each version of of the JS code. Things could easily break whenever GA issues some kind of upgrade and just because you haven't seen a problem yet is no indication that you won't see one tomorrow.

Reducing people's trust in client-side caching even further can hardly be a good thing for the web.


Comment by Mads Kristensen

I agree, and that's why I explicitly said that I couldn't recommend it:

[quote]Should I do it? [b]N/A[/b][/quote]

However, when Google updates their script they are not going to not support the previous version immediately. That would be more than foolish - especially given they cache the script for 12 hours them selves. It could mean that [b]new features[/b] provided by GA would not be adopted as quickly with this approach.

Remember the Yahoo research indicating that 40%-60% of [b]return visitors[/b] doesn't have a primed cache for longer than a week.

Comment by Will

Fair enough - I didn't understand the 'N/A'. Perhaps the 'it works' and 'it's totally cool' claims blinded me?

But (in)famous Yahoo research or not, you're going to have a hell of a job to convince me that the solution to browsers having 'brokenly short' caching is to build lots of proxies with 'brokenly long' caching to compensate... ;-)


Comment by Ruben

Well, this is quite advance for me as I'm still learning basic analytics but it will be useful in the future for sure. Thanks!

Comment by Daniel

Hi Mads, thanks for the insight.

Sorry for going off-topic. Please consider looking into the inconsistency problem I found in Zencoding using Web Essentials 2012 plugin in VS2012:

1- html>body>table>tr*8>td*3>lorem9 will create table with 8 rows and three columns each with Lorem text.

2- html>body>table>tr*8>td*3>lorem30 will create the same table. But this time the last column of each row is empty.

3- html>body>table>tr*8>td*3>lorem90 this time only first column has text!!

4- html>body>table>tr*8>td*3>lorem900 now the text is back in all three rows.

Please fix these issues.


Comment by Stian

I agree with Will. And my firewall dissallow outbound traffic through port 80. I could of course open it for google's IP's, but I don't see the point. But it's a paradox that google gives you a less score for using their script..