5 Comments

HTML5 and CSS3 introduces some new file types that enables us to create even better websites. We are now able to embed video, audio and custom fonts natively to any web page. Some of these file types are relatively new and not supported by the IIS web server by default. It’s file types like .m4v, .webm and .woff.

When a request is made to the IIS for these unsupported file types, we are met with the following error message:

HTTP Error 404.3 - Not Found

The page you are requesting cannot be served because of the extension configuration. If the page is a script, add a handler. If the file should be downloaded, add a MIME map.

The problem is that the IIS doesn’t know how to serve these new files unless we tell it how. This can be easily done in the web.config’s <system.webServer> section by adding the following snippet:

<staticContent>
    <mimeMap fileExtension=".mp4" mimeType="video/mp4" />
    <mimeMap fileExtension=".m4v" mimeType="video/m4v" />
    <mimeMap fileExtension=".ogg" mimeType="video/ogg" />
    <mimeMap fileExtension=".ogv" mimeType="video/ogg" />
    <mimeMap fileExtension=".webm" mimeType="video/webm" />

    <mimeMap fileExtension=".oga" mimeType="audio/ogg" />
    <mimeMap fileExtension=".spx" mimeType="audio/ogg" />

    <mimeMap fileExtension=".svg" mimeType="image/svg+xml" />
    <mimeMap fileExtension=".svgz" mimeType="image/svg+xml" />

    <remove fileExtension=".eot" />
    <mimeMap fileExtension=".eot" mimeType="application/vnd.ms-fontobject" />
    <mimeMap fileExtension=".otf" mimeType="font/otf" />
    <mimeMap fileExtension=".woff" mimeType="font/x-woff" />
</staticContent>

The above snippet includes support for most video, audio and font file types used by HTML5 and CSS3.

7 Comments

Last time we took a look at the performance impact of using tabs vs. spaces in HTML files. One question that arose was whether or not it’s worthwhile to minify HTML when also using GZip. Let’s run the experiment.

I’ve collected a few real-world HTML files and done some minification and GZipping on them. Here’s the result:

WebsiteFile sizeMinifiedGzipGZip & minifiedSavings
amazon.com218,642197,03253,79349,0089%
cnn.com130,014121,53427,11425,3926%
twitter.com53,46546,60812,26211,4167%
xbox.com38,88824,1398,0756,79516%

These pages already do minification on various sections, but none of them minifies the whole document. If none of them used any minification, the savings would have been higher than the table shows.

So, according to the results, minification will provide an additional 6-16% lower file size with GZip enabled.

16% is a rather large saving on top of regular GZip, so the data suggests that we must use both GZip AND minification.

Remember, this is a rather small experiment with only 4 real-world websites. It would be interesting to expand the experiment for more accurate statistics.

For HTML minification in ASP.NET, I like to use WebMarkupMin or Meleze.Web.

7 Comments

Are you using tabs or spaces to indent your markup? Does it matter for performance which one you chose? Let’s run an experiment.

Consider a page that generates a list of 50 items:

<ul>
    @for (int i = 0; i < 50; i++)
    {
        <li>The count is @i</li>
    }
</ul>

The code generates a long list of <li> elements and keep their indentation using the editor’s settings for tabs, spaces and tab size. Default in many editors is spaces and a tab size of 4.

With spaces it looks like this:

image

and with tabs it looks like this:

image

It’s clear to see that the indentation takes up 4 characters when spaces are used and only a single character when using tabs. If we then compare the total file size of the two variations, here’s what we get:

TabsSpacesSaving
Raw file size1403 bytes1703 bytes300 bytes/18%

Using tabs saves close to 18% of the file size over spaces.

This is, however, not a true picture of a web page. All modern web servers use compression in form of GZip or Deflate before serving HTML to the browser. So let’s look at the numbers after GZip:

TabsSpacesSaving
Raw file size1403 bytes1703 bytes300 bytes/18%
Raw file GZipped   327 bytes   332 bytes     5 bytes/1.5%

When using GZipping, the saving from using tabs over spaces is just 1.5%. It’s still a saving and it counts.

Yet again, this is not the complete story because some web developers make sure to minify the HTML by removing redundant whitespace, unneeded quotation marks etc. Normally this is done as a build step or at runtime.

So let’s minify the HTML and see what results it produces:

TabsSpacesSaving
Raw file size1403 bytes1703 bytes300 bytes/18%
Raw file GZipped  327 bytes  332 bytes     5 bytes/1.5%
Raw file minified1199 bytes1199 bytes     0 bytes/0%
Minified & GZipped  312 bytes  312 bytes     0 bytes/0%

When minified, it doesn’t matter if tabs or spaces are used, since they are all stripped away.

Conclusion

Depending on the capabilities of your server, build setup, runtime etc., here’s a little chart of what to do based on the above findings:

Use tabs or spaces
I can minifyDoesn't matter at all
I can GZip but not minifyDoesn’t matter much (tabs gives small benefit)
I can neither GZip nor minifyTabs

Keep in mind that this is a controlled experiment, so your mileage will vary.

If you want to enforce your entire team to use either tabs or spaces, then take a look at .editorconfig. There’s a plugin available for practically all editors.

In the next segment, we look at the effects of GZipping vs. minifying HTML files.

16 Comments

So you have a website filled with images, CSS and JavaScript files, and you realize that you haven’t bothered optimizing the images or minified the CSS and JavaScript files. Or maybe you have, but your users can upload their own and they don’t get optimized/minified.

What’s the easiest way to go about that? Well, you could use tools like Web Essentials and PngGauntlet to help out, but that doesn’t solve the issue with user-uploaded files. You probably have to modify your website to include *.min.js files, commit them to source control, modify your website project and so on.

It would be much nicer if we didn’t have to worry about any of this and didn’t have to make any modifications to our website. It would be much nicer if it just happened automatically.

With Azure Websites that is now possible. Any web application hosted on Azure Websites no longer have to bother with these types of optimizations anymore.

It doesn’t matter if your website is running ASP.NET, PHP, Node.js or plain static HTML, it works for them all.

All there is needed is to install a NuGet package and publish the website to Azure. Here’s a video demonstrating how to add automatic image optimization.

The video shows how simple it really is to optimize images. To optimize CSS and JavaScript files, we can do the exact same thing, but with a different NuGet package.

Here’s what we need:

  1. Install NuGet package: Azure Image Optimizer
  2. Install NuGet package: Azure Minifier
  3. On a web application hosted on Azure Websites

How it works

Both the Image Optimizer and the CSS/JavaScript Minifier works the same way.

When they are installed and you publish to Azure Websites, an MSBuild trick makes sure to publish the Webjobs with your web application. As soon as that is done, Azure recognizes the Webjobs and starts them up.

The first time they start up, it can take a little while for them to finish the first pass of optimizations if you have a lot of files to optimize. You might even see the Webjobs restarting in the Azure portal. That’s ok, no problem. They start up immediately again and continues on where they left off.

The Image Optimizer supports .png,.gifand .jpg files. And the Minifier supports .js and .css files.

Server Explorer in Visual Studio shows us the Webjobs along with a log file and a cache file.

image

The log file is being written to every time a file has been optimized. You can open it by double clicking directly on the .csv file in Server Explorer. The cool thing about using a .csv file is that it an be opened in Excel, so you can easily do more calculations on the data.

The cache file (.xml) contains a list of files and their MD5 hash values. That ensures that the same files aren’t being optimized over and over again each time you publish or restart the WebJob.

If you have enabled Streaming Logs, then you can see the optimizations happen in real time directly within Visual Studio’s Output Window as well.

Open Source

As always, we keep our source code on GitHub and of course accept pull requests.

These features have been some that both Sayed and I have been wanting to add for a long time, but it was never possible before Microsoft introduced Azure Webjobs, because they required continuously running background tasks to work most reliably and in a way that scales.

The demo website used in the video is also open source and is great for playing around with these two optimizers yourself.

Happy optimizing!

46 Comments

Optimizing for website performance includes setting long expiration dates on our static resources, such s images, stylesheets and JavaScript files. Doing that tells the browser to cache our files so it doesn’t have to request them every time the user loads a page. This is one of the most important things to do when optimizing websites.

In ASP.NET on IIS7+ it’s really easy. Just add this chunk of XML to the web.config’s <system.webServer> element:

<staticcontent>
  <clientcache cachecontrolmode="UseMaxAge" cachecontrolmaxage="365.00:00:00" />
</staticcontent>

The above code tells the browsers to automatically cache all static resources for 365 days. That’s good and you should do this right now.

The issue becomes clear the first time you make a change to any static file. How is the browser going to know that you made a change, so it can download the latest version of the file? The answer is that it can’t. It will keep serving the same cached version of the file for the next 365 days regardless of any changes you are making to the files.

Fingerprinting

The good news is that it is fairly trivial to make a change to our code, that changes the URL pointing to the static files and thereby tricking the browser into believing it’s a brand new resource that needs to be downloaded.

Here’s a little class that I use on several websites, that adds a fingerprint, or timestamp, to the URL of the static file.

using System; 
using System.IO; 
using System.Web; 
using System.Web.Caching; 
using System.Web.Hosting;

public class Fingerprint 
{ 
  public static string Tag(string rootRelativePath) 
  { 
    if (HttpRuntime.Cache[rootRelativePath] == null) 
    { 
      string absolute = HostingEnvironment.MapPath("~" + rootRelativePath);

      DateTime date = File.GetLastWriteTime(absolute); 
      int index = rootRelativePath.LastIndexOf('/');

      string result = rootRelativePath.Insert(index, "/v-" + date.Ticks); 
      HttpRuntime.Cache.Insert(rootRelativePath, result, new CacheDependency(absolute)); 
    }

      return HttpRuntime.Cache[rootRelativePath] as string; 
  } 
}

All you need to change in order to use this class, is to modify the references to the static files.

Modify references

Here’s what it looks like in Razor for the stylesheet reference:

<link rel="stylesheet" href="@Fingerprint.Tag("/content/site.css")" />

…and in WebForms:

<link rel="stylesheet" href="<%=Fingerprint.Tag(" />content/site.css") %>" />

The result of using the FingerPrint.Tag method will in this case be:

<link rel="stylesheet" href="/content/v-634933238684083941/site.css" />

Since the URL now has a reference to a non-existing folder (v-634933238684083941), we need to make the web server pretend it exist. We do that with URL rewriting.

URL rewrite

By adding this snippet of XML to the web.config’s <system.webServer> section, we instruct IIS 7+ to intercept all URLs with a folder name containing “v=[numbers]” and rewrite the URL to the original file path.

<rewrite>
  <rules>
    <rule name="fingerprint">
      <match url="([\S]+)(/v-[0-9]+/)([\S]+)" />
      <action type="Rewrite" url="{R:1}/{R:3}" />
    </rule>
  </rules>
</rewrite>

You can use this technique for all your JavaScript and image files as well.

The beauty is, that every time you change one of the referenced static files, the fingerprint will change as well. This creates a brand new URL every time so the browsers will download the updated files.

FYI, you need to run the AppPool in Integrated Pipeline mode for the <system.webServer> section to have any effect.