0 Comments

A few days ago, Google released their Closure Compiler project for optimizing JavaScript. Here’s what they write about the Closure Compiler:

The Closure Compiler is a tool for making JavaScript download and run faster. It is a true compiler for JavaScript. Instead of compiling from a source language to machine code, it compiles from JavaScript to better JavaScript. It parses your JavaScript, analyzes it, removes dead code and rewrites and minimizes what's left.

The interesting part of the Closure Compiler is that it not only removes whitespace, it also rewrites your JavaScript code to make it smaller and optimizes the code for better performance. My tests show that it can reduce JavaScript files by about 60% - and that’s before HTTP compression! Considering how much JavaScript a modern website uses, this is no less than amazing and highly useful.

The Closure Compiler comes in two flavors – a Java based command line tool and a RESTful API. I’ve been playing around with the API and it works great and very fast.

The code

The C# class I’ve written takes a JavaScript file and passes it through the API and then returns the compressed JavaScript as a string. The class contains one public and one private method and is only 47 lines of code including 16 lines of comments.

public string Compress(string file)

{

  string source = File.ReadAllText(file);

  XmlDocument xml = CallApi(source);

  return xml.SelectSingleNode("//compiledCode").InnerText;

}

 

private static XmlDocument CallApi(string source)

{

  using (WebClient client = new WebClient())

  {

    client.Headers.Add("content-type", "application/x-www-form-urlencoded");

    string data = string.Format(PostData, HttpUtility.UrlEncode(source));

    string result = client.UploadString(ApiEndpoint, data);

 

    XmlDocument doc = new XmlDocument();

    doc.LoadXml(result);

    return doc;

  }

}

How to use it

You can use the class to do various cool things. You can write a MSBuild or NAnt script that automatically compresses your JavaScript files as part of a continuous integration process or, as I prefer, write a HTTP handler to do the same but at runtime. Remember to output cache the compressed result. Here's an example of using the class from ASP.NET:

GoogleClosure gc = new GoogleClosure();

string script = gc.Compress(Server.MapPath("~/script.js"));

Remember that the class doesn't do any exception handling, so you might want to stick that in yourself.

Download

GoogleClosure.zip (905,00 bytes)

0 Comments

Recently I had to use iframes on a website conforming to XHTML 1.0 Strict. As you might know, the XHTML 1.0 Strict doctype doesn’t allow the use of iframes. The XHTML 1.0 Transitional doctype on the other hand, does allow you to use iframes, but I don’t like to use that doctype. The reason is, as the name implies, that it’s a doctype meant for make the transition from HTML into XHTML – a sort of a temporary solution.

When building new websites I like to use a strict doctype because it doesn’t allow for many of the style and behavioral tags that is much better placed in stylesheets and JavaScript.

What’s needed is a doctype that conforms to XHTML 1.0 Strict which also allows for iframes.

A solution

What I came up with was very simple, but may be considered a hack by some. Basically, I took the doctype declaration (DTD) of XHTML 1.0 Strict and added support for iframes. I found how iframes was supported in the transitional DTD and copied it to the Strict DTD. It allows for some attributes like width and height that the Strict DTD doesn’t, so I removed those and then added support for the allowtransparency attribute.

So to make iframes work on your own invalid XHTML 1.0 Strict page, just replace the doctype at the top of your pages with this new one:

<!DOCTYPE html SYSTEM "http://madskristensen.net/custom/xhtml1-iframe.dtd">

I suggest you download and host the DTD on your own server instead of using mine in case I forget to pay my hosting fee.

Check out the demo of XHTML 1.0 Strict with Iframe

A hack?

Some might say it’s a hack because by using this DTD the page is no longer XHTML 1.0 Strict. That is correct. It is now something new and different, but completely identical to XHTML 1.0 Strict with support for iframes. So it’s XHTML 1.0 Strict with Iframe.

XHTML have build in support for custom DTDs and thus this is completely supported and valid XHTML. If you don’t like using other doctypes than the few main ones created by the W3C then I have to ask why? What does it give you, your users or the quality of the page that this new one doesn’t?

In my book, it comes down to using a doctype that is based on known standards (XHTML 1.0 Strict in this case) so it still make sense to other devs when they read the markup. It’s also important that the DTD is strict (yep, this DTD is still strict), but most of all it’s important that the markup conforms correctly to the DTD so the entire page is valid. Remember, when using custom DTDs your page is still valid XHTML.

Note

I tried using XHTML 1.1 modules to build the DTD, but it never worked out for me. I got to the point where the iframe tag was valid, but not allowed in any other tags including body. I couldn’t seem to find a way to get full support for it. If you know how, please let me know.

0 Comments

In the past few days, I’ve worked on finding a way to do static code analysis on JavaScript files.The resaon is that I want to apply some sort of binary and source code checking like FxCop and StyleCop provides for C#.

There exist some tools for linting JavaScript like JavaScript Lint, but linting only checks syntax and not implementation. To do that I found the Jscript compiler build into the .NET Framework to be just what I wanted. It compiles JavaScript and reports if it finds any errors.

To test it out, I wrote a simple C# class that takes an array of JavaScript files to compile. I then called the class from a unit test, so I could make the test fail if the compiler finds any errors with the script files. The class contains a single public method called Compile and here is a simplified example on how to use it from any unit testing framework. You can download the class at the bottom of this post.

[Test]

public void JavascriptTest()

{

  string[] javascriptFiles = Directory.GetFiles(@"D:\Website", "*.js",
SearchOption.AllDirectories);

 

  using (JavaScriptCompiler compiler = new JavaScriptCompiler())

  {

    compiler.Compile(javascriptFiles);

    Assert.IsFalse(compiler.HasErrors, compiler.ErrorMessage);

  }

}

What’s nice is that by doing compile time checking of JavaScripts, I get that extra little security that’s so hard to get when developing JavaScript heavy websites. The Microsoft JScript compiler isn’t perfect, so I still recommend using a linting tool as well. The two approaches cover different scenarios. I hope to have a very simple implementation on using a linting tool soon.

Download

Remember to add a reference to Microsoft.JScript in your Visual Studio project before using this class.

JavaScriptCompiler.cs (2,48 kb)

0 Comments

Internet Explorer 8 introduced a new mechanism for ensuring backwards compatibility with websites built for IE7, so "the web" didn't break with IE8's more standards compliant rendering. You could tell IE8 to render your website as IE7 and therefore avoid having to fix potential problems with markup or stylesheets. You can do that in two different ways:

Using a meta-tag:
<meta http-equiv="X-UA-Compatible" content="IE=7" />

or this HTTP header:
X-UA-Compatible: IE=7

This puts IE8 into IE7 rendering mode. You can read more about how and why this was done and made into a standard at A List Apart.

The bonus feature

When IE8 renders a page, it looks for the meta tag or HTTP header in order to determine  whether or  not to render in regular standards mode or IE7 standards mode. So you would think that if you don't add the meta-tag or HTTP header, IE8 will just automatically render in IE8 standards mode, right?

According to this flow diagram on IE8 rendering, this isn't the case. If you don't specify any meta-tag or HTTP-header, IE8 will go through a lot of checks in order to determine how to render your webpage. You can very easily avoid the overhead and uncertainty by specifying to always use IE8 rendering mode.  The diagram tells us to use the meta-tag to specify this directly:

<meta http-equiv="X-UA-Compatible" content="IE=8" />

This meta-tag tells IE8 to skip directly to the DOCTYPE check by bypassing all other checks. If you can't add the meta-tag but can add an HTTP header, use this:

X-UA-Compatible: IE=8

The diagram tells us that the meta-tag is preferable over the HTTP header.

Always target the latest IE browser

Setting the X-UA-Compatible meta-tag/header to IE=8 only targets IE8 and no other browser. But what happens when IE9 ships? Microsoft has been clever enough to support the latest IE browser no matter what version might be. You can set X-UA-Compatible to IE=Edge and it will have effect on IE8 and all future IE versions. Keep in mind that upcoming beta releases and internal builds might not renders correctly, so use the IE=Edge at your own risk.

Button disappears

Another nice thing about directly specifying IE8 rendering mode is that the Compatibility View button disappears from the toolbar. Removing that choice might tell some visitors that what they are seeing is actually how you meant for your webpage to look.

This is one of those little features that gives you a little extra control without compromising anything. I see no reason not to use this on any IE8 standards compliant website today.

0 Comments

Last week a colleague and I gave a talk about scalable architecture and where my colleague talked about databases and application layer scaling, I talked about scaling websites. More precisely, we talked about the upcoming ZYB/Vodafone project

Since there’s still a lot of secrecy about the project, we managed to keep the concepts general. General or not, I’d like to share some thoughts on a different way of scaling websites.

Load balancing

Larger websites are often hosted on multiple web servers under a load balancer that distributes the requests evenly among the servers. This is an old technique for scaling out websites and has been widely used as the de facto scaling mechanism for years.  It’s good, it works and it’s cheap. It’s cheap because web servers often don’t have to be the biggest machines in contrast to e.g. database servers.

So, a load balanced web server setup provides good and cheap scaling possibilities.

Reversed load balancing

Any website, load balanced or not, can also use the vast untapped resources in the visitor’s browsers. Think about it. Quad core CPU’s and 4GB memory is almost standard today – even on laptops. Why not utilize the machine power behind the browsers to do some of the scaling for us?

Traditionally, this is done using browser plug-ins like applets, Flash and Silverlight, but many more sites use JavaScript. Modern browsers process JavaScript very fast and efficient which makes it possible to use JavaScript for scaling purposes.

To utilize the browsers memory we can cache data in JavaScript so we can eliminate chatty communication with the web server. An example would be to load all the data needed behind the scenes after the page is loaded and store it in JavaScript variables.  To utilize the CPU we can make calculations, dynamic rendering and other logic in JavaScript as well.

By pushing some of the load to the browser we are able to scale even more than just using regular load balancing.

It’s not for everyone

There are some problems with this approach that makes it a bad choice for some websites. If enough of the visitors are using old browsers like IE6 then they will get a worse experience because JavaScript runs too slow. There’s also the case where a website just doesn’t have any data to cache like a personal website.

For other types of websites it makes perfect sense. If your visitors have modern browsers and your website is heavily data driven, then it’s a possible candidate. The tests we have done at ZYB shows huge benefits by loading data behind the scenes - both the performance and scalability improves significantly. The load on the web servers dropped drastically with this technique. I hope to be able to show you some real numbers later.