In the first part of the checklist, we looked at creating high quality websites from a client perspective and the tools that helps us do that. In this part we look at the (free) tools that will help us build high quality on the server side of the website.

Code quality

Treat compiler warnings as errors

When you compile your solution in Visual Studio it will by default allow compiler warnings. Compiler warning occurs when there is a problem with the code, but nothing that will result in severe errors. Such a warning could be if you have declared a variable that is never used. These warnings should at all times be treated as errors since they allow you to produce bad code. Keyvan has written a post about how to treat compiler warnings as errors.


The StyleCop Visual Studio add-in analyses your C# code and validates it against a lot of rules. The purpose of the tool is to force you to build maintainable, well documented code using consistent syntax and naming conventions. I’ve found that most of the rules are for maintainability and consistency. After using StyleCop on my latest project I will never build a C# project again without it.
Some of the rules might seem strange at first glance, but when you give it a closer look you’ll find that it actually makes a lot of sense.


This tool should be familiar to most .NET developers by now. It has existed for a long time and is now on version 1.36. FxCop doesn’t analyze your C# code but the compiled MSIL code, so it can be used with any .NET language. Some of the rules are the same as in StyleCop, but it also actually helps you write more robust methods that result in fewer errors.

If you use StyleCop and do proper unit testing, then you might not need FxCop, but it’s always a good idea to run it on your assemblies. Here's a guide to using FxCop in website projects. Just in case. If you own a Visual Studio Team Edition, then you already have FxCop build in.


Anti-Cross site Scripting (XSS) Library

The Anti-XSS library by Microsoft is not just a fancy way to HTML encode text strings entered by users. It uses white-listing which is much more secure than just trust any input and then HTML encode it in the response. It works with JavaScript, HTML elements and even HTML attributes.

Code Analysis Tool .NET (CAT.NET)

When your website relies on cookies, URL parameters or forms then it’s open for attacks. That’s because all three of them is very easy to forge and manipulate by hackers and robots even. By using the CAT.NET add-in for Visual Studio you can now easily analyze the places in your mark-up and code-behind that is vulnerable to those kinds of attacks. CAT.NET analyzes your code and tells you exactly what the problem is. It’s easy to use, understand and it lets you build more secure websites.


With IIS 7 it is now easier than ever to customize the inner workings of ASP.NET applications using only the web.config. It is possible to remove all the features but the ones the specific application uses. In other words, we are able to lock down our applications and only turn on the features we need. The reason to do this is to reduce the attack surface of the application as well as stay in total control all the way from the IIS and into the ASP.NET application.

The attack surface will be reduced when we turn off unneeded features, since there are less ways to access your application. From a security perspective this is desirable. Since we turn off features, we also know exactly what our application can and cannot do. This gives us control and also reduces the overhead of those unneeded features.

The features we can control in the web.config come in the form of modules and handlers. In the <system.webServer> config section below, you’ll see a totally locked down application. All default managed modules have been removed and only two handlers remain. The two handlers let’s you serve .aspx pages and static files such as images and stylesheets.

 <modules runAllManagedModulesForAllRequests="true">
  <remove name="Profile" />
  <remove name="Session" />
  <remove name="RoleManager" />
  <remove name="FormsAuthentication" />
  <remove name="WindowsAuthentication" />
  <remove name="DefaultAuthentication" />
  <remove name="AnonymousIdentification" />
  <remove name="OutputCache" />
  <remove name="UrlAuthorization" />
  <remove name="FileAuthorization" />
  <remove name="UrlMappingsModule" />
  <clear />
  <add name="PageHandlerFactory" path="*.aspx" verb="*" type="System.Web.UI.PageHandlerFactory" />
  <!-- Add custom handlers here -->
  <add name="StaticFile" path="*" verb="*" modules="StaticFileModule,DefaultDocumentModule,DirectoryListingModule" resourceType="Either" requireAccess="Read" />

If you want to register your own handlers, remember to add them above the StaticFile handler. To allow some modules such as the Session module, just delete the line <remove name="Session" /> and it will automatically be added. Use the IIS Manager to see all the handlers and modules that are available.


A couple of weeks ago I released an online TV guide for Danish viewers called ifjernsyn.dk. The goal was to make a very simple overview that could easily be accessed from a mobile phone and customized by any visitor without any login. The purpose was to always know what’s on the air right now and what programs will shortly follow – and of course to keep it simple.

Since the release, some people have asked me about how I did some of the things and one of the most frequently asked questions was how to find movie posters for all the movies. Apparently, people are interested in finding movie posters for their own pet projects involving their own media collection or even a media center plug-in.

The code

With only the name of a movie, the code will search the Yahoo image search API and return a thumbnail of the poster. The API returns an XML document with both the thumbnail and the full image, so to get the full image you should just change the XPath navigation.

private const string LINK = "http://search.yahooapis.com/ImageSearchService/V1/imageSearch?appid=YahooDemo&query={0} movie&results=1";


public static string FindMoviePoster(string title)


  string url = string.Format(LINK, HttpUtility.UrlEncode(title));


  XPathDocument xd = new XPathDocument(url);

  XPathNavigator navigator = xd.CreateNavigator();







    if (navigator.LocalName == "Thumbnail")



      return navigator.Value;


  } while (navigator.MoveToNext());


  return null;


The implementation

To use the method above in your own web page, simply pass a movie title to the method and the image URL is returned. It could look like this:

string posterUrl = FindMoviePoster("independance day");;

if (!string.IsNullOrEmpty(posterUrl))


  imgPoster.ImageUrl = posterUrl;


The reason to use the Yahoo API is because it provides the thumbnails as well as the full image.


Us web developers are a proud people that want to build the best websites every single time. To help us identify areas that could be improved, I’ve compiled a list of various automated and manual checks that can help us make better websites. My professional pride urges me to do the entire list, but I must admit that I sometimes skip some of the checks due to time constraints. I did just finish a project where I did all the checks and that felt very satisfying.


Mark-up validation

Choose the right DOCTYPE and make sure that the mark-up you write conforms to that DOCTYPE. No matter if you choose HTML or XHTML, always go for a strict DOCTYPE since it forces all browsers to render in standards compliant mode. Otherwise, you might end up with different renderings. Here is a list of DOCTYPEs to choose from. Remember that real men use XHTML 1.0 Strict or XHTML 1.1.

I use the Total Validator or the W3C validator to validate the mark-up against the DOCTYPE.

Stylesheet validation

It’s always a good idea to validate your stylesheets against the W3C standards. You will probably find a lot of small things you can easily fix. This is also to ensure better browser compatibility, but it is not as important as mark-up validation. In some cases you deliberately put hacks in your stylesheets to cope with browser differences and that might break the validation. The W3C CSS validator works for me.

RSS and ATOM feed validation

There are literally hundreds of feed aggregators on the market today and that doesn’t include the custom aggregation done by blog engines, content management systems etc. To ensure they can all read your RSS and ATOM feeds, make sure to validate them against the standard.

Accessibility validation

Make sure your mark-up is structured in a way that is readable for everyone. There are two standards widely used today and that’s the Section 508 used in America and the WAI which is more comprehensive. It is pretty easy to make your site compliant with the rules of Section 508, so you should as a minimum do that.

The WAI is split up into three categories: A, AA and AAA where A is the simplest and AAA is the most comprehensive. If you can make your site validate Section 508, you should easily be able to validate against WAI-A. In Europe all government websites must conform to WAI-AA. If you are hardcore and stubborn, you should go for the WAI-AAA valid website. For a regular website, this is not easy but it can be done.

You can perform the validation using the Total Validator or Cynthia Says.



Before finishing your website, make sure to check the YSlow grade. It has to be as high as possible. Also make sure to use Fiddler to analyze the request/response graph by Internet Explorer. Using those two tools can make your site twice as fast with just minor tweaks.

Use general website analysis

It is a little redundant using the Website Analyzer after you’ve optimized the page using YSlow and Fiddler, but it almost always give you inspiration on where to do further performance optimizations.

No errors

Broken links check

This one requires no explanation. Take an automated test for broken links before releasing.

Browser compatability

Even though you have produced valid mark-up and CSS, there is always a need to go through all the pages of the website to make sure they look good in Internet Explorer 6, Firefox 2, Opera 9, Safari 2.5, Chrome and newer versions hereof. The number of different browsers you need to check varies by the type of website and the target group. This check is unfortunately manual and can take a loooong time.

Add meaning to content


You don’t need to know an awful lot about search engine optimization to do just the basic stuff right. I suggest you at least install the senSEO add-on to Firefox. It will provide you with a lot of useful tips and tricks by analyzing your mark-up and suggest improvements

Semantic Extractor

The Semantic Extractor can help you find inconsistencies in the structure of the different tags on your website. It lets you see how the search engine crawlers understand your site.

P3P Policy

If you set cookies then you need to have a P3P policy HTTP header as a minimum. The header briefly describes how the website behaves in regards to privacy. You might also want to add a more complete privacy statement in a certain XML format. The benefit is that your website will receive elevated privileges under stricter security settings in Internet Explorer. Learn more about P3P. Remember also to validate your P3P policy.


The PICS-label is also an HTTP header, but it describes the type of content on your website. The PICS-label is used by various tools build to protect kids on the web, such as NetNanny etc. Learn more about PICS.

Use semantic mark-up where it makes sense

Are you listing events or contact information then wrap them in meaningful mark-up such as microformats. This adds extra meaning to your content and opens the door to data portability. Here is an introduction to semantic mark-up.

Go mobile

This is relevant if you’re website is targeted mobile clients such as PDA’s and smartphones.

MobileOK Checker

Run your website through this online tool that will give you a lot of good feedback on your code. The W3C made it to provide a set of best-practices for mobile web apps.


This DOCTYPE can be difficult to code against if your website is targeted both mobile and richer browsers. However, if you are creating a mobile-only website then this DOCTYPE is for you. It’s basically a stripped down version of the XHTML 1.1 DOCTYPE with some mobile specific enhancements.

Input mode

One of the enhancements of the XHTML Basic 1.1 DOCTYPE is the inputmode attribute on textarea and text input fields. It allows you to specify the type of input that is best suited for the input field. It could be digits, Latin lowercase letters or Greek letters. Devices that understand this attribute will then adjust the input mode.

Checklist for high quality websites part 2


MVP award

When I was eating my breakfast at work Monday morning, the Microsoft developer evangelist Daniel called me on my cell phone. “Have you checked your mail?” he asked and I could hear in his voice that he was smiling more than usually. “Yes, of course” I replied, “Why?”.  “You won the 2009 MVP award in ASP.NET”. 

“SAY WHAT, for reals homes? That's awesome, but I didn't get the e-mail”. He assured me I won the award anyway, we hung up and I was really happy that I won the MVP. However, I did have a problem with the missing e-mail. What if Daniel was misinformed or some other mix-up had made me overly exited on false premise. I would be devastated if that was the case, since Daniel got me all worked up. Well, Daniel reassured me a couple of times during the day about the validity of his claim and that made me relax a bit.

Still, without that e-mail I wouldn’t know for sure…

Later that day I found the mail in Outlook’s spam filter, so now it's official.