ASP.NET Site whitehouse.gov Reviewed

So I was over on Slashdot last night looking for something interesting to read and ran across this tidbit about the new whitehouse.gov site that runs on ASP.NET. Honestly I think the only reason this got mentioned on Slashdot is that yesterday was Inauguration Day. Any other day and it probably would have fallen through the cracks. Anyway, I decided to take a look at the pointers that the author brought up to see if there was something I could learn. Most of the improvements I already knew about, but there were a couple that were new to me. Two that I’d like to consider implementing in my sites are:

  • Remove X-Aspnet-Version: header (remove 30 bytes per request)
  • Use compressed JQuery from Google’s servers (lower latency and improve performance)

The one issue with using JQuery from Google is your tied to whatever they’re using. If a newer version comes out and you want to use it but Google doesn’t, you couldn’t do this. Luckily I use JQuery 1.2.6, so this isn’t an issue for me.

As for the author’s review, I thought it was pretty good. A real world example is always a great illustration of what to do and what not to do. I’d have liked to see him make a suggestion on how to fix one of the issues he found, the ViewState issue, which would have been useful for other developers making the same mistake.

ViewState is still necessary for an ASP.NET application, however you don’t have to pass the entire ViewState back to the client. Its a waste of bandwidth (and can ’cause some nasty Base64 exceptions). One solution I use is to store the ViewState on the file system and only pass the file name back to the client for reference later on. It takes up a lot less bandwidth than a potentially huge ViewState. Other solutions are storing it in memory, the database, or by some other means. We clean out our old ViewState’s every 3-4 hours as many of them aren’t needed after that point (I’m thinking this might make a good article in the future).

Another example that kind of irked me was the site’s use of meta keywords. Uhm, yeah, this might not be as relevant anymore for Search Engine Optimization, but its still not a bad thing to have it in there. Just keep it to 200-300 characters. Nothing too crazy.

One last thing that he pointed out that I just didn’t agree with was that the server was running IIS6.0. Now, correct me if I’m wrong, but IIS7.0 is only supported on Windows Server 2008, right? Well, IT departments have budgets and all, so maybe a Windows Server 2008 license or the hardware to install it to wasn’t available? I know in my case, the budget doesn’t always allow for the latest and greatest, even if I want to use it. So to knock the development team for using IIS6.0 seems a little over the top if you ask me.

This entire site could definitely use some improvements, which the article nicely points out. To go one step further, I suggest any web developer install YSlow for Firebug for the Firefox browser. This came in handy for me when I was trying to optimize my sites. The whitehouse.gov site’s YSlow score is a 51 (an F), which is horrible. When I started using YSlow, I noticed some of my sites had a similar score, to which I was appalled and went to fixing pronto. By implementing some of the changes that I could as suggested by YSlow, I got us up to a 78, which is a high C. Some changes you can’t make (like Javascript at the bottom of pages and using a Content Delivery Network) due to how your application works and changing them just to make a minor improvement is more trouble than its worth. However, there isn’t any excuse to have an ASP.NET site that scores so low. Those folks over at whitehouse.gov definitely need to clean things up!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.