Plip's Weblog

Phil Winstanley - British .NET chap based in Lancashire. Enjoys tea and tech. Working for Microsoft.

Pimp my ASP.NET web application - Part One

Over the coming posts in this series we will explore some of the ways in which I've learnt to "Pimp up my ASP.NET applications". This won't be a technical journey though, rather, it'll be much more about the method of my madness and will explain why I do things the way I do. In this post, we'll focus on performance.


Part One - Performance

In the past I've worked with people who took performance very seriously. So seriously, that they would consider memory allocation when choosing variable types and the way in which reference and value types behave differently. I have only one way to describe those developers, they are clearly clinically insane.

"There is more to life than increasing its speed"
-- Mahatma Gandhi (1869 - 1948)

Application performance can make or break a user's opinion, it's vital to get it right, but there is a fine balance; and like drugs, alcohol and sex, knowing when to stop is of vital importance.


Measure & Benchmark

Whilst it's very tempting to just jump into fixing an application with performance issues, a little patience and work now will make the effort later on much more rewarding. 

"Measure not the work until the day's out and the labor done."
-- Elizabeth Barrett Browning (1806 - 1861)

Until you've taken the time to measure your current performance you can't qualify if the changes you make to attempt to increase performance are making any real difference. 

Let's take the page rendered at the URL as an example. Here's it's load time for me (after having previously visited it): -

Once you have some numbers, you can actively begin to reduce them. I suppose, what I'm trying to say is, that there's no point doing performance work if you can't prove that it's made a difference. 

Rob Howard wrote a fantastic piece for MSDN on the Application Center Test tool "Tools of the Trade: Application Center Test". It's stuffed full of useful information.


It's faster that way ... 

As a rule of thumb, I try not to optimise an application until it's finished. Granted, I know the patterns to follow which will make the code run in a timely fashion usually, but I don't build performance in. There's a good reason for this; performance is only one aspect you must face when building your applications, maintainability and security are of equal importance and often as soon as code is optimised for performance it's maintenance becomes much more difficult.

Let's have a quick look at the following example of some C# which was optimised by the developer whilst writing it instead of afterwards which was rather unwise: -

while (reader.Read())
    User U = new User();

    U.Id = reader.GetGuid(0);
    U.LocalTime = DateTime.Now.AddHours(reader.GetInt32(10));
    U.Referral = reader.GetString(1);
    U.IP = reader.GetString(2);
    U.AccessSpeed = reader.GetInt32(3);



Yes, it is marginally faster to access columns in a reader by their Indexes, rather than the string representations of columns. It is however, a great deal more difficult to read not to mention the consequences should someone change the order of the columns coming out of the query. (Combine this with a SELECT * and you're really in trouble!).

An argument I see a lot from developers, especially in startups is "it won't scale!". Sometimes, it's better to write the code and have it not scale easily forcing a rewrite later on when it's more financially viable than to build it in from the beginning. It may make us feel a bit dirty, but it is the reality of the situation.

So my message on this is clear, don't optimise your code until you need to; understand the return on investment of your time and the associated business requirements. Use your time wisely.



HTML as I'm sure you are more than aware, can be a tad on the wordy side at times. A simple way of increasing the performance of your applications is simply by reducing the amount of the HTML, this can be something simple like changing the name of your CSS Classes so that they're much shorter or more complex such as transmitting your HTML zipped up to clients. 

"MacDonald has the gift of compressing the largest amount of words into the smallest amount of thoughts."
-- Sir Winston Churchill (1874 - 1965) 

Compression might sound to you like a bit of a lame duck, however I can assure you it is most certainly not. A little bit of compression can turn your family saloon into a sports car.


Some argue that compression only benefits those of your users who are on slow connections (or happen to be downloading 20 torrents at the time), but I'd argue there are other fantastic benefits to using compression with your applications. The cost of bandwidth is still high comparatively speaking with the cost of other services, every byte you save by compression is a byte you don't have to pay for.

Products I've been exposed to from a company called Port80 Software including httpZip and ZipEnable are extremely effective at reducing your payload overhead (amount your web server is sending to users) without a significant loss in performance.

Incidentally, whilst we're talking about compression, when you deploy your ASP.NET Applications be sure to flick the compilation mode to release, ASP.NET behaves differently and compresses certain things automatically when in release mode (as well as being faster anyway!).


HTML, JavaScript and CSS Optimisation

Inspecting the payload your server is issuing out can be quite tricky, luckily for the likes of you and I the boffin's at Microsoft created Fiddler and Simtec created HttpWatch (which I use every single day - probably the best developer tool I've got in my arsenal, it's also coincidentally the tool you'll see screen shots from throughout this article). 

Spend time to ensure your start page loads fast. First impressions do count. 

CSS Optimisation seems like a very easy and sensible thing to do. Here's a tool which is online and looks pretty interesting. Here are the results I received when placing one of my CSS files into the CSS Optimizer.


A tool I've not used, but I know others swear by is JSMin, which strips out white space and performs other optimisation in JavaScript. Would be reasonably easy to hook up a little MsBuild task that was run on your output folder for your web application after a deployment with a Visual Studio Web Deployment Project. (That would be rather cool don't you think?).



Why bother writing it properly when you can just cache the data? I think it's a very valuable philosophy, obviously, this can not be followed 100% of the time, however, as a general rule, I think it's a very effective way of getting round the daily constraints we have placed upon us by deadlines and other artificial barriers.   

"Caching can be a good way to get "good enough" performance without requiring a lot of time and analysis. Again, memory is cheap, so if you can get the performance you need by caching the output for 30 seconds instead of spending a day or a week trying to optimize your code or database, do the caching solution (assuming 30-second old data is ok) and move on."


ASP.NET has an absolutely fantastic caching system which can be used to cache not only in memory representations of data but also the resultant HTML. With Dependencies and fragment caching you can very quickly make a slow application look rapid.

Also, be clear, caching isn't just about System.Web.HttpContext.Current.Cache, it's also about the way in which browsers treat the HTML and assets of your web applications.

On the first load of the page, the result type for all the items is 200, which means it's been received from the web server (successfully as it happens).

On any consecutive load, should the files be cached in the local browser cache then that's where they're loaded from as indicated here by a status code of 304. Note the main page is still loading from the web server, that's because it's dynamic content and should always be loaded from the web server.



Database Optimisation

I'm not a database administrator nor do I want to be (open toed sandals just aren't me), so I'll leave the SQL Server Optimisation to those who know and instead leave you with a rather fun cartoon from





What do you think? I'd love to know how you approach optimisation and performance issues.

Also, I'd love some brownie points for not using the word "Performant" in this piece...






Technorati Tags: , ,


Chris Hardy (ChrisNTR) said:

Yslow for Firebug - (Firefox extension) is also a great tool for checking site performance and it even provides helpful hints.

Something I use for PHP is compressing the css and js into one file thus reducing the http requests and increasing the performance. Would be good to know if there was something similar in ASP.NET.

# February 9, 2008 7:37 PM

Dave said:

I agree with you about the use of GetIn/GetString/etc on a reader; I think only under the highest stress will the perf difference show, and to be quite honest I don't think most people (myself included) really know how to do performance testing to that level. However, if you do want the fastest possible access, declare some variables and use GetOrdinal, eg:

int _IPColumn = reader.GetOrdinal("IP");


U.IP = reader.GetString(_IPColumn);

# February 10, 2008 5:16 AM

barryd said:

You lose the "performant" brownie points by putting the image of you pimping, complete with fur coat and hat, in my head.

I agree with Dave, the GetOrdinal call is the way I go if I can.

It's funny how expression will optimise and remove spaces and comments when it publishes, but VS2008 won't.

# February 10, 2008 10:20 AM

Chris G said:

Brill Phil, keep 'em coming.

On your accessing fields by strings point - I have used enums to help keep some kind of readability whilst optimising performance.


# February 11, 2008 6:33 AM

Memmorium said:

     Good idea!

P.S. A U realy girl?

# April 11, 2008 10:36 AM

ipad app development cost said:

You can't judge a tree by its bark.


# December 21, 2010 9:02 PM

ipad reviews said:


I prefer to take breaks during the day and browse as a result of some blogs to see what others are stating. This blog appeared in my searches and I couldn't assist but clicking on it. I'm pleased I did since it was a incredibly pleasant read.

# January 9, 2011 1:31 PM

Weston Pulfer said:

Great weblog! I genuinely adore how it really is uncomplicated on my eyes in addition to the info are well written. I'm questioning how I could be notified whenever a new post has been produced. I've subscribed to your rss feed which should do the trick! Have a nice day!

# June 30, 2011 8:04 AM