IIS 7 Compression. Good? Bad? How much?

If you haven't properly utilized compression in IIS, you're missing out on a lot!  Compression is a trade-off of CPU for Bandwidth.  With the expense of bandwidth and relative abundance of CPU, it's an easy trade-off.  Yet, are you sure that your server is tuned optimally?  I wasn't, which is why I finally sat down to find out for sure.  I'll share the findings here.

A few years ago I wrote about compression on IIS 6.  With IIS 6, the Microsoft defaults were a long ways off of the optimum settings, and a number of changes were necessary before IIS Compression worked well.  My goal here is to dig deep into IIS 7 compression and find out the impact that the various compression levels have, and to see how much adjusting is needed to finely tune a Windows web server.

Note: If you don't care about all the details, jump right down to the conclusion.  I've made sure to put the key review information there.

What's my purpose here?

 To find out the bandwidth savings for the different compression levels, contrast them against the performance impact on the system and come up with a recommended configuration.

Understanding IIS 7 and its Differences from IIS 6

IIS 6 allowed compression per extension type (i.e. aspx, html, txt, etc) and allowed it to be turned on and off per site, folder or file.  Making changes wasn't easy to do, but it was possible with a bit of scripting or editing of the metabase file directly.

IIS 7 changes this somewhat.  Instead of compression per extension, it's per mime type.   Additionally, it's easier to enable or disable compression per server/site/folder/file in IIS 7.  It's configurable from IIS Manager, Appcmd.exe, web.config and all programming APIs.

Additionally, IIS 7 gives the ability to have compression automatically stopped when the CPU on the server is above a set threshold.  This means that when CPU is freely available, compression will be applied, but when CPU isn't available, it is temporarily disabled so that it won't overburden the server.  Thanks IIS team!  These settings are easily configurable, which I'll cover more in a future blog post.

Both IIS6 and IIS7 allow 11 levels of compression (actually 'Off' and 10 levels of 'On').  The goal for this post is to compare the various levels and see the impact of each.

Objectives and Rules

The first thing I had to do was determine what tests I wanted to run, and how I could achieve them.  I also wanted to ensure that I could clearly see the differences between the various levels.  Here are some key objectives that I set for myself:

  1. Various file sizes: All tests had to be applied to various sizes of files.  I initially tested using the default IIS7 homepage (689bytes), a 4kb file, a 28kb file and a 516kb file.  I figured that would give a good range of common file sizes.

    As it turned out, part way through I discovered that performance is severely affected on files of about 200kb and greater, so I did another series of tests on 100kb, 200kb, 400kb and 800kb files.
     
  2. Test compression only:  It was important to me that compression was the only factor and that other variables didn't cloud the picture.  For that reason, all of my test pages generate almost no CPU on their own.  Almost all of the CPU increases in my tests are from compression only.
     
  3. Compression ratio:  I used Port 80 Software's free online web tool to get the compression amount.
     
  4. Stress Testing:  I used Microsoft's free WCAT tool.  For the heavy load testing I used two WCAT client servers so that I can test the IIS server to the limit.  One client couldn't quite bring the IIS server to 100% CPU.
     
  5. Baseline Transactions per Second: After testing for a while, I realized that the most useful data was with a fixed trans/sec goal.  Otherwise, if I tested with a virtually instant page and thousands of transactions per second, the compression resource usage is unrealistic and the data confusing.  Very few real-life IIS servers handle thousands of very-fast pages per second.  So, to make the numbers more useful, I added a 125ms delay to each page so that my server could serve up 120trans/sec with 0% CPU when compression is turned off.  
     
  6. Compression at all CPU Levels: I had to turn off the CPU roll-off so that all pages are compressed, even at 100% CPU.  IIS 7's default setting of CPU roll-off is great, but it gets in the way of this testing.
     
  7. Solid Computer: Since compression is CPU bound, it's important that the server isn't a dinosaur.  The test server is a Dell M600 with a Quad Core Intel Xeon processor (E5405 @ 2.00Ghz). The server only has 1GB of RAM but the physical memory was never used up for these tests, so it had all the memory that it needed.  The disks are SAS, 10K RPM in a RAID 1 configuration.

I'll include the raw data at the bottom of the post for anyone that is interested.

Let's take a look at the results.

Compression Levels

IIS 7 Compression Ratio

As shown in the graph here, the largest increase was between compression level 3 and 4.  After that, the compression improvements were very gradual.  However, if you check out the raw data below though, every level offered at least some incremental benefit.

Time to First Byte

Here's a fun test I've always wanted to do, but never got around to it.  I was curious how quickly a compressed vs. a non-compressed page would load under minimal server load.  In other words, if there is plenty of CPU to spare, does a compressed page load at near line-speed?  While both Time to First Byte (TTFB)--when the page starts to load--and Time to Last Byte (TTLB)--when the page finishes loading--are valuable, I felt that the TTFB gave the picture that I needed.

IIS 7 Compression - TTFB

As you can see, all but the 516kb came in at about 0 milliseconds, even compressed.  Note that the 28kb and 4kb lines are hidden behind the 689 bytes line.  Even the 516kb file came in at under 80ms.  What's 80ms??  That's very low unless you have a really busy site.  As far as I'm concerned from this data, even a 516kb file loads at line speed as long as the server isn't bogged down.  

Let's think about the 80ms a bit more.  Dynamic pages, database calls and other factors play a much bigger role.  Plus, the <80ms saves over 300kb on the page transfer, which is a savings of 2.34 seconds on a 1Mbps line!  (Calculation based on: 300kbytes savings.  A 1Mbps line will transfer 1024Kbps/8 = 128KB per second.  300/128KB/s = 2.34 seconds)

All of the smaller pages load almost instantly, so the impression that the end user gets from a non-pegged server is going to be better with compression on.  Note that my testing doesn't take Internet latency and limits into account.  So, as long as the server can compress it at near line speed, it's going to benefit the site visitor all the more.

CPU Perspective

The following two charts are from a second round of testing that I did, so the file sizes are different than in the previous test.  In my first round of testing I found that the file size makes a huge difference.  With anything under 100kb, the compression overhead is almost non-existent.  However, once you have file sizes of a couple hundred kb or greater, the CPU overhead is significant.

I ran the tests on 100kb, 200kb, 400kb and 800kb files.  To put the size in perspective, I spent a good 10 minutes getting content from various sites to come up with 800kb of text.  I used pages like this from Scott Guthrie's blog, and it still took multiple pages to collect 800kb of text.  That particular page only has 145kb of text on it, but it's 440kb in size because of the HTML mark-up.  So, a 400kb file isn't overly common (not many people have as many blog comments as Scott Gu), but it's certainly possible.

To obtain a good CPU chart, I had to use some trickery.  I wanted to create a test where I could hit the server as hard as a heavily utilized server is hit, but somehow keep non-compression related CPU to 0.  I achieved this by using System.Threading.Thread.Sleep(125) in the pages.  This sets a 125ms delay that doesn't use any CPU.  Then I set the WCAT thread count to 30 per server (2 WCAT client computers).  This became my baseline since, with these settings, the IIS server handled 120 Transactions/second.  From looking at some busy production servers here at ORCS Web, I concluded that that was a reasonable level for a busy server.  The CPU load without compression was nearly zero, so I was pleased with this test case. 

IIS 7 Compression - CPU Usage

I found this the most interesting.  Notice that each file size hit a sweet spot at a different compression level.  For example, if you figure that your files are mostly 200kb and smaller, you may want to use Compression Level 4, which uses almost no CPU for a 200kb file. 

This is on a Quad Core server, targeting 120 transactions/second.  As you can see, if you plan to have that level of traffic on your server, and your file sizes are into the hundreds of kilobytes, you may want to watch the compression utilization on the server.  It may be using more resources that you realize.  I don't think most people have much to worry about though as the average page size is less than 100kb, and the load on the average server is often much less than 120 transactions/second.

I'll mention more in the conclusion, but it's worth briefly mentioning now that you should consider having different compression levels for static and dynamic content.  Static content is compressed and cached, so it's only compressed once until the next time the file is changed.  For that reason, you probably don't care too much about the CPU overhead and can go with a setting of 9. 

On the other hand, dynamic pages are compressed every time (for the most part, although further details on cached dynamic pages can be found here).  So, the setting for the dynamic compression level is much more important to understand.

This also lets us realize that it would be foolish to turn on compression just for the fun of it.  Formats that don't compress much (or are already compressed) like JPG's, EXE's, ZIP's and the like, are often large and the CPU overhead to try to compress them further could be substantial.  These aren't compressed by default in IIS 7.

Transactions per Second

IIS 7 Compression - Transactions per second

Just as a reminder, my goal was to start with 120 transactions/sec.  The server could handle much more, but this was controlled so that 120 was considered the base.  That was achieved with <1% CPU when non-compressed.

Notice that 100kb and 200kb files can still be served up at nearly the same rate.  Once you get to 800kb file sizes though, the server spends massive computing power to compress each page.  Part of that isn't just compression related.  Notice that even with compression turned off, IIS could only serve up 80 Transactions/sec using the same WCAT settings.  However, the transacctions/sec drops off considerably with each compression level.

Conclusion

So, what is my recommendation?  Your mileage will vary, depending on the types of files that you serve up and how active your involvement in the server is.

One great feature that IIS 7 offers is the CPU roll-off.  When CPU gets beyond a certain level, IIS will stop compressing pages, and when it drops below a different level, it will start up again.  This is controlled by the staticCompressionEnableCpuUsage and dynamicCompressionDisableCpuUsage attributes.  That in itself offers a large safety net, protecting you from any surprises.

Based on the data collected, the sweet spot seems to be compression level 4.  It's between 3 and 4 that the compression benefit jumps, but it's between 4 and 5 that the resource usage jumps, making 4 a nice balance between ‘almost full compression levels' and ‘not quite as bad on resource usage'.

Since static files are only compressed once until they are changed again, it's safe to leave them at the default level of 7, or even move it all the way to 9 if you want to compress every last bit out of it.  Unless you have thousands of files that aren't individually called very often, I recommend the higher the better.

For dynamic, there is a lot to consider.  If you have a server that isn't CPU heavy, and you actively administer the server, then crank up the level as far as it will go.  If you are worried that you'll forget about compression in the future when the server gets busier, and you want a safe setting that you can set and forget, then leave at the default of 0, or move to 4.

Make sure that you don't compress non-compressible large files like JPG, GIF, EXE, ZIP.  Their native format already compresses them, and the extra attempts to compress them will use up valuable system resources, for little or no benefit.

Microsoft's default of 0 for dynamic and 7 for static is safe.  Not only is it safe, it is aggressive enough to give you ‘most' of the benefit of compression with minimal system resource overhead.  Don't forget that the default *does not* enable dynamic compression. 

My recommendation is, first and foremost, to make sure that you haven't forgotten to enable dynamic compression.  In almost all cases it's well worth it, unless bandwidth is free for you and you run your servers very hot (on CPU).  Since bandwidth is so much more expensive than CPU, moving forward I'll be suggesting 4 for dynamic and 9 for static to get the best balance of compression and system utilization.  At this setting, I can set and forget for the most part, although when I run into a situation when a server runs hot, I'll be sure to experiment with compression turned off to see what impact compression has in that situation.

Disclaimer: I've run these tests this last week and haven't fully burned in or tested these settings in production over time, so it's possible that my recommendation will change over time.  Use the data above and use my recommendations at your own risk.  That said, I feel comfortable with my recommendation for myself, if that means anything.  Additionally, at ORCS Web, we've run both static and dynamic at level 9 for years and have never attributed compression as the culprit to heavy CPU on production servers.  My test load of 120 transactions/sec is probably more than most production servers handle, so even 9 for both static and dynamic could be a safe setting in many situations.

The How

Here are some AppCmd.exe commands that you can use to make the changes, or to add to your build scripts.  Just paste them into the command prompt and you're good to go.  Watch for line breaks.

Enable Dynamic Compression, and ensure Static compression at the server level:

C:\Windows\System32\Inetsrv\Appcmd.exe set config -section:urlCompression -doStaticCompression:true -doDynamicCompression:true

Alternately, apply for just a single site (make sure to update the site name):

C:\Windows\System32\Inetsrv\Appcmd.exe set config "Site Name" -section:urlCompression -doStaticCompression:true -doDynamicCompression:true

To set the compression level, run the following (this can only be applied at the server level):

C:\Windows\System32\Inetsrv\Appcmd.exe set config -section:httpCompression -[name='gzip'].staticCompressionLevel:9 -[name='gzip'].dynamicCompressionLevel:4

IIS 7 only sets GZip by default. If you use Deflate, run the previous command for Deflate too.

Note that when changing the compression level, an IISReset is required for it to take effect.

Data

In case the raw data interests you, I've provided it here.

Compression ratio

Level 400kb Text 28kb Text 1kb Text 516kb text
Size % Size % Size % Size %
Off 3904 0% 28274 0% 689 0% 528834 0%
0 1713 57% 13878 51% 594 14% 238630 55%
1 1636 59% 12825 55% 588 15% 221751 59%
2 1597 60% 12342 57% 587 15% 213669 60%
3 1592 60% 11988 58% 587 15% 206795 61%
4 1434 64% 11400 60% 457 34% 190362 65%
5 1428 64% 11228 61% 457 34% 184649 66%
6 1428 64% 11158 61% 457 34% 181087 66%
7 1428 64% 11151 61% 457 34% 180620 66%
8 1428 64% 11150 61% 457 34% 180485 66%
9 1428 64% 11150 61% 457 34% 180481 66%

CPU Usage per Compression Level

Level 100kb 200kb 400kb 800kb
CPU T/sec CPU T/sec CPU T/sec CPU T/sec
Off 1% 120 1% 119 3% 103 6% 81
0 1% 125 1% 115 11% 117 65% 83
1 1% 112 1% 113 42% 103 59% 73
2 1% 126 1% 123 48% 103 65% 76
3 1% 122 2% 125 44% 107 79% 75
4 1% 120 2% 122 43% 95 79% 62
5 1% 120 44% 111 68% 79 90% 47
6 6% 117 47% 111 83% 75 95% 33
7 27% 105 57% 117 83% 70 98% 31
8 45% 105 60% 120 84% 66 98% 29
9 45% 90 60% 90 84% 66 98% 29

 

Time to First Byte (TTFB) - minimal load.  In milliseconds.

Level 4kb 28kb 689bytes 516kb
Off 0 0 0 0
0 0 0 0 15
1 0 0 0 24
2 0 0 0 26
3 0 0 0 33
4 0 0 0 40
5 0 0 0 57
6 0 0 0 78
7 0 0 0 79
8 0 0 0 78
9 0 0 0 78

43 Comments

  • That was a great article on IIS compression. I didn't know about the different compression levels, in IIS7 it just gives me the option to enable or disable it, I'll have to dig deeper into the configuration files.

    I take it images such as jpeg/png/gif will count as static files and thus be compressed once then cached? What if you're serving a jpeg image dynamically in code (using GDI+ in .NET for instance, say for a CAPTCHA image), that would get compressed for each hit would it not? Is it possible to enable GZIP globally then disable on a page by page basis?

  • Great article, really detailed by far the most detailed analysis into compression I've read.

    I didn't know about the CPU roll-off with that I can include compression with confidence.

    Did you notice any change in the amount of memory usage in these tests? Is there an increased footprint there?

    It is surprising how many big/corp sites out there don't use compression at all. For hosters they are more keen on compression then some of the corp sites/infrastructures which I think sometimes have the fear that adding compression is a unnecessary complication (what if scenerios like CPU is freakly high and compression is just making it worse, they would rather just pay the excess bandwidth cost -if any, etc). For hosters bandwidth is much more of an important consideration.

    Than again, I see many IIS defaults just left on as no-one changed them....

  • Hi Dominic,

    The compression levels are still not exposed in IIS Manager, so it does take some tweaking behind the scenes to get this right. IIS Manager exposes the ability to enable or disable at every object level, which is the key thing unless you want to do fine tuning of the advanced settings.

    Yes, you can set it at the global level and then turn it off for anything that you want.

    You're right that if a file is static, it will compress it and cache the compressed page. That will happen for extensions like jpg, jpeg or gif. However, if you dynamically server the images through a .aspx page, then it will be considered dynamic because of the PageHandlerFactory assignment in applicationHost.config.

    For dynamic images from code, just make sure that your code outputs the mime type and you'll be set. That way IIS will treat it as an image and not try to compress it further, even if it's a .aspx extension. &nbsp;That's probably the best advantage of using mime types instead of extension types.

  • Hi Rovastar,

    I didn't watch for memory usage specifically, and to really test I would have had to use a large diversity of files rather than the single file testing I was doing. I'm pretty sure that memory isn't an issue at all though, unless someone is compressing *massive* files (i.e. hundreds of MB).

    When static files are compressed, they are saved to disk (the location is configurable), and when they are dynamically compressed, the work is done almost immediately, and I doubt any memory footprint from the compression is left behind. While I didn't test or find out from the IIS team about this, I think it's safe assumption.

    You're right that many people probably don't realize how beneficial compression is, and how easy it is to set. As a web host, bandwidth is a large expense, both for us and for our customers, so the more bandwidth we can save, the better for everyone.

  • Awesome testing. As always, the best advice is to enable cache on the static resources and keep the dynamic page content small too.

  • Hi Dave,

    Thanks for mentioning that. It is true that there are some issues with compression. However, they are few and far between now. Some builds of IE5 had some issues with JavaScript, and the links you provided were all for IE 6 SP1, pre-hotfix. Those builds aren't generally in operation today.

    To get around those early issues, at ORCS Web, we didn't have compression enabled by default for .js files on the IIS6 boxes. With that configuration, we really don't get reports of issues with compression at all. If there are issues, it's for a custom media or image app through a .aspx page where the issues occur. For example, if someone writes their own media player within an .aspx page, we just need to disable compression for their media player page.

    We've had compression enabled on IIS7 since the beta days and I'm not aware of any complaints at all.

    Do you still get reports of it happening nowadays, or is that an issue of that past?

  • Thank you very much for doing all of the work so that we don't have to! This is much more helpful than guessing the correct level.

  • The compression issue with hardware compression and ~3% of IE users mentioned above may relate to only chunk-encoded pages. If they are doing compression at the load balancer, they should be able to control that.

    What do you think explains the sharp rise in CPU usage in the 100kb and 200kb tests? They go from ~1%, which seems almost impossible, to 44%+ between levels, but not at the same level. Could it be kernel memory constraints or something else coming in to play?

  • Last time we had compression on for IE7 on was July. We tried only turning on for IE7 users not IE 6. Still calls came in.

    We do our compression on the load balancer not the webservers.
    We were originally doing compression on a Cisco CSS but replaced it with a NetScaler. Bought the Netscaler because it was much easier to configure compression based on the user agent with the hopes that at least compression could be on for IE7 users.

    Had high hopes for IE7 but alas still was about 3% of users getting the random page hangs so we had to shut it off.

    I should note that our site is secure(https) so maybe thats why I see more issues with page hangs over those people who are strictly running non secure.

  • Hi Keith,

    That's a good question. There is a lot to know about compression and the various algorithms, and I won't claim to know the details on the algorithms themselves.

    My assumption isn't that the kernel memory is coming into play in terms of CPU utilization, but you never know. The steps are significient, as you pointed out.

    I did see on interesting piece of information that WenJun Zhang from Microsoft posted here: http://www.eggheadcafe.com/software/aspnet/32004920/dynamic-compression-iis-7.aspx. He mentioned that levels 0 to 3 use one algorithm, level 4 to 9 uses another, and level 10 uses another. So, that probably accounts for the large jumps between levels too.

  • Hi Dave,

    That's good to know for sure although I haven't seen the same. I've worked with servers serving up a large diverse range of pages and sites, to a very diverse audience, and the compression complaints are almost nill. Most issues are all-or-nothing type issues on certain types of pages.

    (I assume when you say IE6/7, you meant IIS6/7.)

    I wonder if the hangs could be occurring from something related to your environment or specific site? If you're talking 3% affected (that's a lot!), I can't imagine that being from compression, unless compression somehow compounds the situation.

    It's good to be aware of the potential risk with compression though, as there is always some risk when something new is brought into the equation.

  • hello friends,

    i m using vista and have installed iis 7.0
    I want to know how can i change the compression level?
    as httzip and zipenable doesnot work on vista

    Please let me know asap as i am going to have this as a subtopic of my research....

    thks

  • Hi Madristaa. Do a search on this page for "The How". It's close to the end. In there I post the command that will allow you to change the compression level. I believe that's what you're looking for.

  • Yeahh i was looking for that code...But I have created a Sites using IIS Manager. How to enable dynamic compression for that particular site...I used your code but is not able to do enable dynamic compression for that particular sites.

    thanks in advance..

  • Ok, got ya. It's the second code examples that you need then. Use the following to enable dynamic and static compression for a particular site. If you don't want to enable static compression, just remove that parameter. Just change "site name" with your site name.

    C:\Windows\System32\Inetsrv\Appcmd.exe set config "site name" -section:urlCompression -doStaticCompression:true -doDynamicCompression:true

  • thanks for your reply.

    How can i measure the compression ratio. You have used port 80 software. But they don't work on Windows Vista..I am doing the testing on my local host i.e on my laptop. which open source software will help me out to find the compression ratio for different compression level.

  • Madristaa,

    If port 80 software isn't reporting it compressed for Vista, something on Vista probably isn't configured correctly. Port 80 softwere is a web based tool that should work as long as you have your site exposed to the Internet.

    To check it with a local tool. use Firebug in Firefox. That's easy to do, and if you check the Response Headers, you should see the following if it's compressed: "Content-Encoding: gzip". Also, the file size will be different.

    You can use Fiddler2 instead if you want: http://www.fiddler2.com/fiddler2/. That works with IE.

  • Hi Scott, Can PDF files be compressed using GZIP settings?

  • Hi Tanima,

    Yes and no. PDF's don't support gzip compression on the fly. They also don't announce that they do, so they won't ask for compression even if enabled for the pdf mime type (or extension in IIS6).

    What you can do is compress your PDF's in advance using a PDF compression tool. (start with a google search for 'compress pdf tool'). That is faster anyway since it's done once and then available for all downloads.

  • Hi Scott,
    Is CompressionLevel can be applied at Site Level? if not what are the settings can be done at site level?
    (when we host our web application in a server that may contain other applications also)

    Please clarify.

  • Hi Chandru. Compression can be enabled or disabled at the site level, for dynamic or static. However, the levels can only be set at the server level.

    Here's what the httpCompression section looks like in applicationHost.config:



    Notice that it's set to AppHostOnly, which means that it can't be set at the site level.

    As for what you can do, here's the schema for urlcompression:







    So basically you can enable or disable static or dynamic compression and you can set dynamicCompressionBeforeCache. Other than that, compression changes are meant to be performed at the server level.

  • Hi slCoder,

    That should be all that you need so you're on the right track. Check out this excellent article (http://dotnetslackers.com/articles/iis/Making-the-most-out-of-IIS-compression-Part-1-IIS-7-configuration.aspx) by Matt Perdeck and see if it uncovers anything needed in your enviornment.

  • Hello, sorry to ask but here is stated that in IIS 7compression is per mime type possible to turn off. I have been searching a lot but I cannot find how. I want to turn off compression for the .htc mime type.

    Thanks for your time.

  • Hi Bob,

    The mimeMap mimeType for .htc is text/x-component. That is included in the text/* static and dynamic types.

    The only way to remove it is to remove the text/* (from applicationHost.config) and specifically add back the text/ types that you need. I tested and doesn't work because it attempts to remove a non-existent entry rather than just the one mime type.

    Well, another option is to change the mime type for htc. That will also do the trick.

  • A very well written article. I knew how to turn it on and off on IIS, but was unaware of the various levels. Thanks to your research and labwork, I don't have to dig so deep into validating this myself.

  • RE: config -section:httpCompression -[name='gzip'].staticCompressionLevel:9 -[name='gzip'].dynamicCompressionLevel:4

    How is this set in the Appcmd.exe?



































































  • Tina, I'm not sure I understand. The example you have looks like a schema definition from something, but it's not the same appcmd.exe that I mention above. appcmd.exe is used as a tool to make IIS changes from the command line.

  • Hi, we're having problems with compression on one server. I've searched for two entire days and still couldn't get it working. Maybe you have an idea what I am missing, as you seem to be an expert on this subject. On another site the same application compresses as expected and I cannot really see differences.

    Problem: Some files compress, others don't. Always the same files don't compress. I cannot see any difference between the compressed and non-compressed ones. Both have same mime-type, similar sizes, same request client headers, etc. The problem happens for both static and dynamic content. Some static+dynamic requests do get compressed though.

    Configuration is a two-server IIS7.5 with an unknown load-balancer, but the same problem also happens when accessing one IIS server directly (kind of tricky to test due to DNS names etc.). We are using Windows Authentication, which actually makes three requests for each resource (first two return with 401). I'm only talking about the successful third request for each resource here. I have HttpWatch to sniff requests.

    What I checked so far:
    - Server Role, Role Services, static+dynamic content compression is installed
    - IIS root server compression settings:
    -- static+dynamic enabled
    -- static size is set to default 2700 (the files that don't compress are bigger)
    -- size limit is 100MB (folder is empty; only one directory with application name there)
    - IIS site compression settings: enabled for static, dynamic
    - appcmd for httpCompression (general) shows:
    -- mime types are correctly listed for both static + dynamic. Files that compress and those that don't compress have same mime type.
    -- scheme gzip is set to level 9/4 (static/dynamic)
    - appcmd for urlCompression (general) shows:
    -- enabled for both static+dynamic
    - In applicationHost.config there are no settings for CPU configuration (server is generally on 0% CPU load).
    - client requests always send Accept-Encoding correctly.
    - I've set system.webServer/serverRuntime enabled="true" frequentHitThreshold="1" with appcmd, but without any improvement.
    - The application's web.config doesn't disable anything. I checked various compression settings in web.config first.
    - Yes, I restarted the webserver.

    I'm lost. Anything else I could check?

  • Hi Eric,

    It looks like you've covered all bases. Nothing obvious stands out to me.

    What's the pattern on the differences? Can you rename the file and have it compress? Is the file the difference, or the filename, or the folder?

    Also, have you tried Failed Request Tracing (FRT)? That may be the best way to see what's happening. It can turn up some great clues. Let me know if you would like more explanation on how to do that (although I get the impression from your post that you would have no problem figuring it out).

  • Awesome post... saves me bandwidth costs with my hosting provider and makes the speed of my site a lot bette over the wire as I have CPU to spare on my server. Thanks!

  • I think what you do is use string trhoey to compress the data into another dimension. Since the data is uncompressed in another dimension and doesn't exist in your dimension its zero bytes but there is a problem how do you get the data back when you need it? You need to store something about where it was stored and what dimension.So what you do is store it one dimension over in the same spot on the same harddrive. Then you can always retrieve it. The problems is that inevitably some other dimension will try to use your HD to store some sort of useless information on it. Therefore you need to make sure that the HD is properly shielded from inter-dimensional-data-hijacking.There you go zero byte compression. So simple.

  • Joisy,

    Good point. It's important to watch for data compressed into another dimension. I'll need to consider that for next time I do further compression testing.

  • Hi,

    What about image compression? Did you do similar tests at various image file sizes? Any recommendations, pro-cons for a photosite that serves up high-res images that are in the 500kb to 800kb file size range?

    I take it that image compression just by itself also doesn't reduce file size by that much since they are binary files? Or has that changes with some new compression technology inside IIS 7?

    Thank you
    -Shiva

  • Hi Shiva,

    You're right that images won't tend to compress any further. They aren't compressed in IIS by default and aren't worth handling by IIS either. Since they are static, the best bet is to set them to the best setting as they are created. Then they will be served up directly, so IIS doesn't do anything with the images.

  • No point in trying to compress PDFs, they are already zipped.

  • Hi Scott,

    Great article. Wondering if you have any comment about using the hardware network load balancers like F5, Citrix and A10 for HTTP compression. Performance benefits of using NLB vs the IIS7 settings to enable compression.

    Thanks.

  • Hi James,

    They all offer good solutions. It really comes down to where you want the processing to occur. In some situations you may want to distribute them across the web servers. Other times you want to off-load the web servers.

    And sometimes the network devices need to do extra treatments on the data and need to uncompress and recompress anyway, in which case it's best not to compress on the web servers. So it really depends, but all of the options can work well.

  • I have tried your steps but still i don't see the files are being compressed in the following IIS path:
    c:\inetpub\temp\IIS Temporary COmpressed Files\mySite

    even using fiddler, i'm getting all the responses with no compression. i added all meme types and i can see it using appCmd tool specially text/javascript files but no compression is being executed on my Win 08 R2 with SP 2010.

    appreciate in advance your help. any idea why would this happen ?

    thanks a lot.

  • Only static files are compressed to that path. Dynamic files are compressed on the fly each time.

    With Fiddler, if you initiate a request with the Composer, you need to specify that your 'browser' supports compression. Add this header:

    Accept-Encoding: gzip,deflate

    Can you confirm that dynamic and static compression is enabled for the site using IIS Manager/Compression?

    Does it work on some file types, or is it not working for all content?

  • is the json files can be compresed too?

  • Hi Fidel Silva,

    Yes, they fall into the same category. JSON files can be of any of type extension, but just make sure that the mime type that you're using (it should be application/json) is added to IIS 7 for compression.

  • Hi, you mention in the article that "Additionally, it's easier to enable or disable compression per server/site/folder/file in IIS 7." I have only found how to set compression settings via sections for appcmd.exe .. Can you give an example of how you would set a certain file's compression settings using appcmd?

    thanks!

  • Hi Richard,

    Sure, here are a few examples of how to use AppCmd to set compression at different levels:

    -Server Level-
    appcmd.exe set config -section:system.webServer/urlCompression /doStaticCompression:"False" /doDynamicCompression:"False" /commit:apphost

    -Site Level-
    appcmd.exe set config "Site_Name" -section:system.webServer/urlCompression /doStaticCompression:"False" /doStaticCompression:"False"

    -File Level-
    appcmd.exe set config "Site_Name/Page1.aspx" -section:system.webServer/urlCompression /doStaticCompression:"False" /doStaticCompression:"False" /commit:"MACHINE/WEBROOT/APPHOST/TestingSite"

    -Folder Level-
    appcmd.exe set config "Site_Name/ContentFolder" -section:system.webServer/urlCompression /doStaticCompression:"False" /doStaticCompression:"False"

Comments have been disabled for this content.