IIS Compression in IIS6.0

Hold on to your hats folks. If you don't have compression installed on your web server, either IIS Compression or a 3rd party, and you have IIS6.0 and pay for bandwidth you're missing out on something good.

In the day of IIS5 and earlier the compression built into IIS had various issues and was really not worth implementing. To enable compression you would need to go with a 3rd party solution like www.port80software.com or www.xcompress.com. This has all changed in IIS6! At www.orcsweb.com we've been running IIS6.0 compression on some servers for a number of month with few issues, just huge performance and bandwidth benefits.  Expect upwards of 4 times the compression which directly translates to bandwidth savings.  This means faster loading of pages for the end user also.  The only time we had to disable it was for a custom audio application for one of our clients that didn't work with compression.  I'll mention at the end of this how to disable compression for an individual site.

First, for those unfamiliar with what IIS6.0 is and how to get it: IIS6.0, short for Internet Information Services 6.0, is the web software that comes with Windows Server 2003. So if you run IIS on W2K3 then you have IIS6.0. If you're wondering if you can get it apart from W2K3, sorry the answer is 'no'.

One of the issues still there today with compression in IIS is that there isn't a nice interface to manage it. It's not as straight forward as other features of IIS. No need to worry though, I'll explain the ins and outs below of how to implement this properly. I set out to implement IIS Compression a number of months ago and had a hard time finding good information about it. I did find one great post here: http://dotnetguy.techieswithcats.com/archives/003475.shtml. I've since jumped into this subject in more depth and have two things to add to Brad Wilson's article. One, an iisreset is required as I'll mention below but also there is another setting that is required for compression to be more practical for dynamic compression. The other link worth bookmarking is: http://www.microsoft.com/technet/treeview/default.asp?url=/technet/prodtechnol/windowsserver2003/proddocs/standard/ref_prog_iaorefcompschs.asp. (I won't make any promises that this link will always work, Microsoft seems to change their links to documents all the time)

To make it easier I'll include everything needed to properly enable IIS Compression below even though I'll repeat what Brad Wilson said.

First, before anything else, backup the metabase.  This is done by right-clicking on the server in the IIS snap-in and selecting All Tasks -> Backup/Restore Configuration.  The rest is straight forward.

Create Compression Folder (optional)

The first thing I do is create a folder on the D drive where the static file compression will be cached. You can call it anything you want or leave the default of “%windir%\IIS Temporary Compressed Files” if that works for you. The IUSR_{machinename} will need write permission to the folder. If you use custom anonymous users, make sure to assign the proper user. IIS will still work even if the permissions are wrong but the compression won't work properly. Once running, it's worth double checking Event Viewer to see if any errors are occurring that keep IIS Compression from working.

Enable Compression in IIS

- From the IIS snap-in, right-click on the Web Sites node and click on Properties
- Select the Service tab - Enable Compress application files
- Enable Compress static files
- Change Temporary Directory to the folder that you created above, or leave it at it's default
- Set the max size of the temp folder to something that the hard drive can handle. i.e. 1000. 
- Save and close the Web Site Properties dialog

Note: The temporary compress directory is only used for static pages.  Dynamic pages aren't saved to disk and are recreated every time so there is some CPU overhead used on every page request for dynamic content.

Now for the metabase changes

Now we move away from the IIS snap-in GUI and have to get our hands dirty. (well, as dirty as they can get when dealing with computer software)

Here is where the IIS team either wanted to make things a bit difficult or they didn't get the changes done in time for the final release of IIS6. Actually it's the latter as I've heard rumor that they will be improving on the GUI over time.

Note: If you want to save yourself the hassle of understanding all of this, purchase ZipEnable from Port80 Software.  http://www.port80software.com/products/zipenable/.  This is a tool that gives you full control down to the folder and file level and embeds itself into the IIS MMC snap-in, making things much easier.  I haven't tried this out so I can't attest to it myself but Port80 Software is a company that Microsoft has recommended for years to use if you want HTTP Compression.

There are a couple ways to do this. One is to edit the metabase directly using Notepad and the other is using adsutil.vbs usually found in your C:\Inetpub\AdminScripts folder. I'll explain the direct edit method because I find it's easier to picture and understand what is happening then using a command-line tool.

- Ensure that you have Direct Edit enabled. In IIS Manager, right-click on the servername (top level in the left-hand pane). Check "Enable Direct Metabase Edit". Apply.
- Open the metabase located at C:\Windows\system32\inetsrv\metabase.xml in Notepad
- Search for <IIsCompressionScheme 
- There should be two of them, one for deflate and one for gzip.  Basically they are two means of compression that IIS supports.
- First thing to do is add aspx,  asmx, php and any other extension that you need to the list extensions in HcScriptFileExtensions.  Make sure to follow the existing format carefully, an extra space will keep this from working correctly.  Do this for both deflate and gzip.
- Now for the other thing commonly missed.  HcDynamicCompressionLevel has a default value of 0.  Basically this means at if you did everything else right, the compression for dynamic contact is at the lowest level.  The valid range for this is from 0 to 10.  I had the opportunity of receiving an internal testing summary from Chris Adams from Microsoft regarding the compression level -vs- CPU usage which showed that the CPU needed for levels 0 - 9 is fairly low but for level 10 it hits the roof.  Yet the compression for level 9 is nearly as good as level 10.  I write all this to say that I recommend level 9 so make sure to change HcDynamicCompressionLevel to 9.  Do this for both deflate and gzip.

- Just one thing left.  There are two settings that required the World Wide Web Publishing Service (WWW service) be reset.  One was enabling compression and the other was HcDynamicCompressionLevel.  Even with its shortcomings I simply do an iisreset from the command prompt but you can reset the service whichever way you prefer.

That's it folks.  I didn't promise it would be easy but hopefully I was straight forward enough in my steps to keep this from being too difficult.

I should mention it is possible to disable or enable compression at the site or sub-folder level.  This time I'll be lazy and tell you the adsutil.vbs way to do this but it can be done directly using Notepad and editing the metabase directly if you prefer.  From the command prompt enter the following two commands and be sure to replace site# with the siteID that you are changing:

cscript C:\Inetpub\AdminScripts\adsutil.vbs set w3svc/site#/root/DoStaticCompression False
cscript C:\Inetpub\AdminScripts\adsutil.vbs set w3svc/site#/root/DoDynamicCompression False

Note: I was emailed by a reader, David Waters, who noticed a mistake in the Microsoft's documention on adding custom extensions as asked if I would point it out here.  The example adds quotes which shouldn't be added.  For example, the following:
    cscript adsutil.vbs SET W3SVC/Filters/Compression/Deflate/HcFileExtensions "htm html txtnewext"
should instead be
    cscript adsutil.vbs SET W3SVC/Filters/Compression/Deflate/HcFileExtensions htm html txt newext
Just remove the quotes and it will run as it is supposed to.

Follow-up blog: http://weblogs.asp.net/owscott/archive/2004/01/16/59594.aspx

79 Comments

  • Thanks cathal. That was the tool that I had seen so I updated the blog to mention it.

  • I understand it can compress webservice xml calls but what if your client is a delphi application? Do you know what I can use to decode that on the fly?



    Thx.

  • Good question Mehul. Normally the web browser is the client for much of the compression on the web. I'm sure it's possible to create your own client that handles compressed content, maybe there is an API in IE that you can tap into, (that's just a guess) but I really couldn't say how to develop your own.

  • This is exhaustive. But I am wondering whether IIS HTTP Compression is better or the third party Compression components like Xceed, or SharpZipLib ??? any metrics available?

  • ::but what if your client is a delphi

    ::application?



    Depends whether the developer of te client application &quot;just got HTTP working&quot; or was smart enought oget HTTP WORKING - if he got it according to specs, he will either get uncompressed content OR will be able to uncompress it himself.



    ::any metrics available?



    CAN there actually e a difference?



    I mean, the compresion algorythm is predetermined, right? So the results should be similar.

  • IIS compression only zips content if it receives the HTTP &quot;zip&quot; header. You can do a search to find the exact format, but the header will be sent (or should be sent) only if the client can handle the compression. If the header is not sent, IIS will not compress. Very simple and elegant.

  • Sudhaker. Good question. I haven't reviewed various different 3rd party software compression programs except XCompress. Even Port80 Software now says that the native IIS Compression is faster and more efficient than their own which is why they offer the tool mentioned above. XCompress offers an easier interface out of the box and they claim to have more capability with the lesser used browsers. We've been running this on high traffic sites for some time and not had reports of issues though. Microsoft seems to do that, their first couple releases of a technology tend to be weak but at some point they step up to the plate and bump out their competition. Third party software will have to offer some large feature advantages if they want to keep offering compression in IIS6+.

  • Good point Darrell. If the client doesn't support compression then it just send to the content uncompressed. That is another one of XCompresses claims as their advantage. They claim (I have no doubts they are correct) that their compression logic can work with more browsers and even some browsers that request compression but don't handle it correctly. For example there is one build of IE5.0 that doesn't handle compression correctly. If someone has that browser they will run into issues. XCompress overcomes the weakness of the browser and makes sure it works. But . . . that is a rare case and something that personally I wouldn't worry about. If someone has that certain version of IE 5.0 it's probably for testing purposes.

  • Since i use PHP on my website should I add 'php' in addition to 'aspx' and 'asmx' to the &quot;HcScriptFileExtensions&quot; in the metabase?

  • Thanks all for the comments. Great links from everyone.



    Caleb, I've found the CPU overhead to be minimal but noticeable. My guess is less than 5% extra CPU usage.



    Bryce, sure thing. I should have thought to mention it so I just updated the blog to have php in the list.

  • I've got a question about this concerning exchange 2003 owa. Now, I've gone and enabled compression via your instructions above, but I've &quot;heard&quot; (one person heard it from another...) that exchange likes to turn off compression because it wants you to use the built in compression in owa. This built in compression i speak of can be access from the system manager, then going to the http protocol under your server serving owa, going to the exchange virtual server properties, and then the settings tab. My questions are, is what I &quot;heard&quot; true, and how in the world can I check if compression is enabled/disabled/screwed up by exchange? Must I use a 3rd party tool to check on the status?



    Thanks

  • Rebelpeon,



    I can't speak for running OWA and IIS6 Compression together but if you want to turn off just a particular site or subfolder, use the adsutil.vbs example at the bottom of this blog. Or, better yet, try out httpZip: <a target="_new" href="http://www.port80software.com/products/httpzip/">http://www.port80software.com/products/httpzip/ I just got a copy and play to try it out it as soon as I get a couple minutes.



    To test, here are some good links:

    <a target="_new" href="http://www.port80software.com/products/httpzip/">http://www.port80software.com/products/httpzip/

    http://xcompress.com



    Both of them will tell you if a particular page has compression enabled or not.

  • Mehul H. , I see you are in search of the same thing. I would like to know how to do compression to a .NET Winform application using IIS 6 compression. Any help would be greatly appreciated.

  • What about application servers that service local and remote users. Can you enable compression based on where the user connects from. (compress WAN users, do not compress LAN users.)

  • Nate, not from within IIS but you can do it at the code level for dynamic pages.

  • Hi,

    I have enabled the compression, but how do i know it works? I am looking for something that can show me the page is size is now 23K and when I turn it off then I can see the page size id 64K or something like that. Can anyone help please?



    gscloete@yahoo.com

    Thanks

    Gert





  • jlb0001, I like the name :)



    There isn't a way natively within IIS6 to do this but 3rd party software can help. I pinged Tad Fleshman from Port 80 Software regarding this. Basically with IIS5, their product httpZip deals with this for you. Their product for IIS6, zipEnable, is an extension for IIS6 Compression rather than a full-blown filter. They are working on a solution for this. I've included Tad's reply below:

    ---------

    We are, in fact, aware of these browser bugs that cause problems with compression. Since our httpZip product is a full-blown filter it comes preconfigured with browser/MIME type exclusions that address these incompatibilities. Because zipEnable does not currently include a filter component we are working to address this exact issue with a light filter to give IIS 6 users a way to use these exclusions. So, a fix is on its way and it will be included in our next release.

    ---------

  • Situation: I have a web application that creates reports in .pdf format, compression is enabled in IIS 6. Since IE acts on the Content Type header before the Content Encoding header, it is tossing a bunch of compressed non-sense to Acrobat Reader. The solution would be to disable compression on just that (.aspx)page, and nothing else. I have several avenues I have been pursuing.. 1. Getting IIS to process the file with other than an .aspx extension, then handle it with the metabase. 2. Preventing compression from happening via some code in the page itself. 3. Re-do the application to call a .pdf file instead of dynamically creating the file with an aspx page.

    So far.. 1. No luck, 2. No idea, 3. Just don't want to go there.



  • Nate, For #1 did you add .pdf as a file extention to your script mappings? This will allow aspnet_isapi.dll to process all PDF files. Now, if you really have real PDF files elsewhere, how about using a new extension? Try PDFX or something like that. Have it set so that it doesn't have compression but does get handled by aspnet_isapi.dll.



    Note: You can turn of compression at the file level if you want.



    I don't have any ideas for your #2 suggestion either. IIS Compression can't be disabled from within the page itself.

  • Have you tried TurboIIS 2003? They made it to work with IIS 6 and it compresses much better than the built in compression of IIS 6.

  • Margherita, Thanks for the feedback on TurboIIS. I haven't run any benchmark testing on TurboIIS -vs- IIS6 native compression so I would be interested in seeing actual performance benefits if you have them. TurboIIS offer white space removal and other benefits that personally I believe should be done in advance and not at run-time. The overhead of trying to do this on every page request is substantial. Besides that, I would need to see the benefits that TurboIIS offers over IIS Compression. I'm not saying that it won't, I would just need to see tests showing that it is better and by how much.

  • David, sorry I didn't reply sooner but I see you have this working now. If you happen to some by this blog again, can you let us know what the issue was? I would like to keep a list of common gotchas.

  • How do you disable compression on a certain file? I added DoDynamicCompress=0 to /LM/W3SVC/1960170990/root/MyLEED/getPDF.asp/

    and it still gzips that file when I request it. What am I missing? Thanks.

  • I misspelled that on my last post. I added DoDynamicCompression as a DWORD and set it to 0.

  • Steve, DWORD in the IIS metabase? That is new to me.



    Here is what you should do:



    First, to create a node for the page in the metabase, I right click on the file and go to properties. Make a change of some sort, apply, then change it back.



    Then, from the command prompt enter:



    &quot;cscript C:\Inetpub\AdminScripts\adsutil.vbs set W3SVC/{siteID}/Root/{subfolder}{page.asp}/DoDynamicCompression False&quot;



  • That worked. Thanks. I don't know why that worked but the way I was doing didn't work. I was using metabase explorer and doing it the cscript way put the same value that I put in using the metabase explorer. Thanks again.

  • Great, thanks for confirming that it's working. I've only used metabase explorer briefly in the IIS5 days so I can't offer any advice on why it didn't work but I'm pleased to hear you got it working now anyway.

  • is there a limit to the file size that IIS 6 will compress? my web app has the capability of generating up to 11 megs and up worth of HTML data to send to the browser for a single page (i know the comments but thats what my client wanted). i have the compression enabled, the smaller pages on the site are compressing according to port80, but it seems that there is some threshold in IIS that if the file is over X size, don't bother compressing it and just send it. am i right? is there a way to get around this?

  • I am running windows 2003 server and IIS 6.0. However, I cannot find the tab for services when right clicking on properties of my web site under IIS 6.0 all the other tabs are there (eg Web Site, ISAPI filters, Home Directory etc but no tab for services). Is there something i need to turn on to get this to appear.



    I have checked two of our servers with Windows Server 2003 installed and this tab is not there on any of these servers.



    All help greatly appreciated.



    Kevin

  • Kevin,

    The tab for services is up a level. It's at the &quot;Web sites&quot; node and not on the web site itself. In other words, the parent category called &quot;Web sites&quot; is the one to right click on and select.

  • it was easy to understand it

    but can u tell me

    how to disable the whole w3svc and enable only for particular Virdir

  • it was easy to understand it

    but can u tell me

    how to disable Compression the whole w3svc and enable Compression only for particular Virdir

  • Nick, It should be something like:

    cscript C:\Inetpub\AdminScripts\adsutil.vbs set w3svc/root/DoStaticCompression False

    cscript C:\Inetpub\AdminScripts\adsutil.vbs set w3svc/root/DoDynamicCompression False

    cscript C:\Inetpub\AdminScripts\adsutil.vbs set w3svc/site#/root/DoStaticCompression True

    cscript C:\Inetpub\AdminScripts\adsutil.vbs set w3svc/site#/root/DoDynamicCompression True



    You can edit the metabase.xml file directly (more is explained above, in the blog) instead of using adsutil.vbs. Whichever you prefer.



    Or, use HTTPZip 1.1 which is well worth the price if you are doing anything out of the ordinary.

  • Thanks

    it worked

  • i would like to ask you about Out Of Memory Exception

    there is one app which gives this error

    only when aspwp reaches like 800mb

    i know

  • Nick. That's quite a bit off topic. Feel free to email me offlist at scott (at) orcsweb.com. I'll let you piece that email address together so the spammer's spiders don't figure it out automatically.

  • We have our .html extensions mapped to the ASP.NET ISAPI filter because with a custom .NET page filter for authentication and URL mapping. Currently, I can't get these .html pages to be HTML compressed with IIS 6.0 built in compression. I have been succesful with Blowery Web's custom .NET filter. Anyone else have this issue?

  • I figured it out. I needed to configure the html extension as a dynamically compressed extension. I was treating it as a static page. Once the .html extension is mapped to ASP.NET then .html extensions are considered dynamic pages! The setting should look like this:



    HcScriptFileExtensions=&quot;asp

    dll

    exe

    html&quot;

  • Vermin. You are correct that if it's processed by aspnet_isapi.dll or asp.dll it will consider it dynamic and needs to be placed in the other category. Thanks for the update.

  • Laurent, I don't know the answer to that off the top of my head. My suggestion is to do a search on SendChunked and AllowWriteStreamBuffering. It sounds like these two values are configurable and reversing either of them might solve it.

  • Does the assembled wisdom have an opinion on the ability to HTTP compress .PDF files from IIS 5.0 specifically using dynamic content compression?



    This seems to work generally for us until a firewall product is introduced with a specific configuration that does some playing with TCP sequencing, then .PDFs and .TIFF files fail to transfer correctly while other file types work. The failure appears as if a file transfer occurs fuly but then does not display in browser or get saved to disk if that was being done. No error message.



    Should I be looking to IIS 6.0 or third party products to test?

  • Hi,

    I have enable compression on my site. I am running IIS 6 on Windows 2003.



    My site have a heavy load and some times I get blank compressed pages when the source page is not blank.



    any idea why I am getting blank compressed pages?

  • Arnoldo, This could be from a number of things although I must say that IIS6 works great in most cases. If you view source on the final page, is it empty as well? In other words, is the compression displaying blank or changed pages? Is it consistent on the same page all the time? Maybe you should disable compression just on the one file or folder either as a test or permanently.

  • I have enabled Compression on my Windows 2003 server. I want a list of browser with version numbers that supports the compression feature.



  • Hi Scott



    I am getting blank compressed pages when the page is not, I have a 4 windows 2003 servers farm, it happens only in two of them. Also some times they merge the content of the pages (i.e. the default page get the content of other) I don&#194;&#180;t have other software on the servers. My web content is centralized on a NAS and it changes every 15 minutes. I have static and dynamic compression enable, but the 95% of my web content is static and I have detected the problem only on static content. At the beginning I had this problem on all the servers but I found an article that recommend increase the values of I/O Buffer Size and Compression Buffer Size, this recomendation only works on two of the 4 servers.

    thanks for you help

  • It is very tricky!
    It is easy to setup to make it work. But when I turned it off and turned it back. It does't work. When I checked metadata.xml, I found both HcDoDynamicCompression and HcDoOnDemandCompression is FALSE even if enabled compression on IIS service. So I set them to TRUE, and it worked again.

    What is the difference between dynamic,ondemand? And how to set compression on one virtual site?

  • When you changed the metadata.xml, make sure you reset IIS and also delete all off-line files on your IE. Then the change will take effect.

  • This is a great blog, thanx.

    This is working great for compressing the HTTP response content back from server to client.

    I would like to enable gzip compression for the request (POST) content, but the server doesn't appear to support it. I keep getting 500 errors.

    Does anyone know if IIS supports gzip compression in the request (POST) content? If so, how do i enable it? Is there a special request header i need to send the server in addition to the gzip-compressed request (POST) content?

  • I am having problem while rendering pdf report when compression is enabled. It displays blank page everytime. If I disable compression it works fine. I tried by adding pdf in script mappings but it didnt work. My other pages are compressed OK as I verified it using Fiddler.

  • This worked perfect first time. Thanks for posting this.

    250k pages now compressing down to 35k, well worth it when users are overseas.

    Good looking blog too.

  • For x64 versions the gzip dll path is now C:\windows\SysWOW64\inetsrv\gzip.dll

  • As anybody actually been able to turn off compression at the file level? Setting DoDynamicCompression to FALSE in the Metabase on files (or folders) didn't work, it only worked at the global level (HcDoDynamicCompression) :



    This is to prevent the dynamic pdf outputting bug mentioned years ago in this article's comments. Anything I could have missed? Thanks for any help...

  • Shaun,

    The trick is to get an entry for the file in the metabase so that the command does something. What you can do is from the IIS Manager, make a slight change and change it back again. This will create an entry in the metabase for that file. For example, turn off the Log visits, apply, then turn it back on again and apply the settings. Then it should work. i.e.
    cscript C:\Inetpub\AdminScripts\adsutil.vbs set w3svc/1/root/App/output_pdf.aspx/DoStaticCompression False
    cscript C:\Inetpub\AdminScripts\adsutil.vbs set w3svc/1/root/App/output_pdf.aspx/DoDynamicCompression False

  • With IIS 6.0 Compression enabled, can I use that for compressing .js and/or .css files?? I tried to add "js" to the HCFileExtensions and HCScriptFileExtensions in the Metabase file (separately and together), but I see the entries are removed after doing an IISReset... has anyone compressed javascript or Css files?? Is it even possible without third-party plugins??

  • Scott, thanks for the great article. I have one question, though: how should I proceed if I need to update any static files (css or js, for example)?

    IIS Compression 'caches' them and their changes won't show up. How can I force IIS to update the compressed version? Does it have something to do with HcCacheControlHeader?

    Thanks again for helping!

  • I can't seem to get the compression to work. I followed your directions exactly and I am out of ideas. I need help!

  • I've got an .aspx page that does the following:

    It pulls a byte[] from a database, response.clear(), sets response.contenttype, response.binarywrite() to the browser, then a response.end(),

    On IE 7, works great. On IE6, totally hosed.

    Windows 2003 server, IIS 6. Compression on, it's broken, compression off, works every time.

    PDF files work pretty consistenly, but .rtf, .tif, etc don't.

    Any clues?

  • This article was great on explaining the server side. However, if you want to use webservices with anything but a webbrowser, here is some code that will help:

    I found some code online that with some bug fixes works well as a HttpWebResponse decompressed:

    using System.Net;
    using System.IO;
    using System.IO.Compression;

    namespace ClientsDataType
    {
    public class HttpWebResponseDecompressed : WebResponse
    {
    private HttpWebResponse response;

    public HttpWebResponseDecompressed(WebResponse wResponse)
    {
    response = (HttpWebResponse)wResponse;
    }

    public override void Close()
    {
    response.Close();
    }

    public override Stream GetResponseStream()
    {
    Stream compressedStream = null;
    if (response.ContentEncoding == "gzip")
    {
    compressedStream = new
    GZipStream(response.GetResponseStream(), CompressionMode.Decompress, false);
    }
    else if (response.ContentEncoding == "deflate")
    {
    compressedStream = new
    DeflateStream(response.GetResponseStream(), CompressionMode.Decompress, false);
    }
    if (compressedStream != null)
    {
    // Decompress
    MemoryStream decompressedStream = new MemoryStream();
    int size = 2048;
    byte[] writeData = new byte[2048];
    while (size > 0)
    {
    size = compressedStream.Read(writeData, 0, size);
    decompressedStream.Write(writeData, 0, size);
    }
    decompressedStream.Seek(0, SeekOrigin.Begin);
    compressedStream.Close();
    return decompressedStream;
    }
    else
    return response.GetResponseStream();
    }
    public override long ContentLength
    {
    get { return response.ContentLength; }
    }
    public override string ContentType
    {
    get { return response.ContentType; }
    }
    public override System.Net.WebHeaderCollection Headers
    {
    get { return response.Headers; }
    }
    public override System.Uri ResponseUri
    {
    get { return response.ResponseUri; }
    }
    }
    }

    Now that you have a class that will decompress your web reponse, all you need is to override a couple methods in your proxy to make it work. (I wish Microsoft would update the proxy creation process to support compression, but right now it doesnt)

    Namespace ChatImplementation
    Partial Public Class ChatImplementation
    Protected Overrides Function GetWebRequest(ByVal uri As System.Uri) As System.Net.WebRequest
    Dim request As System.Net.WebRequest = MyBase.GetWebRequest(uri)
    request.Headers.Add("Accept-Encoding", "gzip, deflate")
    Return request
    End Function
    Protected Overrides Function GetWebResponse(ByVal request As System.Net.WebRequest) As System.Net.WebResponse
    Dim response As HttpWebResponseDecompressed = New _
    HttpWebResponseDecompressed(MyBase.GetWebResponse(request))
    Return response
    End Function

    Protected Overrides Function GetWebResponse(ByVal request As System.Net.WebRequest, ByVal result As System.IAsyncResult) As System.Net.WebResponse
    Dim response As HttpWebResponseDecompressed = New _
    HttpWebResponseDecompressed(MyBase.GetWebResponse(request, result))
    Return response
    End Function
    End Class
    End Namespace

    But the important thing is the use of partial classes. This means even if we update our web proxy, it doesn't blow out this code everytime.

    I got both the sync and async code working, the later which is so much more usefull.

    The code is pretty straightforward on how it works, but reply to this post if you have any questions.

    Good luck.



  • We are working in WSS3 and MOSS 2007 environment.

    We are trying to figure out how to display large pdf files in the brower as fast as possible. I've been trying to research page at a time or byte serving. Does anyone know about how IIS and sharepoint work with this? Also do you have any good articles or information i can read?

    Thanks in advance, Andy

  • Does anyone know my all my .aspx pages show a size of "-1" when gzip is enabled in IIS6? Any tips appreciated. Thanks.

  • Can I compress, 'css' and 'js' files..?

    If so is it a matter of simply adding to HcScriptFileExtensions the same as 'asp' files...

    Please advise?

    Many Thanks
    jonathan.smith@southampton-city.ac.uk

    Great Tutorial....

  • Hi Jonathan,

    You got it. Add them to the static section since (unless you're doing it differently) they are static files. When I wrote the article there was a bug in IE 5.5 for js files so I didn't add them at the time. Obviously we're years past that and cs and js should be compressed now.

  • Hi. My metabase.xml file does not contain <IIsCompressionScheme. I'm definately using IIS 6.0. Any ideas what might cause this?

    Thanks.

  • Hmm... I've done above and did an IIS Reset and then visited my pages, but I don't see any new files appearing in my 'WINDOWS\IIS Temporary Compress Files' directory. Any ideas? Thanks.

  • Dustin. Did you complete the step: "Enable Compression in IIS"? It's also possible that after enabling it, it hasn't written the changes to disk yet. If you check now, it should have that section as long as it's been enabled.

    Bill, the temp compress files are only for static content like .txt files. Dynamic pages are compressed each time, and never saved to the temp compress folder. Also, if permissions aren't correct on that folder, it will fail gracefully, so it's possible that IIS can't write to that folder if something is off with the permissions.

  • I'm running IIS6. Reading thru all the posts I also added css and js to the static section of the metabase.xml file. Had to stop IIS Admin service before I could save the xml file (and restarted HTTPS, IIS Admin and WWW afterwards). CSS, JS and HTML compressed files show up in the (default) IIS Temp Compressed folder. ASP files do not (as explained elsewhere in this excellent blog). Interestingly enough port80 thinks my ASP files are uncompressed, but I'm presuming they are compressed (given the static compression works, and I've ASP in the dynamic section). I'd sure like to find a way of actually determining if my ASP files are being compressed. Anyone know of a way? Again, excellent blog and great skin/layout :)

  • Just a note for those of you who use webfonts - don't forget to add eot, svg, ttf, woff to your static entry in metabase.

  • @OWScott. Many thanks. Have turned on direct edit mode. But of course I must still do iisreset for any changes in the metabase file to take hold.

    Am using FF9.0.1 and Firebug to see request and response headers. Request is showing "Accept-Encoding: gzip, deflate" but responses are not showing the required "Content-Encoding: gzip" entry (showing the file was delivered via compression). After running a set of ADSUtil.vbs scripts and doing iisreset, the required Content-Encoding: gzip showed up in Firebug - hurray! And now Port80 is reporting compressed files too - sweet :)

    I did a CTRL-F5 on my site to force files to be pulled from the server and not my local browser cache (so I got 200 OK responses instead of 304 Not Modified).

    Of course my goal is to try and get as many assets loaded via the users's browser cache for subsequent visits. For this I'm using "Response.Expires = -1" at the top of my ASP files. For HTML files I'll have to do some research to find out what caching settings to use (pragma cache etc I guess).

    An excellent discussion thread. Thanks very much to all those involved.

    I'm presuming there is no need to add GIF, JPG, PNG into the static section because we'd achieve very little in the way of performance when compressing binary files? But I have added PDF into the static section but have not tested the performance of compressing PDFs yet - has anyone else?

    And regarding adding the font files into the compression list - these files may be compressed/binary already as I'm getting a response header of "Content-Type: application/octet-stream" for my TTF font file.

  • Hi Mark,

    Glad you were able to get it working. It might have been something with the spacing or tabs. I know that people have run into that, and it's nearly impossible to tell that it's wrong.

    Also, thanks for the link for Caching. I assume that you have that working now too.

    For compression on binary files, you're right that it usually doesn't help. It will have more CPU overhead than benefit. For PDF's, it's up to Adobe Reader whether it will accept and handle IIS compression. In this case it's not the web browser which reads the file so it follows different rules.

    It's not fresh in my memory right now but I believe that Adobe reader doesn't announce that it handles compression, so even if you turn it on it won't compress the page. There should be compression already built into the document using Adobe Reader's built-in compression instead of IIS's compression.

  • I migrate my webservice from iis 6 to iis 7.5. Now the client which is calling my webservice returns the error:"white spaces required". What is problem? Is any setting or configuration in iis 7.5 which i haved missed. How i can resolve that issue...please help...

  • Hi Kashif,

    It sounds like the syntax is the webservice isn't correct. I would try calling it directly and seeing what it looks like. Maybe the formatting is incorrect. You replied in an IIS 6 compression thread so I'm not sure if you're leaning towards compression being the cause. You could always turn off IIS compression as a test. But my main recommendation is to hit the web service manually (sometimes that easier said that done) and see what the results are. It will hopefully stand out to you when you see the actual rules.

  • Hi All

    I have Trying to make my site http compression using the Gzip compression ,did all as Stated but my site is not compresssed ranned the ETW in the windows server (2003 ) i got the following error in the log file

    IISCompression, STATIC_COMPRESSION_START, 0x00002278, 130022029405239326, 1470, 840, {00000000-0000-0000-04c9-0260000000fa}, 0, 0
    IISCompression, STATIC_COMPRESSION_NOT_SUCCESS, 0x00002278, 130022029405239326, 1470, 840, {00000000-0000-0000-04c9-0260000000fa}, "NO_ACCEPT_ENCODING", 0, 0

    IISCompression, DYNAMIC_COMPRESSION_START, 0x00002278, 130022029431957563, 1470, 840, {00000000-0000-0000-08c9-0260000000fa}, 0, 0
    IISCompression, DYNAMIC_COMPRESSION_NOT_SUCCESS, 0x00002278, 130022029431957563, 1470, 840, {00000000-0000-0000-08c9-0260000000fa}, "NOT_SUCCESS_STATUS", 0, 0

    IISCompression, DYNAMIC_COMPRESSION_START, 0x00001A94, 130022029437269961, 3870, 8460, {00000000-0000-0000-0ac9-0260000000fa}, 0, 0
    IISCompression, DYNAMIC_COMPRESSION_NOT_SUCCESS, 0x00001A94, 130022029437269961, 3870, 8460, {00000000-0000-0000-0ac9-0260000000fa}, "NO_ACCEPT_ENCODING", 0, 0


    It is stating that NO_ACCEPT_ENCODING but when i check my site in the fiddler i get the header

    Accept: image/gif, image/jpeg, image/pjpeg, image/pjpeg, application/x-shockwave-flash, application/msword, application/x-ms-application, application/x-ms-xbap, application/vnd.ms-xpsdocument, application/xaml+xml, application/vnd.ms-excel, application/vnd.ms-powerpoint, */*
    Accept-Encoding: gzip, deflate
    Accept-Language: en-us

    So i am confused in the client header i am able to see the Accept-Encoding: gzip, deflate but dont why the IIS is not compressing and it is showing error that no accept encoding .

    Please help me as it is an urgent issue.Please revert me if want more deatils

  • @tosundar, I wonder if it's the settings in the config that may be off so that neither of the encoding types work. In other words, maybe the encoding issue is server-side rather than client-side.

    A common issue that is hard to troubleshoot with the config is to have tabs instead of spaces. What I would recommend doing is starting fresh, including entering the config all over again too, and being extra mindful of the spaces and tabs in the config. If you can revert to a default config then when you add new lines of mime types, make sure that the format is exactly the same as the previous line.

  • Hi OWScott

    Thanks for the reply

    Found one More thing the Http Compression is working inside the server where it is deployed ,But when we acess the site outside the server it is not compressed anything to do about proxy setting any ideas please provide

  • Hi tosundar40,

    Sorry for the delay. That does sound like a proxy that is preventing the web server from realizing that your web client supports compression.

    A good way to troubleshoot that is with Fiddler. Turn it on and watch the request and response on the working situation, and then do it again on the failed situation across the web. My guess is that you'll see that the Accept-Encoding is different between the two. Normally it should have gzip,default, and maybe sdch, but your proxy may be stripping that off.

    Or, if your site is behind a load balancer working as a reverse proxy then it may uncompress it but not be set to re-compress it again on the way through.

  • Thank you, thank you! Been trying to do this for years, and finally managed to do it thanks to your article.

    One small thing...you have to 'Enable Direct Metabase Edit' in IIS > {servername} (local computer) > Properties to save the changes to Metabase.xml.

    Thanks again.

  • Hi Paul,

    Thanks. Interesting, I was almost positive that I had that in the article, but you're right, it's not there. I did have to repair this page once after some updates so maybe it got dropped. I just added that now because it is an essential step.

  • Hi OWScott,

    Thanks for this nice article.

    I am also trying to implement it. currently i am checking this on my local server and the site is not hosted on internet, so i can't use various site which tell about if my comperssion is working or not.

    Can you also guide us to test the compression..

  • Hi Yogendra,

    That's a good question. One good tool to test with is Fiddler. If you can download that and copy to your computer and install then it's a great tool for this type of test.

    Simply open it and then visit your site in your web browser. Then in Fiddler look for the request to your page. In the Inspectors tab (open by default), in the bottom section (the response), view the Raw tab. If the page is compressed then it should looks like a bunch of garbage. You can also check the headers if you see Content-Encoding: gzip (or deflate, etc)

Comments have been disabled for this content.