January 2008 - Posts - Jon Galloway

January 2008 - Posts

Adding simple trigger-based auditing to your SQL Server database

How do you track changes to data in your database? There are a variety of supported auditing methods for SQL Server, including comprehensive C2 security auditing, but what do you do if you're solving a business rather than a security problem, and you're interested in tracking the following kinds of information:

  • What data has been updated recently
  • Which tables have not been updated recently
  • Who modified the price of Steeleye Stout to $20 / unit, and when did they do it?
  • What was the unit price for Steeleye Stout before Jon monkeyed with it?

There are a number of ways to design this into your solution from the start, for example:

  • The application is designed so that all changes are logged
  • All data changes go through a data access layer which logs all changes
  • The database is constructed in such a way that logging information is included in each table, perhaps set via a trigger

What if we're not starting from scratch?

But what do you do if you need to add lightweight auditing to an existing solution, in which data can be modified via a variety of direct access methods? When I ran into that challenge, I decided to use Nigel Rivett's SQL Server Auditing triggers. I read about some concern with the performance impact, but this database wasn't forecasted to have a high update rate. Nigel's script works by adding a trigger for INSERT, UPDATE, and DELETE on a single table. The trigger catches data changes, then saves out the information (such as table name, the primary key values, the column name that was altered, and the before and after values for that column) to an Audit table.

I needed to track every table in the database, though, and I expected the database schema to continue to change. I was able to generalize the solution a bit, because the database convention didn't use any no compound primary keys. I created the script listed below, which loops through all tables in the database with the exception of the Audit table, of course, since auditing changes to the audit table is both unnecessary and recursive. I'm also skipping sysdiagrams; you could include any other tables you don't want to track to that list as well.

The nice thing about the script I'm including below is that you can run it after making some schema changes and it will make sure that all newly added tables are included in the change tracking / audit, too.

Here's an example of what you'd see in the audit table for an Update followed by an Insert. Notice that the Update shows type U and a single column updated, while the Insert (type I) shows all columns added, one on each row:

Sample Audit Data

While this information is pretty unstructured, it's not difficult to run some useful reports. For instance, we can easily find things like

  • which tables were updated recently
  • which tables have not been updated in the past year
  • which tables have never been updated
  • all changes made by a specific user in a time period
  • most active tables in a time period

While it's not as easy, it's possible to backtrack from the current state to determine the state of a row in a table at a certain point in time. It's generally possible to dig out the state of an entire table at a point in time, but a change table isn't a good a fit for temporal data tracking - the right solution there is to start adding Modified By and Modified On columns to the required tables.

Note that we're only tracking data changes here. If you'd like to track schema changes, take a look at SQL Server 2005's DDL triggers.

Enough talking, give us the script!

Sure. I'll repeat that there are some disclaimers to the approach -  performance, it'll only track changes to tables with a primary key, etc. If you want to know more about the trigger itself, I'd recommend starting with Nigel's article. However, it worked great for our project.

Posted by Jon Galloway | 124 comment(s)
Filed under: ,

ASP.NET Menu and SiteMap Security Trimming (plus a trick for when your menu and security don't match up)


ASP.NET 2005 introduced a pretty solid menu which is integrated with a configuration driven sitemap. The cool part is that the menu can be hooked in with your security roles, so you don't have to worry about hiding or showing menu options based on the user - the menu options are automatically kept in sync with what the user is allowed to see. We'll talk about how to set this up, using an example from a website I worked on recently.

If you're familiar with ASP.NET sitemaps and menus, skip to the end to read a trick for working around cases when you want to do something more complex, such as have a page to be accessible to a user role, but not to show up in the menu.

The Video.Show site was originally planned to have only one class of user, with no user restrictions other than requiring a quick registration before commenting on videos or uploading videos. With that being the case, we just included a static menu in the masterpage, defined as <asp:MenuItem> elements. As we were gearing up for the first beta release, it became obvious that our user model was too simple. Hosted installations would probably want to allow users to register and begin commenting right away, not give all these users upload rights. That implied four classes of user now: unauthenticated, commenter, uploader, and also administrator (implied by the requirement to manage multiple user classes). That meant role management and new menu items to be kept in sync - the right time to move to a sitemap driven menu with security trimming.

Step One - Define The Sitemap

I'm using a static sitemap defined in a Web.sitemap file, and it's especially simple since there's no is no hierarchy involved. This uses the default XmlSiteMapProvider; there are other sitemap providers available on the internets, such as a SQL Sitemap Provider for database driven site structure, or you can implement your provider if you've got a custom situation.

<?xml version="1.0" encoding="utf-8" ?>
<siteMap xmlns="http://schemas.microsoft.com/AspNet/SiteMap-File-1.0" >
  <siteMapNode roles="*">
    <siteMapNode title="Home" url="~/Default.aspx" />
    <siteMapNode title="Videos" url="~/Tags.aspx" />
    <siteMapNode title="Members" url="~/MemberList.aspx" />
    <siteMapNode title="My Page" url="~/MyPage.aspx" />
    <siteMapNode title="My Recent Views" url="~/RecentViews.aspx" />
    <siteMapNode title="Upload a Video" url="~/Upload.aspx" />
    <siteMapNode title="Administer Users" url="~/AdministerUsers.aspx" />

Note that the IntelliSense for a .sitemap file is misleading:

Sitemap Intellisense

While the XSD for .sitemap files (from which the IntelliSense is derived) includes "securityTrimmingEnabled" attribute, it's incorrect. It's the result of an old VS 2005 bug that's still around. That value should be set in web.config; we'll take care of that next.

Step Two - Declare The Sitemap in web.config

A few things to notice here:

  • I define the provider type as System.Web.XmlSiteMapProvider
  • I specify the siteMapFile as the Web.sitemap file we've just discussed
  • I set securityTrimmingEnabled="true"
<siteMap enabled="true">
    <add siteMapFile="Web.sitemap" name="AspNetXmlSiteMapProvider" type="System.Web.XmlSiteMapProvider" securityTrimmingEnabled="true"/>
This is really just overriding the default sitemap settings from %windir%\Microsoft.NET\Framework\v2.0.50727\CONFIG\web.config, which also uses the name AspNetXmlSiteMapProvider, but which doesn't include security trimming.

Step Three - Set required roles for the pages

This section of the web.config looks long, but you'll see it very repetitive. MSDN's information on setting up authorization rules is pretty well written, so take a look there if you'd like more info. The high points:

  • Rules are processed top to bottom. For example in the Upload.aspx case, a user in the Uploader role is allowed right off the bat, everyone else is denied.
  • Pages which are displayed to all authenticated users just need to deny unauthenticated users, like this: <deny users=?">
  • There's no wildcard for roles, so you can't say something like <allow roles="*">.
  • Role based permissions is configured by default in machine.config (using both AspNetSqlRoleProvider and AspNetWindowsTokenRoleProvider). The Sql Role Provider assumes a database connectionstring named LocalSqlServer, so if your profile information is stored somewhere else you'll need to make sure the rolemanager is configured correctly.
<location path="Upload.aspx">
      <allow roles="Uploader"/>
      <deny users="*" />
<location path="Profile.aspx">
      <deny users="?" />
<location path="MyPage.aspx">
      <deny users="?" />
<location path="RecentViews.aspx">
      <deny users="?" />
<location path="AdministerUsers.aspx">
      <allow roles="Administrator"/>
      <deny users="*"/>

Step Four - Add A Sitemap Data Source and a Menu to your Master Page

<asp:SiteMapDataSource runat="server" ID="siteMapDataSource" ShowStartingNode="false" />
<asp:Menu runat="server" ID="MainMenu" Orientation="Horizontal" DataSourceID="siteMapDataSource" />
You'll probably want to style the menu, too. I'm a fan of the CSS Friendly Control Adapters, which changes the HTML output to use clean UL. Without the Control Adapter, the Menu control outputs nested tables manipulated by JavaScript. Here's what the above menu looks like for a user who's logged in but isn't in the Administrator or Uploader roles:
 Video.Show Menu

The Payoff - Everything is Managed In One Place

That may seem like a lot to configure, and you might be wondering if it isn't easier to just write write your own code to handle access and menu management.

First off, the above actually goes pretty quickly - hopefully this post or others I've linked to make it a little faster.

Secondly, the real payoff is that you've now got a reliable, maintainable solution to controlling page access, and it's all automatically kept in sync. Let's say we want to add a new page that's only visible to users with Uploader rights - maybe a page (MyVideos.aspx) where they can edit or delete videos they've previously uploaded. I'd only need to add one page to the sitemap file, set the access rule in web.config to allow Uploaders and deny everyone else, and the page will only show up in the menu when an Uploader has logged in. This is a good application of the Don't Repeat Yourself philosophy. We don't have one set of logic determining what pages users are allowed to view and another set which determines what pages they should see in the menu; these are both the same list and should be handled that way.

Tip - Use a Url Mapping to alias pages when your access and menu visibility are more complex

I wanted to point out one other tip that came in handy here. Before we realized the need for different user types, we had one page called Member.aspx, which served two purposes. If the querystring contained some other user's userid, it would show their public profile and a list of their videos. We also repurposed it as My Page, determined by navigating to the page as a logged in user without using a querystring.

When we hooked up the menu and page access, we had a problem. We only wanted to show My Page in the menu when a user was logged in, but we needed the Member.aspx page to be viewable by anonymous users, because it was used for public user profiles, too. The simple solution was to set up a Url Mapping which created a virtual MyPage.aspx (mapped to Member.aspx). Now we could set different access rights to MyPage.aspx and Member.aspx, as shown in the Step Three code sample - Member.aspx is public, and MyPage.aspx requires authentication. Here's how the Url Mapping was set up:

  <add url="~/MyPage.aspx" mappedUrl="~/Member.aspx"/>

The Man Who Knew Too Much?

I've been thinking about the odd problem, and what can be done about it. I've found that more active participation in a group can lead to more information, but that new information can actually stifle further participation. Here are some of the problems I'm thinking of, as well as some possible solutions.

Being the Dumbest Person In The Room

The best way to learn is to "aspire to be the dumbest person in the room." Being the smartest person in the room is comfortable - you can feel smug and important as you deign to dole out information. It's also the surest way to avoid learning anything. Surrounding yourself with people who are better informed than you are is a great way to keep learning, but it can take away your confidence in what you've got to say.

Wrong solutions:

  • Keep quiet for fear of saying something stupid
  • Keep quiet because all those other brilliant people will probably say it soon, and better


  • Don't write for your own ego, write to share information
  • Accept that publishing anything on the internet is one of the best ways to invite constructive criticism for what you think you know

Too Much Information

As you improve your learning style and professional relationships, it's easy to get snowed under in the wash of information. It's hard to talk about anything, because more new things keep happening, and people keep talking about the things you're crafting opinions on.

I don't think bullet points are going to do this subject justice. I'm scheming up a series on information management.

Assuming Everyone Else Already Knows What You Know

As you surround yourself with smarter people and plug in to good sources of timely information, it's easy to convince yourself that everything you've got to say has already been said, or is common knowledge.

Wrong solutions:

  • Keep quiet to avoid insulting anyone's intelligence
  • Veer to the other extreme and parrot common news (which is why you probably won't hear product release news on my blog)


  • So what if others have said it - say it differently, or quote them and add some commentary. It's better to sing off key than not to sing at all.
  • Find a variety of outlets. For me, Twitter has been a great place to blurt thoughts or interesting links without unnecessary word-crafting or worrying if it's old news.

Privileged Information

I spent the week at a Silverlight 2.0 TAP (early adopter) class in Redmond. It was in incredible week, packed with briefings and labs and contacts for more information if needed. I know much more about Silverlight than I did six months ago, but now that I'm under NDA as part of the Silverlight 2.0 Beta - and as a participant in Vertigo's NDA as well - it's a lot more difficult to speak publicly about what I know and am working on. Heck, I've re-read these few sentences a few times just to be sure I'm not leaking anything.

Wrong solutions:

  • Annoying comments like "Oh, boy, I know some great new secrets. Sorry, can't tell you! Bye!"
  • Clamming up totally because it's easier to keep quiet than to keep secrets straight
  • Talking about stuff you're not supposed to (of course)


  • Deal with the inconvenience of having to censor yourself and write about what is currently public. Scott Guthrie's blog is an inspirational example of providing helpful information without giving away stuff at the wrong time.
  • Write drafts posts or notes now for publication when it's okay.

I'm Too Busy Now, I'll Blog It Later...

It's easy to get so caught up in your work that you feel you can't take a second to talk about the challenges you're facing, and how you're solving them.

Why "I'll blog it later..." is the wrong solution:

  • The current challenge will be replaced with another one. If you don't write about what you're doing as you're doing it, the intentions are liable to stay nothing more than intentions.
  • You'll lose the context. It's hard to tell a good story or cover the technical details after the fact. It's best to write about it when it's fresh.
  • Blog commenters will likely give you better solutions. Nothing worse than posting about some heroics you pulled off only to hear that they were totally unnecessary.


  • If you're totally unable to compose a post, at least write up a draft with some code snippets. Posting clients like Windows Live Writer make it easy to write up notes in a draft and save them for when you get to them.
  • Artificial deadlines for blogging to compete with real life deadlines. That's part of the reason for why I'm making an effort to post three times a week. Otherwise, ever present deadlines can keep me from doing what's important in the long term.

What do you think? Did I miss anything? Any better solutions?

Posted by Jon Galloway | 1 comment(s)
Filed under: ,

Writing a custom ASP.NET Profile class

We made heavy use of the ASP.NET membership and profile system for Video.Show (a Silverlight 1.0 video community website system, available on CodePlex). In addition to storing basic profile information, we created a custom profile with some additional fields. It's a really easy way to add add some additional personalization to your site without having to add a bunch of tables to your database.

This is really simple if you're using a Website Project - you can just add additional properties to the profile section of your web.config, and a custom profile class is generated on the fly when you rebuild your application. That makes things ridiculously easy. First, we'd define the property in web.config:

<!-- In web.config -->
<profile >
    <add name="FavoritePasta" />
Then you can refer to the Profile.FavoritePasta profile setting anywhere in your web application, and it's automatically mapped to the current user:
Profile.FavoritePasta = "Pumpkin Ravioli";
And you can access the data just as you would a session property:
<span id="user-favorite-pasta"><%= Profile.FavoritePasta %></span>

Not so fast, I'm using a Web Application Project

Yeah, here's the catch. If you're using the Web Application Project model, the custom build handling for the profile doesn't kick in, so those custom properties you've lovingly crafted in your web.config aren't going to be compiled into a custom profile class.

There's a Visual Studio 2005 add-in called WebProfile that reads your custom profile and creates a custom class for you. That's handy, but I passed on it. For one thing, I haven't heard that there's a VS 2008 version of this. Additionally, I don't like to require a custom add-in in order to get my code to work in case I want to add a new profile property - especially when I'm working on a project that's going to be distributed on CodePlex.

Fortunately, it's not very hard to implement a custom profile. First, we'll write a class that inherits from System.Web.Profile.ProfileBase. I added a few static accessors, too:

using System.Web.Profile;
using System.Web.Security;

namespace VideoShow
    public class UserProfile : ProfileBase
        public static UserProfile GetUserProfile(string username)
            return Create(username) as UserProfile;

public static UserProfile GetUserProfile() { return Create(Membership.GetUser().UserName) as UserProfile; } [SettingsAllowAnonymous(false)] public string Description { get { return base["Description"] as string; } set { base["Description"] = value; } } [SettingsAllowAnonymous(false)] public string Location { get { return base["Location"] as string; } set { base["Location"] = value; } } [SettingsAllowAnonymous(false)] public string FavoriteMovie { get { return base["FavoriteMovie"] as string; } set { base["FavoriteMovie"] = value; } } } }

Now we need to  hook that up in the profile section of web.config - notice that I've included inherits="VideoShow.UserProfile" in the profile declaration:
<profile inherits="VideoShow.UserProfile">
    <clear />
    <add name="AspNetSqlProfileProvider" type="System.Web.Profile.SqlProfileProvider" connectionStringName="VideoShowConnectionString"/>
With that done, I can grab an instance of the custom profile class and set a property:
//Write to a user profile from a textbox value
UserProfile profile = UserProfile.GetUserProfile(currentUser.UserName);
profile.FavoriteMovie = FavoriteMovie.Text;

Part of the reason for the accessor is to allow display of profile information for users other than the current user - for instance, a public profile page which displays information about other users in the system.

//Write to a user profile from a textbox value
UserProfile profile = UserProfile.GetUserProfile(displayUser.UserName);

And of course, I can still databind to it as well:

<span id="user-favorite-movie"><%= VideoShow.UserProfile.GetUserProfile().FavoriteMovie %></span>

A few disclaimers:

  • This isn't news, it's been out since ASP.NET 2.0 shipped. Still, it's pretty handy to know about, and if you're like me you may have forgotten or never really dug into some of the ASP.NET 2.0 goodies.
  • This isn't the ultimate solution in terms of entity modeling. Custom profile information is stored in two columns in the aspnet_Profile table (delimited strings in one column, another column for binary serialized objects). That means that the only real way to read or write custom property values is via the profile API. That's not a real problem unless you need to query or join on information stored in a custom profile setting.

Further information:

Profiles in ASP.NET (K. Scott Allen)

Essential ASP.NET 2.0, Chapter 5 (Fritz Onion)

Posted by Jon Galloway | 68 comment(s)
Filed under:

Registry setting keeps Windows from wigging out when you open lots of IE7 tabs

Should I freak out nowSummary

Opening too many tabs in Internet Explorer 7 can cause the Windows shell to switch to "Evil Mode". Fortunately, there's a registry setting that fixes the problem by increasing the Windows heap size.

The Problem

Internet Explorer starts to go nuts when you open a lot of tabs. Jeff Atwood wrote that he started to see problems when he had IE 45 tabs open:

When researching blog posts, I tend to open a lot of browser windows and tabs. At least twice per week, I have so many browsers and tabs open that I run into some internal limitation of the browser and I can't open any more. My system gets a little wonky in this state, too: right-clicking no longer shows a menu, and I'm prevented from launching other applications. But if I close a few errant browser windows or tabs, everything goes back to normal.

I've hit this problem regularly in both Windows XP and Vista - when I open a lot of tabs in IE7, weird things start happening. While IE and File Explorer are no longer integrated beyond sharing some base DLL's, the problems caused by opening too many IE tabs will affect File Explorer and the rest of the Windows shell.

Some of the problems I'd see:

  • The context menus in File Explorer are missing a lot of options or don't display at all
  • Programs will fail to open when you double-click on a shortcut
  • The Start Menu doesn't open
  • General malaise

Closing tabs or windows seemed to help the problem, but sometimes the only solution was a reboot. That was especially annoying for me, as I tend to leave my computers running for month and have been known to have multitudes of tabs open for weeks on end.


Fortunately a comment on Jeff's blog points to a solution in Ed Bott's blog: Increase the size of the Desktop Heap. Like a lot of software problems, this one is hard to figure out until you know what's causing it. Once you know the cause, it all kind of makes sense and it's easy to find out more. Sure enough, Kevin Dente wrote about this exact problem under Windows XP almost 4 years ago:

According to a somewhat dated but still relevant MS KB article, “this static value is used to prevent ill- behaved applications from consuming too many resources”. Well, apparently it IE meets the “ill-behaved” criteria, because it seemed to cause Windows to bump into this limit, and Windows wasn’t handling it very gracefully. Anyway, to make a long story longer, when I bumped up the desktop heap size (from its default of 3MB up to 8MB), bingo, all of the problems magically disappeared. Whew, what a relief.

To make this change, navigate regedit to HKEY_LOCAL_MACHINE\\System\\CurrentControlSet\\Control\\Session Manager\\SubSystems. The “Windows” value contains a big honkin’ string, and one part of it is “SharedSection=xxxx,yyyy,zzzz”. The second number (yyyy) is the one that you want to increase.

I usually provide the text for a REG file whenever I recommend a registry change, but because this is smack dab in the middle of a long, complex string it's only you'll have to take care of manually. The good news is that I've tested this for months on three Windows machines (2 Vista, one XP) and it's worked great.

If you're interested in understanding exactly what the Shared Section of the Desktop Heap has to do with IE tabs and the Windows Wonkiness (a technical term), there's more information on the NT Debugging blog.

Three posts a week - my new year's resolution

I'll pass my 5 year blogging anniversary this year. I've written over 520 posts. But, I'd like to be more consistent. So this year, I'm resolving to post three times a week. I'll give myself an exemption when I'm on vacation, which explains why I'm posting this on the second week of the new year.

I've decided that I'd like to break my content up by day of the week:

  • Monday - Coding tip - Should be a straightforward tip developing for the ASP.NET platform (including SQL Server, LINQ, Silverlight, etc.). This week's example: Large File Uploads in ASP.NET.
  • Wednesday - General Software Development topics - Thoughts on the profession of software development, also including more general software topics which might be a little outside the .NET development box. This week's example was a serendipitous entry on Inkscape's support for XAML.
  • Friday - Free play. All skate. I write what I want. Rob and others have told me that I'm more formal than I need to be on my blog, and I think I buy it. Don't worry - I won't be talking about my cats (I don't have any). Today's is an example - oh, no, I've gone recursive. Well, the idea is that Friday posts may be thoughts on what I've learned that week, lists of things I found interesting, or other geeky topics that haven't quite fit into my narrow blog view to date. I've been using Twitter a lot over the past year (over 1000 updates), which has caused me to think there's a place somewhere between Twitter blurts and in-depth posts. I end up self-censoring to the point that I have literally hundreds of saved blog posts just waiting for my drive to crash and put them out of their misery. Unleash the hounds!

Three posts a week sounds reasonable to me. Jeff made a good sale for his commitment to 5 posts a week. 3 posts a week is a 60% of that, which is a solid D grade in anyone's book. Any extra posts might just keep me out of summer school.

Are you with me? Three posts a week, who's with me? Anyone?

Posted by Jon Galloway | 3 comment(s)
Filed under: ,

Inkscape to support XAML export

Today, Adam Kinney gave me the tip off to some cool news: Inkscape is adding XAML export.

Great, what's Inkscape? Glad you asked! Inkscape is an open source vector graphics editor, like Adobe Illustrator. Rather than drawing in pixels (like you'd do in Photoshop, Paint.NET, etc.), you're drawing in vectors. Inkscape is a little unique in that it uses SVG (scalable vector graphics) as its base format. SVG is a W3C standard, with pretty good support in most non-IE browsers (check out SVG Tetris in Firefox 1.5 or higher). It's unfortunate that IE doesn't (and likely never will) support SVG, since browser support for standards based vector-based graphics and text could do so much for the web.

Although the drawing primitives in XAML are very similar to SVG, up until now there hasn't been a really solid way of developing assets in SVG and moving them to XAML.

Right now, there are only two real ways to do any serious design in XAML:

  1. Buy Expression Blend and Expression Design ($599)
  2. Buy Adobe Illustrator ($599) and use the XAML export plugin

Now that Inkscape is picking up XAML export, there's a third way that doesn't start with any buying.

Why this is important

Lots of reasons:

  • It’s nice to be able to say that you can create full featured Silverlight apps without having to buy a product, in the same way that a free csc.exe and the Visual Studio Express products really round out the .NET platform. Obviously, the Expression products are compelling for XAML creation, but it’s nice for the Silverlight platform to show that it’s not directly tied to a product.
  • Since Inkscape is cross-platform (running on Windows, Linux, and Mac OSX), this opens up cross platform development quite a bit. Now you could create Silverlight apps on OSX with Inkscape and TextMate (or whatever it is they code in these days), then host the app on Linux. I'm guessing that a small minority will actually use Inkscape to do Silverlight / Moonlight development without running Windows, but the fact that they could suddenly makes the Silverlight platform fee a lot friendlier.
  • Inkscape has some really cool features that aren't in the Expression products yet. For example, Inkscape has pretty nice bitmap tracing, and the tile cloning is very powerful. Example:


Let's try it out!

Tim Heuer says the XAML export is already in the Inkscape nightly builds, so let's grab it call his bluff. The Inkscape nightly builds are available as a 30MB 7z archive, so you'll need 7-zip to extract it. 7-zip is a great program with significantly better compression than the standard zip format, so it's worth grabbing it if you don't already have it.

I kept my test pretty simple - a solid background, a bunch of translucent stars, and some text that I'd converted to paths.


Next, I save the drawing, selecting XAML as the output format.


Now I've got a XAML file, which can be opened directly in IE7.


I won't bore you with the XAML itself - since the text was converted to paths, it's really verbose. But it's valid XAML, and no different than I'd expect to see if I'd created it in Expression. No, this isn't being hosted in Silverlight, but I'd fully expect that it should work without any problems - maybe if I get time I'll give that a shot.


This is very alpha right now. I've run the Inkscape nightlies many times over the past two years and they've generally been pretty solid, but it wasn't hard to get the XAML export to crash. To say that another way, it took several tries to come up with a simple design that wouldn't cause a crash when I exported it. For this example, I kept with very simple primitives and converted my text objects to paths. However, it's definitely working, and as Tim says they're hoping to have a stable build the Southern California Linux Expo in February.

Combined with the Moonlight plugin, it should be possible to design, develop, host, and view a Silverlight on Mac or Linux without ever firing up Windows. Not that you should, necessarily, but you could. Neat.

Large file uploads in ASP.NET

Uploading files via the FileUpload control gets tricky with big files. The default maximum filesize is 4MB - this is done to prevent denial of service attacks in which an attacker submitted one or more huge files which overwhelmed server resources. If a user uploads a file larger than 4MB, they'll get an error message: "Maximum request length exceeded."

Increasing the Maximum Upload Size

The 4MB default is set in machine.config, but you can override it in you web.config. For instance, to expand the upload limit to 20MB, you'd do this:

  <httpRuntime executionTimeout="240" maxRequestLength="20480" />

Since the maximum request size limit is there to protect your site, it's best to expand the file-size limit for specific directories rather than your entire application. That's possible since the web.config allows for cascading overrides. You can add a web.config file to your folder which just contains the above, or you can use the <location> tag in your main web.config to achieve the same effect:

<location path="Upload">
        <httpRuntime executionTimeout="110" maxRequestLength="20000" />

What Happens When I Upload A File That's Too Big?

While expanding the upload restriction is a start, it's not a full solution for large file uploads. Milan explains one of the biggest problems with large file uploads in The Dark Side Of File Uploads:

It gets really interesting if someone uploads a file that is too large. Regardless of what your maxRequestLength setting mandates, IIS has to guzzle it, and then ASP.NET checks its size against your size limit. At this point it throws an exception.

As Milan explains, you can trap the exception, but it's trickier than you'd expect. He talks about overriding Page.OnError and checking for HTTP error code 400 when the error is HttpException, which as he says is less than ideal.

At Least Give Me A Warning

If we've got a set limit on file upload sizes, we should at least tell our users what it is. Since this is a configurable value which we may change later, the best is to make our file size warning read directly from web.config setting. The best way to do this is to pull back the httpRuntime section as a HttpRuntimeSection object, which isn't too hard given:
System.Configuration.Configuration config = WebConfigurationManager.OpenWebConfiguration("~");
HttpRuntimeSection section = config.GetSection("system.web/httpRuntime") as HttpRuntimeSection;
double maxFileSize = Math.Round(section.MaxRequestLength / 1024.0, 1);
FileSizeLimit.Text = string.Format("Make sure your file is under {0:0.#} MB.", maxFileSize);

A Real Solution: an HttpModule to Handle File Uploads

There are better solutions to handling large file uploads in ASP.NET. A custom HttpHandler can provide a better user experience by displaying upload progress and allowing you to handle a file size problem in a more controlled fashion. Here's a summary from a cursory search:

One of the longest running threads on the ASP.NET Forums (going back 5 years): HttpHandler or HttpModule for file upload, large files, progress indicator?

A Better Solution: a RIA upload component

The project which caused me to look into this, Video.Show, was based on ASP.NET and Silverlight 1.0, so we weren't able to take advantage of RIA platforms which would support more advance upload handling. In most cases, though, I'd recommend replacing the FileUpload component with a Silverlight or Flash based file upload control. In addition to a better upload experience, these controls generally look better than the the generic button displayed for the <input type="file"> element which is rendered by the FileUpload control. The input / file element doesn't allow for CSS formatting, although smart CSS hackers always seem to find a way around these things.

Although there doesn't seem to be a project / component to support Silverlight uploads yet, Liquid Boy has a nice sample of a Silverlight 1.1 (oops, 2.0) based upload control. I've heard good things about SWFUpload, a Flash and JavaScript based upload system. Developers are responsible for handling a few JavaScript events as well as accepting the file on the server. That can be done pretty easily, as you can see from this ASP.NET sample implementation. Here's a screenshot from one of the SWFUpload online demos:


What About IIS7?

Oh, and there's more thing to worry about. IIS7 has a built-in request scanning which imposes an upload file cap which defaults to 30MB. Again, this is a good feature, but it gets in the way if you're looking to upload files larger than 30MB. Steve Schofield posted about how to change this from the comandline:

appcmd set config "My Site/MyApp" -section:requestFiltering -requestLimits.maxAllowedContentLength:104857600 -commitpath:apphost

Why is this such a pain?

Browsers, and HTML in general, were never designed to handle large uploads gracefully. Jeff Atwood and I discussed this back in September, and he summed up the issue pretty well in his post asking Why Are Web Uploads So Painful? It's disappointing that browser standards have failed us to the point that we need to use browser extension technologies like Flash and Silverlight as a band-aid here. Browsers should support uploads over both HTTP and FTP (it is the file transfer protocol, after all), and should manage the upload in a side or toolbar that allows us to continue browsing without breaking the upload. The Firefox Universal Upload add-on is an example of how this should work out of the box.
Historical trivia: In ASP.NET 1.0 and 1.1, the entire file was loaded into memory before being written to disk. There were improvements in ASP.NET 2.0 to stream the file to disk during the upload process.
More Posts