Newbies to .NET often criticize its memory management and its garbage collector,
but the criticisms are typically based on a lack of understanding and nothing more.
For instance, many have observed that trivial .NET applications allocate about 20MB,
but the incorrect assumption that often follows is that the .NET runtime needs it.
Instead, we know that allocating memory is time-consuming, so its generally better
when there is lots of memory to just go ahead and allocate a big chunk all at once.
This means that our .NET applications can actually perform better in many cases,
since they don't have to be constantly allocating and deallocating needed memory.
Similarly, the garbage collector is often criticized by newbies to .NET unfairly,
because the implication of GC is that memory is not released as quickly as possible.
While this is true, its generally an acceptable trade-off with plenty of memory,
since garbage collection frees us from having to worry about memory ourselves.
That's right, we know that GC means we easily end up with more reliable systems,
with fewer memory leaks without investing tons of time managing memory manually.
So what's the point of this post? Sadly, not all systems are such simple cases.
I've been working with a large .NET WinForms application that uses lots of very
large objects (datasets actually), which means that memory is often running low.
To make matters worse, this application is supposed to run in a Citrix environment,
which means there will be multiple instances of this application at the same time.
Our Citrix servers have 4GB of RAM and dual processors, which sounds like a lot,
but that memory and horsepower has to be shared among many concurrent users here.
The existing Access-based applications we are replacing ran in this environment,
and it was typically common to have up to 40 users working on the same Citrix box.
Its easy to assume, as have the business people, that since .NET is new and better,
then we can surely get 40 users on the same Citrix box with .NET applications too.
Well, it isn't going to happen -- at this point we'll be happy if we can get 20,
and prior to finding the bug in the XmlSerializer we wondered if 10 was possible.
Don't get me wrong, there are issues beyond just .NET here, like the rationale
for working with such large datasets, but then Access made it "seem" easy to do.
So what have I learned so far? First, .NET global memory performance counters
do NOT work -- they simply report the last sample from a single .NET application.
Next, contrary to what I've often read and been told, by many .NET experts too,
setting objects to null (or Nothing in VB) can make a huge difference afterall!
Note, you can download a simple application I've created to see this for yourself
-- it tracks single or multiple processes, with or without the set null cleanup.
My sample creates an approximately 100MB large object (a dataset or an arraylist),
but the process private bytes quickly level at 200MB, and even 300MB at times.
On the other hand, setting my objects to null keeps the private bytes level at
100MB, although certainly there are still the momentary expected spikes to 200MB.
This may not matter for small footprint apps, but it can be quite critical for
large applications, let alone cases where there are multiple such applications.
My sample also calls the Clear method, and Dispose when its defined, but separate
tests actually showed they actually did not make a difference -- just set to null.
That said, a colleague has convinced me that not calling Dispose when its defined
is just asking for trouble, afterall why is it defined if not for a good reason.
A look at the source code can even prove some Dispose methods do nothing at all,
but that's an internal implementation detail that is not good to assume in use.
OK, so what else have I learned? .NET relies on a low memory notification event,
which occurs when there is 32MB of RAM left on systems with 4GB of RAM or less.
I need to be very careful here, since I don't fully understand it myself still,
but it seems that memory management and garbage collection are different things.
Garbage collection occurs rather frequently, which reduces the committed bytes,
but that does NOT mean the memory is given back to the OS for other processes.
Instead, the reserved memory associated with the process stays with the process,
which is why my earlier example often found it acceptable to rise even to 300MB.
Apparently, the reserved memory is only given back to the OS to use for other
processes when the overall available system memory drops to this 32MB threshold.
For a single process, especially one that also sets its large objects to null,
this isn't really an issue -- there is lots of memory and no competition for it.
But with multiple processes, each consuming large objects, this can be an issue!
That's right, imagine my Citrix case with just a dozen users, which isn't many.
A couple processes do large operations, and even after their garbage collection,
they continue to tie up large amounts of memory reserved for their future use.
A litte later a couple of other processes begin large operations, and reduce
the overall memory to below 32MB, at which point the notification event occurs.
The problem is that the new operations can't wait, so paging quickly increases,
and with multiple processes the paging can begin to overwhelm everything else.
So in my opinion the 32MB threshold is simply too late for systems like this!
It can probably be argued that this is an OS problem, and not a .NET problem,
but I think they are related since garbage collection must first free memory.
I ran just three instances of my simple application on my own 512MB computer,
and when the objects were not set to null it became swamped almost immediately.
I made it drastically better when I did change to setting my objects to null,
but make another process or two still swamps the system very quickly anyhow.
True, I won't be running multiple users on my personal computer anytime soon,
but I should be able to run multiple applications on it at the same time, right?
My conclusion is that .net makes it very difficult to do multiple applications
that handle large objects, whether on Citrix or just your own 512MB computer.
The garbage collector that is great for smaller and/or single applications is
just too lazy when combined with this 32MB low memory notification threshold.
Again, there's no doubt that my particular scenario should have been designed
differently, with some type of limits to avoid so many large objects in memory.
But there's no doubt that this system would support more users if it was NOT
having to work with the lazy garbage collector, as our older systems did so.
Oh well, I've went on long enough, and probably said some foolish things too.
My frustrations are not so much about .NET, since I think its great for the proper
scenarios, which are most cases -- my frustrations are that these types of issues
are simply not documented, and there is too much incorrect or misleading data.
This should not stop you from creating most of your applications with .NET still,
but I would very much like some “real” advice on the larger scenarios like this.