*raises right hand*

In case you haven't seen it yet, Jan Gray invites us all to take the Managed Code Challenge: Here's an excerpt:

What Is the Secret to Writing Faster Managed Code?

Just because you can get more done with less effort is not a license to abdicate your responsibility to code wisely. First, you must admit it to yourself: "I'm a newbie." You're a newbie. I'm a newbie too. We're all babes in managed code land. We're all still learning the ropes—including what things cost.

When it comes to the rich and convenient .NET Framework, it's like we're kids in the candy store. "Wow, I don't have to do all that tedious strncpy stuff, I can just '+' strings together! Wow, I can load a megabyte of XML in a couple of lines of code! Whoo-hoo!"

It's all so easy. So easy, indeed. So easy to burn megabytes of RAM parsing XML infosets just to pull a few elements out of them. In C or C++ it was so painful you'd think twice, maybe you'd build a state machine on some SAX-like API. With the .NET Framework, you just load the whole infoset in one gulp. Maybe you even do it over and over. Then maybe your application doesn't seem so fast anymore. Maybe it has a working set of many megabytes. Maybe you should have thought twice about what those easy methods cost...

Unfortunately, in my opinion, the current .NET Framework documentation does not adequately detail the performance implications of Framework types and methods—it doesn't even specify which methods might create new objects. Performance modeling is not an easy subject to cover or document; but still, the "not knowing" makes it that much harder for us to make informed decisions.

Since we're all newbies here, and since we don't know what anything costs, and since the costs are not clearly documented, what are we to do?

Measure it. The secret is to measure it and to be vigilant. We're all going to have to get into the habit of measuring the cost of things. If we go to the trouble of measuring what things cost, then we won't be the ones inadvertently calling a whizzy new method that costs ten times what we assumed it costs.

Good comments. I really appreciate it when someone brings info like this to life. Too often we code to solve the problem, and we don't necessarily take into consideration what things like unused variables and intensive string concantenation do to systems. We take advantage of the fact that a gallon of gas is more expensive than a megabyte of RAM (and people wonder why I like to work from home), and write sloppy code. I've been guilty of it...we all have.

One of the easiest way to do combat this is with a printed code review. You catch things on paper you may not have on the screen. Remember high school? Well, some of you may remember it better than others ;). Did you just write essays and hand them in without revisions? Were there sometimes where you had your mom or someone else go over it to catch your mistakes? Why not take those same rules and apply them to your code? Go ahead, burn thru 40 sheets of paper. Your code will be cleaner, faster, and more consistent for it. You never know, you may eliminate 20 unnecessary variable declarations, like I did the other day reviewing GenX.NET 3.0. You may think that's nothing, but just think if you have 10 concurrent users....25....100....25,000.... you get the idea.

After I'm done with my 2nd editing sweep, I intend to do a third pass, using Jan's information as a guide. The code already runs 60% faster. Maybe I can tweak a few more percentage points out of it. Maybe you should do the with the code you should be writing while you're reading this ;).

No Comments