I think I may have been a bit harsh yesterday in my review of Practical Guidelines and Best Practices for Microsoft Visual Basic and Visual C# Developers. Although I stand by my statements, I wanted to expand on and clarify things based on some of the feedback I received (especially one from the author).
First and foremost, there are lots of good tips in this book on doing a whole variety of things from remoting to threading. In fact, there are many more good tips than there are bad. But when you title your book "Best Practices," you really can’t afford to have one bad paragraph (let alone the number I found) in your book. With the countless books to choose from, readers will look to a "Best Practices" book as the standard for development. Therefore, such a book must be judged on a higher standard than others.
One of the first things I touched on was credibility (of which I said the authors had none). Now, I didn’t intend for this to be demeaning or insulting; it’s simply a statement of fact. There are few (if any) among us who has the credibility to make statements of fact without any backing. Just as your professors would say, you need to reference and back up your facts. Yes, "MSDN Regional Director" is a prestigious title, but it certainly is no where near "Turing Award Recipient." And from reading some of their books, even those folks qualify their statements of fact.
In a number of cases, the authors did provide a reason for the "magic number" they chose. For example (and I do not have the book in front of me), they said to not have more than 64 local variables per method. The reason they gave was that the JIT compiler has to use a less-efficient method of allocating memory. Fair enough, but this is the wrong primary reason to give for not having more than sixty-four local variables per method.
Let’s think about that for a minute. Do we really want a developer to think, gee, I’d love to add a sixty-fifth variable to my method, but that’ll just kill JIT performance. What the authors should have said was that managing sixty-four variables in a method make code hard to fricken’ follow. Harder to follow means more bugs. Had the authors bothered to look at nearly forty years of computer science research, they would have discovered that maintainability vs. bugs vs. speed is the subject of many, many studies.
Francesco Balena’s response my criticism on his focus of speed was:
"[M]ost of the techniques you consider as questionable can make your code run faster by at least 50%, or more. If the offending statement appears in a tight loop they can save you a significant amount of time, not just a few CPU cycles. In a server-side component this sort of optimization makes the difference and can positively affect scalability - I am surprised you missed the point.
I respectfully disagree, Francesco. You missed my point. The book is targeted not towards the gurus and experts, but towards the beginner and intermediate level developers. For this I will quote the two rules of optimization from M.A. Jackson (Principles of Program Design, 1975):
Rule 1. Don't do it
Rule 2. (for experts only) Don't do it yet.
Catch the date on there? Even in 1975 experts understood the problems with the authors’ line of thinking. Believe me when I say your customer would prefer code that works to code that saves forty nanoseconds out three milliseconds. Yes, tuning and optimization is important, but it is not an appropriate theme for a book with this target audience.
This is exactly what I was thinking when I described the book as "dangerous." If it had a different title, I’d give the book a "C" rating and say it’s "okay." But as a "Best Practices" book, it will encourage developers to think about "is it good for the JIT" instead of "is it good for my predecessor." I’ve seen code like that. I’ve seen the cost of code like that. Believe me when I say it’s not pretty.