What is code quality?

Recently I had the chance to examine, in detail, each of the code analysis rules built into Visual Studio 2008. These are, almost exactly, the same rules you will get in FxCop 1.36. In fact, as you consider working MS code analysis into your development lifecycle, you have three options:

  • Use the built-in code analysis in Visual Studio on the development machines
  • Use FxCop as part of the continuous integration/build process
  • Use both

Using the built-in analysis is bit less flexible than FxCop. The extensibility that makes FxCop so useful is not officially supported in the built-in tool. However, creating custom rules and adding them to the rules that VS makes available thru the IDE seems to work just fine, it's just "not supported". Seems like it will be in a future version. The built-in tool is configured by per-project settings in VS. So, while you could define a corporate project template in VS easily enough, the built-in solution becomes more of a suggestion and each project can drift from the start point. To define a corporate standard that can be enforced in the check-in/build process the FxCop stand-alone executable seems like it would better fit the bill. I am actually thinking that both, applied with a good measure of understanding, patience and determination, could really help a shop kick up the quality of its code over time.

These rules are all based on the Framework Design Guidelines published by MS and now available in book form. (that's is not a pay-per-click link, go ahead click it all day if you want). As such you will find that rules are definitely slanted toward code that is meant to be consumed by other developers. In that role applying naming conventions and showing consistency with the .NET framework established patterns in Event Handlers, Exception Handling etc. will provide a familiar experience to .NET developers who are going to use our code. It also raises the bar on the core pieces of our application that will be re-used throughout the project and, perhaps, even across teams. Realistically though, no team is going to apply all these rules, even Microsoft realizes that. Even within a team, I can see how different subsets of these rules will be applied based on the nature of the project. I think the real benefit come from caring about code quality enough to bake it into your development lifecycle. Each shop is going to come up with its own flavor of standards and its own recipe on how to suggest/enforce them.

As I went thru the rules I was impressed with the detail. I was happy to see some pet peeves listed, such as CA2001 Avoid Calling Problematic Methods which screen for code that calls, among other methods, GC.Collect(). It is always nice to know when a dev has slipped that one in to "fix" a problem he was having with memory consumption. You have also got to love some of the rule names such as my favorite: CA1505 Avoid unmaintainable code. Gee, I never would have thought of that! :-) Maybe we should have one called "Avoid buggy code". Kidding aside, the rule is based on code complexity metrics and is one rule that overlaps with the Visual Studio Code Metrics tool as it is based on the structural complexity of the code as indicated by cyclomatic complexity and lines of code etc. So the rule is valid, I just love the name.

There are also rules that are based on idiosyncrasies of the CLR that I was unaware of and that proved to be an education. One example:

CA1809  Avoid excessive locals 

       A common performance optimization is to store a value in a processor register instead of memory, which is referred to as enregistering the value. The common language runtime considers up to 64 local variables for enregistration. Variables that are not enregistered are placed on the stack and must be moved to a register before manipulation. To allow the possibility that all local variables get enregistered, limit the number of local variables to 64.  

Now, my first thought was "who need 64 locals to a method! Still I had not heard of enregistering prior to this. It may never affect a single line of my code, but I still love to know what is going on under the code that I write. A little googling led me to some good articles that mentioned the subject.

.NET Performance - The Crib Sheet

Writing High-Performance Managed Applications

That is just one example among many. Reviewing the rules in detail proved much more enlightening that I had expected. I would encourage it as a project worthy of some time investment. I hope to blog about some of the other rules that I found interesting.

 

© Copyright 2009 - Andreas Zenker

2 Comments

Comments have been disabled for this content.