My current project (which I am very excited about!) is building an internet facing ASP.NET application for a high profile function. It involves building business objects to map to the database and general functions of the system and then mapping those into the UI. And now the fun part ... the entire application has been built using TestDrivenDevelopment (TDD)!
The business layer is being developed test first using NUnit and likewise in the ASP.NET UI using NUnitAsp. The rest of the architecture:
- SourceGear Vault for Source Control
- Visual Studio .NET 2003
- DAL - Thycotic.Data
- Business objects - hand coded (with the help of a custom rolled generator) and using NullableTypes for exposed properties
- NAnt script for pulling out of Vault, compiling, running NUnit unit tests and pushing successful builds to integration server (running every 30 mins on a scheduled task - CruiseControl.NET doesn't support Vault yet ...)
What is the problem? TDD is new to the other developers and management. This means that occassionally there is a tendency to not test drive and just add a feature (null check, private method, etc) without a failing test. If we were always pair programming this would be less of an issue but our deadline is too tight to lose the estimated 15% additional time required to pair program. We did pair program on really critical areas of the system - base objects, establishing our data access pattern, exception management and security. However, the areas of missed TDD code are a great risk as they stand the best chance of containing bugs and swallowing developer time (and this has already been experienced on a few occassions). In true XP style we need an automated tool to help us catch any lack of coverage ...
- Integrates with our NAnt process (copy the source tree, instrument it, run the unit tests, generate a coverage report)
- Provides a metric that we can track and makes management happy by giving them numbers to confirm the TDD process (example report from the NCover website)
- Points us towards the code of least coverage (and greatest risk) to allow us to either delete it (dead code) or write a test for it (not the nicest ... but better than not having a test!)
- It is automated and provides continuous feedback with no intervention required once configured
I am very impressed by NCover. We have looked at the code (which compiled and passed all its unit tests right after the download!) and it is really neat. Well done to the developers! It would be nice to know the coverage strategy it is using (it uses a set of regular expressions at the moment to find branches in C# code) ... after looking at the instrumented code, it may need a little tweaking to catch all possible execution paths but it is an amazing and very welcome product.
I will be speaking at Microsoft Developer Days 2004 back in Pittsburgh on March 9th. It looks to be a very exciting event with a strong emphasis on security. I will be presenting “Threats and Threat Modeling - Understanding Web Application Threats and Vulnerabilities” on the Web Development Track.
There will be great speakers from the local area and an opportunity to get the latest message from Microsoft as well interact with developers in your community. Maybe we can even get in a plug for the Pittsburgh .NET User Group?
Sign up now!
The Refactoring folks talk about CodeSmell and CodeDeodorant. The concept is that a code smell is when there is a feeling that something *could* be wrong with a piece of code. CodeDeodorant is when an attempt is made to cover up the smell by adding whitespace (for “clarity”) or detailed comments to explain the code. These are usually signs that the code needs improving ...
The arrival of automatic documentation tools like Javadoc and NDoc often triggers an impulse to generate reams of HTML documentation for your API. However, documentation comments suffer from the same drawbacks as comments in the code:
- They are not executed and therefore could contain errors or could be out of date.
- They are also a burden and time drain to keep updating as your API evolves.
If you are following TestDrivenDevelopment (TDD) then you already have a complete set of unit tests that exercise all the features of your code. What better examples could you provide for your API then a full set of passing unit tests?
In conclusion, TDD makes XML documentation comments like hairspray. Nice to look at, a pain to modify and certainly not essential! :-)
Disclaimer: If your customer agrees that a documented API brings business value or your API will be published to developers without a source code release, then by all means generate documentation.
Thanks to Jason Alexander from the nGallery team in helping formulate this idea.
I presented “Remote Scripting in .NET” for the first time last Wednesday (1/28/2004). The event was held at Microsoft's offices in Findlay, Ohio. There were about 30 people with quite a few Microsoft employees. The presentation digs deep into how Remote Scripting works and gets rather technical in places - this knowledge is not necessary to use Remote Scripting but it is often useful to really understand what is happening. It also looks at how the landscape has changed with .NET and Microsoft phasing out support for their JVM. I was worried that I may have lost the audience in places but was very pleasantly surprised by the number of probing and interesting questions I received.
There was a certain feel of Remote Scripting vs. Web Services which I should have anticipated but didn't. They are different tools and both have their strengths ... as with all technology decisions - it depends on your requirements!
Presentation slides and demo code are available for download.
The demo code is interesting in that it implements a simple remote call to look up a book title by ISBN from clientside script in a webpage using:
1) Microsoft Remote Scripting client (applet) with ASP
2) msrsclient client with ASP
3) Microsoft Remote Scripting client (applet) with ASP.NET
4) msrsclient client with ASP.NET
5) IE behavior client with ASP.NET Web Service.
Some of the interesting questions that came up:
Q - What about security?
A - You can use Remote Scripting over HTTPS/SSL.
Q - What about using Remote Scripting from a Windows Forms application?
A - Hmmm ... a web service would certainly be a much easier fit since you would have to write a WebRequest based client or do something really hokey with a hidden Browser control!
Q - Which is more scalable and which yields better performance?
A - I plan to run some benchmarks. I suspect that Remote Scripting is lighter weight due to using a less verbose protocol although it is based on System.Web.UI.Page which may have more overhead than a custom HTTP handler as used by Web Services.
Many thanks to Greg Huber for carting me around during my stay in Toledo last week. It was a fun time and I even got to visit Tony Packo's - a Toledo tradition.
The TDD session at NWNUG went well. There were some interesting questions and also some interesting possible solutions suggested for the “PropertyComparer” demo - including using the Queue class but luckily we resorted back to DoTheSimplestThing and just used ArrayList.
David Claydon made an interesting observation while chatting after the presentation - we had implemented a reasonably complex sorting feature using Reflection in .NET and did not use the debugger once thanks to the TDD style of development.