TDD: Who is testing the tester?

I posted earlier about the book "Test Driven Development in Microsoft.NET, and I liked that book very much, it's not that. I've been using JUnit and NUnit in smaller and larger projects, but I've never used it in the way you often see it described in books written by the TDD-gurus and the way it is described in this book. I can't help it, but it feels kind of stupid to create a test the first thing you do, just to see the compile fail, write enough code to make the test compile then see the test fail, then add "dummy code" so that the first test pass but the rest fails... you know the drill :) 

Then there is this thing about testing every class in every layer in a system or application. Writing that many tests manually would probably make me go nuts, and the quality of the test methods would most certainly drop after a while. The value of unit testing only gets as good as your test code, right? I've seen some tools that try to create test code for you automatically, but I've not been that impressed.

The way I've been using JUnit is that I've created tests against the facade layer or (web) service layer and then added the tests as a target in my ANT build scripts. That has worked out pretty well. If someone in the team changed something in the database or some other layer, the test fail and you (most of the time) get a pretty good tracktrace which shows where the problem lies. Even so, writing really good tests for that layer only (with scripts for setting up database tables and populate with test data), can take quite some time. Of course, it depends on what kind of system you're building and if the service/facade interface takes complex parameters which makes it hard to test properly, but imagine writing tests for the service layer, business layer, utilties and data access layer too. You would probably spend more time writing test code than anything else :)

Some would probably say TDD, XP and Refactoring is something invented by a few smart guys so they would get longer contracts and get to spend more time coding and less time documenting ;) I just wish some of my customers believed in those methods when I was a consultant :p

Personally I'm going to continue to use parts of TDD, XP and refactoring that I like because I think these methods (not that refactoring is a dev method but anyway) have got some really good bits in them. Besides, I love to read the books about them.

7 Comments

  • Personally , I would never hire a developer with this sor of attitude.

  • You mean a developer who uses these methods just to get longer contracts, more programming and less documenting? I'm not saying that's the reason why these methods were invented and I don't think that's the case :)



    I do believe, with my limited knowledge in these methods, that you might get to spend more time developing the system, but the result may actually be better - the customer gets what he really wants and the quality of the code (and the system) is better. But since I've not been running a full fledged XP or TDD project, I can't say. Not looking forward to writing a zillion test methods manually though :D

  • I have to agree with your comments. I can see how writing tests like the book explains can be very powerful and give you 100% tested code, but I have worked on projects where the number of tests has far exceeded the amount of production code and then you have the pain of dealing with tests that are so fine that you are unable to change any of your code without having to fix 100's of tests.



    I believe writing tests should be based upon the complexity of the code you are writing. If you are writing a complex algorithm then sure write lots of tests for it, otherwise keep your tests short and sweet enabling you to refactor later without fear of making a 5 minute code change turn into an 8 hour test fixing exercise.

  • I think the point of XP and TDD is to make it fit your particular needs and environment. That is what sets these practices apart from something like CMM. And I think this attitude demonstrates that understanding.



    100% test coverage is impossible, the best we can do is use our best understanding to strengthen the weakest parts of a system with testing. Kind of like the way a civil engineer might reinforce the stressed points of a structure -- you wouldn't waste time money reinforcing what you don't need to!

  • I guess I use TDD pretty much the same way you do and I like using it that way. One thing differes though: I write my test for my business layer. The reason for this is that I might expose the same business functionallity in more than one facade/Web Service. I would only write test for the facade/Web Service layer if I'm doing some kind of transformation or anything other special in that layer. Do you see any special advantages with testing at the facade/Web Service layer? When I think about it, it feels very good to say that you test from the outside in a SO architecture. I might reconsider how I do this. :-)

  • Personally, I find that people who claim "I would never hire XYZ" as their only response to an article are almost always not the people who are in a position to hire anyone.

  • Like Russell, I did go down the path of testing every method, branch etc, and one simple but significant change (say in the eventing model), forces you to go back and change 100s of tests. I did stumble across a solution that gets you code coverage where you want it, and leaves the rest a bit more flexible. On Len Holgate's blog, I stumbled across Just-In-Time testing. www.lenholgate.com/archives/000359.html



    Essentially you go ahead and set up your framework for testing any class you might want to, but if the solution is apparent without testing, don't bother. When you hit a bug because you're not smarter than the code, then you can go easily go back to your framework and write in a test for that case. Now I have confidence that my code is being tested in the right places, and that I'm not wasting countless hours due to a simple architecture change.

Comments have been disabled for this content.