When TDD Goes Bad #1.2

So, #1.1 was all about the "business" - people defining requirements, and how these can cause issues.  #1.2 is just a short entry about the underlying statement I was trying to make in the original post:

Anti-pattern: Where inexperienced/misguided developers keep on testing where it's not realistic or financially astute to do so, thinking that's what TDD is, and thinking they're "adding value".  As we all know, that's not what TDD is.  And it's something we should try to avoid by coaching, mentoring, and working closely with development teams; helping to give them ways of judging the value of any piece of work.

It's all too easy to work with a group of savvy technical people that get TDD, and not be able to see how it could go wrong.  If you try and scale this across an enterprise where some people just don't get it, and the support isn't there to keep things moving in the right direction, things can, and do go wrong.


  • OK, now I get your point, and I completely agree! =)

  • I don't see why people have a problem with James's comments. The situation he is describing is no different from those in more traditional environments that I'm sure we all have experience of, e.g. Analysis Paralysis and testing to the nth degree at the end of a waterfall lifecycle.

    As was stated in comments on the last post, it's knowing when to stop. Anything can be taken to extremes and just because we all love TDD doesn't mean that it can't be misused/misunderstood.

    This to me is where the developer having some knowledge of traditional testing techniques and practices comes into play. A good tester knows when to stop and when diminishing returns kick in. Being able to assess this is not necessarily a skill a developer has. We've all seen vastly over-engineered code for the job at hand because the developer wanted to make it "perfect", "bullet-proof" and "flexible". And yes, I know XP aims to put a stop to that, but you (hopefully) get my point.

Comments have been disabled for this content.