When TDD Goes Bad #2

I've been debating whether to post another entry on this for the last couple of weeks due to another bad smell I've spotted around Agile practices.  However, it's my blog, and I think it's worth saying.  But first, the bad smell...

Zealotry.  Many advocates of Agile practices have had to fight their corner hard against supporters of traditional methodologies.  Having had several such confrontations myself, it does become tiring, and you do beging to expect fairly incongruent arguments against it.  There is a danger, though, that the habit of dealing with arguments against Agile that don't stack up will lead to a blinkered view where it's assumed that any negative statement about the principles and approach should be dismissed.  No methodology is perfect - different circumstances call for different approaches, etc.  Also, just because a methodology when applied correctly is appropriate and should result in success, it doesn't mean that it will be applied correctly and appropriately.  People can, and will, get things wrong - especially when lacking experience.

So, TDD Gone Bad #2...  Mock oriented development.  Mocks are a good thing where you really want to extract a dependency and ensure you're testing the right thing.  But there are a couple of issues that can arise:

  1. Poor integration tests, as everything is being tested in isolation - we can end up with a system where the constituent parts are clean, isolated, well tested, and known to be correct.  But how they fit together is a greyer (or even blacker) area unless mocking is accompanied with a complement of integration tests. Dave describes this well in his blog, so there's no need for me to go on about it here: http://www.twelve71.org/blogs/dave/archives/000616.html
  2. Mock oriented design.  Whilst mocking is good at removing dependencies, their introduction regularly alters the design of the system.  I've seen numerous occasions where the introduction of mocks has added a large amount of complexity to an otherwise simple design.  This complexity leads to higher implementation costs, a higher cognitive load on the developers working on the system, and higher maintenance costs (as there's more code to maintain).  All of which go against the principle of "the simplest thing".  The irony is that the introduction of mocking can, sometimes, make completing a system far more time consuming due to the different levels of granularity, and the additional code required to implement interfaces, etc.  Again, that's not to say mocks aren't very useful things, just that they're a tool to be used where appropriate, not a pattern to base the foundations of your entire implementation on...

Whenever mocking is used, the value that the mock gives versus the cost it will introduce over the lifetime of the system should be measured (or at least estimated/considered)


  • I have read your TDD gone bad posts a couple of times now, and have let it sink in. The more I think of it, I have to admit you do have a point. I also read the article "Goal 1 - Mission 0", which I think touches the same issue.

    I have have gone in the trap of thinking I have to test everything. And because I want every test to be isolated, that means a lot of mocking.

    What I discovered was that my tests digs too deep into the implementation and no longer tests just external behaviour. That means my tests are coupled to the implementation, which is not how it should be.

    Ron Jeffries latest bowling game articles, where he reuse the very same tests to explore different implementations and designs, shows it very well how it should be.

    So a simple test on the tests: If I can't use them to implement a totally different design, then they are too coupled to the implementation.

    Perhaps this is a testing smell or anti-pattern?

  • Thomas - totally agree. One of the most disheartening (TDD) things I've ever seen was when a design had evolved that was clearly not fit for purpose given the latest set of requirements. Unfortunately, a change in design could not be made because of the cost of reimplementing tests against the new design. Rather than testing the business requirements, it was the implementation detail that was being tested. I see misuse of mocking as contributing to that greatly, as it lets you delve much deeper into the system, rather than testing inputs + outputs.

    When it comes to mocking, I generally work on the principle that if you have end to end ownership and visibility of the system, that mocking *may* not be necessary at all. It really comes into its own where external integration is required, or the output from a system isn't measurable.

  • The increased dialog on the subject is great, b/c it brings to light all the positives and negatives that are really there. The crux of this discussion is that it's important to not be blindly religious to any approach, or we fall victim to misapplying something when it is not valuable. A hammer is an excellent tool when we want to drive a nail in that probably doesn't need to be extracted. If it does need to be extracted, a drill and a screw would work better. Neither the hammer nor the drill is better, they just fit better for the situation.

    TDD is an excellent new tool and we should be really examining where we can use it. The fear of misapplying this tool should not stop us from learning and improving. JS Greenwood is correct in that people will do it incorrectly, but fear of failure should never hold us back from progress.

    I say let's keep the dialog going, and find a way to make this new tool work.

  • James,

    It would be great to see some examples - ideally code - of designs with and without mocking to illustrate the issues you describe. I'm mainly interested in the Mock Oriented Design paragraph - code samples to illustrate that would be very helpful.



  • Rob,

    Sorry, for the delay; I'd not noticed the comment on this post following all the XPSS shenanigans. I'll try and find a good real-world example as soon as I get an opportunity - will have to be careful to pick which (prior) employer I'll pick a non-sensitive one from, mind... :)

    However, a general thing that I've spotted that really irks me is the increased usage of interfaces, if mocking is being done based on that (rather than inheritance). As far as I'm concerned, interfaces should be saved for where you definitely _do_ want to be able to replace one implementation with another. If they are only defined and implemented to support mocking, then I feel that the simplicity of the design has very much been compromised for the tests. Not least because there're then security implications of having anything supporting that interface being "pluggable", along with cost & maintenance issues around now having to test that the interface, and so on.

  • While I fairly new to the mocking game I believe that there's a commercially available mock framework (POCMock) that allows you to mock .Net classes and static methods, etc without having to alter your design to accommodate your unit tests.

    I strive to write grey box tests that take into account my intimate knowledge of the code I'm testing / designing but test input/outputs for verification rather than start inspecting the guts of the objects.


  • >> a general thing that I've spotted that really irks me
    >> is the increased usage of interfaces,

    I have to agree with JSG on this statement. Our design on our current project definitely has an increased number of interfaces. In addition, there are a number of factories that instantiate the correct instances (MOCK or real). On the other hand, we are absolutely assured of the interaction between the system and each of the interface implementations.

    MOCKing is similar to design patterns though. Just be smart with them.

Comments have been disabled for this content.