Unit Testing, Agile Development, Leadership & .NET - By Roy Osherove
It's mostly a question of effective education style, assuming that the industry wants more people to learn testing -- maybe many don't, that way they can make more money being "specialists".
BTW, I meant to say that I totally agree with this article, Roy, and on the Twitter convo that birthed this post.
Obviously from our twitter conversations, I agree with you.
As an evangelist working for Microsoft, there's another interesting correlary here. I have been hearing from our Regional Directors and other community influentials (some MVPs, etc.) that there is actually still large number of "mainstream" .NET developers out there who don't understand Generics. If you don't understand Generics, I'd imagine it would be next to impossible to truly "grok" some of the SOLID principles and some of the things we are required to do with Generics just to get around the statically-typed system we deal with.
For the sake of adoption of both SOLID and Unit Testing, I think decoupling the two is a very good idea. And in the end, I believe it will help the adoption of both of them, instead of holding developers back.
I may be in the minority here, but I would much rather be faced with maintaining a crappy-designed system that is loaded with good unit tests, than a really good system that has zero unit tests.
Of course, both would be ideal, but we have to do some work and mentoring to be able to get there. And that just isn't possible if people get discouraged in the first week. People often learn by doing, and if they're having to "sit on the sidelines" to learn a bunch of stuff before they can really contribute, you are going to have much larger problems to deal with in the future.
Thank you again, Roy, for a very well-thought-out post. I really hope that we can achieve what you map out here. This coupling has bothered me for some time now, though I've been more narrowly focused on TDD as a stand-in for the SOLID-coupled camp. Unfortunately, most sources you find when searching for information on "unit testing" are (right now) going to be those with this tight coupling. Changing that will be... difficult...
"...just get people to start testing today and learn SOLID later..."
That sounds like you want to sale me something :) and that magical "just" keyword is not just (he-he) a coincedence.
I agree with the idea of the post, but there is an unfortunate and very real problem with it. Unit testing poorly written/designed code is really tough. I almost gave up on the idea years ago because it was so hard to write tests for my poorly designed applications. I've spent the last couple of years learning about better design practices and now I find TDD much easier.
Overall I like the idea, I just think its going to be hard to teach
if someone is so poor as a developer that they canning understand an apply SOLID (5 simple rules which should take no competent developer more than a couple of weeks to grasp) ... Why do you think they ate competent enough to write even a simple unit test?
As an example ... Our current project has a class for ProductCode ... It requires no mocking to test, requires no application of SOLID to write or test, and has 70 odd unit tests around it ... To write those tests well took far more experience and time than would ever be required to learn and applly SOLID
I think the part that may be the issue here is that good design patterns are trivial to learn ... Applying them well is harder but still not rocket science ... TDD in any meaningful way is considerably harder than both long before you get to mocking or SOLID
>>>I may be in the minority here, but I would much rather be faced with maintaining a crappy-designed system that is loaded with good unit tests, than a really good system that has zero unit tests. <<<<
I would rather the opposite. A well written system has far less need of unit tests ...
- unit tests are there to document (well written code needs little document)
- unit tests are there to verify expected behaviour (well written interfaces, good use of ISP, good use of short methods and clear SRP/SoC mean that observation is usually sufficicent for most code)
- unit tests are there to prevent future development breaking previously working code (well written code will isolate changes, and significantly reduce the impact upon other code areas)
You argue that TDD could be better of alone with out SOLID, and people could learn SOLID later.
But since SOLID is the pre-request of good TDD, why would you want to turn this up side down?
To make $$$?
I would much rather people started learning SOLID and did no TDD.
So its SOLID then TDD, not TDD(with out the design aspect) and then SOLID.
Pingback from Gabriele Lana » La legge implicita del test unitario
Pingback from Dew Drop - September 27, 2008 | Alvin Ashcraft's Morning Dew
Ultimately, on the purely theoretical side, I agree that TDD and SOLID design have no reason to be coupled. If they assist each other, that's awesome, but there's no need to make one a prerequisite for the other.
However, looking at the bigger picture, there are other factors to consider. To me, it seems that software engineering is in some sort of trouble, mainly because a lot of those who work in the field are insufficiently educated and unskilled in their craft, thus dragging the entire industry down.
The high demand for software, and the "programming is easy for anyone" approach cultivated by many for the past couple of decades, have made people believe they can practice software engineering without being proficient in it, and have contributed to software engineering becoming an industry filled with mediocre workers and mediocre quality products.
Making TDD easier on the programmer who doesn't employ SOLID will cause greater integration of TDD principles in the industry, but at the same time it will legitimize building software that is badly designed, as long as it's tested.
The current coupling of TDD and SOLID presents a sort of healthy ultimatum to the uneducated programmer: either learn how to design well and gain the ability to test your code, or stay uneducated. Simply put: if you want to be any good, you have to learn how to practice your craft well.
This choice might leave some in the warm embrace of amateurism, true; but it will cause others, who wish to become better at what they do, learn how to design software well AND test it.
With TDD simplified and the barrier for entry lowered, a middle class will be created: that of programmers who know how to test their code, but don't know how to design it. Some will go on to learn SOLID, but many, I fear, will be comfortable staying at that level - which would be a loss for the industry as I see it.
Simplifying TDD creates a sort of "magic bullet" that gleams at the uneducated programmer, hinting: "there's no need to know how to do things right". I feel that in the current state of the industry, this is not a good idea.
I hold good design as more valuable than test coverage (this might be the only point where we differ, and the heart of this comment). I'd rather "win over" those who would take on SOLID for the sake of TDD, than "give up" those who would forsake SOLID because they have a simplified way to do TDD.
I hate to again be the Negative Nancy in the comments, especially when so many commenters are simply gushing over the post :), but I have to say that I'm with Casey and Morten on this.
Some things take time. Some things have pre-requisites. If you can't isolate dependencies, you aren't doing unit testing. If you can't do unit testing, you aren't doing TDD. Unless Roy comes clean and starts advocating changing entirely to dynamic languages/platforms, SOLID is going to remain a pre-requisite for isolating dependencies. As such, the ordering of the subject continually under discussion, but still missing the real point (and I'll get to this in a moment) is still and always going to be:
2. Unit testing with hand-crafted Doubles.
With the introduction of mocking frameworks occurring either after 2. or after 3. depending on the people, the project, etc.
Now, to get to the elephant in the room that has been missed by just about every single post across the blogsphere that I've seen crop up due to this on-going thread of discussion. At the end of the day, nothing I've written above is even relevant to the core problem. And nothing that Roy, or anyone else has written about during these last few days, is relevant to the core problem either. That's right. You've ALL missed the key point. And here it is:
There are two and only two reasons the TDD adoption rate is low.
1. There are people unwilling to improve the skills of their trade.
2. There is nowhere near enough focus on first convincing people of the value of test automation.
1. is a show-stopper, as there are lots of things to learn, lots of pre-requisites that create dependencies between the things that need to be learned, and it takes a willingness to make an effort to learn over time. Don't blame schedules. Don't blame management. Blame these people. Such avoidance of skills development is simply irresponsible and unprofessional, and we need to stop making excuses for them.
Even once #1 isn't an issue for a given person, you still have to convince them of the value of automated testing, and I have seen not a single mention of this in recent posts revolving around this subject.
In my experience, you can't "solve" #1. You can't change people. Not to this degree or in this manner, at least. You can either remove such people from the situation or remove yourself from it, and don't convince yourself that you can force them into it via standards and defined processes. Even if they play along, they'll just write shitty tests all day long in order to collect their paycheque, and you don't want those kinds of tests (or the kind of code they are testing) in your codebase.
As for #2, I have found that if you can convince people of the value of test automation, and admittedly it usually takes a bit of faith at the onset, some time to build up some automation, and finally demonstration of the resulting real value, as long as the people you are trying to convince aren't a member of the #1 group, they'll come around to ALL of it. The value of test automation. SOLID. Unit testing. Mocking. TDD. BDD. Beyond.
Start by admitting that #1 is a problem that you cannot solve. I know it sounds defeatist, and perhaps in a way it is, but it is also simply a pragmatic realization of the practical realities of the situation.
Then move onto #2. Lead by example. Put some test automation in place. Demonstrate its value over time. Do this and people will buy in. It will still take them time to learn the various mutually-dependent, mutually-beneficial, network-effect-generating skills involved, but they will buy in, and given time they will get there.
Can we please just get back to building great software and helping our peers develop their skills while doing it?
I think most of the people that are arguing for SOLID first (and foremost) and then everything else are missing one important point; not all us poor developer soles are computer science (or the related) graduates.
The industry is thriving with professionals that were lured into the field by the unresistible attraction of long hours in-front of the keyboard to bring to life something you designed (sorry Roy, beat you to the world’s worst analogy)
Didn’t the success of Ruby and similar dynamic languages teach the industry that its not all about UML, factorings, SOLID and all the OOD jargon that though serves great interview show-off does not impress management when they talk about delivery and care less about maintainability, QA, CI, …etc.
Of-course developers should care and that’s why Agile and TDD is causing a lot of stir and curiosity but speaking out of experience the charge of entrance is too high that most just stay watching from the fences.
I am all about tools and frameworks that make all this attainable for us mere mortals and I am sure that most would pick-up SOLID and OOD as an evolution to already attained understandings and skills.
"It allows developers in a non-agile environment to do little,incremental steps on the road to a better way of working, without needing to chow down the whole meal in one big bite."
This is not simply 'hitting the nail' on the head; this is *sledgehammering the railroad spike* on the head. Excellent post, Roy. It seems that people don't understand that you don't want the software engineering discipline to become diluted by dimwits - on the contrary, you want lead people down the right path, and you're tossing around some great ideas about how to do it.
Pingback from The reactionary voice of TDD | Steve Freeman
I've responded at www.m3p.co.uk/.../the-reactionary-voice-of-tdd but I did want to address a couple of points here.
- the people who developed TDD in Smalltalk came out of a very strong design culture. Just because Smalltalk is infinitely flexible didn't mean that people exploited every corner, the community had learned not to. Furthermore, Smalltalk has some key constraints (everything is an object) that make certain kinds of bad design very difficult.
- static typing is not a prerequisite for interaction testing. It makes certain checking easier but its implementation /much/ harder. Tim Mackinnon's Smalltalk mocking library is about 7 small classes.
- we're not talking about especially "great" design, just basic concepts that other people discovered up to 20 years ago.
Pingback from Arjan`s World » LINKBLOG for September 27, 2008
"I'd rather "win over" those who would take on SOLID for the sake of TDD"
This is exactly where the adoption rate is nearing zero. From my experience in the Enterprise (not the starship, silly, the corporation type) this is roughly what happens:
1. dev is concerned about quality issues
2. dev gets boss permission to look into automated testing
3. dev delves into TDD, some design concerns and prepares a presentation
4. dev enthusiastically presents his finding to team. Team members rant about having to redesign. Team members rant about this going to take too much time. One of the experienced guys throws out an estimate of this going to prolong the next feature development by 200%. Boss wraps up the meeting thanking dev for the excellent research, and says something about following up on it once things settle down in the project
5. dev goes back to cubical and resumes churning code
So what happened, in the bottom line? Nothing changed. Nothing was gained. The individuals involved in this sort of a story (and I've witnessed it unfold first hand at least twice) may all be excellent engineers with proven experience and overall nice guys. What happened? the barrier for entry was too high. People had to meet deadlines and starting to learn perfect SOLID and (taking from Roy's last post) the difference between a stub and a mock had them push the need for change down the to-do list right after cleaning their keyboards.
I on the other hand, would prefer lowering this barrier of entry and introduce people to the world of TDD and automated testing. Worst case? code is poorly designed but is more stable and less buggy when it hits the market (or the organization). Likely case? after falling in for TDD (like many people do) these likeable nice-guy engineers understand that their tests are sometimes fragile. Someone who just finished his CS degree will point out a high degree of coupling in the codebase. People may now approach healthy redesign with the safety net of regression tests and the associated confidence.
Bottom line - this is not a fight between the light side of design purity and the dark side of making tests too easy to write they will corrupt the product and the people writing it (how's that for a poor analogy?). I believe good design will follow good practice, and easy TDD will help people learn.
Pingback from Husbanding willpower | Steve Freeman
Your enterprise enthusiastic dev picked the wrong target - he was so concerned about this new fad of TDD that he forgot it pays more to refactor to a better code base in the first place.
No matter how magical TypeMock can be in some circumstances, badly designed code will still be hard to test, not just because you used a static for example (which could be refactored to a testable approach in minutes by giving it a facade), but because it will be a tangled ball of mess that nobody will truely understand and be able to decipher.
Forget TDD ... these people need to start learning good coding practices first ... then they need to learn to write unit tests (POUT) ... then they can learn TDD a *whole* long time after.
You do not start TDD on a legacy code base for legacy code unless you are going to refactor, otherwise you are further concreting your poor codebase against refactoring and improvement later.
It seems to me that you are suggesting that the only barrier to effective unit testing is that the code doesn't follow SOLID principles.
It rather seems to me that if you cannot get at least the 'S' followed, you sort of don't actually have UNITS in the first place and so testing the app is going to necessitate spinning the whole thing up just to get 'testable' code to run. This pattern will result in extremely SLOW unit tests that in turn will stymie adoption of unit tests on even non-SOLID code.
I actually applaud your identification of the challenges to TDD adoption and don't consider your opinion unduly influenced by your Typemock affiliation, but let's not conflate TDD adoption with simple unit-test adoption. The barriers to BOTH are somewhat similar but also quite different in most cases.
BTW, speaking as someone who has been in software development for 15 years and never once took a course in anything remotely like computer science, software engineering, or anything else technology-related, this whole notion suggested that only CS-grads can somehow grasp why SOLID is an important set of principles is complete malarky; SOLID = common sense if anyone with half a brain would think about it for a second. Those that say "SOLID is too hard to comprehend" must have never been on a project that suffered from any of the long-term problems the SOLID principles are designed to mitigate.
Like anything else in life, once you have been burned once or twice, SMART people start to look around and ask "isn't there a better way?" and dumb people just do the same thing over and over again hoping for a differnt result. Tooling and techniques will never turn the latter into the former, no matter how 'approachable' the tool might make any one technique.
Pingback from Reflective Perspective - Chris Alcock » The Morning Brew # 189
It's really a question of how to be Agile about "how to be Agile".
It's about how to capitalize on the TDD benefits incrementally, as opposed to some sort of "waterfall"-like training program that is supposed to change the mindset of every team member in one big leap without any stepping stones in between.
I think you are right, Roy. Personally, I think the code smell driven training is the best. It was the approach by Bob Martin in his book. If you have a certain smell, chances are that you are violating a principle.
Smell type A usually is the symptom of violation A or C. A and C have each an associated list of patterns that may be investigated.
That's what did it for me. Do TDD and investigate smells. Smells in the tests and smells in the codebase.
After a couple of weeks of crappy Internet connections I finally managed to connect via my cell phone
What's a unit test anyway?
Is it any class that happens to have a TestFixture attribute on it?
What value would that provide?
Wow, Udi is now trolling. What's next? JDN commenting in full agreement? :-)
On one of my recent projects, we were doing exactly what Roy is proposing. Every developer was creating classes that had the TestFixture attribute on it, with loads of methods that have the Test attribute on them. (we were told to write unit tests to test the code we are developing., so that is what we did. And let me add this was a team of 20 odd contractors, on very competetive daily rates including me.)
The application here was a classic example of the aneamic design model. Let me spell it out for everyone and say that where we ended up was very painful and quite useless. We had truck loads of tests that were written with no understanding of SOLID. It meant
1. Only the developer who authored them more often than not understood them.
2. So many dependencies on the tests to run that quite often we ended up ignoring them.
3. The slightest of refactoring and so many would break, and we would have no clue., so we ended up ignoring most of them.
4. Running the entire test suite would take us some where between 1 and 2 hours.
5. Looking at the test, no one could make any idea of what it was intending to test, or any remote sense of documenting the code.
6. Over time we discovered there were even truck load of bugs in these tests.
The only real value we got out of these tests was, when some one checks in code, eventually in a couple hours time, the build breaks or fails. From there on, let me add, that to find what went wrong, and often why a test was failing was no easier than debugging the application itself. 99% of the time I found my self debugging the tests with break points to understand what the hell was happening.
Does this really help? Is that what you want Roy when you said, 'I did rather have a poorly designed app with tests than an app designed on SOLID principles with no tests'.
You see a dev who doesn't understand SOLID pricinples will exactly write tests as I have mentioned above. And exactly the same result will be acheived.
you should just stop talking. yer posts are freaking irritating and for the noobs.
Pingback from Software-Engineering » COEN 285 - Software Engineering
I think it's a bit strange that you recommend to buy your book. Because it's not even finished and have been in the working for a long time now. and because there are several things in the book that contradict what you are saying here on your blog. Here are some quotes:
"there are always those who
feel that opening up the design to be more testable is a bad thing because it hurts the “object Oriented” principles
the design is based on. I can wholeheartedly say to those people now “Don’t be silly”." page 53
About Typemock "Some people claim that can even be too powerful (I’m one of them)" page 84
Karsten: indeed, when I started writing this book, about two(!) years ago, I had different opinions on Typemock and testability (or rather, less dilemmas about these issues).
These comments were changes to reflect my current thinking in the latest book cycle.
the good news is the book is just about done now.
regarding the comment on page 53 - I still hold that SOLID is a good way to design (on top of OOP) so "Don't be silly" still holds.
Also, suggesting my book was a bit of a tongue in cheek since it was in the middle of me saying how people can profit off the situation.
There. I explained the joke. It's dead. happy?
My point is that you can write quality tests regardless of your design (though they can be a bit longer with tough designs).
Just like you can write really crappy tests even if you have a very good design to begin with.
1.Only the developer who authored them more often than not understood them.
- Test Readability can be achieved with simple guidelines
- Ignoring what? the tests or the dependencies?
- Writing brittle overspecified tests can be done in good designs as well (if you overuse mocks instead of stubs for example)
- so maybe these were integration tests to begin with? not unit tests? which explains a lot of the previous points you said.
- again test readability. which can be done crappy in good designs as well.
- avoiding logic in your tests is a key guideline that can be learned, and has nothing to do with good or bad design
in fact, all the issues you've mentioned can be caught and done well without teaching the developers what SOLID means.
That's my point.
Udi: What the hell are you talking about..?
>Karsten: indeed, when I started writing this book, about two(!) years ago..
I fully understand that you have changed your mind.
>the good news is the book is just about done now.
That's great, I'm excited to read it and also excited to see if you will change your examples to use Typemock instead of Rhinomocks :-) Now don't get me wrong, I think the book (what I've read in it) is good.
>There. I explained the joke. It's dead. happy?
Hmm what was the joke again. Do you recommand designing for testability?
Actually I am still using Rhino in my book because Typemock Is still a commercial product.
Pingback from TDD without the design at Mark Needham
Pingback from Unit Testing for Developers and Managers
Thanks for this series of posts. I think you're making a very good, very relevant point.
By the way, Alistair Cockburn posted something very relevant here: www.infoq.com/.../cockburn-testing-guts
This whole " lets make TDD easier for the masses " thing is getting out of hand . Unit tests
The discussion on the future of unit testing for the masses has shifted from the standard “if they are
Ok. I think I am beginning to see what you are saying Roy. What I am not sure now is why has this post triggered the discussion of doing TDD without the SOLID principles?
I see this post as you saying to the average developer..
'keep developing in your comfort zone., let me show you how to write good tests on top of your code., so you can have good maintainable tests to start with'
I think this is achievable. But to write tests first and then code that makes the test pass without any SOLID principles is.. P.A.I.N.F.U.L
There has been a fair amount of activity in the blogsphere to Roy's original post about increasing
I agree and I've been telling people this for quite some time now.
Getting your code under test is valuable in its own right. Why wait until you can convince a whole team of people to radically change how they work to apply TDD processes?
The grassroots effort should proceed from conscientious application of SOLID principles, refactoring and Plain Old Unit Testing.
In other words, business as usual... but a little smarter and a little better. Not a sea change but a progression.