Unit Testing, Agile Development, Leadership & .NET - By Roy Osherove
Wonderful article. This could provide a blueprint for lowering barriers to entry and easing the burden on organizations and leagues of jaded and weary developers.
Pingback from Dew Drop - September 20, 2008 | Alvin Ashcraft's Morning Dew
I'm happy more people are getting on-board with this mock overloading. We discussed that exact same point at the first altnetuk conf :)
Pingback from Small Steps Blog » Blog Archive » TDD ?????? ?????? ??????
I use Unit tests to drive out my design and use Rhino Mocks.
We use contractors a lot and I get a lot of resistance to writing unit tests.
I generally have to retrofit them after they have left.
I find it frustrating but I do see their argument. They are there to as they see it "get the job done".
Unit testing in my opinion is just not the norm. and until we lower the bar then it just will not be accepted.
I've recently moved to AAA with Rhino.Mocks but I have found more resistance with the lamda style Stub than with PlayBack and Record.
I much prefer stubbing via AAA and very rarely use expectations now.
The syntax is perhaps a bit cryptic for people who might not have used lamdas or for that matter .NET 3.5 before.
While I generally agree with the overall content of your post, I do feel compelled to state my disagreement with the following:
"Design for testability is just one way to get where you want to go. it is not the only way."
People get too hyperfocused on the whole "testability" phrase, forgetting along the way that each and every thing one might label as being solely or even primarily an aspect of improving testability was already long a member of good practices like S.O.L.I.D. There isn't a single thing one does to improve testability that isn't already a well-proven way to improve the overall maintainability of your solution.
Anything you can't do on the premise of improving testability doesn't only show that you have a testability problem. It also shows that you have a systemic problem in the overall maintainability of your solution.
"Sometimes it’s impossible."
To suggest that testability is impossible in your solution is to admit that maintainability is also impossible in your solution.
"To suggest that testability is impossible in your solution is to admit that maintainability is also impossible in your solution."
That is precisely the kind of purist view that drives me crazy.
and, yest. SOLID has been around for a long time, and people have not embraced it for a myriad of reasons. first of which is the high learning curve.
I agree that testability isn't an end goal, but many in the industry view it as such (you equate it with good design so achieving ones means the other). I don't view it as a bar to anything. it is simply a bi product of following SOLID rules. not the other way around.
Do you need to design for testability in ruby? or SmallTalk?
can you still get a good design in those languages?
First of all I found your post a very interesting / good read.
The reason for some good practices and principles not becoming main stream because of the learning curve, that doesn’t sound right to me. How can you expect improvement if you are not willing to invest? If you know that doing something this and that way is the proper way of doing it, why than not do it this and that way, even if it means a personal investment / learning. That is like saying that you would stick with technology A because that is what you know and feel comfortable with, not because it is the best answer to the problem. There are so many people that just don’t have an interest in learning new things; I think that is the reason for things not becoming more accepted. I think that for those types of people it doesn’t matter how low the learning curve is, they won’t do it anyway, only if it is enforced upon them.
The example of the external consultants not willing to make proper unit tests because they are there to get the job done, weird, if you specify that the job includes proper unit testing coverage than that is what they should do. Anybody can make software, writing good software that is maintainable and extendable that requires you to think, using principles like SOLID make you think about your design. Usually those projects that have good design and are maintainable and extendable are also very testable. I am surely not saying that I know exactly how to do these things, but I am saying that I am learning and improving.
Don’t you think that principles like SOLID and others help improve the design to become more maintainable and extendable? Don't you agree that a design that is maintainable usually also is extendable, because maintaining an application very often means extending it, or at the very least change certain behavior of it?
Roy Osherove has suggested a new name for mocks, fakes, stubs or any test double: Isolation . True, the
I am simply finding it really difficult to find developers (full time or contract) with any relevant tdd experience.
Either people do not see the benefit or are unwillining
To put the effort into learning it.
I simply cannot legislate the time in terms of money in getting contractors up
To speed with unit testing techniques like mocking.
"That is precisely the kind of purist view that drives me crazy."
Purist? Please try to avoid labelling people you don't know. I'm no purist. I'm a pragmatist. That said, my pragmatism says that you can't maintain what you can't test.
"SOLID has been around for a long time, and people have not embraced it for a myriad of reasons. first of which is the high learning curve."
Are you kidding? SOLID simply captures and puts a convenient acronym on techniques that already existed and had been well-proven. I have yet to find any sense of surprise in anyone to which I introduce SOLID. It is common sense to them as soon as they read it. Why? Because they've heard everything in it before. They just haven't seen it collected and distilled as nicely as the SOLID pages have done it. Once they see it all together, the concepts crystalize for them in a new way, even though they new the concepts individually from their various learnings and experiences from both schooling (which does often cover many of these things) and professional life.
"(you equate it with good design so achieving ones means the other)."
When did I do that? I didn't, so please avoid putting words in my mouth. In fact, the entire point of my original comment is that none of the things you do for testability are things you do solely for testability. The principles people seem overly apt to attaching solely to testability goals are simply principles of good design, principles that also happen to nicely support testability. They are not "equated" in any way.
"Do you need to design for testability in ruby? or SmallTalk?"
You need a testable design no matter the technology base. Again, please don't conflate the issues.
"can you still get a good design in those languages?"
Yes, and you can also still get a bad design.
I hope we are simply talking at cross-purposes and would in fact in violent agreement, because based on your reply I would otherwise have to assume that you completely missed my point and yet completely validated it at the same time. Thanks for playing.
I don't design for testability. I design for design goodness in and of itself, for the myriad mutually-supportive benefits that emerge from it. Testability is but one of them.
I didn't mean to call names. Sorry. But I find that putting finalistic labels on things to be a bit arrogant (which I seem to be as well some times)
When you said
that is why I said you equate the two. You may not have meant it but it sure sounds like to you they are equal.
Ruby and smalltalk are "testable by default" because you can replace anything at runtime without needing to have a "replceable" design.
So the design and the testability aspects don't matter, depending on the technology.
The fact that we are treating testability as a first class citizen in .NET irks me because there are solutions to that.
oh and Jeremy:
sure, people have not problem *hearing* about solid, and understanding it's always been there, but if everyone were already practicing it, the issues we have today with testability wouldn't exist as much as they do.
not would learning how to make code testable, or all the IoC principles be so hard for so many people.
Nice read. And I agree on the KISS, though I wonder, are you just swapping out some labels, but still pushing the same tech? How does that solve the learning curve?
The biggest obstacle to learning is Yet Another Domain Specific Language.
Personally I've thrown mocks out the window, I'm getting better mileage out of tailoring stubs with embedded assertions.
Pingback from indomitablehef.com » Blog Archive » Hillbilly Soul-Searching
Roy, congratulations on starting walking in right direction (simplification).
But you need to go much, much further.
Whole mocking concept could be safely thrown away for most of the real projects.
Only the simplest and the most efficient auto-testing methods deserve mass use.
It's important to admit, that for most of software projects even 30% of auto-test coverage is "good enough" coverage.
I think this is a great debate. For too long now everyone in the ALT.NET community has been stating and reiterating that TDD, BDD, whatever is just fantastic and we should all do it and anybody who is not, is obviously doing something wrong.
I do practice TDD for the design merits but it has taken me a long time to get up to speed with it. I made countless (and still do) mistakes at the start and the learning curve has been steep.
My first experience was that developers started with the best intentions but our initial tests were un-maintainable as they were affectively integration tests.
Using a mocking framework helped to write maintainable, fast running tests and I love the test first paradigm.
But I often wonder if I have actually got the return that my outlay warrants.
I use NHibernate and I rarely mock out ISession, ISessionFactory or ICriteria.
If I was after high code coverage, I would have to mock out these for no other reason than to get high code coverage. I use layering to hide this code behind its own abstracted dependency. What if I have a WCF or asmx proxy, then should I stub out each generated method? No.
A side effect of test first is you get unit tests to test your logic. If you are expecting unit tests to catch bugs and keep QA to a minimum then you are approaching the methodology wrong.
I also have a problem with new people coming onto the project who are inexperienced with TDD. They naturally write unmaintainable bad tests.
TDD is a massive outlay and we should and I am questioning its return.
Great article. Just this last week I've been trying to bring some other devs onto a project I've been working on. There's heavy use of mocking, DI, and IoC and I get the sense the new guys feel like they're drinking from the fire hose as you put it.
I think there is way too much emphasis put on Expectations in unit tests. The entire point of the mock, stub, test double is to Isolate the stuff I do care to test from the stuff I don't care about.
I had an ah-ha moment about a year ago when I realized how difficult it was to refactor code where my tests were full of Expectations. Tests were failing for the wrong reasons after doing refactoring. It was because of the heavy use of Expectations over Stubing in my test code. From that point on I only use Stubs unless I'm truly testing interaction of my objects.
It has made tests much more understandable. It has made refactoring much easier. It has isolated the rights tests that should fail from the refactoring or changes to my code.
I 100% agree with your ideas, and the direction you are heading.
Pingback from Arjan`s World » LINKBLOG for September 21, 2008
Pingback from Elegant Code » Testify
Recently, I've seen a nice trend out on the blogosphere on going back and revisiting the basics of
Pingback from Let’s go back to the basics of Cohesion and Coupling - taccato! trend tracker, cool hunting, new business ideas
Pingback from Reflective Perspective - Chris Alcock » The Morning Brew #184
Disagree: "Reason #1: Learning curve"
While I think the (perceived) complexity is a significant issue I wouldn't place it at number one. My own experiences learning TDD and teaching it to others suggest that the biggest issue is showing developers that there is enough value in the long run to justify the huge productivity loss in the short term.
Developers who really grok TDD tend to be those who have gravitated towards because it solves issues they have experienced in the past. The new developers we are trying to teach often haven't lived through enough project pain to really believe spending all those brain cells up front is worth it.
TDD doesn't have to be all or nothing. I think it is a much easier transition for developers if you can introduce a few key concepts each time you see them having issues. Pick the low hanging fruit and if you really add value they should come looking for the rest.
Roy Osherove is one of my favorite .NET personalities.  Having had the chance to spend some time
I train co-op students all the time and they love TDD and get the concept quickly. With continuous integration in place with many developers, the tests that fail will tell the committer that he/she broke some code. Also, writing the tests first makes you think of a classes responsibility (SRP) and behaviour.
Fakeing? WTF is that? Mock stays.
Pingback from Do the right thing. Assuming you know what that is. - taccato! trend tracker, cool hunting, new business ideas
I think Typemock does wonders for solving the problem. Learning curve with traditional IoC/DI - I don't think so - but I think the fundamental difference is ISV vs. ITS - and like it or not, the majority of your programmers are ITS.
We don't require the amount of flexibility that some of you do, and in fact if we went to the business and said that it's going to take 3 weeks longer to develop a part of the system that is more flexible they're going to ask us what does the flexibility give us. If we say something like - it will allow us to make members of a certain group change business rules on the fly they may say, okay. They may say 'no', because they have a time to market to worry about. But if we answered 'to allow us to swap from using SQL Server to Oracle' or move off the AS400 they're going to laugh us in the face. Why? They know that that such a change would require millions of dollars in other infrastructure related changes and they'd basically be pissing away money on the project that they could be spending elsewhere.
You see, in the ITS world we have what's called a strategic technology plan and enterprise strategic requirements. When we work on new projects we know we're going to use 'XYZ Logger' for logging, we know that we're going to use SQL Server, we know how many users we need to support, we know the expected (and realistic) life time of the product we're developing, we know were flexibility is required, and more importantly where it isn't, we have a business partner who understands and signs off on these constraints.
So in ITS we don't have to be flexible, and often we'd be causing problems and project delays by trying to build in the flexibility that an ISV might build in because *they need to by they're very nature*. Our parameters are known; this doesn't mean we're inflexible, it just means we don't require as much flexibility.
So for many of us, IoC/DI principals, when not required by the project, are just gold plating (or as Fowler would put it, 'Speculative Generalities', or as the Agile world would put it, YAGNI). So to put these things in, merely for a test framework? We're shooting ourselves in the foot.
This doesn't mean we don't understand the need for testing, and this doesn't mean in the past we've haven't put in these unneeded extra moving parts purely for testing sake; but it has rubbed us the wrong way because it just isn't pragmatic to design for a unit testing framework. Now low and behold, in my world, Typemock comes along - well now I have the ability to do the tests I want to do, without having to add the moving parts and complexity I don't need (because my parameters are known).
I hope that explains it from the other side of the world. I follow and preach that code needs to be tested as much as true TDD guy, but sometimes, the bottom line dictates that things need to get done. Tools such as Typemock allow me to server both needs, tools like Rhino mocks force me down a design path that our business just doesn't have the time, nor the need for.
For the record, when trying to sell purest TDD and the use of DI/IoC so Rhino Mocks could work in places it wouldn't now, I met 100% resistance - including with myself. When I introduced the team to Typemock, things clicked immediately and I had virtually no resistance. I wonder if that's a sign?
Less interaction testing, more state based testing -> have you tried Mockito for Java? (www.mockito.org)
Scottish Developers September Newsletter
Interesting but annoying that you didn't run a spelling check before posting.
Pingback from TDD. Killing the messenger? | Software Testing Blog
Nice article, and judging from the comments you have created a discussion.
I teach objects and TDD as part of my job (team lead/coach) and I wonder what some companies and schools teach when I have to tell someone what the difference is between passing by reference and passing by value. We can’t work for the lowest common denominator of the industry, we must drag the industry kicking and screaming up a notch.
Now, I also, I am afraid to say, dumb things down. I never use scary words like “mock” or “encapsulation” or, God forbid, “polymorphism” Well, not in the beginning at least. Because, as you said, you don’t want to put people off. But, I’ll be damned if it’s not my intention to pull everyone, and the companies I work with, up to higher levels. And that next level will include discovering interfaces a la TDD and mocking. So, I think we need to start slowly, for sure, but we also need to keep teaching, keep writing, and keep learning. I would be careful about diluting what developers need to do, know and learn - i.e. we should not enable bad behavior.
Good post, thanks, J
Well said Roy!!!
It's what I've been telling people for a while now (www.clariusconsulting.net/.../47152.aspx).
Although I propose we stick to Mock as the name, but with a very broad meaning (including stub and fake), which basically depends on how you use the "mock".
Moving forward, do you think "isolation frameworks" should do away with expectations altogether and just offer AAA style APIs?
In my previous post I started talking in more coherent words about feelings I’ve had lurking in the past
Pingback from September Newsletter « Scottish Developers
Pingback from TechFocus2.0 » So many doors, so many options, what a headache…
The discussion on the future of unit testing for the masses has shifted from the standard “if they are
Pingback from Latest Newsletter « Scottish Developers
Pingback from Funny Blog » Blog Archive » Funny Blog ?? Blog Archive ?? funny wallpaper ?? Goodbye mocks …
Pingback from Eli Lopian’s Blog (TypeMock) » Blog Archive » Future of Unit Testing and Economics
In my experience what is preventing the adoption of TDD is legacy code and frameworks not designed with testability in mind. I really agree that testability is not the same thing as good design!
I have developed a Java framework that enables unit testing of "untestable" code. For example the framework allows mocking static methods, private methods and even constructors. It also allows bypassing encapsulation, static initializers etc.
Have a look at www.powermock.org
provisions sulfate studies trends against weathering mean
Pingback from Mark’s Testblog » Blog Archive » Understanding test doubles - …for these are testing times, indeed.