Joel Spolsky has written an interesting piece about the idea of "eating your own dogfood" which simply means using your own software - people who buy dogfood don't eat it. Over the years, I have seldom been in the position to use the software that Thycotic Software Ltd has developed. The software has usually been out of my realm of usefulness - how many of us even know the slightest thing about chromatography, sewer planning or asbestos claims for that matter?
This all changed when Thycotic decided to productize its internal time tracking and billing system -myclockwatcher.com was born! We use the system everyday as our time tracking system (to track billable consulting hours and product development hours) and now it is also a product for our customers.
Some observations to date:
- Features we need (such as certain reports) are usually also needed by our customers
- Product stability is not negotiable (all hail TDD!)
- Customer requests help us to rethink our own business use of the product
- Using your own product can also be a big selling point to customers
There is further discussion on "eating your own dogfood" here.
This weekend we purchased a condo in the Washington DC area and I passed the Microsoft 70-320 exam ("Developing XML Web Services and Server Components with Microsoft Visual C# .NET and the Microsoft .NET Framework"). The natural question for any techie then would be: which was harder?
Let's weigh up the contenders:
DC House Purchase
- Compete with at least 10 other bidders on any property
- Pay $50k+ more than asking price
- Waive all contingencies in your offer (inspection, financing, appraisal)
- Sweat it out and possibly still not even come close
- Learn a lot about COM Interop, COM+ and Enterprise Services
- Delve into security permissions, roles and wonder which of our clients would ever be satisfied with 'off the shelf' security
- Reading up on .NET Remoting intricacies that make your brain hurt
- Find out that SOAP extensions are pretty common and are basically required reading for this exam - so much for thinking we were so smart!
I used this book for my preparation and spent some time doing the practice exams that come with the accompanying CD.
The exam was difficult since it tests some things that I just don't use in my day to day work - we seldom work with COM or COM+. The level of questions in this exam also seemed much higher than 70-315 and there were some trick questions. However, the examiners still need to focus more on the understanding of concepts and how people apply knowledge rather than specifics. I don't see the value in testing whether you know the name of a particular method within a class.
I think the house purchase was harder due to the emotional rollercoaster involved since the outcome affects your life for the next 7 years (US average?) but the exam was a worthy adversary especially on the same weekend! :-)
Eric Sink discusses his views on why Team System is priced at its current level and also may set you straight on what you do and don't get with MSDN Universal. MSDN Universal is a wonderful product and probably a great way to get pricey (more than $2000) server products into the hands of those who otherwise might not try them. Unfortunately you will still need a client who can afford the licensing to purchase those server products for your applications. This can often be a barrier to consulting shops who would otherwise like to gain real-world experience in technologies such as BizTalk, Microsoft Content Management Server and Microsoft Commerce Server.
In the meantime, small to medium size business will do what makes sense for their dollars and probably steer clear of products like Team System.
The plus side of this pricing announcement is that our tried and tested tool set including VS.NET, NUnit, TestDriven.NET, NAnt and NCover will continue to be relevant to a very large market in spite of Team System.
Mark Miller talks about a new metric he is calling Maintenance Complexity. The system assigns operators and constructs a point score representing their contributing complexity. A method is then analyzed and all the points add up to yield the Maintenance Complexity score. I like the concept but tackling a report of complex methods would probably just result in breaking them apart into more methods. Many might argue that this is a Good Thing although it really depends on how the methods are split.
This would seem to be completely against Kent Beck's Rule 4 for achieving the SimplestCode - "Minimizes number of classes and methods".
It also seems that determining the complexity of a method should also take into account the parameters to the method especially if any of the parameters are out since that could indicate greater coupling and more possible side effects of the code in general.
Automated tools and reports provide a great way to home in on problems area in code (Just look at the marvels a coverage report can do for statement coverage!) - but once you find the problem areas ... how do you fix them? How does the tool/metric naturally select for simpler logic and less complexity?
Mark - Thanks for your thoughts and ideas!
Andrew Duthie (the Mid-Atlantic Microsoft .NET Developer Evangelist) mentions that Code Camp will be coming to the DC area in May. Stay posted for details of this full day FREE technology fest with lots of great technical content from the local community. This promises to be a local event that you don't want to miss.
If you have ideas for the Smart Client track or are doing work with Smart Client technology, please drop me an email and tell me what you would like to see.