Wayne Allen's Weblog

pragmatic agility

  • Iteration #2 - Day #-2 Monday 12/30/2002

    Another interesting fact came out of our discussion with our customer. He was feeling nervous about progress since our integration was late and couldn't determine the true state or quality of the project. We determined that our binary approach to story completion didn't reflect what people needed to know. So we changed our approach to list the number of test cases that passed and failed as a stacked graph with time as the X axis. This should give the stakeholders a better feel for how complete the iteration is.

  • Programming Fun

    I was talking to one of our developers last week and he made a comment that didn't strike me until yesterday. He said he was actually having fun again! This made me realize that by placing responsibility for tasks in the correct hands, everyone can better enjoy their jobs. I recall other projects where the development team was given high level guidelines and left to it. Everything was fine until we got done with the "fun" stuff and were trying to make decisions about features etc, where we didn't have the slightest clue. The developers were working on whatever they wanted, and not much was getting accomplished.

  • Iteration #1 – Day #10 Friday 12/20/2002

    Today was the last official day of the iteration. We didn't get everything signed off by our customer representative (QA), but we did complete all the technical tasks generated from the story. Additionally we got a few bonus features as people noticed that an hour or two would pull things together. I'm not sure this is the best thing to do since it wasn't on our official list, but as most of the developers know what features are desired and it wasn't blatant gold plating we can let it slide. Our director and PM decided to slide the end of the iteration to 12/31/2002 to line up with another related project and because of the difficulty of having a 2 week iteration over the Christmas holidays.

  • Iteration #1 – Day #9 Thursday 12/19/2002

    Today was a regular working day. We came, we coded, we went home. We did make a couple of design decisions and the design of a couple of components is changing as we better understand what we are building. I keep waiting for someone to complain about constantly changing interfaces, but people are just stubbing out the return results for know. I should check in to see what the tests look like. Mock objects are something we have on tap for a training brownbag and seems like a technique that could be useful.

  • Iteration #1 – Day #8 Wednesday 12/18/2002

    Yesterday our customer was (rightly) getting nervous about not having seen a build yet, so the team committed to producing a build today and followed through with flying colors. I'm still getting some questions along the lines of "why build when we know all the functionality isn't there yet". If we can get the build process somewhat more automated I think these questions will go away. Our customer really likes being able to see incomplete functionality so he can make changes before we've committed a lot of time. QA also likes being able to see the product earlier in the cycle so they know their tests are accurate and complete as well as giving earlier feedback. All in all just the way agile should be working, we just need to straighten out some kinks.

  • Iteration #1 – Day #7 Tuesday 12/17/2002

    Brief status: bullpen space is ready, desktops are installed, servers built. Hooray! A couple of us actually worked in the "pod" as it is called, but no pairing yet. At this point it looks like this is not going to happen this iteration.

  • Self configuration of objects

    Mark Strawmyer has a nice article on using .NET attributes to assist the self configuration of objects. Interesting approach, and one I'm sure I can apply in many areas as I think more about how to leverage attributes. In my current thinking (revolving around writing testable code) I'm having difficulties with the whole area of configuration. We are trying to write code that can have automated unit and acceptance tests (customer tests) and since the dev, qa, integration and production environments all have different configuration settings one can't just grab everything out of VSS and run the test (the goal) since the tests will break without the correct configuration files.

  • Iteration #1 – Day #4 Thursday 12/12/2002

    Brief status: no bullpen yet, hardware arrived - everything should be ready EOD Monday. Today I spent time resolving the need to share information without resorting to MS Word files in VSS (which never works since there are too many layers to go through to get the information).