Follow @PDSAInc January 2007 - Posts - Paul Sheriff's Blog for the Real World

Paul Sheriff's Blog for the Real World

This blog is to share my tips and tricks garnered over 25+ years in the IT industry

Paul's Favorites

January 2007 - Posts

Not Estimating and Tracking your Projects? Expect Failure

Estimating software development projects is one of the hardest things to do. First, programmers just don’t like doing it. And why should they? They are usually incorrect and may feel the heat when their estimates slip. Secondly, most programmers just don’t have a method or process by which to develop an estimate. I will lay out some simple steps to follow to help you on your way to actually “loving” to provide estimates. Yes, loving it. Because if you can be right most of the time, wouldn’t you love it too? If you come up with a process for estimating, you will be on your way to being right a large percentage of the time!

If you ask three programmers to give you an estimate for a project, you will have three different answers. Why? Each one used their own backgrounds, experiences, and checklists to derive an estimate. So what do you do? Leverage those experiences in a structured approach and document it. What we did at my company PDSA, Inc. (http://www.pdsa.com/) is develop an estimating model and approach to estimating. We built and defined an estimating process. Remember: you can never improve an estimate if it is in someone’s mind. But if you build an estimating process and document it, you can improve the process and measure the process. Once you have done that, you can begin to build solid and more predictable estimates.

Step 1: Build an estimating model.

What are all the elements to an estimate? Collect your teammates together and brainstorm all the elements of a project. There are many books out in the market, and I am sure your teammates will have some ideas to. Here are some sample elements: logical database design, physical database design, unit testing, integrated testing, prototyping, requirements meetings, coding, etc. You will likely have a list of over a hundred items. Classify them into categories like: requirements, development, testing, customer buyoff, etc.

Step 2: Use the estimating model

Does that sound silly? Why did I just go through all the work in Step 1 only to not use it in Step 2? That is because your customers or upper management will want a quick answer. The problem is that they will remember your quick “answer.” Try and resist. Once you start using your estimating model and you can show how successful you are, they will trust and believe in you. Now that is something new for IT!

Step 3: Track your work

An estimate is only just an estimate unless you have time tracking in place. I don’t mean the kind where I track whether you went to lunch or not, but the kind of tracking that captures for every single task in your estimating model how many actual hours are spent. This is important. You have no idea how good your estimate really is unless you track actuals against it. You must do this and I can’t impress this point upon you more. By understanding your actuals you can refine your estimating model. Refining your model may include changing the number of hours you think a task really takes, or it could be that you forgot several tasks. Now you can add those new tasks back into your model. It is imperative to leverage your lessons learned and improve your model. Better models enable you to better predict project results, which yields in projects coming in on time and your customer’s appreciating it.

Once you have estimated and tracked several projects, you will begin building up a repository of estimating information. Another benefit is that at times a custom may ask for a quick estimate. You don’t have time to do much if any analysis, but they need that magic “number” to put into their budget. You could mine that data to find projects roughly similar to the new project. You could then at least provide a range of estimates based on some very high level project criteria. The estimate certainly is not perfect and should come with the usual disclaimers, but at least you have used real data based on your experience. At PDSA, we have over 9 years of tracked data and it has been extremely valuable in helping our clients in the initial planning and budgeting phase.

Estimating “Rules of thumb”

Estimates must be created using a standard, repeatable process. But when developing your estimating model, your model should produce reasonable estimates. If your model produces estimates that are unreachable, your development team will not work within your process. So your estimates need to be reasonable (within the context of your company, skill sets, leadership, etc.). On the other hand, make sure your estimates are a bit of a stretch. Each person should be challenged and your software development process should be challenged as well. As you challenge your team and processes you will find a wonderful outcome: innovation! You will innovate and discover new or different ways to do things better, faster, and cheaper. It may sound like a cliché, but it works.

Summary

Estimating is an important skill that can be learned and improved. It is not a scientific process unless you let it become one. Accurate and consistent estimating builds customer loyalty and team confidence. Tracking provides real time feedback to not only your team members to see exactly where they are, but also to your customers. In October of 2006, I did a Webcast on Estimating on my Paul Sheriff's Inner Circle (http://www.paulsheriffinnercircle.com/). On January 26, 2007 we are having another Webcast on the Top 10 Best Project Management Practices that will also talk about estimating and time tracking.

5 Things you did not know about me...

Thanks to Craig Shoemaker for tagging me on the "Five Things"<g>. So, here we go, here are five things you probably didn't know about me:

1. I grew up in Iowa. I love the Midwest and I try to get back there to see friends whenever I can. I try to speak at the Des Moines, Iowa .NET User Group once a year, since that is where I have a lot of friends. I will be speaking there on March 7, 2007, so if you are in the area, stop by! (www.iowadnug.org)

2. I snow ski. I try to go a few times a year to Mammoth with my friend, and the VP of my company, Michael. I love being out in the fresh air and getting exercise. I am an intermediate skiier. I tried snowboarding once, but I prefer skiing.

3. I play drums in a progressive rock band called Evolve (www.MusicEvolve.com). We are an all-original music band. We have created about 20 songs so far and are working on our first CD which we hope to release in about 1 month. I picked up drums in January of 2004 after not having played for 20 years! Took awhile to get the chops back, but thanks to some good instructors, I am improving all the time. In fact, I am fortunate to have a local guy who is a Gene Krupa impersonator give me some lessons. Check out Randy Caputo if you are interested (www.RandyCaputo.com).

4. I was going to become a theatre major before I discovered computers. While in college I was working in the school theatre doing lighting and stagecraft and we used an old Apple II+ to run the lights for the productions. I really enjoyed playing with the computer, and that is how I decided I liked computers better than theatre. However, to this day, I still love live theatre and go whenever I have a chance.

5. I have an 8 year old daughter named Maddie. She and I are like two peas in a pod. We love roller blading together, and in fact, we go 2 times a week to a local skating rink. We enjoying walking with the dog, going to movies, and just laughing and being together. Being recently divorced has cut down a little of our time together, but we still have some great quality time together.

Hopefully, this gives you a little insight to my life. I hope you find it interesting.

So, if I were to pick on 3 other people, here is who I would like to know more about.

Paul

Posted: Jan 17 2007, 04:11 PM by psheriff | with no comments
Filed under:
Use Close and Finally

In the last 2 weeks I have had two different clients complain that there are "memory leaks" in .NET. I tell them very politely, that most likely it is their code!<g> In both cases it was their code. The first case had to do with the programmers using a DataReader and not closing them when they were done with them, or they put the Close() method call within their Try block and not in a Finally. The second case involved the StreamWriter in the File.IO namespace where once again the file was not always being closed correctly due to the programmer not putting the close in the Finally block. Below is what the original code looked like: 

using System.IO;
private void CreateLogFile()
{
  try
  {
    StreamWriter sw =
      new StreamWriter(@"D:\Samples\Test.txt",
      true, System.Text.UTF8Encoding.UTF8);

    sw.WriteLine("This is some text");
    sw.Close();
  }
  catch(Exception ex)
  {
    throw ex;
  }
}


You can see in the above code that the sw.Close() method is within the Try block. If an exception occurs when trying to open or write to the file, the code would go immediately to the Catch block and the Close would never execute. If this method is being called many times, this can cause and "apparent" memory leak. With just a little re-factoring the code was fixed to close the file in the Finally block.

private void CreateLogFile()
{
  StreamWriter sw = null;

  try
  {
    sw = new StreamWriter(@"D:\Samples\Test.txt",
      true, System.Text.UTF8Encoding.UTF8);

    sw.WriteLine("This is some text");
  }
  finally
  {
    if (sw != null)
    {
      sw.Close();
    }
  }
}

Notice that there are three items that we had to fix up here. First, the declaration of the StreamWriter was moved out of the Try block. If we did not do this, then the variable "sw" would not able to be accessed from within the Finally block since it would have block-level scope only within the Try portion. Secondly, we created a Finally block, checked the "sw" variable for null, and if it was not null then we closed it. Thirdly, since nothing was happening with the catch, we eliminated it all together.

Watch these types of scenarios as this can cause hard-to-find bugs in your code.

Posted: Jan 16 2007, 03:50 PM by psheriff | with 6 comment(s)
Filed under: ,
Beware of Encoding Types when saving to a File

I learned something today... I was doing some Encryption of strings using DPAPI and converting them to a Base64 string and everything worked fine when I was encrypting and decrypting. However, when I saved the Base64 string to a file, then re-read the data and tried to decrypt the data, it would not work. It took me quite awhile to figure out what was going on.

I realized that when saving text using the WriteAllText method on the File class, you need to specify the Encoding type. For example:

File.WriteAllText(C:\Temp\Test.txt, "This is some text", System.Text.Encoding.UTF8)

Now the above example uses just plain text, but when you are using doing encryption and decryption and you are translating a string to an array of bytes, you use one of the encoding mechanisms to do the conversion. For example:

bytArray = Encoding.UTF8.GetBytes(strValue)

So in this case if you use UTF encoding to translate a string into an array of bytes prior to doing the encryption, you need to make sure that you store these characters to a file using the same encoding mechanism.

I hope this helps someone else, so you won't have to go through the same agony I did!

Paul

Posted: Jan 15 2007, 08:12 PM by psheriff | with no comments
Filed under: ,
Wrap it up!

In our daily programming with .NET, we often find new things to use. In some cases Microsoft tells us there is something new to use. Take the case of moving from .NET 1.1 to .NET 2.0. Remember in .NET 1.1 how you used the ConfigurationSettings.AppSettings("MyValue")  to retrieve values from your .Config files? Then when .NET 2.0 came out and you attempted to upgrade your project, now all those lines of code were marked as Obsolete and a bunch of warnings were generated in your project.

Change is inevitable in this industry, and in life! However, somethings like this we can avoid with a little careful planning. I am sure most of you have discovered the benefits of using a Data Layer. This is where you create a class to wrap up ADO.NET so you spend less time writing the same ADO.NET code over and over again. The same technique should be used with configuration settings as well.

In .NET 2.0 Microsoft wants you to now use the ConfigurationManager class now to retrieve application settings and connection strings. But should you? I say NO! Once you start writing this code all over the place you have locked yourself into that way of doing things. This means that using the ConfigurationManager class you have just locked yourself into only placing your configuration settings into a Config file.

What would happen if someone needed you to store all your settings in the registry, or in an XML file on another server, or in a database table? You would have to find all those places where you used the ConfigurationManager class and replace that code. It would be better if you wrap up the ConfigurationManager class into your own class and just have methods to retrieve your various settings. Keep it simple, something like the following:

 Public Class AppConfig
  Public Shared Function ConnectString() As String
    Return System.Configuration.ConfigurationManager.ConnectionStrings("SQL").ConnectionString
  End Function

  Public Shared Function DefaultStateCode() As String
    Return System.Configuration.ConfigurationManager.AppSettings("DefaultStateCode")
  End Function
End Class

You would then use this class whenever you wished to retrieve these values, for example:

lblState.Text = AppConfig.DefaultStateCode

If you then need to change the location of where the StateCode is retrieved from, you only need to change the Shared Function DefaultStateCode to retrieve the value from the registry, a database table, or whereever.

So as you are programming your applications, think about wrapping up code that could potentially change in the future.

Have fun in your coding,

Paul

 

<a href="http://www.codeproject.com/script/Articles/BlogFeedList.aspx?amid=2032491" style="display:none" rel="tag">CodeProject</a>

Posted: Jan 12 2007, 05:00 PM by psheriff | with 4 comment(s)
Filed under: ,
dnrTV

On Wednesday, January 3rd, I got together with Carl Franklin and filmed an episode for his Dot Net Rocks TV (www.dnrTV.com). This episode is on, what else, creating your own custom providers. In this show I walk you through creating providers from the ground up. Carl informs me that this episode should air around the first week of February.

Posted: Jan 10 2007, 04:50 PM by psheriff | with no comments
Filed under: ,
The Provider Model Rocks!

In the last few weeks I have started on a new endeavor. I am rewriting my PDSA Framework from the ground up. I figured that it is time to once again take everything I have learned over the last few years of working with .NET and re-architect things a little better.

In doing so, I am going with a complete provider model for each and every piece of my Framework. I really like this model. In fact, I just wrote a nice little chapter about how to create providers in my "Architecting ASP.NET 2.0 Applications" eBook (www.pdsa.com/eBooks). So far, I have created a Data Provider, a Configuration Management Provider, an Exception Management Provider and a Cryptography provider. Yes, I know, I could have just used the Microsoft Enterprise Library, but have you seen the underlying code for that? Wow! Very complicated, way over-engineered. I knew I could make it simpler (and I have).

I will be writing an eBook on each one of these provider model blocks that I have created and releasing them this quarter (Q1 of 2007). Keep a watch on my web site for them.

Posted: Jan 10 2007, 04:45 PM by psheriff | with 2 comment(s)
Filed under: ,
More Posts