Memory vs. Recorded Data

Steve McConnell, in Software Estimation: Demystifying the Black Art [McC06], reports that “individual expert judgment is by far the most common approach used in practice.” When we talk about estimation by expert judgment, we’re really talking about comparison with past experiences. You can augment memories of those experiences with recorded data, when that’s available. There are often files of information about past projects that can be mined for data. How similar were the requirements of this old project to the one we’re contemplating? How many people worked on it, in what roles, and for how long?

Inaccurate Memories

Most manuals on estimation warn you that estimating based on remembered data is fraught with inaccuracies. You might remember the estimate for a past project rather than the actual results. Or, due to proximity, you might remember the staffing at the end of the project, rather than the actual staffing throughout it. You might remember the project start from some date other than the actual starting point, perhaps when you joined it or the date that it was announced to a wider audience. You might neglect accounting for rework done after the project was initially delivered.

Recording Data for Future Estimates

Estimation experts will say that you should record significant data for your projects, and use that data to estimate future ones. As Capers Jones advises in Estimating Software Costs [Jon07], “The best defense against having a cost estimate rejected is to have solid historical data from at least a dozen similar projects.” That’s good advice, but it probably won’t help you immediately. You’re likely facing the need for an estimate before you can organize your data collection program.

If you do have detailed data from past projects, you’re probably working for a company that needs high-precision, high-accuracy data to support pricing fixed-scope, fixed-price bids for large projects. But if you’re in that situation, then you probably have a department full of experts at doing that in your business domain, using a standardized process approved by your company. It’s the same department collecting and maintaining all that data. They’ll help you

For the rest of us, there’s still hope of finding recorded data from past projects. There is almost always some data that was recorded contemporaneously, if you can only find and understand it. Asking around for memories may give you a starting point. Look for accounting data for costs charged to past projects. Look for email conversations about those projects.

Inaccurate Recorded Data

Beware, though, of inaccuracies in the data. Even when recorded at the time, the data that was recorded likely does not tell the whole story. There is much that happens “off the record” due to inattention or embarrassment.

It's Worse Than We Thought

Once again Marion strode into Sidney’s office. "I’ve been digging deeper researching the last call center project, and I’ve got data, this time. Accounting looked up the numbers for me. There were five programmers working on the last call center project code. They billed 40-hour weeks for 15 months. And there were two testers on that charge code for the last five months of the project."

"Fifteen months?" Sidney looked unhappy. "And you think it’s going to take longer this time?"

"Yes, our rough order of magnitude was two and a quarter times as long, factoring in requirements gathering and the additional metrics requirements. That would be 34 months. I know that’s not what you wanted to hear."

"No, Ryan won’t go for that. He’ll insist on an off-the-shelf solution, not realizing that customizing such a system is a big project in itself. And I don’t like running the risk of not being able to deliver the functionality that’s requested. If we hit a wall with a purchased solution, we likely won’t have any options."

"It’s worse than that." Marion shifted uneasily. "I ran into Blaise who’s still friends with the lead programmer of the old call system. It turns out that the project turned into a death march once the testers started looking at it. That’s not reflected in the accounting numbers because the development team were all on salary, so their timesheets reflected 40 hours no matter how much more they worked in a week. If we planned the new project assuming those were 40-hour weeks, we’d doom it to failure before we started."

There was a long pause. "Are you sure Ryan needs everything in the current system?"

Collected data often has pernicious errors. Unrecorded overtime is common; programmers are forced to donate extra time to the project. Even when timesheets aren’t forced by policy to be inaccurate, the allocation to charge codes may not be correct. People often think that contemporaneous data collection will be accurate. When data collection is extra work, in addition to the work of accomplishing the project itself, it seems like a nuisance to those doing the work. Therefore, they may not spend a lot of effort keeping it accurate. It’s easier to duplicate a previous time sheet than to consider precisely how we’ve split our time among multiple accounting buckets. The more detailed this task is made, the less accurate it’s likely to be.

The data collection may have other types of omissions, too. Timesheet records may collect only direct value-adding work, not overhead activities such as meetings. These may be charged to a different category. Your upcoming project, though, is sure to have its share of meetings, also.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.189.13.129