Articles...

Copyright 2009 Corvus International Inc.  All Rights Reserved

Home      About Corvus       Contact Us       Articles/Resources       Clients          Affiliations  

1estimate \'es-tb-'mat\ vt [L aestimatus, pp. of aestimare to value, estimate]
1 archaic a: ESTEEM b: APPRAISE 2 a: to judge tentatively or approximately
the value, worth, or significance of
b: to determine roughly the size,
extent, or nature of
c: to produce a statement of the approximate cost of
                            Merriam-Webster's Dictionary


ACM DL Author-ize serviceThe inaccurate conception
Phillip G. Armour
Communications of the ACM - Urban sensing: out of the woods, 2008
 

"Accurate Estimate" is an Oxymoron                                                  

From the dictionary definition of "estimate" the thing is inexact.  So pursuing an "accurate estimate" is a straightforward oxymoron, like "British cuisine".  But in the business of software, the accurate estimate is the Eldorado of project management.  How can we reconcile these contradictions?

Whether Forecast?                                                                           

If the forecast for today says 40% chance of rain and it does rain, was the forecast "inaccurate"?  If it does rain, was it "accurate"?  It turns out that whether it rains or not is actually a lousy measure of how accurate the forecast is.

For the same reason:
whether a project is "successful" (ie., under budget and/or schedule) is NOT a measure of how accurate the estimate was
,

The reason is that both rain forecasts and project estimates are probabilistic.  That means, associated with any result (budget, schedule, staff, scope), is a probability of achieving that  budget, schedule, staff, or scope.  To pretend otherwise, to assume (or hope) that the forecast is "certain" is both delusionary and dishonest.

The Cone of Uncertainty       

Both Barry Boehm1 and Steve McConnell2 assert, in the two most important and influential books on software estimation, that the probability of an estimate being on target varies with time (more correctly, it varies with what we know and don't know about the project, which is a related but different issue).

They describe this variation in time--against a generic lifecycle--in the "Cone of Uncertainty".

The Cone of Uncertainty shows that in early project stages, the likelihood that the project will come exactly at an estimate of budget, schedule, etc could vary from 4x to 0.25x.  That is, the project could come four times over an estimate, or it could come in at one quarter of the estimate all else being equal.

Cumulative Probability Distribution: The S Curve           

The "width" of the Cone of Uncertainty can be described as a probability distribution of the likelihood of attaining or beating a particular value (on the y-axis), against that value (x-axis).  Clearly, if we have a lot of resources available we are unlikely to go over them, simply because there is a lot of them.  If we have few resources, we are obviously more likely to overrun.  There are some interesting mathematical issues here, related to the shape of the curve (probably a cumulative Weibull distribution with a shape factor around 2 and a lambda scale parameter around or below 1, but I digress).

Estimate ≠ Commitment                                               

The main point is that the "estimate" generates the curve, not the point on the curve. The point on the curve where the project management commits resources is decided by the business-focused commitment process

So an estimate is not an answer set (budget + schedule + staff + scope +...), unless it is accompanied by a probability.  The estimate is not the result, the estimate is the CURVE;  so we can see why an "accurate estimate" is a bogus phrase This is clear if we produce an estimate that states: "...this project will take somewhere between one week and thirty years..."   This is undoubtedly an accurate estimate--after all, every project takes somewhere between a week and three decades.  It's accurate, but it's not useful.  We don't need an accurate Estimate, we need an accurate Commitment.  Failing to understand this is what I call "The Inaccurate Conception".

Rein in the Commitment                                               

Whether it rains or doesn't rain is not a measure of the "accuracy" of the rain estimate.  A rain forecast is accurate if it rains at the likelihood it was predicted to rain.  So if on 100 days when there the forecast predicted a 40% chance of rain it rained on 40 days and didn't rain on 60 days, the 40% chance of rain forecast is accurate.  Whether it rains or doesn't rain on any particular day is almost irrelevant.

Also, if there is a 40% chance of rain and I decide to go out in my $2,000 Armani suit without an umbrella and it rains on me and ruins my suit, it's not the fault of the forecast, it's the fault of my decision.

If executive project management don't know what the probability of a particular estimate is and they commit to it anyway, they are making an important decision with only part of the information they need.  If executive project management do know what the probability of a commitment is, it is lower than 50%, and they commit anyway, they shouldn't be surprised if the project fails: the estimate predicted it would fail (that's what a lower than 50% probability means).

It is neither fair nor well, accurate, to blame a good estimate for a poor commitment.


 

1. Software Engineering Economics Boehm, Barry.  Prentice-Hall Englewood Cliffs NJ 1981
2. Software Estimation: Demystifying the Black Art McConnell, Steve.  Microsoft Press 2006

 

 

 
The Inaccurate Conception