“Software Estimating: Reflections and Looking Forward” November 2010 We are interested in hearing your reflection on the cost estimation field over the years as well as your thoughts on the state of the industry looking forward Galorath Incorporated Daniel D. Galorath blog: www.galorath.com/wp
An Estimate Defined: Many Still Take an Estimate As Exact An estimate is the most knowledgeable statement you can make at a particular point in time regarding: Effort / Cost Schedule Staffing Risk Reliability Estimates more precise with progress A WELL FORMED ESTIMATE IS A DISTRIBUTION
Age Warped History of Software Estimating 1966 SDC Software cost model (Probably the 1st Software Model) 1976 Manual Estimate Killed Project 1978 Halstead Software Science 1979 Reifer, Galorath paper the genesis of “JPL Softcost” 1980ish COCOMO 1982ish Reifer Poor Mans Guide to Estimating Software Cost 1983 Softcost use made management hardware decision viable 1984 CEI & Jensen model: more powerful than Softcost 1988 SEER-SEM Development 1988-Present Hundreds of staff years with constant: Data collection Cost Model refinement Other models (e.g. defects, maintenance, monitoring & control) Data driven models (e.g ProjectMiner, Metrics & Benchmarking, Data Driven analogies)
Poor Estimates Effects on Projects Inaccurate estimates significant impact on project success: Poor implementations Critical processes don’t scale Emergency staffing Cost overruns caused by underestimating project needs Lack of well defined objectives, requirements, & specifications, results in creeping scope resulting in: Forever changing project goals Frustration Customer dissatisfaction Cost overruns and missed schedules Project Failures Incorrect estimates / bad plans are a root cause of subsequent program risk Estimating & Planning are key to software project success
Problem: Humans seem hardwired to be optimists Delusions of Success: How Optimism Undermines Executives' Decisions (Source: HBR) Problem: Humans seem hardwired to be optimists Routinely exaggerate benefits & discount costs Optimism from cognitive biases & organizational pressures Exaggerate talents & degree of control Attribute negative consequences to external factors Anchoring (relying heavily on 1 piece of information) magnifies optimism Most pronounced for new initiatives Solution: Temper with “outside view” Supplements traditional forecasting w/ statistical analysis of analogous efforts Don’t remove optimism, but balance optimism & realism Why Should We Care: Optimistic Estimates Come From Optimistic People. And It Is Hard To Be Realistic
The Chasm: Acceptance of Parametrics Source: "Crossing The Chasm," Geoffrey A. Moore 3. Early majority-MAINSTREAM MARKET POPULATION- a) Similar to Early adopters but far more practical and pragmatic Aversion to risk, want a proven solution. b) Insist on seeing well-established references of other Early Majority users (A real Catch-22) c) Not intimidated by technology, but will not pursue technology for technology's sake. I.e. “no one gets fired for choosing IBM 2. Early adopter (Visionary) a) Not technologists but appreciate technology benefits. need more help than Innovator. b) In pursuit of major benefits early-on can see the strategic opportunity represented by new technology. c) In search of procedural and benefit break-through which will achieve order-of-magnitude ROI. e) Easy to sell & hard to please f) Want "productized" technology g) Always in a rush but contract closure is next to impossible h) Expectation Management… visionaries will attempt to alter a vendor's priorities. Acceptance 4. Late majority a) Similar to Early majority BUT they are not confident in their ability to handle a technology product. 5. Laggard a) Want nothing to do with technology and not worth the trouble to try to convert. Tend to "fight the use of new technology 1. Innovator a) Pursues new technology aggressively, often for its own sake. b) Technologists or technology enthusiasts c) Will overlook all kinds of short falls in the deliverable. d) Easiest buying population to satisfy: want the truth, access to top technical support, be first, want low cost (cheap.) I believe parametrics are in later part of early adoptors with “data driven” as a current manta
Opportunity to Reduce Cost Parametrics Can Provide Cost Reduction Insight Early, Then Throughout The Project and Opportunity 100% 80% 60% The Time to Reduce Costs 40% 20% 0% Concept Design Test Production Committed Cost Opportunity to Reduce Cost Ó Galorath Incorporated 2004
Engineers Sometimes Don’t Care Since Costs Are Constantly Talked About Why Aren't They Understood and Managed? Don’t Know How How To Produce Credible Estimates How To Scope The Problem How To Factor In Risk Engineers Sometimes Don’t Care Make It “Best”… At Any Cost Since They Can’t Quantify Cost They May Ignore Cost Over Optimism Sometimes People Don’t Want To Know The Cost Their Programs May Get Killed They May Not Win They May Lose Their Job They May Be Proven Wrong
Estimation Organizational Maturity: Clarifies Estimation Needs & Goals Level 0 Informal or no estimating Manual effort estimating without a process Level 1 Direct Task Estimation Spreadsheets Ad Hoc Process Level 2 Formal Sizing (e.g. function points) Simple model (Size * Prod.) or informal SEER Use Some measurement & analysis Informal Process Level 3 Formal Sizing Robust Parametric estimation (SEER) Estimate vs. actual capture Rigorous measurement & analysis Parametric planning & Control repeatable process Level 4 Formal sizing Repeatable process Robust parametric estimating (SEER) Parametric estimation with tracking & control Process improvement via lessons learned Level 5 Continuous process improvement We’re finding most companies are In this range Why Should We Care: Impacts ROI & development decisions. Robust processes can improve project success
10 Step Software Estimation Process: Consistent Processes = Reliable Estimates Establish Estimate Scope Establish Technical Baseline, Ground Rules, Assumptions Collect Data Estimate and Validate Software Size Prepare Baseline Estimates Quantify Risks and Risk Analysis Review, Verify and Validate Estimate Generate a Project Plan Document Estimate and Lessons Learned Track Project Throughout Development This software estimation process is described in Software Sizing, Estimation and Risk Management by Dan Galorath and Michael Evans. 10
Manual Estimates: Human Reasons For Error (Metrics Can Help) Manual Task estimates yield SIGNIFICANT error Desire for “credibility” motivates overestimate behavior (80% probability?) So must spend all the time to be “reliable” Better approach: force 50% probability & have “buffer” for overruns Technical pride sometimes causes underestimates
Sizing Pitfalls: Still A Challenge Sizing Mistake Consequence Wrong sizing metric chosen for level of detail desired Large variance in estimates Not enough time/effort spent on software sizing in general Unbelievable estimates – results don’t match the program and are too optimistic or pessimistic No clear definition of size Inconsistent estimates – results don’t pass the sanity check, unreliable output, blame the model Size growth not considered OR size estimates reduced to achieve desired cost Inaccurate estimates – results are too optimistic, programs will overrun cost / schedule estimates
The Vision of Parametrics Over the Next 20 Years (Originally Drafted 2004) Parametrics will be integrated into engineering processes and engineering decision making For example: Cost of a system derived from simulation models of that system Parametrics will be used throughout Government and industry Parametrics will lose its “magic” reputation Improved processes will yield better data Augmentation of parametrics with more viewable data will increase believability among engineers and management The nay-sayers who say that can make parametric models say anything they want will be replaced with belief More dynamic parametrics based on both historical and real time data Parametric models will be available to use as “objects” in financial and engineering analysis
Lessons Learned Once you build a new model / methodology it takes about 3 years before it gets to the chasm.. Crossing takes a lot longer Development is never done. New models, upgrades, enhancements must occur constantly Software development of commercial products continues to become more difficult. Nothing is easy People will promise much more data than will ever be received Data must be qualified as to its quality so bad data is better segregated Models need to handle what people are doing today as well as what they will need next year People make estimates, models are tools