Verifying – Evaluating Software Estimates John A. Long PRICE Systems
Most Complex systems in procurement by DOD are Software Intensive Problem Statement Most Complex systems in procurement by DOD are Software Intensive Software is the highest risk component in DOD programs 1995 Standish Survey polled over 800 software developers 52.7% of all projects were completed but incurred cost and schedule overruns Challenged projects were delivered with only 61% of originally specified functionality Source: 1995 Standish Survey
Overview Types of Estimates Evaluation Process Metrics Measures Productivity Summary
Types of Software Estimates Budgetary Proposal Bid to Win? Deliver within proposed schedule? Cost to Complete Cost Growth Schedule growth Each type of estimate is reviewed using different techniques and scrutiny
Evaluation Process Bid to Win Effort only – Does the bottom line cost meet expected number Only One Parameter Evaluated using this Process Three Key Parameters Must be Evaluated Price Schedule Quality
Number of Lines of Code Produced Per Hour Evaluation Process Metrics Provide Insight Into Cost/Schedule (Performance) Process Quality Metric Example Number of Lines of Code Produced Per Hour Number of Defects per Thousand Lines of Code Metrics are derived using Software Measures
Evaluation Process Software Measures Actual Data Collected and Analyzed to form a Metric PRICE S Input Parameter – APPL is a Measure Metrics can be Integrated with Software Parametric Cost Model Should be based on Past History of Completed Programs
Metrics Key is to Derive Metrics from Similar Software Measures Software Size As size increases, productivity decreases Thru-Put Speed Timing issues/problems reduce productivity Functional Difficulty of the Code Productivity decreases as software functional difficulty increases
Example Productivity Metrics Source: Software Productivity Since 1970, by David Longstreet
Metrics – Can Aid Evaluation of Staff Size Software Department Cost Software Development Schedule Program Performance versus Program Plans Process Improvement Risk of Achieving Program Success
Metrics Cannot Estimate Software Size SLOC Function Points Objects
Objective Software Measures Collected Performance Product Size Metric Development Objective Software Measures Collected Performance Schedule and Cost Product Size Product Quality Development Process Process Improvement Technology Customer Satisfaction
Collect Objective Software Measures SLOC Developed Total Hours Metric Development Collect Objective Software Measures SLOC Developed Total Hours Staff Size Number of Defects Requirements Design Code Test
Metric Development Normalize Data into Meaningful Metrics SLOC Per Hour Defects Per Thousand SLOC Analyze Results Why was Productivity Higher for Program XYZ Expect Differences
Reasons for Higher Productivity Measures Functional Complexity of Code APPL Process (Code Generations) Modification versus New Development Different Functional Cost and Development Phases included in Software Measure SLOC versus Function Points
Sample Performance Metric Source: Software Engineering Baselines, Kaman sciences Corporation, July 1996
Metric Implementation Management Buy In Implemented at the Correct Level to Evaluate Proposal or Product Status Avoid Metrics that do not Add Value to evaluation process Understandable Definitions of each Metric Must be used to Evaluate Product Status, Not Individuals
Summary Metrics can be used to evaluate soundness of software estimate Metrics should be derived from past history Metrics Work very well with Parametric Cost Models