Comparison and Assessment of Cost Models for NASA Flight Projects Ray Madachy, Barry Boehm, Danni Wu {madachy, boehm, USC Center for Systems.

Slides:



Advertisements
Similar presentations
Software Cost Estimation
Advertisements

Configuration management
Cost as a Business Driver 1 John Brown C Eng MIEE mr_ Software Cost Estimation.
Schedule and Effort. Planning Big Project: Waterfall-ish Style 1.Figure out what the project entails Requirements, architecture, design 2.Figure out dependencies.
Software Engineering CSE470: Process 15 Software Engineering Phases Definition: What? Development: How? Maintenance: Managing change Umbrella Activities:
Automated Software Cost Estimation By James Roberts EEL 6883 Spring 2007.
A Sizing Framework for DoD Software Cost Analysis Raymond Madachy, NPS Barry Boehm, Brad Clark and Don Reifer, USC Wilson Rosa, AFCAA
Copyright 2000, Stephan Kelley1 Estimating User Interface Effort Using A Formal Method By Stephan Kelley 16 November 2000.
COCOMO Suite Model Unification Tool Ray Madachy 23rd International Forum on COCOMO and Systems/Software Cost Modeling October 27, 2008.
University of Southern California Center for Software Engineering CSE USC System Dynamics Modeling of a Spiral Hybrid Process Ray Madachy, Barry Boehm,
University of Southern California Center for Software Engineering CSE USC COSYSMO: Constructive Systems Engineering Cost Model Barry Boehm, USC CSE Annual.
Integration of Software Cost Estimates Across COCOMO, SEER- SEM, and PRICE-S models Tom Harwick, Engineering Specialist Northrop Grumman Corporation Integrated.
COCOMO II Calibration Brad Clark Software Metrics Inc. Don Reifer Reifer Consultants Inc. 22nd International Forum on COCOMO and Systems / Software Cost.
Smi COCOMO II Calibration Status COCOMO Forum October 2004.
Integrated COCOMO Suite Tool for Education Ray Madachy 24th International Forum on COCOMO and Systems/Software Cost Modeling November.
SE 555 Software Requirements & Specification Requirements Management.
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy, Barry Boehm USC Center for Systems and Software Engineering.
Ch8: Management of Software Engineering. 1 Management of software engineering  Traditional engineering practice is to define a project around the product.
Introduction Wilson Rosa, AFCAA CSSE Annual Research Review March 8, 2010.
System-of-Systems Cost Modeling: COSOSIMO July 2005 Workshop Results Jo Ann Lane University of Southern California Center for Software Engineering.
SE is not like other projects. l The project is intangible. l There is no standardized solution process. l New projects may have little or no relationship.
1 COSYSMO 2.0: A Cost Model and Framework for Systems Engineering Reuse Jared Fortune University of Southern California Ricardo Valerdi Massachusetts Institute.
UNCLASSIFIED Schopenhauer's Proof For Software: Pessimistic Bias In the NOSTROMO Tool (U) Dan Strickland Dynetics Program Software Support
A Neuro-Fuzzy Model with SEER-SEM for Software Effort Estimation Wei Lin Du, Danny Ho*, Luiz F. Capretz Software Engineering, University of Western Ontario,
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy USC Center for Systems and Software Engineering
University of Southern California Center for Systems and Software Engineering © 2009, USC-CSSE 1 An Analysis of Changes in Productivity and COCOMO Cost.
Software Process and Product Metrics
Copyright © 2001, Software Productivity Consortium NFP, Inc. SOFTWARE PRODUCTIVITY CONSORTIUM SOFTWARE PRODUCTIVITY CONSORTIUM COSYSMO Overview INCOSE.
Chapter 23 – Project planning Part 2. Estimation techniques  Organizations need to make software effort and cost estimates. There are two types of technique.
What is Business Analysis Planning & Monitoring?
University of Toronto Department of Computer Science © 2001, Steve Easterbrook CSC444 Lec22 1 Lecture 22: Software Measurement Basics of software measurement.
COCOMO-SCORM: Cost Estimation for SCORM Course Development
Chapter 2 The process Process, Methods, and Tools
Dillon: CSE470: SE, Process1 Software Engineering Phases l Definition: What? l Development: How? l Maintenance: Managing change l Umbrella Activities:
1 ECE 453 – CS 447 – SE 465 Software Testing & Quality Assurance Lecture 22 Instructor Paulo Alencar.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 23Slide 1 Chapter 23 Software Cost Estimation.
Capability Maturity Models Software Engineering Institute (supported by DoD) The problems of software development are mainly caused by poor process management.
By K Gopal Reddy.  Metrics in software are of two types.direct and indirect.  Function points as indirect metrics.  Function points are used to measure.
9/17/2002 COSYSMO Usage Experience Panel: What is Happening at Lockheed Martin Garry Roedler, Lockheed Martin Engineering Process Improvement Center
Cost Estimation What is estimated? –resources (humans, components, tools) –cost (person-months) –schedule (months) Why? –Personnel allocation –Contract.
Project Estimation Model By Deepika Chaudhary. Factors for estimation Initial estimates may have to be made on the basis of a high level user requirements.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 26 Slide 1 Software cost estimation 2.
University of Southern California Center for Systems and Software Engineering COCOMO Suite Toolset Ray Madachy, NPS Winsor Brown, USC.
March 2004 At A Glance NASA’s GSFC GMSEC architecture provides a scalable, extensible ground and flight system approach for future missions. Benefits Simplifies.
Software Project Estimation IMRAN ASHRAF
March 2004 At A Glance autoProducts is an automated flight dynamics product generation system. It provides a mission flight operations team with the capability.
Software cost estimation. Fundamental estimation questions How much effort is required to complete an activity? How much calendar time is needed to complete.
University of Southern California Center for Systems and Software Engineering © 2010, USC-CSSE 1 Trends in Productivity and COCOMO Cost Drivers over the.
Function Points Synthetic measure of program size used to estimate size early in the project Easier (than lines of code) to calculate from requirements.
Estimating “Size” of Software There are many ways to estimate the volume or size of software. ( understanding requirements is key to this activity ) –We.
Effort Estimation In WBS,one can estimate effort (micro-level) but needed to know: –Size of the deliverable –Productivity of resource in producing that.
Proposed Metrics Definition Highlights Raymond Madachy Naval Postgraduate School CSSE Annual Research Review March 8, 2010.
University of Southern California Center for Systems and Software Engineering Reducing Estimation Uncertainty with Continuous Assessment: Tracking the.
The COCOMO model An empirical model based on project experience. Well-documented, ‘independent’ model which is not tied to a specific software vendor.
Smi COCOMO II Calibration Status USC-CSE Annual Research Review March 2004.
Rating Very Very Extra Cost Drivers Low Low Nominal High High High Product Attributes Required software reliability Database size.
Software Project Management
University of Southern California Center for Systems and Software Engineering Reducing Estimation Uncertainty with Continuous Assessment Framework Pongtip.
CSE SW Project Management / Module 18 - Introduction to Effort Estimating Models Copyright © , Dennis J. Frailey, All Rights Reserved CSE7315M18.
1 Agile COCOMO II: A Tool for Software Cost Estimating by Analogy Cyrus Fakharzadeh Barry Boehm Gunjan Sharman SCEA 2002 Presentation University of Southern.
Project Cost Management
Project Management Chapter 3.
COCOMO III Workshop Summary
Tutorial: Software Cost Estimation Tools – COCOMO II and COCOTS
Constructive Cost Model
SLOC and Size Reporting
More on Estimation In general, effort estimation is based on several parameters and the model ( E= a + b*S**c ): Personnel Environment Quality Size or.
COCOMO Models.
COCOMO 2 COCOMO 81 was developed with the assumption that a waterfall process would be used and that all software would be developed from scratch. Since.
Relating Effort Reporting to Project Estimation
Presentation transcript:

Comparison and Assessment of Cost Models for NASA Flight Projects Ray Madachy, Barry Boehm, Danni Wu {madachy, boehm, USC Center for Systems & Software Engineering 21 st International Forum on COCOMO and Software Cost Modeling November 8, 2006

2 Outline  Introduction and background  Model comparison examples  Estimation performance analysis  Conclusions and future work

3 Introduction  This work is sponsored by the NASA AMES project Software Risk Advisory Tools, Cooperative Agreement No. NNA06CB29A  Existing parametric software cost, schedule, and quality models are being assessed and updated for critical NASA flight projects –Includes a comparative survey of their strengths, limitations and suggested improvements –Developing transformations between the models –Accuracies and needs for calibration are being examined with relevant NASA project data  This presents the latest developments in ongoing research at the USC Center for Systems and Software Engineering (USC-CSSE) –Current work builds on previous research with NASA and the FAA

4 Frequently Used Cost/Schedule Models for Critical Flight Software  COCOMO II is a public domain model that USC continually updates and is implemented in several commercial tools  SEER-SEM and True S are proprietary commercial models with unique features that also share some aspects with COCOMO –Include factors for project type and application domain  All three have been extensively used and tailored for flight project domains

5 Support Acknowledgments  Galorath Inc. (SEER-SEM) –Dan Galorath, Tim Hohmann, Bob Hunt, Karen McRitchie  PRICE Systems (True S) –Arlene Minkiewicz, James Otte, David Seaver  Softstar Systems (COCOMO calibration) –Dan Ligett  Jet Propulsion Laboratories –Jairus Hihn, Sherry Stukes  NASA Software Risk Advisory Tools research team –Mike Lowry, Tim Menzies, Julian Richardson  This study was performed mostly by persons highly familiar with COCOMO but not necessarily with the vendor models. The vendors do not certify or sanction the data nor information contained in these charts.

6 Approach  Develop “Rosetta Stone” transformations between the models so COCOMO inputs can be converted into corresponding inputs to the other models, or vice-versa –Crosscheck multiple estimation methods –Represent projects in a consistent manner in all models and to help understand why estimates may vary –Extensive discussions with model proprietors to clarify definitions  Models assessed against a common database of relevant projects –Using a database with effort, size and COCOMO cost factors for completed NASA projects called NASA 94  Completion dates 1970s through late 1980s –Additional data as it comes in from NASA or other data collection initiatives  Analysis considerations –Calibration issues –Model deficiencies and extensions –Accuracy with relevant project data  Repeat analysis with updated calibrations, revised domain settings, improved models and new data

7 Critical Factor Distributions by Project Type Reliability Complexity

8 Outline  Introduction and background  Model comparison examples  Estimation performance analysis  Conclusions and future work

9 Cost Model Comparison Attributes  Algorithms  Size definitions –New, reused, modified, COTS –Language adjustments  Cost factors –Exponential, linear  Work breakdown structure (WBS) and labor parameters –Scope of activities and phases covered –Hours per person-month

10 Common Effort Formula  Effort in person-months  A - calibrated constant  B - scale factor  EM - effort multiplier from cost factors Size Cost Factors Effort = A * Size B * EM Effort Phase and Activity CalibrationsDecomposition

11 Example: Top-Level Rosetta Stone for COCOMO II Factors (1/3) COCOMO II FactorSEER Factor(s)True S Factor(s) PRODUCT ATTRIBUTES Required Software ReliabilitySpecification Level - ReliabilityOperating Specification Data Base SizenoneCode Size non Executable Product Complexity- Complexity (Staffing) - Application Class Complexity Functional Complexity Required Reusability- Reusability Level Required - Software Impacted by Reuse Design for Reuse Documentation Match to Lifecycle Needs noneOperating Specification PLATFORM ATTRIBUTES Execution Time ConstraintTime ConstraintsProject Constraints - Communications and Timing Main Storage ConstraintMemory ConstraintsProject Constraints - Memory & Performance Platform Volatility- Target System Volatility - Host System Volatility Hardware Platform Availability 3

12 Example: Top-Level Rosetta Stone for COCOMO II Factors (2/3) COCOMO II FactorSEER Factor(s)True S Factor(s) PERSONNEL ATTRIBUTES Analyst Capability Development Team Complexity - Capability of Analysts and Designers Programmer Capability Development Team Complexity - Capability of Programmers Personnel ContinuitynoneDevelopment Team Complexity - Team Continuity Application Experience Development Team Complexity - Familiarity with Product Platform Experience- Development System Experience - Target System Experience Development Team Complexity - Familiarity with Platform Language and Toolset Experience Programmer’s Language Experience Development Team Complexity - Experience with Language

13 Example: Top-Level Rosetta Stone for COCOMO II Factors (3/3) COCOMO II FactorSEER Factor(s)True S Factor(s) PROJECT ATTRIBUTES Use of Software ToolsSoftware Tool UseDesign Code and Test Tools Multi-site DevelopmentMultiple Site DevelopmentMulti Site Development Required Development Schedule none 2 Start and End Date 1 - SEER Process Improvement factor rates the impact of improvement, not the CMM level 2 - Schedule constraints handled differently 3 - A software assembly input factor

14 Example: Model Size Inputs COCOMO IISEER-SEMTrue S New Software  New Size  New Size Non-executable Adapted Software  Adapted Size  % Design Modified  % Code Modified  % Integration Required  Assessment and Assimilation  Software Understanding 1  Programmer Unfamiliarity 1  Pre-exists Size 2  Deleted Size  Redesign Required %  Reimplementation Required %  Retest Required %  Adapted Size  Adapted Size Non-executable  % of Design Adapted  % of Code Adapted  % of Test Adapted  Reused Size  Reused Size Non-executable  Deleted Size  Code Removal Complexity 1 - Not applicable for reused software 2 - Specified separately for Designed for Reuse and Not Designed for Reuse

15 Example: SEER Factors with No Direct COCOMO II Mapping PERSONNEL CAPABILITIES AND EXPERIENCE  Practices and Methods Experience DEVELOPMENT SUPPORT ENVIRONMENT  Modern Development Practices  Logon thru Hardcopy Turnaround  Terminal Response Time  Resource Dedication  Resource and Support Location  Process Volatility PRODUCT DEVELOPMENT REQUIREMENTS  Requirements Volatility (Change) 1  Test Level 2  Quality Assurance Level 2  Rehost from Development to Target PRODUCT REUSABILITY  Software Impacted by Reuse DEVELOPMENT ENVIRONMENT COMPLEXITY  Language Type (Complexity)  Host Development System Complexity  Application Class Complexity 3  Process Improvement TARGET ENVIRONMENT  Special Display Requirements  Real Time Code  Security Requirements 1 – COCOMO II uses the Requirements Evolution and Volatility size adjustment factor 2 – Captured in the COCOMO II Required Software Reliability factor 3 – Captured in the COCOMO II Complexity factor

16 Vendor Elaborations of Critical Domain Factors COCOMO IISEER *True S  Required Software Reliability  Specification Level – Reliability  Test Level  Quality Assurance Level  Operating Specification Level (platform and environment settings for required reliability, portability, structuring and documentation)  Product Complexity  Complexity (Staffing)  Language Type (Complexity)  Host Development System Complexity  Application Class Complexity  Functional Complexity – Application Type  Language  Language Object-Oriented * SEER factors supplemented with and may be impacted via knowledge base settings for –Platform –Application –Acquisition method –Development method –Development standard –Class –Component type (COTS only)

17 Example: Required Reusability Mapping (1/2)  SEER-SEM –Reusability Level  XH = Across organization  VH = Across product line  H = Across project  N = No requirements –Software Impacted by Reuse (% reusable)  100%  50%  25%  0%-  COCOMO II –XH = Across multiple product lines –VH = Across product line –H = Across program –N = Across project –L = None Cost to develop software module for subsequent reuse  SEER-SEM to COCOMO II: –XH = XH in COCOMO II 100% reuse level = % reuse level = % reuse level = % reuse level = 1.25 –VH = VH in COCOMO II 100% reuse level = % reuse level = % reuse level = % reuse level = 1.16 –H = N in COCOMO II –N = L in COCOMO II

18 Example: Required Reusability Mapping (2/2)  SEER-SEM to COCOMO II: –XH = XH in COCOMO II 100% reuse level = % reuse level = % reuse level = % reuse level = 1.25 –VH = VH in COCOMO II 100% reuse level = % reuse level = % reuse level = % reuse level = 1.16 –H = N in COCOMO II –N = L in COCOMO II

19 Example: WBS Mapping

20 Example: Model Normalization

21 Outline  Introduction and background  Model comparison examples  Estimation performance analysis  Conclusions and future work

22 Model Analysis Flow COCOMO II SEER-SEM True S Not all steps performed on iterations 2-n

23 Performance Measures  For each model, compare actual and estimated effort for n projects in a dataset: Relative Error (RE) = ( Estimated Effort – Actual Effort ) / Actual Effort Magnitude of Relative Error (MRE) = | Estimated Effort – Actual Effort | / Actual Effort Mean Magnitude of relative error (MMRE) = (  MRE) / n Root Mean Square (RMS) = ((1/n)  (Estimated Effort – Actual Effort) 2 ) ½ Prediction level PRED(L) = k / n where k = the number projects in a set of n projects whose MRE <= L.

24 COCOMO II Performance Examples PRED(40) Calibration Effect MMRE Calibration Effect

25 SEER-SEM Performance Examples MMRE Progressive Adjustment Effects PRED(40) Progressive Adjustment Effects

26 Model Performance Summaries For Flight Projects

27 Outline  Introduction and background  Model comparison examples  Estimation performance analysis  Conclusions and future work

28 Vendor Concerns  Study limited to a COCOMO viewpoint only  Current Rosetta Stones need review and may be weak translators from the original data  Results not indicative of model performance due to ignored parameters  Risk and uncertainty were ground ruled out  Data sanity checking needed

29  All cost models (COCOMO II, SEER-SEM, True S) performed well against NASA database of critical flight software –Calibration and knowledge base settings improved default model performance –Estimate performance varies by domain subset  Complexity and reliability factor distributions characterize the domains as expected  SEER-SEM and True S vendor models provide additional factors beyond COCOMO II –More granular factors for the overall effects captured in the COCOMO II Complexity factor. –Additional factors for other aspects, many of which are relevant for NASA projects  Some difficulties mapping inputs between models, but simplifications are possible  Reconciliation of effort WBS necessary for valid comparison between models Conclusions (1/2)

30 Conclusions (2/2)  Models exhibited nearly equivalent performance trends for embedded flight projects within the different subgroups –Initial uncalibrated runs from COCOMO II and SEER-SEM both underestimated the projects by approximately 50% overall –Improvement trends between uncalibrated estimates and those with calibrations or knowledge base refinements were almost identical  SEER experiments illustrated that model performance measures markedly improved when incorporating knowledge base information for the domains –All three models have roughly the same final performance measures for either individual flight groups or combined  In practice no one model should be preferred over all others –Use a variety of methods and tools and then investigate why the estimates may vary

31 Future Work  Study has been helpful in reducing sources of misinterpretation across the models but considerably more should be done * –Developing two-way and/or multiple-way Rosetta Stones –Explicit identification of residual sources of uncertainty across models and their estimates not fully addressable by Rosetta Stones –Factors unique to some models but not others –Many-to-many factor mappings –Partial factor-to-factor mappings –Similar factors that affect estimates in different ways: linear, multiplicative, exponential, other –Imperfections in data: subjective rating scales, code counting, counting of other size factors, effort/schedule counting, endpoint definitions and interpretations, WBS element definitions and interpretations  Repeating the analysis with improved models, new data and updated Rosetta Stones –COCOMO II may be revised for critical flight project applications  Improved analysis process –Revision of vendor tool usage to set knowledge bases before COCOMO translation parameter setting –Capture estimate inputs in all three model formats; try different translation directionalities  With modern and more comprehensive data, COCOMO II and other models can be further improved and tailored for NASA project usage –Additional data always welcome * The study participants welcome sponsorship of further joint efforts to pin down sources of uncertainty, and to more explicitly identify the limits to comparing estimates across models

32 Bibliography  Boehm B, Abts C, Brown A, Chulani S, Clark B, Horowitz E, Madachy R, Reifer D, Steece B, Software Cost Estimation with COCOMO II, Prentice-Hall, 2000  Boehm B, Abts C, Chulani S, Software Development Cost Estimation Approaches – A Survey, USC-CSE , 2000  Galorath Inc., SEER-SEM User Manual, 2005  Lum K, Powell J, Hihn J, Validation of Spacecraft Software Cost Estimation Models for Flight and Ground Systems, JPL Technical Report, 2001  Madachy R, Boehm B, Wu D, Comparison and Assessment of Cost Models for NASA Flight Projects, 616/usccse pdf, USC Center for Systems and Software Engineering Technical Report USC-CSSE , /usccse pdf  PRICE Systems, True S User Manual, 2005  Reifer D, Boehm B, Chulani S, The Rosetta Stone - Making COCOMO 81 Estimates Work with COCOMO II, Crosstalk, 1999