A Sizing Framework for DoD Software Cost Analysis Raymond Madachy, NPS Barry Boehm, Brad Clark and Don Reifer, USC Wilson Rosa, AFCAA

Slides:



Advertisements
Similar presentations
Cost as a Business Driver 1 John Brown C Eng MIEE mr_ Software Cost Estimation.
Advertisements

Printed by CONIPMO (Constructive Infrastructure Protection Model): ACCURATELY ESTIMATING HOW MUCH IT WILL COST TO SET UP NETWORK.
Early Effort Estimation of Business Data-processing Enhancements CS 689 November 30, 2000 By Kurt Detamore.
Automated Software Cost Estimation By James Roberts EEL 6883 Spring 2007.
Copyright 2000, Stephan Kelley1 Estimating User Interface Effort Using A Formal Method By Stephan Kelley 16 November 2000.
Using UML, Patterns, and Java Object-Oriented Software Engineering Royce’s Methodology Chapter 16, Royce’ Methodology.
COCOMO Suite Model Unification Tool Ray Madachy 23rd International Forum on COCOMO and Systems/Software Cost Modeling October 27, 2008.
USC 21 st International Forum on Systems, Software, and COCOMO Cost Modeling Nov 2006 University of Southern California Center for Software Engineering.
University of Southern California Center for Systems and Software Engineering Next-Generation Software Sizing and Costing Metrics Workshop Report Wilson.
University of Southern California Center for Software Engineering CSE USC System Dynamics Modeling of a Spiral Hybrid Process Ray Madachy, Barry Boehm,
University of Southern California Center for Systems and Software Engineering USC CSSE Research Overview Barry Boehm Sue Koolmanojwong Jo Ann Lane Nupul.
Integration of Software Cost Estimates Across COCOMO, SEER- SEM, and PRICE-S models Tom Harwick, Engineering Specialist Northrop Grumman Corporation Integrated.
University of Southern California Center for Systems and Software Engineering Productivity Data Analysis and Issues Brad Clark, Thomas Tan USC CSSE Annual.
University of Southern California Center for Software Engineering C S E USC Using COCOMO for Software Decisions - from COCOMO II Book, Section 2.6, 6.5.
COCOMO II Calibration Brad Clark Software Metrics Inc. Don Reifer Reifer Consultants Inc. 22nd International Forum on COCOMO and Systems / Software Cost.
Smi COCOMO II Calibration Status COCOMO Forum October 2004.
COSYSMO Reuse Extension 22 nd International Forum on COCOMO and Systems/Software Cost Modeling November 2, 2007 Ricardo ValerdiGan Wang Garry RoedlerJohn.
I n t e g r i t y - S e r v i c e - E x c e l l e n c e Headquarters U.S. Air Force Next-Generation Systems and Software Cost Estimation Wilson Rosa Technical.
Integrated COCOMO Suite Tool for Education Ray Madachy 24th International Forum on COCOMO and Systems/Software Cost Modeling November.
University of Southern California Center for Systems and Software Engineering © 2010, USC-CSSE 1 COCOMO II Maintenance Model Upgrade Vu Nguyen, Barry Boehm.
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy, Barry Boehm USC Center for Systems and Software Engineering.
I n t e g r i t y - S e r v i c e - E x c e l l e n c e Headquarters U.S. Air Force Next-Generation Systems and Software Cost Estimation Wilson Rosa Technical.
University of Southern California Center for Systems and Software Engineering Building Cost Estimating Relationships for Acquisition Decision Support Brad.
Introduction Wilson Rosa, AFCAA CSSE Annual Research Review March 8, 2010.
University of Southern California Center for Systems and Software Engineering Assessing the IDPD Factor: Quality Management Platform Project Thomas Tan.
University of Southern California Center for Systems and Software Engineering © 2009, USC-CSSE 1 Assessing and Estimating Corrective, Enhancive, and Reductive.
University of Southern California Center for Software Engineering CSE USC USC-CSE Annual Research Review COQUALMO Update John D. Powell March 11, 2002.
COSYSMO Reuse Extension 22 nd International Forum on COCOMO and Systems/Software Cost Modeling November 2, 2007 Ricardo ValerdiGan Wang Garry RoedlerJohn.
USC 21 st International Forum on Systems, Software, and COCOMO Cost Modeling Nov 2006 University of Southern California Center for Software Engineering.
Comparison and Assessment of Cost Models for NASA Flight Projects Ray Madachy, Barry Boehm, Danni Wu {madachy, boehm, USC Center for Systems.
COCOMO II Database Brad Clark Center for Software Engineering Annual Research Review March 11, 2002.
University of Southern California Center for Systems and Software Engineering Software Cost Estimation Metrics Manual 26 th International Forum on COCOMO.
University of Southern California Center for Systems and Software Engineering © 2009, USC-CSSE 1 Reuse and Maintenance Estimation Vu Nguyen March 17, 2009.
University of Southern California Center for Systems and Software Engineering AFCAA Database and Metrics Manual Ray Madachy, Brad Clark, Barry Boehm, Thomas.
SRDR Data Analysis Workshop Summary Brad Clark Ray Madachy Thomas Tan 25th International Forum on COCOMO and Systems/Software Cost Modeling November 5,
1 COSYSMO 2.0: A Cost Model and Framework for Systems Engineering Reuse Jared Fortune University of Southern California Ricardo Valerdi Massachusetts Institute.
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy USC Center for Systems and Software Engineering
University of Southern California Center for Systems and Software Engineering © 2009, USC-CSSE 1 An Analysis of Changes in Productivity and COCOMO Cost.
Improving ERP Cost Estimating
University of Toronto Department of Computer Science © 2001, Steve Easterbrook CSC444 Lec22 1 Lecture 22: Software Measurement Basics of software measurement.
COCOMO-SCORM: Cost Estimation for SCORM Course Development
University of Southern California Center for Software Engineering C S E USC Using COCOMO for Software Decisions - from COCOMO II Book, Section 2.6 Barry.
University of Southern California Center for Software Engineering C S E USC Using COCOMO for Software Decisions - from COCOMO II Book, Section 2.6 Barry.
This material is approved for public release. Distribution is limited by the Software Engineering Institute to attendees. Sponsored by the U.S. Department.
Disciplined Software Engineering Lecture #2 Software Engineering Institute Carnegie Mellon University Pittsburgh, PA Sponsored by the U.S. Department.
Copyright © 1994 Carnegie Mellon University Disciplined Software Engineering - Lecture 1 1 Disciplined Software Engineering Lecture #2 Software Engineering.
University of Southern California Center for Systems and Software Engineering COCOMO Suite Toolset Ray Madachy, NPS Winsor Brown, USC.
Function Points Synthetic measure of program size used to estimate size early in the project Easier (than lines of code) to calculate from requirements.
Estimating “Size” of Software There are many ways to estimate the volume or size of software. ( understanding requirements is key to this activity ) –We.
Effort Estimation In WBS,one can estimate effort (micro-level) but needed to know: –Size of the deliverable –Productivity of resource in producing that.
Proposed Metrics Definition Highlights Raymond Madachy Naval Postgraduate School CSSE Annual Research Review March 8, 2010.
University of Southern California Center for Systems and Software Engineering Reducing Estimation Uncertainty with Continuous Assessment: Tracking the.
Overview of RUP Lunch and Learn. Overview of RUP © 2008 Cardinal Solutions Group 2 Welcome  Introductions  What is your experience with RUP  What is.
+ Incremental Development Productivity Decline Ramin Moazeni, Daniel Link.
University of Southern California Center for Systems and Software Engineering Reducing Estimation Uncertainty with Continuous Assessment Framework Pongtip.
University of Southern California Center for Systems and Software Engineering 26 th Annual COCOMO Forum 1 November 2 nd, 2011 Mauricio E. Peña Dr. Ricardo.
What’s New in SPEED APPS 2.3 ? Business Excellence Application Services.
Estimate Testing Size and Effort Using Test Case Point Analysis
Productivity Data Analysis and Issues
Software Lifecycle Management Lecture
Metrics and Terms SLOC (source lines of code)
Tutorial: Software Cost Estimation Tools – COCOMO II and COCOTS
COCOTS Life Cycle Estimation: Some Preliminary Observations
SLOC and Size Reporting
Using COCOMO for Software Decisions - from COCOMO II Book, Section 2
Using COCOMO for Software Decisions - from COCOMO II Book, Section 2
Chapter 26 Estimation for Software Projects.
Center for Software and Systems Engineering,
Using COCOMO for Software Decisions - from COCOMO II Book, Section 2.6
Using COCOMO for Software Decisions - from COCOMO II Book, Section 2
Presentation transcript:

A Sizing Framework for DoD Software Cost Analysis Raymond Madachy, NPS Barry Boehm, Brad Clark and Don Reifer, USC Wilson Rosa, AFCAA 24th International Forum on COCOMO and Systems/Software Cost Modeling November 2, 2009

2 Agenda Project Overview (Dr. Wilson Rosa) Data Analysis Software Sizing Conclusions

3 Project Background Goal is to improve the quality and consistency of estimating methods across cost agencies and program offices through guidance, standardization, and knowledge sharing. Project led by the Air Force Cost Analysis Agency (AFCAA) working with service cost agencies, and assisted by University of Southern California and Naval Postgraduate School We will publish the AFCAA Software Cost Estimation Metrics Manual to help analysts and decision makers develop accurate, easy and quick software cost estimates for avionics, space, ground, and shipboard platforms.

44 Stakeholder Communities Research is collaborative across heterogeneous stakeholder communities who have helped us in refining our data definition framework, domain taxonomy and providing us project data. –Government agencies –Tool Vendors –Industry –Academia SLIM-Estimate ™ TruePlanning ® by PRICE Systems

5 Research Objectives Establish a robust and cost effective software metrics collection process and knowledge base that supports the data needs of the United States Department of Defense (DoD) Enhance the utility of the collected data to program oversight and management Support academic and commercial research into improved cost estimation of future DoD software-intensive systems

6 Software Cost Model Calibration  Most program offices and support contractors rely heavily on software cost models  May have not been calibrated with most recent DoD data  Calibration with recent data (2002-Present) will help increase program office estimating accuracy

7 AFCAA Software Cost Estimation Metrics Manual Table of Contents Chapter 1: Software Estimation Principles Chapter 2: Product Sizing Chapter 3: Product Growth Chapter 4: Effective SLOC Chapter 5: Historical Productivity Chapter 6: Model Calibration Chapter 7: Calibrated SLIM-ESTIMATE Chapter 8: Cost Risk and Uncertainty Metrics Chapter 9: Data Normalization Chapter 10: Software Resource Data Report Chapter 11: Software Maintenance Chapter 12: Lessons Learned

8 Manual Special Features Augment NCCA/AFCAA Software Cost Handbook: –Default Equivalent Size Inputs (DM, CM, IM, SU, AA, UNFM) –Productivity Benchmarks by Operating Environment, Application Domain, and Software Size –Empirical Code, Effort, and Schedule Growth Measures derived from SRDRs –Empirically Based Cost Risk and Uncertainty Analysis Metrics –Calibrated SLIM-Estimate™ using most recent SRDR data –Mapping between COCOMO, SEER, True S cost drivers –Empirical Dataset for COCOMO, True S, and SEER Calibration –Software Maintenance Parameters

9 Manual Special Features (Cont.) Guidelines for reconciling inconsistent data Standard Definitions (Application Domain, SLOC, etc.) Address issues related to incremental development (overlaps, early-increment breakage, integration complexity growth, deleted software, relations to maintenance) and version management (a form of product line development and evolution). Impact of Next Generation Paradigms – Model Driven Architecture, Net-Centricity, Systems of Systems, etc.

10 Agenda Project Overview (Dr. Wilson Rosa) Data Analysis Software Sizing Conclusions

11 DoD Empirical Data Data quality and standardization issues –No reporting of Equivalent Size Inputs – CM, DM, IM, SU, AA, UNFM, Type –No common SLOC reporting – logical, physical, etc. –No standard definitions – Application Domain, Build, Increment, Spiral,… –No common effort reporting – analysis, design, code, test, CM, QA,… –No common code counting tool –Product size only reported in lines of code –No reporting of quality measures – defect density, defect containment, etc. Limited empirical research within DoD on other contributors to productivity besides effort and size: –Operating Environment, Application Domain, and Product Complexity –Personnel Capability –Required Reliability –Quality – Defect Density, Defect Containment –Integrating code from previous deliveries – Builds, Spirals, Increments, etc. –Converting to Equivalent SLOC Categories like Modified, Reused, Adopted, Managed, and Used add no value unless they translate into single or unique narrow ranges of DM, CM, and IM parameter values. We have seen no empirical evidence that they do…

12 SRDR Data Source

13 Data Collection and Analysis Approach –Be sensitive to the application domain –Embrace the full life cycle and Incremental Commitment Model Be able to collect data by phase, project and/or build or increment Items to collect –SLOC reporting – logical, physical, NCSS, etc. –Requirements Volatility and Reuse Modified or Adopted using DM, CM, IM; SU, UNFM as appropriate –Definitions for Application Types, Development Phase, Lifecycle Model,… –Effort reporting – phase and activity –Quality measures – defects, MTBF, etc.

14 Data Normalization Strategy Interview program offices and developers to obtain additional information not captured in SRDRs… –Modification Type – auto generated, re-hosted, translated, modified –Source – in-house, third party, Prior Build, Prior Spiral, etc. –Degree-of-Modification – %DM, %CM, %IM; SU, UNFM as appropriate –Requirements Volatility -- % of ESLOC reworked or deleted due to requirements volatility –Method – Model Driven Architecture, Object-Oriented, Traditional –Cost Model Parameters – True S, SEER, COCOMO

15 Agenda Project Overview (Dr. Wilson Rosa) Data Analysis Software Sizing Conclusions

16 Size Issues and Definitions An accurate size estimate is the most important input to parametric cost models. Desire consistent size definitions and measurements across different models and programming languages The sizing chapter addresses these : –Common size measures defined and interpreted for all the models –Guidelines for estimating software size –Guidelines to convert size inputs between models so projects can be represented in in a consistent manner Using Source Lines of Code (SLOC) as common measure –Logical source statements consisting of data declarations executables –Rules for considering statement type, how produced, origin, build, etc. –Providing automated code counting tools adhering to definition –Providing conversion guidelines for physical statements Addressing other size units such as requirements, use cases, etc.

17 Sizing Framework Elements Core software size type definitions –Standardized data collection definitions Measurements will be invariant across cost models and data collections venues –Project data normalized to these definitions Translation tables for non-compliant data sources SLOC definition and inclusion rules Equivalent SLOC parameters Cost model Rosetta Stone size translations Other size unit conversions (e.g. function points, use cases, requirements)

18 Core Software Size Types

19 Equivalent SLOC – A User Perspective * “Equivalent” – A way of accounting for relative work done to generate software relative to the code-counted size of the delivered software “Source” lines of code: The number of logical statements prepared by the developer and used to generate the executing code –Usual Third Generation Language (C, Java): count logical 3GL statements –For Model-driven, Very High Level Language, or Macro-based development: count statements that generate customary 3GL code –For maintenance above the 3GL level: count the generator statements –For maintenance at the 3GL level: count the generated 3GL statements Two primary effects: Volatility and Reuse –Volatility: % of ESLOC reworked or deleted due to requirements volatility –Reuse: either with modification (modified) or without modification (adopted) * Stutzke, Richard D, Estimating Software-Intensive Systems, Upper Saddle River, N.J.: Addison Wesley, 2005

20 Adapted Software Parameters For adapted software, apply the parameters: –DM: % of design modified –CM: % of code modified –IM: % of integration required compared to integrating new code –Normal Reuse Adjustment Factor RAF = 0.4*DM + 0.3*CM + 0.3*IM Reused software has DM = CM = 0. Modified software has CM > 0. Since data indicates that the RAF factor tends to underestimate modification effort due to added software understanding effects, two other factors are used: –Software Understandability (SU): How understandable is the software to be modified? –Unfamiliarity (UNFM): How unfamiliar with the software to be modified is the person modifying it?

21 SLOC Inclusion Rules

22 Equivalent SLOC Rules SourceIncludesExcludes New Reused Modified Generated Generator statements 3GL generated statements Converted COTS Volatility How Produced in Development or Source IncludesExcludes New Reused Modified Generated Generator statements (if 3GL generated statements not modified in development) (if 3GL generated statements modified in development) 3GL generated statements (if modified in development) (if not modified in development) Converted COTS Volatility Equivalent SLOC Rules for Development Equivalent SLOC Rules for Maintenance

23 Cost Model Size Inputs

24 Sizing Chapter Current Outline

25 Agenda Project Overview (Dr. Wilson Rosa) Data Analysis Software Sizing Conclusions

26 Concluding Remarks Goal is to publish a manual to help analysts develop quick software estimates using empirical metrics from recent programs Additional information is crucial for improving data quality across DoD We want your input on Productivity Domains and Data Definitions Looking for collaborators Looking for peer-reviewers Need more data

27 References United States Department of Defense (DoD), “Instruction , Operation of the Defense Acquisition System”, December W. Rosa, B. Clark, R. Madachy, D. Reifer, and B. Boehm, “Software Cost Metrics Manual”, Proceedings of the 42 nd Department of Defense Cost Analysis Symposium, February B. Boehm, “Future Challenges for Systems and Software Cost Estimation”, Proceedings of the 13 th Annual Practical Software and Systems Measurement Users’ Group Conference, June B. Boehm, C. Abts, W. Brown, S. Chulani, B. Clark, E. Horowitz, R. Madachy, D. Reifer, and B. Steece, Software Cost Estimation with COCOMO II, Upper Saddle River, NJ: Prentice-Hall, R. Stutzke, Estimating Software-Intensive Systems, Upper Saddle River, NJ: Addison Wesley, Madachy R, Boehm B, “Comparative Analysis of COCOMO II, SEER-SEM and True-S Software Cost Models”, USC-CSSE , University of Southern California Center for Systems and Software Engineering, 2008.