Barry Boehm, USC-CSSE Fall 2011

Slides:



Advertisements
Similar presentations
COST ESTIMATION TECHNIQUES AND COCOMO. Cost Estimation Techniques 1-)Algorithmic cost modelling 2-)Expert judgement 3-)Estimation by analogy 4)-Parkinsons.
Advertisements

Ninth Lecture Hour 8:30 – 9:20 pm, Thursday, September 13
The System and Software Development Process Instructor: Dr. Hany H. Ammar Dept. of Computer Science and Electrical Engineering, WVU.
University of Southern California Center for Systems and Software Engineering Process Decision Frameworks for DoD and e-Services Projects ASRR 2011 Supannika.
A Sizing Framework for DoD Software Cost Analysis Raymond Madachy, NPS Barry Boehm, Brad Clark and Don Reifer, USC Wilson Rosa, AFCAA
Software Cost Estimation Main issues:  What factors determine cost/effort?  How to relate effort to development time?
Using UML, Patterns, and Java Object-Oriented Software Engineering Royce’s Methodology Chapter 16, Royce’ Methodology.
University of Southern California Center for Systems and Software Engineering SoS Engineering and the ICM Workshop Overview Jo Ann Lane USC CSSE
Rational Unified Process
University of Southern California Center for Software Engineering CSE USC System Dynamics Modeling of a Spiral Hybrid Process Ray Madachy, Barry Boehm,
University of Southern California Center for Systems and Software Engineering USC CSSE Research Overview Barry Boehm Sue Koolmanojwong Jo Ann Lane Nupul.
University of Southern California Center for Software Engineering CSE USC COSYSMO: Constructive Systems Engineering Cost Model Barry Boehm, USC CSE Annual.
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 3/18/08 (Systems and) Software Process Dynamics Ray Madachy USC.
University of Southern California Center for Systems and Software Engineering Cost Modeling for Commercial Organizations Anandi Hira, USC Graduate Student.
Integration of Software Cost Estimates Across COCOMO, SEER- SEM, and PRICE-S models Tom Harwick, Engineering Specialist Northrop Grumman Corporation Integrated.
University of Southern California Center for Systems and Software Engineering Integrating Systems and Software Engineering (IS&SE) with the Incremental.
University of Southern California Center for Software Engineering CSE USC ©USC-CSE 10/23/01 1 COSYSMO Portion The COCOMO II Suite of Software Cost Estimation.
University of Southern California Center for Systems and Software Engineering System of Systems Engineering Cost Modeling: Strategies for Different Types.
Introduction Wilson Rosa, AFCAA CSSE Annual Research Review March 8, 2010.
University of Southern California Center for Systems and Software Engineering Assessing the IDPD Factor: Quality Management Platform Project Thomas Tan.
System-of-Systems Cost Modeling: COSOSIMO July 2005 Workshop Results Jo Ann Lane University of Southern California Center for Software Engineering.
Estimating System of Systems Engineering (SoSE) Effort Jo Ann Lane, USC Symposium on Complex Systems Engineering January 11-12, 2007.
Iterative development and The Unified process
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy USC Center for Systems and Software Engineering
University of Southern California Center for Software Engineering C S E USC Agile and Plan-Driven Methods Barry Boehm, USC USC-CSE Affiliates’ Workshop.
University of Southern California Center for Systems and Software Engineering © 2009, USC-CSSE 1 An Analysis of Changes in Productivity and COCOMO Cost.
Chapter 23 – Project planning Part 2. Estimation techniques  Organizations need to make software effort and cost estimates. There are two types of technique.
Cost Management Week 6-7 Learning Objectives
Cost Estimation Van Vliet, chapter 7 Glenn D. Blank.
Information System Economics Software Project Cost Estimation.
University of Toronto Department of Computer Science © 2001, Steve Easterbrook CSC444 Lec22 1 Lecture 22: Software Measurement Basics of software measurement.
© The McGraw-Hill Companies, Software Project Management 4th Edition Software effort estimation Chapter 5.
Using SysML to Estimate SoS Engineering and Development Effort Jo Ann Lane Tim Bohn COCOMO.
COCOMO-SCORM: Cost Estimation for SCORM Course Development
Introduction to RUP Spring Sharif Univ. of Tech.2 Outlines What is RUP? RUP Phases –Inception –Elaboration –Construction –Transition.
Chapter 2 The process Process, Methods, and Tools
Chapter 6 : Software Metrics
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 23Slide 1 Chapter 23 Software Cost Estimation.
CHECKPOINTS OF THE PROCESS Three sequences of project checkpoints are used to synchronize stakeholder expectations throughout the lifecycle: 1)Major milestones,
University of Southern California Center for Software Engineering C S E USC Using COCOMO for Software Decisions - from COCOMO II Book, Section 2.6 Barry.
10/27/20151Ian Sommerville.  Fundamentals of software measurement, costing and pricing  Software productivity assessment  The principles of the COCOMO.
University of Southern California Center for Systems and Software Engineering Future Challenges for Systems and Software Cost Estimation and Measurement.
Fifth Lecture Hour 9:30 – 10:20 am, September 9, 2001 Framework for a Software Management Process – Life Cycle Phases (Part II, Chapter 5 of Royce’ book)
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 26 Slide 1 Software cost estimation 2.
University of Southern California Center for Systems and Software Engineering COCOMO Suite Toolset Ray Madachy, NPS Winsor Brown, USC.
Ali Afzal Malik, Supannika Koolmanojwong, Barry Boehm USC-CSSE
University of Southern California Center for Systems and Software Engineering © 2010, USC-CSSE 1 Trends in Productivity and COCOMO Cost Drivers over the.
Function Points Synthetic measure of program size used to estimate size early in the project Easier (than lines of code) to calculate from requirements.
Estimating “Size” of Software There are many ways to estimate the volume or size of software. ( understanding requirements is key to this activity ) –We.
Effort Estimation In WBS,one can estimate effort (micro-level) but needed to know: –Size of the deliverable –Productivity of resource in producing that.
Proposed Metrics Definition Highlights Raymond Madachy Naval Postgraduate School CSSE Annual Research Review March 8, 2010.
University of Southern California Center for Systems and Software Engineering Reducing Estimation Uncertainty with Continuous Assessment: Tracking the.
The COCOMO model An empirical model based on project experience. Well-documented, ‘independent’ model which is not tied to a specific software vendor.
University of Southern California Center for Systems and Software Engineering Current and Future Challenges for Software Cost Estimation and Data Collection.
+ Incremental Development Productivity Decline Ramin Moazeni, Daniel Link.
Overview of Addressing Risk with COSYSMO Garry Roedler & John Gaffney Lockheed Martin March 17, 2008.
Welcome to Software Project Management. CONVENTIONAL SOFTWARE MANAGEMENT The BEST and WORST thing about software is its flexibility. 1.Software development.
University of Southern California Center for Systems and Software Engineering Reducing Estimation Uncertainty with Continuous Assessment Framework Pongtip.
University of Southern California Center for Systems and Software Engineering 26 th Annual COCOMO Forum 1 November 2 nd, 2011 Mauricio E. Peña Dr. Ricardo.
What’s New in SPEED APPS 2.3 ? Business Excellence Application Services.
Project Cost Management
Chapter 18 Maintaining Information Systems
Constructive Cost Model
More on Estimation In general, effort estimation is based on several parameters and the model ( E= a + b*S**c ): Personnel Environment Quality Size or.
COCOMO 2 COCOMO 81 was developed with the assumption that a waterfall process would be used and that all software would be developed from scratch. Since.
Software Cost Estimation
Ramin Moazeni Winsor Brown Barry Boehm
Chapter 26 Estimation for Software Projects.
Center for Software and Systems Engineering,
Presentation transcript:

Barry Boehm, USC-CSSE Fall 2011 Future Challenges for Systems and Software Cost Estimation and Measurement Barry Boehm, USC-CSSE Fall 2011 Many people have provided us with valuable insights on the challenge of integrating systems and software engineering, especially at the OSD/USC workshops in October 2007 and March 2008. We would particularly like to thank Bruce Amato (OSD), Elliot Axelband (Rand/USC), William Bail (Mitre), J.D. Baker (BAE Systems), Kristen Baldwin (OSD), Kirstie Bellman (Aerospace), Winsor Brown (USC), Jim Cain (BAE Systems), David Castellano (OSD), Clyde Chittister (CMU-SEI), Les DeLong (Aerospace), Chuck Dreissnack (SAIC/MDA), Tom Frazier (IDA), George Friedman (USC), Brian Gallagher (CMU-SEI), Stuart Glickman (Lockheed Martin), Gary Hafen (Lockheed Martin), Dan Ingold (USC), Judy Kerner (Aerospace), Kelly Kim (Boeing), Sue Koolmanojwong (USC), Per Kroll (IBM), DeWitt Latimer (USAF/USC), Rosalind Lewis (Aerospace), Azad Madni (ISTI), Mark Maier (Aerospace), Darrell Maxwell (USN), Ali Nikolai (SAIC), Lee Osterweil (UMass), Karen Owens (Aerospace), Adrian Pitman (Australia DMO), Art Pyster (Stevens), Shawn Rahmani (Boeing), Bob Rassa (Raytheon), Don Reifer (RCI/USC), John Rieff (Raytheon), Stan Rifkin (Master Systems), Wilson Rosa (USAF), Walker Royce (IBM), Kelly Schlegel (Boeing), Tom Schroeder (BAE Systems), David Seaver (Price Systems), Rick Selby (Northrop Grumman), Stan Settles (USC), Neil Siegel (Northrop Grumman), Frank Sisti (Aerospace), Peter Suk (Boeing), Denton Tarbet (Galorath), Rich Turner (Stevens), Gan Wang (BAE Systems), and Marilee Wheaton (Aerospace), for their valuable contributions to the study. 4/27/2017 USC-CSSE

Summary Current and future trends create challenges for systems and software data collection and analysis Metrics and “productivity:” “equivalent” size; requirements/design/product/value metrics; productivity growth and decline phenomena Cost drivers: effects of complexity, volatility, architecture Alternative processes: rapid/agile; systems of systems; evolutionary development Model integration: systems and software; cost, schedule, and quality; costs and benefits Updated systems and software data definitions and estimation methods needed for good management Being addressed in Nov 4-5 workshops 4/27/2017 USC-CSSE

Metrics and “Productivity” “Equivalent” size Requirements/design/product/value metrics Productivity growth phenomena Incremental development productivity decline 4/27/2017 USC-CSSE

Size Issues and Definitions An accurate size estimate is the most important input to parametric cost models. Desire consistent size definitions and measurements across different models and programming languages The AFCAA Guide sizing chapter addresses these: Common size measures defined and interpreted for all the models Guidelines for estimating software size Guidelines to convert size inputs between models so projects can be represented in in a consistent manner Using Source Lines of Code (SLOC) as common measure Logical source statements consisting of data declarations executables Rules for considering statement type, how produced, origin, build, etc. Providing automated code counting tools adhering to definition Providing conversion guidelines for physical statements Addressing other size units such as requirements, use cases, etc. 4/27/2017 USC-CSSE

Equivalent SLOC – A User Perspective * “Equivalent” – A way of accounting for relative work done to generate software relative to the code-counted size of the delivered software “Source” lines of code: The number of logical statements prepared by the developer and used to generate the executing code Usual Third Generation Language (C, Java): count logical 3GL statements For Model-driven, Very High Level Language, or Macro-based development: count statements that generate customary 3GL code For maintenance above the 3GL level: count the generator statements For maintenance at the 3GL level: count the generated 3GL statements Two primary effects: Volatility and Reuse Volatility: % of ESLOC reworked or deleted due to requirements volatility Reuse: either with modification (modified) or without modification (adopted) *Stutzke, Richard D, Estimating Software-Intensive Systems, Upper Saddle River, N.J.: Addison Wesley, 2005 4/27/2017 USC-CSSE

Cockburn, Writing Effective Use Cases, 2001 “Number of Requirements” - Early estimation availability at kite level -Data collection and model calibration at clam level Cockburn, Writing Effective Use Cases, 2001 4/27/2017 USC-CSSE

IBM-UK Expansion Factor Experience Business Objectives 5 Cloud Business Events/Subsystems 35 Kite Use Cases/Components 250 Sea level Main Steps/Main Operations 2000 Fish Alt. Steps/Detailed Operations 15,000 Clam SLOC* 1,000K – 1,500K Lava *(70 – 100 SLOC/Detailed Operation) (Hopkins & Jenkins, Eating the IT Elephant, 2008) 4/27/2017 USC-CSSE

SLOC/Requirement Data (Selby, 2009) 4/27/2017 USC-CSSE

Uncertainties in scope, COTS, reuse, services Estimation Challenges: A Dual Cone of Uncertainty – Need early systems engineering, evolutionary development Uncertainties in scope, COTS, reuse, services Uncertainties in competition, technology, organizations, mission priorities There is Another Cone of Uncertainty: Shorter increments are better Uncertainties in competition and technology evolution and changes in organizations and mission priorities, can wreak havoc with the best of system development programs. In addition, the longer the development cycle, the more likely it will be that several of these uncertainties or changes will occur and make the originally-defined system obsolete. Therefore, planning to develop a system using short increments helps to ensure that early, high priority capabilities can be developed and fielded and changes can be more easily accommodated in future increments. 4/27/2017 USC-CSSE

Incremental Development Productivity Decline (IDPD) Example: Site Defense BMD Software 5 builds, 7 years, $100M; operational and support software Build 1 productivity over 200 LOC/person month Build 5 productivity under 100 LOC/PM Including Build 1-4 breakage, integration, rework 318% change in requirements across all builds IDPD factor = 20% productivity decrease per build Similar trends in later unprecedented systems Not unique to DoD: key source of Windows Vista delays Maintenance of full non-COTS SLOC, not ESLOC Build 1: 200 KSLOC new; 200K reused@20% = 240K ESLOC Build 2: 400 KSLOC of Build 1 software to maintain, integrate 4/27/2017 USC-CSSE

IDPD Cost Drivers: Conservative 4-Increment Example Some savings: more experienced personnel (5-20%) Depending on personnel turnover rates Some increases: code base growth, diseconomies of scale, requirements volatility, user requests Breakage, maintenance of full code base (20-40%) Diseconomies of scale in development, integration (10-25%) Requirements volatility; user requests (10-25%) Best case: 20% more effort (IDPD=6%) Worst case: 85% (IDPD=23%) 4/27/2017 USC-CSSE

Effects of IDPD on Number of Increments Model relating productivity decline to number of builds needed to reach 8M SLOC Full Operational Capability Assumes Build 1 production of 2M SLOC @ 100 SLOC/PM 20000 PM/ 24 mo. = 833 developers Constant staff size for all builds Analysis varies the productivity decline per build Extremely important to determine the incremental development productivity decline (IDPD) factor per build SLOC 8M 2M 4/27/2017 USC-CSSE

Incremental Development Data Challenges Breakage effects on previous increments Modified, added, deleted SLOC: need Code Count with diff tool Accounting for breakage effort Charged to current increment or I&T budget (IDPD) IDPD effects may differ by type of software “Breakage ESLOC” added to next increment Hard to track phase and activity distributions Hard to spread initial requirements and architecture effort Size and effort reporting Often reported cumulatively Subtracting previous increment size may miss deleted code Time-certain development Which features completed? (Fully? Partly? Deferred?) 4/27/2017 USC-CSSE

“Equivalent SLOC” Paradoxes Not a measure of software size Not a measure of software effort Not a measure of delivered software capability A quantity derived from software component sizes and reuse factors that helps estimate effort Once a product or increment is developed, its ESLOC loses its identity Its size expands into full SLOC Can apply reuse factors to this to determine an ESLOC quantity for the next increment But this has no relation to the product’s size 4/27/2017 USC-CSSE

COCOMO II Database Productivity Increases Two productivity increasing trends exist: 1970 – 1994 and 1995 – 2009 1970-1974 1975-1979 1980-1984 1985-1989 1990-1994 1995-1999 2000-2004 2005-2009 1970-1999 productivity trends largely explained by cost drivers and scale factors Post-2000 productivity trends not explained by cost drivers and scale factors SLOC per PM Five-year Periods 4/27/2017 USC-CSSE

Constant A Decreases Over Post-2000 Period Calibrate the constant A while stationing B = 0.91 Constant A is the inverse of adjusted productivity adjusts the productivity with SF’s and EM’s Constant A decreases over the periods 1970- 1975- 1980- 1985- 1990- 1995- 2000- 2005- 1974 1979 1984 1989 1994 1999 2004 2009 50% decrease over Post-2000 period Productivity is not fully characterized by SF’s and EM’s What factors can explain the phenomenon? 4/27/2017 USC-CSSE

Candidate Explanation Hypotheses Productivity has doubled over the last 40 years But scale factors and effort multipliers did not fully characterize this increase Hypotheses/questions for explanation Is standard for rating personnel factors being raised? E.g., relative to “national average” Was generated code counted as new code? E.g., model-driven development Was reused code counted as new code? Are the ranges of some cost drivers not large enough? Improvement in tools (TOOL) only contributes to 20% reduction in effort Are more lightweight projects being reported? Documentation relative to life-cycle needs 4/27/2017 USC-CSSE

Summary Current and future trends create challenges for systems and software data collection and analysis Metrics and “productivity:” “equivalent” size; requirements/design/product/value metrics; productivity growth and decline phenomena Cost drivers: effects of complexity, volatility, architecture Alternative processes: rapid/agile; systems of systems; evolutionary development Model integration: systems and software; cost, schedule, and quality; costs and benefits Updated systems and software data definitions and estimation methods needed for good management Being addressed in Nov 4-5 workshops 4/27/2017 USC-CSSE

Cost Driver Rating Scales and Effects Application Complexity Difficulty and Constraints scales Architecture, Criticality, and Volatility Effects Architecture effects as function of product size Added effects of criticality and volatility 4/27/2017 USC-CSSE

Candidate AFCAA Difficulty Scale Difficulty would be described in terms of required software reliability, database size, product complexity, integration complexity, information assurance, real-time requirements, different levels of developmental risks, etc. 4/27/2017 USC-CSSE

Candidate AFCAA Constraints Scale Dimensions of constraints include electrical power, computing capacity, storage capacity, repair capability, platform volatility, physical environment accessibility, etc. 4/27/2017 USC-CSSE

Added Cost of Weak Architecting Calibration of COCOMO II Architecture and Risk Resolution factor to 161 project data points 4/27/2017 USC-CSSE

Effect of Size on Software Effort Sweet Spots 4/27/2017 USC-CSSE

Effect of Volatility and Criticality on Sweet Spots 4/27/2017 USC-CSSE

Summary Current and future trends create challenges for systems and software data collection and analysis Metrics and “productivity:” “equivalent” size; requirements/design/product/value metrics; productivity growth and decline phenomena Cost drivers: effects of complexity, volatility, architecture Alternative processes: rapid/agile; systems of systems; evolutionary development Model integration: systems and software; cost, schedule, and quality; costs and benefits Updated systems and software data definitions and estimation methods needed for good management Being addressed in Nov 4-5 workshops 4/27/2017 USC-CSSE

Estimation for Alternative Processes Agile Methods Planning Poker/Wideband Delphi Yesterday’s Weather Adjustment: Agile COCOMO II Evolutionary Development Schedule/Cost/Quality as Independent Variable Incremental Development Productivity Decline Systems of Systems Hybrid Methods 4/27/2017 USC-CSSE

Planning Poker/Wideband Delphi Stakeholders formulate story to be developed Developers choose and show cards indicating their estimated ideal person-weeks to develop story Card values: 1,2,3,5,8,13,20,30, 50,100 If card values are about the same, use the median as the estimated effort If card values vary significantly, discuss why some estimates are high and some low Re-vote after discussion Generally, values will converge, and the median can be used as the estimated effort 4/27/2017 USC-CSSE

Agile COCOMO II Adjusting agile “yesterday’s weather” estimates Agile COCOMO II is a web-based software cost estimation tool that enables you to adjust your estimates by analogy through identifying the factors that will be changing and by how much. Step 1 Estimate Cost:   Estimate Effort:   Analogy Parameter Project Name: Baseline Value:  (Dollars) (Person - Month)(Dollars / Function Point) (Dollars / Lines of Code) (Function Points / Person-Months) (Lines of Code / Person-Months)(Ideal-Person-Weeks / Iteration) Current Project Function Points Current Project Size (SLOC):  (Lines of Code) Current Labor Rate:  (Dollars / Person-Month) Current Labor Rate (for Ideal-Person-Weeks):  (Dollars / Ideal-Person-Week) Current Iteration Number: 4/27/2017 USC-CSSE

Incremental Development Forms Type Examples Pros Cons Cost Estimation Evolutionary Sequential Small: Agile Large: Evolutionary Development Adaptability to change; rapid fielding Easiest-first; late, costly breakage Small: Planning-poker-type Large: Parametric with IDPD Prespecified Sequential Platform base plus PPPIs Prespecifiable full-capability requirements Emergent requirements or rapid change COINCOMO with no increment overlap Overlapped Evolutionary Product lines with ultrafast change Modular product line Cross-increment breakage Parametric with IDPD and Requirements Volatility Rebaselining Evolutionary Mainstream product lines; Systems of systems High assurance with rapid change Highly coupled systems with very rapid change COINCOMO, IDPD for development; COSYSMO for rebaselining Time phasing terms: Scoping; Architecting; Developing; Producing; Operating (SADPO)   Prespecified Sequential: SA; DPO1; DPO2; DPO3; … Evolutionary Sequential: SADPO1; SADPO2; SADPO3; … Evolutionary Overlapped: SADPO1; SADPO2; SADPO3; … Evolutionary Concurrent: SA; D1 ; PO1… SA2; D2 ; PO2… SA3; D3; PO3 … 4/27/2017 USC-CSSE

Evolutionary Development Implications Total Package Procurement doesn’t work Can’t determine requirements and cost up front Need significant, sustained systems engineering effort Need best-effort up-front architecting for evolution Can’t dismiss systems engineers after Preliminary Design Review Feature set size becomes dependent variable Add or drop borderline-priority features to meet schedule or cost Implies prioritizing, architecting steps in SAIV process model Safer than trying to maintain a risk reserve 4/27/2017 USC-CSSE

Future DoD Challenges: Systems of Systems Source Selection ● ● ● Valuation Exploration Architecting Develop Operation System A System B System C System x LCO-type Proposal & Feasibility Info Candidate Supplier/ Strategic Partner n Strategic Partner 1 SoS-Level FCR1 DCR1 OCR1 Rebaseline/ Adjustment FCR1 OCR2    OCRx1 FCRB DCRB OCRB1 FCRA DCRA FCRC DCRC OCRC1 OCRx2 OCRx3 OCRx4 OCRx5 OCRC2 OCRB2 OCRA1 4/27/2017 USC-CSSE

Conceptual SoS SE Effort Profile SoS SE activities focus on three somewhat independent activities, performed by relatively independent teams A given SoS SE team may be responsible for one, two, or all activity areas Some SoS programs may have more than one organization performing SoS SE activities 4/27/2017 USC-CSSE

SoS SE Cost Model SoSs supported by cost model Strategically-oriented stakeholders interested in tradeoffs and costs Long-range architectural vision for SoS Developed and integrated by an SoS SE team System component independence Size drivers and cost factors Based on product characteristics, processes that impact SoS SE team effort, and SoS SE personnel experience and capabilities Planning, Requirements Management, and Architecting Source Selection and Supplier Oversight SoS Integration and Testing Size Drivers SoS SE Effort Cost Factors Calibration 4/27/2017 USC-CSSE

Comparison of SE and SoSE Cost Model Parameters Parameter Aspects COSYSMO COSOSIMO Size drivers # of system requirements # of system interfaces # operational scenarios # algorithms # of SoS requirements # of SoS interface protocols # of constituent systems # of constituent system organizations “Product” characteristics Size/complexity/volatility Requirements understanding Architecture understanding Level of service requirements # of recursive levels in design Migration complexity Technology risk #/ diversity of platforms/installations Level of documentation Component system maturity and stability Component system readiness Process characteristics Process capability Multi-site coordination Tool support Maturity of processes Cost/schedule compatibility SoS risk resolution People characteristics Stakeholder team cohesion Personnel/team capability Personnel experience/continuity SoS team capability 4/27/2017 USC-CSSE

Summary Current and future trends create challenges for systems and software data collection and analysis Metrics and “productivity:” “equivalent” size; requirements/design/product/value metrics; productivity growth and decline phenomena Cost drivers: effects of complexity, volatility, architecture Alternative processes: rapid/agile; systems of systems; evolutionary development Model integration: systems and software; cost, schedule, and quality; costs and benefits Updated systems and software data definitions and estimation methods needed for good management Being addressed in Nov 4-5 workshops 4/27/2017 USC-CSSE

Reasoning about the Value of Dependability – iDAVE iDAVE: Information Dependability Attribute Value Estimator Use iDAVE model to estimate and track software dependability ROI Help determine how much dependability is enough Help analyze and select the most cost-effective combination of software dependability techniques Use estimates as a basis for tracking performance Integrates cost estimation (COCOMO II), quality estimation (COQUALMO), value estimation relationships 4/27/2017 USC-CSSE

iDAVE Model Framework 4/27/2017 USC-CSSE

Examples of Utility Functions: Response Time Value Time Real-Time Control; Event Support Value Time Mission Planning, Competitive Time-to-Market Critical Region Value Time Event Prediction - Weather; Software Size Value Time Data Archiving Priced Quality of Service 4/27/2017 USC-CSSE

Tradeoffs Among Cost, Schedule, and Reliability: COCOMO II (RELY, MTBF (hours)) For 100-KSLOC set of features Can “pick all three” with 77-KSLOC set of features -- Cost/Schedule/RELY: “pick any two” points 4/27/2017 USC-CSSE

The SAIV* Process Model 1. Shared vision and expectations management 2. Feature prioritization 3. Schedule range estimation and core-capability determination - Top-priority features achievable within fixed schedule with 90% confidence 4. Architecting for ease of adding or dropping borderline-priority features - And for accommodating past-IOC directions of growth 5. Incremental development - Core capability as increment 1 6. Change and progress monitoring and control - Add or drop borderline-priority features to meet schedule *Schedule As Independent Variable; Feature set as dependent variable Also works for cost, schedule/cost/quality as independent variable 4/27/2017 USC-CSSE

How Much Testing is Enough How Much Testing is Enough? - Early Startup: Risk due to low dependability - Commercial: Risk due to low dependability - High Finance: Risk due to low dependability - Risk due to market share erosion Sweet Spot COCOMO II: 12 22 34 54 Added % test time COQUALMO: 1.0 .475 .24 .125 0.06 P(L) Early Startup: .33 .19 .11 .06 .03 S(L) Commercial: .56 .32 .18 .10 High Finance: 3.0 1.68 .96 .54 .30 Market Risk: .008 .027 .09 REm 4/27/2017 USC-CSSE

Related Additional Measurement Challenges Tracking progress of rebaselining, V&V teams No global plans; individual changes or software drops Earlier test preparation: surrogates, scenarios, testbeds Tracking content of time-certain increments Deferred or partial capabilities; effects across system Trend analysis of emerging risks INCOSE Leading Indicators; SERC Effectiveness Measures Contributions to systems effectiveness Measures of Effectiveness models, parameters Systems of systems progress, risk, change tracking Consistent measurement flow-up, flow-down, flow-across 4/27/2017 USC-CSSE

Some data definition topics for discussion In SW Metrics Unification workshop Nov 4-5 Ways to treat data elements COTS, other OTS (open source; services; GOTS; reuse; legacy code) Other size units (function points object points, use case points, etc.) Generated code: counting generator directives Requirements volatility Rolling up CSCIs into systems and systems of systems Cost model inputs and outputs (e.g., submitting estimate files) Scope issues Cost drivers, Scale factors Reuse parameters: Software Understanding , Programmer Unfamiliarity Phases included: hardware-software integration; systems of systems integration, transition, maintenance WBS elements and labor categories included Parallel software WBS How to involve various stakeholders Government, industry, commercial cost estimation organizations 4/27/2017 USC-CSSE

Summary Current and future trends create challenges for systems and software data collection and analysis Metrics and “productivity:” “equivalent” size; requirements/design/product/value metrics; productivity growth and decline phenomena Cost drivers: effects of complexity, volatility, architecture Alternative processes: rapid/agile; systems of systems; evolutionary development Model integration: systems and software; cost, schedule, and quality; costs and benefits Updated systems and software data definitions and estimation methods needed for good management Being addressed in Nov 4-5 workshops 4/27/2017 USC-CSSE

References Boehm, B., “Some Future Trends and Implications for Systems and Software Engineering Processes”, Systems Engineering 9(1), pp. 1-19, 2006. Boehm, B., and Lane, J., “Using the ICM to Integrate System Acquisition, Systems Engineering, and Software Engineering,” CrossTalk, October 2007, pp. 4-9. Boehm, B., Brown, A.W.. Clark, B., Madachy, R., Reifer, D., et al., Software Cost Estimation with COCOMO II, Prentice Hall, 2000. Dahmann, J. (2007); “Systems of Systems Challenges for Systems Engineering”, Systems and Software Technology Conference, June 2007. Department of Defense (DoD), Instruction 5000.02, Operation of the Defense Acquisition System, December 2008. Galorath, D., and Evans, M., Software Sizing, Estimation, and Risk Management, Auerbach, 2006. Lane, J. and Boehm, B., “Modern Tools to Support DoD Software-Intensive System of Systems Cost Estimation, DACS State of the Art Report, also Tech Report USC-CSSE-2007-716 Lane, J., Valerdi, R., “Synthesizing System-of-Systems Concepts for Use in Cost Modeling,” Systems Engineering, Vol. 10, No. 4, December 2007. Madachy, R., “Cost Model Comparison,” Proceedings 21st, COCOMO/SCM Forum, November, 2006, http://csse.usc.edu/events/2006/CIIForum/pages/program.html Northrop, L., et al., Ultra-Large-Scale Systems: The Software Challenge of the Future, Software Engineering Institute, 2006. Reifer, D., “Let the Numbers Do the Talking,” CrossTalk, March 2002, pp. 4-8. Stutzke, R., Estimating Software-Intensive Systems, Addison Wesley, 2005. Valerdi, R, Systems Engineering Cost Estimation with COSYSMO, Wiley, 2010 (to appear) USC-CSSE Tech Reports, http://csse.usc.edu/csse/TECHRPTS/by_author.html 4/27/2017 USC-CSSE

Backup Charts 4/27/2017 USC-CSSE

COSYSMO Operational Concept # Requirements # Interfaces # Scenarios # Algorithms + Volatility Factor Size Drivers COSYSMO Effort Effort Multipliers Application factors 8 factors Team factors 6 factors Schedule driver Calibration WBS guided by ISO/IEC 15288 4/27/2017 USC-CSSE

4. Rate Cost Drivers - Application

COSYSMO Change Impact Analysis – I – Added SysE Effort for Going to 3 Versions Size: Number, complexity, volatility, reuse of system requirements, interfaces, algorithms, scenarios (elements) 13 Versions: add 3-6% per increment for number of elements add 2-4% per increment for volatility Exercise Prep.: add 3-6% per increment for number of elements add 3-6% per increment for volatility Most significant cost drivers (effort multipliers) Migration complexity: 1.10 – 1.20 (versions) Multisite coordination: 1.10 – 1.20 (versions, exercise prep.) Tool support: 0.75 – 0.87 (due to exercise prep.) Architecture complexity: 1.05 – 1.10 (multiple baselines) Requirements understanding: 1.05 – 1.10 for increments 1,2; 1.0 for increment 3; .9-.95 for increment 4 4/27/2017 USC-CSSE

COSYSMO Change Impact Analysis – II – Added SysE Effort for Going to 3 Versions Cost Element Incr. 1 Incr. 2 Incr. 3 Incr. 4 Size 1.11 – 1.22 1.22 – 1.44 1.33 – 1.66 1.44 – 1.88 Effort Product 1.00 – 1.52 0.96 – 1.38 0.86 – 1.31 Effort Range 1.11 – 1.85 1.22 – 2.19 1.27 – 2.29 1.23 – 2.46 Arithmetic Mean 1.48 1.70 1.78 1.84 Geometric Mean 1.43 1.63 1.71 1.74 4/27/2017 USC-CSSE

COSYSMO Requirements Counting Challenge Estimates made in early stages Relatively few high-level design-to requirements Calibration performed on completed projects Relatively many low-level test-to requirements Need to know expansion factors between levels Best model: Cockburn definition levels Cloud, kite, sea level, fish, clam Expansion factors vary by application area, size One large company: Magic Number 7 Small e-services projects: more like 3:1, fewer lower levels Survey form available to capture your experience 4/27/2017 USC-CSSE

Increment N Transition/O&M Achieving Agility and High Assurance -I Using timeboxed or time-certain development Precise costing unnecessary; feature set as dependent variable Short Development Increments Rapid Change Foreseeable Change (Plan) Short, Stabilized Development Of Increment N Increment N Transition/O&M Increment N Baseline ICM Stage II: Increment View The ICM is organized to simultaneously address the conflicting challenges of rapid change and high assurance of dependability. It also addresses the need for rapid fielding of incremental capabilities with a minimum of rework. For high assurance, the development of each increment should be short, stable, and provided with a validated baseline architecture and set of requirements and development plans. The architecture should accommodate any foreseeable changes in the requirements; the next chart shows how the unforeseeable changes are handled. High Assurance Stable Development Increments 4/27/2017 USC-CSSE

Evolutionary Concurrent: Incremental Commitment Model Unforeseeable Change (Adapt) Rapid Change Agile Rebaselining for Future Increments Future Increment Baselines Short Development Increments Deferrals Foreseeable Change (Plan) Short, Stabilized Development of Increment N Increment N Transition/ Operations and Maintenance Increment N Baseline ICM Stage II: More Detailed Increment View The need to deliver high-assurance incremental capabilities on short fixed schedules means that each increment needs to be kept as stable as possible. This is particularly the case for large, complex systems and systems of systems, in which a high level of rebaselining traffic can easily lead to chaos. In keeping with the use of the spiral model as a risk-driven process model generator, the risks of destabilizing the development process make this portion of the project into a waterfall-like build-to-specification subset of the spiral model activities. The need for high assurance of each increment also makes it cost-effective to invest in a team of appropriately skilled personnel to continuously verify and validate the increment as it is being developed. However, “deferring the change traffic” does not imply deferring its change impact analysis, change negotiation, and rebaselining until the beginning of the next increment. With a single development team and rapid rates of change, this would require a team optimized to develop to stable plans and specifications to spend much of the next increment’s scarce calendar time performing tasks much better suited to agile teams. The appropriate metaphor for addressing rapid change is not a build-to-specification metaphor or a purchasing-agent metaphor but an adaptive “command-control-intelligence-surveillance-reconnaissance” (C2ISR) metaphor. It involves an agile team performing the first three activities of the C2ISR “Observe, Orient, Decide, Act” (OODA) loop for the next increments, while the plan-driven development team is performing the “Act” activity for the current increment. “Observing” involves monitoring changes in relevant technology and COTS products, in the competitive marketplace, in external interoperating systems and in the environment; and monitoring progress on the current increment to identify slowdowns and likely scope deferrals. “Orienting” involves performing change impact analyses, risk analyses, and tradeoff analyses to assess candidate rebaselining options for the upcoming increments. “Deciding” involves stakeholder renegotiation of the content of upcoming increments, architecture rebaselining, and the degree of COTS upgrading to be done to prepare for the next increment. It also involves updating the future increments’ Feasibility Rationales to ensure that their renegotiated scopes and solutions can be achieved within their budgets and schedules. A successful rebaseline means that the plan-driven development team can hit the ground running at the beginning of the “Act” phase of developing the next increment, and the agile team can hit the ground running on rebaselining definitions of the increments beyond. Stable Development Increments High Assurance Artifacts Concerns Future V&V Resources Current V&V Resources Verification and Validation (V&V) of Increment N Continuous V&V 4/27/2017 USC-CSSE

Effect of Unvalidated Requirements -15 Month Architecture Rework Delay Arch. A: Custom many cache processors Arch. B: Modified Client-Server 1 2 3 4 5 Response Time (sec) Original Spec After Prototyping Available budget 4/27/2017 USC-CSSE