Presentation is loading. Please wait.

Presentation is loading. Please wait.

University of Southern California Center for Systems and Software Engineering Future Challenges for Systems and Software Cost Estimation and Measurement.

Similar presentations


Presentation on theme: "University of Southern California Center for Systems and Software Engineering Future Challenges for Systems and Software Cost Estimation and Measurement."— Presentation transcript:

1 University of Southern California Center for Systems and Software Engineering Future Challenges for Systems and Software Cost Estimation and Measurement Barry Boehm, USC-CSSE CS 510, Fall 2015 1 USC-CSSEFall 2015

2 University of Southern California Center for Systems and Software Engineering Summary Current and future trends create challenges for systems and software data collection and analysis –Metrics and “productivity:” “equivalent” size; requirements/design/product/value metrics; productivity growth and decline phenomena –Cost drivers: effects of complexity, volatility, architecture –Alternative processes: rapid/agile; systems of systems; evolutionary development –Model integration: systems and software; cost, schedule, and quality; costs and benefits Updated systems and software data definitions and estimation methods needed for good management –COCOMO III effort USC-CSSE2Fall 2015

3 University of Southern California Center for Systems and Software Engineering Metrics and “Productivity” “Equivalent” size Requirements/design/product/value metrics Productivity growth phenomena Incremental development productivity decline USC-CSSE3Fall 2015

4 University of Southern California Center for Systems and Software Engineering Size Issues and Definitions An accurate size estimate is the most important input to parametric cost models. Desire consistent size definitions and measurements across different models and programming languages The upcoming Software Metrics Guide addresses these : –Common size measures defined and interpreted for all the models –Guidelines for estimating software size –Guidelines to convert size inputs between models so projects can be represented in in a consistent manner Using Source Lines of Code (SLOC) as common measure –Logical source statements consisting of data declarations executables –Rules for considering statement type, how produced, origin, build, etc. –Providing automated code counting tools adhering to definition –Providing conversion guidelines for physical statements Addressing other size units such as requirements, use cases, etc. 4USC-CSSEFall 2015

5 University of Southern California Center for Systems and Software Engineering Equivalent SLOC – A User Perspective * “Equivalent” – A way of accounting for relative work done to generate software relative to the code-counted size of the delivered software “Source” lines of code: The number of logical statements prepared by the developer and used to generate the executing code –Usual Third Generation Language (C, Java): count logical 3GL statements –For Model-driven, Very High Level Language, or Macro-based development: count statements that generate customary 3GL code –For maintenance above the 3GL level: count the generator statements –For maintenance at the 3GL level: count the generated 3GL statements Two primary effects: Volatility and Reuse –Volatility: % of ESLOC reworked or deleted due to requirements volatility –Reuse: either with modification (modified) or without modification (adopted) * Stutzke, Richard D, Estimating Software-Intensive Systems, Upper Saddle River, N.J.: Addison Wesley, 2005 5USC-CSSEFall 2015

6 University of Southern California Center for Systems and Software Engineering 6 “Number of Requirements” - Early estimation availability at kite level -Data collection and model calibration at clam level Cockburn, Writing Effective Use Cases, 2001 USC-CSSEFall 2015

7 University of Southern California Center for Systems and Software Engineering USC-CSSE 7 IBM-UK Expansion Factor Experience Business Objectives5Cloud Business Events/Subsystems35 Kite Use Cases/Components250Sea level Main Steps/Main Operations2000 Fish Alt. Steps/Detailed Operations15,000Clam SLOC*1,000K – 1,500KLava (Hopkins & Jenkins, Eating the IT Elephant, 2008) *(70 – 100 SLOC/Detailed Operation) Fall 2015

8 University of Southern California Center for Systems and Software Engineering 8 SLOC/Requirement Data (Selby, 2009) USC-CSSE Fall 2015

9 University of Southern California Center for Systems and Software Engineering USC-CSSE 9 Estimation Challenges: A Dual Cone of Uncertainty – Need early systems engineering, evolutionary development Uncertainties in competition, technology, organizations, mission priorities Uncertainties in scope, COTS, reuse, services Fall 2015

10 University of Southern California Center for Systems and Software Engineering Incremental Development Productivity Decline (IDPD) Example: Site Defense BMD Software –5 builds, 7 years, $100M; operational and support software –Build 1 productivity over 200 LOC/person month –Build 5 productivity under 100 LOC/PM Including Build 1-4 breakage, integration, rework 318% change in requirements across all builds IDPD factor = 20% productivity decrease per build –Similar trends in later unprecedented systems –Not unique to DoD: key source of Windows Vista delays Maintenance of full non-COTS SLOC, not ESLOC –Build 1: 200 KSLOC new; 200K reused@20% = 240K ESLOC –Build 2: 400 KSLOC of Build 1 software to maintain, integrate 10USC-CSSEFall 2015

11 University of Southern California Center for Systems and Software Engineering IDPD Cost Drivers: Conservative 4-Increment Example Some savings: more experienced personnel (5-20%) Depending on personnel turnover rates Some increases: code base growth, diseconomies of scale, requirements volatility, user requests Breakage, maintenance of full code base (20-40%) Diseconomies of scale in development, integration (10-25%) Requirements volatility; user requests (10-25%) Best case: 20% more effort (IDPD=6%) Worst case: 85% (IDPD=23%) 11USC-CSSEFall 2015

12 University of Southern California Center for Systems and Software Engineering Effects of IDPD on Number of Increments Model relating productivity decline to number of builds needed to reach 8M SLOC Full Operational Capability Assumes Build 1 production of 2M SLOC @ 100 SLOC/PM –20000 PM/ 24 mo. = 833 developers –Constant staff size for all builds Analysis varies the productivity decline per build –Extremely important to determine the incremental development productivity decline (IDPD) factor per build 2M 8M SLOC 12USC-CSSEFall 2015

13 University of Southern California Center for Systems and Software Engineering Incremental Development Data Challenges Breakage effects on previous increments –Modified, added, deleted SLOC: need Code Count with diff tool Accounting for breakage effort –Charged to current increment or I&T budget (IDPD) IDPD effects may differ by type of software –“Breakage ESLOC” added to next increment –Hard to track phase and activity distributions Hard to spread initial requirements and architecture effort Size and effort reporting –Often reported cumulatively –Subtracting previous increment size may miss deleted code Time-certain development –Which features completed? (Fully? Partly? Deferred?) USC-CSSE13Fall 2015

14 University of Southern California Center for Systems and Software Engineering “Equivalent SLOC” Paradoxes Not a measure of software size Not a measure of software effort Not a measure of delivered software capability A quantity derived from software component sizes and reuse factors that helps estimate effort Once a product or increment is developed, its ESLOC loses its identity –Its size expands into full SLOC –Can apply reuse factors to this to determine an ESLOC quantity for the next increment But this has no relation to the product’s size USC-CSSE14Fall 2015

15 University of Southern California Center for Systems and Software Engineering COCOMO II Database Productivity Increases Two productivity increasing trends exist: 1970 – 1994 and 1995 – 2009 1970-1974 1975-1979 1980-1984 1985-1989 1990-1994 1995-1999 2000-2004 2005-2009 Five-year Periods SLOC per PM 1970-1999 productivity trends largely explained by cost drivers and scale factors Post-2000 productivity trends not explained by cost drivers and scale factors 15USC-CSSEFall 2015

16 University of Southern California Center for Systems and Software Engineering Constant A Decreases Over Post-2000 Period Calibrate the constant A while stationing B = 0.91 Constant A is the inverse of adjusted productivity –adjusts the productivity with SF’s and EM’s Constant A decreases over the periods 1970- 1975- 1980- 1985- 1990- 1995- 2000- 2005- 1974 1979 1984 1989 1994 1999 2004 2009 Productivity is not fully characterized by SF’s and EM’s 50% decrease over Post- 2000 period What factors can explain the phenomenon? 16 USC-CSSEFall 2015

17 University of Southern California Center for Systems and Software Engineering Candidate Explanation Hypotheses Productivity has doubled over the last 40 years –But scale factors and effort multipliers did not fully characterize this increase Hypotheses/questions for explanation –Is standard for rating personnel factors being raised? E.g., relative to “national average” –Was generated code counted as new code? E.g., model-driven development –Was reused code counted as new code? –Are the ranges of some cost drivers not large enough? Improvement in tools (TOOL) only contributes to 20% reduction in effort –Are more lightweight projects being reported? Documentation relative to life-cycle needs 17USC-CSSEFall 2015

18 University of Southern California Center for Systems and Software Engineering Summary Current and future trends create challenges for systems and software data collection and analysis –Metrics and “productivity:” “equivalent” size; requirements/design/product/value metrics; productivity growth and decline phenomena –Cost drivers: effects of complexity, volatility, architecture –Alternative processes: rapid/agile; systems of systems; evolutionary development –Model integration: systems and software; cost, schedule, and quality; costs and benefits Updated systems and software data definitions and estimation methods needed for good management –COCOMO III effort USC-CSSE18Fall 2015

19 University of Southern California Center for Systems and Software Engineering Cost Driver Rating Scales and Effects Application Complexity –Difficulty and Constraints scales Architecture, Criticality, and Volatility Effects –Architecture effects as function of product size –Added effects of criticality and volatility USC-CSSE19Fall 2015

20 University of Southern California Center for Systems and Software Engineering USC-CSSE 20 Candidate AFCAA Difficulty Scale Difficulty would be described in terms of required software reliability, database size, product complexity, integration complexity, information assurance, real-time requirements, different levels of developmental risks, etc. Fall 2015

21 University of Southern California Center for Systems and Software Engineering USC-CSSE 21 Candidate AFCAA Constraints Scale Dimensions of constraints include electrical power, computing capacity, storage capacity, repair capability, platform volatility, physical environment accessibility, etc. Fall 2015

22 University of Southern California Center for Systems and Software Engineering USC-CSSE22 Added Cost of Weak Architecting Calibration of COCOMO II Architecture and Risk Resolution factor to 161 project data points: Technical Debt Fall 2015

23 University of Southern California Center for Systems and Software Engineering Effect of Size on Software Effort Sweet Spots 23USC-CSSEFall 2015

24 University of Southern California Center for Systems and Software Engineering Effect of Volatility and Criticality on Sweet Spots 24 USC-CSSEFall 2015

25 University of Southern California Center for Systems and Software Engineering Summary Current and future trends create challenges for systems and software data collection and analysis –Metrics and “productivity:” “equivalent” size; requirements/design/product/value metrics; productivity growth and decline phenomena –Cost drivers: effects of complexity, volatility, architecture –Alternative processes: rapid/agile; systems of systems; evolutionary development –Model integration: systems and software; cost, schedule, and quality; costs and benefits Updated systems and software data definitions and estimation methods needed for good management –COCOMO III effort USC-CSSE25Fall 2015

26 University of Southern California Center for Systems and Software Engineering Estimation for Alternative Processes Agile Methods –Planning Poker/Wideband Delphi –Yesterday’s Weather Adjustment: Agile COCOMO II Evolutionary Development –Schedule/Cost/Quality as Independent Variable –Incremental Development Productivity Decline Systems of Systems –Hybrid Methods USC-CSSE26Fall 2015

27 University of Southern California Center for Systems and Software Engineering Planning Poker/Wideband Delphi Stakeholders formulate story to be developed Developers choose and show cards indicating their estimated ideal person-weeks to develop story –Card values: 1,2,3,5,8,13,20,30, 50,100 If card values are about the same, use the median as the estimated effort If card values vary significantly, discuss why some estimates are high and some low Re-vote after discussion –Generally, values will converge, and the median can be used as the estimated effort USC-CSSE27Fall 2015

28 University of Southern California Center for Systems and Software Engineering Agile COCOMO II Adjusting agile “yesterday’s weather” estimates USC-CSSE 28 Agile COCOMO II is a web-based software cost estimation tool that enables you to adjust your estimates by analogy through identifying the factors that will be changing and by how much. Step 1 Estimate Cost: Estimate Effort: Analogy Parameter Project Name: Baseline Value: (Dollars) (Person - Month)(Dollars / Function Point) (Dollars / Lines of Code) (Function Points / Person-Months) (Lines of Code / Person- Months)(Ideal-Person-Weeks / Iteration) Current Project Function Points Current Project Size (SLOC): (Lines of Code) Current Labor Rate: (Dollars / Person-Month) Current Labor Rate (for Ideal-Person- Weeks): (Dollars / Ideal-Person-Week) Current Iteration Number: Fall 2015

29 University of Southern California Center for Systems and Software Engineering Incremental Development Forms USC-CSSE 29 TypeExamplesProsConsCost Estimation Evolutionary Sequential Small: Agile Large: Evolutionary Development Adaptability to change; rapid fielding Easiest-first; late, costly breakage Small: Planning-poker-type Large: Parametric with IDPD Prespecified Sequential Platform base plus PPPIs Prespecifiable full-capability requirements Emergent requirements or rapid change COINCOMO with no increment overlap Overlapped Evolutionary Product lines with ultrafast change Modular product line Cross-increment breakage Parametric with IDPD and Requirements Volatility Rebaselining Evolutionary Mainstream product lines; Systems of systems High assurance with rapid change Highly coupled systems with very rapid change COINCOMO, IDPD for development; COSYSMO for rebaselining Time phasing terms: Scoping; Architecting; Developing; Producing; Operating (SADPO) Prespecified Sequential: SA; DPO1; DPO2; DPO3; … Evolutionary Sequential: SADPO1; SADPO2; SADPO3; … Evolutionary Overlapped: SADPO1; SADPO2; SADPO3; … Evolutionary Concurrent: SA; D1 ; PO1… SA2; D2 ; PO2… SA3; D3; PO3 … Fall 2015

30 University of Southern California Center for Systems and Software Engineering Evolutionary Development Implications Total Package Procurement doesn’t work –Can’t determine requirements and cost up front Need significant, sustained systems engineering effort –Need best-effort up-front architecting for evolution –Can’t dismiss systems engineers after Preliminary Design Review Feature set size becomes dependent variable –Add or drop borderline-priority features to meet schedule or cost –Implies prioritizing, architecting steps in SAIV process model –Safer than trying to maintain a risk reserve USC-CSSE30Fall 2015

31 University of Southern California Center for Systems and Software Engineering Future DoD Challenges: Systems of Systems 31USC-CSSEFall 2015

32 University of Southern California Center for Systems and Software Engineering USC-CSSE 32 Conceptual SoS SE Effort Profile SoS SE activities focus on three somewhat independent activities, performed by relatively independent teams A given SoS SE team may be responsible for one, two, or all activity areas Some SoS programs may have more than one organization performing SoS SE activities Fall 2015

33 University of Southern California Center for Systems and Software Engineering USC-CSSE 33 SoS SE Cost Model Planning, Requirements Management, and Architecting Source Selection and Supplier Oversight SoS Integration and Testing Size Drivers SoS SE Effort Cost Factors SoSs supported by cost model –Strategically-oriented stakeholders interested in tradeoffs and costs –Long-range architectural vision for SoS –Developed and integrated by an SoS SE team –System component independence Size drivers and cost factors –Based on product characteristics, processes that impact SoS SE team effort, and SoS SE personnel experience and capabilities Calibration Fall 2015

34 University of Southern California Center for Systems and Software Engineering USC-CSSE34 Comparison of SE and SoSE Cost Model Parameters Parameter AspectsCOSYSMOCOSOSIMO Size drivers# of system requirements # of system interfaces # operational scenarios # algorithms # of SoS requirements # of SoS interface protocols # of constituent systems # of constituent system organizations # operational scenarios “Product” characteristicsSize/complexity/volatility Requirements understanding Architecture understanding Level of service requirements # of recursive levels in design Migration complexity Technology risk #/ diversity of platforms/installations Level of documentation Size/complexity/volatility Requirements understanding Architecture understanding Level of service requirements Component system maturity and stability Component system readiness Process characteristicsProcess capability Multi-site coordination Tool support Maturity of processes Tool support Cost/schedule compatibility SoS risk resolution People characteristicsStakeholder team cohesion Personnel/team capability Personnel experience/continuity Stakeholder team cohesion SoS team capability Fall 2015

35 University of Southern California Center for Systems and Software Engineering Summary Current and future trends create challenges for systems and software data collection and analysis –Metrics and “productivity:” “equivalent” size; requirements/design/product/value metrics; productivity growth and decline phenomena –Cost drivers: effects of complexity, volatility, architecture –Alternative processes: rapid/agile; systems of systems; evolutionary development –Model integration: systems and software; cost, schedule, and quality; costs and benefits Updated systems and software data definitions and estimation methods needed for good management –COCOMO III effort USC-CSSE35Fall 2015

36 University of Southern California Center for Systems and Software Engineering COCOMO III Project Scope COCOMO® III will product estimates for: –Effort, Schedule, Cost, Defects COCOMO® III can be applied at various moments in a project’s lifecycle: –Early Estimation, Post-Architecture Estimation, Project Re-estimation COCOMO® III’s functional vision –Single and Multiple component estimate –Analysis of alternatives –Analysis with Size-Effort-Schedule as independent variables –Support for different lifecycle processes –Lifecycle cost estimation –Legacy system transformation –Alternative size measures –Include technical debt and its effects on effort and schedule COCOMO III Project Purpose Broaden audiences of COCOMO® and address scope of modern projects: mobile devices, web/internet, big data, cloud-targeted, and multi- tenant software Improve the accuracy and realism of estimates Improve driver definitions New and updated software cost drivers and adjust their ratings as needed Quality estimation capability Point and range estimates based on risk Improve value of COCOMO® in decision-making Fall 2015 USC-CSSE 36 COCOMO III Effort

37 University of Southern California Center for Systems and Software Engineering Historical Overview of COCOMO Suite of Models COQUALMO 1998 COCOMO 81 COPROMO 1998 COSoSIMO 2004 Legend: Model has been calibrated with historical project data Model is derived from calibrated model Model has been calibrated with expert (Delphi) data COCOTS 2000 COSYSMO 2002 CORADMO 1999 iDAVE 2003 COPLIMO 2003 COPSEMO 1998 COCOMO II 2000 DBA COCOMO 2004 COINCOMO 2004 Security Extension 2004 Costing Secure System 2004 Software Cost Models Software Extensions Other Independent Estimation Models Dates indicate the time that the first paper was published for the model. Fall 2015USC-CSSE37

38 University of Southern California Center for Systems and Software Engineering Project, product and personnel attributes Number of residual defects COQUALMO 38 COCOMO II COQUALMO Defect Introduction Model Defect Removal Model Software platform, Software Size Estimate Defect removal profile levels Software development effort, cost and schedule estimate Automation, Reviews, Testing Defect density per unit of size Fall 2015USC-CSSE

39 University of Southern California Center for Systems and Software Engineering Non-SLOC Size Measures Software Requirements Function Points SNAP Points Fast Function Points COSMIC Points Object / Application Points –Reports, Screens, 3G Modules Feature Points Use Case Points Story Points (Agile Development) Design Modules RICE Objects –Reports, Interfaces, Conversions, Enhancements Fall 2015 USC-CSSE 39

40 University of Southern California Center for Systems and Software Engineering USC-CSSE40 References Boehm, B., “Some Future Trends and Implications for Systems and Software Engineering Processes”, Systems Engineering 9(1), pp. 1-19, 2006. Boehm, B., and Lane, J., “Using the ICM to Integrate System Acquisition, Systems Engineering, and Software Engineering,” CrossTalk, October 2007, pp. 4-9. Boehm, B., Brown, A.W.. Clark, B., Madachy, R., Reifer, D., et al., Software Cost Estimation with COCOMO II, Prentice Hall, 2000. Dahmann, J. (2007); “Systems of Systems Challenges for Systems Engineering”, Systems and Software Technology Conference, June 2007. Department of Defense (DoD), Instruction 5000.02, Operation of the Defense Acquisition System, December 2008. Galorath, D., and Evans, M., Software Sizing, Estimation, and Risk Management, Auerbach, 2006. Lane, J. and Boehm, B., “Modern Tools to Support DoD Software-Intensive System of Systems Cost Estimation, DACS State of the Art Report, also Tech Report USC-CSSE-2007-716 Lane, J., Valerdi, R., “Synthesizing System-of-Systems Concepts for Use in Cost Modeling,” Systems Engineering, Vol. 10, No. 4, December 2007. Madachy, R., “Cost Model Comparison,” Proceedings 21 st, COCOMO/SCM Forum, November, 2006, http://csse.usc.edu/events/2006/CIIForum/pages/program.html Northrop, L., et al., Ultra-Large-Scale Systems: The Software Challenge of the Future, Software Engineering Institute, 2006. Reifer, D., “Let the Numbers Do the Talking,” CrossTalk, March 2002, pp. 4-8. Stutzke, R., Estimating Software-Intensive Systems, Addison Wesley, 2005. Valerdi, R, Systems Engineering Cost Estimation with COSYSMO, Wiley, 2010 (to appear) USC-CSSE Tech Reports, http://csse.usc.edu/csse/TECHRPTS/by_author.html Fall 2015

41 University of Southern California Center for Systems and Software Engineering Backup Charts 41 USC-CSSEFall 2015

42 University of Southern California Center for Systems and Software Engineering 42 COSYSMO Size Drivers Effort Multipliers Effort Calibration # Requirements # Interfaces # Scenarios # Algorithms + Volatility Factor - Application factors -8 factors - Team factors -6 factors - Schedule driver WBS guided by ISO/IEC 15288 COSYSMO Operational Concept USC-CSSEFall 2015

43 University of Southern California Center for Systems and Software Engineering 43 4. Rate Cost Drivers - Application

44 University of Southern California Center for Systems and Software Engineering COSYSMO Change Impact Analysis – I – Added SysE Effort for Going to 3 Versions Size: Number, complexity, volatility, reuse of system requirements, interfaces, algorithms, scenarios (elements) –1  3 Versions:add 3-6% per increment for number of elements add 2-4% per increment for volatility –Exercise Prep.: add 3-6% per increment for number of elements add 3-6% per increment for volatility Most significant cost drivers (effort multipliers) –Migration complexity: 1.10 – 1.20 (versions) –Multisite coordination: 1.10 – 1.20 (versions, exercise prep.) –Tool support: 0.75 – 0.87 (due to exercise prep.) –Architecture complexity: 1.05 – 1.10 (multiple baselines) –Requirements understanding: 1.05 – 1.10 for increments 1,2; 1.0 for increment 3;.9-.95 for increment 4 44USC-CSSEFall 2015

45 University of Southern California Center for Systems and Software Engineering COSYSMO Change Impact Analysis – II – Added SysE Effort for Going to 3 Versions Cost ElementIncr. 1Incr. 2Incr. 3Incr. 4 Size1.11 – 1.221.22 – 1.441.33 – 1.661.44 – 1.88 Effort Product1.00 – 1.52 0.96 – 1.380.86 – 1.31 Effort Range1.11 – 1.851.22 – 2.191.27 – 2.291.23 – 2.46 Arithmetic Mean1.481.701.781.84 Geometric Mean1.431.631.711.74 45 USC-CSSEFall 2015

46 University of Southern California Center for Systems and Software Engineering COSYSMO Requirements Counting Challenge Estimates made in early stages –Relatively few high-level design-to requirements Calibration performed on completed projects –Relatively many low-level test-to requirements Need to know expansion factors between levels –Best model: Cockburn definition levels Cloud, kite, sea level, fish, clam Expansion factors vary by application area, size –One large company: Magic Number 7 –Small e-services projects: more like 3:1, fewer lower levels Survey form available to capture your experience USC-CSSE46Fall 2015

47 University of Southern California Center for Systems and Software Engineering USC-CSSE 47 Achieving Agility and High Assurance -I Using timeboxed or time-certain development Precise costing unnecessary; feature set as dependent variable Rapid Change High Assurance Short, Stabilized Development Of Increment N Increment N Transition/O&M Increment N Baseline Short Development Increments Foreseeable Change (Plan) Stable Development Increments Fall 2015

48 University of Southern California Center for Systems and Software Engineering Evolutionary Concurrent: Incremental Commitment Model Agile Rebaselining for Future Increments Short, Stabilized Development of Increment N Verification and Validation (V&V) of Increment N Deferrals ArtifactsConcerns Rapid Change High Assurance Future Increment Baselines Increment N Transition/ Operations and Maintenance Future V&V Resources Increment N Baseline Current V&V Resources Unforeseeable Change (Adapt) Short Development Increments Foreseeable Change (Plan) Stable Development Increments Continuous V&V 48USC-CSSEFall 2015

49 University of Southern California Center for Systems and Software Engineering USC-CSSE 49 $100M $50M Arch. A: Custom many cache processors Arch. B: Modified Client-Server 12 3 4 5 Response Time (sec) Original Spec After Prototyping Available budget Effect of Unvalidated Requirements -15 Month Architecture Rework Delay Fall 2015

50 University of Southern California Center for Systems and Software Engineering 50 Reasoning about the Value of Dependability – iDAVE iDAVE: Information Dependability Attribute Value Estimator Use iDAVE model to estimate and track software dependability ROI – Help determine how much dependability is enough – Help analyze and select the most cost-effective combination of software dependability techniques – Use estimates as a basis for tracking performance – Integrates cost estimation (COCOMO II), quality estimation (COQUALMO), value estimation relationships USC-CSSEFall 2015

51 University of Southern California Center for Systems and Software Engineering 51 iDAVE Model Framework USC-CSSEFall 2015

52 University of Southern California Center for Systems and Software Engineering 52 Examples of Utility Functions: Response Time Value Time Real-Time Control; Event Support Value Time Mission Planning, Competitive Time-to-Market Value Time Event Prediction - Weather; Software Size Value Time Data Archiving Priced Quality of Service Critical Region USC-CSSEFall 2015

53 University of Southern California Center for Systems and Software Engineering USC-CSSE 53 Tradeoffs Among Cost, Schedule, and Reliability: COCOMO II -- Cost/Schedule/RELY: “pick any two” points (RELY, MTBF (hours)) For 100-KSLOC set of features Can “pick all three” with 77-KSLOC set of features Fall 2015

54 University of Southern California Center for Systems and Software Engineering USC-CSSE 54 The SAIV* Process Model 1. Shared vision and expectations management 2. Feature prioritization 3. Schedule range estimation and core-capability determination - Top-priority features achievable within fixed schedule with 90% confidence 4. Architecting for ease of adding or dropping borderline- priority features - And for accommodating past-IOC directions of growth 5. Incremental development - Core capability as increment 1 6. Change and progress monitoring and control - Add or drop borderline-priority features to meet schedule *Schedule As Independent Variable; Feature set as dependent variable – Also works for cost, schedule/cost/quality as independent variable Fall 2015

55 University of Southern California Center for Systems and Software Engineering 55 How Much Testing is Enough? - Early Startup: Risk due to low dependability - Commercial: Risk due to low dependability - High Finance: Risk due to low dependability - Risk due to market share erosion COCOMO II: 012223454Added % test time COQUALMO: 1.0.475.24.1250.06P(L) Early Startup:.33.19.11.06.03S(L) Commercial: 1.0.56.32.18.10S(L) High Finance: 3.01.68.96.54.30S(L) Market Risk:.008.027.09.301.0RE m Sweet Spot USC-CSSE Fall 2015


Download ppt "University of Southern California Center for Systems and Software Engineering Future Challenges for Systems and Software Cost Estimation and Measurement."

Similar presentations


Ads by Google