University of Southern California Center for Software Engineering C S E USC Barry Boehm, USC COCOMO/SCM Forum #16 October 24, 2001

Slides:



Advertisements
Similar presentations
COST ESTIMATION TECHNIQUES AND COCOMO. Cost Estimation Techniques 1-)Algorithmic cost modelling 2-)Expert judgement 3-)Estimation by analogy 4)-Parkinsons.
Advertisements

Ninth Lecture Hour 8:30 – 9:20 pm, Thursday, September 13
1 Analysis of MBASE Model-Clashes Mohammed Al-Said USC-CSE.
Tradespace, Affordability, and COCOMO III Barry Boehm, USC CSSE Annual Research Review 2014 April 30,
University of Southern California Center for Systems and Software Engineering A Look at Software Engineering Risks in a Team Project Course Sue Koolmanojwong.
Copyright 2000, Stephan Kelley1 Estimating User Interface Effort Using A Formal Method By Stephan Kelley 16 November 2000.
Using UML, Patterns, and Java Object-Oriented Software Engineering Royce’s Methodology Chapter 16, Royce’ Methodology.
COCOMO Suite Model Unification Tool Ray Madachy 23rd International Forum on COCOMO and Systems/Software Cost Modeling October 27, 2008.
University of Southern California Center for Software Engineering CSE USC System Dynamics Modeling of a Spiral Hybrid Process Ray Madachy, Barry Boehm,
University of Southern California Center for Software Engineering CSE USC COSYSMO: Constructive Systems Engineering Cost Model Barry Boehm, USC CSE Annual.
Applying COCOMO II Effort Multipliers to Simulation Models 16th International Forum on COCOMO and Software Cost Modeling Jongmoon Baik and Nancy Eickelmann.
University of Southern California Center for Systems and Software Engineering Integrating Systems and Software Engineering (IS&SE) with the Incremental.
University of Southern California Center for Software Engineering CSE USC ©USC-CSE 10/23/01 1 COSYSMO Portion The COCOMO II Suite of Software Cost Estimation.
University of Southern California Center for Software Engineering CSE USC ©USC-CSE Overview: USC Annual Research Review Barry Boehm, USC-CSE February.
University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE1 Barry Boehm, USC Motorola Quality Workshop September 15,
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy, Barry Boehm USC Center for Systems and Software Engineering.
1 CORADMO in 2001: A RAD Odyssey Cyrus Fakharzadeh 16th International Forum on COCOMO and Software Cost Modeling University of Southern.
Fundamentals of Information Systems, Second Edition
© USC-CSE1 Determine How Much Dependability is Enough: A Value-Based Approach LiGuo Huang, Barry Boehm University of Southern California.
© USC-CSE Feb Keun Lee ( & Sunita Chulani COQUALMO and Orthogonal Defect.
April 13, 2004CS WPI1 CS 562 Advanced SW Engineering General Dynamics, Needham Tuesdays, 3 – 7 pm Instructor: Diane Kramer.
University of Southern California Center for Software Engineering CSE USC 9/14/05 1 COCOMO II: Airborne Radar System Example Ray Madachy
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy USC Center for Systems and Software Engineering
April 27, 2004CS WPI1 CS 562 Advanced SW Engineering Lecture #3 Tuesday, April 27, 2004.
University of Southern California Center for Software Engineering C S E USC Agile and Plan-Driven Methods Barry Boehm, USC USC-CSE Affiliates’ Workshop.
University of Southern California Center for Systems and Software Engineering © 2009, USC-CSSE 1 An Analysis of Changes in Productivity and COCOMO Cost.
Chapter 23 – Project planning Part 2. Estimation techniques  Organizations need to make software effort and cost estimates. There are two types of technique.
COCOMO-SCORM: Cost Estimation for SCORM Course Development
High Dependability Computing in a Competitive World
1 SAIV/CAIV/SCQAIV LiGuo Huang USC University of Southern California Center for Software Engineering CSE USC.
Object-oriented Analysis and Design Stages in a Software Project Requirements Writing Analysis Design Implementation System Integration and Testing Maintenance.
CHECKPOINTS OF THE PROCESS Three sequences of project checkpoints are used to synchronize stakeholder expectations throughout the lifecycle: 1)Major milestones,
University of Southern California Center for Systems and Software Engineering Cost Estimation with COCOMO II Barry Boehm CS 510, Fall 2015 v3: Slide 10.
© USC-CSE 2001 Oct Constructive Quality Model – Orthogonal Defect Classification (COQUALMO-ODC) Model Keun Lee (
University of Southern California Center for Software Engineering C S E USC Using COCOMO for Software Decisions - from COCOMO II Book, Section 2.6 Barry.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 1: Software and Software Engineering.
University of Southern California Center for Software Engineering C S E USC Using COCOMO for Software Decisions - from COCOMO II Book, Section 2.6 Barry.
Fifth Lecture Hour 9:30 – 10:20 am, September 9, 2001 Framework for a Software Management Process – Life Cycle Phases (Part II, Chapter 5 of Royce’ book)
University of Southern California Center for Systems and Software Engineering COCOMO Suite Toolset Ray Madachy, NPS Winsor Brown, USC.
Lecture 2 –Approaches to Systems Development Method 10/9/15 1.
Cmpe 589 Spring 2006 Lecture 2. Software Engineering Definition –A strategy for producing high quality software.
SFWR ENG 3KO4 Slide 1 Management of Software Engineering Chapter 8: Fundamentals of Software Engineering C. Ghezzi, M. Jazayeri, D. Mandrioli.
University of Southern California Center for SoftwareEngineering Reliable Software Research and Technology Transition Barry Boehm, USC NASA IT Workshop.
University of Southern California Center for Systems and Software Engineering © 2010, USC-CSSE 1 Trends in Productivity and COCOMO Cost Drivers over the.
University of Southern California Center for Systems and Software Engineering Reducing Estimation Uncertainty with Continuous Assessment: Tracking the.
©Ian Sommerville 2006Software Engineering, 8th edition. Chapter 4 Slide 1 Software Processes.
Project Manager:PATS Project Manager Estimator:Peter Project Manager Start Date:1/1/2010 PATS Software PATS Project Team.
RUP RATIONAL UNIFIED PROCESS Behnam Akbari 06 Oct
1 Agile COCOMO II: A Tool for Software Cost Estimating by Analogy Cyrus Fakharzadeh Barry Boehm Gunjan Sharman SCEA 2002 Presentation University of Southern.
CS 577b: Software Engineering II
COCOMO III Workshop Summary
Cost Estimation with COCOMO II
Tutorial: Software Cost Estimation Tools – COCOMO II and COCOTS
COCOMO II Overview CSCI 510 Fall 2013 (c) USC CSSE.
COCOMO II Overview Barry Boehm CSCI 510 Fall 2011 (c) USC CSSE
COCOMO II Overview Barry Boehm CSCI (c) USC CSSE 2018/9/19.
Constructive Cost Model
Pongtip Aroonvatanaporn CSCI 577b Spring 2011 March 25, 2011
COCOMO II Overview Ray Madachy CSCI 510
SOFTWARE PROJECT MANAGEMENT AND COST ESTIMATION
Cost Estimation with COCOMO II
Software Systems Cost Estimation
Cost Estimation with COCOMO II
Cost Estimation with COCOMO II
Cost Estimation with COCOMO II
Quality Management, Peer Review, & Architecture Review Board
Cost Estimation with COCOMO II
Cost Estimation with COCOMO II
Center for Software and Systems Engineering,
Using COCOMO for Software Decisions - from COCOMO II Book, Section 2.6
Presentation transcript:

University of Southern California Center for Software Engineering C S E USC Barry Boehm, USC COCOMO/SCM Forum #16 October 24, 2001 ( Competing on Schedule, Cost, and Quality: The Role of Software Models

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE2 Outline Traditional and e-Services Development - USC research model perspectives Software Schedule-Cost-Quality Tradeoffs - Risk exposure - Development and ownership costs The SAIV/CAIV/SCQAIV Process Models Conclusions and References

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE3 Traditional and e-Services Development Traditional Development Standalone systems Stable requirements Rqts. determine capabilities Control over evolution Enough time to keep stable Repeatability-oriented process, maturity models e-Services Development Everything connected--maybe Rapid requirements change COTS capabilities determine rqts. No control over COTS evolution Ever-decreasing cycle times Adaptive process models

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE4 USC: Model Integration Research MBASE, CeBASE Success Models Product Models Property Models Process Models Win-Win Business Case Analysis Results Chains Risk Software Warranties Correctness RAD Six Sigma Stories Award Fees Agility JAD QFD Golden Rule Waterfall Spiral RUP XP SAIV CAIV SCQAIV Risk Management Business Process Reengineering CMM’s Peopleware IPT’s Agile Development Groupware Easy WinWin Experience Factory GQM UML XML CORBA COM Architectures Product Lines OO Analysis & Design Requirements Operational Concepts Domain Ontologies COTS GOTS COCOMO II COCOTS CORADMO System Dynamics Metrics - ilities COQUALMO Simulation and Modeling

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE5 Outline Traditional and e-Services Development - USC research model perspectives Software Schedule-Cost-Quality Tradeoffs - Risk exposure - Development and ownership costs The SAIV/CAIV/SCQAIV Process Models Conclusions and References

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE6 Competing on Schedule and Quality - A risk analysis approach Risk Exposure RE = Prob (Loss) * Size (Loss) –“Loss” – financial; reputation; future prospects, … For multiple sources of loss: sources RE =  [Prob (Loss) * Size (Loss)] source

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE7 Example RE Profile: Time to Ship - Loss due to unacceptable dependability Time to Ship (amount of testing) RE = P(L) * S(L) Many defects: high P(L) Critical defects: high S(L) Few defects: low P(L) Minor defects: low S(L)

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE8 Example RE Profile: Time to Ship - Loss due to unacceptable dependability - Loss due to market share erosion Time to Ship (amount of testing) RE = P(L) * S(L) Few rivals: low P(L) Weak rivals: low S(L) Many rivals: high P(L) Strong rivals: high S(L) Many defects: high P(L) Critical defects: high S(L) Few defects: low P(L) Minor defects: low S(L)

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE9 Example RE Profile: Time to Ship - Sum of Risk Exposures Time to Ship (amount of testing) RE = P(L) * S(L) Few rivals: low P(L) Weak rivals: low S(L) Many rivals: high P(L) Strong rivals: high S(L) Sweet Spot Many defects: high P(L) Critical defects: high S(L) Few defects: low P(L) Minor defects: low S(L)

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE10 Comparative RE Profile: Safety-Critical System Time to Ship (amount of testing) RE = P(L) * S(L) Mainstream Sweet Spot Higher S(L): defects High-Q Sweet Spot

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE11 Comparative RE Profile: Internet Startup Time to Ship (amount of testing) RE = P(L) * S(L) Mainstream Sweet Spot Higher S(L): delays Low-TTM Sweet Spot TTM: Time to Market

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE12 Conclusions So Far Unwise to try to compete on both cost/schedule and quality –Some exceptions: major technology or marketplace edge There are no one-size-fits-all cost/schedule/quality strategies Risk analysis helps determine how much testing (prototyping, formal verification, etc.) is enough –Buying information to reduce risk Often difficult to determine parameter values –Some COCOMO II values discussed next

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE Slight inconvenience Low, easily recoverable loss Moderate recoverable loss High Financial Loss Loss of Human Life 1 month 1 day 2 years 100 years Defect RiskRough MTBF(mean time between failures) In-house support software 1.0 Relative Cost/Source Instruction Software Development Cost/Quality Tradeoff - COCOMO II calibration to 161 projects 1 hour High RELY Rating Very High Nominal Low Very Low

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE Slight inconvenience Low, easily recoverable loss Moderate recoverable loss High Financial Loss Loss of Human Life 1 month 1 day 2 years 100 years Defect RiskRough MTBF(mean time between failures) Commercial quality leader 1.10 In-house support software 1.0 Commercial cost leader 0.92 Relative Cost/Source Instruction Software Development Cost/Quality Tradeoff - COCOMO II calibration to 161 projects 1 hour High RELY Rating Very High Nominal Low Very Low

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE Slight inconvenience (1 hour) Low, easily recoverable loss Moderate recoverable loss High Financial Loss Loss of Human Life 1 month 1 day 2 years 100 years Defect RiskRough MTBF(mean time between failures) Commercial quality leader 1.10 In-house support software 1.0 Commercial cost leader Startup demo Safety-critical 1.26 Relative Cost/Source Instruction Software Development Cost/Quality Tradeoff - COCOMO II calibration to 161 projects High RELY Rating Very High Nominal Low Very Low

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE16 COCOMO II RELY Factor Dispersion Very Low NominalHigh Very High t = 2.6 t > 1.9 significant

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE17 COCOMO II RELY Factor Phenomenology RELY = Very Low RELY = Very High Rqts. and Product Design Detailed Design Code and Unit test Integration and test Little detail Many TBDs Little verification Minimal QA, CM, standards, draft user manual, test plans Minimal PDR Detailed verification, QA, CM, standards, PDR, documentation IV & V interface Very detailed test plans, procedures Basic design information Minimal QA, CM, standards, draft user manual, test plans Informal design inspections Detailed verification, QA, CM, standards, CDR, documentation Very thorough design inspections Very detailed test plans, procedures IV & V interface Less rqts. rework Detailed test procedures, QA, CM, documentation Very thorough code inspections Very extensive off- nominal tests IV & V interface Less rqts., design rework No test procedures Minimal path test, standards check Minimal QA, CM Minimal I/O and off- nominal tests Minimal user manual No test procedures Many requirements untested Minimal QA, CM Minimal stress and off-nominal tests Minimal as-built documentation Very detailed test procedures, QA, CM, documentation Very extensive stress and off-nominal tests IV & V interface Less rqts., design, code rework

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE18 “Quality is Free” Did Philip Crosby’s book get it all wrong? Investments in dependable systems – Cost extra for simple, short-life systems – Pay off for high-value, long-life systems

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE19 Software Life-Cycle Cost vs. Dependability 0.8 Very Low NominalHigh Very High Relative Cost to Develop COCOMO II RELY Rating

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE20 Software Life-Cycle Cost vs. Dependability 0.8 Very Low NominalHigh Very High Relative Cost to Develop, Maintain COCOMO II RELY Rating

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE21 Software Life-Cycle Cost vs. Dependability 0.8 Very Low NominalHigh Very High Relative Cost to Develop, Maintain COCOMO II RELY Rating % Maint Low-dependability inadvisable for evolving systems

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE22 Software Ownership Cost vs. Dependability 0.8 Very Low NominalHigh Very High Relative Cost to Develop, Maintain, Own and Operate COCOMO II RELY Rating % Maint VL = 2.55 L = 1.52 Operational-defect cost at Nominal dependability = Software life cycle cost Operational - defect cost = 0

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE23 Conclusions So Far - 2 Quality is better than free for high-value, long-life systems There is no universal dependability sweet spot –Yours will be determined by your value model –And the relative contributions of dependability techniques Risk analysis helps answer “How much is enough?” COCOMO II provides schedule-cost-quality tradeoff analysis framework

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE24 Outline Traditional and e-Services Development - USC research model perspectives Software Schedule-Cost-Quality Tradeoffs - Risk exposure - Development and ownership costs The COCOMO Suite of Tradeoff Models - COCOMO II, CORADMO, COQUALMO-ODC The SAIV/CAIV/SCQAIV Process Models Conclusions and References

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE25 1.Introduction 2.Model Definition 3.Application Examples 4.Calibration 5.Emerging Extensions 6.Future Trends Appendices –Assumptions, Data Forms, User’s Manual, CD Content COCOMO II Book Table of Contents - Boehm, Abts, Brown, Chulani, Clark, Horowitz, Madachy, Reifer, Steece, Software Cost Estimation with COCOMO II, Prentice Hall, 2000 CD: Video tutorials, USC COCOMO II.2000, commercial tool demos, manuals, data forms, web site links, Affiliate forms

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE26 To help people reason about the cost and schedule implications of their software decisions Purpose of COCOMO II

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE27 Major Decision Situations Helped by COCOMO II Software investment decisions –When to develop, reuse, or purchase –What legacy software to modify or phase out Setting project budgets and schedules Negotiating cost/schedule/performance tradeoffs Making software risk management decisions Making software improvement decisions –Reuse, tools, process maturity, outsourcing

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE28 Need to ReEngineer COCOMO 81 New software processes New sizing phenomena New reuse phenomena Need to make decisions based on incomplete information

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE29 COCOMO II Model Stages

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE30 Early Design and Post-Architecture Model    FactorsScaleProcess SizeEffort sMultiplier Environment          Environment: Product, Platform, People, Project Factors Size: Nonlinear reuse and volatility effects Process: Constraint, Risk/Architecture, Team, Maturity Factors   FactorsScaleProcess  Effort Schedule Multiplier 

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE31 Relative cost Amount Modified Usual Linear Assumption Data on 2954 NASA modules [Selby, 1988] Nonlinear Reuse Effects

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE32 COCOMO II Productivity Ranges Productivity Range Product Complexity (CPLX) Analyst Capability (ACAP) Programmer Capability (PCAP) Time Constraint (TIME) Personnel Continuity (PCON) Required Software Reliability (RELY) Documentation Match to Life Cycle Needs (DOCU) Multi-Site Development (SITE) Applications Experience (AEXP) Platform Volatility (PVOL) Use of Software Tools (TOOL) Storage Constraint (STOR) Process Maturity (PMAT) Language and Tools Experience (LTEX) Required Development Schedule (SCED) Data Base Size (DATA) Platform Experience (PEXP) Architecture and Risk Resolution (RESL) Precedentedness (PREC) Develop for Reuse (RUSE) Team Cohesion (TEAM) Development Flexibility (FLEX) Scale Factor Ranges: 10, 100, 1000 KSLOC

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE33 Percentage of sample projects within 30% of actuals -Without and with calibration to data source COCOMO II Estimation Accuracy: COCOMO81COCOMOII.2000COCOMOII.1997 # Projects Effort Schedule 81% 52% 64% 75% 80% 61% 62% 72% 81% 65%

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE34 COCOMO II cost drivers (except SCED) Language Level, experience,... COCOMO II Phase Distributions (COPSEMO) RAD Extension Baseline effort, schedule Effort, schedule by stage RAD effort, schedule by phase RVHL DPRS CLAB RESL COCOMO II RAD Extension (CORADMO) PPOS RCAP

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE35 Effect of RCAP on Cost, Schedule

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE36 COCOMO II Current COQUALMO System COQUALMO Defect Introduction Model Defect Removal Model Software platform, Project, product and personnel attributes Software Size Estimate Defect removal profile levels Automation, Reviews, Testing Software development effort, cost and schedule estimate Number of residual defects Defect density per unit of size

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE37 Defect Removal Rating Scales Highly advanced tools, model- based test More advance test tools, preparation. Dist- monitoring Well-defined test seq. and basic test coverage tool system Basic test Test criteria based on checklist Ad-hoc test and debug No testing Execution Testing and Tools Extensive review checklist Statistical control Root cause analysis, formal follow Using historical data Formal review roles and Well- trained people and basic checklist Well-defined preparation, review, minimal follow-up Ad-hoc informal walk- through No peer review Peer Reviews Formalized specification, verification. Advanced dist- processing More elaborate req./design Basic dist- processing Intermediate- level module Simple req./design Compiler extension Basic req. and design consistency Basic compiler capabilities Simple compiler syntax checking Automated Analysis Extra HighVery HighHighNominalLowVery Low COCOMO II p.263

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE38 Defect Removal Estimates - Nominal Defect Introduction Rates Delivered Defects / KSLOC Composite Defect Removal Rating

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE39 COQUALMO-ODC Model Objectives Support cost-schedule-quality tradeoff analysis Provide reference for defect monitoring and control Evolve to cover all major classes of project - With different defect distributions (e.g. COTS-based) - Start simple;grow opportunistically

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE40 Example of Desired Model - I K 1200K 1800K $20K/PM Effort (PM) VL L N H VH Current: RELY rating Desired: Defect MTBF KSLOC (hr) , Details for any given point-next Time (Mo.)

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE41 Example of Desired Model - II 30 KSLOC; RELY = VH; 75PM; 12Mo.; Delivered Defect = 0.3 PhaseRqts.DesignCode & Unit testIntegration & Test Effort(PM) Cost($K) Schedule(Mo.) Defects in/out/left - Rqts. -Design -Timing -Interface ….. 60/50/10130/116/24264/234/308/37.7/0.3 60/50/1010/16/42/5/11/2/0 120/100/2010/25/52/6.9/0.1 12/6/62/4/41/4.9/0.1 30/25/55/9/10/1/0 …

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE42 Current COQUALMO shortfalls One-size-fits-all model - May not fit COTS/Web, embedded applications Defect uniformity, independence assumptions - Unvalidated hypotheses COCOMO II C-S-Q trade offs just to RELY levels - Not to delivered defect density, MTBF Need for more calibration data - ODC data could extend and strengthen model

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE43 Outline Traditional and e-Services Development - USC research model perspectives Software Schedule-Cost-Quality Tradeoffs - Risk exposure - Development and ownership costs The COCOMO Suite of Tradeoff Models - COCOMO II, CORADMO, COQUALMO-ODC The SAIV/CAIV/SCQAIV Process Models Conclusions and References

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE44 Example of Desired Model - II 30 KSLOC; RELY = VH; 75PM; 12Mo.; Delivered Defect = 0.3 PhaseRqts.DesignCode & Unit testIntegration & Test Effort(PM) Cost($K) Schedule(Mo.) Defects in/out/left - Rqts. -Design -Timing -Interface ….. 60/50/10130/116/24264/234/308/37.7/0.3 60/50/1010/16/42/5/11/2/0 120/100/2010/25/52/6.9/0.1 12/6/62/4/41/4.9/0.1 30/25/55/9/10/1/0 …

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE45 Current COQUALMO shortfalls One-size-fits-all model - May not fit COTS/Web, embedded applications Defect uniformity, independence assumptions - Unvalidated hypotheses COCOMO II C-S-Q trade offs just to RELY levels - Not to delivered defect density, MTBF Need for more calibration data - ODC data could extend and strengthen model

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE46 Outline Traditional and e-Services Development - USC research model perspectives Software Schedule-Cost-Quality Tradeoffs - Risk exposure - Development and ownership costs The COCOMO Suite of Tradeoff Models - COCOMO II, CORADMO, COQUALMO-ODC The SAIV/CAIV/SCQAIV Process Models Conclusions and References

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE47 The SAIV Process Model 1. Shared vision and expectations management 2. Feature prioritization 3. Schedule range estimation 4. Architecture and core capabilities determination 5. Incremental development 6. Change and progress monitoring and control

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE48 Shared Vision, Expectations Management, and Feature Prioritization Use stakeholder win-win approach Developer win condition: Don’t overrun fixed 9-month schedule Clients’ win conditions: 24 months’ worth of features Win-Win negotiation –Which features are most critical? –COCOMO II: How many features can be built within a 9-month schedule?

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE49 COCOMO II Estimate Ranges

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE50 Software Product Production Function Value of software product to organization InvestmentHigh-payoffDiminishing returns Operating system Data management Basic application functions Main application functions Humanized I/O Secondary application functions Animated graphics Tertiary Application Functions Natural speech input T=12 mo. T=6 mo. Availability of delivery by time T 90% 50%

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE51 Core Capability Incremental Development, and Coping with Rapid Change Core capability not just top-priority features –Useful end-to-end capability –Architected for ease of adding, dropping marginal features Worst case: Deliver core capability in 9 months, with some extra effort Most likely case: Finish core capability in 6-7 months –Add next-priority features Cope with change by monitoring progress –Renegotiate plans as appropriate

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE52 SAIV Experience I: USC Digital Library Projects Life Cycle Architecture package in fixed 12 weeks –Compatible operational concept, prototypes, requirements, architecture, plans, feasibility rationale Initial Operational Capability in 12 weeks –Including 2-week cold-turkey transition Successful on 24 of 26 projects –Failure 1: too-ambitious core capability Cover 3 image repositories at once –Failure 2: team disbanded Graduation, summer job pressures 92% success rate vs industry 16% in Standish Report

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE53 Rapid Value TM Project Approach Define Design Develop Deploy Lines of readiness Are we ready for the next step? Iteration Scope, Listening, Delivery focus Identify System Actors Document Business Processes Generate Use Cases Define basic development strategies Object Domain Modeling Detailed Object Design, Logical Data Model Object Interactions, System Services Polish Design, Build Plan Build 1 Build 2 Stabilization Build Release to Test Beta Program Pilot Program Production week fixed schedule LA SPIN Copyright © 2001 C-bridge

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE54 Conclusions: SAIV Critical Success Factors Working with stakeholders in advance to achieve a shared product vision and realistic expectations; Getting clients to develop and maintain prioritized requirements; Scoping the core capability to fit within the high-payoff segment of the application’s production function for the given schedule; Architecting the system for ease of adding and dropping features; Disciplined progress monitoring and corrective action to counter schedule threats Also works for Cost as Independent Variable –And “Cost, Schedule, Quality: Pick All Three”

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE55 Conclusions Future IT systems require new model perspectives - Product, process, property, success models USC COCOMO and MBASE model families helpful - Tradeoff and decision analysis - Integrated product and process development - Rapid Application Development - Process management and improvement USC-IBM COQUALMO-ODC model a valuable next step

University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE56 Further Information V. Basili, G. Caldeira, and H. Rombach, “The Experience Factory” and “The Goal Question Metric Approach,” in J. Marciniak (ed.), Encyclopedia of Software Engineering, Wiley, B. Boehm, C. Abts, A.W. Brown, S. Chulani, B. Clark, E. Horowitz, R. Madachy, D. Reifer, and B. Steece, Software Cost Estimation with COCOMO II, Prentice Hall, B. Boehm, D. Port, “Escaping the Software Tar Pit: Model Clashes and How to Avoid Them,” ACM Software Engineering Notes, January, B. Boehm et al., “Using the Win Win Spiral Model: A Case Study,” IEEE Computer, July 1998, pp R. van Solingen and E. Berghout, The Goal/Question/Metric Method, McGraw Hill, COCOMO II, MBASE items : CeBASE items :