University of Southern California Center for Software Engineering CSE USC 1 COCOMO Suite Barry Boehm CSCI 510 Fall 2011
University of Southern California Center for Software Engineering CSE USC 2 Agenda COCOMO II refresher Modeling methodology and model status Suite overview Emerging extensions Model unification Addendum: selected model details References and further information
University of Southern California Center for Software Engineering CSE USC 3 COCOMO II Overview Software product size estimate Software product, process, computer, and personal attributes Software reuse, maintenance, and increment parameters Software organization’s Project data COCOMO Software development and maintenance: Costs (effort) Schedule estimates Distributed by phase, activity, increment COCOMO locally calibrated to organization’s data
University of Southern California Center for Software Engineering CSE USC 4 Purpose of COCOMO II To help people reason about the cost and schedule implications of their software decisions –Software investment decisions When to develop, reuse, or purchase What legacy software to modify or phase out –Setting project budgets and schedules –Negotiating cost/schedule/performance tradeoffs –Making software risk management decisions –Making software improvement decisions Reuse, tools, process maturity, outsourcing
University of Southern California Center for Software Engineering CSE USC 5 COCOMO II Model Stages
University of Southern California Center for Software Engineering CSE USC 6 COCOMO II Scope of Outputs Provides the estimated software development effort and schedule for MBASE/RUP –Elaboration –Construction LCOLCAIOC
University of Southern California Center for Software Engineering CSE USC 7 Agenda COCOMO II refresher Modeling methodology and model status Suite overview Emerging extensions Model unification Addendum: selected model details References and further information
University of Southern California Center for Software Engineering CSE USC 8 Analyze existing literature Step 1 Perform Behavioral analyses Step 2 Identify relative significance Step 3 Perform expert-judgment Delphi assessment, formulate a-priori model Step 4 Gather project data Step 5 Determine Bayesian A-Posteriori model Step 6 Gather more data; refine model Step 7 Concurrency and feedback implied… USC-CSE Modeling Methodology
University of Southern California Center for Software Engineering CSE USC 9 Status of Models ModelLiteratureBehavior Significant Variables DelphiData, Bayesian Tool COCOMO II **** >200 Product COQUALMO **** 6Excel iDAVEExcel COPLIMO Excel CORADMO *** 10Excel COPROMO **** Excel COCOTS **** 20Excel COSYSMO **** 42 Excel COSOSIMO *** n/a Excel
University of Southern California Center for Software Engineering CSE USC 10 General COCOMO Form PM = A * ( Size) B * (EM) ADDITIVE EXPONENTIAL MULTIPLICATIVE Where: PM = Person Months A = calibration factor Size = measure(s) of functional size of a software module that has an additive effect on software development effort B = scale factor(s) that have an exponential or nonlinear effect on software development effort EM = effort multipliers that influence software development effort
University of Southern California Center for Software Engineering CSE USC 11 Agenda COCOMO II refresher Modeling methodology and model status Suite overview Emerging extensions Model unification Addendum: selected model details References and further information
University of Southern California Center for Software Engineering CSE USC 12 COCOMO Suite: Quantities Estimated ModelEffort Effort by Phase ScheduleDefectsROI Improvement Graphs COCOMO II XXX COQUALMO XXX iDAVE X COPLIMO XX CORADMO XXX COPROMO XX X COCOTS X COSYSMO X COSOSIMO X
University of Southern California Center for Software Engineering CSE USC 13 COCOMO Suite: Sizing Model SLOC FP + Lang Requirement s Interfaces Scenarios Algorithms Components Complexity Reuse Volatility COCOMO II Module XX CORADMO XXXX COQUALMO XXXX COSYSMO XXXXXXX COSOSIMO GlueXXXXXX COCOTS Glue X
University of Southern California Center for Software Engineering CSE USC 14 COCOMO Suite: Phase/Activity Distribution ModelInceptionElaborationConstructionTransition COCOMO II COQUALMO iDAVE COPLIMO CORADMO COPROMO COCOTS COSYSMO COSOSIMO
University of Southern California Center for Software Engineering CSE USC 15 Typical Model Usage
University of Southern California Center for Software Engineering CSE USC 16 High Level Partitioning of Cost Models Requirements Analysis Preliminary Design Detailed Design Coding Unit Test Integration Software Acceptance Test Legend COCOMO COSYSMO COSOSIMO SOS System Integration/Test System of System Software Architecting COSOSIMO COSYSMO COCOMO II Integration/Test COCOTS
University of Southern California Center for Software Engineering CSE USC 17 Agenda COCOMO II refresher Modeling methodology and model status Suite overview Emerging extensions Model unification Addendum: selected model details References and further information
University of Southern California Center for Software Engineering CSE USC 18 Emerging Extensions COCOMO-Dependent Extensions –COQUALMO: software quality –iDAVE: software dependability –COPLIMO: product line investment –CORADMO: rapid application software development –COPROMO: productivity improvement Emerging Independent Extensions –COCOTS: software commercial off the shelf –COSYSMO: systems engineering –COSOSIMO: systems of systems –Dynamic COCOMO: dynamic vs. static modeling
University of Southern California Center for Software Engineering CSE USC 19 Constructive Quality Model: COQUALMO Predicts the number of residual defects in a software product Enables 'what-if' analyses that demonstrate the impact of –various defect removal techniques –effects of personnel, project, product and platform characteristics on software quality. Provides insights into –Probable ship time –Assessment of payoffs for quality investments –Understanding of interactions amongst quality strategies
University of Southern California Center for Software Engineering CSE USC 20 COCOMO II COQUALMO Defect Introduction Model Defect Removal Model Software platform, Project, product and personnel attributes Software Size Estimate Defect removal profile levels Automation, Reviews, Testing Software development effort, cost and schedule estimate Number of residual defects Defect density per unit of size COQUALMO Operational Concept
University of Southern California Center for Software Engineering CSE USC 21 COQUALMO Defect Removal Rating Scales Highly advanced tools, model- based test More advance test tools, preparation. Dist- monitoring Well-defined test seq. and basic test coverage tool system Basic test Test criteria based on checklist Ad-hoc test and debug No testing Execution Testing and Tools Extensive review checklist Statistical control Root cause analysis, formal follow Using historical data Formal review roles and Well- trained people and basic checklist Well-defined preparation, review, minimal follow-up Ad-hoc informal walk- through No peer review Peer Reviews Formalized specification, verification. Advanced dist- processing More elaborate req./design Basic dist- processing Intermediate- level module Simple req./design Compiler extension Basic req. and design consistency Basic compiler capabilities Simple compiler syntax checking Automated Analysis Extra HighVery HighHighNominalLowVery Low
University of Southern California Center for Software Engineering CSE USC 22 COQUALMO Defect Removal Estimates - Nominal Defect Introduction Rates Delivered Defects / KSLOC Composite Defect Removal Rating
University of Southern California Center for Software Engineering CSE USC 23 Information Dependability Attribute Value Estimator: iDAVE iDAVE estimates and tracks software dependability Return on Investment (ROI) – Help determine how much dependability is enough – Help analyze and select the most cost-effective combination of software dependability techniques – Use estimates as a basis for tracking performance Based on COCOMO II and COQUALMO cost models and Value Estimating Relationships (VERs) Used to reason about the ROI of software dependability investments Dependability defined as a composite property that integrates such attributes as availability, reliability, safety, security, survivability and maintainability
University of Southern California Center for Software Engineering CSE USC 24 iDAVE Operational Concept
University of Southern California Center for Software Engineering CSE USC 25 Constructive Product Line Investment Model: COPLIMO Supports software product line cost estimation and ROI analysis within the scope of product line life cycle Consists of two components –Product line development cost model –Annualized post-development life cycle extension Based on COCOMO II software cost model –Statistically calibrated to 161 projects, representing 18 diverse organizations
University of Southern California Center for Software Engineering CSE USC 26 COPLIMO Operational Concept COPLIMO For set of products: Average product size (COCOMO II cost drivers) Percent mission- unique, reused-with- modifications, black- box reuse Relative cost of reuse (RCR) and relative cost of writing for reuse (RCWR) factors As functions of # products, # years in life cycle: Non-product line effort Product line investment (effort) Product line savings (ROI)
University of Southern California Center for Software Engineering CSE USC 27 Constructive Rapid Application Development Model: CORADMO Calculates/predicts for smaller, rapid application development projects –Schedule –Personnel –Adjusted effort Allocates effort and schedule to the stages, which are anchored at points in a development life cycle Scope includes inception, elaboration, and construction
University of Southern California Center for Software Engineering CSE USC 28 CORADMO Factors Reuse and Very High Level Languages Development Process Reengineering and Streamlining Collaboration Efficiency Architecture/Risk Resolution Prepositioning Assets RAD Capability and Experience
University of Southern California Center for Software Engineering CSE USC 29 Constructive Productivity Model: COPROMO Determines impact of technology investments on model parameter settings Predicts the most cost effective allocation of investment resources in new technologies intended to improve productivity Uses COCOMO II, COPSEMO, and CORADMO models as assessment framework –Well-calibrated to 161 projects for effort, schedule –Subset of ’s projects for current-practice baseline –Extensions for Rapid Application Development formulated
University of Southern California Center for Software Engineering CSE USC 30 Constructive COTS Model: COCOTS Estimates the effort associated with the integration of Commercial-Off-The-Shelf (COTS) software products Scope includes inception, elaboration, and construction Model has four components –Assessment –Tailoring –“Glue” code –System volatility Effort reported by COCOTS is the sum of the efforts from each of the four components Can be used in conjunction with COCOMO II to estimate new software development with COTS integration
University of Southern California Center for Software Engineering CSE USC 31 COCOTS Operational Concept # COTS Classes # Candidates/Class Tailoring Complexity Glue code size & cost drivers COCOMO II application effort (separate from COTS) COTS volatility rework (%) Rework due to COTS requirements changes (%) Rework due to non-COTS requirements changes (%) Effort Assessment COCOTS Tailoring Volatility “Glue” Code
University of Southern California Center for Software Engineering CSE USC 32 STAFFING TIME COCOMO vs. COCOTS Cost Sources
University of Southern California Center for Software Engineering CSE USC 33 Covers full system engineering lifecycle (maps to ISO/IEC 15288) Life cycle stages being used in COSYSMO Project Estimates standard Systems Engineering WBS tasks (based on EIA/ANSI 632) Developed with USC-CSE Corporate Affiliate sponsorship and INCOSE participation Conceptualize Develop Oper Test & Eval Transition to Operation Operate, Maintain, or Enhance Replace or Dismantle Constructive System Engineering Cost Model: COSYSMO
University of Southern California Center for Software Engineering CSE USC 34 COSYSMO Size Drivers Effort Multipliers Effort Calibration # Requirements # Interfaces # Scenarios # Algorithms + 3 Volatility Factors - Application factors -8 factors - Team factors -6 factors - Schedule driver WBS guided by EIA/ANSI 632 COSYSMO Operational Concept
University of Southern California Center for Software Engineering CSE USC 35 COSYSMO Effort Multipliers Application Factors –Requirements understanding –Architecture complexity –Level of service requirements –Migration complexity –Technology Maturity –Documentation Match to Life Cycle Needs –# and Diversity of Installations/Platforms –# of Recursive Levels in the Design Team Factors –Stakeholder team cohesion –Personnel/team capability –Personnel experience/continuity –Process maturity –Multisite coordination –Tool support
University of Southern California Center for Software Engineering CSE USC 36 Constructive System-of-System Cost Model: COSOSIMO Parametric model to estimate the effort associated with the definition and integration of software-intensive “system of systems” components –SoS abstraction –Architecting –Source selection –Systems acquisition –Integration and test –Change management effort Includes at least one size driver and 6 exponential scale factors related to effort Targets input parameters that can be determined in early phases
University of Southern California Center for Software Engineering CSE USC 37 Size Drivers Exponential Scale Factors SoS Definition and Integration Effort Calibration Interface-related eKSLOC Number of logical interfaces at SoS level Number of operational scenarios Number of components Integration simplicity Integration risk resolution Integration stability Component readiness Integration capability Integration processes COSOSIMO COSOSIMO Operational Concept
University of Southern California Center for Software Engineering CSE USC 38 Agenda COCOMO II refresher Modeling methodology and model status Suite overview Emerging extensions Model unification Addendum: selected model details References and further information
University of Southern California Center for Software Engineering CSE USC 39 Model Unification Main Issues For each individual model as well as the unified model: 1.Objectives & Strategies 2.Inputs/scope of work 3.Output/scope of estimate 4.Assumptions of each model 5.Stakeholders for each model 6.Counting Rules 7.Sponsorship (FCS, Model-Based Acq.) 8.PhD dissertation critical mass 9.Data sources
University of Southern California Center for Software Engineering CSE USC 40 Unification Goals Allow more comprehensive cost exploration with respect to –Development decisions –Investment decisions –Established project budget and schedules –Client negotiations and requested changes –Cost, schedule, performance, and functionality tradeoffs –Risk management decisions –Process improvement decisions Affiliate request: Provide a single unified tool to allow users to –Specify System and software components comprising the software system of interest Composition and characteristics of components –Receive A set of comprehensive outputs for system engineering, software development, and system-of- systems integration Adjusted using the appropriate special-purpose extensions
University of Southern California Center for Software Engineering CSE USC 41 Issue #1: Objectives & Strategies First pass and future enhancements Framework (Goal-Quality-Metric model approach) Restate objectives for existing models –COCOMO II –COCOTS –COSYSMO –COSOSIMO –CORADMO –COQUALMO Develop objectives for unified cost model Operational scenario(s) for each model
University of Southern California Center for Software Engineering CSE USC 42 Issue #2: Inputs/scope of work Need to define on several levels –To determine scope of work to be estimated –To determine system of interest/viewpoint and system component characteristics –To determine specific sub-model inputs Life cycle model Single user interface A single definition for each parameter/driver (eg. TEAM, PMAT, etc.) vs, context-specific definitions for parameters with common names across models Need to determine which “components” can be estimated as relatively independent pieces vs. tightly coupled components
University of Southern California Center for Software Engineering CSE USC 43 Issue #3: Output/scope of estimate Single value for all integrated models (default 152 hours per person- month) –Normalized PM for calibration Backward compatibility to existing models What set of “bins” should be used for initial effort outputs? What additional levels of granularity should be provided? –By phase/stage? –By labor category? –By activities? –Break out by sub-models? –Increments? (i.e., COINCOMO) How will an Integrated Master Schedule be developed? Effort & schedule as a function of risk Projected productivity
University of Southern California Center for Software Engineering CSE USC 44 Issue #4: Assumptions of each model ModelLife Cycle Stages COCOMO II COCOTS COSYSMO COSOSIMO
University of Southern California Center for Software Engineering CSE USC 45 Issue #5: Users for each model Acquirers, SW developers, estimators, systems engineers, managers, executives, or accountants who are interested in: –Software development (COCOMO II) –Commercial off the shelf software (COCOTS) –Systems engineering (COSYSMO) –Software quality (COQUALMO) –Software rapid application development (COPSEMO, CORADMO) –Software system of systems integration (COSOSIMO) –ROI/Investment analysis (iDave, COPLIMO)
University of Southern California Center for Software Engineering CSE USC 46 Issue #6: Counting Rules & Definitions Inputs –Size drivers (VHLLs, FPs, APs, Use Case Points, KSLOC, REQS, ALG, I/F, SCEN, Components, etc.) –Model inputs (cost drivers, scale factors) Outputs –Effort distributions Phase, activity, or labor categories –Schedule –Defects –$ cost –Risk –Productivity
University of Southern California Center for Software Engineering CSE USC 47 Additional Analysis in Progress Cost Drivers Scale Factors
University of Southern California Center for Software Engineering CSE USC 48 Long Term Vision Unified Interface COSOSIMO COSYSMO COCOMOII/ COQUALMO COCOTS COCOMOII extensions RAD, security Incremental, phase/activity Agile, risk, Monte Carlo ROI (product line, dependability) Maintenance COCOMOII extensions RAD, security Incremental, phase/activity Agile, risk, Monte Carlo ROI (product line, dependability) Maintenance Output Analysis and Report Generation Unified Model
University of Southern California Center for Software Engineering CSE USC 49 Agenda COCOMO II refresher Modeling methodology and model status Suite overview Emerging extensions Model unification Addendum: selected model details –COCOTS –COPLIMO –COSYSMO –COSOSIMO References and further information
University of Southern California Center for Software Engineering CSE USC 50 COTS Software Integration Lifecycle 1) Qualify COTS product 2) Perform system requirements 3) Administer COTS software acquisition 4) Prototype the system including COTS software 5) Fully integrate COTS software and interface code 6) Test completed prototype COTS Software Integration Lifecycle
University of Southern California Center for Software Engineering CSE USC 51 COTS Integration Sources of Effort COTS Assessment (pre- and post- commitment) –Of functionality, performance, interoperability, etc. COTS Tailoring and Tuning –Effects of platform, other COTS products Glue Code Development –Similar to other Cost Xpert estimation Application Volatility Due to COTS –COTS volatility, shortfalls, learning curve Added Application V&V Effort –COTS option and stress testing –Debugging complications, incorrect fixes
University of Southern California Center for Software Engineering CSE USC 52 Traditional vs. COTS Cost Sources Time Staffing 1) COTS Assessment 3) COTS/Application Glue Code Development and Test 2) COTS Tailoring 4) Increased Application Effort due to COTS Volatility LCO/ Reqts. Review Application Code Development LCA/ Design Review IOC/ Beta Test COCOMO II COTS model
University of Southern California Center for Software Engineering CSE USC 53 Current Scope of COTS Model COTS model covers –assessment –tailoring –glue code development and integration –impact of new releases (volatility) It does not cover –cost of re-engineering business processes –vendor management –licenses –training (for COTS integrators or end users) –COTS platform or tool experience or maturity Covered by PLEX, LTEX, PVOL, TOOL environmental factors
University of Southern California Center for Software Engineering CSE USC 54 Assessment Effort Inputs Initial Filtering of COTS products –estimate of the total number of candidate COTS components to be filtered More detailed assessment of specific candidates against attributes that are important –class(es) of COTS components to be assessed –for each class, number assessed attributes considered
University of Southern California Center for Software Engineering CSE USC 55 # COTS Candidates in class filtered Initial Filtering Effort (IFE) = Average Filtering Effort for product class ) ( )( Assessment Submodel Over all classes # COTS Candidates in class detailed assessed Detailed Assessment Effort (DAE) = Average Assessment Effort for product class * ) ( )( Over all classes, by project domain Final Project Assessment Effort (FPAE) = IFE + DAE * Qualified by assessment attributes most associated with that class
University of Southern California Center for Software Engineering CSE USC 56 Assessment Attributes
University of Southern California Center for Software Engineering CSE USC 57 Tailoring Effort Inputs COTS tailoring - activities required to prepare or initialize a component for use in a specific system Tailoring includes –parameter specification –script writing –GUI screen specification –Report specification –Security/Access Protocol initialization and set up For each class of COTS component, –rate the complexity of tailoring for each of the above activities
University of Southern California Center for Software Engineering CSE USC 58 Tailoring Submodel where # COTS Tailored in class Project Tailoring Effort (PTE) = Average Tailoring Effort for product class ) [( )( Over all classes, by project domain TCQ r, class ] TCQ r,class = Tailoring Complexity Qualifier, calibrated within a class, for each of five possible ratings from Very Low to Very High, and with the TCQ NOMINAL = 1.0
University of Southern California Center for Software Engineering CSE USC 59 Tailoring Complexity Table
University of Southern California Center for Software Engineering CSE USC 60 Glue Code Inputs Definition of glue code: –code needed to facilitate data or information exchange between the COTS component and the system into which it is being integrated –code needed to provide required functionality missing in the COTS component AND which depends on or must interact with the COTS component Estimate of the total delivered lines of glue code Estimate of glue code rework due to COTS volatility or requirements volatility
University of Southern California Center for Software Engineering CSE USC 61 Glue Code Inputs (continued) Integration Personnel –Integrator experience with product (VL - VH) –Integrator personnel capability (VL - VH) –Integrator experience with COTS integration process (L - VH) –Integrator personnel continuity (VL - VH) COTS Component –COTS product maturity (VL - VH) –COTS supplier product extension willingness (L - VH) –COTS product interface complexity (L - VH) –COTS supplier product support (L - VH) –COTS supplier provided training and documentation (VL - VH)
University of Southern California Center for Software Engineering CSE USC 62 Glue Code Inputs (continued) Application/System –Constraints on system/subsystem reliability (L - VH) –Constraints on system/subsystem technical performance (N-VH) –System portability (N - VH) –Application architectural engineering (VL - VH)
University of Southern California Center for Software Engineering CSE USC 63 [(size)(1+breakage)] Total Effort = A B (effort multipliers) Glue Code Submodel A - a linear scaling constant Size - of the glue code in SLOC or FP Breakage - of the glue code due to change in requirements and/or COTS volatility Effort Multipliers - 13 parameters, each with settings ranging VL to VH B - an architectural scale factor with settings VL to VH
University of Southern California Center for Software Engineering CSE USC 64 Glue Code Cost Drivers
University of Southern California Center for Software Engineering CSE USC 65 Volatility Inputs Captures impact of new COTS releases on the custom/new application effort Inputs: –Estimate of new development effort (derived via Cost Xpert - traditional) –Percentage of new development rework due to requirements changes COTS volatility Note: This submodel is being revised
University of Southern California Center for Software Engineering CSE USC 66 Approximate Model: Detailed Model with Cost Xpert Parameters: BRAK COTS: % application code breakage due to COTS volatility BRAK : % application code breakage otherwise : Cost Xpert scale factor EAF : Effort Adjustment Factor (product of effort multipliers) [ ] BRAK COTS 100 Total Effort = (Application Effort) (EAF) COTS [ ] Total Effort = (Application Effort) ( ) BRAK COTS 1+BRAK (EAF) COTS Volatility Submodel
University of Southern California Center for Software Engineering CSE USC 67 x Total Integration Effort (in Person-Months) = Assessment Effort + Tailoring Effort + Glue Code Effort + Volatility Effort where Assessment Effort = Filtering Effort + Final Selection Effort Total integration Cost = (Total Integration Effort) ($$/Person-Month) Total COTS Integration Cost Estimate
University of Southern California Center for Software Engineering CSE USC 68 Agenda COCOMO II refresher Modeling methodology and model status Suite overview Emerging extensions Model unification Addendum: selected model details –COCOTS –COPLIMO –COSYSMO –COSOSIMO References and further information
University of Southern California Center for Software Engineering CSE USC 69 COPLIMO Background Benefits vs. Costs of product line Does product line pay off? Traditional product line cost estimation models mostly underestimate the ROI for product lines by focusing only on development savings –Apply RCWR surcharge to entire product not only to the reused portions –If life cycle costs are considered, high payoff comes from a smaller code base to undergo maintenance COPLIMO life cycle model –Addresses the shortfalls with a representative set of parameters based on experience in aircraft and spacecraft product line domains –Based on COCOMO II parameters calibrated to 161 projects, empirical data on nonlinear reuse effects
University of Southern California Center for Software Engineering CSE USC 70 COPLIMO Model Overview Based on COCOMO II software cost model –Statistically calibrated to 161 projects, representing 18 diverse organizations Based on standard software reuse economic terms –RCWR: Relative Cost of Writing for Reuse –RCR: Relative Cost of Reuse Avoids investment overestimation, savings underestimation –Avoids RCWR for non-reused components –Includes savings from smaller life-cycle code base Provides experience-based default parameter values Simple Excel spreadsheet model –Easy to modify, extend, interoperate
University of Southern California Center for Software Engineering CSE USC 71 COPLIMO - RCWR Development for Reuse (RUSE) – In COCOMO II database, 11 out of 161 projects rated as VH for RUSE, and 1 rated as XH – Productivity Range of RUSE Highest rating / Lowest rating = 1.24/0.95 = 1.31 And two other contributing variables –Required Reliability (RELY): –Degree of Documentation (DOCU):
University of Southern California Center for Software Engineering CSE USC 72 COPLIMO – RCWR (Cont.) Required Reliability (RELY) Constraints: At least Nominal for Nominal and High RUSE ratings, at least High for Very High and Extra High RUSE ratings Degree of Documentation (DOCU) Constraint: No more than one level below the RUSE rating
University of Southern California Center for Software Engineering CSE USC 73 COPLIMO – RCR Reused, or Black Box (unmodified code) RCR model –Assessment and Assimilation (AA) factor Adapted, or White Box (modified code) RCR model –AA –Non-Linear Model 100 AAM Worst Case: AA = 0 Relative Modification of Size (AAF) AAM Best Case: SU = 10 UNFM = 0 AAF = 0.5 Selby data Relative Cost AA = 8 SU = 50 UNFM = 1 AAF = 0.5 Selby data summary Figure 1 Nonlinear Reuse Effects
University of Southern California Center for Software Engineering CSE USC 74 Basic COPLIMO – Development Cost Model (1) Simplifying assumptions about uniformity and stability –Every product roughly the same size (PSIZE) –Roughly the same fractions of product-specific (PFRAC), adapted (AFRAC), and reused (RFRAC) software Inputs and outputs For current set of similar products, As functions of # products, Basic COPLIMO Average product size, productivity Percent product- specific, adapted, reused RCR, RCWR factors Non-product line effort Product line investment, effort Product line savings, ROI
University of Southern California Center for Software Engineering CSE USC 75 Basic COPLIMO – Development Cost Model (2) RCWR: –RCWR = RUSE * DOCU * RELY 1 product development effort: –Non-PL Effort for developing N similar products: PM NR (N) = N · A· (PSIZE) B · Π (EM) Where PSIZE is the general software product size, A and B are the COCOMO II calibration coefficient and scale factor, and Π (EM) is the product of the effort multipliers for the COCOMO II cost drivers –PL Effort (the first product): PM R (1) = PM NR (1) * [PFRAC + RCWR*(AFRAC+RFRAC)] Note: RCWR not applied to non- reused portion, where many other models overestimate RCWR RCR parameters
University of Southern California Center for Software Engineering CSE USC 76
University of Southern California Center for Software Engineering CSE USC 77 Basic COPLIMO – Annualized Life Cycle Cost Model Annual Change Traffic (ACT) –Relative fraction of a product’s software that is modified per year –Simplifying assumption: Constant-ACT Life cycle effort without reuse –N complete products undergo maintenance Life cycle effort with reuse –PFRAC: maintenance for N instances –RFRAC: maintenance for 1 instance –AFRAC: maintenance for 1 instance and N-1 variants
University of Southern California Center for Software Engineering CSE USC 78
University of Southern California Center for Software Engineering CSE USC 79
University of Southern California Center for Software Engineering CSE USC 80 Discussions Software product line payoffs are significant esp. across life cycle This does not mean any attempt at product line reuse will generate large savings Challenges: –Technical: Domain engineering and product line architecting –Management and Culture: People unwilling to corporate “Not invented here” attitudes Success factor: empowered product line manager
University of Southern California Center for Software Engineering CSE USC 81 Conclusions Software product line payoffs are significant esp. across life cycle COPLIMO avoids investment overestimation & savings underestimation COPLIMO helps to determine whether and when it pays to launch a product line COPLIMO enables assessment of situation-dependencies, hence lead to better product line decisions. Future work: Support for more sensitivity analysis Model refinement and calibration Integration with other COCOMO II family models, such as COCOTS
University of Southern California Center for Software Engineering CSE USC 82 COPLIMO Backup Charts
University of Southern California Center for Software Engineering CSE USC 83 COPLIMO – RCR Reused, or Black Box (unmodified code) RCR model –Assessment and Assimilation (AA) factor Adapted, or White Box (modified code) RCR model –AA –Non-Linear Model
University of Southern California Center for Software Engineering CSE USC 84 Guidelines for Quantifying Adapted Software
University of Southern California Center for Software Engineering CSE USC 85 Basic COPLIMO – Development Cost Model (3) Determining RCR –Equiv. size of product- specific portion: –Equiv. size of reused portion: –Equiv. size of adapted portion: –Total EKSLOC: –Effort: –ROI = (PL Effort Savings for K products - PL Reuse Investment) / PL Reuse Investment PM R (N) = N · A· (EKSIZE) B · Π (EM)
University of Southern California Center for Software Engineering CSE USC 86 Basic COPLIMO – Annualized Life Cycle Cost Model (1) Annual Change Traffic (ACT) –Relative fraction of a product ’ s software that is modified per year Life cycle effort without reuse –Annual maintained software –L times maintenance effort Life cycle effort with reuse –Three categories of annual maintenance and AMSIZE
University of Southern California Center for Software Engineering CSE USC 87 Agenda COCOMO II refresher Modeling methodology and model status Suite overview Emerging extensions Model unification Addendum: selected model details –COCOTS –COPLIMO –COSYSMO –COSOSIMO References and further information
University of Southern California Center for Software Engineering CSE USC 88 COSYSMO Introduction Covers full system engineering lifecycle (maps to ISO/IEC 15288) Life cycle stages being used in COSYSMO Project Estimates standard Systems Engineering WBS tasks (based on EIA/ANSI 632) Developed with USC-CSE Corporate Affiliate sponsorship and INCOSE participation Conceptualize Develop Oper Test & Eval Transition to Operation Operate, Maintain, or Enhance Replace or Dismantle
University of Southern California Center for Software Engineering CSE USC 89 How is Systems Engineering Defined? EIA/ANSI 632 Processes for Engineering a System: Acquisition and Supply –Supply Process –Acquisition Process Technical Management –Planning Process –Assessment Process –Control Process System Design –Requirements Definition Process –Solution Definition Process Product Realization –Implementation Process –Transition to Use Process Technical Evaluation –Systems Analysis Process –Requirements Validation Process –System Verification Process –End Products Validation Process
University of Southern California Center for Software Engineering CSE USC 90 COSYSMO Size Drivers Effort Multipliers Effort Calibration # Requirements # Interfaces # Scenarios # Algorithms + 3 adjustment factors - Application factors -8 factors - Team factors -6 factors COSYSMO Operational Concept
University of Southern California Center for Software Engineering CSE USC 91 Where: PM NS = effort in Person Months (Nominal Schedule) A = calibration constant derived from historical project data k = {REQ, IF, ALG, SCN} w x = weight for “easy”, “nominal”, or “difficult” size driver = quantity of “k” size driver E = represents diseconomy of scale (currently equals 1) EM = effort multiplier for the j th cost driver. The geometric product results in an overall effort adjustment factor to the nominal effort. Model Form
University of Southern California Center for Software Engineering CSE USC Cost Drivers (Effort Multipliers) 1.Requirements understanding 2.Architecture understanding 3.Level of service requirements 4.Migration complexity 5.Technology Maturity 6.Documentation Match to Life Cycle Needs 7.# and Diversity of Installations/Platforms 8.# of Recursive Levels in the Design Application Factors (8)
University of Southern California Center for Software Engineering CSE USC Cost Drivers (continued) 1.Stakeholder team cohesion 2.Personnel/team capability 3.Personnel experience/continuity 4.Process maturity 5.Multisite coordination 6.Tool support Team Factors (6)
University of Southern California Center for Software Engineering CSE USC 94 Agenda COCOMO II refresher Modeling methodology and model status Suite overview Emerging extensions Model unification Addendum: selected model details –COCOTS –COPLIMO –COSYSMO –COSOSIMO References and further information
University of Southern California Center for Software Engineering CSE USC 95 How Much Effort to Integrate a System of Systems? Systems developed by system contractors –Total effort 3000 person-years System of systems integration functions –SoS abstraction, architecting, source selection, systems acquisition, integration, test, change management effort How much to budget for integration? What factors make budget higher or lower? How to develop and validate an estimation model? System of Systems ? person-years (PY) Sensing 500 PY Vehicles 500 PY Common 400 PY Infrastructure 600 PY Command & Control 1000 PY
University of Southern California Center for Software Engineering CSE USC 96 Constructive System-of-System Integration Cost Model (COSOSIMO) Parametric model to estimate the effort associated with the definition and integration of software-intensive “system of systems” components Includes at least one size driver and 6 exponential scale factors related to effort Targets input parameters that can be determined in early phases Goal is to have zero overlap with COCOMO II and COSYSMO
University of Southern California Center for Software Engineering CSE USC 97 Size Drivers Exponential Scale Factors SoS Definition and Integration Effort Calibration Interface-related eKSLOC Number of logical interfaces at SoS level Number of components Number of operational scenarios Integration simplicity Integration risk resolution Integration stability Component readiness Integration capability Integration processes COSOSIMO Operational Concept COSOSIMO Each size driver weighted by Complexity Volatility Degree of COTS/reuse
University of Southern California Center for Software Engineering CSE USC 98 COSOSIMO Model Equations Level 1 IPM (S i ) = A i Size (S ij ) BiBi j=1 nini Level 0 IPM (SoS) = A 0 IPM (S i ) B0B0 i=1 mimi Two level model that First determines integration effort for first level subsystems…. Then, using subsystem integration effort and SoS characteristics, determines SoS integration effort… SOS SmSm S2S2 S1S1 S 11 S 12 S 1n S 21 S 22 S 2n S m1 S m2 S mn …… Level 0 Level 1
University of Southern California Center for Software Engineering CSE USC 99 COSOSIMO Model Parameters IPM Integration effort in Person Months S i The i th subsystem within the SoS A Constant derived from historical project data Size Determined by computing the weighted average of the size driver(s) n i Number of Subsystem level 2 components comprising the i th subsystem mNumber of Subsystem level 1 components comprising the SoS B i Effort exponent for the i th subsystem based on the subsystem’s 6 exponential scale factors. The sum of the scale factors results in an overall exponential effort adjustment factor to the nominal effort. B 0 Effort exponent for the SoS based on the SOS’ 6 exponential scale factors. The sum of the scale factors results in an overall exponential effort adjustment factor to the nominal effort.
University of Southern California Center for Software Engineering CSE USC 100 Agenda COCOMO II refresher Modeling methodology and model status Suite overview Emerging extensions Model unification Addendum: selected model details –COCOTS –COPLIMO –COSYSMO –COSOSIMO References and further information
University of Southern California Center for Software Engineering CSE USC 101 References Abts, C., Extending The COCOMO II Software Cost Model To Estimate Effort And Schedule For Software Systems Using Commercial-off-the-shelf (COTS) Software Components: The COCOTS Model, USC PhD dissertation, May 2004 B. Boehm, C. Abts, W. Brown, S. Chulani, B. Clark, E. Horowitz, R. Madachy, D. Reifer, B. Steece, Software Cost Estimation with COCOMO II, Prentice-Hall, 2000 Chulani, "Bayesian Analysis of Software Cost and Quality Models“, USC PhD dissertation, April Clark, B., Clark, B., “Early COCOTS”, September Lane, J. “Constructive Cost Model for System-of-System Integration,” 3rd ACM- IEEE International Symposium on Empirical Software Engineering, Redondo Beach, CA, August, 2004 Valerdi, R., Boehm, B., Reifer, D., “COSYSMO: A Constructive Systems Engineering Cost Model Coming Age,” Proceedings, 13th Annual INCOSE Symposium, Crystal City, VA. July Boehm B, Valerdi R Lane J, Brown W, COCOMO Suite Methodology and Evolution, Crosstalk, 2005 Yang Y, Boehm B, Madachy R, COPLIMO: A Product-Line Investment Analysis Model, Proceedings of the Eighteenth International Forum on COCOMO and Software Cost Modeling, USC, Los Angeles, CA, October 2003
University of Southern California Center for Software Engineering CSE USC 102 Further Information Main COCOMO website at USC: COCOMO information at USC: (213) COCOMO cocomo-