Download presentation
Presentation is loading. Please wait.
Published byAbner Martin Modified over 9 years ago
1
University of Southern California Center for Systems and Software Engineering Domain-Driven Software Cost Estimation Wilson Rosa (Air Force Cost Analysis Agency) Barry Boehm (USC) Brad Clark (USC) Thomas Tan (USC) Ray Madachy (Naval Post Graduate School) 27th International Forum on COCOMO® and Systems/Software Cost Modeling October 16, 2012 This material is based upon work supported, in whole or in part, by the U.S. Department of Defense through the Systems Engineering Research Center (SERC) under Contract H98230-08-D-0171. The SERC is a federally funded University Affiliated Research Center (UARC) managed by Stevens Institute of Technology consisting of a collaborative network of over 20 universities. More information is available at www.SERCuarc.orgwww.SERCuarc.org
2
University of Southern California Center for Systems and Software Engineering Data Preparation and Analysis Cost (Effort) = a * Size b Research Objectives 27th International Forum on COCOMO® and Systems/Software Cost Modeling2 Make collected data useful to oversight and management entities –Provide guidance on how to condition data to address challenges –Segment data into different Application Domains and Operating Environments –Analyze data for simple Cost Estimating Relationships (CER) and Schedule-Cost Estimating Relationships (SCER) within each domain –Develop rules-of-thumb for missing data Data Records for one Domain Schedule = a * Size b * Staff c Domain CER/SER
3
University of Southern California Center for Systems and Software Engineering Stakeholder Community Research is collaborative across heterogeneous stakeholder communities who have helped us in refining our data definition framework, taxonomy, providing us data and funding Project has evolved into a Joint Government Software Study Funding SourcesData Sources 327th International Forum on COCOMO® and Systems/Software Cost Modeling
4
University of Southern California Center for Systems and Software Engineering Topics Data Preparation Workflow –Data Segmentation Analysis Workflow Software Productivity Benchmarks Cost Estimating Relationships Schedule Estimating Relationships Conclusion Future Work 427th International Forum on COCOMO® and Systems/Software Cost Modeling
5
University of Southern California Center for Systems and Software Engineering Data Preparation
6
University of Southern California Center for Systems and Software Engineering Current Dataset Multiple Data Formats (SRDR, SEER, COCOMO) SRDR (377 records) + Other (143 records) = 522 total records Multiple Sources 627th International Forum on COCOMO® and Systems/Software Cost Modeling
7
University of Southern California Center for Systems and Software Engineering The Need for Data Preparation Issues found in dataset –Inadequate information on modified code (size provided) –Inadequate information on size change or growth –Size measured inconsistently –Inadequate information on average staffing or peak staffing –Inadequate information on personnel experience –Inaccurate effort data in multi-build components –Missing effort data –Replicated duration (start and end dates) across components –Inadequate information on schedule compression –Missing schedule data –No quality data 727th International Forum on COCOMO® and Systems/Software Cost Modeling
8
University of Southern California Center for Systems and Software Engineering Data Preparation Workflow 8 Start with SRDR submissions Correct Missing or Questionable Data Determine Data Quality Levels Exclude from Analysis Normalize Data Inspect each Data Point No resolution Segment Data 27th International Forum on COCOMO® and Systems/Software Cost Modeling
9
University of Southern California Center for Systems and Software Engineering Segment Data by Operating Environments (OE) 27th International Forum on COCOMO® and Systems/Software Cost Modeling9
10
University of Southern California Center for Systems and Software Engineering Segment Data by Productivity Type (PT) 27th International Forum on COCOMO® and Systems/Software Cost Modeling10 1.Sensor Control and Signal Processing (SCP) 2.Vehicle Control (VC) 3.Real Time Embedded (RTE) 4.Vehicle Payload (VP) 5.Mission Processing (MP) 6.System Software (SS) 7.Telecommunications (TEL) 8.Process Control (PC) 9.Scientific Systems (SCI) 10.Planning Systems (PLN) 11.Training (TRN) 12.Test Software (TST) 13.Software Tools (TUL) 14.Intelligence & Information Systems (IIS) Different productivities have been observed for different software application types. SRDR dataset was segmented into 14 productivity types to increase the accuracy of estimating cost and schedule
11
University of Southern California Center for Systems and Software Engineering Example: Finding Productivity Type 27th International Forum on COCOMO® and Systems/Software Cost Modeling11 Finding Productivity Type (PT) using the Aircraft MIL-STD-881 WBS: The highest level element represents the environment. In the MAV environment there are the Avionics subsystem, Fire-Control sub-subsystem, and the sensor, navigation, air data, display, bombing computer and safety domains. Each domain has an associated productivity type. EnvSubsysSub-subsystemDomainsPT MAVAvionicsFire ControlSearch, target, tracking sensorsSCP Self-contained navigationRTE Self-contained air data systemsRTE Displays, scopes, or sightsRTE Bombing computerMP Safety devicesRTE Data DisplayMulti-function displayRTE and ControlsControl display unitsRTE Display processorsMP On-board mission planningTRN Level 1Level 2Level 3Level 4
12
University of Southern California Center for Systems and Software Engineering Operating Environment & Productivity Type 27th International Forum on COCOMO® and Systems/Software Cost Modeling12 Operating Environment GSFGSMGVMGVUMVMMVUAVMAVUOVUSVMSVU Productivity Type SSP VC X RTE VP MP SS TEL PC SCI PLN TRN TST TUL IIS When the dataset is segmented by Productivity Type and Operating Environment, the impact accounted for by many COCOMO II model drivers are considered
13
University of Southern California Center for Systems and Software Engineering Data Analysis
14
University of Southern California Center for Systems and Software Engineering Analysis Workflow 27th International Forum on COCOMO® and Systems/Software Cost Modeling10/16/201214 Prepared, Normalized & Segmented Data Derive CER Model Form Derive Final-CER & reference data subset Derive SCER Publish SCER CER: Cost Estimating Relationship PR: Productivity Ratio SER: Schedule Estimating Relationship SCER: Schedule Compression / Expansion Relationship Publish Productivity Benchmarks by Productivity Type & Size Group Publish Productivity Benchmarks by Productivity Type & Size Group Publish CER results
15
University of Southern California Center for Systems and Software Engineering Software Productivity Benchmarks Productivity-based CER Software productivity refers to the ability of an organization to generate outputs using the resources that it currently has as inputs. Inputs typically include facilities, people, experience, processes, equipment, and tools. Outputs generated include software applications and documentation used to describe them. The metric used to express software productivity is thousands of equivalent source lines of code (ESLOC) per person-month (PM) of effort. While many other measures exist, ESLOC/PM will be used because most of the data collected by the Department of Defense (DoD) on past projects is captured using these two measures. While controversy exists over whether or not ESLOC/PM is a good measure, consistent use of this metric (see Metric Definitions) provides for meaningful comparisons of productivity..
16
University of Southern California Center for Systems and Software Engineering Software Productivity Benchmarks PT MIN (ESLOC/PM) MEAN (ESLOC/PM ) MAX (ESLOC/PM) Obs. Std. Dev.CV KESLOC MINMAX SCP105080381939%1162 VP2882202164352%5120 RTE33136443527354%1167 MP341897174711058%1207 SCI92214313911954%1171 SYS61225421607835%2215 IIS16944210393619243%1180 Benchmarks by PT, across all operating environments** ** The following operating environments were included in the analysis: Ground Surface Vehicles Sea Systems Aircraft Missile / Ordnance (M/O) Spacecraft ** The following operating environments were included in the analysis: Ground Surface Vehicles Sea Systems Aircraft Missile / Ordnance (M/O) Spacecraft Preliminary Results – More Records to be added
17
University of Southern California Center for Systems and Software Engineering Software Productivity Benchmarks PTOE MIN (ESLOC/PM) MEAN (ESLOC/PM) MAX (ESLOC/PM) Obs. Std. Dev.CV KESLOC MINMAX SCPGSM275680131730%176 RTEGSM51129239224636%989 MPGSM8716224365232%1591 SYSGSM115240421286426%5215 SCIGSM92434102410844%5171 IISGSM236376581238523%15180 Benchmarks by PT, Ground System Manned Only CV:Cost Variance ESLOC: Equivalent SLOC KESLOC:Equivalent SLOC in Thousands MAD:Mean Absolute Deviation MAX: Maximum MIN:Minimum PM:Effort in Person-Months PT: Productivity Type OE:Operating Environment CV:Cost Variance ESLOC: Equivalent SLOC KESLOC:Equivalent SLOC in Thousands MAD:Mean Absolute Deviation MAX: Maximum MIN:Minimum PM:Effort in Person-Months PT: Productivity Type OE:Operating Environment Preliminary Results – More Records to be added
18
University of Southern California Center for Systems and Software Engineering Cost Estimating Relationships Preliminary Results – More Records to be added
19
University of Southern California Center for Systems and Software Engineering CER Model Forms Effort = a * Size Effort = a * Size + b Effort = a * Size b + c Effort = a * ln(Size) + b Effort = a * Size b * Duration c Effort = a * Size b * c 1-n Production Cost (Cost/Unit) Scaling Factor % Adjustment Factor ln(Effort) = b 0 + (b 1 * ln(Size)) + (b 2 * ln(c 1 )) + (b 3 * ln(c 2 )) + … Effort = e b0 * Size b1 * c 1 b2 * c 2 b3 + … Log-Log transform Anti-log transform 19
20
University of Southern California Center for Systems and Software Engineering Software CERs by Productivity Type (PT) PT Equation FormObs. R2 (adj)MAD PRED (30) KESLOC MIN MAX IIS PM = 1.266 * KESLOC 1.179 3790%35%651 180 MP PM = 3.477 * KESLOC 1.172 4888%49%581 207 RTE PM = 34.32 + KESLOC 1.515 5268%61%461 167 SCI PM = 21.09 + KESLOC 1.356 3961%65%181 171 SCP PM = 74.37 + KESLOC 1.714 3667%69%311 162 SYS PM = 16.01 + KESLOC 1.369 6085%37%532 215 VP PM = 3.153 * KESLOC 1.382 1686%27%505120 CERs by PT, across all operating environments** ** The following operating environments were included in the analysis: Ground Surface Vehicles Sea Systems Aircraft Missile / Ordnance (M/O) Spacecraft ** The following operating environments were included in the analysis: Ground Surface Vehicles Sea Systems Aircraft Missile / Ordnance (M/O) Spacecraft Preliminary Results – More Records to be added
21
University of Southern California Center for Systems and Software Engineering Software CERs for Aerial Vehicle Manned (AVM) PTOE Equation FormObs. R 2 (adj)MAD PRED (30) KESLOC MINMAX MPMAV PM = 3.098*KESLOC 1.236 3188%50%591207 RRTEMAV PM = 5.611 * KESLOC 1.126 989%50%331167 SCPMAV PM = 115.8 + KESLOC 1.614 888%27%626162 CERs by Productivity Type, AVM Only CERs:Cost Estimating Relationships ESLOC: Equivalent SLOC KESLOC:Equivalent SLOC in Thousands MAD:Mean Absolute Deviation MAX: Maximum MIN:Minimum PM:Effort in Person-Months PRED:Prediction (Level) PT: Productivity Type OE:Operating Environment CERs:Cost Estimating Relationships ESLOC: Equivalent SLOC KESLOC:Equivalent SLOC in Thousands MAD:Mean Absolute Deviation MAX: Maximum MIN:Minimum PM:Effort in Person-Months PRED:Prediction (Level) PT: Productivity Type OE:Operating Environment Preliminary Results – More Records to be added
22
University of Southern California Center for Systems and Software Engineering Software CERs for Manned Ground Systems Manned (GSM) CERs by Productivity Type PTOE Equation FormObs. R2 (adj)MAD PRE (30) KESLOC MINMAX IISMGS PM = 30.83 + 1.381 * KESLOC 1.103 2316%9115180 MPMGS PM = 3.201 * KESLOC 1.188 686%24%831591 RTEMGS PM = 84.42 + KESLOC 1.451 2224%73989 SCIMGS PM = 34.26 + KESLOC 1.286 2437%565171 SCPMGS PM = 135.5 + KESLOC 1.597 1339%31176 SYSMGS PM = 20.86 + 2.347 * KESLOC 1.115 2819%825215 CERs:Cost Estimating Relationships ESLOC: Equivalent SLOC KESLOC:Equivalent SLOC in Thousands MAD:Mean Absolute Deviation MAX: Maximum MIN:Minimum PM:Effort in Person-Months PT: Productivity Type OE:Operating Environment CERs:Cost Estimating Relationships ESLOC: Equivalent SLOC KESLOC:Equivalent SLOC in Thousands MAD:Mean Absolute Deviation MAX: Maximum MIN:Minimum PM:Effort in Person-Months PT: Productivity Type OE:Operating Environment Preliminary Results – More Records to be added
23
University of Southern California Center for Systems and Software Engineering Software CERs for Space Vehicle Unmanned PTOE Equation FormObs. R2 (adj)MAD PRED (30) KESLOC MINMAX VPSVU PM = 3.153 * KESLOC 1.382 1686%27%505120 CERs by Productivity Type (PT) - SVU Only CERs:Cost Estimating Relationships ESLOC: Equivalent SLOC KESLOC:Equivalent SLOC in Thousands MAD:Mean Absolute Deviation MAX: Maximum MIN:Minimum PM:Effort in Person-Months PRED:Prediction (Level) PT: Productivity Type OE:Operating Environment CERs:Cost Estimating Relationships ESLOC: Equivalent SLOC KESLOC:Equivalent SLOC in Thousands MAD:Mean Absolute Deviation MAX: Maximum MIN:Minimum PM:Effort in Person-Months PRED:Prediction (Level) PT: Productivity Type OE:Operating Environment Preliminary Results – More Records to be added
24
University of Southern California Center for Systems and Software Engineering Schedule Estimating Relationships Preliminary Results – More Records to be added
25
University of Southern California Center for Systems and Software Engineering Schedule Estimation Relationships (SERs) SERs by Productivity Type (PT), across operating environments** PT Equation FormObs. R2 (adj)MAD PRED (30) KESLOC MINMAX IIS TDEV = 3.176 * KESLOC 0.7209 / FTE 0.4476 356525681180 MP TDEV = 3.945 * KESLOC 0.968 / FTE 0.7505 437739521207 RTE TDEV= 11.69 * KESLOC 0.7982 / FTE 0.8256 497036551167 SYS TDEV = 5.781 * KESLOC 0.8272 / FTE 0.7682 567127622215 SCP TDEV = 34.76 * KESLOC 0.5309 / FTE 0.5799 356226641165 ** The following operating environments were included in the analysis: Ground Surface Vehicles Sea Systems Aircraft Missile / Ordnance (M/O) Spacecraft ** The following operating environments were included in the analysis: Ground Surface Vehicles Sea Systems Aircraft Missile / Ordnance (M/O) Spacecraft 2527th International Forum on COCOMO® and Systems/Software Cost Modeling Preliminary Results – More Records to be added
26
University of Southern California Center for Systems and Software Engineering Size – People – Schedule Tradeoff 27th International Forum on COCOMO® and Systems/Software Cost Modeling26
27
University of Southern California Center for Systems and Software Engineering COCOMO 81 vs. New Schedule Equations Model Comparisons PTObs. New Schedule Equations COCOMO 81 Equations IIS35TDEV = 3.176 * KESLOC 0.7209 * FTE -0.4476 TDEV = 2.5 * PM 0.38 MP43TDEV = 3.945 *KESLOC 0.968 * FTE -0.7505 TDEV = 2.5 * PM 0.35 RTE49TDEV= 11.69 *KESLOC 0.7982 * FTE -0.8256 TDEV = 2.5 * PM 0.32 SYS56TDEV = 5.781 *KESLOC 0.8272 * FTE -0.7682 TDEV = 2.5 * PM 0.35 SCP35TDEV = 34.76 * KESLOC 0.5309 * FTE -0.5799 TDEV = 2.5 * PM 0.32 ** The following operating environments were included in the analysis: Ground Surface Vehicles Sea Systems Aircraft Missile / Ordnance (M/O) Spacecraft ** The following operating environments were included in the analysis: Ground Surface Vehicles Sea Systems Aircraft Missile / Ordnance (M/O) Spacecraft Preliminary Results – More Records to be added
28
University of Southern California Center for Systems and Software Engineering COCOMO 81 vs. New Schedule Equations Model Comparisons using PRED (30%) PTObs. New Schedule Equations PRED (30) COCOMO 81 Equations PRED (30) IIS356828 MP435223 RTE495516 SYS56625 SCP35648 Preliminary Results – More Records to be added ** The following operating environments were included in the analysis: Ground Surface Vehicles Sea Systems Aircraft Missile / Ordnance (M/O) Spacecraft ** The following operating environments were included in the analysis: Ground Surface Vehicles Sea Systems Aircraft Missile / Ordnance (M/O) Spacecraft
29
University of Southern California Center for Systems and Software Engineering Conclusions.
30
University of Southern California Center for Systems and Software Engineering Conclusion Developing CERs and Benchmarks by grouping appears to account for some of the variability in estimating relationships. Grouping software applications by Operating Environment and Productivity Type appears to have promise – but needs refinement Analyses shown in this presentation are preliminary as more data is available for analysis –It requires preparation first 27th International Forum on COCOMO® and Systems/Software Cost Modeling30
31
University of Southern California Center for Systems and Software Engineering Future Work Productivity Benchmarks need to be segregated by size- groups More data is available to fill in missing cells in the OE-PT table Workshop recommendations will be implemented –New data grouping strategy Data repository that provides drill-down to source data –Presents the data to the analyst –If there is a question, it is possible to navigate to the source document, e.g. data collection form, project notes, EVM data, Gantt Charts, etc. Final results will be published online http://csse.usc.edu/afcaawiki
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.