Presentation is loading. Please wait.

Presentation is loading. Please wait.

COSYSMO Workshop Future Directions and Priorities 23 rd International Forum on COCOMO and Systems/Software Cost Modeling Los Angeles, CA Wed Oct 29 & Thurs.

Similar presentations


Presentation on theme: "COSYSMO Workshop Future Directions and Priorities 23 rd International Forum on COCOMO and Systems/Software Cost Modeling Los Angeles, CA Wed Oct 29 & Thurs."— Presentation transcript:

1 COSYSMO Workshop Future Directions and Priorities 23 rd International Forum on COCOMO and Systems/Software Cost Modeling Los Angeles, CA Wed Oct 29 & Thurs Oct 30, 2008 Garry RoedlerGan Wang John GaffneyJared Fortune Ricardo Valerdi

2 Agenda Context setting Discussion on COSYSMO 2.0 improvements Recursive levels in the design parameter Update on COSYSMO book Heuristics

3 8:30 am to 9:00 am Introductions Review of COSYSMO workshop in July – Mystic, CT (Garry Roedler, Lockheed Martin) Results of Reuse Survey (Jared Fortune, USC) 9:00 am to 10:00 am Harmonization of Software and Systems Engineering Cost Estimation (Garry Roedler, Lockheed Martin) 10:00 am to 10:30 amBreak 10:30 am to 11:00 am Experience with SEEMAP at BAE Systems: Quantitative Risk Modeling Using Monte Carlo / Crystal Ball (Gan Wang, BAE Systems) 11:00 am to 11:30 am Experience with COSYSMO-R at Lockheed Martin (John Gaffney, Lockheed Martin) 11:30 am to 12:00 pm Heuristic Risk Assessment (Ray Madachy, Naval Postgraduate School and USC) 12:00 pm to 1:00 pmLunch 1:00 pm to 2:00 pm Best Practice Guidance (Garry Roedler, Lockheed Martin) Model Usage Heuristics (Ricardo Valerdi, MIT) 2:00 pm to 3:00 pm Working Session on Harmonization of SW & SE Estimation: WBS Approach (Garry Roedler, Lockheed Martin; Gan Wang, BAE Systems) 3:00 pm to 3:30 pmBreak 3:30 pm to 4:00 pm Discussion on Reuse Framework (Jared Fortune, USC; Ricardo Valerdi, MIT) 4:00 pm to 5:00 pmDiscussion on Recursive Levels Cost Driver (Ricardo Valerdi, MIT) 5:00 pm to 7:00 pmReception

4 Context setting

5 How is Systems Engineering Defined? Acquisition and Supply –Supply Process –Acquisition Process Technical Management –Planning Process –Assessment Process –Control Process System Design –Requirements Definition Process –Solution Definition Process Product Realization –Implementation Process –Transition to Use Process Technical Evaluation –Systems Analysis Process –Requirements Validation Process –System Verification Process –End Products Validation Process EIA/ANSI 632, Processes for Engineering a System, 1999.

6 COSYSMO Origins COSYSMO Systems Engineering 1950 Software Cost Modeling 1980 CMMI* 1990 *developed at Carnegie Mellon University Warfield, J. N., Systems Engineering, United States Department of Commerce PB111801, 1956. Boehm, B. W., Software Engineering Economics, Prentice Hall, 1981. Humphrey, W. Managing the Software Process. Addison-Wesley, 1989. (Humphrey 1989) (Boehm 1981) (Warfield 1956)

7 COSYSMO Data Sources BoeingIntegrated Defense Systems (Seal Beach, CA) RaytheonIntelligence & Information Systems (Garland, TX) Northrop GrummanMission Systems (Redondo Beach, CA) Lockheed MartinTransportation & Security Solutions (Rockville, MD) Integrated Systems & Solutions (Valley Forge, PA) Systems Integration (Owego, NY) Aeronautics (Marietta, GA) Maritime Systems & Sensors (Manassas, VA; Baltimore, MD; Syracuse, NY) General DynamicsMaritime Digital Systems/AIS (Pittsfield, MA) Surveillance & Reconnaissance Systems/AIS (Bloomington, MN) BAE Systems National Security Solutions/ISS (San Diego, CA) Information & Electronic Warfare Systems (Nashua, NH) SAIC Army Transformation (Orlando, FL) Integrated Data Solutions & Analysis (McLean, VA) L-3 Communications Greenville, TX

8 Modeling Methodology 3 rounds; > 60 experts 62 data points; 8 organizations

9 COSYSMO Scope Addresses first four phases of the system engineering lifecycle (per ISO/IEC 15288) Considers standard Systems Engineering Work Breakdown Structure tasks (per EIA/ANSI 632) Conceptualize Develop Oper Test & Eval Transition to Operation Operate, Maintain, or Enhance Replace or Dismantle

10 COSYSMO Size Drivers Effort Multipliers Effort Calibration # Requirements # Interfaces # Scenarios # Algorithms + 3 Adj. Factors - Application factors -8 factors - Team factors -6 factors COSYSMO Operational Concept

11 COSYSMO Model Form Where: PM NS = effort in Person Months (Nominal Schedule) A = calibration constant derived from historical project data k = {REQ, IF, ALG, SCN} w x = weight for “easy”, “nominal”, or “difficult” size driver = quantity of “k” size driver E = represents diseconomies of scale EM = effort multiplier for the j th cost driver. The geometric product results in an overall effort adjustment factor to the nominal effort.

12 EasyNominalDifficult # of System Requirements0.51.005.0 # of Interfaces1.74.39.8 # of Critical Algorithms3.46.518.2 # of Operational Scenarios9.822.847.4 Size Driver Weights

13 UNDERSTANDING FACTORS –Requirements understanding –Architecture understanding –Stakeholder team cohesion –Personnel experience/continuity COMPLEXITY FACTORS –Level of service requirements –Technology Risk –# of Recursive Levels in the Design –Documentation Match to Life Cycle Needs OPERATIONS FACTORS –# and Diversity of Installations/Platforms –Migration complexity PEOPLE FACTORS –Personnel/team capability –Process capability ENVIRONMENT FACTORS –Multisite coordination –Tool support Cost Driver Clusters

14 Cost Driver Rating Scales Very LowLowNominalHighVery High Extra HighEMR Requirements Understanding1.871.371.000.770.60 3.12 Architecture Understanding1.641.281.000.810.65 2.52 Level of Service Requirements0.620.791.001.361.85 2.98 Migration Complexity 1.001.251.551.93 Technology Risk0.670.821.001.321.75 2.61 Documentation0.780.881.001.131.28 1.64 # and diversity of installations/platforms 1.001.231.521.87 # of recursive levels in the design0.760.871.001.211.47 1.93 Stakeholder team cohesion1.501.221.000.810.65 2.31 Personnel/team capability1.501.221.000.810.65 2.31 Personnel experience/continuity1.481.221.000.820.67 2.21 Process capability1.471.211.000.880.770.682.16 Multisite coordination1.391.181.000.900.800.721.93 Tool support1.391.181.000.850.72 1.93

15 Cost Drivers Ordered by Effort Multiplier Ratio (EMR)

16 ISO/IEC 15288 Conceptualize Develop Transition to Operation Operate, Maintain, or Enhance Replace or Dismantle EIA/ANSI 632 Acquisition & Supply Technical Management System Design Product Realization Technical Evaluation Operational Test & Evaluation Effort Profiling

17 Academic prototype Commercial Implementations Proprietary Implementations COSYSMO-R SECOST SEEMaP Impact Academic Curricula Intelligence Community Sheppard Mullin, LLC Policy & Contracts Model 10 theses

18 COSYSMO 2.0 Improvements

19 Recommended Improvements (from user community) 1.Reuse 2.Integration of SwE & SysE estimation 3.Assumption of linearity in COSYSMO cost drivers 4.Effect of cost drivers and scale factors 5.Number of recursive levels of design 6.Risk modeling 7.Establishing best practice guidance 8.Consideration of SoS scope in COSYSMO 9.Estimation in Operation & Maintenance Phase 10.Requirements volatility Deferred

20 1. Reuse Central question: What is the effect of reuse in estimating systems engineering size/effort? Hypothesis: A COSYSMO reuse submodel will improve the model’s estimation accuracy POC: Jared Fortune References –Valerdi, R., Wang, G., Roedler, G., Rieff, J., Fortune, J., “COSYSMO Reuse Extension,” 22 nd International Forum on COCOMO and Systems/Software Cost Modeling, 2007.

21 2. Integration of SwE & SysE estimation Central question: What is the overlap between COCOMO II and COSYSMO? Hypothesis: By identifying the WBS elements in COSYSMO that overlap with the WBS in COCOMO II, the systems engineering resource estimation accuracy increases POC: Ricardo Valerdi References –Valerdi, R., The Architect and the Builder: Overlaps Between Software and Systems Engineering. (working paper)

22 3. Linearity in COSYSMO cost drivers Central question: How do we characterize the non- linearity of cost drivers across the system life cycle? Hypothesis: Not all cost drivers have a constant impact on systems engineering effort throughout the life cycle. POC: Gan Wang References –Wang, G., Valerdi, R., Boehm, B., Shernoff, A., “Proposed Modification to COSYSMO Estimating Relationship,” 18th INCOSE Symposium, June 2008.

23 4. Effect of cost drivers and scale factors Central question: Can some of the cost drivers become scale factors in the cost estimating relationship calibrated by the new data set? Hypothesis: The current set of size and cost drivers are too sensitive to small variations in rating levels. POC: Gan Wang References –Wang, G., Valerdi, R., Boehm, B., Shernoff, A., “Proposed Modification to COSYSMO Estimating Relationship,” 18th INCOSE Symposium, June 2008.

24 5. Number of recursive levels of design Central question: How can the integration complexity of subsystems one layer below the system-of-interest be operationalized? Hypothesis: The integration complexity of subsystems is a predictor of systems engineering effort. POC: John Rieff References –Marksteiner, B., “Recursive Levels and COSYSMO”, October 2007. (working paper)

25 6. Risk Modeling Central question: How can risk associated with the COSYSMO estimate be quantified? Hypothesis: The output generated by COSYSMO can be quantified using probability distributions for better assessment of the likelihood of meeting the estimate POC: John Gaffney (developer of COSYSMO-R) References –Valerdi, R., Gaffney, J., “Reducing Risk and Uncertainty in COSYSMO Size and Cost Drivers: Some Techniques for Enhancing Accuracy,” 5th Conference on Systems Engineering Research, March 2007, Hoboken, NJ.

26 7. Best practice guidance for use of Cost Drivers Central question: How can misuse of the COSYSMO cost drivers be avoided? Hypothesis: By developing a best practice guide that describes common pitfalls associated with COSYSMO cost drivers, over-estimation can be reduced or avoided POC: Garry Roedler References –COSYSMO User Manual

27 8. Consideration of SoS scope in COSYSMO Central question: How can COSYSMO be updated to address system of systems effort estimation? Hypothesis: To be discussed in joint session POC: Jo Ann Lane

28 9. Estimation in Operation & Maintenance Phase Central question: How can we estimate systems engineering effort in the Operate & Maintain phase? Hypothesis: Coverage of the Operate & Maintenance phases will broaden to model’s life cycle coverage POC: Ricardo Valerdi

29 10. Requirements volatility Central question: How do we quantify the effects of requirements volatility on systems engineering effort throughout the life cycle? Hypothesis: Requirements volatility is a significant factor for predicting systems engineering effort and can serve as a leading indicator for project success POC: Ricardo Valerdi Feb 15, 2007 Workshop led by Rick Selby –Identified critical success factors in: technical, product, process, people –http://sunset.usc.edu/events/2007/ARR/presentations/RequirementsVol atilityWorkshopSummaryARR2007.ppthttp://sunset.usc.edu/events/2007/ARR/presentations/RequirementsVol atilityWorkshopSummaryARR2007.ppt –Loconsole, A., Borstler, J., “An industrial case study on requirements volatility measures,” 12th Asia-Pacific Software Engineering Conference, 2005.

30 Prioritization Exercise Factors to Consider –Availability of data –Impact on total cost of ownership –Frequency of use –Compatibility with other models (i.e., COCOMO family, PRICE-H, etc.) –Addressal of future trends (Volatility, Uncertainty, Scalability) –Factor interactions

31 Recursive Levels in the Design

32 Number of Recursive Levels in the Design The number of levels of design related to the system-of-interest (as defined by ISO/IEC 15288) and the amount of required SE effort for each level.

33 Definition Issues Clarification of “one deep” –Integration complexity of subsystems one layer below the system-of-interest Recursive –of, relating to, or constituting a procedure that can repeat itself indefinitely

34 Possible Interpretations The largest number of decomposition levels in any branch of the system’s specification tree The average number of decomposition levels in the branches of the system’s specification tree The smallest number of decomposition levels in any branch of the system’s specification tree The number of levels of the system’s Work Breakdown Structure (WBS) The number of levels on the system’s Bill of Materials (BOM)

35 Discussion Form vs. function –Form is handled by levels in the design cost driver –Function is handled by requirements interfaces size drivers

36 Update on COSYSMO Book 6 chapters/300 pages Foreword by Barry Boehm Endorsements from –Bill Rouse (Georgia Tech) –Paul Nielsen (SEI) –Dinesh Verma (Stevens) –Dan Galorath (Galorath) –Andy Sage (George Mason) –Rick Selby (Northrop Grumman/USC) –Wolt Fabrycky (Virginia Tech) –Marilee Wheaton (Aerospace Corporation/USC)

37 Objectives Quantify systems engineering Provide framework for decision making Define opportunities for tailoring & calibration Capture lessons learned from development, validation and implementation Complement other estimation methods (heuristics, analogy, expert-based) and models (COCOMO II) Cater to commercial marketplace in support of –SEER-SEM –TruePlanning –SystemStar Continue to build repository of systems engineering data Provide a platform for future research

38 First Sentence “COSYSMO is a model to help you reason about the cost and schedule implications of systems engineering decisions you may need to make”.

39 New Table of Contents Dissertation New 1. Scope of COSYSMO2. Model Definition 3. Model Validation & Verification 4. Model Usage 5. Systems Engineering & Program Management Strategies 6. Evolution of Systems Engineering Cost Estimation New Dissertation New Paper New Paper

40 Chapter 6 COSYSMO Size Drivers Effort Multipliers Effort Calibration Chapter 1 Chapter 2 Chapter 3 Chapter 4 Chapter 5

41 Cost Estimation Heuristics

42 Criteria for Developing Heuristics 1.Agreement among experts that the heuristic is useful and correct 2.Heuristic must stand the test of time 3.Heuristic must be resilient across different scenarios 4.Heuristic must demonstrate value by –reoccurring more than once –Not be considered obvious by everybody, particularly people who are new to the field

43 Model Development Heuristics Heuristic #1: More parameters increase the explanatory power of the model, but too many parameters make the model too complex to use and difficult to calibrate. Heuristic #2: Break the problem and analysis into phases over time; the right amount of granularity is important. Heuristic #3: Let available data drive the application boundaries of the model.

44 Model Development Heuristics Heuristic #4: Design the rating scale according to the phenomenon being modeled. Heuristic #5: Some system characteristics are more likely to be cost penalties than cost savings.

45 Model Calibration Heuristics Heuristic #6: All calibrations are local. Heuristic #7: Calibrations fix chronic errors in over- or underestimation. Heuristic #8: Be skeptical of data that you did not collect. Heuristic #9: For every parameter in the model, 5 data points are required for the calibration.

46 Model Calibration Heuristics Heuristic #10: Don’t do more analysis than the data is worth. Heuristic #11: You need less data than you think, you have more data than you think.

47 Model Usage Heuristics Heuristic #12: A model is not reality. Heuristic #13: All models are wrong, but some of them are useful. Heuristic #14: Begin with the end in mind. Heuristic #15: Requirements are king. Heuristic #16: Not all requirements are created equal.

48 Model Usage Heuristics Heuristic #17: Reuse is not free. Heuristic #18: Operational Scenarios may come first, but requirements will ultimately describe the system. Heuristic #19: Don't double dip. Heuristic #20: Find your sea level. Heuristic #21: Nominal is the norm.

49 Model Usage Heuristics Heuristic #22: If you're estimating a large project, personnel capability is Nominal. Heuristic #23: Most of your off-Nominal cost drivers should match your last project. Heuristic #24: If you're going to sin, sin consistently. Heuristic #25: Use a combination of models to estimate total system cost.

50 Model Usage Heuristics Heuristic #26: Avoid overlap between models. Heuristic #27: Estimate using multiple methods (analogy, parametric, etc.)

51 Estimation Heuristics Heuristic #28: Estimate early and often. Heuristic #29: Experts all disagree forever. Bound the options they are given to evaluate. Heuristic #30: Models are optimistic. Heuristic #31: People are generally optimistic.


Download ppt "COSYSMO Workshop Future Directions and Priorities 23 rd International Forum on COCOMO and Systems/Software Cost Modeling Los Angeles, CA Wed Oct 29 & Thurs."

Similar presentations


Ads by Google