Working Group Meeting Ricardo Valerdi Thursday October 27, 2005 Los Angeles, CA 20 th International Forum on COCOMO and Software Cost Modeling.

Slides:



Advertisements
Similar presentations
Full Day Workshop: Thursday, November 3, 2005, 8:00am – 5:00pm SAIC, Science Drive, Orlando, Florida Registration/Sign-In Begins at 7:30am Building.
Advertisements

Program Evaluation. Overview and Discussion of: Objectives of evaluation Process evaluation Outcome evaluation Indicators & Measures Small group discussions.
Test Automation Success: Choosing the Right People & Process
Software Engineering CSE470: Process 15 Software Engineering Phases Definition: What? Development: How? Maintenance: Managing change Umbrella Activities:
COSYSMO 2.0 Workshop Summary (held Monday, March 17 th 2008) USC CSSE Annual Research Review March 18, 2008 Jared Fortune.
Advancing the knowledge of systems engineering
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy, Ricardo Valerdi USC Center for Systems and Software.
Working Group Meeting (Outbrief) Ricardo Valerdi, Indrajeet Dixit, Garry Roedler Tuesday.
March 2002 COSYSMO: COnstructive SYStems Engineering Cost MOdel Ricardo Valerdi USC Annual Research Review March 11, 2002.
1 / 24 CS 425/625 Software Engineering Software Evolution Based on Chapter 21 of the textbook [SE-8] Ian Sommerville, Software Engineering, 8 th Ed., Addison-Wesley,
COSYSMO Adoption Process 21 st International Forum on COCOMO and Software Cost Modeling November 9, 2006 Chris MillerRicardo Valerdi.
University of Southern California Center for Software Engineering CSE USC COSYSMO: Constructive Systems Engineering Cost Model Barry Boehm, USC CSE Annual.
COSYSMO: Constructive Systems Engineering Cost Model Ricardo Valerdi USC CSE Workshop October 25, 2001.
Some Experience With COSYSMOR At Lockheed Martin
Extensions of COSYSMO to Represent Reuse 21 st International Forum on COCOMO and Software Cost Modeling November 9, 2006 Ricardo ValerdiJohn Gaffney Garry.
COSYSMO Reuse Extension 22 nd International Forum on COCOMO and Systems/Software Cost Modeling November 2, 2007 Ricardo ValerdiGan Wang Garry RoedlerJohn.
1 Systems Engineering Reuse Principles Jared Fortune, USC Ricardo Valerdi, MIT COSYSMO COCOMO Forum 2010 Los Angeles, CA.
1 Lecture 2.6: Organization Structures Dr. John MacCarthy UMBC CMSC 615 Fall, 2006.
COSYSMO Workshop Summary Ricardo Valerdi Tuesday March 14, 2006 Los Angeles, CA USC Center for Software Engineering Annual Research Review.
COSYSMO Reuse Extension 22 nd International Forum on COCOMO and Systems/Software Cost Modeling November 2, 2007 Ricardo ValerdiGan Wang Garry RoedlerJohn.
1 COSYSMO Workshop - Survey on Intuitive Judgments (Part III) COSYSMO = 1,000 PM Historical data = 1,100 PM COSYSMO = 100 PM Historical data = 110 PM.
System-of-Systems Cost Modeling: COSOSIMO July 2005 Workshop Results Jo Ann Lane University of Southern California Center for Software Engineering.
1 Discussion on Reuse Framework Jared Fortune, USC Ricardo Valerdi, MIT COSYSMO COCOMO Forum 2008 Los Angeles, CA.
1 COSYSMO 2.0: A Cost Model and Framework for Systems Engineering Reuse Jared Fortune University of Southern California Ricardo Valerdi Massachusetts Institute.
COSOSIMO* Workshop Outbrief 14 March 2006 Jo Ann Lane University of Southern California Center for Software Engineering CSE.
©2006 BAE Systems. Practical Implementation of COSYSMO Reuse Extension Gan Wang, Aaron Ankrum, Cort Millar, Alex Shernoff, Ricardo Valerdi.
Towards COSYSMO 2.0: Update on Reuse Jared Fortune, USC Ricardo Valerdi, MIT USC ARR 2009 Los Angeles, CA.
Copyright © 2001, Software Productivity Consortium NFP, Inc. SOFTWARE PRODUCTIVITY CONSORTIUM SOFTWARE PRODUCTIVITY CONSORTIUM COSYSMO Overview INCOSE.
How ISO 9001 Fits Into The Software World? Management of Software Projects and Personnel CIS 6516 March 6, 2006 Prepared by Olgu Yilmaz Swapna Mekala.
1 Ricardo Valerdi – USC Center for Software Engineering April 2004 Drivers & Rating Scales.
CMMI Course Summary CMMI course Module 9..
S/W Project Management
Customized for the Seattle Area Software Quality Assurance Group Dave Brandes People Capability Maturity Model Services The Boeing Company July 19, 2001.
Capability Maturity Model Part One - Overview. History Effort started by SEI and MITRE Corporation  assess capability of DoD contractors First.
Dillon: CSE470: SE, Process1 Software Engineering Phases l Definition: What? l Development: How? l Maintenance: Managing change l Umbrella Activities:
ESD web seminar1 ESD Web Seminar February 23, 2007 Ricardo Valerdi, Ph.D. Unification of systems and software engineering cost models.
1 REQUIREMENT ENGINEERING Chapter 7. 2 REQUIREMENT ENGINEERING Definition Establishing what the customer requires from a software system. OR It helps.
1 Process Engineering A Systems Approach to Process Improvement Jeffrey L. Dutton Jacobs Sverdrup Advanced Systems Group Engineering Performance Improvement.
Dr. Jana Jagodick Polytechnic of Namibia, 2012 Project Management Chapter 3 Project Management for Strategic Goal Achievement.
CSI - Introduction General Understanding. What is ITSM and what is its Value? ITSM is a set of specialized organizational capabilities for providing value.
28 th CEOS Plenary Session Position Paper and Recommended Way Forward for the LSI-VC Thomas Cecere, USGS Jonathon Ross, GA CEOS Plenary, Agenda Item 19.
Creating a Shared Vision Model. What is a Shared Vision Model? A “Shared Vision” model is a collective view of a water resources system developed by managers.
Systems Engineering Cost Estimation Systems Engineering Day, São José dos Campos, Brazil Dr. Ricardo Valerdi Massachusetts Institute of Technology June.
Software process improvement Framework for SPI SPI support groups, maturity and immaturity models Assessment and gap analysis Education and training Selection.
University of Sunderland CIFM03Lecture 2 1 Quality Management of IT CIFM03 Lecture 2.
9/17/2002 COSYSMO Usage Experience Panel: What is Happening at Lockheed Martin Garry Roedler, Lockheed Martin Engineering Process Improvement Center
Lecture 7: Requirements Engineering
Georgia Institute of Technology CS 4320 Fall 2003.
@2002 Copyright, Itreya Technologies CMMI kick off July 2005.
 Management ◦ The activities and tasks undertaken by one or more persons for the purpose of planning and controlling the activities of other in order.
Software Engineering - I
Take Charge of Change MASBO Strategic Roadmap Update November 15th, 2013.
Gan Wang 22 October th International Forum on COCOMO® and Systems/Software Cost Modeling in conjunction with the Practical Software and Systems.
Process Improvement. It is not necessary to change. Survival is not mandatory. »W. Edwards Deming Both change and stability are fundamental to process.
+ Chapter 9: Management of Business Intelligence © Sabherwal & Becerra-Fernandez.
Formulating a Simulation Project Proposal Chapter3.
Search Engine Optimization © HiTech Institute. All rights reserved. Slide 1 What is Solution Assessment & Validation?
Software Project Management (SEWPZG622) BITS-WIPRO Collaborative Programme: MS in Software Engineering SECOND SEMESTER /1/ "The content of this.
Unit – I Presentation. Unit – 1 (Introduction to Software Project management) Definition:-  Software project management is the art and science of planning.
Info-Tech Research Group1 Manage the IT Portfolio World Class Operations - Impact Workshop.
Overview of Addressing Risk with COSYSMO Garry Roedler & John Gaffney Lockheed Martin March 17, 2008.
University of Southern California Center for Systems and Software Engineering Enablers and Inhibitors for Expediting Systems and Software Engineering &
EITS Planning & Decision Support
Identify the Risk of Not Doing BA
COSYSMO: Constructive Systems Engineering Cost Model
COSYSMO Delphi Round 2 Results
Towards COSYSMO 2.0: Update on Reuse
Working Group Meeting Report
University of Southern California Center for Software Engineering
Chapter 26 Estimation for Software Projects.
Presentation transcript:

Working Group Meeting Ricardo Valerdi Thursday October 27, 2005 Los Angeles, CA 20 th International Forum on COCOMO and Software Cost Modeling

Agenda 8:15 AM COSYSMO Status 8:45 AM Major updates COCOMO/COSYSMO Overlap COSYSMO book outline Exercise in Psychometrics COSYSMO game Systems engineering effort estimation in O&M phase 9:15 AM SE Staffing profile survey 9:45 AM Break 10:15 AM Ongoing projects Data availability for COSYSMO (Chris Miller) Schedule implications in COSYSMO (Anthony Peterson) COSYSMOstar (Dan Ligett) 11:45 AM Lunch

COSYSMO Evolution Phase 1: Baseline ☺ Phase 2: Prototype ☺ Phase 3: Validate ☺ Phase 4: Institutionalize

Notional Estimation Example Company “Lockheed Grumman” is developing a system that has: COSYSMO Size Drivers Effort Multipliers 36 Person Months of systems engineering effort Calibration 100 easy, 50 nominal, 75 difficult requirements 2 easy, 3 difficult interfaces 4 easy algorithms 5 nominal operational scenarios High requirements und High tech risk High process capability

COSYSMO Status OrganizationData Points in Calibration BAE Systems 19 General Dynamics10 Raytheon10 Lockheed Martin8 Northrop Grumman4 SAIC 2 TOTAL 53 academicCOSYSMO, myCOSYSMO, COSYSMO Risk Add-on, and COSYSMOstar continue to be improved and can be downloaded from

Major Updates COCOMO II/COSYSMO overlap –MBASE/RUP phases and ISO –Integration/test/requirements activities Calibration factor clarification Hours = * (size)^1.06 * product (EM) PM = 38.55/152 * (size)^1.06 * product (EM) 0.25

COSYSMO Book Outline 1.Scope of COSYSMO a)Background on Systems Engineering b)COCOMO II and COSYSMO c)Estimation example 2.Model definition a)Model form b)Size drivers & counting rules c)Cost drivers & interpretations 3.Model verification & validation a)Statistical tests b)Model parsimony c)Bayesian approximation 4.Model usage a)Experience base b)Model Tailoring c)Institutionalization & training

Exercise in Psychometrics* Psychometrics is the field of study (connected to psychology and statistics) concerned with the measurement of "psychological" aspects of a person such as knowledge, skills, abilities, or personality. Source: Guilford JP. Psychometric methods. 2nd edn. New York: McGraw-Hill; Stakeholder team cohesion 2. Personnel/team capability 3. Personnel experience/continuity 4. Process Capability *Inspired by Clark and subsequently Guilford Source:

Reality Check

Stakeholder team cohesion Represents a multi-attribute parameter which includes leadership, shared vision, diversity of stakeholders, approval cycles, group dynamics, IPT framework, team dynamics, trust, and amount of change in responsibilities. It further represents the heterogeneity in stakeholder community of the end users, customers, implementers, and development team. ViewpointVery LowLowNominalHighVery High Culture  Stakeholders with diverse expertise, task nature, language, culture, infrastructure  Highly heterogeneous stakeholder communities  Heterogeneous stakeholder community  Some similarities in language and culture  Shared project culture  Strong team cohesion and project culture  Multiple similarities in language and expertise  Virtually homogeneous stakeholder communities  Institutionalized project culture Compatibility  Highly conflicting organizational objectives  Converging organizational objectives  Compatible organizational objectives  Clear roles & responsibilities  Strong mutual advantage to collaboration Familiarity and trust  Lack of trust  Willing to collaborate, little experience  Some familiarity and trust  Extensive successful collaboration  Very high level of familiarity and trust

Personnel/team capability Basic intellectual capability of a Systems Engineer (compared to the national pool of SEs) to analyze complex problems and synthesize solutions. Very LowLowNominalHighVery High 15 th percentile35 th percentile55 th percentile75 th percentile90 th percentile Personnel experience/continuity The applicability and consistency of the staff at the initial stage of the project with respect to the domain, customer, user, technology, tools, etc. Very lowLowNominalHighVery High ExperienceLess than 2 months1 year continuous experience, other technical experience in similar job 3 years of continuous experience 5 years of continuous experience 10 years of continuous experience Annual Turnover 48%24%12%6%3%

Process capability The consistency and effectiveness of the project team at performing SE processes. This may be based on assessment ratings from a published process model (e.g., CMMI, EIA- 731, SE-CMM, ISO/IEC15504). It can also be based on project team behavioral characteristics, if no assessment has been performed. Very lowLowNominalHighVery HighExtra High Assessme nt Rating (Capability or Maturity) Level 0 (if continuou s model) Level 1Level 2Level 3Level 4Level 5 Project Team Behavioral Characteri stics Ad Hoc approach to process performan ce Performed SE process, activities driven only by immediate contractual or customer requirements, SE focus limited Managed SE process, activities driven by customer and stakeholder needs in a suitable manner, SE focus is requirements through design, project-centric approach – not driven by organizational processes Defined SE process, activities driven by benefit to project, SE focus is through operation, process approach driven by organizational processes tailored for the project Quantitatively Managed SE process, activities driven by SE benefit, SE focus on all phases of the life cycle Optimizing SE process, continuous improvement, activities driven by system engineering and organizational benefit, SE focus is product life cycle & strategic applications

COSYSMO Game Purpose: to serve as a training tool for stakeholders to understand the capabilities and limitations of COSYSMO Rules The formal part of the game. A game designer designs the rules of the game directly but designs the player’s experience only indirectly. Source: Rules of Play: Game Design Fundamentals by Salen & Zimmerman, MIT Press Play Rule-bound and free-form; Players are assigned roles and given a project. Culture Simulated project which realistically mimics the real world of systems engineering.

COSYSMO Game Rules –Facilitator sets the stage for a “case study” –Deliberately creates a model clash scenario Play –Role-based decision making Culture –Expectations and politics must be highlighted –Bring forward social process of cost estimation

Cast of Characters CastFunction CustomerDevelops spec ContractorEvaluates spec 3 rd party consultant (i.e., SETA or FFRDC) Evaluates spec & bid ObserversDon’t interfere but keep track of what’s happening

Role Playing Activities PhaseActivities 1. Warm upFacilitator: Sets up scenario Makes scenario real Surfaces model clashes Explains roles 2. Select participantsParticipants: Roles are assigned 3. Set the stageFacilitator & Participants Clarifies scenario, background, and goal 4. Prepare the observersFacilitator & Participants Identify what observers are observing Assign observing tasks 5. ActionParticipants Act out scenario Observe flow, behaviors, results, etc. Source: Models of Teaching, 6th Ed., by Joyce, Weil, Calhoun, 2000.

Role Playing Activities PhaseActivities 6. DebriefParticipants: Present COSYSMO results 7. GeneralizationFacilitator & Participants: Relate to reality Comment on limitations of the model Summarize general challenges of process Source: Models of Teaching, 6th Ed., by Joyce, Weil, Calhoun, 2000.

Operate & Maintain Phase Current scope Suggestions: Need to clarify point where M&E begins Add “deleted” category (may be a different weighting) for requirements, interfaces, etc. Consider that not all current COSYSMO parameters will apply here and that the ones that do, may have different values Need to consider business strategies with respect to OM&E (e.g., problem report-driven, level of effort/priority driven, requirement changes, upgrade/technology refresh activities, etc.)—may require other size drivers Need to better define OM&E activities Consider frequency of deliveries as a driver Conceptualize Develop Oper Test & Eval Transition to Operation Operate, Maintain, or Enhance Replace or Dismantle

Inconsistent Effort Reporting Data was adjusted based on the following: –Effort distribution across EIA632 processes ISO15288 phases –Yielding Phase Conceptualiz e Develop Operational Test & Eval Transition to Operation %Effort (STDEV) 23 (12)36 (16)27 (13)14 (9)

Effort Distribution Across EIA 632 Fundamental Processes N = 18 EIA 632 Fundamental Process AverageStandard Deviation Acquisition & Supply 7%3.5 Technical Management 17%4.5 System Design 30%6.1 Product Realization 15%8.7 Technical Evaluation 31%8.7 Total = 100%

ISO/IEC Conceptualize Develop Transition to Operation EIA/ANSI 632 Acquisition & Supply Technical Management System Design Product Realization Technical Evaluation Operational Test & Evaluation Effort Profiling mini-Delphi

ConceptualizeDevelop Operation al Test & Eval. Transition to Operation Operate, Maintain, or Enhance Replace or Dismantle (check sum) Acquisition and Supply 28 (12.3)51 (18.6)13 (11.3)8 (5.0)100 Technical Management 22 (10.0)38 (9.9)25 (7.4)15 (6.4)100 System Design 34 (12.4)40 (19.4)17 (9.6)9 (6.2)100 Product Realization 13 (14.1)30 (24.3)32 (16.0)25 (20.4)100 Technical Evaluation 18 (11.4)27 (11.0)40 (17.7)15 (8.5)100 Effort Distribution of EIA 632 Fundamental Processes Across ISO Phases N = 15 In each cell: Average (Standard Deviation)

Contact Ricardo Valerdi MIT Lean Aerospace Initiative (617)