Review of Effectiveness Measures Evaluation Task Dan Ingold, USC USC-CSSE Annual Research Review March 16, 2009.

Slides:



Advertisements
Similar presentations
Systems Engineering From a Life Cycle Perspective John Groenenboom Director Engineering – Mesa Boeing Rotorcraft Dec 12, 2007.
Advertisements

Program Management Office (PMO) Design
© 2009 The MITRE Corporation. All rights Reserved. Evolutionary Strategies for the Development of a SOA-Enabled USMC Enterprise Mohamed Hussein, Ph.D.
Ninth Lecture Hour 8:30 – 9:20 pm, Thursday, September 13
1 Independent Verification and Validation Current Status, Challenges, and Research Opportunities Dan McCaugherty IV&V Program Manager Titan Systems Corporation.
SERC Achievements and Program Direction Art Pyster Deputy Executive Director November, Note by R Peak 12/7/2010: This presentation.
Rational Unified Process
Proposed Way Forward for SERC EM Task Barry Boehm, USC-CSSE 30 January 2009.
University of Southern California Center for Systems and Software Engineering USC CSSE Research Overview Barry Boehm Sue Koolmanojwong Jo Ann Lane Nupul.
NJIT From Inception to Elaboration Chapter 8 Applying UML and Patterns Craig Larman.
DoD Systems and Software Engineering A Strategy for Enhanced Systems Engineering Kristen Baldwin Acting Director, Systems and Software Engineering Office.
Capability Maturity Model (CMM) in SW design
Proposed Approach to SERC EM Task: Assessing SysE Effectiveness in Major Defense Acquisition Programs (MDAPs) Barry Boehm, USC-CSSE 26 November 2008.
System-of-Systems Cost Modeling: COSOSIMO July 2005 Workshop Results Jo Ann Lane University of Southern California Center for Software Engineering.
SQM - 1DCS - ANULECTURE Software Quality Management Software Quality Management Processes V & V of Critical Software & Systems Ian Hirst.
1 Perform Assess Policy and Guidance Acquisition Program Improvement Model Acquisition Programs Acquisition Workforce Human Capital Strategic Planning.
Iterative development and The Unified process
DITSCAP Phase 2 - Verification Pramod Jampala Christopher Swenson.
Purpose of the Standards
Effective Programme Management
Pre-Project Planning Lessons from the Construction Industry Institute Construction Industry Institute Michael Davis, P. Eng, PMP Ontario Power Generation.
What are MRLs ? Alfred W. Clark Dawnbreaker, Inc.
Capability Maturity Model
INCOSE 1 st reactions. One other area that struck me has the sheer number of levels of proficiency—in ours we are going with 5 and the first one is limited.
© 2005 Prentice Hall14-1 Stumpf and Teague Object-Oriented Systems Analysis and Design with UML.
Managing a Training Program Why train? Who will attend the training? What are the learning objectives? Strategies? Coverage? How will the training program.
 A project is “a unique endeavor to produce a set of deliverables within clearly specified time, cost and quality constraints”
Bill Golaz Greg Niemann October 22, 2013 Operations Analysis Workshop “Operational Analysis Measures for Program Start Up”
Software Project Management Lecture # 8. Outline Chapter 25 – Risk Management  What is Risk Management  Risk Management Strategies  Software Risks.
Capability Maturity Model. Reflection Have you ever been a part of, or observed, a “difficult” software development effort? How did the difficulty surface?
Software Development *Life-Cycle Phases* Compiled by: Dharya Dharya Daisy Daisy
N By: Md Rezaul Huda Reza n
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
Software System Engineering: A tutorial
CS 360 Lecture 3.  The software process is a structured set of activities required to develop a software system.  Fundamental Assumption:  Good software.
Software Engineering Lecture # 17
Best Systems Engineering Products Drive CMMI NDIA 6th Annual Systems Engineering Supportability & Interoperability Conference October 21, 2003 Dr. Tom.
CSI - Introduction General Understanding. What is ITSM and what is its Value? ITSM is a set of specialized organizational capabilities for providing value.
A COMPETENCY APPROACH TO HUMAN RESOURCE MANAGEMENT
An Integrated Control Framework & Control Objectives for Information Technology – An IT Governance Framework COSO and COBIT 4.0.
Name Workshop Facilitator Instructional Leadership: Creating Demand.
Chapter 3 Project Management Concepts
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Systems Engineering Planning NDIA Systems Engineering Division Meeting April 12, 2005 Warren M. Anderson, Col, USAF Deputy for Systems Engineering Plans.
Software Engineering - I
Chapter 15 Introduction to Systems Development. Learning Objectives Learn how information systems are developed Understand importance of managing SD process.
Software Product Line Material based on slides and chapter by Linda M. Northrop, SEI.
SERC Technical Overview: First-Year Results and Future Directions Barry Boehm, USC Rich Turner, Stevens 15 October 2009.
Strategies for Knowledge Management Success SCP Best Practices Showcase March 18, 2004.
MODEL-BASED SOFTWARE ARCHITECTURES.  Models of software are used in an increasing number of projects to handle the complexity of application domains.
Kathy Corbiere Service Delivery and Performance Commission
Or How to Gain and Sustain a Competitive Advantage for Your Sales Team Key’s to Consistently High Performing Sales Organizations © by David R. Barnes Jr.
Software Development Process CS 360 Lecture 3. Software Process The software process is a structured set of activities required to develop a software.
Continual Service Improvement Methods & Techniques.
Evaluate SE Methods, Processes and Tools Technical Task Plan USC Workshop Los Angeles, CA 16 March 2009.
Logical Framework Approach An Evaluation Toolbox Presentation
SESSION 2 ISSAT Governing Board Core Group Meeting 2013 PERFORMANCE MANAGEMENT.
LECTURE 5 Nangwonvuma M/ Byansi D. Components, interfaces and integration Infrastructure, Middleware and Platforms Techniques – Data warehouses, extending.
Capability Maturity Model. What is CMM? n CMM: Capability Maturity Model n Developed by the Software Engineering Institute of the Carnegie Mellon University.
Overview of CMMI Global Certification Consultant is aiming to designed CMMI Presentation to share knowledge about CMMI,
TK2023 Object-Oriented Software Engineering
DoD Template for Application of TLCSM and PBL
Systems Analysis and Design in a Changing World, 4th Edition
CS4311 Spring 2011 Process Improvement Dr
SAMPLE Develop a Comprehensive Competency Framework
Building a Business Case for Systems Engineering
SAI Jamaica’s SDG Audit Coverage
Capability Maturity Model
Use of CMMI in an Acquisition Context Using CMMI for Process Improvement at USAF Space and Missile Systems Center (SMC) Dr. Jack R. Ferguson
Capability Maturity Model
Presentation transcript:

Review of Effectiveness Measures Evaluation Task Dan Ingold, USC USC-CSSE Annual Research Review March 16, 2009

EM Task Statement of Work Develop measures to monitor and predict system engineering effectiveness for DoD Major Defense Acquisition Programs – Define SysE effectiveness – Develop measurement methods for contractors, DoD program managers and PEOs, oversight organizations For weapons platforms, SoSs, Net-centric services – Recommend continuous process improvement approach – Identify DoD SysE outreach strategy Consider full range of data sources – Journals, tech reports, org’s (INCOSE, NDIA), DoD studies Partial examples cited: GAO, SEI, INCOSE, Stevens/IBM GFI: Excel version of SADB Deliverables: Report and presentation – Approach, sources, measures, examples, results, recommendations March 16, 20092

Target EM Task Benefits for DoD Identification of best available EM’s for DoD use – Across 3 domains; 3 review levels; planning and execution Early warning vs. late discovery of SysE effectiveness problems Identification of current EM capability gaps – Recommendations for most cost-effective enhancements, research on new EM approaches – Ways to combine EM strengths, avoid weaknesses Foundation for continuous improvement of DoD SysE effectiveness measurement – Knowledge base of evolving EM cost-effectiveness – Improved data for evaluating SysE ROI 01/29/20093

Measuring SysE Effectiveness- And measuring SysE effectiveness measures Good SysE correlates with project success – INCOSE definition of systems engineering, “An interdisciplinary approach and means to enable the realization of successful systems” Good SysE not a perfect predictor of project success – Project does bad SysE, but gets lucky at the last minute and finds a new COTS solution, producing a great success – Project does great SysE, but poor managers and developers turn it into a disaster Goodness of a candidate SysE effectiveness measure (EM) – Whether it can detect when a project’s SysE is leading the project more toward success than toward failure Heuristic for evaluating a proposed SysE EM – Role-play as underbudgeted, short-tenure project manager – Ask “How little can I do and still get a positive rating on this EM?” March 16, 20094

Candidate Measurement Methods NRC Pre-Milestone A & Early-Phase SysE top-20 checklist Army Probability of Program Success (PoPS) Framework INCOSE/LMCO/MIT Leading Indicators Stevens Leading Indicators (new; using SADB root causes) USC Anchor Point Feasibility Evidence progress UAH teaming theories NDIA/SEI capability/challenge criteria SISAIG Early Warning Indicators/ USC Macro Risk Tool March 16, 20095

Independent EM Evaluations and Resolution March 16, Candidate EM USC Stevens FC-MD UAH PoPS Leading Indicators X X X INCOSE LIs X X Stevens LIs X X X SISAIG LIs/ Macro Risk X X X NRC Top-20 List X X X SEI CMMI-Based LIs X X X USC AP-Feasibility Evidence X X X UAH Team Effectiveness X X X

Candidate EM Coverage Matrix March 16, SERC EM Task Coverage Matrix V1.0 NRC Probability of Success SE Leading Indicators LIPSF (Stevens) Anchoring SW Process (USC) PSSES (U. of Alabama) SSEE (CMU/SEI) Macro Risk Model/Tool Concept Dev Atleast 2 alternatives have been evaluated xxx x (w.r.t NPR) (x) Can an initial capability be achieved within the time that the key program leaders are expected to remain engaged in their current jobs (normally less than 5 years or so after Milestone B)? If this is not possible for a complex major development program, can critical subsystems, or at least a key subset of them, be demonstrated within that time frame? x(x)x x (5 years is not explicitly stated) (x) (seems to be inferrable from the conclusions) (x) (implies this) Will risky new technology mature before B? Is there a risk mitigation plan? xxx(x)xx Have external interface complexities been identified and minimized? Is there a plan to mitigate their risks? xxxxxx KPP and CONOPS At Milestone A, have the KPPs been identified in clear, comprehensive, concise terms that are understandable to the users of the system? x(x)x x (strongly implied) (x) (implied) xx At Milestone B, are the major system-level requirements (including all KPPs) defined sufficiently to provide a stable basis for the development through IOC? xx(x)xx (x) (There is no direct reference to this but is inferrable) x Has a CONOPS been developed showing that the system can be operated to handle the expected throughput and meet response time requirements? xx(x) x (x) (there is a mention of a physical solution. That's the closest in this regard) xx Legend: x = covered by EM (x) = partially covered (unless stated otherwise)

EM Evaluation First cut is to complete “45 x 8” evaluation Evaluation identifies key criteria / EMs Preliminary coverage & commonality eval – Four EMs cover more than 1/2 the criteria – Top-20 criteria mentioned by at least 5 EMs Revise results after team evaluations done March 16, 20098

New SE Evaluation Forms March 16, 20099

EM Coverage March 16, Effectiveness Measure Number Covered Percent Covered SEI SSEE36 80% NRC30 67% USC Anchor Point28 62% SISAIG/Macro Risk27 60% UAH PSSES21 47% Army PoPS19 42% Stevens LI-PSF18 40% INCOSE SE LI17 38%

EM Commonality March 16, # of Mentions Effectiveness Measures 85, 6, 7, 15 78, 9, 11, 18, 19 63, 4, 24, 25, 39, 42 51, 2, 10, 17, , 13, 16, 23, 43, 45

Most Mentioned (8) March 16, EMDefinition 5KPPs identified in concise, understandable terms 6KPPs sufficiently stable for development 7CONOPS shows system handles throughput and response time 15Key risk drivers (not just technical) are identified

Most Mentioned (7) March 16, EMDefinition 8Major cost / schedule drivers identified, risk plan in place 9Cost confidence level accepted by stakeholders 11Requirements account for likely future mission growth 18Top-level plan defined for system integration and testing 19Sufficient experienced and talented program and SE managers identified

Most Mentioned (6) March 16, EMDefinition 3Risky new technology will mature before MS B, and risk mitigation plan in place 4External interface complexities identified and minimized, risk mitigation plan in place 24Process compliance 25Review of review process 39Verification / validation and configuration management 42Program governance process, SE plan well articulated

Most Mentioned (5) March 16, EMDefinition 1At least two alternatives have been evaluated 2Initial capability can be achieved within expected duty rotation (< 5 years) 10Sufficient collection of models to validate CONOPS against KPPs 17Plan in place at MS A defining MS B activities and performers 29Level of service has been validated

Test of EM Evaluation Process March 16,

New Evaluation Framework Synthesized from workshop crosswalks Simpler than SEPP / SISAIG / 45 x 8 – Four major categories – Four to five elements per category Provides taxonomy of existing frameworks Coverage spans previous frameworks Could be basis for new Macro Risk-like tool March 16,

Evaluation Framework Criteria March 16,

SE Competencies Developed by ODNI Comprehensive survey of core competencies – 10 candidate work activities – 173 candidate knowledge, skills & abilities (KSAs) To our knowledge, not yet validated Approved for limited release within SERC March 16,

SE Competency Sample March 16,

Q & A March 16,