1 Leading Indicators for Systems Engineering Effectiveness Presentation for NDIA SE Conference October 28, 2009 Garry Roedler Lockheed Martin.

Slides:



Advertisements
Similar presentations
Chapter 7: Key Process Areas for Level 2: Repeatable - Arvind Kabir Yateesh.
Advertisements

System Development Performance Measurement Project 1.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
©2006 OLC 1 Process Management: The Foundation for Achieving Organizational Excellence Process Management Implementation Worldwide.
Systems Engineering in a System of Systems Context
Standards for Internal Control in the Government Going Green Standards for Internal Control in the Federal Government 1.
COSYSMO 2.0 Workshop Summary (held Monday, March 17 th 2008) USC CSSE Annual Research Review March 18, 2008 Jared Fortune.
COSYSMO Reuse Extension 22 nd International Forum on COCOMO and Systems/Software Cost Modeling November 2, 2007 Ricardo ValerdiGan Wang Garry RoedlerJohn.
Pittsburgh, PA Software Engineering Institute Carnegie Mellon University Pittsburgh, PA Sponsored by the U.S. Department of Defense.
Iterative development and The Unified process
Chapter 3: The Project Management Process Groups
The Software Product Life Cycle. Views of the Software Product Life Cycle  Management  Software engineering  Engineering design  Architectural design.
Welcome ISO9001:2000 Foundation Workshop.
Technical Integrity Assurance For Product Development W. Henson Graves Lockheed Martin Aeronautics Company Russ Campbell.
Codex Guidelines for the Application of HACCP
What is Business Analysis Planning & Monitoring?
Effective Methods for Software and Systems Integration
NDIA SE Architecture Committee 25 June 2012 Agenda: General Info – Nothing new Task # 4 Status (S. Dam) Task # 8 Status (R. Carson)
CMMI Course Summary CMMI course Module 9..
Chapter 4 Interpreting the CMM. Group (3) Fahmi Alkhalifi Pam Page Pardha Mugunda.
Introduction to Software Quality Assurance (SQA)
Student Learning Objectives 1 Phase 3 Regional Training April 2013.
RUP Fundamentals - Instructor Notes
Project Risk and Cost Management. IS the future certain? The future is uncertain, but it is certain that there are two questions will be asked about our.
1 System Architecture Measurement. 2 Continuation of NDIA Measurements Task Goal of last year’s task was to: Identify a set of leading indicators that.
Software System Engineering: A tutorial
NIST Special Publication Revision 1
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
Lecture #9 Project Quality Management Quality Processes- Quality Assurance and Quality Control Ghazala Amin.
December 14, 2011/Office of the NIH CIO Operational Analysis – What Does It Mean To The Project Manager? NIH Project Management Community of Excellence.
『华东师范大学』 课程名称: 软件开发实践 Software Development Practice 课程类型: 实践课 第二讲: 项目管理 Lect_02: Manage the Project 主讲 : 软件学院 周勇 副 教授 日期 :
Page 1 ISO/IEC JTC 1/SC 7/WG 7 N Summary of the Alignment of System and Software Life Cycle Process Standards The material in this briefing.
Chapter – 9 Checkpoints of the process
ISM 5316 Week 3 Learning Objectives You should be able to: u Define and list issues and steps in Project Integration u List and describe the components.
CHECKPOINTS OF THE PROCESS Three sequences of project checkpoints are used to synchronize stakeholder expectations throughout the lifecycle: 1)Major milestones,
Georgia Institute of Technology CS 4320 Fall 2003.
Government Procurement Simulation (GPSim) Overview.
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION © 2010 AdvancED.
Programme Performance Criteria. Regulatory Authority Objectives To identify criteria against which the status of each element of the regulatory programme.
Software Engineering - I
Search Engine Optimization © HiTech Institute. All rights reserved. Slide 1 What is Solution Assessment & Validation?
Where We Are Now 14–2. Where We Are Now 14–2 Major Tasks of Project Closure Evaluate if the project delivered the expected benefits to all stakeholders.
Requirements Development in CMMI
Requirements Management with Use Cases Module 10: Requirements Across the Product Lifecycle Requirements Management with Use Cases Module 10: Requirements.
PRJ566 Project Planning & Management Software Architecture.
Earned Value Management Update Nancy L. Spruill Director, Acquisition Resources and Analysis Office of the Under Secretary of Defense (Acquisition, Technology.
1 EMS Fundamentals An Introduction to the EMS Process Roadmap AASHTO EMS Workshop.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Copyright 2010, The World Bank Group. All Rights Reserved. Recommended Tabulations and Dissemination Section B.
Evaluate Phase Pertemuan Matakuliah: A0774/Information Technology Capital Budgeting Tahun: 2009.
Project Management Strategies Hidden in the CMMI Rick Hefner, Northrop Grumman CMMI Technology Conference & User Group November.
Company LOGO. Company LOGO PE, PMP, PgMP, PME, MCT, PRINCE2 Practitioner.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
Organization and Implementation of a National Regulatory Program for the Control of Radiation Sources Program Performance Criteria.
LECTURE 5 Nangwonvuma M/ Byansi D. Components, interfaces and integration Infrastructure, Middleware and Platforms Techniques – Data warehouses, extending.
What’s New in SPEED APPS 2.3 ? Business Excellence Application Services.
SQA project process standards IEEE software engineering standards
CS 577b: Software Engineering II
Software Project Configuration Management
Lecture 3 Prescriptive Process Models
Managing the Project Lifecycle
Chapter 11: Software Configuration Management
SQA project process standards IEEE software engineering standards
Software Requirements
Lean Six Sigma DMAIC Improvement Story
Lean Six Sigma DMAIC Improvement Story
Software Measurement Process ISO/IEC
Project Management Process Groups
Chapter 11: Software Configuration Management
Requirements Development in CMMI
Presentation transcript:

1 Leading Indicators for Systems Engineering Effectiveness Presentation for NDIA SE Conference October 28, 2009 Garry Roedler Lockheed Martin

2 Growing Interest in SE Effectiveness Questions about the effectiveness of the SE processes and activities are being asked Questions about the effectiveness of the SE processes and activities are being asked –DoD –INCOSE –Others Key activities and events have stimulated interest Key activities and events have stimulated interest –DoD SE Revitalization –AF Workshop on System Robustness Questions raised included: Questions raised included: –How do we show the value of Systems Engineering? –How do you know if a program is doing good systems engineering? Sessions included SE Effectiveness measures and Criteria for Evaluating the Goodness of Systems Engineering on a Program Sessions included SE Effectiveness measures and Criteria for Evaluating the Goodness of Systems Engineering on a Program

3 Background of the Systems Engineering Leading Indicators Project “SE Leading Indicators Action Team” formed in late 2004 under Lean Aerospace Initiative (LAI) Consortium in support of Air Force SE Revitalization The team is comprised of engineering measurement experts from industry, government and academia, involving a collaborative partnership with INCOSE, PSM, and several others Co-Leads: Garry Roedler, Lockheed Martin & Donna Rhodes, MIT ESD/LAI Research Group Co-Leads: Garry Roedler, Lockheed Martin & Donna Rhodes, MIT ESD/LAI Research Group Leading SE and measurement experts from collaborative partners volunteered to serve on the team Leading SE and measurement experts from collaborative partners volunteered to serve on the team The team held periodic meetings and used the ISO/IEC and PSM Information Model to define the indicators. PSM (Practice Software and Systems Measurement) has developed foundational work on measurements under government funding; this effort uses the formats developed by PSM for documenting the leading indicators

4 A Collaborative Industry Effort … and several others

5 Objectives of the project 1.Gain common understanding of the needs and drivers of this initiative 2.Identify information needs underlying the application of SE effectiveness –Address SE effectiveness and key systems attributes for systems, SoS, and complex enterprises, robustness, flexibility, and architectural integrity –Address SE effectiveness and key systems attributes for systems, SoS, and complex enterprises, such as robustness, flexibility, and architectural integrity 3.Identify set of leading indicators for SE effectiveness 4.Define and document measurable constructs for highest priority indicators –Includes base and derived measures needed to support each indicator, attributes, and interpretation guidance 5.Identify challenges for implementation of each indicator and recommendations for managing implementation 6.Establish recommendations for piloting and validating the new indicators before broad use

6 SE Leading Indicator Definition A measure for evaluating the effectiveness of a how a specific SE activity is applied on a program in a manner that provides information about impacts that are likely to affect the system performance objectives A measure for evaluating the effectiveness of a how a specific SE activity is applied on a program in a manner that provides information about impacts that are likely to affect the system performance objectives –An individual measure or collection of measures that are predictive of future system performance Predictive information (e.g., a trend) is provided before the performance is adversely impacted Predictive information (e.g., a trend) is provided before the performance is adversely impacted –Measures factors that may impact the system engineering performance, not just measure the system performance itself –Aids leadership by providing insight to take actions regarding: Assessment of process effectiveness and impacts Assessment of process effectiveness and impacts Necessary interventions and actions to avoid rework and wasted effort Necessary interventions and actions to avoid rework and wasted effort Delivering value to customers and end users Delivering value to customers and end users

7 Leading Indicators Engineering Status Causes Consequences Engineering Performance Engineering Capability Financial Indicators Fires Behind schedule, unpredictable Fire alarms Product not maturing fast enough Smoke detectors Performance not meeting plans Sources of ignition Need to monitor drivers and triggers (Copyright 2009, YorkMetrics)

8 Interactions Among Factors Functional Size Product Size Effort Schedule Product Quality Customer Satisfaction Process Performance Adapted from J. McGarry, D.Card, et al., Practical Software Measurement, Addison Wesley, 2002 Technology Effectiveness SE Technical Issues

9 Criteria of Leading Indicators Early in activity flow Early in activity flow In-process data collection In-process data collection In time to make decisions In time to make decisions – Actionable – Key decisions Objective Objective Insight into goals / obstacles Insight into goals / obstacles Able to provide regular feedback Able to provide regular feedback Can support defined checkpoints Can support defined checkpoints – Technical reviews, etc. Confidence Confidence – Quantitative (Statistical) – Qualitative Can clearly/objectively define decision criteria for interpretation Can clearly/objectively define decision criteria for interpretation – Thresholds Tailorable or universal Tailorable or universal Used criteria to prioritize candidates for inclusion in guide

10 Systems Engineering Leading Indicators Thirteen leading indicators defined by SE measurement experts Beta guide released December 2005 for validation Pilot programs conducted Pilot programs conducted Workshops conducted Workshops conducted Survey conducted Survey conducted – 106 responses – Query of utility of each indicator – No obvious candidates for deletion Version 1.0 released in June 2007 Objective: Develop a set of SE Leading Indicators to assess if program is performing SE effectively, and to enhance proactive decision making

11 List of Indicators Requirements Trends (growth; correct and complete) Requirements Trends (growth; correct and complete) System Definition Change Backlog Trends (cycle time, growth) System Definition Change Backlog Trends (cycle time, growth) Interface Trends (growth; correct and complete) Interface Trends (growth; correct and complete) Requirements Validation Rate Trends (at each level of development) Requirements Validation Rate Trends (at each level of development) Requirements Verification Trends (at each level of development) Requirements Verification Trends (at each level of development) Work Product Approval Trends Work Product Approval Trends - Internal Approval (approval by program review authority) - Internal Approval (approval by program review authority) - External Approval (approval by the customer review authority) - External Approval (approval by the customer review authority) Review Action Closure Trends (plan vs actual for closure of actions over time) Review Action Closure Trends (plan vs actual for closure of actions over time) Technology Maturity Trends (planned vs actual over time) Technology Maturity Trends (planned vs actual over time) - New Technology (applicability to programs) - New Technology (applicability to programs) - Older Technology (obsolesence) - Older Technology (obsolesence) Risk Exposure Trends (planned vs, actual over time) Risk Exposure Trends (planned vs, actual over time) Risk Handling Trends (plan vs, actual for closure of actions over time) Risk Handling Trends (plan vs, actual for closure of actions over time) SE Staffing and Skills Trends: # of SE staff per staffing plan (level or skill - planned vs. actual) SE Staffing and Skills Trends: # of SE staff per staffing plan (level or skill - planned vs. actual) Process Compliance Trends Process Compliance Trends Technical Measurement Trends: MOEs (or KPPs), MOPs, TPMs, and margins Technical Measurement Trends: MOEs (or KPPs), MOPs, TPMs, and margins Current set has 13 Leading Indicators

12 Fields of Information Collected for Each Indicator Information Need/Category Information Need/Category Measurable Concept Measurable Concept Leading Information Description Leading Information Description Base Measures Specification Base Measures Specification –Base Measures Description –Measurement Methods –Units of Measure Entities and Attributes Entities and Attributes –Relevant Entities (being measured) –Attributes (of the entities) Derived Measures Specification Derived Measures Specification –Derived Measures Description –Measurement Function Indicator Specification Indicator Specification –Indicator Description and Sample –Thresholds and Outliers –Decision Criteria –Indicator Interpretation Additional Information Additional Information –Related SE Processes –Assumptions –Additional Analysis Guidance –Implementation Considerations –User of the Information –Data Collection Procedure –Data Analysis Procedure Derived from measurement guidance of PSM and ISO/IEC 15939, Measurement Process

13 Guide Contents 1. About This Document 2. Executive Summary Includes Table 1 with overview of indicators and mapping to life cycle phases/stagesIncludes Table 1 with overview of indicators and mapping to life cycle phases/stages 3. Leading Indicators Descriptions Includes a brief narrative description of each indicator, description of the leading information provided and example graphicsIncludes a brief narrative description of each indicator, description of the leading information provided and example graphics 4. Information Measurement Specifications Detailed definitions of each indicators, including all fields of informationDetailed definitions of each indicators, including all fields of information

14 Example of Section 3 Contents Graphics are for illustrative purposes only – may reflect a single aspect of the indicator.

15 Example of Section 4 Contents

16 Example of Section 4 Contents (Cont’d)

17 Systems Engineering Leading Indicators Application to Life Cycle Phases/Stages

18 Indicator’s Usefulness for Gaining Insight to the Effectiveness of Systems Engineering (1 of 3) Indicator Critic al Very Useful Somewhat Useful Limited Usefuln ess Not Useful Usefulness Rating * Requirements Trends 24%35%11%3%3%4.1 System Definition Change Backlog Trend Interface Trends Requirements Validation Trends Requirements Verification Trends Work Product Approval Trends Review Action Closure Trends Risk Exposure Trends Risk Handling Trends Technology Maturity Trends Technical Measurement Trends Systems Engineering Staffing & Skills Trends Process Compliance Trends * Defined on the Slide. Very UsefulSomewhat Useful Percentages shown are based on total survey responses. Not all indicator responses total to 100% due to round-off error or the fact that individual surveys did not include responses for every question.

19 Indicator’s Usefulness for Gaining Insight to the Effectiveness of Systems Engineering (2 of 3) Usefulness Ratings defined via the following guidelines: Usefulness Ratings defined via the following guidelines: – = Critical: Crucial in determining the effectiveness of Systems Engineering – = Very Useful: Frequent insight and/or is very useful for determining the effectiveness of Systems Engineering – = Somewhat Useful: Occasional insight into the effectiveness of Systems Engineering – = Limited Usefulness: Limited insight into the effectiveness of Systems Engineering –Less than 2.0 = Not Useful: No insight into the effectiveness of Systems Engineering

20 Looking Forward – What Next?

21 Next Steps/Action Items Revision to SELI Guide revision planned for release in December Revision to SELI Guide revision planned for release in December Continue to conduct SELI telecons every 3 weeks Continue to conduct SELI telecons every 3 weeks –Contact Howard Schimmoller, Garry Roedler, or Cheryl Jones for information

22 New Indicators New indicators New indicators 1.Test Completeness 2.Resource Volatility 3.Defect and Error Trends 4.System Affordability 5.Architecture Trends 6.Algorithm & Scenario Trends 7.Complexity Change Trends 8.Concept Development – May want to consider based on needs identified by UARC EM task 9.2 other indicators are being contributed for consideration Will include those that have matured by late November

23 Additional Information on Specific Application and Relationships 1.Cost-effective sets of Base Measures that support greatest number of indicators 2.Indicators vs. SE Activities of ISO/IEC Application of the SE Leading Indicators for Human System Integration (HSI) 4.Application of the SE Leading Indicators for Understanding Complexity

24 SELI versus SE Activities of ISO/IEC 15288

25 NAVAIR Applied Leading Indicators (ALI) Methodology Systematically analyzes multiple data elements for a specific information need to determine mathematically valid relationships with significant correlation Systematically analyzes multiple data elements for a specific information need to determine mathematically valid relationships with significant correlation –These are then identified as Applied Leading Indicators Provides a structured approach for: Provides a structured approach for: –Validation of the LIs –Identifying most useful relationships Unanimous agreement to include this in the SELI guide Unanimous agreement to include this in the SELI guide NAVAIR (Greg Hein) to summarize the methodology for incorporation into the SELI Guide revision as an appendix NAVAIR (Greg Hein) to summarize the methodology for incorporation into the SELI Guide revision as an appendix –Summary will include links to any supplementary information and guidance

26 Interaction with SERC SE Effectiveness Measurement Project SE Leading Indicators Guide is pointed to from SERC SE Effectiveness Measurement (EM) project for quantitative measurement perspective SE Leading Indicators Guide is pointed to from SERC SE Effectiveness Measurement (EM) project for quantitative measurement perspective SERC EM contribution: SERC EM contribution: –Short-term: Mapping of SE Effectiveness Measurement Framework to SE Leading Indicators (SELI) Mapping of SE Effectiveness Measurement Framework to SE Leading Indicators (SELI) –51 Criteria => Critical Success Factors => Questions => SELI  Critical Success Factors serve as Information Needs  Questions serve as Measurable Concepts Mapping of 51 Criteria to SELI Mapping of 51 Criteria to SELI Review to ensure consistency of concepts and terminology Review to ensure consistency of concepts and terminology –Longer-term: Work with OSD to get infrastructure in place to support data collection and analysis Work with OSD to get infrastructure in place to support data collection and analysis –Tie to SRCA DB (TBR) –May require government access and analysis

27 QUESTIONS?