Proposed Approach to SERC EM Task: Assessing SysE Effectiveness in Major Defense Acquisition Programs (MDAPs) Barry Boehm, USC-CSSE 26 November 2008.

Slides:



Advertisements
Similar presentations
Program Management Office (PMO) Design
Advertisements

Roadmap for Sourcing Decision Review Board (DRB)
© 2009 The MITRE Corporation. All rights Reserved. Evolutionary Strategies for the Development of a SOA-Enabled USMC Enterprise Mohamed Hussein, Ph.D.
Ninth Lecture Hour 8:30 – 9:20 pm, Thursday, September 13
Chris Reisig, Task Group Chairman December 17, 2009 NDIA EHM Committee EHM Technology Transition Study Report.
Systems Engineering in a System of Systems Context
1 Independent Verification and Validation Current Status, Challenges, and Research Opportunities Dan McCaugherty IV&V Program Manager Titan Systems Corporation.
SERC Achievements and Program Direction Art Pyster Deputy Executive Director November, Note by R Peak 12/7/2010: This presentation.
Proposed Way Forward for SERC EM Task Barry Boehm, USC-CSSE 30 January 2009.
Chapter 8 Information Systems Development & Acquisition
University of Southern California Center for Systems and Software Engineering USC CSSE Research Overview Barry Boehm Sue Koolmanojwong Jo Ann Lane Nupul.
NJIT From Inception to Elaboration Chapter 8 Applying UML and Patterns Craig Larman.
Cost and Management Challenges of Systems of Systems True Program Success TM Cost and Management Challenges of System of Systems Arlene Minkiewicz, Chief.
Software in Acquisition Workshop Software Expert Panel Working Groups and Tasks Rick Selby DoD Software In Acquisition.
University of Southern California Center for Systems and Software Engineering Integrating Systems and Software Engineering (IS&SE) with the Incremental.
Capability Maturity Model (CMM) in SW design
Valuing System Flexibility via Total Ownership Cost Analysis Barry Boehm, JoAnn Lane, USC Ray Madachy, NPS NDIA Systems Engineering Conference October.
System-of-Systems Cost Modeling: COSOSIMO July 2005 Workshop Results Jo Ann Lane University of Southern California Center for Software Engineering.
1 SOFTWARE PRODUCTION. 2 DEVELOPMENT Product Creation Means: Methods & Heuristics Measure of Success: Quality f(Fitness of Use) MANAGEMENT Efficient &
Iterative development and The Unified process
Quality evaluation and improvement for Internal Audit
Assessing SysE Effectiveness in Major Defense Acquisition Programs (MDAPs): Support Surveys Dan Ingold, USC-CSSE 29 November 2008.
DITSCAP Phase 2 - Verification Pramod Jampala Christopher Swenson.
The Software Product Life Cycle. Views of the Software Product Life Cycle  Management  Software engineering  Engineering design  Architectural design.
University of Southern California Center for Software Engineering C S E USC Agile and Plan-Driven Methods Barry Boehm, USC USC-CSE Affiliates’ Workshop.
Software Process and Product Metrics
Purpose of the Standards
Pre-Project Planning Lessons from the Construction Industry Institute Construction Industry Institute Michael Davis, P. Eng, PMP Ontario Power Generation.
Field Learning Through ICT: The CRS/ NetHope/ Intel Collaboration and Great Lakes Cassava Initiative Pilot CRS Program Quality & Support Department 21.
Annual SERC Research Review - Student Presentation, October 5-6, Extending Model Based System Engineering to Utilize 3D Virtual Environments Peter.
What is Business Analysis Planning & Monitoring?
© 2005 Prentice Hall14-1 Stumpf and Teague Object-Oriented Systems Analysis and Design with UML.
Process: A Generic View
S/W Project Management
Object-Oriented Analysis and Design Iterative Development and the Unified Process.
Introduction to RUP Spring Sharif Univ. of Tech.2 Outlines What is RUP? RUP Phases –Inception –Elaboration –Construction –Transition.
Chapter 10 Contemporary Project Management Kloppenborg
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
Campaign Readiness Project Overview Enabling a structured, scalable approach to customer-centric campaigns.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
What is a life cycle model? Framework under which a software product is going to be developed. – Defines the phases that the product under development.
Ways for Improvement of Validity of Qualifications PHARE TVET RO2006/ Training and Advice for Further Development of the TVET.
Evaluating a Research Report
An Integrated Control Framework & Control Objectives for Information Technology – An IT Governance Framework COSO and COBIT 4.0.
Basic of Project and Project Management Presentation.
Role-Based Guide to the RUP Architect. 2 Mission of an Architect A software architect leads and coordinates technical activities and artifacts throughout.
The Architecture Lecture September 2006 Cem Kaner CSE 1001.
Management & Development of Complex Projects Course Code MS Project Management Perform Qualitative Risk Analysis Lecture # 25.
Business Process’s Blue Print Pertemuan 3 Matakuliah: M0734-Business Process Reenginering Tahun: 2010.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
Assessing the influence on processes when evolving the software architecture By Larsson S, Wall A, Wallin P Parul Patel.
Software Engineering - I
Performance Stories Evaluation - A Monitoring Method to Enhance Evaluation Influence Riad Naji, Catriona King, Richard Habgood.
Practical Investment Assurance Framework PIAF Copyright © 2009 Group Joy Pty. Ltd. All rights reserved. Recommended for C- Level Executives.
SERC Technical Overview: First-Year Results and Future Directions Barry Boehm, USC Rich Turner, Stevens 15 October 2009.
Strategies for Knowledge Management Success SCP Best Practices Showcase March 18, 2004.
Software Architecture Evaluation Methodologies Presented By: Anthony Register.
MODEL-BASED SOFTWARE ARCHITECTURES.  Models of software are used in an increasing number of projects to handle the complexity of application domains.
Overview of RUP Lunch and Learn. Overview of RUP © 2008 Cardinal Solutions Group 2 Welcome  Introductions  What is your experience with RUP  What is.
John N. Lavis, MD, PhD Professor and Canada Research Chair in Knowledge Transfer and Exchange McMaster University Program in Policy Decision-Making McMaster.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
Review of Effectiveness Measures Evaluation Task Dan Ingold, USC USC-CSSE Annual Research Review March 16, 2009.
Evaluate SE Methods, Processes and Tools Technical Task Plan USC Workshop Los Angeles, CA 16 March 2009.
Info-Tech Research Group1 Info-Tech Research Group, Inc. Is a global leader in providing IT research and advice. Info-Tech’s products and services combine.
Process 4 Hours.
Identify the Risk of Not Doing BA
Modernization Maturity Model and Roadmap
SKILL ASSESSMENT OF SOFTWARE TESTERS Case Study
Metrics for process and Projects
Presentation transcript:

Proposed Approach to SERC EM Task: Assessing SysE Effectiveness in Major Defense Acquisition Programs (MDAPs) Barry Boehm, USC-CSSE 26 November 2008

Outline EM Task Statement of Work – Definition of “Systems Engineering Effectiveness” EM Task Schedule and Planned Results Candidate Measurement Methods, Evaluation Criteria, Evaluation Approach Survey and evaluation instruments Workshop objectives and approach 01/29/20092

EM Task Statement of Work Develop measures to monitor and predict system engineering effectiveness for DoD Major Defense Acquisition Programs – Define SysE effectiveness – Develop measurement methods for contractors, DoD program managers and PEOs, oversight organizations For weapons platforms, SoSs, Net-centric services – Recommend continuous process improvement approach – Identify DoD SysE outreach strategy Consider full range of data sources – Journals, tech reports, org’s (INCOSE, NDIA), DoD studies Partial examples cited: GAO, SEI, INCOSE, Stevens/IBM GFI: Excel version of SADB Deliverables: Report and presentation – Approach, sources, measures, examples, results, recommendations 01/29/20093

Measuring SysE Effectiveness - And measuring SysE effectiveness measures Good SysE correlates with project success – INCOSE definition of systems engineering, “An interdisciplinary approach and means to enable the realization of successful systems” Good SysE not a perfect predictor of project success – Project does bad SysE, but gets lucky at the last minute and finds a new COTS solution, producing a great success – Project does great SysE, but poor managers and developers turn it into a disaster Goodness of a candidate SysE effectiveness measure (EM) – Whether it can detect when a project’s SysE is leading the project more toward success than toward failure Heuristic for evaluating a proposed SysE EM – Role-play as underbudgeted, short-tenure project manager – Ask “How little can I do and still get a positive rating on this EM?” 01/29/20094

5 RESL as Proxy for SysE Effectiveness: A COCOMO II Analysis Used on FCS

01/29/20096 Added Cost of Minimal Software Systems Engineering

01/29/20097 How Much Architecting is Enough? Percent of Project Schedule Devoted to Initial Architecture and Risk Resolution Added Schedule Devoted to Rework (COCOMO II RESL factor) Total % Added Schedule KSLOC 100 KSLOC 10 KSLOC Sweet Spot Sweet Spot Drivers: Rapid Change: leftward High Assurance: rightward

EM Task Schedule and Results 01/29/20098 PeriodActivityResults 12/8/08 – 1/28/09Candidate EM assessments, surveys, Interviews, coverage matrix Initial survey results, candidate EM assessments, coverage matrix 1/29-30/09SERC-internal joint workshop with MPTProgress report on results, gaps, plans for gap followups 2/1 – 3/27/09Sponsor feedback on results and plans; Sponsor identification of candidate pilot organizations; Execution of plans; suggested EMs for weapons platform (WP) pilots Identification of WP pilot candidate organizations at Contractor, PM/PEO. Oversight levels; Updated survey, EM evaluation, recommended EM results 3/30-4/3/09 weekSERC-external joint workshop with MPT, sponsors, collaborators, pilot candidates, potential EM users Guidance for refining recommended EMs; Candidates for pilot EM evaluations 4/7 – 5/1/09Tailor lead EM candidates for weapons platform (WP) pilots; SADB-based evaluations of candidate EMs Refined, tailored EM candidates for weapons platform (WP) pilots at Contractor, PM-PEO, oversight levels; Pilot evaluation guidelines 5/6-7/09Joint workshop with MPT, sponsors, collaborators, pilot candidates, stakeholders; Select WP EM pilots Selected pilots; Guidance for final preparation of EM candidates and evaluation guidelines 5/11 – 7/10/09WP EM pilot experiments; Analysis and evaluation of guidelines and results; Refinement of initial SADB evaluation results based on EM improvements EM pilot experience database and survey results; Refined SADB EM evaluations 7/13 – 8/14/09Analyze WP EM pilot and SADB results; Prepare draft report on results, conclusions, and recommendations Draft report on WP and general EM evaluation results, conclusions, and recommendations for usage and research/transition/education initiatives 8/17-18/09Workshop on draft report with sponsors, collaborators, WP EM deep-dive evaluators, stakeholders Feedback on draft report results, conclusions, and recommendations 8/19 - 9/4-09Prepare, present, and deliver final reportFinal report on WP and general EM evaluation results, conclusions, and recommendations for usage and research/transition/education initiatives

Outline EM Task Statement of Work – Definition of “Systems Engineering Effectiveness” EM Task Schedule and Planned Results Candidate Measurement Methods, Evaluation Criteria, Evaluation Approach Survey and evaluation instruments Workshop objectives and approach 01/29/20099

Candidate Measurement Methods NRC Pre-Milestone A & Early-Phase SysE top-20 checklist Air Force Probability of Program Success (PoPS) Framework INCOSE/LMCO/MIT Leading Indicators Stevens Leading Indicators (new; using SADB root causes) USC Anchor Point Feasibility Evidence progress UAH teaming theories NDIA/SEI capability/challenge criteria SISAIG Early Warning Indicators/ USC Macro Risk Tool 01/29/200910

Independent EM Evaluations and Resolution 01/29/ Candidate EM USC Stevens FC-MD UAH PoPS Leading Indicators X X X INCOSE LIs X X Stevens LIs X X X SISAIG LIs/ Macro Risk X X X NRC Top-20 List X X X SEI CMMI-Based LIs X X X USC AP-Feasibility Evidence X X X UAH Team Effectiveness X X X

Extended SEPP Guide EM Evaluation Criteria Quality Attributes Accuracy and Objectivity. Does the EM accurately and objectively identify the degree of SE effectiveness? Or is it easy to construct counterexamples in which it would be inaccurate? Level of Detail. How well does the EM cover the spectrum between strategic and tactical data collection and analysis? Scalability. How well does the EM cover the spectrum between simple weapons platforms and ultra-large systems of systems with deep supplier chains and multiple concurrent and interacting initiatives? Ease of Use/Tool Support. Is the EM easy to learn and apply by non-experts, and is it well supported by tools, or does it require highly specialized knowledge and training? Adaptability. Is the EM easy to adapt or extend to apply to different domains, levels of assessment, changing priorities, or special circumstances? Maturity. Has the EM been extensively used and improved across a number and variety of applications? Cost Attributes Cost To Collect Data Cost To Relate Data To Effectiveness Schedule Efficiency Key Personnel Feasibility Functional Attributes 1201/29/2009

Criteria validation (EM) /29/2009

Candidate EM Coverage Matrix 01/29/ SERC EM Task Coverage Matrix V1.0 NRC Probability of Success SE Leading Indicators LIPSF (Stevens) Anchoring SW Process (USC) PSSES (U. of Alabama) SSEE (CMU/SEI) Macro Risk Model/Tool Concept Dev Atleast 2 alternatives have been evaluated xxx x (w.r.t NPR) (x) Can an initial capability be achieved within the time that the key program leaders are expected to remain engaged in their current jobs (normally less than 5 years or so after Milestone B)? If this is not possible for a complex major development program, can critical subsystems, or at least a key subset of them, be demonstrated within that time frame? x(x)x x (5 years is not explicitly stated) (x) (seems to be inferrable from the conclusions) (x) (implies this) Will risky new technology mature before B? Is there a risk mitigation plan? xxx(x)xx Have external interface complexities been identified and minimized? Is there a plan to mitigate their risks? xxxxxx KPP and CONOPS At Milestone A, have the KPPs been identified in clear, comprehensive, concise terms that are understandable to the users of the system? x(x)x x (strongly implied) (x) (implied) xx At Milestone B, are the major system-level requirements (including all KPPs) defined sufficiently to provide a stable basis for the development through IOC? xx(x)xx (x) (There is no direct reference to this but is inferrable) x Has a CONOPS been developed showing that the system can be operated to handle the expected throughput and meet response time requirements? xx(x) x (x) (there is a mention of a physical solution. That's the closest in this regard) xx Legend: x = covered by EM (x) = partially covered (unless stated otherwise)

EM “Survey” Instruments Two web-/PDF-based “surveys” now in place EM set exposure, familiarity and use – Participants: industry, government, academia – Short survey, seeking experience with EM sets – Presently distributed only to limited audience – Finding individual metrics used, less so for EM sets EM evaluation form – Participants: Task 1 EM team members – Detailed survey, requires in-depth evaluation of EM’s – Plan follow-up interviews for qualitative assessments – Evaluation still in progress, too early to expect results

Exposure to EM sets Only two EM sets used (one by DoD, other by industry) Even in limited audience, EM sets not widely used Several have no exposure at all Some “union of EM sets” might be required

EM Task Schedule and Results 01/29/ PeriodActivityResults 12/8/08 – 1/28/09Candidate EM assessments, surveys, Interviews, coverage matrix Initial survey results, candidate EM assessments, coverage matrix 1/29-30/09SERC-internal joint workshop with MPTProgress report on results, gaps, plans for gap followups 2/1 – 3/27/09Sponsor feedback on results and plans; Sponsor identification of candidate pilot organizations; Execution of plans; suggested EMs for weapons platform (WP) pilots Identification of WP pilot candidate organizations at Contractor, PM/PEO. Oversight levels; Updated survey, EM evaluation, recommended EM results 3/30-4/3/09 weekSERC-external joint workshop with MPT, sponsors, collaborators, pilot candidates, potential EM users Guidance for refining recommended EMs; Candidates for pilot EM evaluations 4/7 – 5/1/09Tailor lead EM candidates for weapons platform (WP) pilots; SADB-based evaluations of candidate EMs Refined, tailored EM candidates for weapons platform (WP) pilots at Contractor, PM-PEO, oversight levels; Pilot evaluation guidelines 5/6-7/09Joint workshop with MPT, sponsors, collaborators, pilot candidates, stakeholders; Select WP EM pilots Selected pilots; Guidance for final preparation of EM candidates and evaluation guidelines 5/11 – 7/10/09WP EM pilot experiments; Analysis and evaluation of guidelines and results; Refinement of initial SADB evaluation results based on EM improvements EM pilot experience database and survey results; Refined SADB EM evaluations 7/13 – 8/14/09Analyze WP EM pilot and SADB results; Prepare draft report on results, conclusions, and recommendations Draft report on WP and general EM evaluation results, conclusions, and recommendations for usage and research/transition/education initiatives 8/17-18/09Workshop on draft report with sponsors, collaborators, WP EM deep-dive evaluators, stakeholders Feedback on draft report results, conclusions, and recommendations 8/19 - 9/4-09Prepare, present, and deliver final reportFinal report on WP and general EM evaluation results, conclusions, and recommendations for usage and research/transition/education initiatives

Target EM Task Benefits for DoD Identification of best available EM’s for DoD use – Across 3 domains; 3 review levels; planning and execution Early warning vs. late discovery of SysE effectiveness problems Identification of current EM capability gaps – Recommendations for most cost-effective enhancements, research on new EM approaches – Ways to combine EM strengths, avoid weaknesses Foundation for continuous improvement of DoD SysE effectiveness measurement – Knowledge base of evolving EM cost-effectiveness – Improved data for evaluating SysE ROI 01/29/200918

Candidate EM Discussion Issues Feedback on evaluation criteria, approach – Single-number vs. range asessment – Effectiveness may vary by domain, management level Experience with candidate EMs Additional candidate EMs – Dropped GAO, COSYSMO parameters, NUWC open-systems Feedback on EM survey and evaluation instruments – Please turn in your survey response Feedback on coverage matrix Other issues or concerns 01/29/200919

6/18/08©USC-CSSE20 Macro Risk Model Interface 01/29/200920