International Workshop 28 Jan – 2 Feb 2011 Phoenix, AZ, USA INCOSE MBSE Initiative Methodology and Metrics Activity Breakout Session Outbrief 31 January.

Slides:



Advertisements
Similar presentations
Armstrong Process Group, Inc. Copyright © , Armstrong Process Group, Inc., All rights reserved Armstrong Process Group,
Advertisements

Course: e-Governance Project Lifecycle Day 1
Virtual University - Human Computer Interaction 1 © Imran Hussain | UMT Imran Hussain University of Management and Technology (UMT) Lecture 16 HCI PROCESS.
Using UML, Patterns, and Java Object-Oriented Software Engineering Royce’s Methodology Chapter 16, Royce’ Methodology.
Developing a Strategic Communications Plan. Overview This session will cover how to: Outline team functions and chain of command Identify key stakeholders.
FEMA’s Guidelines and Standards Strategy ASFPM Conference May 23, 2012.
Project Management Methodology (PMM)
What is Business Analysis Planning & Monitoring?
FINAL DEMO Apollo Crew, group 3 T SW Development Project.
NDIA SE Division Meeting February 13, Developmental Test and Evaluation Committee Beth Wilson, Raytheon Steve Scukanec, Northrop Grumman Industry.
Software testing and development for intended quality Tero Pesonen.
Using IBM Rational Unified Process for software maintenance
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
What is a life cycle model?
What is a life cycle model? Framework under which a software product is going to be developed. – Defines the phases that the product under development.
What is a Business Analyst? A Business Analyst is someone who works as a liaison among stakeholders in order to elicit, analyze, communicate and validate.
© 2011 IBM Corporation OSLC Communications Workgroup 21 March 2012.
Prototyping life cycle Important steps 1. Does prototyping suit the system 2. Abbreviated representation of requirements 3. Abbreviated design specification.
1A FAST EXCELLENCE THROUGH FACILITATION Gary Rush The FAST Process MGR Consulting
Intelligence and Information Systems 1 3/17/2004 © 2004 Raytheon Company USC/CSE Executive Workshop on Agile Experiences March 17, 2004 A Raytheon Agile.
INCOSE IW12 MBSE Workshop 15 INCOSE (MBSE) Model Based System Engineering Integration and Verification Scenario Ron Williamson, PhD Raytheon
International Workshop 28 Jan – 2 Feb 2011 Phoenix, AZ, USA INCOSE MBSE Model Management Working Group Mark Sampson Jozsef Bedocs.
International Workshop Jan 21– 24, 2012 Jacksonville, Fl USA INCOSE IW 2012 MBSE Requirement Flowdown Workshop - Outbrief - John C. Watson Principal Member.
System Modeling Assessment & Roadmap WG Meeting Boston, MA June 17, 2014 Eldad Palachi Sandy Friedenthal.
International Workshop 28 Jan – 2 Feb 2011 Phoenix, AZ, USA INCOSE MBSE Initiative Methodology and Metrics Activity Overview, Update, & Breakout Agenda.
International Workshop 28 Jan – 2 Feb 2011 Phoenix, AZ, USA Modeling Standards Activity Team Model-based Systems Engineering (MBSE) Initiative Roger Burkhart.
1 International Workshop Jan 21– 24, 2012 Jacksonville, Fl USA Roger Burkhart INCOSE MBSE Workshop January 2012 Jacksonville, Florida Modeling Standards.
More Effective Planning Using Agile and Lean Approaches INCOSE North Texas Chapter December 16, 2015.
International Workshop 26 Jan – 29 Jan 2013 Jacksonville, FL, USA MBSE Workshop INCOSE IW 2013 MBSE Workshop January 26-27, 2013 Introduction MBSE Chairs.
1 Copyright © 2014 by Lockheed Martin Corporation SE Use Cases SysML Roadmap Activity John Watson Lockheed Martin 6/17/2014.
International Symposium 24 June – 27 June 2013 Philadelphia, PA, USA MBSE Workshop INCOSE IS 2013 MBSE Plenary June 24, 2013 MBSE Usability Lead: Bjorn.
International Workshop Jan 21– 24, 2012 Jacksonville, Fl USA INCOSE IW 2012 MBSE Workshop INCOSE MBSE Initiative Methods and Metrics Activity John C. Watson.
Rick Selby Software Products, Northrop Grumman & Adjunct Faculty, University of Southern California Los Angeles, CA Candidate member Main empirical research.
Agile Systems and Systems Engineering (AS&SE) Working Group
Systems Engineering Concept Model (SECM) Update
The Five Secrets of Project Scheduling A PMO Approach
Review Outcomes: Ratifying Improvement Focus
INCOSE IS 2013 MBSE Track June 23-27, 2013
INCOSE Usability Working Group
INCOSE Usability Working Group
PMI Chapter, IT Governance, Portfolio and Project Management in State Government Chris Cruz, Chief Information Officer, California Department of Food and.
John C. Watson Principal Member of Engineering Staff
Summit 2017 Breakout Group 2: Data Management (DM)
Introduction to Eclipse Process Framework: EPF Composer and OpenUP
What is a Learning Collaborative?
Information Technology Project Management – Fifth Edition
INCOSE IW 2014 MBSE Workshop January 25-26, 2014
Systems Analysis and Design in a Changing World, 6th Edition
Systems Engineering Workflow Use Cases Activity SysML Roadmap Activity
Project Management and the Agile Manifesto
INCOSE IW11 MBSE Workshop
Project Plan Template (Help text appears in cursive on slides and in the notes field)
End of Year Performance Review Meetings and objective setting for 2018/19 This briefing pack is designed to be used by line managers to brief their teams.
Roger Burkhart MBSE Workshop Closing Plenary 31 January 2011
Enabling step change in your PPM Maturity
Software engineering -1
Requirements Working Group
YIIP1100 Project Management
INCOSE IS 2013 MBSE Plenary June 24, 2013
Core Competencies of a World Class Customer Advisory Board
Employee engagement Delivery guide
Summary & Objectives Format
Chapter 2 Process Models
Building a Strategic Plan
INCOSE Digital Artifacts Challenge Team
Software Product Management Association Origination and Mission
Systems Engineering Workflow Use Cases Activity SysML Roadmap Activity
Summary & Objectives Format
APMP Professional Certification
Presentation transcript:

International Workshop 28 Jan – 2 Feb 2011 Phoenix, AZ, USA INCOSE MBSE Initiative Methodology and Metrics Activity Breakout Session Outbrief 31 January 2011 International Workshop 28 Jan – 2 Feb 2011 Phoenix, AZ, USA

International Workshop 28 Jan – 2 Feb 2011 Phoenix, AZ, USA Breakout Summary Number of participants: ~25-30 (23 on sign-up sheet) Reviewed breakout agenda Shared standard breakout questions (next slides) Outstanding talk and video demo from G. Oswalds on “Using Simulation and Visualization to support MBSE” – Application of Harmony-SE tying in Visualization Broke into open discussion surrounding standard questions and general feedback on topics of Methodology and Metrics 2

International Workshop 28 Jan – 2 Feb 2011 Phoenix, AZ, USA Standard Breakout Questions (All Teams) 1.What metrics should we use to measure goodness? 2.What kind of data is out there? 3.Who can contribute to webinars? 4.Who can contribute to papers for 2012? 5.What can you contribute to advancing the MBSE roadmap? 6.What standards are relevant to your particular area? 3

International Workshop 28 Jan – 2 Feb 2011 Phoenix, AZ, USA Participant Feedback IW11 [Methodology (1/1)] “Make or buy”? – Consensus centers around tailoring candidate methodologies – Questions emerge such as “how do we tailor” a methodology? Where do we start? – Suggest review of current SE practices, many of which have developed over dozens of years in an organization – Try and answer how to meet the spirit of existing /proven techniques in a model-based or model-driven context? Some methodologies better suited to certain domains – truth or myth? Experiences with hybrid approaches In some cases, external constraints may drive use or limit selection of methodologies Should methodology selection be risk-driven? (Apply risk-driven approach to selection) 4

International Workshop 28 Jan – 2 Feb 2011 Phoenix, AZ, USA Breakout Agenda for IW11 [Methodologies (2/2)] Suggestion for INCOSE to reach out to academia and frame as a study problem How do we create incremental steps toward deploying a methodology? And measure progress along the way? Perception exists that it is “all or nothing” Need to answer how you plan to grow a methodology within an organization? Requires organizations to identify a Process Owner to provide continuous stewardship of methodology and not just focus on process of selection of methodology, and then you’re done Metrics for success: Measure actual use of an adopted/tailored methodology(ies) or hybrid throughout the project lifecycle Applicable/Candidate Standards? – OMG Systems & Software Process Metamodel (SPEM) V2 – Tool support: Eclipse EPF / IBM Rational Method Composer (RMC) Excellent interest level in contributing to this Activity Team: Rick Steiner, JD Baker, David Long, Chris Hansen, Arno Granados, Channy Laux 5

International Workshop 28 Jan – 2 Feb 2011 Phoenix, AZ, USA Breakout Agenda for IW11 [Metrics(1/3)] Measure Architecture “goodness”/maturity throughout project lifecycle How do we know our design is any good? Completeness, elegance, maturity spec? Measure/quantify how MBSE helps to facilitate V&V earlier in the project lifecycle Measure of risk mitigation/closure Measure of system satisfying quality attributes Can we associate cost w/size of system model and estimate level of effort to complete? Measure learning curve, capital investment, lessons learned, obtaining feedback 6

International Workshop 28 Jan – 2 Feb 2011 Phoenix, AZ, USA Breakout Agenda for IW11 [Metrics(2/3)] Following MBSE adoption, use metric to monitor progras ? Measure MBSE progress as it is being applied, following adoption Need to identify high payoff metrics (prioritize), may be driven by stakeholder engagement, stakeholders with $’s to quantify ROI – Types of metrics, e.g., tool metrics, process metrics, cost metrics, others? Measure adaptability of a particular methodology (related to Usability) Measure compression of timeline to field new systems using MBSE Measure learning curve, capital investment, lessons learned, obtaining feedback Measure introduction of errors throughout lifecycle process Measure reuse, payoff, reduced time-to-market of MBSE paradigm Measure cost of processing change requests 7

International Workshop 28 Jan – 2 Feb 2011 Phoenix, AZ, USA Breakout Agenda for IW11 [Metrics(3/3)] Measure amount of work or “design-in-process” to identify areas of priority and focus – Objective function to minimize flow rate of design-in-process Measure of model complexity – This is a “DARPA hard” problem – A lot of research in this area Measure completeness of work products throughout project lifecycle w/emphasis early in lifecycle Applicable/Candidate Standards? ISO/IEC 27004, CMU CMMI, IEEE 1024 (software) 8 Coordinate w/INCOSE Measurement WG

International Workshop 28 Jan – 2 Feb 2011 Phoenix, AZ, USA Backup 9

International Workshop 28 Jan – 2 Feb 2011 Phoenix, AZ, USA Breakout Agenda for IW11 Kickoff w/brief introductions Jeff E. to introduce John W. as new Activity Lead following IW11 Guest talk by Gundars Osvalds (Northrop Grumman) entitled “Using Simulation and Visualization to Support MBSE”; will include a video Jeff E. to briefly review content of Wiki Jeff E. to recap IW09 MBSE workshop breakout participant feedback to set stage for workshop interaction, dialog, and participation Seeking active participation from MBSE Usability team from morning session Jeff E. and John W. to facilitate working dialog and capture notes – Need to make time for workshop dialog and solution ideas on Metrics (tool metrics, process metrics, other metrics?) Jeff E. and John W. to submit breakout notes to Sandy F. and Mark S. for incorporation into MBSE workshop outbrief 10

International Workshop 28 Jan – 2 Feb 2011 Phoenix, AZ, USA Standard Breakout Questions (First Cut Answers) 1.What metrics should we use to measure goodness? a.Activity level of participation in WG and contribution of members b.Wide dissemination of Body of Knowledge (BoK) in particular methodologies c.Methodologies in practice well documented for full scale of applicability 2.What kind of data is out there? 1.Metrics: INCOSE Measurement WG 3.Who can contribute to webinars? 4.Who can contribute to papers for 2012? 5.What can you contribute to advancing the MBSE roadmap? 6.What standards are relevant to your particular area? 1.Metrics: ISO/IEC 27004, CMU CMMI, IEEE 1024 (software) 2.Methodology: OMG Unified Process, OMG Software Process Engineering Metamodel (SPEM) 11

International Workshop 28 Jan – 2 Feb 2011 Phoenix, AZ, USA Methodologies Outbrief (IW09) (1/2) Most recent participant recommendations comes from MBSE Workshop at INCOSE IW09 held in San Francisco – MBSE workshop at INCOSE IW10 did not formally break out Activity Lead and Challenge Teams Participant Feedback (morning session) – Create public Wiki site to capture – Best practices & experiences using methodology(ies) – Discussion forum for methodology Q&A – Forum for methodologist to post latest updates and links to resources – Include comparison chart/table of features for each methodology to identify strengths or “sweet spot” for lifecycle SE functions (e.g., reqts, architecture, design, risk) – Provide tailoring guide to map to standard project phases (what is coverage to lifecycle phase (needs evaluation) – Evaluate methodologies to determine of certain methods have strengths that should be incorporated into local process models 12

International Workshop 28 Jan – 2 Feb 2011 Phoenix, AZ, USA Methodologies Outbrief (IW09) (2/2) Participant Feedback (afternoon session) – Differentiate work product-centric methodologies from process-centric methodologies (R. Hodgson) – Review “X-model” (R. Hodgson) – Seek process element/ process pattern reuse – Role of governance – Do some methods work better in certain domains? – Enterprise modeling and instantiation (R. Griego) Response to recommendations: – Will need a great deal more participation from practitioners and other interested stakeholders to adopt these recommendations – This is A LOT of work – Best opportunity near-term is to stand-up public Wiki site for methodologist to post information about their particular methodologies – Possibly continue annual update of MBSE Methodology Survey (TBD) Want to get out of annual “maintenance” business 13