Download presentation
Presentation is loading. Please wait.
1
© 2001 by Carnegie Mellon University C S a r n e g i e M e l l o n o f t w a r e E n g i n e e r i n g I n s t i t u t e CMMI SM CMMI SM Appraisal Overview Southern California SPIN December 7, 2001 SM CMMI, CMM Integration, and SCAMPI are service marks of Carnegie Mellon University ® Capability Maturity Model and CMM are registered with the U.S. Patent and Trademark Office Jane Moon Raytheon
2
© 2001 by Carnegie Mellon University C S a r n e g i e M e l l o n o f t w a r e E n g i n e e r i n g I n s t i t u t e CMMI SM CMMI Appraisal Method Overview - CMMI – 11/13/01 Page 2 CMMI Appraisal Method Status V1.0 assessment products published October 2000 Assessment Requirements for CMMI (ARC) Standard CMMI Assessment Method for Process Improvement (SCAMPI SM ) Method Definition Several pilot appraisals performed in 2000 (Phase I) and 2001 (Phase II) V1.1 primary objectives: Performance improvements Integrated appraisal method (assessments and evaluations) Detailed method definition ARC and SCAMPI Method Definition documents currently in SEI publication cycle
3
© 2001 by Carnegie Mellon University C S a r n e g i e M e l l o n o f t w a r e E n g i n e e r i n g I n s t i t u t e CMMI SM CMMI Appraisal Method Overview - CMMI – 11/13/01 Page 3 AMIT Membership Jim Armstrong (SPC) Mary Busby (Lockheed Martin) David Kitson (SEI) Rick Barbour (SEI) Geoff Draper (Harris) Gene Miluk (SEI) Dan Bennett (USAF STSC) Bud Glick (Motorola) Joseph Morin (ISD, Inc.) Ben Berauer (Raytheon) Will Hayes (SEI) Paul Riviere (U.S. Army CECOM) Tom Bernard (USAF ASC/EN) Rick Hefner (TRW) Charlie Ryan (SEI)
4
© 2001 by Carnegie Mellon University C S a r n e g i e M e l l o n o f t w a r e E n g i n e e r i n g I n s t i t u t e CMMI SM CMMI Appraisal Method Overview - CMMI – 11/13/01 Page 4 Characterizing ARC Appraisal Method Classes SomeMostAll (15504 option) ARC requirements applicable SmallMediumLargeTeam size No Yes (optional) 15504 conformance LowMediumHighResource needs No YesRatings generated LowMediumHighAmount of objective evidence gathered Class CClass BClass ASummaryCharacteristic Consider a family of appraisal methods in determining overall appraisal needs Class A methods may not be the most appropriate choice for organizations early in their process improvement cycle
5
© 2001 by Carnegie Mellon University C S a r n e g i e M e l l o n o f t w a r e E n g i n e e r i n g I n s t i t u t e CMMI SM CMMI Appraisal Method Overview - CMMI – 11/13/01 Page 5 SCAMPI Assumptions and Design Principles 1. SCAMPI is positioned as a Class A benchmarking method. 2. Goal achievement is a function of the extent to which the corresponding practices are present in the planned and implemented processes of the organization. 3. Practice implementation at the organizational unit level is a function of the degree of practice implementation at the instantiation level (e.g., projects) 4. The aggregate of objective evidence available to the appraisal team is used as the basis for determination of practice implementation. 5. Appraisal teams are obligated to seek and consider objective evidence of multiple types in determining the extent of practice implementation.
6
© 2001 by Carnegie Mellon University C S a r n e g i e M e l l o n o f t w a r e E n g i n e e r i n g I n s t i t u t e CMMI SM CMMI Appraisal Method Overview - CMMI – 11/13/01 Page 6 What is not in the ARC or MDD? Form and content of data collection instruments Tool dependencies Authorization-related requirements for SCAMPI Lead Appraisers (SM) Detailed reporting requirements to CMMI Steward following completion of appraisal Government acquisition policy related issues. Deployment issues and content described in FAR, RFP, etc. Reuse of evaluation results (to be addressed by Evaluation IPT) Source Selection Authority (SSA) interface
7
© 2001 by Carnegie Mellon University C S a r n e g i e M e l l o n o f t w a r e E n g i n e e r i n g I n s t i t u t e CMMI SM CMMI Appraisal Method Overview - CMMI – 11/13/01 Page 7 SCAMPI MDD v1.0 DoD SW Eval IPT Detailed Method Definition: Phases, Processes, Activities Inputs, Outputs, Outcomes Options Implementation Guides: Internal Process Improvement Supplier Selection and Monitoring CMMI Requirements (Revised) A-Spec ARC Change Requests Performance Ideas Best Practices Pilot Feedback Other Appraisal Methods SCAMPI Method Definition Document (MDD) Transition Other Sources CBA IPI, EIA 731-2 SCE, SDCE, FAM etc. Evaluation Requirements Group SCAMPI V1.1 Method description Implem. Guides (v1.1+)
8
© 2001 by Carnegie Mellon University C S a r n e g i e M e l l o n o f t w a r e E n g i n e e r i n g I n s t i t u t e CMMI SM CMMI Appraisal Method Overview - CMMI – 11/13/01 Page 8 ARC / SCAMPI Improvement Strategy Shift appraisal team focus from discovery to verification Leverage pre-onsite analysis of organization model implementation (documentation, mapping, etc.) Integrated data collection and continuous consolidation Prioritize areas for focused investigation based on data collection, analysis, and sufficiency of coverage (i.e., “triage”) Provide detailed method definition and implementation guidance Support clarity, consistency, repeatability Organize content for efficient usage in the field
9
© 2001 by Carnegie Mellon University C S a r n e g i e M e l l o n o f t w a r e E n g i n e e r i n g I n s t i t u t e CMMI SM CMMI Appraisal Method Overview - CMMI – 11/13/01 Page 9 Summary of ARC Changes Expand ARC to encompass appraisal application modes Supplier feedback of evaluation results (preliminary findings, final findings), per Evaluation IPT Defined appraisal input (a major portion of the appraisal plan) subject to sponsor approval and change control (per ISO/IEC 15504) Greater clarity in Class A, B, C descriptions and relationships in overall appraisal strategy 15504 conformance is optional for Class A methods; selected 15504-specific requirements may not apply
10
© 2001 by Carnegie Mellon University C S a r n e g i e M e l l o n o f t w a r e E n g i n e e r i n g I n s t i t u t e CMMI SM CMMI Appraisal Method Overview - CMMI – 11/13/01 Page 10 What are Practice Implementation Indicators? “The fundamental idea of practice implementation indicators (PIIs) is quite simple and broadly applicable to any practice or activity; it is based on the presumption that the conduct of an activity or the implementation of a practice will result in “footprints” which are attributable to the activity or practice.” - SCAMPI Method Definition, V1.1
11
© 2001 by Carnegie Mellon University C S a r n e g i e M e l l o n o f t w a r e E n g i n e e r i n g I n s t i t u t e CMMI SM CMMI Appraisal Method Overview - CMMI – 11/13/01 Page 11 Objective Evidence Data Types Direct Artifacts Tangible outputs resulting directly from implementation of a practice (e.g., Typical Work Products) Indirect Artifacts Artifacts that are a consequence or indicative of performing a practice (e.g., meeting minutes, reviews, logs, reports) Affirmations Oral or written statements confirming or supporting implementation of the practice (e.g, interviews, questionnaires)
12
© 2001 by Carnegie Mellon University C S a r n e g i e M e l l o n o f t w a r e E n g i n e e r i n g I n s t i t u t e CMMI SM CMMI Appraisal Method Overview - CMMI – 11/13/01 Page 12 Characterizing Practice Implementation Assign characterization values reflecting the extent of practice implementation for each instance Fully Implemented (FI) Largely Implemented (LI) Partially Implemented (PI) Not Implemented (NI) Aggregate practice characterizations to organizational unit level using defined method aggregation rules Iterate and focus revisions to data collection plan Generate findings based on aggregation of weaknesses and strengths
13
© 2001 by Carnegie Mellon University C S a r n e g i e M e l l o n o f t w a r e E n g i n e e r i n g I n s t i t u t e CMMI SM CMMI Appraisal Method Overview - CMMI – 11/13/01 Page 13 Characterizing Practice Implementation - 2 Any situation not covered by above Not Implemented (NI) Direct artifacts absent or judged inadequate Artifacts or affirmations indicate some aspects of the practice are implemented One or more weaknesses noted Partially Implemented (PI) Direct artifacts present and appropriate Supported by indirect artifact and/or affirmation One or more weaknesses noted Largely Implemented (LI) Direct artifacts present and appropriate Supported by indirect artifact and/or affirmation No weaknesses noted Fully Implemented (FI)
14
© 2001 by Carnegie Mellon University C S a r n e g i e M e l l o n o f t w a r e E n g i n e e r i n g I n s t i t u t e CMMI SM CMMI Appraisal Method Overview - CMMI – 11/13/01 Page 14 Example – Indicators of Practice Implementation PP SP1.1-1: Establish and maintain a top-level work breakdown structure (WBS) for estimating the scope of the project. Primary artifact: -top-level WBS, with revision history -task descriptions -work product descriptions Affirmation: - how is the WBS used? - how are estimates generated? Indirect artifact: - project estimates aligned with WBS elements Indirect artifact: -minutes of meetings at which WBS was generated or used to develop project estimates
15
© 2001 by Carnegie Mellon University C S a r n e g i e M e l l o n o f t w a r e E n g i n e e r i n g I n s t i t u t e CMMI SM CMMI Appraisal Method Overview - CMMI – 11/13/01 Page 15 Data Collection and Rating Concepts Corroboration Must have direct artifacts, combined with either indirect artifact or affirmation Coverage Must have sufficient objective evidence for implementation of each practice, for each instance Must have face-to-face (F2F) affirmations (avoid “paper-only appraisals”): – At least one instance for each practice (“one column”) – At least one practice for each instance (“one row”) – or 50% of practices for each PA goal, for each project, have at least one F2F affirmation data point
16
© 2001 by Carnegie Mellon University C S a r n e g i e M e l l o n o f t w a r e E n g i n e e r i n g I n s t i t u t e CMMI SM CMMI Appraisal Method Overview - CMMI – 11/13/01 Page 16 Affirmation Coverage Rules - Summary Column Row PA.SPx.4-1 PA.SPx.3-1 PA.SPx.2-1 PA.SPx.1-1 Project-4Project-3Project-2Project-1 1. “One Row, One Column” or, 2. “50% rule”: > 50% of PA practices for each goal, for each project, have at least one face-to- face (F2F) affirmation data point
17
© 2001 by Carnegie Mellon University C S a r n e g i e M e l l o n o f t w a r e E n g i n e e r i n g I n s t i t u t e CMMI SM CMMI Appraisal Method Overview - CMMI – 11/13/01 Page 17 Aggregation and Consensus Practice Implementation Characterizations (practice instantiation level) Practice Implementation Characterizations (organizational unit level) Goal Satisfaction Ratings Capability Level and/or Maturity Level Ratings Level of Consensus Mini-Team Full Team
18
© 2001 by Carnegie Mellon University C S a r n e g i e M e l l o n o f t w a r e E n g i n e e r i n g I n s t i t u t e CMMI SM CMMI Appraisal Method Overview - CMMI – 11/13/01 Page 18 MDD v1.1 Outline Phase I: Plan and Prepare for Appraisal ------------ - --- ---- ----- ---- ----- ----- -------------- Phase II: Conduct Appraisal ------------ - --- ---- ----- ---- ----- ----- -------------- Phase III: Report Results ------------ - --- ---- ----- ---- ----- ----- -------------- Primary Reference Material Introductory Prose ------------ - --- ---- ----- ---- ----- ----- -------------- Doc. Overview ______________________________ Part1Descriptive name and information ______________________________ Part2Descriptive name and information ______________________________... ______________________________ Part NDescriptive name and information ______________________________ Executive Summary for the Appraisal Sponsor ------------ - --- ---- ----- ---- ----- ----- -------------- Method Overview with audience- specific summary of this document ------------ - --- ---- ----- ---- ----- ----- -------------- Front Matter Glossary ------------ - --- ---- ----- ---- ----- ----- -------------- SCAMPI Appraisal Disclosure Statement (ADS) ------------ - --- ---- ----- ---- ----- ----- -------------- Role of PIIs in Verifying Practice Implementation ------------ - --- ---- ----- ---- ----- ----- -------------- ARC/MDD Traceability ------------ - --- ---- ----- ---- ----- ----- -------------- Appendices Focused Investigation Elaboration and Guidance ------------ - --- ---- ----- ---- ----- ----- --------------
19
© 2001 by Carnegie Mellon University C S a r n e g i e M e l l o n o f t w a r e E n g i n e e r i n g I n s t i t u t e CMMI SM CMMI Appraisal Method Overview - CMMI – 11/13/01 Page 19 MDD Structure Phases (3) Processes (11) Entry / exit criteria, inputs, outputs, activities, etc. Activities (43) Activity Description Required Practices Parameters and Limits Optional Practices Implementation Guidance
20
© 2001 by Carnegie Mellon University C S a r n e g i e M e l l o n o f t w a r e E n g i n e e r i n g I n s t i t u t e CMMI SM CMMI Appraisal Method Overview - CMMI – 11/13/01 Page 20 Potential Future Efficiencies Incremental and delta appraisals Additional work aids (templates, checklists, “look fors / listen fors”) Improved instruments and tools Statistical sampling Leverage and cross-correlate model built-in dependencies for improved appraisal data management. Relationships (threads) among practices (GPs, SPs), Goals, PAs –e.g. PP, PMC, IPM Single work products / indicators that satisfy multiple practices
21
© 2001 by Carnegie Mellon University C S a r n e g i e M e l l o n o f t w a r e E n g i n e e r i n g I n s t i t u t e CMMI SM CMMI Appraisal Method Overview - CMMI – 11/13/01 Page 21 Appraisal Concept of Operations Project Implementation Organization Implementation Model / process mapping Objective evidence Inventory of Objective Evidence Verified implementation Strengths, weaknesses Findings, ratings Defined Processes Process improvement Class B, C appraisals Organization CMMI Steward Appraisal Team CMMI Product Suite Transition Deployment Tailoring Appraisal planning Readiness review Verification Validation Aggregation Focused Investigation Model Method Training PAIS Appraisal results Performance data
22
© 2001 by Carnegie Mellon University C S a r n e g i e M e l l o n o f t w a r e E n g i n e e r i n g I n s t i t u t e CMMI SM CMMI Appraisal Method Overview - CMMI – 11/13/01 Page 22 AMIT Initiatives Investigated Integrated Data Collection and Verification Approach Focused investigation - opportunistically use data collection (e.g., questionnaire, objective evidence) to narrow the focus for further investigation and team emphasis Leverage organizational assets reflecting implementation of model practices Greater appraisal team focus on verification rather than discovery Incremental Appraisals (Deferred post-v1.1) Pre-planned partitioning of appraisal scope across weeks or months Delta Appraisals (Deferred post-v1.1) Partial re-assessment to focus on weaknesses identified in prior SCAMPI appraisals
23
© 2001 by Carnegie Mellon University C S a r n e g i e M e l l o n o f t w a r e E n g i n e e r i n g I n s t i t u t e CMMI SM CMMI Appraisal Method Overview - CMMI – 11/13/01 Page 23 Summary - CMMI Appraisal Method Status ARC and SCAMPI v1.1 revisions currently in SEI publication process Performance improvements Integrated appraisal method (assessments and evaluations) Detailed method definition Fundamental SCAMPI concepts Indicator-driven appraisals Focused investigation (Integrated data collection and continuous consolidation)
24
© 2001 by Carnegie Mellon University C S a r n e g i e M e l l o n o f t w a r e E n g i n e e r i n g I n s t i t u t e CMMI SM CMMI Appraisal Method Overview - CMMI – 11/13/01 Page 24 For More Information… Contacts: Geoff Draper Harris Corporation gdraper@harris.com David Kitson Manager, SEI Appraiser Program dhk@sei.cmu.edu http://www.sei.cmu.edu/cmmihttp://www.sei.cmu.edu/cmmi/
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.