OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 1© OSEL 2008 CMMI and Metrics BCS SPIN SG, 19 February 2008 Clifford Shelley.

Slides:



Advertisements
Similar presentations
Integrated Project Management IPM (Without IPPD) Intermediate Concepts of CMMI Project meets the organization Author: Kiril Karaatanasov
Advertisements

Implementing CMMI® for Development Version 1.3
OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 1.1 eXtreme Programming experiences with a new approach to software development.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 4 Slide 1 Software Processes.
Copyright 2005 CMMI and ITIL Alison Adams & Kieran Doyle.
Software Quality Engineering Roadmap
Chapter 10 Simple Regression.
Planning a measurement program What is a metrics plan? A metrics plan must describe the who, what, where, when, how, and why of metrics. It begins with.
Capability Maturity Model Integration (CMMI). CMMI Enterprise-wide process improvement framework Focuses on processes for improved product Process areas:
200209–CSSA0001 – 16/27/ :25 PM CSSA Cepeda Systems & Software Analysis, Inc. GENERIC.
1 Software Requirements Specification Lecture 14.
1 Computer Systems & Architecture Lesson 1 1. The Architecture Business Cycle.
The Quality Improvement Model
OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 1© OSEL  - and its application to software development UK SPIN May.
Lecture 11 CMM CSCI – 3350 Software Engineering II Fall 2014 Bill Pine.
1 College of Engineering and Computer Science Computer Science Department CSC 131 Computer Software Engineering Fall 2006 Lecture # 2 Chapter 6 & 7 System.
OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Metrics in the context of the CMM/SPICE SPIN-UK, 29 September 1998.
What is Business Analysis Planning & Monitoring?
University of Toronto Department of Computer Science © 2001, Steve Easterbrook CSC444 Lec22 1 Lecture 22: Software Measurement Basics of software measurement.
CMMI Course Summary CMMI course Module 9..
Capability Maturity Model Integration
1 The Continuous Representation. 2 UNIT 2 Topics covered in this unit include Additional terminology Practices – The fundamental building blocks Process.
Copyright © 2009, Systems and Software Consortium, Inc. Introduction to an Integrated Lean Thinking, Six Sigma  and CMMI  Approach for Process Improvement.
8. CMMI Standards and Certifications
Forecasting and Statistical Process Control MBA Statistics COURSE #5.
AICT5 – eProject Project Planning for ICT. Process Centre receives Scenario Group Work Scenario on website in October Assessment Window Individual Work.
Integrated Capability Maturity Model (CMMI)
1 Software Quality Engineering CS410 Class 5 Seven Basic Quality Tools.
COMPANY CONFIDENTIAL Page 1 Final Findings Briefing Client ABC Ltd CMMI (SW) – Ver 1.2 Staged Representation Conducted by: QAI India SM - CMMI is a service.
Chapter 7 Cost Drivers and Cost Behavior IDIS 364 – Spring 2007.
1 The Continuous Representation. 2 UNIT 2 Topics covered in this unit include Additional terminology Practices – The fundamental building blocks Process.
1 Chapter 2 The Process. 2 Process  What is it?  Who does it?  Why is it important?  What are the steps?  What is the work product?  How to ensure.
CMMi What is CMMi? Basic terms Levels Common Features Assessment process List of KPAs for each level.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 4 Slide 1 Software Processes.
ARINC PROPRIETARY Measurement and Analysis JD Rosser SC-SPIN January 2008.
Software Engineering Lecture # 17
Software Project Management With Usage of Metrics Candaş BOZKURT - Tekin MENTEŞ Delta Aerospace May 21, 2004.
1 f02kitchenham5 Preliminary Guidelines for Empirical Research in Software Engineering Barbara A. Kitchenham etal IEEE TSE Aug 02.
1 © Mahindra Satyam 2009 Mahindra Satyam Confidential Welcome To CMMI Introduction.
Software Engineering - I
©Ian Sommerville 2004 Software Engineering. Chapter 28Slide 1 Chapter 28 Process Improvement.
Managerial Accounting: An Introduction To Concepts, Methods, And Uses
1 / x CMMI Measurement & Analysis Pieter Cailliau Stijn De Vos Measurement & Analysis.
 Copyright ProcessVelocity, LLP Slides intended for informational purposes only. CMM and Capability Maturity Model are registered in the U.S. Patent.
Requirements Development in CMMI
OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 12 February 1999 Its not the model - its what you do Its not the model -
January 2003 CMMI ® CMMI ® V1.1 Tutorial Sponsored by the U.S. Department of Defense © 2003 by Carnegie Mellon University SM CMM Integration and SCAMPI.
1 / 25 IPM CMMI Integrated Project Management (IPM) Dieter De Paepe & Sarah Bourgeois.
Purpose: The purpose of CMM Integration is to provide guidance for improving your organization’s processes and your ability to manage the development,
Guidelines for Process
XML Schemas Experiences Rogue Wave Software Allen Brookes.
OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 1.1 © Copyright OSEL 2002 Beyond Numbers SPIN September 2002.
Software Measurement Measuring software indicators: metrics and methods Jalote-2002,
Space and Airborne Systems Prepared For 3rd Annual CMMI Technology Conference Presented In Denver, CO Tom Cowles November 19, 2003 Peer Reviews For CMMI.
CMMI1 Capability Maturity Model Integration Eyal Ben-Ari 8/2006.
MSA Orientation – v203a 1 What’s RIGHT with the CMMI?!? Pat O’Toole
OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 1.1 © OSEL 2005 Page 1 of 30 Analysis of Defect (and other) Data SPIN London,
© 2004 Tangram Hi-Tech Solutions Project Management According to the CMMI1 Project Management according to the Capability Maturity Model (CMMI)
Figures – Chapter 26. Figure 26.1 Factors affecting software product quality.
OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 1.1© OSEL 2001 Software Project Management: MONITORING AND CONTROL SPIN -
OXFORD SOFTWARE ENGINEERING Consulting Software Engineers © OSEL 2009 Page 1 of 9 A first review of the SPI Manifesto BCS SPIN, 14 December 2009 C. C.
CMMI for Services, Version 1.3 Speaker: Business Excellence Date:
Chapter 3: Cost Estimation Techniques
Regression Analysis Part D Model Building
Capability Maturity Model Integration
CMMI – Staged Representation
Goal, Question, and Metrics
Software Measurement Process ISO/IEC
Algebra Review The equation of a straight line y = mx + b
Requirements Development in CMMI
Presentation transcript:

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 1© OSEL 2008 CMMI and Metrics BCS SPIN SG, 19 February 2008 Clifford Shelley OXFORD SOFTWARE ENGINEERING Ltd 9 Spinners Court, 53 West End, Witney, Oxfordshire OX28 1NH Tel. +44 (0)

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 2© OSEL 2008 Objective 1.Present the background and history of measurement within CMMI 2.Place measurement within the context of CMMI (or CMMI within the context of measurement?) 3.Identify common issues and concerns with measurement within the context of CMMI, and their resolution 4.Look at some measurement approaches that can be really useful (it is assumed here that ‘measurement’ is equivalent to ‘metrics’)

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 3© OSEL 2008 Background… Original SW CMM (a tool to measure software capability) –Included measurement and analysis as a ‘common feature’ – briefly described –Expected product size estimates in planning –Measurement of processes (problematic at L2, expected at L3) –SPC like KPA at L4

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 4© OSEL 2008 …Background… Carried forward and developed to CMMI Staged Representation –ML2 Measurement and Analysis is a PA in its own right –NB: Includes GQM as SP PP requires product sizing –ML3 OPD SP process data defined and collected (using MA) IPM SP 1.2 – use historical data for estimating –ML4 SPC expectations (OPP) Control common causes of variation? Exploited by QPM –ML5 Quantitiative Process Improvement – pervasive measurement CAR uses measurement as analysis tool OID uses measurement as an analysis tool

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 5© OSEL 2008 …Background Continuous Representation – CL2: All PAs expected to be monitored and controlled (GP2.8 – measurement is expected) –CL3: Standard process measures [defined and] stored (GP3.2) –CL4: Quantitative objectives for processes established (GP4.1) SPC required - stabilize sub-process performance (GP4.2) –Control special causes of variation – part of process capability as understood by production engineering –CL5: Establish quantitative process improvement (GP5.1 sub-practice) Manage common causes too – to enable process capability

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 6© OSEL 2008 Measurement within CMMI… MA PA is the enabler “…is to develop and sustain a measurement capability that is used to support management information needs” Interpretation?

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 7© OSEL 2008 …Measurement within CMMI… MA Scope –SG1 Align Measurement and Analysis Activities SP 1.1 Establish Measurement Objectives SP 1.2 Specify Measures SP 1.3 Specify Data collection and Storage Procedures SP 1.4 Specify Analysis Procedures –SG2 Provide Measurement Results SP 2.1 Collect Measurement Data (includes verification) SP 2.2 Analyze Measurement Data SP 2.3 Store Data and Results SP 2.4 Communicate Results (to aid decision making)

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 8© OSEL 2008 …Measurement within CMMI MA PA – Applicability for ML2 –“…support management information needs…” –Project management (initially) –“…at multiple levels in the organization…” –and process management (GP 2.8) –and products (product components provided by suppliers) Not considered explicitly –testing? – testers tend to be a measurement ‘centre of excellence’ –development? – developers don’t (design not amenable to measurement?)

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 9© OSEL 2008 Practical concerns 1: Fear –‘I don’t want to be measured’ It is abstract and can be difficult (Pfleeger) –Distinguish between metrics designers and metrics users –and train accordingly Where does it fit? –Tactical capability or organizational infrastructure, or mix?

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 10© OSEL 2008 Practical concerns 2: What should we measure? –RTFM –SG1 –Ask what do you* need to know? Why? Who monitors and controls (measures)processes –At ML2, at ML3? Who monitors and controls the MA process? Metrics Repository –Central/organizaton, or local/project * ‘you’, perhaps ‘we’, but not ‘they’

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 11© OSEL 2008 Common Circumstances 1: Product sizing rarely done, or done well –Difficulty with identifying entities and their quantitative attributes –Fix: analysts extract identities, counts and attributes of system elements from developer/estimators – start with task based estimation spreadsheet and work upstream, ‘What were they thinking?’ Organizational Measurement Infrastructure in place –Collects lots of data (if it moves measure it) –No traceability back to objectives (SG1 missing) data is orphaned from rationale – can’t reuse –Fix: discard collected data without definitions or rationale, then reverse engineer objectives (SG1) for existing data collection systems – usually results in opportunity to shed data collection activity – reduce costs – although rarely taken up Organizational measurement data is unverified – of unknown accuracy –used for admin/billing –Not credible, known to be invalid (timesheets) –Not used by collectors –Data is for reporting, not using –Fix: verify data – presumes SG1

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 12© OSEL 2008 Common Circumstances 2: Good measurement –Developed, owned and used locally, within teams –Can be undervalued (seems obvious) –MA SG1 implicit There is a limited ‘information horizon’ –Visibility is limited, and may be better that way –Measurement data doesn’t travel well –‘Drill down’ is limited – even if it looks like it isn’t

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 13© OSEL 2008 Good Measurement 1: 1.Purpose clear and understood by collectors, analysts and decision makes 2.Measures are defined (not just described) 3.Data collectors are users (short feedback loops) 4.Accuracy and validity known (as minimal requirement) 5.Can stop collecting when no longer needed

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 14© OSEL 2008 Good Measurement 2: 1.KISS –Minimal arithmetic, especially multiplication and division –(this includes percentages) –Arithmetic obscures much, reveals little 2.Non parametric approaches –Robust, widely applicable to ‘messy’ software engineering data –Not the usual statistical approach 3.Consider ‘Exploratory Data Analysis’ (EDA) –Tukey 4.Use Graphics But not pie charts –Let data show its information content – patterns, trends, outliers –Tufte 5.GQMG tutu –Goal Question Metric, Graphics – guided by Tufte and Tukey –SEI GQ(I)M 6.SPC –Later, much later –SPC Rule #1, know what you’re doing

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 15© OSEL 2008

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 16© OSEL 2008 O X F O R D S O F T W A R E E N G I N E E R I N G L I M I T E D 9 Spinners Court, 53 West End, Witney, Oxfordshire OX28 1NH Tel. +44 (0)

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 17© OSEL 2008

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 18© OSEL 2008 Supplementary Material…

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 19© OSEL 2008 Empirical relational system

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 20© OSEL 2008 Empirical relational system Formal relational system measurement real worldmathematical world

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 21© OSEL 2008 Empirical relational system Formal relational system Results measurement mathematics and statistics real worldmathematical world

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 22© OSEL 2008 Empirical relational system Formal relational system Results measurement interpretation mathematics and statistics real worldmathematical world Relevant empirical information

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 23© OSEL 2008 Empirical relational system Formal relational system Results measurement interpretation mathematics and statistics decisions and actions real worldmathematical world Relevant empirical information From Pfleeger 1998

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 24© OSEL 2008 Empirical relational system Formal relational system Results refined measurement improved interpretation mathematics and statistics better decisions and actions real worldmathematical world Relevant empirical information

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 25© OSEL 2008

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 26© OSEL 2008 ‘Anscombe’s Quartet’ – American Statistician 1973

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 27© OSEL 2008 N = 11 Mean of X’s = 9.0 Mean of Y’s = 7.5 Regression line = Y + 0.5X Standard error of estimate of slope = t = 4.24 Sum of squares X – X = Regression of sum of squares = Residual sum of squares of Y = Correlation coefficient = 0.82 R 2 = 0.67 _

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 28© OSEL 2008

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 29© OSEL 2008

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 30© OSEL 2008 Process Capability: Indices for measuring process goodness Cp = USL - LSL / 6  or 2T / 6  –Cp < 1 process is incapable –Cp > 1 process is capable (6  processes have Cp of 2) –does not account for process drift so... Cpk = the lesser of (USL - X) / 3  or (X - LSL) / 3  ==

OXFORD SOFTWARE ENGINEERING Software Engineering Services & Consultancy Slide 31© OSEL 2008