S T A M © 2000, KPA Ltd. Software Trouble Assessment Matrix Software Trouble Assessment Matrix *This presentation is extracted from SOFTWARE PROCESS QUALITY:

Slides:



Advertisements
Similar presentations
Software Engineering CSE470: Process 15 Software Engineering Phases Definition: What? Development: How? Maintenance: Managing change Umbrella Activities:
Advertisements

Chapter 4 Quality Assurance in Context
Chapter 7: Key Process Areas for Level 2: Repeatable - Arvind Kabir Yateesh.
More CMM Part Two : Details.
1 Brief Descriptions of CMM KPAs CEN 6070 Summer 2004.
How ISO9001 Compares with CMM Mark C. Paulk JAN,1995 CMM version 1.1 ISO9001 July 1994 presented by Zhilan Zhou.
Chapter 2 The Software Process
Stepan Potiyenko ISS Sr.SW Developer.
Software Measurement and Process Improvement
School of Computing, Dublin Institute of Technology.
Capability Maturity Model (CMM) in SW design
Computer Engineering 203 R Smith Process/Plan Model 7/ Development Process Models Development Process Models are different ways to look at the processes.
Validating and Improving Test-Case Effectiveness Author: Yuri Chernak Presenter: Lam, Man Tat.
CMM Overview - 1 © Paul Sorenson CMPUT Software Engineering refs. IEEE Software, March 1988, 73-79, and IEEE Software, July 1993, (Capability.
Chapter 3 The Structure of the CMM
Software Process CS 414 – Software Engineering I Donald J. Bagert Rose-Hulman Institute of Technology December 17, 2002.
 QUALITY ASSURANCE:  QA is defined as a procedure or set of procedures intended to ensure that a product or service under development (before work is.
Software Quality Assurance For Software Engineering && Architecture and Design.
Using A Defined and Measured Personal Software Process Watts S. Humphrey CS 5391 Article 8.
Capability Maturity Model
Chapter : Software Process
CC20O7N - Software Engineering 1 CC2007N Software Engineering 1 Requirements Engineering Practices with Techniques.
Software Quality Chapter Software Quality  How can you tell if software has high quality?  How can we measure the quality of software?  How.
Software Engineering II Lecture 1 Fakhar Lodhi. Software Engineering - IEEE 1.The application of a systematic, disciplined, quantifiable approach to the.
Org Name Org Site CMM Assessment Kick-off Meeting Dates of assessment.
Dillon: CSE470: SE, Process1 Software Engineering Phases l Definition: What? l Development: How? l Maintenance: Managing change l Umbrella Activities:
N By: Md Rezaul Huda Reza n
Software Quality Assurance Activities
Software Inspection A basic tool for defect removal A basic tool for defect removal Urgent need for QA and removal can be supported by inspection Urgent.
1 Chapter 2 The Process. 2 Process  What is it?  Who does it?  Why is it important?  What are the steps?  What is the work product?  How to ensure.
Software System Engineering: A tutorial
Introduction to Software Engineering LECTURE 2 By Umm-e-Laila 1Compiled by: Umm-e-Laila.
S Q A.
Chapter 2 Process: A Generic View
Software Engineering - Spring 2003 (C) Vasudeva Varma, IIITHClass of 39 CS3600: Software Engineering: Standards in Process Modeling CMM and PSP.
Software Quality Assurance SE Software Quality Assurance What is “quality”?
IT Requirements Management Balancing Needs and Expectations.
By Ritesh Reddy Nagaram.  Organizations which are developing software processes are facing many problems regarding the need for change of already existing.
CS 3610: Software Engineering – Fall 2009 Dr. Hisham Haddad – CSIS Dept. Chapter 2 The Software Process Discussion of the Software Process: Process Framework,
Quality Concepts within CMM and PMI G.C.Reddy
Z26 Project Management CMMI and Improving Process Quality Lecture 5 a Graham Collins, UCL.
Georgia Institute of Technology CS 4320 Fall 2003.
SWEN 5130 Requirements Engineering 1 Dr Jim Helm SWEN 5130 Requirements Engineering Requirements Management Under the CMM.
Software Engineering - I
CSCI 521 Final Exam Review. Why Establish a Standard Process? It is nearly impossible to have a high quality product without a high quality process. Standard.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
Cmpe 589 Spring 2006 Lecture 2. Software Engineering Definition –A strategy for producing high quality software.
Ch-1 Introduction The processes used for executing a software project have major effect on quality of s/w produced and productivity achieved in project…
Page 1 The Capability Maturity Model (CMM) distinguishes between immature and mature software organizations. Immature software organizations are typically.
Project Management Basics
Software Engineering Modern Approaches Eric Braude and Michael Bernstein 1.
Software Engineering (CSI 321) Software Process: A Generic View 1.
Project Management Strategies Hidden in the CMMI Rick Hefner, Northrop Grumman CMMI Technology Conference & User Group November.
Capability Maturity Model. CS460 - Senior Design Project I (AY2004)2 Immature Organisations Software processes are often rigorously followed. Organisation.
6/6/ SOFTWARE LIFE CYCLE OVERVIEW Professor Ron Kenett Tel Aviv University School of Engineering.
Cmpe 589 Spring Fundamental Process and Process Management Concepts Process –the people, methods, and tools used to produce software products. –Improving.
by: Er. Manu Bansal Deptt of IT Software Quality Assurance.
Capability Maturity Model. What is CMM? n CMM: Capability Maturity Model n Developed by the Software Engineering Institute of the Carnegie Mellon University.
Software Project Configuration Management
CS4311 Spring 2011 Process Improvement Dr
Chapter 10 Software Quality Assurance& Test Plan Software Testing
Software Engineering (CSI 321)
IEEE Std 1074: Standard for Software Lifecycle
Level 1 Level 1 – Initial: The software process is characterized as ad hoc and occasionally even chaotic. Few processes are defined, and success depends.
Software Engineering: A Practitioner’s Approach, 6/e Chapter 2 Process: A Generic View copyright © 1996, 2001, 2005 R.S. Pressman & Associates, Inc.
KEY PROCESS AREAS (KPAs)
Software Engineering Lecture 16.
Software Engineering I
Capability Maturity Model
Capability Maturity Model
Presentation transcript:

S T A M © 2000, KPA Ltd. Software Trouble Assessment Matrix Software Trouble Assessment Matrix *This presentation is extracted from SOFTWARE PROCESS QUALITY: Management and Control by Kenett and Baker, Marcel Dekker Inc., It was first published as "Assessing Software development and Inspection Errors", Quality Progress, pp , October 1994 with corrections in the issue of February KPA Ltd. and Tel Aviv University Assessing Software Inspection Processes with STAM* Ron S. Kenett

S T A M © 2000, KPA Ltd. Presentation agenda Software life cycles Inspection processes Measurement programs Assessing software inspection processes

S T A M © 2000, KPA Ltd. Presentation agenda Software life cycles Inspection processes Measurement programs Assessing software inspection processes

S T A M © 2000, KPA Ltd. Informal software life cycle Marketing Requirements Spec System Requirements Spec System Design Spec Software Requirements Spec Software Test Spec System Integration Spec System Test Spec Acceptance Test Spec Design and Construction Artifacts

S T A M © 2000, KPA Ltd. Design a little... Implement a little... Test a little... Web applications life cycle

S T A M © 2000, KPA Ltd. Requirements stage (Program Statement) Proposal stage Analysis Definition END Design stage (Draft Requirements Specification) (Requirements Specification) (Proposal) (Project Plans) Proposal and Project Planning Code stage Verification stage (Functional Description) (Design) (Code and Unit Test) (Documentation) (Technical Testing) (System Testing) Change to Requirements? Develop Proposal and Project Plans to fulfill project requirements Analyze requirements, categorize to expose incomplete areas, and prioritize by importance Gather initial requirements, clarify requirements for understanding Change Control Update all related documents, code, and tests to reflect the change Formal software life cycle

S T A M © 2000, KPA Ltd. Software Life Cycle Phases RequirementsAnalysis Top Level DesignDetailedDesign SystemTestsUnitTests Program- ming AcceptanceTests

S T A M © 2000, KPA Ltd. Presentation agenda Software life cycles Inspection processes Measurement programs Assessing software inspection processes

S T A M © 2000, KPA Ltd. The software development matrix Key Activities Work Products Work Product Development Practices Inspection Practices Work Product Control Practices

S T A M © 2000, KPA Ltd. SEI Capability Maturity Model Maturity Level Software Inspection Features Characteristics Defined Defect removal, Entry, Exit Defined processes, peer reviews Initial Depends entirely on individuals.None Repeatable Policies, procedures, experience base Writing-Task Rules, QA Policies, Inspection Procedures Managed Quantitative goals for product & process Optimum rates, quality level at exit & entry, data summary, d-base Optimizing Entire organization. focused on continuous process improvement Defect Prevention Process Improvements logging, Owners, Proc. Change Mgt. Team Based on Paulk et al, “Capability Maturity Model Version 1.1”, IEEE Software, July 1993.

S T A M © 2000, KPA Ltd. Presentation agenda Software life cycles Inspection processes Measurement programs Assessing software inspection processes

S T A M © 2000, KPA Ltd. Software Measurement Programs

S T A M © 2000, KPA Ltd. Measurement Program Implementation

S T A M © 2000, KPA Ltd Plan/Evaluate Phase Reasons for implementation  Establish a baseline from which to determine trends  Quantify how much was delivered in terms the client understands  Help in estimating and planning projects  Compare the effectiveness and efficiency of current processes, tools, and techniques  Identify and proliferate best practices  Identify and implement changes that will result in productivity, quality, and cost improvements  Establish an ongoing program for continuous improvement  Quantitatively prove the success of improvement initiatives  Establish better communication with customers  Manage budgets for software development more effectively Measurement Program Implementation: Plan/Evaluate Phase

S T A M © 2000, KPA Ltd Plan/Evaluate Phase Questions to help identify goals  How fast can we deliver reliable software to our customers? Does it satisfy their requirements?  Can we efficiently estimate the development cost and schedule? Are the estimates accurate?  What can we do to improve our systems-development life cycle and shorten the cycle time?  What is the quality of the software we deliver? Has it improved with the introduction of new tools or techniques?  How much are we spending to support existing software? Why does one system cost more than another to support?  Which systems should be re-engineered or replaced? When?  Should we buy or build new software systems?  Are we becoming more effective and efficient at software development? Why? Why not?  How can we better leverage our information technology?  Has our investment in a particular technology increased our productivity? Measurement Program Implementation: Plan/Evaluate Phase

S T A M © 2000, KPA Ltd Plan/Evaluate Phase Identification of sponsors Identification of roles and responsibilities  Who will decide what, how, and when to collect the measurement information?  Who will be responsible for collecting the measurement information?  How will the data be collected? What standards (internal or external) will be used?  At which phases will the data be collected? Where will it be stored?  Who will ensure consistency of data reporting and collection?  Who will input and maintain the measurement information?  Who will report measurement results? When?  What will be reported to each level of management?  Who will interpret and apply the measurement results?  Who is responsible for training?  Who will maintain an active interest in the measurement program to ensure full usage of the measurement information?  Who will evaluate measurement results and improve the measurement program?  Who will ensure adequate funding support? Measurement Program Implementation: Plan/Evaluate Phase

S T A M © 2000, KPA Ltd Analysis Phase Analysis of audience and identification of target metrics Definition of Software Metrics Implement/Measure Phase Organizing for Just In Time training and education processes Reporting and publishing results Improve Phase Managing expectations Managing with metrics Measurement Program Implementation: Analysis/Implementation/Improve Phases

S T A M © 2000, KPA Ltd. Source: SEI, 1994, number of organizations: , number of organizations 606 Statistics from formal assessments “ the tip of the iceberg”

S T A M © 2000, KPA Ltd. Most organizations are moving towards level 2 INITIAL REPEATABLE Requirements Management Project Planning Project Tracking & Oversight Subcontract Management Quality Assurance Configuration Management Requirements Management Project Planning Project Tracking & Oversight Subcontract Management Quality Assurance Configuration Management

S T A M © 2000, KPA Ltd. CMM Level 2 Key Process Areas Requirements Management Software Project Planning Software Configuration Management Software Quality Assurance Software Subcontract Management Software Project Tracking and Oversight

S T A M © 2000, KPA Ltd. Software Development Management Dashboard “it works only for organizations above level 2” PP and PTO RM CM PP and PTO QA

S T A M © 2000, KPA Ltd. Presentation agenda Software life cycles Inspection processes Measurement programs Assessing software inspection processes

S T A M © 2000, KPA Ltd. Software Trouble Assessment Matrix When were errors detected? When were errors detected? Depends on the inspection process efficiency - i.e., how it performs When errors could have been detected? When errors could have been detected? Depends on the inspection process effectiveness - i.e., how it was designed When were errors created? When were errors created? Depends on the overall performance of the software development process

S T A M © 2000, KPA Ltd. Software Life Cycle Phases RequirementsAnalysis Top Level DesignDetailedDesign SystemTestsUnitTests Program- ming AcceptanceTests

S T A M © 2000, KPA Ltd. When were errors detected? RequirementsAnalysis Top Level DesignDetailedDesign SystemTestsUnitTests Program- ming AcceptanceTests

S T A M © 2000, KPA Ltd. When were errors detected? Life Cycle PhaseNumber of Errors Requirements Analysis3 Top Level design7 Detailed Design2 Programming25 Unit Tests31 System Tests29 Acceptance Test13 Cumulative profile = S1

S T A M © 2000, KPA Ltd. When errors could have been detected? Life Cycle PhaseNumber of Errors Requirements Analysis8 Top Level design14 Detailed Design10 Programming39 Unit Tests8 System Tests26 Acceptance Test5 Cumulative profile = S2

S T A M © 2000, KPA Ltd. When were errors created? Life Cycle PhaseNumber of Errors Requirements Analysis34 Top Level design22 Detailed Design17 Programming27 Unit Tests5 System Tests5 Acceptance Test0 Cumulative profile = S3

S T A M © 2000, KPA Ltd. S1, S2, S3 cumulative profiles

S T A M © 2000, KPA Ltd. The Software Trouble Assessment Matrix When were errors created? When were errors detected?

S T A M © 2000, KPA Ltd. S T A M The S oftware T rouble A ssessment M atrix When were errors created? When were errors detected?

S T A M © 2000, KPA Ltd. Definition of STAM Metrics Negligence ratio: ratio: indicates the amount of errors that escaped through the inspection process filters. INSPECTION EFFICIENCY Evaluation ratio: ratio: measures the delay of the inspection process in identifying errors relative to the phase in which they occurred. INSPECTION EFFECTIVENESS Prevention ratio: ratio: an index of how early errors are detected in the development life cycle relative to the total number of reported errors. DEVELOPMENT PROCESS EXECUTION

S T A M © 2000, KPA Ltd. Areas under cumulative profiles: S1 = 337 S2 = 427 S3 = 588 Negligence ratio: 100 x (S2 - S1)/S1 = 26.7% Evaluation ratio: 100 x (S3 - S2)/S2 = 37.7% Prevention ratio: 100 x S1/(7 x total) = 43.7% Computation of STAM Metrics

S T A M © 2000, KPA Ltd. 1. Errors are detected 27% later than they should have been (I.e. if the inspection processes worked perfectly) 2. The design of the inspection processes imply that errors are detected 38% into the phase following their creation. 3. Ideally all errors are requirements errors, and they are detected in the requirements phase. In this example only 47% of this ideal is materialized implying significant opportunities for improvement. Interpretation of STAM Metrics

S T A M © 2000, KPA Ltd. Inspection processes need to be designed in the context of a software life cycle Inspection processes need to be evaluated using quantitative metrics STAM metrics provide such an evaluation STAM metrics should be integrated in an overall measurement program Conclusions