How to Measure the Impact of Specific Development Practices on Fielded Defect Density.

Slides:



Advertisements
Similar presentations
Configuration Management
Advertisements

1.Quality-“a characteristic or attribute of something.” As an attribute of an item, quality refers to measurable characteristics— things we are able to.
Project Management and Software Quality See accompanying Word file “Software PM tools 3”
A presentation from June 20, 2000 Jim Brosseau The ‘How-To’ of Software Process Improvement.
More CMM Part Two : Details.
Predictor of Customer Perceived Software Quality By Haroon Malik.
Project What is a project A temporary endeavor undertaken to create a unique product, service or result.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 6/e (McGraw-Hill 2005). Slides copyright 2005 by Roger Pressman.1.
CPIS 357 Software Quality & Testing I.Rehab Bahaaddin Ashary Faculty of Computing and Information Technology Information Systems Department Fall 2010.
Configuration Management Managing Change. Points to Ponder Which is more important?  stability  progress Why is change potentially dangerous?
Total Quality Management
Software Development Process Models. The Waterfall Development Model.
COMP 6710 Course NotesSlide 2-0 Auburn University Computer Science and Software Engineering Course Notes Set 2: Software Process Models Computer Science.
A GOAL-BASED FRAMEWORK FOR SOFTWARE MEASUREMENT
SE 450 Software Processes & Product Metrics 1 Defect Removal.
1 Predictors of customer perceived software quality Paul Luo Li (ISRI – CMU) Audris Mockus (Avaya Research) Ping Zhang (Avaya Research)
SOFTWARE PROJECT MANAGEMENT Project Quality Management Dr. Ahmet TÜMAY, PMP.
Software Process and Product Metrics
Organizational Project Management Maturity: Roadmap to Success
12 Steps to Useful Software Metrics
Capability Maturity Model
S/W Project Management
Quality Planning & Defect Estimation
Org Name Org Site CMM Assessment Kick-off Meeting Dates of assessment.
Capability Maturity Model Part One - Overview. History Effort started by SEI and MITRE Corporation  assess capability of DoD contractors First.
CLEANROOM SOFTWARE ENGINEERING.
N By: Md Rezaul Huda Reza n
Software Inspections. Defect Removal Efficiency The number of defects found prior to releasing a product divided by The number of defects found prior.
Software Inspection A basic tool for defect removal A basic tool for defect removal Urgent need for QA and removal can be supported by inspection Urgent.
Unit 8 Syllabus Quality Management : Quality concepts, Software quality assurance, Software Reviews, Formal technical reviews, Statistical Software quality.
Analyze Opportunity Part 1
Chapter 6 : Software Metrics
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
Configuration Management (managing change). Starter Questions... Which is more important?  stability  progress Why is change potentially dangerous?
Lecture 1 Introduction to Software Engineering
“Put Some Science in Your Game with Leading and Trailing Indicators” Safety Performance Metrics Tom Lott Senior Vice President Wachovia Insurance Services.
Software Project Management Lecture # 10. Outline Quality Management (chapter 26)  What is quality?  Meaning of Quality in Various Context  Some quality.
Chapter 12 Evaluating Products, Processes, and Resources.
A Strategy for Prioritising Non-response Follow-up to Reduce Costs Without Reducing Output Quality Gareth James Methodology Directorate UK Office for National.
Object-Oriented Software Engineering
Thomas L. Gilchrist Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009.
Rapid software development 1. Topics covered Agile methods Extreme programming Rapid application development Software prototyping 2.
Software Project Management Lecture # 3. Outline Chapter 22- “Metrics for Process & Projects”  Measurement  Measures  Metrics  Software Metrics Process.
From Quality Control to Quality Assurance…and Beyond Alan Page Microsoft.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Software Verification, Validation and Testing.
1 The Concept of Risk A risk is defined as a variable that can take a value that endangers or eliminates success for a project. In plain terms, a risk.
INFO 636 Software Engineering Process I Prof. Glenn Booker Week 9 – Quality Management 1INFO636 Week 9.
6/6/01 1 Copyright 2001 by Ralph R. Young Effective Requirements Practices Designed to improve individual, project, and organizational effectiveness. Based.
Copyright  2003 by Dr. Gallimore, Wright State University Department of Biomedical, Industrial Engineering & Human Factors Engineering Human Factors Research.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved. Section 7-1 Review and Preview.
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
SOFTWARE METRICS. Software Process Revisited The Software Process has a common process framework containing: u framework activities - for all software.
Chapter 3: Software Project Management Metrics
Cmpe 589 Spring 2006 Lecture 2. Software Engineering Definition –A strategy for producing high quality software.
SOFTWARE PROCESS AND PROJECT METRICS. Topic Covered  Metrics in the process and project domains  Process, project and measurement  Process Metrics.
Software Architecture Evaluation Methodologies Presented By: Anthony Register.
“How to Measure the Impact of Specific Development Practices on Fielded Defect Density” by Ann Marie Neufelder Presented by: Feride Padgett.
(1) Cam Moore Collaborative Software Development Laboratory Communication & Information Sciences University of Hawaii, Manoa
1 These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.
Cmpe 589 Spring Fundamental Process and Process Management Concepts Process –the people, methods, and tools used to produce software products. –Improving.
Configuration Management
Configuration Management
Failure mode and effect analysis
12 Steps to Useful Software Metrics
CSSSPEC6 SOFTWARE DEVELOPMENT WITH QUALITY ASSURANCE
Capability Maturity Model
Capability Maturity Model
Presentation transcript:

How to Measure the Impact of Specific Development Practices on Fielded Defect Density

Purpose of Study: Mathematically correlate software development practices to defect density

Why is this important? If we can determine which practices lead to lower defect densities we can then use that knowledge to direct future development efforts with the goal of eliminating more defects for the least amount of cost.

Dual Purpose: The model also serves to help predict the ability of an organization to deliver the products on schedule or with a greater degree of accuracy in relation to the schedule. That is the study suggest that there is potentially a misconception within the industry; the conception that practices that lead to lower defect densities slow a project down and can lead to delays. The study shows that the organizations with the lowest defect density also have the highest probability of delivering the product on time. Additionally, these same organizations are less late when the project goes beyond the targeted delivery date.

History: The USAF Rome Laboratories produced one of the first documents that aimed at correlating development practices to defect density. This document served as a spring board for this study. The author sought to improve upon this initial work by:  Developing individual weights for each parameter. In the Rome study each parameter was given equal weight, whereas this study aimed to determine the individual weight for each parameter.  The author sought out parameters that were objective and reliable. That is parameters that could be measured repeatedly and across organizations consistently. For example the author avoided measurements related to developer experience.  Making the study boarder, that is applicable to commercial applications  Independent of compiler  Incorporating into the study newer technologies such as OO and incremental models

Forty five organizations have been evaluated but only seventeen organizations’ documents have been used for the study due to the perceived accuracy and completeness of the data. The original parameter list was 27 but was then expanded to 102 parameters due to newer technologies and tools for data collection and through interviewing the organizations with the highest and lowest defect densities to determine the major differences between them. History Continued:

Positive Attributes of the Study:  Only one person evaluated the practices for each organization making the evaluation process consistent, i.e. the same criteria was used across the board.  The author had intimate knowledge of each organization therefore could distinguish between “actual” and “wish list” practices.  The author required physical proof of all positively answered questions.  Author required a wide cross section of responses from organizations to help insure accuracy of reported data: managers, lead engineers, quality engineers, test engineers, seasoned members and new hires, etc.

The Outcome

Results Score on Study ClassificationAverage defect density in defects per Assembler KSLOC Average probability of late delivery Average margin of error on late deliveries as a % of original schedule Average number of corrective action releases per year Average SEI CMM level 700+Best Practices.1430%10% Good Practices.4866%88% Moderate Practices.9682%215%8.5 1 < 100Least Practices %138%14 1

Common Practices Amongst organizations with the highest scores and lowest defect densities:  Software engineering is part of the engineering process and not considered to be an art form  Well rounded practices as opposed to believing in a “single bullet” philosophy  Formal and informal reviews of the software requirements prior to proceeding to design and code  Testers are involved in the requirement translation process

Common Practices Amongst organizations with the lowest scores and highest defect densities:  Lack of software management  Misconception that programmers know what the customer wants better then the customer does.  An inability to focus on the requirements and use cases with the highest occurrence probability  Complete void of a requirements definition process  Insufficient Testing Methods

The Score

Scoring Methodology  Review practices that had already been mathematically correlated by USAF Rome Laboratories  Study organizations that were at the top of their industry  Study organizations that were at the bottom of their industry  Poll customers what key factors they felt impacted software reliability and investigate

Scoring Methodology Continued  Select parameters that will correlate to many organizations as opposed to a single one  Make sure the parameter is measurable  Determine if each single parameter correlates to the empirical defect densities for each of the samples  Drop parameters that do not correlate but keep data incase parameter correlates at a later time  If parameter correlates determine its relative weight by weight, which does not necessarily directly or linearly relate to the correlation.

Equation DD = x^ x

Stronger correlation was expected  Configuration Management and source control  Use of automated tools  Code Reviews  Implementation of OO technologies

Top Ten Parameters with the highest correlations CorrelationMax Point% of Max ScoreDescription LCP-ST PM PM LCP-ST LCP-C LCP-A/R LCP-RT CC-FDRS LCP-LCM LCP-A/R8

Top Ten Parameters with the largest percentage of Max Score Key% of Max Score Correlation Ranking LCP-LCM PM PM CC-FDRS LCP-ST LCP-ST CC-FDRS OC OC OC Total of Top Total of Top

Percentage of Max Score by Categories % of Max Score by Categories % of Max Score Ranking Corrective Action (CC) Failure and defect reporting system (CC) A/R (LCP) 3.56%5 Coding (LCP) 1.97%9 Design (LCP) 4.20%6 Model (LCP) 17.81%2 Miscellaneous (LCP) 0.58%10 Regression Testing (LCP) 0.52%11 System Testing (LCP) 20.86% 1 Unit Testing (LCP) 3.21% 7 Organizational Commitment Product Metrics