Presentation is loading. Please wait.

Presentation is loading. Please wait.

How to Measure the Impact of Specific Development Practices on Fielded Defect Density.

Similar presentations


Presentation on theme: "How to Measure the Impact of Specific Development Practices on Fielded Defect Density."— Presentation transcript:

1 How to Measure the Impact of Specific Development Practices on Fielded Defect Density

2 Purpose of Study: Mathematically correlate software development practices to defect density

3 Why is this important? If we can determine which practices lead to lower defect densities we can then use that knowledge to direct future development efforts with the goal of eliminating more defects for the least amount of cost.

4 Dual Purpose: The model also serves to help predict the ability of an organization to deliver the products on schedule or with a greater degree of accuracy in relation to the schedule. That is the study suggest that there is potentially a misconception within the industry; the conception that practices that lead to lower defect densities slow a project down and can lead to delays. The study shows that the organizations with the lowest defect density also have the highest probability of delivering the product on time. Additionally, these same organizations are less late when the project goes beyond the targeted delivery date.

5 History: The USAF Rome Laboratories produced one of the first documents that aimed at correlating development practices to defect density. This document served as a spring board for this study. The author sought to improve upon this initial work by:  Developing individual weights for each parameter. In the Rome study each parameter was given equal weight, whereas this study aimed to determine the individual weight for each parameter.  The author sought out parameters that were objective and reliable. That is parameters that could be measured repeatedly and across organizations consistently. For example the author avoided measurements related to developer experience.  Making the study boarder, that is applicable to commercial applications  Independent of compiler  Incorporating into the study newer technologies such as OO and incremental models

6 Forty five organizations have been evaluated but only seventeen organizations’ documents have been used for the study due to the perceived accuracy and completeness of the data. The original parameter list was 27 but was then expanded to 102 parameters due to newer technologies and tools for data collection and through interviewing the organizations with the highest and lowest defect densities to determine the major differences between them. History Continued:

7 Positive Attributes of the Study:  Only one person evaluated the practices for each organization making the evaluation process consistent, i.e. the same criteria was used across the board.  The author had intimate knowledge of each organization therefore could distinguish between “actual” and “wish list” practices.  The author required physical proof of all positively answered questions.  Author required a wide cross section of responses from organizations to help insure accuracy of reported data: managers, lead engineers, quality engineers, test engineers, seasoned members and new hires, etc.

8 The Outcome

9 Results Score on Study ClassificationAverage defect density in defects per Assembler KSLOC Average probability of late delivery Average margin of error on late deliveries as a % of original schedule Average number of corrective action releases per year Average SEI CMM level 700+Best Practices.1430%10%3.67 2.1 300-699Good Practices.4866%88%6 1.2 100-299Moderate Practices.9682%215%8.5 1 < 100Least Practices 1.3588%138%14 1

10 Common Practices Amongst organizations with the highest scores and lowest defect densities:  Software engineering is part of the engineering process and not considered to be an art form  Well rounded practices as opposed to believing in a “single bullet” philosophy  Formal and informal reviews of the software requirements prior to proceeding to design and code  Testers are involved in the requirement translation process

11 Common Practices Amongst organizations with the lowest scores and highest defect densities:  Lack of software management  Misconception that programmers know what the customer wants better then the customer does.  An inability to focus on the requirements and use cases with the highest occurrence probability  Complete void of a requirements definition process  Insufficient Testing Methods

12 The Score

13 Scoring Methodology  Review practices that had already been mathematically correlated by USAF Rome Laboratories  Study organizations that were at the top of their industry  Study organizations that were at the bottom of their industry  Poll customers what key factors they felt impacted software reliability and investigate

14 Scoring Methodology Continued  Select parameters that will correlate to many organizations as opposed to a single one  Make sure the parameter is measurable  Determine if each single parameter correlates to the empirical defect densities for each of the samples  Drop parameters that do not correlate but keep data incase parameter correlates at a later time  If parameter correlates determine its relative weight by weight, which does not necessarily directly or linearly relate to the correlation.

15 Equation DD =.000001x^2 -.002531x + 1.394135

16 Stronger correlation was expected  Configuration Management and source control  Use of automated tools  Code Reviews  Implementation of OO technologies

17 Top Ten Parameters with the highest correlations CorrelationMax Point% of Max ScoreDescription -0.89 405.08LCP-ST3 -0.86 708.90PM1 -0.86 708.90PM2 -0.85 405.08LCP-ST9 -0.84 101.27LCP-C8 -0.84 151.91LCP-A/R5 -0.83 1.25.16LCP-RT1 -0.81 455.72CC-FDRS5 -0.81 14017.80LCP-LCM1 -.765.64LCP-A/R8

18 Top Ten Parameters with the largest percentage of Max Score Key% of Max Score Correlation Ranking LCP-LCM1 17.819 PM1 8.92 PM2 8.93 CC-FDRS5 5.728 LCP-ST3 5.091 LCP-ST9 5.094 CC-FDRS9 4.5814 OC2 3.5623 OC9 3.5653 OC83.0554 Total of Top 1066.26 Total of Top 2078.72

19 Percentage of Max Score by Categories % of Max Score by Categories % of Max Score Ranking Corrective Action (CC) 2.54 8 Failure and defect reporting system (CC) 13.74 4 A/R (LCP) 3.56%5 Coding (LCP) 1.97%9 Design (LCP) 4.20%6 Model (LCP) 17.81%2 Miscellaneous (LCP) 0.58%10 Regression Testing (LCP) 0.52%11 System Testing (LCP) 20.86% 1 Unit Testing (LCP) 3.21% 7 Organizational Commitment 13.20 5 Product Metrics 17.81 3


Download ppt "How to Measure the Impact of Specific Development Practices on Fielded Defect Density."

Similar presentations


Ads by Google