Download presentation
Presentation is loading. Please wait.
Published byMay Cummings Modified over 9 years ago
1
Seven Key Measures for Software Testing Graham Thomas RCOG, 15 th June 2006 Specialist Interest Group in Software Testing
2
2 Abstract Last year I came across the worst measurement example that I had seen in over 20 years experience of IT. Part of the problem comes from the fact that there isn’t a standard set of measures, so should we actually get upset when software testers measure the wrong thing, in the wrong way, and then report it badly? Actually no! Not until there is a standard definition for software test measurement and reporting. So there is the challenge for this presentation. To present a standard set of measures, metrics and reports for software testing so that there can no longer be any excuse. This presentation proposes 7 key measures across the software testing lifecycle, covering; Planning, Risk, Test Preparation, Test Execution and Defect analysis. The presentation will also identify effective ways to present the 7 key measures in the form of a practical model.
3
3 Agenda ●An example Test Report ●Definition of Measurement and Metric ●Seven Key Measures ●Weekly Reporting ●More measures and metrics ●Tips for success ●Conclusion
4
4 An example Test Report ● Lets look at an example test report Example Test Report Example Test Report ● To summarise Poor presentation Unable to quickly, simply and easily get a view of testing Too much information Difficult to compare data Real message obscured Actually unintelligible !
5
5 Definition Measurement - Metric ●Measurement “The act or process of measuring” ●Metric “A calculated term or enumeration representing some aspect of biological assemblage, function, or other measurable aspect and is a characteristic of the biota that changes in some predictable way with increased human influence” Ref: Minnesota Pollution Control Agency Biological Monitoring Program Glossary of Terms
6
6 Testing Progress
7
7 Scripting Progress
8
8 Risk Profile
9
9 Risk Mitigation
10
10 Fault S-Curve
11
11 Environment Availability
12
12 Coverage
13
13 Weekly Report Config Status Metrics Analysis Risk
14
14 RED BLACK YELLOW BLUE RED GREEN YELLOW BLACK BLUE BLACK RED YELLOW GREEN BLUE GREEN ZYP QLEKF SUWRG XCIDB WOPR ZYP QLEKF XCIDB SUWRG WOPR SUWRG ZYP XCIDB QLEKF WOPR More Measures and Metrics ●Use these views to support the view and message Defects by Type / Severity / Priority Defect Hot Spot Analysis Defect Age Analysis Causal Analysis ●Metrics Defects / Faults per KLOC / KXLOC Defects per Requirement Days Test Effort per Requirement DDP The Stroop Effect
15
15 A Few Tips for successful charting Colours ● Easily Distinguishable Colours ●Consistent look and feel ●If you shade then - light at the top dark at the bottom ●RED means DANGER ! ●Cumulative totals enable to be spotted ●Remember it is the content that is important
16
16 Conclusions ●There are other things to measure than just defects ●Report trends rather than snapshot measurements ●Limit the amount that you report to increase impact ●Be consistent in your reporting ●Explain what you mean by your measurements - don’t assume that others will automatically know
17
17 Contact Details Graham Thomas Independent Consultant graham@badgerscroft.com +44 7973 387 853 www.badgerscroft.com
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.