Download presentation
Presentation is loading. Please wait.
Published byWilliam Phelps Modified over 8 years ago
1
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 SMU CSE 8314 Software Measurement and Quality Engineering Module 31 Methods of Observation Part 2 - Significance and Response (Displaying, Analyzing and Interpreting Data)
2
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 2 IntakeMeaningSignificanceResponse (*) Satir, Virginia et al. (references) The Observation Process (*) Previous Module This Module Other Modules
3
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 3 Methods of Display and Interpretation (*) (*) Weinberg, Vol 2, chapters 4-8; Grady, chapters 2, 12 (references) Significance
4
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 4 Methods of Display and Interpretation “Diagrams are Nothing; Diagramming is Everything” -- Weinberg, after Eisenhower (*) (*) Eisenhower, Dwight, “Plans are Nothing, Planning is Everything.”
5
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 5 Methods of Diagramming The way you present the data is associated with the message you want to convey It is important to know what you want to communicate before selecting a way of graphing the data An important objective of many types of diagrams is to show the relationships between different factors
6
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 6 Methods to be Described Fishbone Diagrams Scatter Charts Histograms Pareto Charts Run (Trend) Charts Control Charts
7
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 7 Fishbone (Ishikawa) Diagrams Project Late Resources Design Workstations Skills Motivation Errors People Complexity See previous module for more discussion
8
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 8 Scatter Charts Purpose: –To show correlation (or lack of correlation) between two variables Method: –Plot two or more variables on an x-y or scatter chart
9
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 9 Scatter Chart Example
10
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 10 A form of Linear Correlation V2 = a * V1 + b
11
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 11 A Non-Linear Correlation
12
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 12 No Apparent Correlation
13
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 13 Further Categorization of Data Sometimes the scatter chart hides a relationship because we did not segregate the data sufficiently well –e.g, suppose the last diagram showed fault density and weeks late on shipping –And suppose we had two kinds of projects: those that do inspections and walkthroughs and those that use only testing to identify defects –If we further segregate the data, we might see the following
14
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 14 Segregated Scatter Chart
15
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 15 Notes about Correlation Correlation does NOT necessarily mean cause and effect Correlation can come in many shapes
16
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 16 Histogram Purpose: –To show the variance in a collection of data –Usually the data are expected to cluster about a mean Method: –Use a bar or column chart to show data values as a function of some variable –Can also show frequency of occurrence vs. data value
17
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 17 Histogram Example Second Data Value vs. Core Index
18
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 18 Histogram Example Unexpected Variation
19
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 19 Histogram Example # of Occurrences vs. Core Index
20
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 20 Low Variability about a Mean
21
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 21 High Variability about a Mean
22
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 22 Skewed Data Target Value
23
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 23 Uncontrolled Data Target Value
24
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 24 Do Projects with Walkthroughs Ship On Time More Often?
25
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 25 No, but they have Fewer Defects
26
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 26 Pareto Charts Purpose: –To identify the most significant cases –To highlight where to focus –To separate the significant few from the trivial many Method: –Sort data by vertical axis (“y”) value
27
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 27 Example Days Lost due to Vacation
28
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 28 Example No Apparent Discriminator Showing only the top of each bar helps show differences Sometimes, Pareto analysis shows that there is no significant difference
29
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 29 Discussion The previous charts are good at showing correlations between factors And sometimes there is a genuine causal relationship between the factors But the charts do not say much about the meaning of an observation –What does it mean to say that there are 5 defects per KLOC in the output? –Is this good, bad, typical, ??? For such purposes we need to show reference points in our data
30
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 30 Reference Point Example Corporat e Goal
31
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 31 Run Charts or Trend Charts Purpose: –To show the variance over time –To show how much variation is normal –To help understand what constitutes normal variance and what constitutes exceptional data Method: –Plot all data using a line chart and then compute and (optionally) plot the average as a separate line Note that the average is based on current data, not past history
32
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 32 Run Chart Example Data relative to Recent History
33
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 33 Run Chart Example
34
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 34 Run Chart with Moving Average
35
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 35 Control Charts Purpose: –To track performance –To know when a process or a machine is performing out of its normal range –To know when to take action Method: –Show actual data vs. average and expected variation (control limits). –Very much like a run chart, but with control limits added and with average based on prior data rather than current data
36
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 36 Typical Control Chart
37
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 37 More on Control Charts This will be addressed in a later module on quantitative process management
38
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 38 Recommended Reading Weinberg, vol. 2, Chapter 5 -- Slip Charts (see references)
39
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 39 Data Analysis Significance
40
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 40 Data Analysis Improper analysis can lead to wrong conclusions Proper analysis is very hard, it requires: –Insight into the problem –Knowledge about software development –Knowledge about the application –Knowledge of the customer situation –Tracking down the real facts –Looking at the data in several ways Telling the difference can be even harder!
41
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 41 Example of Need for Proper Analysis
42
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 42 Naive Conclusion Don’t inspect designs or code. Wait until code is done because it is cheaper to find and fix the defects while testing.
43
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 43 Proper Analysis Shows... Each phase detects different defects –Those introduced early may not be detected during code and test phase PhaseType XType YType Z RA 80 20 10 PD 40 30 30 DD 30 40 50 C&T 10 40 80 I&T 5 30 45
44
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 44 Proper Analysis Also Shows... Net cost for post-release defects is higher for those introduced in early stages Defect Correction Cost by Phase when Defect was Introduced
45
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 45 An Alternative Way to Evaluate Inspections and Walkthroughs Track the net gain for inspections and walkthroughs D = defects detected per inspection T = time (staff hours) per inspection f = time (staff hours) to fix a defect after an inspection F = time (staff hours) to find and fix defects after release –F R = F for requirements defects –F D = F for design defects –etc. (i.e., subscript is phase of origin)
46
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 46 Tracking the Gain Metric D * (F-f) - T = Gain for the inspection
47
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 47 Analyzing the Impact of Defect Prevention Activities Collect key post-release data –Incoming defects –Repair time & cost per defect (staff hours) –Calendar time per defect –Phase during which defect was introduced Compare different projects to see impact
48
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 48 Project 1 Phase% of DefectsAvg Fix IntroducedCost -------------------------------------- Req. 2600 Des. 20300 Cod. 75 80 Test 5 46 ---------------------------------- Total100125 Project 2 Phase% of DefectsAvg Fix IntroducedCost -------------------------------------- Req. 5575 Des. 15235 Cod. 65118 Test 15 75 ------------------------------------ Total100142 Comparison of Two Projects Project 1 invested more money in up front activities and ended up with a significantly lower net cost to fix defects. We also need to understand the total number of defects.
49
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 49 What About Taking Action? This module is primarily about interpreting things properly Other modules address various ways of taking action on the basis of measurements Response
50
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 50 Summary Drawing proper significance from data depends on methods of displaying the data and interpreting the results Proper analysis avoids incorrect conclusions
51
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 51 References Andersen, Bjorn and Tom Fagerhaug, Root Cause Analysis, ASQ Press, 2006. Satir, Virginia et al., The Satir Model, Family Therapy and Beyond, Palo Alto, CA., Science and Behavior Books, 1991. ISBN: 0831400781. Weinberg, Gerald M. Quality Software Management, Volume 2, First Order Measurement, Dorset House, New York, 1993. ISBN: 0-932633-24-2.
52
Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 52 END OF MODULE 31
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.