Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 SMU CSE 8314 Software Measurement.

Slides:



Advertisements
Similar presentations
Service Learning: A Strategic Collaboration for Engineering Quality Concepts in Service Management Author: Richard Greenwood Department of Technology Management,
Advertisements

©2014 IDBS, Confidential Statistical Process Control Workshop An Introduction to the Principles behind SPC Ilca Croufer.
Eight Basic Quality Improvement Tools – Part 2 Quality Engineering and Quality Management 1 © University of Wisconsin-Madison.
By: Dr. David L. Goetsch and Stanley Davis Based on the book
Basic Seven Tools of Quality
Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved. Lecture Slides Elementary Statistics Eleventh Edition and the Triola.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
Seven Quality Tools The Seven Tools
Copyright , Dennis J. Frailey CSE7315 – Software Project Management CSE7315 M30 - Version 9.01 SMU CSE 7315 Planning and Managing a Software Project.
Computer Engineering 203 R Smith Project Tracking 12/ Project Tracking Why do we want to track a project? What is the projects MOV? – Why is tracking.
Copyright © 1994 Carnegie Mellon University Disciplined Software Engineering - Lecture 1 1 Disciplined Software Engineering Lecture #7 Software Engineering.
Tool: Benchmarking Purpose: provide realistic goals and direction; monitor performance; improve processes to match the best in Higher Education. Benchmarking.
Listening to Voices The voice of the process measurement over time.
Software Quality Engineering Roadmap
Tumolo’s Toolbox 7 Step Improvement Process.
RIT Software Engineering
SE 450 Software Processes & Product Metrics 1 Defect Removal.
Total Quality Management BUS 3 – 142 Statistics for Variables Week of Mar 14, 2011.
ISHIKAWA’S BASIC SEVEN TOOLS OF QUALITY
Paul Prunty The 7 Basic Quality Tools ~ The DMAIC Process Continuous Improvement and … To a hammer, everything’s a nail … How many tools do you have in.
Presenting information
Chapter 5 Plotting Data Types of Graphs, Plotting with Excel Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
Software Project Management
© 2005 Wiley1 Total Quality Management Chapter 5.
Quality Control Tools A committee for developing QC tools affiliated with JUSE was set up in April Their aim was to develop QC techniques for.
Software Quality Chapter Software Quality  How can you tell if software has high quality?  How can we measure the quality of software?  How.
Overview of Total Quality Tools
1 Software Quality Engineering CS410 Class 5 Seven Basic Quality Tools.
Graphical Analysis. Why Graph Data? Graphical methods Require very little training Easy to use Massive amounts of data can be presented more readily Can.
Managing Software Projects Analysis and Evaluation of Data - Reliable, Accurate, and Valid Data - Distribution of Data - Centrality and Dispersion - Data.
Describing and Exploring Data Initial Data Analysis.
Problem Solving.
INFO 636 Software Engineering Process I Prof. Glenn Booker Week 9 – Quality Management 1INFO636 Week 9.
Seven Quality Tools The Seven Tools –Histograms, Pareto Charts, Cause and Effect Diagrams, Run Charts, Scatter Diagrams, Flow Charts, Control Charts.
CSE SW Metrics and Quality Engineering Copyright © , Dennis J. Frailey, All Rights Reserved CSE8314M10 8/20/2001Slide 1 SMU CSE 8314 /
CEN st Lecture CEN 4021 Software Engineering II Instructor: Masoud Sadjadi Monitoring (POMA)
Measure : SPC Dedy Sugiarto.
BPS - 5th Ed. Chapter 11 Picturing Distributions with Graphs.
Lecture PowerPoint Slides Basic Practice of Statistics 7 th Edition.
Copyright , Dennis J. Frailey CSE7315 – Software Project Management CSE7315 M16 - Version 8.01 SMU CSE 7315 Planning and Managing a Software Project.
Introduction to Quality Improvement Tools We are what we repeatedly do. Excellence, then, is not an act but a habit. ARISTOTLE.
CSE SW Project Management / Module 15 - Introduction to Effort Estimation Copyright © , Dennis J. Frailey, All Rights Reserved CSE7315M15.
CSE SW Measurement and Quality Engineering Copyright © , Dennis J. Frailey, All Rights Reserved CSE8314M15 version 5.09Slide 1 SMU CSE.
CHAPTER 7 STATISTICAL PROCESS CONTROL. THE CONCEPT The application of statistical techniques to determine whether the output of a process conforms to.
1 Software Quality Engineering. 2 Quality Management Models –Tools for helping to monitor and manage the quality of software when it is under development.
CSE SW Measurement and Quality Engineering Copyright © , Dennis J. Frailey, All Rights Reserved CSE8314M31 version 5.09Slide 1 SMU CSE.
CSE SW Metrics and Quality Engineering Copyright © , Dennis J. Frailey, All Rights Reserved CSE8314M29 8/20/2001Slide 1 SMU CSE 8314 /
1 Project Quality Management QA and QC Tools & Techniques Lec#10 Ghazala Amin.
The seven traditional tools of quality I - Pareto chart II – Flowchart III - Cause-and-Effect Diagrams IV - Check Sheets V- Histograms VI - Scatter Diagrams.
CSE SW Metrics and Quality Engineering Copyright © , Dennis J. Frailey, All Rights Reserved CSE8314M13 8/20/2001Slide 1 SMU CSE 8314 /
CSE SW Metrics and Quality Engineering Copyright © , Dennis J. Frailey, All Rights Reserved CSE8314M37 8/20/2001Slide 1 SMU CSE 8314 /
Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M00 - Version 7.09 SMU CSE 8314 Software Measurement.
CSE SW Metrics and Quality Engineering Copyright © , Dennis J. Frailey, All Rights Reserved CSE8314M18 8/20/2001Slide 1 SMU CSE 8314 /
CSE SW Project Management / Module 30 - Managing with Earned Value / Measurement Issues Copyright © , Dennis J. Frailey, All Rights Reserved.
THE 7 BASIC QUALITY TOOLS AS A PROBLEM SOLVING SYSTEM Kelly Roggenkamp.
Copyright , Dennis J. Frailey CSE7315 – Software Project Management CSE7315 M15 - Version 9.01 SMU CSE 7315 Planning and Managing a Software Project.
Basic 7 Tools of Quality Presented by: Rajender Kumar, Asst. Prof.
Statistical Fundamentals: Using Microsoft Excel for Univariate and Bivariate Analysis Alfred P. Rovai Charts Overview PowerPoint Prepared by Alfred P.
Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 SMU CSE 8314 Software Measurement.
CSE SW Metrics and Quality Engineering Copyright © , Dennis J. Frailey, All Rights Reserved CSE8314M11 8/20/2001Slide 1 SMU CSE 8314 /
Picturing Distributions with Graphs BPS - 5th Ed. 1 Chapter 1.
Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M00 - Version 7.09 SMU CSE 8314 Software Measurement.
CSE SW Measurement and Quality Engineering Copyright © , Dennis J. Frailey, All Rights Reserved CSE8314M11 version 5.09Slide 1 SMU CSE.
Describing Data Week 1 The W’s (Where do the Numbers come from?) Who: Who was measured? By Whom: Who did the measuring What: What was measured? Where:
CSE SW Metrics and Quality Engineering Copyright © , Dennis J. Frailey, All Rights Reserved CSE8314M33 8/20/2001Slide 1 SMU CSE 8314 /
Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 SMU CSE 8314 Software Measurement.
Quality Improvement: Problem Solving
CHAPTER 1: Picturing Distributions with Graphs
Scatter Diagrams Slide 1 of 4
Presentation transcript:

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version 7.09 SMU CSE 8314 Software Measurement and Quality Engineering Module 31 Methods of Observation Part 2 - Significance and Response (Displaying, Analyzing and Interpreting Data)

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version IntakeMeaningSignificanceResponse (*) Satir, Virginia et al. (references) The Observation Process (*) Previous Module This Module Other Modules

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Methods of Display and Interpretation (*) (*) Weinberg, Vol 2, chapters 4-8; Grady, chapters 2, 12 (references) Significance

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Methods of Display and Interpretation “Diagrams are Nothing; Diagramming is Everything” -- Weinberg, after Eisenhower (*) (*) Eisenhower, Dwight, “Plans are Nothing, Planning is Everything.”

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Methods of Diagramming  The way you present the data is associated with the message you want to convey  It is important to know what you want to communicate before selecting a way of graphing the data  An important objective of many types of diagrams is to show the relationships between different factors

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Methods to be Described  Fishbone Diagrams  Scatter Charts  Histograms  Pareto Charts  Run (Trend) Charts  Control Charts

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Fishbone (Ishikawa) Diagrams Project Late Resources Design Workstations Skills Motivation Errors People Complexity See previous module for more discussion

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Scatter Charts Purpose: –To show correlation (or lack of correlation) between two variables Method: –Plot two or more variables on an x-y or scatter chart

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Scatter Chart Example

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version A form of Linear Correlation V2 = a * V1 + b

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version A Non-Linear Correlation

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version No Apparent Correlation

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Further Categorization of Data  Sometimes the scatter chart hides a relationship because we did not segregate the data sufficiently well –e.g, suppose the last diagram showed fault density and weeks late on shipping –And suppose we had two kinds of projects: those that do inspections and walkthroughs and those that use only testing to identify defects –If we further segregate the data, we might see the following

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Segregated Scatter Chart

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Notes about Correlation  Correlation does NOT necessarily mean cause and effect  Correlation can come in many shapes

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Histogram Purpose: –To show the variance in a collection of data –Usually the data are expected to cluster about a mean Method: –Use a bar or column chart to show data values as a function of some variable –Can also show frequency of occurrence vs. data value

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Histogram Example Second Data Value vs. Core Index

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Histogram Example Unexpected Variation

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Histogram Example # of Occurrences vs. Core Index

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Low Variability about a Mean

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version High Variability about a Mean

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Skewed Data Target Value

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Uncontrolled Data Target Value

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Do Projects with Walkthroughs Ship On Time More Often?

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version No, but they have Fewer Defects

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Pareto Charts Purpose: –To identify the most significant cases –To highlight where to focus –To separate the significant few from the trivial many Method: –Sort data by vertical axis (“y”) value

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Example Days Lost due to Vacation

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Example No Apparent Discriminator Showing only the top of each bar helps show differences Sometimes, Pareto analysis shows that there is no significant difference

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Discussion  The previous charts are good at showing correlations between factors  And sometimes there is a genuine causal relationship between the factors  But the charts do not say much about the meaning of an observation –What does it mean to say that there are 5 defects per KLOC in the output? –Is this good, bad, typical, ???  For such purposes we need to show reference points in our data

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Reference Point Example Corporat e Goal

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Run Charts or Trend Charts  Purpose: –To show the variance over time –To show how much variation is normal –To help understand what constitutes normal variance and what constitutes exceptional data  Method: –Plot all data using a line chart and then compute and (optionally) plot the average as a separate line  Note that the average is based on current data, not past history

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Run Chart Example Data relative to Recent History

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Run Chart Example

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Run Chart with Moving Average

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Control Charts  Purpose: –To track performance –To know when a process or a machine is performing out of its normal range –To know when to take action  Method: –Show actual data vs. average and expected variation (control limits). –Very much like a run chart, but with control limits added and with average based on prior data rather than current data

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Typical Control Chart

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version More on Control Charts  This will be addressed in a later module on quantitative process management

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Recommended Reading  Weinberg, vol. 2, Chapter 5 -- Slip Charts (see references)

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Data Analysis Significance

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Data Analysis  Improper analysis can lead to wrong conclusions  Proper analysis is very hard, it requires: –Insight into the problem –Knowledge about software development –Knowledge about the application –Knowledge of the customer situation –Tracking down the real facts –Looking at the data in several ways  Telling the difference can be even harder!

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Example of Need for Proper Analysis

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Naive Conclusion Don’t inspect designs or code. Wait until code is done because it is cheaper to find and fix the defects while testing.

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Proper Analysis Shows...  Each phase detects different defects –Those introduced early may not be detected during code and test phase PhaseType XType YType Z RA PD DD C&T I&T

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Proper Analysis Also Shows...  Net cost for post-release defects is higher for those introduced in early stages Defect Correction Cost by Phase when Defect was Introduced

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version An Alternative Way to Evaluate Inspections and Walkthroughs  Track the net gain for inspections and walkthroughs D = defects detected per inspection T = time (staff hours) per inspection f = time (staff hours) to fix a defect after an inspection F = time (staff hours) to find and fix defects after release –F R = F for requirements defects –F D = F for design defects –etc. (i.e., subscript is phase of origin)

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Tracking the Gain Metric D * (F-f) - T = Gain for the inspection

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Analyzing the Impact of Defect Prevention Activities  Collect key post-release data –Incoming defects –Repair time & cost per defect (staff hours) –Calendar time per defect –Phase during which defect was introduced  Compare different projects to see impact

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Project 1 Phase% of DefectsAvg Fix IntroducedCost Req Des Cod Test Total Project 2 Phase% of DefectsAvg Fix IntroducedCost Req Des Cod Test Total Comparison of Two Projects  Project 1 invested more money in up front activities and ended up with a significantly lower net cost to fix defects.  We also need to understand the total number of defects.

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version What About Taking Action?  This module is primarily about interpreting things properly  Other modules address various ways of taking action on the basis of measurements Response

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version Summary  Drawing proper significance from data depends on methods of displaying the data and interpreting the results  Proper analysis avoids incorrect conclusions

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version References  Andersen, Bjorn and Tom Fagerhaug, Root Cause Analysis, ASQ Press,  Satir, Virginia et al., The Satir Model, Family Therapy and Beyond, Palo Alto, CA., Science and Behavior Books, ISBN:  Weinberg, Gerald M. Quality Software Management, Volume 2, First Order Measurement, Dorset House, New York, ISBN:

Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M31 - Version END OF MODULE 31