Presentation is loading. Please wait.

Presentation is loading. Please wait.

Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 SMU CSE 8314 Software Measurement.

Similar presentations


Presentation on theme: "Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 SMU CSE 8314 Software Measurement."— Presentation transcript:

1 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 SMU CSE 8314 Software Measurement and Quality Engineering Module 29 Measuring and Improving the Software Development Process

2 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 2 Outline  The Measurement Process  Tying Measures to the Software Development Process  Root Cause Analysis

3 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 3 The Measurement Process

4 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 4 The ISO 15939 Software Measurement Process Model Measurement Data Base Establish and Sustain Commit- ment Perform the Measure- ment Process Evaluate Measure- ments Project Management and Software Development Information NeedsInformation Information & Evaluation Results Improvement Actions Plan the Measure- ment Process Core Process

5 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 5 Core Measurement Process Manage Risks on the Project Plan Collect Analyze Utilize Plan Perform

6 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 6 The Measurement Process  Define goals & information needs –Work with customer & stakeholders –Prioritize and align  Select measures –See prior module on selecting SW measures  Plan the data collection, analysis and reporting procedures  Define criteria for evaluation  Obtain approval for resources  Deploy supporting technologies Plan

7 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 7 The Measurement Process  Integrate the data collection process into the software & management processes –Consider the need for culture change –Consider the impact of data collection  Communicate the process to affected parties  Deploy the collection process  Collect & verify the data  Record and store the data Collect

8 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 8 The Measurement Process  Analyze the Data –Interpret relative to models, etc.  Report/Communicate the Results –Document –Display/Communicate to users –Interpret Analyze

9 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 9 Evaluate Measurements (not part of core process) The Measurement Process  Change the project –Identify trends early –Adjust plans to head off problems  Change the process –At the project level –At the organizational level  Change the measurement process –Data definitions, collection process, validation, etc. Utilize

10 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 10 Each Measure Must Have an Objective and a Customer  Objective must be clear –Otherwise, people will not know why and may resist or may collect the wrong thing  Interpretation of the data must be clear –Otherwise, measurements can be misused and misinterpreted  End user (customer) of the measurement must want the information –Otherwise, all of the effort is for no purpose

11 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 11 The Objective can be Stated in terms of Three Elements  To understand / evaluate / control / predict / report  An attribute of an entity (resource, process, product)  In order to satisfy a purpose (related to the goal of the measure)

12 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 12 Construct a Sentence for Each Measure “We are measuring MMM so we can OOO the AAA of the EEE in order to PPP” –MMM is a measure –OOO is an objective –AAA is an attribute –EEE is an entity –PPP is a purpose “We are measuring the number of lines of code so we can understand the size of the software in order to predict the cost and schedule of the project.” This idea is due to Linda Westfall (see references)

13 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 13 Purpose and Use must be Communicated to All Concerned  If people do not know why they are being measured, they will mistrust the measurements and generate bad data

14 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 14 Use of Measurements must be Demonstrated

15 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 Integrating Measures and Data Collection with the Software Development Process

16 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 16  Managers -- Project measures –That’s how they are evaluated –But if the project is in trouble they need to know more  Developers -- Product measures –That’s how they are evaluated  Both should care about Process measures –This is usually where you learn the reasons for a problem Who Cares about What?

17 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 17 What Should we Measure? ProductProjectProcess determines success of determines quality of root causes Process Measures –How effective is the process? –How well are we following the process? –Risk monitoring Product Measures – Performance and quality – How well is the product meeting its requirements ? Project Measures – Used to monitor the state of the project – How are we doing relative to cost, schedule, staffing, etc.?

18 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 18 Example: Same Base Measure, Different Uses Base Measure: Number of defects in the software Use in a Project Measure: number of defects must be less than target before project is complete, thus the data are used to measure project status. Use in a Product Measure: product quality index is “defects per 1000 LOC”. Used to determine probable warranty cost. Use in a Process Measure: quality index is compared for different processes and for process improvements to determine which processes are best for future projects in the organization

19 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 19 How do you Fix Problems Identified by Project Measures?  An effective solution is usually based on analyzing and fixing the process

20 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 20 Example  Project measure: schedule performance  Problem: schedule performance is poor (we are behind schedule)  Solution: understand why the process is not achieving the desired schedule –Are we not following the process? –Is the process inefficient? –Are people poorly trained? –Is the process ill suited to this project? Questions to help identify process measures

21 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 21 Example: A Measure & its Impact Information Need: Productivity Measure 1: Lines of code per day Use: reward those who produce the most lines of code per day Result: people produce bloated, inflated code in order to look good Measure 2: Requirements met and tested, weighted by complexity of requirement Use: track against history and use to identify process bottlenecks Result: people will use the data to make the process more efficient, resulting in lower cost

22 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 22 Good and Not-So-Good Measures Goal: Produce software more efficiently Information Needed: Efficiency Measure 1: tests completed per week Result: easy tests done first; corners cut in testing; hard problems ignored or deferred Measure 2: rework Result: process and methods are improved to reduce rework, resulting in more efficient software development But rework is a lagging indicator - it does not spot problems in advance

23 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 23 ProductProjectProcess Attributes What Resources Quality Time Are We On Schedule? Expenses vs. Budget? How Fast can we Manufacture? What Is our Cycle Time? Post-release Defects? What will it Cost? What is our Productivity? Customer Satisfaction? In-process Defects? Performance Meets Perf. Goals? Meets Mgt. Goals? Does it Work? What Attributes Can We Measure?  We want attributes that relate to our goals –time, resources, performance, quality etc.  The following type of matrix can help:

24 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 24 Tie Measures to the Process -- A Measure for Every Phase Phase Measure Require ments Test Plan ning Design Integr ation Coding Staffing X X XXX X Requirements Stability XXX Etc. X Design Complexity X Code Complexity

25 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 25 Plan the Measurement Process to be Part of the Software Process  The software process and associated procedures define the day-to-day actions –Thus they are the ideal place to communicate details of how data should be collected & evaluated  The software process is defined in terms of tasks to be done, inputs and outputs, and sequencing

26 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 26 Typical Process Description Design Software Module Inspect Design Software Design Review Place Module under Configuration Control Design Module Test Cases Place Tests under Configuratio n Control defects

27 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 27 “Inspect Design” Detailed Process Description Hold Inspection Meeting Track Defects to Closure Record Defects Detected Identify Inspection Participants Decide if OK to Release to CM Design Software Module Software Design Review Place Module under Configuration Control Design Module Test Cases Inspect Design Place Tests under Configuration Control defects Inspect Design

28 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 28 “Record Defects Detected” (text description of task)  For each defect, identify the following: –Type –Origin –Severity –Date and Time Found –Person assigned to Correct –Estimated effort to Correct  Record the above in the defect database  When defect is corrected, record date and time corrected and total effort to correct Record Defects Detecte d

29 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 29 Tie the Software Process to the Measures  Consider measures for each process task or procedure or input or output –Why (Goal of this measure) –What to collect (base measures; clear definitions) –How to collect (forms, procedures, automation) –Responsibility for collection (individuals, organizations) –How it will be used (analysis, formulas, etc.)

30 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 30 Tie the Software Process to the Measures (continued)  Add this information to the process description -- usually as part of the text description of the bottom level task  But don’t overdo it  And make sure there is a concrete benefit for each measurement being recommended.

31 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 31 Process for Design Walkthrough 1) Collect design documents 2) … (walkthrough details)... 3) … (walkthrough details)... 4) Identify Defects 5) Categorize by Priority and Type 6) Document in Walkthrough Report 7) Define action plan for Each Defect 8) Assign each defect to Analyst for Resolution 9) Report status each week until closure, using “open defects” report Another Example Focusing on Status Measures

32 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 32 Typical Graph of Status Data Open Defects Report for module XXX123

33 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 33 Root Cause Analysis

34 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 34 What Is Root Cause Analysis  Finding the real, underlying cause of a problem rather than just dealing with the symptoms  Used for problems that occur often and consume a lot of resources  The most expedient solution to any problem is usually to deal with symptoms –root cause analysis provides a more lasting and, ultimately, more cost effective solution

35 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 35 From systems-thinking.org To find root causes there is one really only one question that's relevant, "What can we learn from this situation?" Research has repeatedly proven that unwanted situations within organizations are about 95% related to process problems and only 5% related to personnel problems. Yet, most organizations spend far more time looking for culprits than causes and because of this misdirected effort seldom really gain the benefit they could gain from understanding the foundation of the unwanted situation.

36 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 36 Example  Problem: expediting certain jobs causes others to be delayed, resulting in overall loss of efficiency  Root cause: why are we expediting those jobs in the first place? Why is the root cause not usually addressed? Because the person in charge of solving the problem (expediting the jobs) isn’t being asked to look at the whole picture

37 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 37 Example of Effective RCA (1 of 5) (From systems-thinking.org)  The Plant Manager walked into the plant and found oil on the floor. <<< Symptom  He called the Foreman over and asked him why there was oil on the floor.  The Foreman indicated that it was due to a leaky gasket in the pipe joint above. <<< Cause  The Plant Manager then asked when the gasket had been replaced and the Foreman responded that Maintenance had installed 4 gaskets over the past few weeks and each one seemed to leak.

38 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 38 Example of Effective RCA (2 of 5) (From systems-thinking.org)  The Foreman also indicated that Maintenance had been talking to Purchasing about the gaskets because it seemed they were all bad. <<< More fundamental cause  The Plant Manager then went to talk with Purchasing about the situation with the gaskets. The Purchasing Manager indicated that they had in fact received a bad batch of gaskets from the supplier.

39 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 39 Example of Effective RCA (3 of 5) (From systems-thinking.org)  The Purchasing Manager also indicated that they had been trying for the past 2 months to try to get the supplier to make good on the last order of 5,000 gaskets that all seemed to be bad.  The Plant Manager then asked the Purchasing Manager why they went with the lowest bidder and he indicated that was the direction he had received from the VP of Finance. <<< More fundamental cause

40 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 40 Example of Effective RCA (4 of 5) (From systems-thinking.org)  The Plant Manager then went to talk to the VP of Finance about the situation.  When the Plant Manager asked the VP of Finance why Purchasing had been directed to always take the lowest bidder the VP of Finance said, "Because you indicated that we had to be as cost conscious as possible!" and purchasing from the lowest bidder saves us lots of money.

41 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 41 Example of Effective RCA (5 of 5) (From systems-thinking.org)  The Plant Manger was horrified when he realized that he was the reason there was oil on the plant floor. Root Cause!!! Note the string of causes, across multiple parts of the organization, before the manager found the underlying, root cause

42 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 42 Some Methods of Root Cause Analysis  Cause and Effect Charts  The Five Why’s –Keep asking “why” until you find the underlying reason for a problem  Fault Tree Analysis  Matrix Diagrams –Similar to QFD  There are many other methods (see Andersen)

43 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 43 Root Cause Analysis for Assignment 5  After you have identified the most important problems, you will be expected to use Root Cause Analysis to determine the underlying causes of these problems  You will be expected to read Andersen and use techniques found there (at least two of them) in your analysis in Assignment 5  This will come AFTER you have done the value-added analysis and the cost-of- quality analysis

44 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 44 Root Cause Analysis Applied to Defects  Used to identify causes of defects so you can change the process to prevent them  The methods can also be used to trace forward: –given a defect, what problems might it cause  Unfortunately, the terminology differs greatly among various authors.

45 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 45 IEEE Standard for Defect Classification  See IEEE standards for latest status  Rather elaborate and comprehensive  Tends to be rather generic and therefore hard to apply to specific projects  This illustrates one of the problems with standards -- they are often too universal and generic to be of much use for specific applications

46 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 46 Hewlett-Packard Model for Defect Analysis Each defect is classified in terms of three factors: 1) Origin (phase of process where defect originated) –Specification/Requirements –Design –Coding –Environment and Support –Documentation –Operator or Other See Grady pp 127- 129, and Leath (references)

47 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 47 Typical Results from Origins Analysis

48 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 48 Origins Analysis Weighted by Cost to Fix Defects (from HP) Weighting Factors : Specification14.25 Design 6.25 Code 2.50 Documentation 1.00 Operator 1.00 Other 1.00

49 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 49 Hewlett-Packard Model (continued) 2) Type of Defect (what caused the problem) –Specification error –Functionality (impossible or incorrectly described) –Interface (hardware, software, or user; design or implementation) –Interprocess communication –Data definition or data handling error –Module design

50 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 50 More Types of defects  More types of defects –Logic error or description error –Error checking –Standards specified or applied incorrectly –Computation error –Test error (hardware, software, process) –Integration error –Tools error

51 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 51 Distribution of Defects by Type (from HP)

52 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 52 Hewlett-Packard Model (continued) 3) Mode (why it happened) –Something missing –Something unclear –Something wrong –Something changed –A better way is known

53 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 53 Advanced Uses  Organizations with more experience and data may identify defects in very complex ways, such as: –Most expensive defect types –Most time consuming defect types –Most customer sensitive defect types –Most frequent defect types

54 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 54 Changing the Process  They may also identify process changes that will reduce the number of defects  Examples: –Change the measure by which the project manager is rewarded –Eliminate wasteful documentation –Plan testing earlier in the process

55 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 55 Cause and Effect Charts (Ishikawa Diagrams; Fishbone Diagrams)  Useful for brainstorming possible causes of problems  The idea is to categorize defects and then subcategorize them to try to reach a better understanding of what kinds of defects are occurring and why  Basic approach is a “fishbone” picture where the main “backbone” is the category of defect and the “ribs” are subcategories Andersen (p119); Grady & Caswell, p. 127; Ishikawa (see references)

56 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 56 Concept Category of Defect Subcategory Cause

57 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 57 Example Register Allocation Error Incorrect Usage Side Effect of Correct Usage Incorrect Documentation Lack of Knowledge Keep Track Improperly Poor Design Documentation

58 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 58 Beware of Linear Thinking  Just because Joe is always associated with register allocation defects, does not mean that Joe is the cause of these defects  Perhaps Joe is the person who wrote the code that produces the problem, but that does not mean Joe is the problem  Perhaps the problem is inherent in the requirements, for example

59 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 59 Many Authors have Categorized Defects in Various Ways  Materials, Workers, Tools and Inspections  Materials, Methods, Measures  Process, Operator, Equipment and Material  Process, People, Tools, Inputs  FURPS+, etc.  Each “rib” on the fishbone diagram might be one of the defect categories.

60 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 60 Example People Tools Inputs Process

61 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 61 An Interesting Finding from HP Experience using Ishikawa  Over half of the errors reported occurred during redesigns  These were traced to the fact that design reviews were not carried out during redesigns because they occurred in a hurried environment  Consider the design flow diagram of a few pages back -- did we re-inspect the corrections?

62 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 62 Summary  The measurement process has four steps: –Plan –Collect –Analyze –Utilize  Construct a sentence to make sure that each measure has an appropriate objective

63 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 63 Summary (continued)  Measure project, product and process –Measure project and product to identify problems –Measure process to identify causes –Use process information to identify organizational characteristics and trends  Tie measures to the software development process

64 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 64 Summary (concluded)  Determine Causes of Problems –Root cause analysis –Ishikawa diagrams  Beware of Linear Thinking

65 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 65 References  Andersen, Bjorn and Tom Fagerhaug, Root Cause Analysis, ASQ Quality Press, 2006  Grady, Robert B. Practical Software Metrics for Project Management and Process Improvement. Englewood Cliffs, N.J., Prentice-Hall, Inc., 1992. ISBN 0-13 ‑ 720384-5.  Grady, Robert B. and Deborah L. Caswell, Software Metrics: Establishing a Company-Wide Program. Englewood Cliffs, N.J., Prentice-Hall, Inc., 1987. ISBN 0-13-821844-7.  Ishikawa, K. Guide to Quality Control, Asian Productivity Organization, Tokyo (1976), 18-28.

66 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 66 References (continued)  IEEE, A Standard for Software Errors, Faults and Failures, IEEE Working Group P1044, March, 1987.  Leath, C., “A Software Defect Analysis,” HP Software Productivity Conference Proceedings, (April, 1987) 4:147-161  Robitaille, Denise, Root Cause Analysis: Basic Tools and Techniques, Paton Press, 2004  Westfall, Linda, Software Metrics that Meet Your Information Needs, Dallas SPIN Tutorial, October 11, 1994.

67 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 67 END OF MODULE 29


Download ppt "Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M29 - Version 7.09 SMU CSE 8314 Software Measurement."

Similar presentations


Ads by Google