Download presentation
Presentation is loading. Please wait.
Published byDavid Moore Modified over 9 years ago
1
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 1 Software Engineering: A Practitioner’s Approach, 6/e Chapter 22 Process and Project Metrics Software Engineering: A Practitioner’s Approach, 6/e Chapter 22 Process and Project Metrics copyright © 1996, 2001, 2005 R.S. Pressman & Associates, Inc. For University Use Only May be reproduced ONLY for student use at the university level when used in conjunction with Software Engineering: A Practitioner's Approach. Any other reproduction or use is expressly prohibited.
2
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 2 Administrative Stuff Detailed info about presentation available on website Detailed info about presentation available on website Final Review Posted on website Final Review Posted on website T/TH class - when do you want to present? T or TH? T/TH class - when do you want to present? T or TH? Class evaluations coming Class evaluations coming See me if you are interested in becoming a TA or GTA for this class See me if you are interested in becoming a TA or GTA for this class
3
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 3 Until you can measure something and express it in numbers, you have only the beginning of understanding. - Lord Kelvin
4
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 4 A Good Manager Measures measurement What do we use as a basis? size? size? function? function? project metrics process metrics process product product metrics
5
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 5 Why Do We Measure? assess the status of an ongoing project track potential risks uncover problem areas before they go “critical,” adjust work flow or tasks, evaluate the project team’s ability to control quality of software work products.
6
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 6 Process versus Project Metrics Process Metrics - Measure the process to help update and change the process as needed across many projects Process Metrics - Measure the process to help update and change the process as needed across many projects Project Metrics - Measure specific aspects of a single project to improve the decisions made on that project Project Metrics - Measure specific aspects of a single project to improve the decisions made on that project Frequently the same measurements can be used for both purposes Frequently the same measurements can be used for both purposes
7
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 7 Process Measurement We measure the efficacy of a software process indirectly. We measure the efficacy of a software process indirectly. That is, we derive a set of metrics based on the outcomes of the process That is, we derive a set of metrics based on the outcomes of the process Outcomes include Outcomes include measures of errors uncovered before release of the software measures of errors uncovered before release of the software defects delivered to and reported by end-users defects delivered to and reported by end-users work products delivered (productivity) work products delivered (productivity) human effort expended human effort expended calendar time expended calendar time expended schedule conformance schedule conformance many others… many others… We also derive process metrics by measuring the characteristics of specific software engineering tasks. We also derive process metrics by measuring the characteristics of specific software engineering tasks.
8
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 8 Process Metrics Guidelines Use common sense and organizational sensitivity when interpreting metrics data. Use common sense and organizational sensitivity when interpreting metrics data. Provide regular feedback to the individuals and teams who collect measures and metrics. Provide regular feedback to the individuals and teams who collect measures and metrics. Don’t use metrics to appraise individuals. Don’t use metrics to appraise individuals. Work with practitioners and teams to set clear goals and metrics that will be used to achieve them. Work with practitioners and teams to set clear goals and metrics that will be used to achieve them. Never use metrics to threaten individuals or teams. Never use metrics to threaten individuals or teams. Metrics data that indicate a problem area should not be considered “negative.” These data are merely an indicator for process improvement. Metrics data that indicate a problem area should not be considered “negative.” These data are merely an indicator for process improvement. Don’t obsess on a single metric to the exclusion of other important metrics. Don’t obsess on a single metric to the exclusion of other important metrics.
9
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 9 Software Process Improvement SPI Process model Improvement goals Process metrics Process improvement recommendations Make your metrics actionable!
10
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 10 Typical Process Metrics Quality-related Quality-related focus on quality of work products and deliverables focus on quality of work products and deliverables Productivity-related Productivity-related Production of work-products related to effort expended Production of work-products related to effort expended Statistical SQA data Statistical SQA data error categorization & analysis error categorization & analysis Defect removal efficiency Defect removal efficiency propagation of errors from process activity to activity propagation of errors from process activity to activity Reuse data Reuse data The number of components produced and their degree of reusability The number of components produced and their degree of reusability Within a single project this can also be a “project metric”. Across projects this is a “process metric”. Within a single project this can also be a “project metric”. Across projects this is a “process metric”. Correctness Maintainability Integrity Usability Earned Value Analysis Defects found in this stage --------------------------------------- This Stage + Next Stage Severity of errors (1-5) MTTF (Mean time to failure) MTTR (Mean time to repair)
11
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 11 Effective Metrics (ch 15) Simple and computable Simple and computable Empirically and intuitively persuasive Empirically and intuitively persuasive Consistent and objective Consistent and objective Consistent in use of units and dimensions Consistent in use of units and dimensions Programming language independent Programming language independent Should be actionable Should be actionable
12
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 12 Actionable Metrics Actionable metrics (or information in general) are metrics that guide change or decisions about something Actionable: Measure the amount of human effort versus use cases completed. Actionable: Measure the amount of human effort versus use cases completed. Too high -- more training, more design, etc… Too high -- more training, more design, etc… Very low: maybe we can shorten the schedule Very low: maybe we can shorten the schedule Not-Actionable: Measure the number of times the letter “e” appears in code Not-Actionable: Measure the number of times the letter “e” appears in code Think before you measure. Don’t waste people’s time!
13
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 13 Project Metrics used to minimize the development schedule by making the adjustments necessary to avoid delays and mitigate potential problems and risks used to minimize the development schedule by making the adjustments necessary to avoid delays and mitigate potential problems and risks used to assess product quality on an ongoing basis and, when necessary, modify the technical approach to improve quality. used to assess product quality on an ongoing basis and, when necessary, modify the technical approach to improve quality. every project should measure: every project should measure: Inputs —measures of the resources (e.g., people, tools) required to do the work. Inputs —measures of the resources (e.g., people, tools) required to do the work. Outputs —measures of the deliverables or work products created during the software engineering process. Outputs —measures of the deliverables or work products created during the software engineering process. Results —measures that indicate the effectiveness of the deliverables. Results —measures that indicate the effectiveness of the deliverables.
14
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 14 Typical Project Metrics Effort/time per software engineering task Effort/time per software engineering task Errors uncovered per review hour Errors uncovered per review hour Scheduled vs. actual milestone dates Scheduled vs. actual milestone dates Changes (number) and their characteristics Changes (number) and their characteristics Distribution of effort on software engineering tasks Distribution of effort on software engineering tasks
15
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 15 Metrics Guidelines Use common sense and organizational sensitivity when interpreting metrics data. Use common sense and organizational sensitivity when interpreting metrics data. Provide regular feedback to the individuals and teams who have worked to collect measures and metrics. Provide regular feedback to the individuals and teams who have worked to collect measures and metrics. Don’t use metrics to appraise individuals. Don’t use metrics to appraise individuals. Work with practitioners and teams to set clear goals and metrics that will be used to achieve them. Work with practitioners and teams to set clear goals and metrics that will be used to achieve them. Never use metrics to threaten individuals or teams. Never use metrics to threaten individuals or teams. Metrics data that indicate a problem area should not be considered “negative.” These data are merely an indicator for process improvement. Metrics data that indicate a problem area should not be considered “negative.” These data are merely an indicator for process improvement. Don’t obsess on a single metric to the exclusion of other important metrics. Don’t obsess on a single metric to the exclusion of other important metrics. Same as process metrics guidelines
16
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 16 Typical Size-Oriented Metrics errors per KLOC (thousand lines of code) errors per KLOC (thousand lines of code) defects per KLOC defects per KLOC $ per LOC $ per LOC pages of documentation per KLOC pages of documentation per KLOC errors per person-month errors per person-month Errors per review hour Errors per review hour LOC per person-month LOC per person-month $ per page of documentation $ per page of documentation
17
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 17 Typical Function-Oriented Metrics errors per Function Point (FP) errors per Function Point (FP) defects per FP defects per FP $ per FP $ per FP pages of documentation per FP pages of documentation per FP FP per person-month FP per person-month
18
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 18 But.. What is a Function Point? Function points (FP) are a unit measure for software size developed at IBM in 1979 by Richard Albrecht Function points (FP) are a unit measure for software size developed at IBM in 1979 by Richard Albrecht To determine your number of FPs, you classify a system into five classes: To determine your number of FPs, you classify a system into five classes: Transactions - External Inputs, External Outputs, External Inquires Transactions - External Inputs, External Outputs, External Inquires Data storage - Internal Logical Files and External Interface Files Data storage - Internal Logical Files and External Interface Files Each class is then weighted by complexity as low/average/high Each class is then weighted by complexity as low/average/high Multiplied by a value adjustment factor (determined by asking questions based on 14 system characteristics Multiplied by a value adjustment factor (determined by asking questions based on 14 system characteristics
19
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 19 But.. What is a Function Point? CountLowAverageHighTotal External Input x3x4x6 External Output x4x5x7 External Inquiries x3x4x6 Internal Logic Files x7x10x15 External Interface Files x5x7x10 Unadjusted Total: Value Adjustment Factor: Total Adjusted Value:
20
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 20 Function Point Example http://www.his.sunderland.ac.uk/~cs0mel/Alb_Example.doc
21
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 21 Comparing LOC and FP Representative values developed by QSM
22
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 22 Why Opt for FP? Programming language independent Programming language independent Used readily countable characteristics that are determined early in the software process Used readily countable characteristics that are determined early in the software process Does not “penalize” inventive (short) implementations that use fewer LOC that other more clumsy versions Does not “penalize” inventive (short) implementations that use fewer LOC that other more clumsy versions Makes it easier to measure the impact of reusable components Makes it easier to measure the impact of reusable components
23
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 23 Object-Oriented Metrics Number of scenario scripts (use-cases) Number of scenario scripts (use-cases) Number of support classes ( Number of support classes (required to implement the system but are not immediately related to the problem domain) Average number of support classes per key class (analysis class) Number of subsystems (an aggregation of classes that support a function that is visible to the end-user of a system)
24
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 24 WebEngineering Project Metrics Number of static Web pages (the end-user has no control over the content displayed on the page) Number of dynamic Web pages (end-user actions result in customized content displayed on the page) Number of internal page links (internal page links are pointers that provide a hyperlink to some other Web page within the WebApp) Number of persistent data objects Number of external systems interfaced Number of static content objects Number of dynamic content objects Number of executable functions
25
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 25 Measuring Quality Correctness — the degree to which a program operates according to specification Correctness — the degree to which a program operates according to specification Maintainability—the degree to which a program is amenable to change Maintainability—the degree to which a program is amenable to change Integrity—the degree to which a program is impervious to outside attack Integrity—the degree to which a program is impervious to outside attack Usability—the degree to which a program is easy to use Usability—the degree to which a program is easy to use Verified non-conformance with reqmts ---------------------------------- KLOC Verified non-conformance with reqmts ---------------------------------- KLOC MTTC Mean time to change: time to analyze, design, implement and deploy a change MTTC Mean time to change: time to analyze, design, implement and deploy a change threat probability security - likelihood of repelling attack Integrity = 1-(threat*(1-security)) E.g. t=0.25, s=0.95 --> I=0.99 threat probability security - likelihood of repelling attack Integrity = 1-(threat*(1-security)) E.g. t=0.25, s=0.95 --> I=0.99 Many options. See ch 12
26
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 26 Defect Removal Efficiency DRE = E /(E + D) E is the number of errors found before delivery of the software to the end-user D is the number of defects found after delivery.
27
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 27 Defect Removal Efficiency DRE = E /(E + D) Defects found during phase: Requirements (10) Design (20) Construction Implementation (5) Unit Testing (50) Testing Integration Testing (100) System Testing (250) Acceptance Testing (5) By Customer (10) 10 / (10 + 20) = 33% 20 / (20 + 50) = 28% 5 / (5 + 50) = 9% 50 / (50 + 100) = 33% 100 / (100 + 250) = 28% 250 / (250 + 5) = 98% 5 / (5 + 10) = 33% 10 / (10 + 20) = 33% What are the rest?
28
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 28 Metrics for Small Organizations time (hours or days) elapsed from the time a request is made until evaluation is complete, t queue. effort (person-hours) to perform the evaluation, W eval. time (hours or days) elapsed from completion of evaluation to assignment of change order to personnel, t eval. effort (person-hours) required to make the change, W change. time required (hours or days) to make the change, t change. errors uncovered during work to make change, E change. defects uncovered after change is released to the customer base, D change.
29
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 29 Establishing a Metrics Program Set Goals Identify your business goals. Identify what you want to know or learn. Identify your subgoals. Identify the entities and attributes related to your subgoals. Formalize your measurement goals. Determine indicators for goals Identify quantifiable questions and the related indicators that you will use to help you achieve your measurement goals. Identify the data elements that you will collect to construct the indicators that help answer your questions. Define Measurements Define the measures to be used, and make these definitions operational. Identify the actions that you will take to implement the measures. Prepare a plan for implementing the measures.
30
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 30 Questions What are some reasons NOT to use lines of code to measure size? What are some reasons NOT to use lines of code to measure size? What do you expect the DRE rate will be for the implementation (or construction) phase of the software lifecycle? What do you expect the DRE rate will be for the implementation (or construction) phase of the software lifecycle? What about for testing? What about for testing? Give an example of a usability metric? Give an example of a usability metric? According to the chart, Smalltalk is much more efficient than Java and C++. Why don’t we use it for everything? According to the chart, Smalltalk is much more efficient than Java and C++. Why don’t we use it for everything?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.