Download presentation
Presentation is loading. Please wait.
Published byMaud Page Modified over 6 years ago
1
Software Engineering: A Practitioner’s Approach, 6/e Chapter 22 Process and Project Metrics copyright © 1996, 2001, 2005 R.S. Pressman & Associates, Inc. For University Use Only May be reproduced ONLY for student use at the university level when used in conjunction with Software Engineering: A Practitioner's Approach. Any other reproduction or use is expressly prohibited. These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
2
Until you can measure something and express it in numbers, you have only the beginning of understanding. - Lord Kelvin These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
3
Until you can measure something and express it in numbers, you have only the beginning of understanding. - Lord Kelvin Non-software metrics you use everyday: - Gas tank scale - Speedometer - Thermostat in your house - Battery monitor in your laptop What would happen if instead these were not numeric? What other examples do you have? Gas Tank ☐A Lot ☐Some A little These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
4
Until you can measure something and express it in numbers, you have only the beginning of understanding. - Lord Kelvin The problem is non-numeric measurements are subjective… they mean different things to different people. Numbers are objective… they mean the same thing to everyone! When your friend says “oh yeah, John/Jane Doe is super attractive, you should go our with him/her”… that is a subjective measurement… so, you ask “Send me a photo”. Why? Because the subjective term “super attractive” has vastly different meanings for different people! These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
5
Metrics for software When asked to measure something, always try to determine an objective measurement. If not possible, try to get as close as you can! These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
6
A Good Manager Measures
process process metrics project metrics measurement product metrics product What do we use as a basis? • size? • function? These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
7
We need a basis to say 20 defects per X lines of code
We need a basis to say 20 defects per X lines of code. Why is this important? A Because lines of code equals cost B We want our metrics to be valid across projects of many sizes C Because you just caused me to die in Halo 3… stop asking these questions! D Because this helps up understand how big our program is These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
8
Why Do We Measure? assess the status of an ongoing project
track potential risks uncover problem areas before they go “critical,” adjust work flow or tasks, evaluate the project team’s ability to control quality of software work products. These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
9
Process versus Project Metrics
Process Metrics - Measure the process to help update and change the process as needed across many projects Project Metrics - Measure specific aspects of a single project to improve the decisions made on that project Process may aggregate % delay of deliverables (schedule conformance) across many projects to determine how good our scheduling/planning process is Project would use the same measurement to make project level decisions Frequently the same measurements can be used for both purposes These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
10
Process Measurement We measure the efficacy of a software process indirectly. That is, we derive a set of metrics based on the outcomes of the process Outcomes include measures of errors uncovered before release of the software defects delivered to and reported by end-users work products delivered (productivity) human effort expended calendar time expended schedule conformance many others… We also derive process metrics by measuring the characteristics of specific software engineering tasks. Efficacy - does the process do what is intended Specific tasks - measure any of the above for a specific phase (communication or analysis, design, construction) Got here 11/13/2008 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
11
Process Metrics Guidelines
Use common sense and organizational sensitivity when interpreting metrics data. Provide regular feedback to the individuals and teams who collect measures and metrics. Don’t use metrics to appraise individuals. Work with practitioners and teams to set clear goals and metrics that will be used to achieve them. Never use metrics to threaten individuals or teams. Metrics data that indicate a problem area should not be considered “negative.” These data are merely an indicator for process improvement. Don’t obsess on a single metric to the exclusion of other important metrics. These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
12
If I calculate the number of defects per developer and rank them, then using that rank assign salary raises based on that. A. This is good B. This is bad C. Helloo…. Halo 3? Stop it. These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
13
Software Process Improvement
Process model SPI Process improvement recommendations Improvement goals Process metrics Make your metrics actionable! These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
14
Typical Process Metrics
Quality-related focus on quality of work products and deliverables Productivity-related Production of work-products related to effort expended Statistical SQA data error categorization & analysis Defect removal efficiency propagation of errors from process activity to activity Reuse data The number of components produced and their degree of reusability Within a single project this can also be a “project metric”. Across projects this is a “process metric”. Correctness Maintainability Integrity Usability Earned Value Analysis Defects found in this stage This Stage + Next Stage Severity of errors (1-5) MTTF (Mean time to failure) MTTR (Mean time to repair) SQA - types of errors (0-5), MTTF (failure), MTTR (Repair), Quality - Correctness-adhearance to rqmts, Maintainability-easy to fix?, Integrity-attack vulnerability, Usability-training time, number of screens, etc… These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
15
Can you calculate a metric that records the number of ‘e’ that appear in a program? A. Yes B. No
Should you calculate the number of ‘e’ in a program? A. Yes B. No These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
16
Effective Metrics (ch 15)
Simple and computable Empirically and intuitively persuasive Consistent and objective Consistent in use of units and dimensions Programming language independent Should be actionable Persuasive - Be what you would naturally think about a metric These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
17
Actionable Metrics Actionable metrics (or information in general) are metrics that guide change or decisions about something Actionable: Measure the amount of human effort versus use cases completed. Too high: more training, more design, etc… Very low: maybe we can shorten the schedule Not-Actionable: Measure the number of times the letter “e” appears in code These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 Think before you measure. Don’t waste people’s time!
18
Project Metrics used to minimize the development schedule by making the adjustments necessary to avoid delays and mitigate potential problems and risks used to assess product quality on an ongoing basis and, when necessary, modify the technical approach to improve quality. every project should measure: Inputs —measures of the resources (e.g., people, tools) required to do the work. Outputs —measures of the deliverables or work products created during the software engineering process. Results —measures that indicate the effectiveness of the deliverables. These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
19
Typical Project Metrics
Effort/time per software engineering task Errors uncovered per review hour Scheduled vs. actual milestone dates Changes (number) and their characteristics Distribution of effort on software engineering tasks Dist of effort on SWE == Effort on each phase in the lifecycle (then see where you have problems and correlate them) These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
20
Metrics Guidelines Same as process metrics guidelines
Use common sense and organizational sensitivity when interpreting metrics data. Provide regular feedback to the individuals and teams who have worked to collect measures and metrics. Don’t use metrics to appraise individuals. Work with practitioners and teams to set clear goals and metrics that will be used to achieve them. Never use metrics to threaten individuals or teams. Metrics data that indicate a problem area should not be considered “negative.” These data are merely an indicator for process improvement. Don’t obsess on a single metric to the exclusion of other important metrics. These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005 Same as process metrics guidelines
21
Typical Size-Oriented Metrics
errors per KLOC (thousand lines of code) defects per KLOC $ per LOC pages of documentation per KLOC errors per person-month Errors per review hour LOC per person-month $ per page of documentation TTH Class got here. These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
22
Typical Function-Oriented Metrics
errors per Function Point (FP) defects per FP $ per FP pages of documentation per FP FP per person-month These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
23
But.. What is a Function Point?
Function points (FP) are a unit measure for software size developed at IBM in 1979 by Richard Albrecht To determine your number of FPs, you classify a system’s features into five classes: Transactions - External Inputs, External Outputs, External Inquires Data storage - Internal Logical Files and External Interface Files Each class is then weighted by complexity as low/average/high Multiplied by a value adjustment factor (determined by asking questions based on 14 system characteristics EI - Info coming into the system EO - provides derived info out of a system (use ILF and EIF to create information) ExtInq - provides non-derived information out of the system (echos back EI) ILF - Think structures in RAM, datafiles ONLY updated based on EI ExtIF - Think database, datafile updated by anything These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
24
But.. What is a Function Point?
Count Low Average High Total External Input x3 x4 x6 External Output x5 x7 External Inquiries Internal Logic Files x10 x15 External Interface Files Be wary of statistics! Unadjusted Total: Value Adjustment Factor: Total Adjusted Value: These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
25
Function Point Example
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
26
Comparing LOC and FP Representative values developed by QSM
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
27
At IBM in the 70s or 80s (I don’t remember) they paid people per line-of-code they wrote
What happened? A. The best programmers got paid the most B. The worst programmers got paid the most C. The sneakiest programmers, got paid the most D. The lawyers got paid the most These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
28
Why Opt against LOC? Programming language independent
Used readily countable characteristics that are determined early in the software process Does not “penalize” inventive (short) implementations that use fewer LOC that other more clumsy versions Makes it easier to measure the impact of reusable components Other options: COCOMO, Planning Poker, SLIM, Story Points (remember Scrum?), many others… These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
29
Object-Oriented Metrics
Number of scenario scripts (use-cases) Number of support classes (required to implement the system but are not immediately related to the problem domain) Average number of support classes per key class (analysis class) Number of subsystems (an aggregation of classes that support a function that is visible to the end-user of a system) These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
30
WebEngineering Project Metrics
Number of static Web pages (the end-user has no control over the content displayed on the page) Number of dynamic Web pages (end-user actions result in customized content displayed on the page) Number of internal page links (internal page links are pointers that provide a hyperlink to some other Web page within the WebApp) Number of persistent data objects Number of external systems interfaced Number of static content objects Number of dynamic content objects Number of executable functions These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
31
Measuring Quality Correctness — the degree to which a program operates according to specification Maintainability—the degree to which a program is amenable to change Integrity—the degree to which a program is impervious to outside attack Usability—the degree to which a program is easy to use Verified non-conformance with reqmts KLOC MTTC Mean time to change: time to analyze, design, implement and deploy a change threat probability security - likelihood of repelling attack Integrity = 1-(threat*(1-security)) E.g. t=0.25, s= > I=0.99 Correctness - number of Maintainability - MTTC (meant time to change) - given an incoming change req, what is time to design, implement and test the change Integrity – T=Liklihood of threat occuring, S = liklihood of repelling the attack Many options. See ch 12 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
32
Defect Removal Efficiency
DRE = E /(E + D) E is the number of errors found before delivery of the software to the end-user D is the number of defects found after delivery. These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
33
Defect Removal Efficiency
DRE = E /(E + D) Defects found during phase: Requirements (10) Design (20) Construction Implementation (5) Unit Testing (50) Testing Integration Testing (100) System Testing (250) Acceptance Testing (5) By Customer (10) 10 / ( ) = 33% What are the rest? 10 / ( ) = 33% 20 / ( ) = 28% 5 / (5 + 50) = 9% 50 / ( ) = 33% 100 / ( ) = 28% 250 / ( ) = 98% 5 / (5 + 10) = 33% These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
34
Metrics for Small Organizations
time (hours or days) elapsed from the time a request is made until evaluation is complete, tqueue. effort (person-hours) to perform the evaluation, Weval. time (hours or days) elapsed from completion of evaluation to assignment of change order to personnel, teval. effort (person-hours) required to make the change, Wchange. time required (hours or days) to make the change, tchange. errors uncovered during work to make change, Echange. defects uncovered after change is released to the customer base, Dchange. These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
35
Metrics give you information!
Metrics about your process help you determine if you need to make changes or if your process is working Metrics about your project do they same thing Metrics about your software can help you understand it better, and see where possible problems may lurk. Let’s see the complexity measurement (after a few questions…) These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
36
Questions What are some reasons NOT to use lines of code to measure size? What do you expect the DRE rate will be for the implementation (or construction) phase of the software lifecycle? What about for testing? Give an example of a usability metric? According to the chart, Smalltalk is much more efficient than Java and C++. Why don’t we use it for everything? These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.