Download presentation
Presentation is loading. Please wait.
Published byArron Douglas Modified over 6 years ago
1
Metrics That Matter Real Measures to Improve Software Development
Steven Borg, Principal ALM Consultant Northwest Cadence
2
Every ‘best in class’ company measures software quality
Every ‘best in class’ company measures software quality. There are no exceptions. If your company does not do this it is not an industry leader and there is a good chance that your software quality levels are marginal at best. Capers Jones, Applied Software Measurement
3
You get what you measure. If you don’t measure it, you can’t manage it
You get what you measure. If you don’t measure it, you can’t manage it. You cannot improve what you can’t measure. Garbage in, garbage out. If you don’t measure it, it’s just a hobby. "You can't manage what you can't control, and you can't control what you don't measure. " Tom DeMarco
4
Metrics Matter Without metrics you can’t predict you can’t judge quality you can’t accurately estimate you can’t measure impacts you can’t improve consistently
5
Characteristics of Effective Metrics
Comparable Honest Simple Actionable
6
True Metrics vs Proxy Metrics
True Metric What you SHOULD measure Proxy Metric What you CAN measure Tip #1 Focus on the true metric, not proxy metrics
7
“The companies that wish to improve but do not measure are at the mercy of fads and chance. Progress may not be impossible, but it is certainly unlikely.” Capers Jones Tip #2 Only introduce one or two new metrics at a time
8
Deming Cycle 1st 2nd 4th 3rd
9
ITIL Seven Step Improvement Process
Define what you should measure. Define what you can measure. Gather the data. Who? How? When? Integrity of the data. Process the data. Frequency, format, system, accuracy Analyze the data. Relationships, trends, according to plan, targets met, corrective action Present and use the information. Implement the corrective action. ITIL: Information Technology Infrastructure Library
10
Tip #3 Standardize by setting triggers to alert you to slips in prior metrics A few effective metrics: Quality Metrics Sizing Metrics Productivity Metrics User Metrics Business Metrics
11
Quality Metrics Defect Removal Efficiency Warning: Not always “simple”
Code Coverage Warning: Not always “honest” Performance Profiling Test Cases per feature Bugs per feature (bug density metrics)
12
Sizing Metrics Lines of Code Warning: Not always “honest”
Function Points Warning: Not “simple” Effort (in hours, days, weeks, etc) Warning: Not always “honest” or “comparable” Story Points Warning: Rarely “comparable”
13
Productivity Metrics Velocity Warning: Not always “simple” Throughput
Lines of Code / Developer / Day (???) Quality* Note: Defect Removal Efficiency is highly correlated with Productivity So highly correlated it can often act as proxy metric for productivity
14
User Metrics User satisfaction Warning: Not always “comparable”
Post-release bug count Number of Help Desk calls Warning: Not always “honest”
15
Business Metrics Schedule Estimation Accuracy Warning: Not always “honest” Cost Estimation Accuracy Scope Estimation Accuracy Warning: Rarely “honest” ROI Estimation Accuracy Warning: Rarely “simple”, Not always “honest”
16
Tip #4 First identify your problem, THEN identify the metric Tip #5 Balance introduction of new metrics across different categories The problem He broke the Build!!!
17
Balance introduction of new metrics across different categories
Tip #5 Balance introduction of new metrics across different categories The problem today is not a deficiency in software measurement technology itself; rather, it is cultural resistance on the part of software management and staff. The resistance is due to the natural human belief that measures might be used against them. This feeling is the greatest barrier to applied software measurement. Capers Jones, Applied Software Measurement
18
Tip #6 NEVER use metrics to evaluate individuals Tip #7 Watch your metrics closely; adjust when necessary
19
Estimation and Planning
“There is a perfect correlation between measurement accuracy and estimation accuracy: Companies that measure well can estimate well; companies that do not measure accurately cannot estimate accurately either.” Capers Jones "Optimism is an occupational hazard of programming: feedback is the treatment." Kent Beck “Planning and estimation are the mirror images of measurement. ”
20
ALM (Application Lifecycle Management Tooling)
Next Slide
21
Why ALM (Application Lifecycle Management Tooling)?
Tip #8 Count first, Calculate next, use Judgment last Why ALM (Application Lifecycle Management Tooling)? "Tools are very important element of defining a path of least resistance. If I can set up a tool so that it’s easier for a developer to do something the way that I want the developer to do it, and harder for the developer to do it some other way, then I think it’s very likely the developer is going to do it the way I want them to, because it’s easier. It’s the path of least resistance." Steve McConnell
22
Application Lifecycle Management Tooling
Tip #9 When possible, use an integrated ALM Tool to gather metrics Tip #10 Start using the built-in reports right away!
23
Cultural issues with Metrics
The problem today is not a deficiency in software measurement technology itself; rather, it is cultural resistance on the part of software management and staff. The resistance is due to the natural human belief that measures might be used against them. This feeling is the greatest barrier to applied software measurement. Capers Jones, Applied Software Measurement
24
"The problem of software process change are often complicated by the fact that no one is responsible to make it happen. If software process improvement isn't anybody's job, it is not surprising that is doesn't get done! If it is important enough to do, however, someone must be assigned the responsibility and given the necessary resources. Until this is done, software process development will remain a nice thing to do someday, but never today." Watts Humpthey
25
Tip #11 Put someone in charge of process improvement… use metrics to show change "If it ain't broke, don't fix it," the saying goes. Common software development practices are seriously broken, and the cost of not fixing them has become extreme. Traditional thinking would have it that the change represents the greatest risk. In software's case, the greatest risk lies with not changing - staying mired in unhealthy, profligate development practices instead of switching to practices that were proven more effective many years ago. Steve McConnell
26
Tip #12 Start collecting metrics now
Tip #12 Start collecting metrics now. Next year, you shouldn’t be saying “I wish I had some metrics” Steven Borg
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.