Every ‘best in class’ company measures software quality. There are no exceptions. If your company does not do this it is not an industry leader and there is a good chance that your software quality levels are marginal at best. Capers Jones Applied Software Measurement
Steven Borg VSTS Practice Lead Northwest Cadence DPR304
Your job!
What is Quality?
You get what you measure. If you don’t measure it, you can’t manage it. You cannot improve what you can’t measure. Garbage in, garbage out. If you don’t measure it, it’s just a hobby. You can't manage what you can't control, and you can't control what you don't measure.
Metrics Matter Without metrics you can’t predict Without metrics you can’t judge quality Without metrics you can’t accurately estimate Without metrics you can’t measure impacts Without metrics you can’t improve consistently
Questions for you. How many of you have implemented process improvement initiative? What was it? Move to agile? CMMI? Six Sigma? What metrics did you use to measure their success?
Questions for you. Is your development team more effective that they were 4 years ago? What code files have the most bugs? How effective was your last process improvement initiative? How often do your teams check in code? What percentage of bugs get removed prior to shipping? What is your velocity? Throughput?
DEFINING METRICS THAT MATTER What makes a metric effective?
Comparable – a metric should be easily compared with the same metric across geography, time and technology. Actionable – a metric must provide information that can be used to improve some aspect of the process. Simple – a metric should be easily understandable, and how it was generated should be easily understood. Honest – a metric should clearly align with organizational goals, and shouldn’t appear good when actual business value is low or negative. Honest metrics are difficult to ‘game’. Charateristics of Effective Metrics
Characteristics of Effective Metrics Comparable Simple Actionable Honest
Lines of Code. Honest? Before For i=1 To 10 Do Print i.ToString() Next i After Print 1.ToString() Print 2.ToString() Print 3.ToString() Print 4.ToString() Print 5.ToString() Print 6.ToString() Print 7.ToString() Print 8.ToString() Print 9.ToString() Print 10.ToString()
Determining the right metrics Collecting metrics has a cost At a minimum that cost includes time and effort Tools such as Team Foundation Server can help dramatically reduce overall cost of metric collection and simplify gathering metrics Collect only those metrics that you can directly leverage to improve your process, or validate an improvement With powerful tools, there is a tendency to want to collect “everything” – this is a mistake
True Metrics vs Proxy Metrics True Metric What you SHOULD measure Proxy Metric What you CAN measure
True Metrics vs Proxy Metrics It’s a snake! It’s a hill! It’s a spear! It’s a bag filled with goo!
Determining the right metrics Collecting metrics has a cost Collect only those metrics that you can directly leverage to: improve your process validate an improvement verify a milestone
Corollary to Tip #2 Don’t waste upfront time: thinking about which other metrics you’ll want in the future Holding meetings about how to gather metrics Start thinking about the next metrics when you’re getting value from your current metrics (or are not getting the expected value from the metrics you have)
ITIL Seven Step Improvement Process Define what you should measure. Define what you can measure. Gather the data. Process the data. Analyze the data. Present and use the information. Implement the corrective action.
“Only companies that measure software productivity and quality can find their way through the morass of conflicting assertions and pick a truly effective path. As it turns out, heavy investment in tools prior to resolving organizational and methodological issues is normally counterproductive and will improve neither quality nor productivity. Only accurate measurements can navigate a path that is both cost-effective and pragmatic.” Capers Jones
A FEW EFFECTIVE METRICS Some widely used metrics with proven value
Defect Removal Efficiency Warning: Not always “simple” Code Coverage Warning: Not always “honest” Performance Profiles Test Cases per Requirement Bugs per Requirement (bug density metrics) Passing Tests per Requirement Cyclomatic Complexity Quality Metrics
Lines of Code Warning: Not always “honest” Function Points Warning: Not “simple” Effort (in hours, days, weeks, etc) Warning: Not always “honest” or “comparable” Story Points Warning: Rarely “comparable” Sizing Metrics
Velocity Warning: Not always “simple” Throughput Quality* Note: Defect Removal Efficiency is highly correlated with Productivity So highly correlated it can often act as proxy metric for productivity Productivity Metrics
User satisfaction Warning: Not always “comparable” Post-release bug count Number of Help Desk calls Warning: Not always “honest” User Metrics
Business Metrics Schedule Estimation Accuracy Warning: Not always “honest” Cost Estimation Accuracy Warning: Not always “honest” Scope Estimation Accuracy Warning: Rarely “honest” ROI Estimation Accuracy Warning: Rarely “simple”, Not always “honest”
True Metrics vs Proxy Metrics
Tip XXX View metrics as a group to help avoid metric “gaming’
Setting the Stage Process was ad-hoc Delivery dates were often missed Quality problems occurred frequently Deployment generally problematic Decision was made to adopt Scrum
Goals Provide process around software development Through Scrum adoption Not have failures in releases Better quality Track process (visibility to management) More predictability Metrics: Estimates vs Actuals % of iteration tasks completed / committed Status measured by state transitions
Sprint 1 No data because they were in the middle of an upgrade There was a Sprint 0 to examine technologies for use in the project Requirements were poorly conceived A large part of sprint 1 was fixing the structure and content of the backlog
Sprint 2 (1/7/2009 – 1/23/2009)
Sprint 3 (1/26/2009 – 2/13/2009)
Sprint 4 (2/18/2009 – 3/13/2009)
Sprint 5 (3/16/2009 – 4/10/2009)
Fortune Telling I want to complete this project by 1 July And I started it on 5 January 2009 There are 68 requirements Will I finish…
Trending Development 1 July
Trending Development 1 July
ESTIMATION AND PLANNING A special case for metrics
“There is a perfect correlation between measurement accuracy and estimation accuracy: Companies that measure well can estimate well; companies that do not measure accurately cannot estimate accurately either.” Capers Jones "Optimism is an occupational hazard of programming: feedback is the treatment." Kent Beck “Planning and estimation are the mirror images of measurement. ” Capers Jones
Demo Using Team System to estimate effort and timelines.
TEAM FOUNDATION SERVER Understanding the power of tooling
Why Tool? "Tools are very important element of defining a path of least resistance. If I can set up a tool so that it’s easier for a developer to do something the way that I want the developer to do it, and harder for the developer to do it some other way, then I think it’s very likely the developer is going to do it the way I want them to, because it’s easier. It’s the path of least resistance." Steve McConnell
Demo Accessing the TFS Cube
Demo Changing the Process Template to gather metrics. Correctly.
REPORTING Transparency and Visibility
How Far Can We Get in The Available Time? Work planned Work completed
What Requirements Haven’t Been Tested? Tracks progression of requirements’ states from Untested to Passed by successive Build
How Effectively Is Our (Outsourced) Team Delivering? Test rates (pass, inconclusive, fail) shown in bars Against code coverage, … code churn, … and active bugs
Solution Stuck in Testing Bulge in resolved → Insufficient resources or inadequate quality from dev
Development Practices Too Loose Growing “Fault Feedback Ratio” – Bugs requiring multiple handling
Scope Creep “Dark matter” emerging during iteration Planned work is squeezed out
CULTURAL ISSUES WITH METRICS The social aspects of collecting metrics
The problem today is not a deficiency in software measurement technology itself; rather, it is cultural resistance on the part of software management and staff. The resistance is due to the natural human belief that measures might be used against them. This feeling is the greatest barrier to applied software measurement. Capers Jones, Applied Software Measurement
"The problem of software process change are often complicated by the fact that no one is responsible to make it happen. If software process improvement isn't anybody's job, it is not surprising that is doesn't get done! If it is important enough to do, however, someone must be assigned the responsibility and given the necessary resources. Until this is done, software process development will remain a nice thing to do someday, but never today." Watts Humpthey
""If it ain't broke, don't fix it," the saying goes. Common software development practices are seriously broken, and the cost of not fixing them has become extreme. Traditional thinking would have it that the change represents the greatest risk. In software's case, the greatest risk lies with not changing - staying mired in unhealthy, profligate development practices instead of switching to practices that were proven more effective many years ago. " Steve McConnell
WRAP UP Final thoughts…
Enable Communication
Related Content Metrics that Matter: Interactive Lab Today, immediately following this session Overcoming Project Estimation Madness DPR205 – 9:00 AM Friday – Room 502 Brian the Build Bunny: Extending Team Build DTL307 – 9:00 AM Friday – Room 404 Better Change and Config Mgmt using TFS DTL305 – 10:45 AM Friday – Room 515B Ask Doug and Steve: A VSTS Q&A Challenge DPR01-INT – 10:45 AM Friday – Blue Theatre 2 Branching & Merging for Parallel Development DTL302 – 1:00 PM Friday – Room 411
Sessions On-Demand & Community Resources for IT Professionals Resources for Developers Microsoft Certification and Training Resources Microsoft Certification & Training Resources Resources Required Slide Speakers, TechEd 2009 is not producing a DVD. Please announce that attendees can access session recordings at TechEd Online. Required Slide Speakers, TechEd 2009 is not producing a DVD. Please announce that attendees can access session recordings at TechEd Online.
Complete an evaluation on CommNet and enter to win! Required Slide
© 2009 Microsoft Corporation. All rights reserved. Microsoft, Windows, Windows Vista and other product names are or may be registered trademarks and/or trademarks in the U.S. and/or other countries. The information herein is for informational purposes only and represents the current view of Microsoft Corporation as of the date of this presentation. Because Microsoft must respond to changing market conditions, it should not be interpreted to be a commitment on the part of Microsoft, and Microsoft cannot guarantee the accuracy of any information provided after the date of this presentation. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS PRESENTATION. Required Slide