Every ‘best in class’ company measures software quality. There are no exceptions. If your company does not do this it is not an industry leader and there.

Slides:



Advertisements
Similar presentations
04b | Manage Test Execution (2 of 2) Steven Borg | Co-founder & Strategist, Northwest Cadence Anthony Borton | ALM Consultant, Enhance ALM.
Advertisements

05 | Define End Value for the Software Iteration Steven Borg | Co-founder & Strategist, Northwest Cadence Anthony Borton | ALM Consultant, Enhance ALM.
ConnectedDevelopment Supported by 26 MAY Disclaimer text to go here after the video.
Steven Borg | Co-founder & Strategist, Northwest Cadence Anthony Borton | ALM Consultant, Enhance ALM.
Session 1.
Dan Parish Program Manager Microsoft Session Code: OFC 304.
Robert LevyDoug Kramer Program ManagerDevelopment Lead DTL337.
04 | Define a Software Iteration Steven Borg | Co-founder & Strategist, Northwest Cadence Anthony Borton | ALM Consultant, Enhance ALM.
Sudesh Krishnamoorthy Developer Technology Specialist | Microsoft |
Chris Menegay VP of Consulting Notion Solutions, Inc. DTL319.
Ram Cherala Principal Program Manager Microsoft Corporation DTL320.
Siddharth Bhatia Senior Program Manager Microsoft Session Code: DTL301.
Mickey Gousset Senior Technical Developer Infront Consulting Group Session Code: DTL330.
demo © 2008 Microsoft Corporation. All rights reserved. Microsoft, Windows, Windows Vista and other product names.
demo QueryForeign KeyInstance /sm:body()/x:Order/x:Delivery/y:TrackingId1Z
Stephen Forte Chief Strategy Officer Telerik Session Code: DEV317.
Thavash Govender Senior BI Consultant iSolve Business Solutions BIN307.
Dattatreya Kulkarni Sr. Project Manager Sonata Software Limited UNC203 Sanjay Kumar Madhva Architect Sonata Software Limited.
Brian Harry Technical Fellow Microsoft Session Code: DEV205.
Alyson Powell Erwin Sr. Program Manager Microsoft BIN307.
Arend-Jan Speksnijder Solutions Architect Microsoft Dynamics Lighthouse team Dynamics AX2009 Technical Overview and Demo (DYN301)
Microsoft Corporation. Announcement Visual Studio® Team System 2008 Enables you to Build Quality Code Be More Productive Collaborate at the Next Level.
Sara Ford Program Manager Microsoft Corporation DPR301.
Warren Stevens-Baytopp Director YoungBlood Consultants Session Code BIN303.
Asif Rehmani Trainer SharePoint-eLearning.com OFC 301.
Mark Michaelis Chief Computer Nerd IDesign/Itron/IntelliTechture DTL313.
Scott Morrison Program Manager Microsoft Corporation Session Code: WUX308.
MIX 09 4/17/2018 4:41 PM © 2009 Microsoft Corporation. All rights reserved. Microsoft, Windows, Windows Vista and other product names are or may be registered.
Metrics That Matter Real Measures to Improve Software Development
TechEd /22/2018 7:16 PM © 2013 Microsoft Corporation. All rights reserved. Microsoft, Windows, and other product names are or may be registered trademarks.
How We Do Language Design at Microsoft (C#, Visual Basic, F#)
Why Software Estimation is so Painful and How It Doesn’t Have To Be
6/2/2018 3:37 PM © 2009 Microsoft Corporation. All rights reserved. Microsoft, Windows, Windows Vista and other product names are or may be registered.
Tech·Ed North America /6/2018 2:20 AM
Tech·Ed  North America /11/ :01 AM SESSION CODE: DEV405
6/12/ :53 PM DEV311 Deep Dive into Microsoft Visual Studio Team Foundation Server 2010 Reporting Steven Borg, Principal ALM Consultant Northwest.
6/13/2018 1:23 AM © 2009 Microsoft Corporation. All rights reserved. Microsoft, Windows, Windows Vista and other product names are or may be registered.
Tech·Ed North America /19/2018 3:29 PM
Владимир Гусаров Директор R&D, Dell Visual Studio ALM MVP ALM Ranger
9/11/2018 5:53 PM © 2009 Microsoft Corporation. All rights reserved. Microsoft, Windows, Windows Vista and other product names are or may be registered.
Tech·Ed North America /14/2018 7:13 PM
Sysinternals Tutorials
11/22/2018 8:05 PM © 2009 Microsoft Corporation. All rights reserved. Microsoft, Windows, Windows Vista and other product names are or may be registered.
Jason Zander Unplugged
Branching and Merging Practices
12/5/2018 3:24 PM © 2009 Microsoft Corporation. All rights reserved. Microsoft, Windows, Windows Vista and other product names are or may be registered.
Tech·Ed North America /5/2018 6:43 PM
12/8/2018 OFC-B270 Why Adoption Matters: Key Factors in Maximizing ROI and Customer Satisfaction in Your Lync Deployment Marc Sanders © 2014 Microsoft.
Deep Dive into the Team Foundation Server 2012 Agile Planning Tools
Ben Robb MVP, SharePoint Server cScape Ltd Session Code: OFS207
Team Foundation Server 2010 for Everyone
12/27/ :01 PM © 2009 Microsoft Corporation. All rights reserved. Microsoft, Windows, Windows Vista and other product names are or may be registered.
Tech·Ed North America /2/2019 4:47 PM
DEV410: Deep Dive into Team Foundation Server 2012 Reporting
Tech·Ed North America /17/2019 1:47 AM
Tech·Ed North America /17/2019 6:01 PM
Peter Provost Sr. Program Manager Microsoft Session Code: DEV312
Tech·Ed North America /22/2019 7:40 PM
Building Silverlight Apps with RIA Services
Windows 8 Security Internals
2010 Microsoft BI Conference
Tech·Ed North America /25/ :53 PM
Hack-proofing your Clients using Windows 7 Security!
Lap Around the Windows Azure Platform
Code First Development in Microsoft ADO.NET Entity Framework 4.1
Building BI applications using PowerPivot for Excel
5/24/ :22 AM © 2009 Microsoft Corporation. All rights reserved. Microsoft, Windows, Windows Vista and other product names are or may be registered.
Tech·Ed North America /17/2019 4:14 PM
Jamie Cool Program Manager Microsoft
WCL425 App Compat for Nerds Chris Jackson.
Presentation transcript:

Every ‘best in class’ company measures software quality. There are no exceptions. If your company does not do this it is not an industry leader and there is a good chance that your software quality levels are marginal at best. Capers Jones Applied Software Measurement

Steven Borg VSTS Practice Lead Northwest Cadence DPR304

Your job!

What is Quality?

You get what you measure. If you don’t measure it, you can’t manage it. You cannot improve what you can’t measure. Garbage in, garbage out. If you don’t measure it, it’s just a hobby. You can't manage what you can't control, and you can't control what you don't measure.

Metrics Matter Without metrics you can’t predict Without metrics you can’t judge quality Without metrics you can’t accurately estimate Without metrics you can’t measure impacts Without metrics you can’t improve consistently

Questions for you. How many of you have implemented process improvement initiative? What was it? Move to agile? CMMI? Six Sigma? What metrics did you use to measure their success?

Questions for you. Is your development team more effective that they were 4 years ago? What code files have the most bugs? How effective was your last process improvement initiative? How often do your teams check in code? What percentage of bugs get removed prior to shipping? What is your velocity? Throughput?

DEFINING METRICS THAT MATTER What makes a metric effective?

Comparable – a metric should be easily compared with the same metric across geography, time and technology. Actionable – a metric must provide information that can be used to improve some aspect of the process. Simple – a metric should be easily understandable, and how it was generated should be easily understood. Honest – a metric should clearly align with organizational goals, and shouldn’t appear good when actual business value is low or negative. Honest metrics are difficult to ‘game’. Charateristics of Effective Metrics

Characteristics of Effective Metrics Comparable Simple Actionable Honest

Lines of Code. Honest? Before For i=1 To 10 Do Print i.ToString() Next i After Print 1.ToString() Print 2.ToString() Print 3.ToString() Print 4.ToString() Print 5.ToString() Print 6.ToString() Print 7.ToString() Print 8.ToString() Print 9.ToString() Print 10.ToString()

Determining the right metrics Collecting metrics has a cost At a minimum that cost includes time and effort Tools such as Team Foundation Server can help dramatically reduce overall cost of metric collection and simplify gathering metrics Collect only those metrics that you can directly leverage to improve your process, or validate an improvement With powerful tools, there is a tendency to want to collect “everything” – this is a mistake

True Metrics vs Proxy Metrics True Metric What you SHOULD measure Proxy Metric What you CAN measure

True Metrics vs Proxy Metrics It’s a snake! It’s a hill! It’s a spear! It’s a bag filled with goo!

Determining the right metrics Collecting metrics has a cost Collect only those metrics that you can directly leverage to: improve your process validate an improvement verify a milestone

Corollary to Tip #2 Don’t waste upfront time: thinking about which other metrics you’ll want in the future Holding meetings about how to gather metrics Start thinking about the next metrics when you’re getting value from your current metrics (or are not getting the expected value from the metrics you have)

ITIL Seven Step Improvement Process Define what you should measure. Define what you can measure. Gather the data. Process the data. Analyze the data. Present and use the information. Implement the corrective action.

“Only companies that measure software productivity and quality can find their way through the morass of conflicting assertions and pick a truly effective path. As it turns out, heavy investment in tools prior to resolving organizational and methodological issues is normally counterproductive and will improve neither quality nor productivity. Only accurate measurements can navigate a path that is both cost-effective and pragmatic.” Capers Jones

A FEW EFFECTIVE METRICS Some widely used metrics with proven value

Defect Removal Efficiency Warning: Not always “simple” Code Coverage Warning: Not always “honest” Performance Profiles Test Cases per Requirement Bugs per Requirement (bug density metrics) Passing Tests per Requirement Cyclomatic Complexity Quality Metrics

Lines of Code Warning: Not always “honest” Function Points Warning: Not “simple” Effort (in hours, days, weeks, etc) Warning: Not always “honest” or “comparable” Story Points Warning: Rarely “comparable” Sizing Metrics

Velocity Warning: Not always “simple” Throughput Quality* Note: Defect Removal Efficiency is highly correlated with Productivity So highly correlated it can often act as proxy metric for productivity Productivity Metrics

User satisfaction Warning: Not always “comparable” Post-release bug count Number of Help Desk calls Warning: Not always “honest” User Metrics

Business Metrics Schedule Estimation Accuracy Warning: Not always “honest” Cost Estimation Accuracy Warning: Not always “honest” Scope Estimation Accuracy Warning: Rarely “honest” ROI Estimation Accuracy Warning: Rarely “simple”, Not always “honest”

True Metrics vs Proxy Metrics

Tip XXX View metrics as a group to help avoid metric “gaming’

Setting the Stage Process was ad-hoc Delivery dates were often missed Quality problems occurred frequently Deployment generally problematic Decision was made to adopt Scrum

Goals Provide process around software development Through Scrum adoption Not have failures in releases Better quality Track process (visibility to management) More predictability Metrics: Estimates vs Actuals % of iteration tasks completed / committed Status measured by state transitions

Sprint 1 No data because they were in the middle of an upgrade There was a Sprint 0 to examine technologies for use in the project Requirements were poorly conceived A large part of sprint 1 was fixing the structure and content of the backlog

Sprint 2 (1/7/2009 – 1/23/2009)

Sprint 3 (1/26/2009 – 2/13/2009)

Sprint 4 (2/18/2009 – 3/13/2009)

Sprint 5 (3/16/2009 – 4/10/2009)

Fortune Telling I want to complete this project by 1 July And I started it on 5 January 2009 There are 68 requirements Will I finish…

Trending Development 1 July

Trending Development 1 July

ESTIMATION AND PLANNING A special case for metrics

“There is a perfect correlation between measurement accuracy and estimation accuracy: Companies that measure well can estimate well; companies that do not measure accurately cannot estimate accurately either.” Capers Jones "Optimism is an occupational hazard of programming: feedback is the treatment." Kent Beck “Planning and estimation are the mirror images of measurement. ” Capers Jones

Demo Using Team System to estimate effort and timelines.

TEAM FOUNDATION SERVER Understanding the power of tooling

Why Tool? "Tools are very important element of defining a path of least resistance. If I can set up a tool so that it’s easier for a developer to do something the way that I want the developer to do it, and harder for the developer to do it some other way, then I think it’s very likely the developer is going to do it the way I want them to, because it’s easier. It’s the path of least resistance." Steve McConnell

Demo Accessing the TFS Cube

Demo Changing the Process Template to gather metrics. Correctly.

REPORTING Transparency and Visibility

How Far Can We Get in The Available Time? Work planned Work completed

What Requirements Haven’t Been Tested? Tracks progression of requirements’ states from Untested to Passed by successive Build

How Effectively Is Our (Outsourced) Team Delivering? Test rates (pass, inconclusive, fail) shown in bars Against code coverage, … code churn, … and active bugs

Solution Stuck in Testing Bulge in resolved → Insufficient resources or inadequate quality from dev

Development Practices Too Loose Growing “Fault Feedback Ratio” – Bugs requiring multiple handling

Scope Creep “Dark matter” emerging during iteration Planned work is squeezed out

CULTURAL ISSUES WITH METRICS The social aspects of collecting metrics

The problem today is not a deficiency in software measurement technology itself; rather, it is cultural resistance on the part of software management and staff. The resistance is due to the natural human belief that measures might be used against them. This feeling is the greatest barrier to applied software measurement. Capers Jones, Applied Software Measurement

"The problem of software process change are often complicated by the fact that no one is responsible to make it happen. If software process improvement isn't anybody's job, it is not surprising that is doesn't get done! If it is important enough to do, however, someone must be assigned the responsibility and given the necessary resources. Until this is done, software process development will remain a nice thing to do someday, but never today." Watts Humpthey

""If it ain't broke, don't fix it," the saying goes. Common software development practices are seriously broken, and the cost of not fixing them has become extreme. Traditional thinking would have it that the change represents the greatest risk. In software's case, the greatest risk lies with not changing - staying mired in unhealthy, profligate development practices instead of switching to practices that were proven more effective many years ago. " Steve McConnell

WRAP UP Final thoughts…

Enable Communication

Related Content Metrics that Matter: Interactive Lab Today, immediately following this session Overcoming Project Estimation Madness DPR205 – 9:00 AM Friday – Room 502 Brian the Build Bunny: Extending Team Build DTL307 – 9:00 AM Friday – Room 404 Better Change and Config Mgmt using TFS DTL305 – 10:45 AM Friday – Room 515B Ask Doug and Steve: A VSTS Q&A Challenge DPR01-INT – 10:45 AM Friday – Blue Theatre 2 Branching & Merging for Parallel Development DTL302 – 1:00 PM Friday – Room 411

Sessions On-Demand & Community Resources for IT Professionals Resources for Developers Microsoft Certification and Training Resources Microsoft Certification & Training Resources Resources Required Slide Speakers, TechEd 2009 is not producing a DVD. Please announce that attendees can access session recordings at TechEd Online. Required Slide Speakers, TechEd 2009 is not producing a DVD. Please announce that attendees can access session recordings at TechEd Online.

Complete an evaluation on CommNet and enter to win! Required Slide

© 2009 Microsoft Corporation. All rights reserved. Microsoft, Windows, Windows Vista and other product names are or may be registered trademarks and/or trademarks in the U.S. and/or other countries. The information herein is for informational purposes only and represents the current view of Microsoft Corporation as of the date of this presentation. Because Microsoft must respond to changing market conditions, it should not be interpreted to be a commitment on the part of Microsoft, and Microsoft cannot guarantee the accuracy of any information provided after the date of this presentation. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS PRESENTATION. Required Slide