Test Metrics: A Practical Approach to Tracking & Interpretation Presented By: Shaun Bradshaw Director of Quality Solutions May 20, 2004 Test Metrics: A.

Slides:



Advertisements
Similar presentations
The 4 T’s of Test Automation:
Advertisements

S-Curves & the Zero Bug Bounce:
1.Quality-“a characteristic or attribute of something.” As an attribute of an item, quality refers to measurable characteristics— things we are able to.
QA and Need for a QA Framework A Walkthrough. What is QA? Quality Assurance is a process driven approach Ensures that the developed product meets the.
Course: e-Governance Project Lifecycle Day 1
Test Automation Success: Choosing the Right People & Process
The ROI of Testing Presented By: Shaun Bradshaw Questcon Technologies The ROI of Testing Presented By: Shaun Bradshaw Questcon Technologies.
Chapter 4 Quality Assurance in Context
QAAC 1 Metrics: A Path for Success Kim Mahoney, QA Manager, The Hartford
Stoimen Stoimenov QA Engineer SitefinityLeads, SitefinityTeam6 Telerik QA Academy Telerik QA Academy.
Roadmap to Continuous Integration Testing and Benefits Gowri Selka, Walgreens Natalie Koltun, Walgreens May 20th, 2014 ©2013 Walgreen Co. All rights reserved.
Personal Software Process
Week 7: Requirements validation Structured walkthroughs Why have walkthroughs When to have walkthroughs Who participates What procedures are helpful Thoughtless.
Software Measurement and Process Improvement
RIT Software Engineering
SE 450 Software Processes & Product Metrics 1 Defect Removal.
System Implementation
Software Process and Product Metrics
RISK MANAGEMENT IN SOFTWARE ENGINEERING RISK MANAGEMENT IN SOFTWARE ENGINEERING Prepared by Prepared by Sneha Mudumba Sneha Mudumba.
Overview of Lean Six Sigma
Team Skill 1 Analyzing the Problem Business Modeling (6) 1.
Complete and Integrated Lifecycle Management. Challenges 1.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 27 Slide 1 Quality Management 1.
Software Quality Chapter Software Quality  How can you tell if software has high quality?  How can we measure the quality of software?  How.
S OFTWARE Q UALITY QA: Quality Assurance By: MSMZ.
What is Software Engineering? the application of a systematic, disciplined, quantifiable approach to the development, operation, and maintenance of software”
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
CLEANROOM SOFTWARE ENGINEERING.
Testing – A Methodology of Science and Art. Agenda To show, A global Test Process which work Like a solution Black Box for an Software Implementation.
© Mahindra Satyam 2009 Defect Management and Prevention QMS Training.
Business Analysis and Essential Competencies
Welcome to Lean Six Sigma Green Belt Training
Balamurali L Senior SQA Manager Diana Ambrose Senior Lead SQA Arun Kumar V Senior Engineer QA
Why do we … Life cycle processes … simplified !!!.
University of Sunderland COMM80 Risk Assessment of Systems ChangeUnit 13 Overview of Riskit*: The Method and its Techniques * Further information available.
1 Essentials of Migration Management for Policy Makers and Practitioners Section 3.10 Management of Operational Data.
Software Test Metrics When you can measure what you are speaking about and express it in numbers, you know something about it; but when you cannot measure,
Software Measurement & Metrics
What is a Business Analyst? A Business Analyst is someone who works as a liaison among stakeholders in order to elicit, analyze, communicate and validate.
Systems Design Approaches The Waterfall vs. Iterative Methodologies.
Testing Workflow In the Unified Process and Agile/Scrum processes.
Team Assignment 15 Team 04 Class K15T2. Agenda 1. Introduction 2. Measurement process 3. GQM 4. Strength Weakness of metrics.
© 1998 Carnegie Mellon UniversityTutorial The Personal Software Process (PSP) The overview of the PSP that follows has been built from material made.
Linda Hayes’ suggests that automation has to be taken up when: Different Server platforms are to be tested Huge number of tests are to be performed. If.
Systems Analysis and Design in a Changing World, Fourth Edition
Search Engine Optimization © HiTech Institute. All rights reserved. Slide 1 What is Solution Assessment & Validation?
1 TenStep Project Management Process ™ PM00.9 PM00.9 Project Management Preparation for Success * Manage Quality *
Chapter 3: Software Project Management Metrics
Defect resolution  Defect logging  Defect tracking  Consistent defect interpretation and tracking  Timely defect reporting.
SOFTWARE PROCESS AND PROJECT METRICS. Topic Covered  Metrics in the process and project domains  Process, project and measurement  Process Metrics.
PRJ566 Project Planning & Management Software Architecture.
SEN 460 Software Quality Assurance. Bahria University Karachi Campus Waseem Akhtar Mufti B.E(UIT), M.S(S.E) AAU Denmark Assistant Professor Department.
July, 2008 Impati – Software Test Solutions. July, Contents Testing Service Overview and Approach Test Services and Industries Key Services Offering.
Software Quality Assurance SOFTWARE DEFECT. Defect Repair Defect Repair is a process of repairing the defective part or replacing it, as needed. For example,
Static and Integration Testing. Static Testing vs Dynamic Testing  To find defects  This testing includes verification process  without executing.
C++ for Engineers and Scientists, Second Edition 1 Problem Solution and Software Development Software development procedure: method for solving problems.
What is a software? Computer Software, or just Software, is the collection of computer programs and related data that provide the instructions telling.
 System Requirement Specification and System Planning.
Continuous Delivery- Complete Guide
Regression Testing with its types
Software Quality Engineering
Software Quality Engineering
Software Quality Engineering
Strategies For Software Test Documentation
Fundamental Test Process
Take Control Over Underwriting Quality and Leakage
Presentation transcript:

Test Metrics: A Practical Approach to Tracking & Interpretation Presented By: Shaun Bradshaw Director of Quality Solutions May 20, 2004 Test Metrics: A Practical Approach to Tracking & Interpretation Presented By: Shaun Bradshaw Director of Quality Solutions May 20, 2004

Slide 2 Objectives Why Measure? Definition Metrics Philosophy Types of Metrics Interpreting the Results Metrics Case Study Q & A

Slide 3 “Software bugs cost the U.S. economy an estimated $59.5 billion per year. An estimated $22.2 billion could be eliminated by improved testing that enables earlier and more effective identification and removal of defects.” - US Department of Commerce (NIST) “Software bugs cost the U.S. economy an estimated $59.5 billion per year. An estimated $22.2 billion could be eliminated by improved testing that enables earlier and more effective identification and removal of defects.” - US Department of Commerce (NIST) Why Measure?

Slide 4 It is often said, “You cannot improve what you cannot measure.” It is often said, “You cannot improve what you cannot measure.” Why Measure?

Slide 5 Definition Test Metrics: Are a standard of measurement. Gauge the effectiveness and efficiency of several software development activities. Are gathered and interpreted throughout the test effort. Provide an objective measurement of the success of a software project. Test Metrics: Are a standard of measurement. Gauge the effectiveness and efficiency of several software development activities. Are gathered and interpreted throughout the test effort. Provide an objective measurement of the success of a software project.

Slide 6 Metrics Philosophy When tracked and used properly, test metrics can aid in software development process improvement by providing pragmatic & objective evidence of process change initiatives. Keep It Simple Make It Meaningful Track It Use It

Slide 7 Metrics Philosophy Measure the basics first Clearly define each metric Get the most “bang for your buck” Keep It Simple Make It Meaningful Track It Use It

Slide 8 Metrics Philosophy Metrics are useless if they are meaningless (use GQM model) Must be able to interpret the results Metrics interpretation should be objective Make It Meaningful Keep It Simple Track It Use It

Slide 9 Metrics Philosophy Incorporate metrics tracking into the Run Log or defect tracking system Automate tracking process to remove time burdens Accumulate throughout the test effort & across multiple projects Track It Keep It Simple Use It Make It Meaningful

Slide 10 Metrics Philosophy Use It Keep It Simple Make It Meaningful Track It Interpret the results Provide feedback to the Project Team Implement changes based on objective data

Slide 11 Types of Metrics Base Metrics Examples Raw data gathered by Test Analysts Tracked throughout test effort Used to provide project status and evaluations/feedback # Test Cases # Executed # Passed # Failed # Under Investigation # Blocked # 1 st Run Failures # Re-Executed Total Executions Total Passes Total Failures

Slide 12 Types of Metrics # Blocked Base Metrics Examples The number of distinct test cases that cannot be executed during the test effort due to an application or environmental constraint. Defines the impact of known system defects on the ability to execute specific test cases Raw data gathered by Test Analyst Tracked throughout test effort Used to provide project status and evaluations/feedback # Test Cases # Executed # Passed # Failed # Under Investigation # Blocked # 1 st Run Failures # Re-Executed Total Executions Total Passes Total Failures

Slide 13 Types of Metrics Calculated Metrics Examples Tracked by Test Lead/Manager Converts base metrics to useful data Combinations of metrics can be used to evaluate process changes % Complete % Test Coverage % Test Cases Passed % Test Cases Blocked 1 st Run Fail Rate Overall Fail Rate % Defects Corrected % Rework % Test Effectiveness Defect Discovery Rate

Slide 14 Types of Metrics 1 st Run Fail Rate Calculated Metrics Examples The percentage of executed test cases that failed on their first execution. Used to determine the effectiveness of the analysis and development process. Comparing this metric across projects shows how process changes have impacted the quality of the product at the end of the development phase. Tracked by Test Lead/Manager Converts base metrics to useful data Combinations of metrics can be used to evaluate process changes % Complete % Test Coverage % Test Cases Passed % Test Cases Blocked 1 st Run Fail Rate Overall Fail Rate % Defects Corrected % Rework % Test Effectiveness Defect Discovery Rate

Slide 15 Sample Run Log

Slide 16 Sample Run Log

Slide 17 Issue: The test team tracks and reports various test metrics, but there is no effort to analyze the data. Result: Potential improvements are not implemented leaving process gaps throughout the SDLC. This reduces the effectiveness of the project team and the quality of the applications. Metrics Program – No Analysis

Slide 18 Solution: Closely examine all available data Use the objective information to determine the root cause Compare to other projects Are the current metrics typical of software projects in your organization? What effect do changes have on the software development process? Metrics Analysis & Interpretation Result: Future projects benefit from a more effective and efficient application development process.

Slide 19 Volvo IT of North America had little or no testing involvement in its IT projects. The organization’s projects were primarily maintenance related and operated in a COBOL/CICS/Mainframe environment. The organization had a desire to migrate to newer technologies and felt that testing involvement would assure and enhance this technological shift. While establishing a test team we also instituted a metrics program to track the benefits of having a QA group. Metrics Case Study

Slide 20 Project V Introduced a test methodology and metrics program Project was 75% complete (development was nearly finished) Test team developed 355 test scenarios 30.7% - 1 st Run Fail Rate 31.4% - Overall Fail Rate Defect Repair Costs = $519,000 Metrics Case Study

Slide 21 Project T Instituted requirements walkthroughs and design reviews with test team input Same resources comprised both project teams Test team developed 345 test scenarios 17.9% - 1 st Run Fail Rate 18.0% - Overall Fail Rate Defect Repair Costs = $346,000 Metrics Case Study

Slide 22 Metrics Case Study Reduction of 33.3% in the cost of defect repairs Every project moving forward, using the same QA principles can achieve the same type of savings.

Slide 23 Q & A