Test Metrics. Metrics Framework “Lord Kelvin, a renowned British physicist, is reputed to have said: ‘When you can measure what you are speaking about,

Slides:



Advertisements
Similar presentations
Metrics and Databases for Agile Software Development Projects David I. Heimann IEEE Boston Reliability Society April 14, 2010.
Advertisements

Test Execution and Defect management. 2 Module Objectives Introduction to Test Execution Checklist of Test Execution Defect management Defect Classification.
QAAC 1 Metrics: A Path for Success Kim Mahoney, QA Manager, The Hartford
Metrics for Process and Projects
ITIL: Service Transition
Computer Engineering 203 R Smith Project Tracking 12/ Project Tracking Why do we want to track a project? What is the projects MOV? – Why is tracking.
Project Closure Report Basker George. Project Closure When does a project end? Does it end when the software has been delivered to customer & acceptance-tested?
Practical Uses of Software Measurement for Process Improvement January 10, V1.0 Larry Dribin, Ph.D
Project Change Management
A framework for describing IT Project Management Processes and Tool Set Features Enterprise Project Management Framework.
Software Development Process Models. The Waterfall Development Model.
Software Quality Metrics
Planning a measurement program What is a metrics plan? A metrics plan must describe the who, what, where, when, how, and why of metrics. It begins with.
Software Measurement and Process Improvement
SE 450 Software Processes & Product Metrics 1 Defect Removal.
CMM Overview - 1 © Paul Sorenson CMPUT Software Engineering refs. IEEE Software, March 1988, 73-79, and IEEE Software, July 1993, (Capability.
SE is not like other projects. l The project is intangible. l There is no standardized solution process. l New projects may have little or no relationship.
SOFTWARE PROJECT MANAGEMENT Project Quality Management Dr. Ahmet TÜMAY, PMP.
Software Process and Product Metrics
Software Test Plan Why do you need a test plan? –Provides a road map –Provides a feasibility check of: Resources/Cost Schedule Goal What is a test plan?
Readiness Index – Is your application ready for Production? Jeff Tatelman SQuAD October 2008.
Capability Maturity Model
SOFTWARE QUALITY ASSURANCE Asst. Prof. Dr. Selim BAYRAKLI Maltepe University Faculty of Engineering SE 410.
Release & Deployment ITIL Version 3
Process: A Generic View
Complete and Integrated Lifecycle Management. Challenges 1.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 27 Slide 1 Quality Management 1.
S/W Project Management
© Mahindra Satyam 2009 Project Metrics QMS Training.
Software Quality Chapter Software Quality  How can you tell if software has high quality?  How can we measure the quality of software?  How.
N By: Md Rezaul Huda Reza n
Rational Unified Process Fundamentals Module 4: Disciplines II.
Software Estimation and Function Point Analysis Presented by Craig Myers MBA 731 November 12, 2007.
Chapter 6 : Software Metrics
1 Quality Center 10.0 NOTE: Uninstall the current version of QC before downloading QC All QC 10.0 documents can be located on the BI Shared Services.
Software Test Metrics When you can measure what you are speaking about and express it in numbers, you know something about it; but when you cannot measure,
Software Measurement & Metrics
Ahmad Al-Ghoul. Learning Objectives Explain what a project is,, list various attributes of projects. Describe project management, discuss Who uses Project.
UKSMA 2005 Lessons Learnt from introducing IT Measurement Peter Thomas –
Object-oriented Analysis and Design Stages in a Software Project Requirements Writing Analysis Design Implementation System Integration and Testing Maintenance.
Software Quality Metrics
CS 3300 FALL 2015 Software Metrics. Some Quotes When you can measure what you are speaking about and express it in numbers, you know something about it;
Computing and SE II Chapter 15: Software Process Management Er-Yu Ding Software Institute, NJU.
Chapter 3: Software Project Management Metrics
Cmpe 589 Spring 2006 Lecture 2. Software Engineering Definition –A strategy for producing high quality software.
Introduction to Measurement. According to Lord Kelvin “When you can measure what you are speaking about and express it in numbers, you know something.
A Metrics Program. Advantages of Collecting Software Quality Metrics Objective assessments as to whether quality requirements are being met can be made.
Project management Topic 1 Project management principles.
Test status report Test status report is important to track the important project issues, accomplishments of the projects, pending work and milestone analysis(
Software Engineering Modern Approaches Eric Braude and Michael Bernstein 1.
Hussein Alhashimi. “If you can’t measure it, you can’t manage it” Tom DeMarco,
Software Test Plan Why do you need a test plan? –Provides a road map –Provides a feasibility check of: Resources/Cost Schedule Goal What is a test plan?
6/6/ SOFTWARE LIFE CYCLE OVERVIEW Professor Ron Kenett Tel Aviv University School of Engineering.
LECTURE 5 Nangwonvuma M/ Byansi D. Components, interfaces and integration Infrastructure, Middleware and Platforms Techniques – Data warehouses, extending.
Software Test Metrics When you can measure what you are speaking about and express it in numbers, you know something about it; but when you cannot measure,
SOFTWARE TESTING TRAINING TOOLS SUPPORT FOR SOFTWARE TESTING Chapter 6 immaculateres 1.
ITIL: Service Transition
Software Quality Control and Quality Assurance: Introduction
Software Requirements
A possible solution: Personal Software Process (PSP)
Software Quality Engineering
Engineering Processes
Capability Maturity Model
Software metrics.
Measuring Success How to use simple bug database queries and reports
Metrics for process and Projects
6. Software Metrics.
Capability Maturity Model
Testing Workshop.
3. Software Quality Management
Presentation transcript:

Test Metrics

Metrics Framework “Lord Kelvin, a renowned British physicist, is reputed to have said: ‘When you can measure what you are speaking about, and express it in numbers, you know something about it … [otherwise] your knowledge is of a meager and unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely in thought advanced to the stage of science.”” - quotation from ‘Measuring Business Performance’, A. Neely, The Economist Books,1998 Establish a standard framework and implement appropriate tools to gather, track and report standardized performance metrics and measurements, enabling an organization to effectively manage test planning and test execution efforts, to accurately evaluate the quality of delivered solutions, and to consistently assess the overall test effectiveness of applications and systems. Mission Metrics are critical for gathering information that will lead to good decision-making. Metrics should be limited to a “vital few” metrics.

Metrics - Definitions Basic Definitions  Metric: Quantitative performance measure of the degree to which a system, component, or process possesses a given attribute. A calculated or composite indicator based upon a measure value. Example: Total Test Effort  Measure: To ascertain or appraise by comparing to a standard. A standard or unit of measurement; the extent, dimension, capacity, etc. of anything, especially as determined by a standard; a result of a measurement. Example: # of total actual hours billed to the project  Measurement: The act or process of recording or assessing a value of something, such as a figure, extent or amount. Example: 368 actual hours billed to project A  Core Metric: Required metric that is essential to support solution delivery test management on systems development projects. Example: Percentage of requirements met  Non-Core Metric: Optional metric that can help to create a more balanced picture of the quality and effectiveness of test efforts. Example: Total number of defects by test phase  Base Measure: Direct performance measurement of a system, component, or process. Example: Total number of open defects  Derived Measure: Calculated performance measurement using base measures as input to obtain a more sophisticated view of a system, component, or process. Example: Return on Quality Investment

Sample testing metrics

Sample Metrics – Test Planning Test ActivityCategoryPrimary IndicatorMetricCalculation ApproachProductivityTest PlanTotal numbers of hours spent establishing test approachTotal hours dedicated to test planning, by stage ApproachProductivityTest PlanTotal number of signoffs on test approach PlanProductivityConditionsTotal numbers of hours spent on planning test (test Conditions)Total hours dedicated to test planning, by stage PlanProductivityConditionsTotal number of test conditions documented and signed offTotal number of test conditions that have been signed-off PlanProductivityConditionsTest Conditions /Function PlanProductivityConditionsTest Conditions /Release or Package PlanProductivityConditionsTest Planning RateTotal number of hours spent planning the test stage/ Total number of conditions documented and signed off for the stage PrepProductivityScriptsTotal # of test scripts created PrepProductivityScriptsTotal % of test scripts created(Total # of test scripts created/Total # of planned scripts)*100 PrepProductivityScriptsTotal number of hours spent preparing scriptsTotal hours dedicated to test planning, by stage PrepProductivityScriptsTotal number of hours spent preparing scripts by stageTotal hours dedicated to creating test scripts, by stage PrepProductivityScriptsNumber of scripts entered into repository PrepProductivityScripts% of scripts entered into repository PrepProductivityScriptsTest Preparation RateTotal number of hours spent scripting the test stage/ Total number of cycles which have been scripted and signed off this test stage EnvironmentProductivityEnvironmentsTest environments established on time EnvironmentProductivityEnvironmentsTest environments validated on time

Sample Metrics – Test Execution Test ActivityCategoryPrimary IndicatorMetricCalculation ExecutionProductivityScriptsTotal hours spent executing test scriptsTotal hours of testing ExecutionProductivityScriptsTotal hours spent executing each test stageTotal hours of testing per stage ExecutionProductivityScriptsTest Hrs/Function ExecutionProductivityScriptsTest Scripts/Hour ExecutionProductivityScriptsTest Execution RateTotal number of hours spent executing the test stage/ Total number of test scripts executed in the stage ExecutionProductivityScriptsTotal # of test scripts executed ExecutionProductivityScriptsTotal # of test scripts In Progress ExecutionProductivityScriptsTotal # of test scripts NOT Started ExecutionProductivityScriptsTotal # of test scripts Scheduled ExecutionProductivityScriptsTotal % of test scripts executedTotal # of test scripts executed ExecutionProductivityScriptsTotal % of test scripts In ProgressTotal # of test scripts In Progress/ Total number of scripts ExecutionProductivityScriptsTotal % of test scripts NOT StartedTotal # of test scripts NOT Started/ Total number of scripts ExecutionProductivityScriptsTotal % of test scripts ScheduledTotal # of test scripts Scheduled/ Total number of scripts ExecutionProductivityScriptsTotal # of test scripts added ExecutionProductivityScriptsTotal # of test scripts removed ExecutionProductivityScriptsTotal # of test scripts (results) signed off ExecutionProductivityScriptsTotal % of test scripts (results) signed off Below are sample metrics for productivity measures of test execution using test scripts…

Sample Metrics - Defects Test ActivityCategoryPrimary IndicatorMetricCalculation ExecutionProductivityDefectsAverage time to fix a defect for a given stageTotal # of defects/Total time to fix defects ExecutionProductivityDefectsNumber of hours spent fixing defects at a given stageTotal hours to fix defects by stage ExecutionProductivityDefectsNumber of defects fixed correctly for a given stage# of defects fixed per stage ExecutionProductivityDefects(Stage Containment) Repair Effort Percentage(# of hours spent repairing defects from a given stage/original number of hours to build the stage)*100 ExecutionProductivityDefectsRepair Effectiveness Percentage(# defects fixed correctly the first time for a given stage/total # attempted fix for the stage)*100 ExecutionProductivityDefectsAverage Turnaround Days of Defects by prioritySum of all Turnaround Days of Defects of Priority P resolved during the week/Number of Defects of Priority resolved during the week ExecutionProductivityDefectsBest Defect Turnaround by priorityMinimum among the Turnaround Days of All Defects of Priority ExecutionProductivityDefectsWorst Defect Turnaround by priorityMaximum among the Turnaround Days of All Defects of Priority ExecutionProductivityDefectsDefect Fix Ratenumber of defects fixed / hours spent fixing defects ExecutionProductivityDefectsDefect Closure Rate by priorityDefects of Priority whose Date Closed falls within the start and end of reporting week ExecutionProductivityDefectsDefect Emergence Rate by priorityDefects of Priority whose Date Open falls within the start and end of reporting week ExecutionProductivityDefectsTesting Effectiveness Percentage(Number of defects found during testing for a given stage/ Hours of testing for a given stage) * 100

Sample Metrics – General Productivity Test ActivityCategoryPrimary IndicatorMetricCalculation ExecutionQualityDefectsTotal # of defects found ExecutionQualityDefectsTotal # of defects fixed ExecutionQualityDefectsTotal # of outstanding defects ExecutionQualityDefectsNumber of defects by module# of defects per module ExecutionQualityDefectsTotal defects by stage# of defects by stage ExecutionQualityDefectsTotal defect by severity# of total defects by severity ExecutionQualityDefectsTotal defect by severity by stage# of total defects by severity by stage ExecutionQualityDefects# of defects by type (application, data, environment, script)# of defects created for each category ExecutionQualityDefects# of defects by component (online, batch, report, etc)# of defects created for each component ExecutionQualityDefects# of defects by component complexity# of defects for each component, by component complexity ExecutionQualityDefectsDefect ratenumber of defects / days or weeks of execution ExecutionQualityDefectsDefect ratio# of defects /software size ExecutionQualityDefectsDefect density# Defects/KSLOC (thousand lines of code) ExecutionQualityScriptsTotal # of test scripts Passed ExecutionQualityScriptsTotal % of test scripts PassedTotal # of test scripts Passed/ Total number of scripts ExecutionQualityScriptsTotal # of test scripts Failed ExecutionQualityScriptsTotal % of test scripts FailedTotal # of test scripts Failed/ Total number of scripts All/GeneralProductivitytime/scheduleDuration Variance Percentage[(actual duration/planned duration) *100]-100 All/GeneralProductivitywork/effortEffort Variance Percentage[(actual effort/planned effort) *100]-100 All/GeneralProductivitywork/effortNumber of Personnel entering time into time management tool All/GeneralProductivitywork/effort% of Personnel entering time into time management tool

Test Metric Start Up Hours Accounting Total Effort Total QA Effort Total Rework Effort Total Test Effort Total RW Effort Cost Variance Schedule Variance MS Project Project Workplans - Test Process Task Mapping - Compliance Reports - Accuracy and Compliance Reports - Cost / Schedule Reports Defect Tracking Development Defects - Test Management Tool - Project Database (Interim) - Project Tools (Temporary) - Project Reports - Test Management Tool Reports Production Problems Accuracy & Completeness, Integrity Verification Dashboard Reporting Engine & Archive (Central Repository) Exec Dashboard Other Sources Risk-Based Testing Training Statistics Customer Satisfaction Test Manager Dashboard The consolidation of metric data sources, integrated verification of data quality, and a centralized repository for capturing measurement results enables an organization to successfully establish a test metric framework and its associated processes.