Copyright © 1996-2003, Satisfice, Inc. V1. James Bach, Satisfice, Inc. (540)631-0600.

Slides:



Advertisements
Similar presentations
Using Excel Biostatistics 212 Lecture 4. Housekeeping Questions about Lab 3? –replace vs. recode Final Project Dataset! –“Housekeeping” commands vs. data.
Advertisements

Trnsport Test Suite Project Tony Compton, Texas DOT Charles Engelke, Info Tech.
HP Quality Center Overview.
Weekly Risk Report & Performance Metrics
SAITS Learning Opportunities Introduction to Pivots and Dashboards December 12, :00 – 12:00 Cathy Bates.
Evaluating Requirements. Outline Brief Review Stakeholder Review Requirements Analysis Summary Activity 1.
INFO 638Lecture #81 Software Project Management Cycle plan and build INFO 638 Glenn Booker.
Alternate Software Development Methodologies
1 In-Process Metrics for Software Testing Kan Ch 10 Steve Chenoweth, RHIT Left – In materials testing, the goal always is to break it! That’s how you know.
Software Measurement and Process Improvement
Swami NatarajanJune 17, 2015 RIT Software Engineering Reliability Engineering.
SE 450 Software Processes & Product Metrics Reliability Engineering.
Stat 217 – Week 10. Outline Exam 2 Lab 7 Questions on Chi-square, ANOVA, Regression  HW 7  Lab 8 Notes for Thursday’s lab Notes for final exam Notes.
From Module Breakdown to Interface Specifications Completing the architectural design of Map Schematizer.
1 Software Testing and Quality Assurance Lecture 14 - Planning for Testing (Chapter 3, A Practical Guide to Testing Object- Oriented Software)
02/10/06EECS150 Lab Lecture #41 Debugging EECS150 Spring 2006 – Lab Lecture #4 Philip Godoy Greg Gibeling.
Introduction to Software Testing
© 2006, Cognizant Technology Solutions. All Rights Reserved. The information contained herein is subject to change without notice. Automation – How to.
Structuring an essay. Structuring an Essay: Steps 1. Understand the task 2.Plan and prepare 3.Write the first draft 4.Review the first draft – and if.
October 15, 2004 – 1 Welcome IPMA and SolutionsIQ Professional Event Testing, Testing, 1…2…3… Improving software quality -- one bug at a time.
Notes on the Game Development Process
KEY IDEA. It Boils Down To…  YOU: Skills, equipment, experience, attitude  THE BALL: The product, testing tasks, bugs  YOUR TEAM: Coordination, roles,
Copyright © 2009 T.L. Martin & Associates Inc. Chapter 3 Requirements of a realistic CPM schedule.
INFO 637Lecture #31 Software Engineering Process II Launching & Strategy INFO 637 Glenn Booker.
1 PowerPoint Presentation Design Wednesday, September 02, 2015Ms. Wear Info Tech 9/10.
Copyright The McQuaig Group Inc. 1 Six Thinking Hats ® As a Problem Solving and Directed thinking Tool Presented by: Suman Datta.
SWEN 302: AGILE METHODS Roma Klapaukh & Alex Potanin.
Report Prepared for Envision Presented by: Kristen Vargas Rossana Figuera Yinka Osidein.
A Day in the Life of a MProLite User Easy Steps to Ensure You Get the Best Results Performance Solutions Technology, LLC.
Moodle (Course Management Systems). Assignments 1 Assignments are a refreshingly simple method for collecting student work. They are a simple and flexible.
CS4723 Software Validation and Quality Assurance Lecture 15 Advanced Topics Test Plans and Management.
Software Engineering Software Process and Project Metrics.
Chapter 6 : Software Metrics
Word Revise documents and keep track of changes. Use Track Changes and comments Course contents Overview: Insertions, deletions, comments Lesson 1: Stay.
Moving into Implementation SYSTEMS ANALYSIS AND DESIGN, 6 TH EDITION DENNIS, WIXOM, AND ROTH © 2015 JOHN WILEY & SONS. ALL RIGHTS RESERVED.Roberta M. Roth.
Writing a Research Manuscript GradWRITE! Presentation Student Development Services Writing Support Centre University of Western Ontario.
How to read a scientific paper
The next generation tester! 1 To Softec – Silicon India attendees With love, Pradeep Soundararajan Moolya Software Testing Private Limited
T Iteration Demo Group name [PP|I1|I2] Iteration
McGraw-Hill/Irwin © 2009 The McGraw-Hill Companies, All Rights Reserved Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
T Project Review (Template for PI and I1 phases) Group name [PI|I1] Phase
SOFTWARE PROCESS AND PROJECT METRICS. Topic Covered  Metrics in the process and project domains  Process, project and measurement  Process Metrics.
Session # Rational User Conference 2002 Author Note: To edit Session # go to: View/Master/Title Master ©1998, 1999, 2000, 2001, 2002 Rational Software.
The Case Against Test Cases
ECE 353 Lab 2 Pipeline Simulator Additional Material.
Common Sense Validation Using SAS Lisa Eckler Lisa Eckler Consulting Inc. TASS Interfaces, December 2015.
Test status report Test status report is important to track the important project issues, accomplishments of the projects, pending work and milestone analysis(
Compuware Corporation Deliver Reliable Applications Faster Dave Kapelanski Automated Testing Manager.
Managing Challenging Projects Presented to the class of: Dr. Jane Mackay M.J. Neely School of Business.
T Iteration Demo Group name [PP|I1|I2] Iteration
Project Control n How do I spot problems? n What do I do once I find them? n How do I stay on track? n What happened to my plan? n How do I stay flexible?
n Taking Notes and Keeping a Journal n Listening Skills n Working Together n Managing Your Time.
PMI Evening Event - Preston / Dec 2007 Spotting & catching problem projects before it's too late Neil Johnson Project & Programme Management Capability.
Internal developer tools and bug tracking Arabic / Hebrew Windows 3.1Win95 Japanese Word, OneNote, Outlook
T Iteration Demo LicenseChecker I2 Iteration
QlikView Licensing.
eXtremely Distributed Software Development
Decision Teamwork In Uncertain Meeting Environments Using
Introduction to Software Testing
Standard Scripts Project 2
SSI Toolbox Status Workbook Overview
Welcome to Corporate Training -1
In the Senior Design Center
Standard Scripts Project 2
Holistic Release Criteria
that focus first on areas of risk.
Manage testing by time boxes
Test Cases, Test Suites and Test Case management systems
Time Scheduling and Project management
Standard Scripts Project 2
Presentation transcript:

Copyright © , Satisfice, Inc. V1. James Bach, Satisfice, Inc. (540)

Copyright Notice These slides are distributed under the Creative Commons License. In brief summary, you may make and distribute copies of these slides so long as you give the original author credit and, if you alter, transform or build upon this work, you distribute the resulting work only under a license identical to this one. For the rest of the details of the license, see sa/2.0/legalcode.

Public Relations Problem “What’s the status of testing?” “What are you doing today?” “When will you be finished?” “Why is it taking so long?” “Have you tested _______, yet?”

The Problem  Management has little patience for detailed test status reports.  Management doesn’t understand testing.  Testing is confused with improving.  Testing is considered a linear, independent task.  Testing is assumed to be exhaustive.  Testing is assumed to be continuous.  Test results are assumed to stay valid over time.  Impact of regression testing is not appreciated.  Test metrics are hard to interpret.

A Solution  Report test cycle progress in a simple, structured way... ...that shows progress toward a goal... ...manages expectations... ...and inspires support... ...for an effective test process. No process? No Problem! Easy setup

The Dashboard Concept Project conference room Large dedicated whiteboard “ Do Not Erase ” Project status meeting

Test Cycle Report Test Effort Test Coverage Quality Assessment Time Product Areas vs.

Area file/edit view insert format tools slideshow online help clipart converters install compatibility general GUI Effort high low blocked low blocked none start 3/17 low C Q.Comments 1345, 1363, 1401 automation broken crashes: 1406, 1407 animation memory leak new files not delivered need help to test... lab time is scheduled Testing Dashboard Updated:Build: 2/2138

Product Area  areas (keep it simple)  Avoid sub-areas: they’re confusing.  Areas should have roughly equal value.  Areas together should be inclusive of everything reasonably testable.  “Product areas” can include tasks or risks- but put them at the end.  Minimize overlap between areas.  Areas must "make sense" to your clients, or they won’t use the board. Area file/edit view insert format tools slideshow online help clipart converters install compatibility general GUI

Test Effort None Start Low High Pause Blocked Ship Not testing; not planning to test. Regression or spot testing only; maintaining coverage. Focused testing effort; increasing coverage. No testing yet, but expect to start soon. Temporarily ceased testing, though area is testable. Can’t effectively test, due to blocking problem. Going through final tests and signoff procedure.

Test Effort  Use red to denote significant problems or stoppages, as in blocked, none, or pause.  Color ship green once the final tests are complete and everything else on that row is green.  Use neutral color (such as black or blue, but pick only one) for others, as in start, low, or high.

Test Coverage We don’t have good information about this area. More than sanity, but many functions not tested. Common & Critical: Sanity Check: Some data, state, or error coverage beyond level 2. Complex Cases: all functions touched; common & critical tests executed. strong data, state, error, or stress testing. major functions & simple data.

Test Coverage  Color green if coverage level is acceptable for ship, otherwise color black.  Level 1 and 2 focus on functional requirements and capabilities: can this product work at all?  Level 2 may span 50%-90% code coverage.  Level 2+ and 3 focus on information to judge performance, reliability, compatibility, and other “ilities”: will this product work under realistic usage?  Level 3 or 3+ implies “if there were a bad bug in this area, we would probably know about it.”

Quality Assessment “We know of no problems in this area that threaten to stop ship or interrupt testing, nor do we have any definite suspicions about any.” “We know of problems that are possible showstoppers, or we suspect that there are important problems not yet discovered.” “We know of problems in this area that definitely stop ship or interrupt testing.”

Use the comment field to explain anything colored red, or any non-green quality indicator. Comments  Problem ID numbers.  Reasons for pausing, or delayed start.  Nature of blocking problems.  Why area is unstaffed.

Using the Dashboard  Updates: 2-5/week, or at each build, or prior to each project meeting.  Progress: Set expectation about the duration of the “Testing Clock” and how new builds reset it.  Justification: Be ready to justify the contents of any cell in the dashboard. The authority of the board depends upon meaningful, actionable content.  Going High Tech: Sure, you can put this on the web, but will anyone actually look at it???

KEY IDEA