441 Copyright © 1996-2003, Satisfice, Inc. V1.6.1 James Bach, Satisfice, Inc. (540)631-0600.

Slides:



Advertisements
Similar presentations
Test process essentials Riitta Viitamäki,
Advertisements

Thoughts on Systematic Exploratory Testing of Important Products James Bach, Satisfice, Inc.
Four Schools of Software Testing Workshop on Teaching Software Testing, Florida Tech, February 2003.
Software Quality Assurance Plan
Evaluating Requirements. Outline Brief Review Stakeholder Review Requirements Analysis Summary Activity 1.
Evaluating Requirements
Project Plans CSCI102 - Systems ITCS905 - Systems MCS Systems.
Software Testing and Quality Assurance
SE 555 Software Requirements & Specification Requirements Validation.
Recall The Team Skills Analyzing the Problem
High Level: Generic Test Process (from chapter 6 of your text and earlier lesson) Test Planning & Preparation Test Execution Goals met? Analysis & Follow-up.
Stoimen Stoimenov QA Engineer QA Engineer SitefinityLeads,SitefinityTeam6 Telerik QA Academy Telerik QA Academy.
Purpose of the Standards
Research Methods for Computer Science CSCI 6620 Spring 2014 Dr. Pettey CSCI 6620 Spring 2014 Dr. Pettey.
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
Conducting Usability Tests ITSW 1410 Presentation Media Software Instructor: Glenda H. Easter.
S/W Project Management
Extreme Programming Software Development Written by Sanjay Kumar.
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
Test Organization and Management
Copyright (c) Cem Kaner Black Box Software Testing (Academic Course - Fall 2001) Cem Kaner, J.D., Ph.D. Florida Institute of Technology Section:
Learning Law Orientation: August 16, Synthesis Judgment 4. Problem Solving 3. Spotting Issues 2. Understanding 1. Knowledge 1. Recognition vs.
Product Documentation Chapter 5. Required Medical Device Documentation  Business proposal  Product specification  Design specification  Software.
ISM 5316 Week 3 Learning Objectives You should be able to: u Define and list issues and steps in Project Integration u List and describe the components.
Lecture 7: Requirements Engineering
How to read a scientific paper
From Quality Control to Quality Assurance…and Beyond Alan Page Microsoft.
1 Technical & Business Writing (ENG-315) Muhammad Bilal Bashir UIIT, Rawalpindi.
1 Technical & Business Writing (ENG-715) Muhammad Bilal Bashir UIIT, Rawalpindi.
CSCI 521 Final Exam Review. Why Establish a Standard Process? It is nearly impossible to have a high quality product without a high quality process. Standard.
Copyright © , Satisfice, Inc. V1. James Bach, Satisfice, Inc. (540)
Chapter 10 Verification and Validation of Simulation Models
The Software Development Process
Session # Rational User Conference 2002 Author Note: To edit Session # go to: View/Master/Title Master ©1998, 1999, 2000, 2001, 2002 Rational Software.
Good Enough Testing James Bach The author thanks ST Labs, Inc. for supporting the work that led to some of.
The Case Against Test Cases
Copyright 2010, The World Bank Group. All Rights Reserved. Testing and Documentation Part II.
Advanced Software Engineering: Software Testing COMP 3702 (Lecture1) Sada Narayanappa Seif Azgandhi Anneliese Andrews Thomas Thelin Carina Andersson.
Chapter 1: Fundamental of Testing Systems Testing & Evaluation (MNN1063)
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
Organizational Project Management Maturity Organizational Project Management Maturity Model (OPM3) PMI-MN Breakfast sessions Improvement Planning.
Software Quality Assurance SOFTWARE DEFECT. Defect Repair Defect Repair is a process of repairing the defective part or replacing it, as needed. For example,
Evaluating Requirements
Copyright (c) Cem Kaner Black Box Software Testing (Academic Course - Fall 2001) Cem Kaner, J.D., Ph.D. Florida Institute of Technology Section:
Software Quality Assurance and Testing Fazal Rehman Shamil.
1 The Software Development Process ► Systems analysis ► Systems design ► Implementation ► Testing ► Documentation ► Evaluation ► Maintenance.
Continual Service Improvement Methods & Techniques.
“The Role of Experience in Software Testing Practice” A Review of the Article by Armin Beer and Rudolf Ramler By Jason Gero COMP 587 Prof. Lingard Spring.
#1 Make sense of problems and persevere in solving them How would you describe the problem in your own words? How would you describe what you are trying.
Black Box Software Testing Copyright © 2003 Cem Kaner & James Bach 1 Black Box Software Testing Spring 2005 PART 8 -- TEST DESIGN by Cem Kaner, J.D., Ph.D.
Week # 4 Quality Assurance Software Quality Engineering 1.
Introduction to Software Testing Maili Markvardt.
 System Requirement Specification and System Planning.
The Quality Gateway Chapter 11. The Quality Gateway.
Software Engineering (CSI 321)
Exploratory Testing By Alo Roots.
Recall The Team Skills Analyzing the Problem
Chapter 10 Verification and Validation of Simulation Models
Introduction to Software Testing
A test technique is a recipe these tasks that will reveal something
KEY IDEA Use a diversified test strategy that will serve the mission.
Introducing ISTQB Agile Foundation Extending the ISTQB Program’s Support Further Presented by Rex Black, CTAL Copyright © 2014 ASTQB 1.
CSE 303 Concepts and Tools for Software Development
KEY IDEA Assure that your testing fits the logistics of the project.
Rapid Software Testing
Black Box Software Testing (Professional Seminar)
that focus first on areas of risk.
Manage testing by time boxes
Exploring Exploratory Testing
Presentation transcript:

441 Copyright © , Satisfice, Inc. V1.6.1 James Bach, Satisfice, Inc. (540)

442 Copyright Notice These slides are distributed under the Creative Commons License. In brief summary, you may make and distribute copies of these slides so long as you give the original author credit and, if you alter, transform or build upon this work, you distribute the resulting work only under a license identical to this one. For the rest of the details of the license, see sa/2.0/legalcode.

443 Acknowledgements  Some of this material was developed in collaboration with Dr. Cem Kaner, of the Florida Institute of Technology.  Many of the ideas in this presentation were inspired by or contributed by other colleagues including Bret Pettichord, Brian Marick, Doug Hoffman, Dave Gelperin, Elisabeth Hendrickson, and Noel Nyman.  This class is under continuous development. Many ideas were improved or contributed by students in earlier versions of the class since 1996.

444 Assumptions  You test software.  You have at least some control over the design of your tests and some time to create new tests.  One of your goals is to find important bugs fast.  You test things under conditions of uncertainty and time pressure.  You have control over how you think and what you think about.  You want to get very good at software testing.

445 Primary Goal of this Class To teach you how to test a product when you have to test it right now, under conditions of uncertainty, in a way that stands up to scrutiny.

446 Background

447 Your Moves: Rapid Testing Cycle Make sense of your status Focus on what needs doing START STOP Do a burst of testing Report Compare status against mission

448 What About “Slow” Testing? All Testing Rapid automation extensive preparation super testability super skill Rigorous or Thorough Management likes to talk about this… but they don’t fund it. You can do this, no matter what.

449 What is quality? What is a bug? Quality is value to some person. A bug is anything that threatens the value of the product.  These definitions are designed to be inclusive.  Inclusive definitions minimize the chance that you will inadvertently overlook an important problem.

450 Technical Knowledge Specifications Product Ah! Problem! Problem Testing is in Your Head Coverage Problem Report Communication The important parts of testing don’t take place in the computer or on your desk. Critical Thinking Domain Knowledge Experience

451 A Test (or Test Suite) is Like a Question you Ask the Product  A tester’s questions seek potentially valuable information.  To some degree, good tests have these attributes:  Power. When a problem exists, the test will reveal it.  Validity. When the test reveals a problem, it is a genuine problem.  Value. It reveals things your clients want to know about the product or project.  Pop. (short for Karl Popper) It reveal things about our basic or critical assumptions.  Coverage. It exercises the product in some way.  Performability. It can be performed as designed; repeated as needed.  Accountability. You can explain, justify, and prove you ran it.  Cost. This includes time and effort, as well as direct costs.  Opportunity Cost. Performing it may prevent you from doing other tests.

452 Contrasting Approaches In scripted testing, tests are first designed and recorded. Then they may be executed at some later time or by a different tester. In exploratory testing, tests are designed and executed at the same time, and they often are not recorded. Product Tests

453 Contrasting Approaches Scripted testing emphasizes accountability and decidability. Exploratory testing emphasizes adaptability and learning. Product Tests

454 Exploratory Testing Defined Exploratory testing is simultaneous learning, test design, and test execution. pure scripted freestyle exploratory When I say “exploratory testing” and don’t qualify it, I mean anything on the exploratory side of this continuum. chartersvague scripts fragmentary test cases roles

455 ET Done Well is a Structured Process  Exploratory testing, as I teach it, is a structured process conducted by a skilled tester, or by a lesser skilled testers or users working under reasonable supervision.  The structure of ET comes from:  Test design heuristics  Chartering  Time boxing  Perceived product risks  The nature of specific tests  The structure of the product being tested  The process of learning the product  Development activities  Constraints and resources afforded by the project  The skills, talents, and interests of the tester  The overall mission of testing In other words, it’s not “random”, but reasoned.

456 ET is an Adaptive Process  Exploratory testing decentralizes the testing problem.  Instead of trying to solve it:  only before test execution begins.  by investing in expensive test documentation that tends to reduce the total number of tests that can be created.  only via a designer who is not necessarily the tester.  while trying to eliminate the variations among testers.  completely, and all at once.  It is solved:  over the course of the project.  by minimizing the need for expensive test documentation so that more tests and more complex tests can be created with the same effort.  via testers who may also be test designers.  by taking maximum advantage of variations among testers.  incrementally and cyclically.

457 Exploratory Forks New test idea New test idea New test idea New test ideas occur continually during an ET session.

458 Lateral Thinking but periodically take stock of your status against your mission Let yourself be distracted… ‘cause you never know what you’ll find…

459 Exploratory Testing Tasks Learning Execute Tests Design Tests Product (coverage) Techniques Quality (oracles) Discover the elements of the product. Discover how the product should work. Discover test design techniques that can be used. Decide which elements to test. Observe product behavior. Speculate about possible quality problems. Evaluate behavior against expectations. Configure & operate the product. Select & apply test design techniques. Testing notes Tests Problems Found

460 Taking Notes  Test Coverage Outline/Matrix  Oracle Notes  Risk/Strategy List  Test Execution Log  Issues, Questions & Anomalies  It would be easier to test if you changed/added…  How does … work?  Is this important to test? How should I test it?  I saw something strange…

461 KEY IDEA

462 Models  A Model is…  A map of a territory  A simplified perspective  A relationship of ideas  An incomplete representation of reality  A diagram, list, outline, matrix… No good test design has ever been done without models. The trick is to become aware of how you model the product, and learn different ways of modeling.

463 The Universal Test Procedure “Try it and see if it works.”  Learn about it  Model it  Speculate about it  Configure it  Operate it  Know what to look for  See what’s there  Understand the requirements  Identify problems  Distinguish bad problems from not- so-bad problems ModelsOraclesCoverage

464 All Product Testing is Something Like This Project Environment Product Elements Quality Criteria Test Techniques Perceived Quality

465 Seven Big Problems of Testing Logistics Problem Coverage Problem Oracle Problem Reporting Problem Stopping Problem Pesticide Problem & Agency Problem &

466 Coverage  There are as many kinds of coverage as there are ways to model the product.  Structural  Functional  Data  Platform  Operations Product coverage is the proportion of the product that has been tested.

467 Sometimes your coverage is disputed… “No user would do that.” “No user I can think of, who I like, would do that on purpose.” Who aren’t you thinking of? Who don’t you like who might really use this product? What might good users do by accident?

468 Useful Oracle Heuristics  Consistent with History: Present function behavior is consistent with past behavior.  Consistent with our Image: Function behavior is consistent with an image that the organization wants to project.  Consistent with Comparable Products: Function behavior is consistent with that of similar functions in comparable products.  Consistent with Claims: Function behavior is consistent with what people say it’s supposed to be.  Consistent with User’s Expectations: Function behavior is consistent with what we think users want.  Consistent within Product: Function behavior is consistent with behavior of comparable functions or functional patterns within the product.  Consistent with Purpose: Function behavior is consistent with apparent purpose.

469 Rapid, Frequent Feedback to Clients Test Cycle receive build Sanity Check is it testable? Fix Verifications were they fixed? New Stuff is it functional? Common and Critical Tests Complex Tests General Regression Tests

470 Risk Focus: Common and Critical Cases  Core functions: the critical and the popular.  Capabilities: can the functions work at all?  Common situations: popular data and pathways.  Common threats: likely stress and error situations.  User impact: failures that would do a lot of damage.  Most wanted: problems of special interest to someone else on the team.

471 Rapid Bug Investigation  Identification  Notice a problem.  Recall what you were doing just prior to the problem.  Examine symptoms of the problem w/o disturbing system state.  Consider possibility of tester error.  Investigation  How can the problem be reproduced?  What are the symptoms of the problem?  How severe could the problem be?  What might be causing the problem?  What might be a workaround?  Reality Check  Do we know enough about the problem to report it?  Is it important to investigate this problem right now?  Is this problem, or any variant of it, already known?  How do we know this is really a problem?  Is there someone else who can help us? Identify InvestigateCheck

472 KEY IDEA

473 Test Strategy  Strategy: “The set of ideas that guide your test design.”  Logistics: “The set of ideas that guide your application of resources to fulfilling the test strategy.”  Plan: “The set of ideas that guide your test project.”  A good test strategy is:  Product-Specific  Risk-focused  Diversified  Practical

474 Test Strategy  Test Approach and Test Architecture are other terms commonly used to describe what I’m calling test strategy.  Example of a poorly stated (and probably poorly conceived) test strategy:  “We will use black box testing, cause-effect graphing, boundary testing, and white box testing to test this product against its specification.”

475 Test Strategy  Not to be confused with test logistics, which involve the details of bringing resources to bear on the test strategy at the right time and place.  You don’t have to know the entire strategy in advance. The strategy should change as you learn more about the product and its problems.

476 One way to make a strategy… 1.Learn the product. 2.Think of important potential problems. 3.Think of ways to test that will cover the product and look for those important problems. 4.Make sure you are taking advantage of resources. 5.Make sure that your strategy is reasonably practical.

477 Test Strategy Heuristic: Diverse Half-Measures  There is no single technique that finds all bugs.  We can’t do any technique perfectly.  We can’t do all conceivable techniques. Use “diverse half-measures”-- lots of different points of view, approaches, techniques, even if no one strategy is performed completely.

478 Strategy Heuristic: Function/Data Square Data Functions risk testing Function testing reliability testing smoke testing

479 Test Techniques A test technique is a recipe for performing these tasks that will reveal something worth reporting  Analyze the situation.  Model the test space.  Select what to cover.  Define your oracles.  Configure the test system.  Operate the test system.  Observe the test system.  Evaluate the test results.

480 Dynamic Quality Paradigm Perfect Awful unnecessary quality unacceptable quality Item A Item B It’s more important to work on Item B. Further improvement would not be a good use of resources. Further improvement is necessary. Good enough quality bar floating line

481 A Heuristic for Good Enough 1. X has sufficient benefits. 2. X has no critical problems. 3. Benefits of X sufficiently outweigh problems. 4. In the present situation, and all things considered, improving X would be more harmful than helpful. Benefits Problems All conditions must apply.

482 Good Enough...  …with what level of confidence?  …to meet ethical obligations?  …in what time frame?  …compared to what?  …for what purpose?  …or else what?  …for whom? Perspective is Everything

483 MISSION: The most important part  Find important problems  Assess quality  Certify to standard  Fulfill process mandates  Satisfy stakeholders  Assure accountability  Advise about QA  Advise about testing  Advise about quality  Maximize efficiency  Minimize time  Minimize cost The quality of testing depends on which of these possible missions matter and how they relate. Many debates about the goodness of testing are really debates over missions and givens.

484 Testability  Controllability  Observability  Availability  Simplicity  Stability  Information Log files! Scriptable Interface!

485 It Boils Down To…  YOU: Skills, equipment, experience, attitude  THE BALL: The product, testing tasks, bugs  YOUR TEAM: Coordination, roles, support  THE GAME: Risks, rewards, project environment, corporate environment, your mission as a tester  YOUR MOVES: How you spend your attention and energy to help your team win the game.

486 Rapid Testing  Develop your scientific mind.  Use exploratory testing.  Know your coverage and oracles.  Run crisp test cycles that focus first on areas of risk.  Use a diversified test strategy that serves the mission.  Assure that your testing fits the logistics of the project.

487 Exploratory Process

488 Introducing the Test Session 1)Charter 2)Time Box 3)Reviewable Result 4)Debriefing vs.

489 Charter: A clear mission for the session  A charter may suggest what should be tested, how it should be tested, and what problems to look for.  A charter is not meant to be a detailed plan.  General charters may be necessary at first:  “Analyze the Insert Picture function”  Specific charters provide better focus, but take more effort to design:  “Test clip art insertion. Focus on stress and flow techniques, and make sure to insert into a variety of documents. We’re concerned about resource leaks or anything else that might degrade performance over time.”

490 Time Box: Focused test effort of fixed duration  Brief enough for accurate reporting.  Brief enough to allow flexible scheduling.  Brief enough to allow course correction.  Long enough to get solid testing done.  Long enough for efficient debriefings.  Beware of overly precise timing. Short: 60 minutes (+-15) Normal: 90 minutes (+-15) Long: 120 minutes (+-15)

491 Debriefing: Measurement begins with observation  The manager reviews session sheet to assure that he understands it and that it follows the protocol.  The tester answers any questions.  Session metrics are checked.  Charter may be adjusted.  Session may be extended.  New sessions may be chartered.  Coaching happens.

492 Reviewable Result: A scannable session sheet  Charter  #AREAS  Start Time  Tester Name(s)  Breakdown  #DURATION  #TEST DESIGN AND EXECUTION  #BUG INVESTIGATION AND REPORTING  #SESSION SETUP  #CHARTER/OPPORTUNITY  Data Files  Test Notes  Bugs  #BUG  Issues  #ISSUE CHARTER Analyze MapMaker’s View menu functionality and report on areas of potential risk. #AREAS OS | Windows 2000 Menu | View Strategy | Function Testing Strategy | Functional Analysis START /30/00 03:20 pm TESTER Jonathan Bach TASK BREAKDOWN #DURATION short #TEST DESIGN AND EXECUTION 65 #BUG INVESTIGATION AND REPORTING 25 #SESSION SETUP 20

493 Coverage: Specifying coverage areas  These are text labels listed in the Charter section of the session sheet. (e.g. “insert picture”)  Coverage areas can include anything  areas of the product  test configuration  test strategies  system configuration parameters  Use the debriefings to check the validity of the specified coverage areas.

494 Closing Concepts

495 Common Concerns About ET  Concerns  We have a long-life product and many versions, and we want a good corporate memory of key tests and techniques. Corporate memory is at risk because of the lack of documentation.  The regulators would excommunicate us. The lawyers would massacre us. The auditors would reject us.  We have specific tests that should be rerun regularly.  Replies  So, use a balanced approach, not purely exploratory.  Even if you script all tests, you needn’t outlaw exploratory behavior.  Let no regulation or formalism be an excuse for bad testing.

496  Concerns  Some tests are too complex to be kept in the tester’s head. The tester has to write stuff down or he will not do a thorough or deep job.  Replies  There is no inherent conflict between ET and documentation.  There is often a conflict between writing high quality documentation and doing ET when both must be done at the same moment. But why do that?  Automatic logging tools can solve part of the problem.  Exploratory testing can be aided by any manner of test tool, document, or checklist. It can even be done from detailed test scripts. Common Concerns About ET

497  Concerns  ET works well for expert testers, but we don’t have any.  Replies  Detailed test procedures do not solve that problem, they merely obscure it, like perfume on a rotten egg.  Our goal as test managers should be to develop skilled testers so that this problem disappears, over time.  Since ET requires test design skill in some measure, ET management must constrain the testing problem to fit the level and type of test design skill possessed by the tester.  I constrain the testing problem by personally supervising the testers, and making use of concise documentation, NOT by using detailed test scripts. Humans make poor robots. Common Concerns About ET

498  Concerns  How do I tell the difference between bluffing and exploratory testing?  If I send a scout and he comes back without finding anything, how do I know he didn’t just go to sleep behind some tree?  Replies  You never know for sure– just as you don’t know if a tester truly followed a test procedure.  It’s about reputation and relationships. Managing testers is like managing executives, not factory workers.  Give novice testers short leashes; better testers long leashes. An expert tester may not need a leash at all.  Work closely with your testers, and these problems go away. Common Concerns About ET

499 Challenges of High Accountability Exploratory Testing  Architecting the system of charters (test planning)  Making time for debriefings  Getting the metrics right  Creating good test notes  Keeping the technique from dominating the testing  Maintaining commitment to the approach For example session sheets and metrics see