Presentation is loading. Please wait.

Presentation is loading. Please wait.

Testing Under Pressure: Five Key Principles

Similar presentations


Presentation on theme: "Testing Under Pressure: Five Key Principles"— Presentation transcript:

1 Testing Under Pressure: Five Key Principles
Robert Sabourin President AmiBug.Com, Inc. Montreal, Canada April 20, 2017 © Robert Sabourin, 2009

2 Under Pressure Pain points? What hurts? How much? April 20, 2017
© Robert Sabourin, 2009

3 Testing Under Pressure
Begin with the end in mind Active context listening Decision making Ruthlessly triage The last best build April 20, 2017 © Robert Sabourin, 2009

4 (1)Begin with the end in mind
Under Pressure (1)Begin with the end in mind April 20, 2017 © Robert Sabourin, 2009

5 Fundamental Question How do you know when you are finished?
April 20, 2017 © Robert Sabourin, 2009

6 Crosby on Quality “Quality is defined as conformance to requirements”
“Quality is not a measure of GOODNESS” Phil B. Crosby, Quality is Free April 20, 2017 © Robert Sabourin, 2009

7 Gerald M. Weinberg “Quality is value to some person”
Exploring Requirements Quality Before Design Dorset House April 20, 2017 © Robert Sabourin, 2009

8 Edsger W. Dijkstra “Program testing can be used to show the presence of bugs, but never to show their absence” April 20, 2017 © Robert Sabourin, 2009

9 Pareto Principle Vilfredo Pareto, 1848 - 1923, Economist
80% of the wealth was in the hands of 20% of the population April 20, 2017 © Robert Sabourin, 2008

10 Pareto Principle Joseph Juran, present, Quality Control Engineer 1950 Quality Control Handbook 20% of the study population accounts for 80% of the measure under consideration April 20, 2017 © Robert Sabourin, 2008

11 (2) Active context listening
Under Pressure (2) Active context listening April 20, 2017 © Robert Sabourin, 2009

12 Context Drivers - BTO Business Technology Organization Value To whom?
Why? Technology Solutions Organization Corporate Structure Team Structure Roles and Responsibilities April 20, 2017 © Robert Sabourin, 2009

13 April 20, 2017 © Robert Sabourin, 2009

14 Context Listeners Find Sources Monitor Drivers Anticipate Change React
April 20, 2017 © Robert Sabourin, 2009

15 Under Pressure (3) Decision Making April 20, 2017
© Robert Sabourin, 2009

16 First Things First Begin with the end in mind Gain Consensus Goals
How do we know we are finished? Purpose Why are we doing this project? How will be react to change? Meaning What is a bug? What is a test? What is quality? April 20, 2017 © Robert Sabourin, 2008

17 Yoda "No! Try not, Do. Or do not. There is no try." April 20, 2017
© Robert Sabourin, 2009

18 Bug Flow We will be testing … imagine that we actually find a bug!
What are we going to do about it? How will we decide? When should we decide how to decide? When should we change how we decide? When should we review our past decisions? April 20, 2017 © Robert Sabourin, 2008

19 Bug Flow Entered Reviewed Prioritized Assigned Unassigned Fixed Closed
REFUSE Entered Reviewed Prioritized Assigned CHECK TRIAGE DESIGNATE CORRECT MANDATE Unassigned Fixed Closed CONFIRM FAILURE April 20, 2017 © Robert Sabourin, 2008

20 Bug Workflow Seven Steps Identify key stakeholders
Learn about decisions Define bug priority & severity List steps when bug is found Build bug flow state model Get stakeholders “buy in” Adapt bug flow as required April 20, 2017 © Robert Sabourin, 2008

21 1 Bug Workflow Identify key stakeholders April 20, 2017
© Robert Sabourin, 2008

22 2 Bug Workflow How is a decision made? April 20, 2017
© Robert Sabourin, 2008

23 3 Bug Workflow Define Priority and Severity Scheme April 20, 2017
© Robert Sabourin, 2008

24 4 Bug Workflow Define steps to follow
bug is discovered in testing or reported from the field a bug report form is completed the bug report form is reviewed the bug report is added to the bug list a decision is made, at a bug review meeting, about whether the bug should be fixed if the bug is fixed then the software is re-tested to reconfirm that the bug has indeed been fixed if the bug is not fixed (on purpose!) then a description of the work around is published or made available to help desk staff 4 April 20, 2017 © Robert Sabourin, 2008

25 5 Bug Workflow Build State Model Entered Reviewed Prioritized Assigned
REFUSE Entered Reviewed Prioritized Assigned CHECK TRIAGE DESIGNATE CORRECT MANDATE Unassigned Fixed Closed CONFIRM FAILURE Build State Model April 20, 2017 © Robert Sabourin, 2008

26 6 Bug Workflow Get stakeholder “buy-in” April 20, 2017
© Robert Sabourin, 2008

27 April 20, 2017 © Robert Sabourin, 2009
R2-D2 and Chewbacca are playing the holographic game aboard the Millennium Falcon Chewbacca: Aaaaaaaaaaaaaaaarrrgh C-3PO: He made a fair move. Screaming about it can't help you. Han Solo: Let him have it. It's not wise to upset a Wookiee. C-3PO: But sir, nobody worries about upsetting a droid. Han Solo: That's 'cause droids don't pull people's arms out of their sockets when they lose. Wookiees are known to do that. Chewbacca: Grrf. C-3PO: I see your point, sir. I suggest a new strategy, R2: let the Wookiee win. April 20, 2017 © Robert Sabourin, 2009

28 7 Bug Workflow Adapt as required April 20, 2017
© Robert Sabourin, 2008

29 Under Pressure (4) Ruthlessly triage April 20, 2017
© Robert Sabourin, 2009

30 Test Idea Sources Capabilities Failure Modes Quality Factors
Usage Scenarios Creative Ideas States Data Environments White Box Taxonomies Capture testing ideas April 20, 2017 © Robert Sabourin, 2008

31 Triage Criticality Resources Trade offs Benefit Consequence
Credibility April 20, 2017 © Robert Sabourin, 2009

32 Which test? Impact estimation For each test idea guesstimate:
benefit of implementation consequence of implementation benefit for not implementing consequence of not implementing How credible is the information? Triage testing ideas April 20, 2017 © Robert Sabourin, 2008

33 Understanding Complex Technology Quantitatively By Tom Gilb
How to Decide? Rank Credibility 0.0 Wild guess, no credibility 0.1 We know it has been done somewhere 0.2 We have one measurement somewhere 0.3 There are several measurements in the estimated range 0.4 The measurements are relevant to our case 0.5 The method of measurement is considered reliable 0.6 We have used the method in-house 0.7 We have reliable measurements in-house 0.8 Reliable in-house measurements correlate to independent external measurements 0.9 We have used the idea on this project and measured it 1.0 Perfect credibility, we have rock solid, contract- guaranteed, long-term, credible experience with this idea on this project and, the results are unlikely to disappear Triage testing ideas April 20, 2017 © Robert Sabourin, 2008

34 Which test? Test Idea Rejection – What If?
If the cost/benefit does not make business sense then consider implementing: part of the test, could that lead to part of the benefit at a more reasonable cost? more than the stated test, would that generate more benefit? a different test than the stated idea, could that generate more benefit for less cost? Triage testing ideas April 20, 2017 © Robert Sabourin, 2008

35 Test Triage Test Triage JIT Projects High Frequency
Daily Test Triage Session Experience dictates Early AM (Rob Preference) Late PM (several clients) Triage testing ideas April 20, 2017 © Robert Sabourin, 2008

36 Test Triage Test Triage Meeting Review Context
Business Technical Information since last triage Test results Bug results New testing ideas Triage testing ideas April 20, 2017 © Robert Sabourin, 2008

37 Test Triage Allocate Testing Assignments to Testers
Make sure testers know context Best thing to test Best person to test it Best people to explore it Best lead Are subject matter experts required Triage testing ideas April 20, 2017 © Robert Sabourin, 2008

38 Test Triage Requirement Triage Change Control Test Triage Bug Flow
Combined Equivalent to CCB Few people Fluid Triage testing ideas April 20, 2017 © Robert Sabourin, 2008

39 Test Triage Life of a test idea Triage testing ideas
Comes into existence Clarified Prioritized Test Now (before further testing) Test before shipping Nice to have May be of interest in some future release Not of interest in current form Will never be of interest Integrate into a testing objective Triage testing ideas April 20, 2017 © Robert Sabourin, 2008

40 Which test is next? Questions
Given state of project, state of business, state of technology, our abilities, our experience and our history, what we know and what we do not know, what should we test next? How much effort are we willing to spend continuing to test this project? Can we ship yet? Triage testing ideas April 20, 2017 © Robert Sabourin, 2008

41 Which test is next? Magic crystal ball Triage testing ideas
If it existed how would you use it? What question would you ask? What question would it ask? Triage testing ideas April 20, 2017 © Robert Sabourin, 2008

42 Deciding what not to test?
Time pressure Should we skip a test? If test failed could system still be of value to some stakeholder? If test was skipped could important bugs have been otherwise found? Triage testing ideas April 20, 2017 © Robert Sabourin, 2008

43 Guidelines and Decisions
To each stakeholder risk of failure consequence of failure value of success how much certainty do we have is it a wild guess or an absolute truth? Get Started Right April 20, 2017 © Robert Sabourin, 2008

44 Under Pressure (5) The last best build April 20, 2017
© Robert Sabourin, 2009

45 Under Pressure Always know the last best build. Rank recent builds.
If forced to ship at a fixed date then I often let project stakeholders trade off between recent builds. April 20, 2017 © Robert Sabourin, 2009

46 Under Pressure The right stuff April 20, 2017 © Robert Sabourin, 2009

47 Testing Under Pressure
So what exactly did they throw over the wall? April 20, 2017 © Robert Sabourin, 2008

48 Getting Things Done Adapt to change Triage Testing Periodic
Revised risks? New test objectives? Triage Testing What to test? What not to test? Periodic Prioritize Bugs What to fix? What not to fix? Track Progress What do we know so far? What don’t we know yet? Smoke Test Should the new build be tested at all? On failure continue with previous build in test. FAST Test Each testable object has a simple test. Is the testable object stable enough to test BUILD Regression Test Does application still work as before? Did we accidentally break something? Confirmation Test Have bugs really been fixed? Double check! Stress Testing How well does the application behave in harsh conditions? Experiment. April 20, 2017 © Robert Sabourin, 2008

49 Smoke Testing Smoke test is run on a new build of software to make sure all functions operate well enough to continue testing “Turn on a new appliance at the store” April 20, 2017 © Robert Sabourin, 2008

50 FAST Testing Functional Acceptance Simple Tests
Wide in breadth, low in depth Exercise every function of the application at least once April 20, 2017 © Robert Sabourin, 2008

51 Regression Testing Previously executed tests are re-executed against a new version of the application Have code changes broken something that used to work Have we introduced new defects April 20, 2017 © Robert Sabourin, 2008

52 Confirmation Testing Typically:
Tester confirms that the fixed bug is really fixed in the appropriate software build April 20, 2017 © Robert Sabourin, 2008

53 Stress Testing Testing operational characteristics of application within a harshly constrained environment Limit processor Limit memory Limit disk space Diminish access to shared resources April 20, 2017 © Robert Sabourin, 2008

54 Chartered Session Based Testing
Under Pressure Chartered Session Based Testing April 20, 2017 © Robert Sabourin, 2009

55 C. Northcote Parkinson Parkinson’s Law:
“…work expands so as to fill the time available for its completion…” April 20, 2017 © Robert Sabourin, 2008

56 Just In Time Testing Exploratory Testing April 20, 2017
© Robert Sabourin, 2008

57 Exploratory Testing James Bach (www.satisfice.com)
General Functionality and Stability Test Procedure for Windows 2000 Application Certification April 20, 2017 © Robert Sabourin, 2008

58 Mandate to explore William Clark Meriwether Lewis The object of your mission is to explore the Missouri river, & such principle streams of it, as, by its course and communication with the waters of the Pacific ocean...may offer the most direct & practicable water communication across this continent for the purposes of commerce. - Thomas Jefferson's letter to Meriwether Lewis, June 1803 April 20, 2017 © Robert Sabourin, 2008

59 Make intelligent decisions
Take notes about your decisions Map out where you have been Others can use the result April 20, 2017 © Robert Sabourin, 2008

60 Chart as you explore Further exploration yields a good idea of the state of the world! One bit at a time April 20, 2017 © Robert Sabourin, 2008

61 Exploration Notes - Tabular - Chronological - Schematic - Point form
- Concise April 20, 2017 © Robert Sabourin, 2008

62 Exploratory Testing Test Cases Map Making Skills Not known in advance
Defined & executed “on the fly” while you learn about the product Map Making Skills Consistent note taking style Practice April 20, 2017 © Robert Sabourin, 2008

63 Exploratory Testing During test we must capture
Function, options or sub-functions being explored Test cases attempted Comments, notes, images or attachments Hints, reminders and observations which may be useful to future testers Date, Platform, Build or Configuration under test Name of person running test Oracles, “strategy to assess correctness” Other relevant details April 20, 2017 © Robert Sabourin, 2008

64 An Exploratory Test Process
Confirm Test Objective Ensure context known Ensure HW and SW OK All tools available Kick Off Chunk of 90 to 120 min Test, Plan, Discover Prepare Run Wrap up Collect all notes data Complete Review results with Test Lead Review Follow Up Reassess goals Piece together map April 20, 2017 © Robert Sabourin, 2008

65 Finished? How do you know you are finished? April 20, 2017
© Robert Sabourin, 2009

66 You know you are finished when …
… the only bugs left are the ones are acceptable (based on your objective test team input) ... April 20, 2017 © Robert Sabourin, 2009

67 You know you are finished when …
… the only bugs left are the ones are acceptable (based on your objective test team input) ... At least for now! April 20, 2017 © Robert Sabourin, 2009

68 Bottom Line My experience is that it is better to omit a test on purpose than to skip it because you ran out of time or forgot about it! Systematically collecting, evaluating and triaging testing ideas helps me decide what not to test - at least for now? Get Started Right April 20, 2017 © Robert Sabourin, 2008

69 Thank You Questions? April 20, 2017 © Robert Sabourin, 2009


Download ppt "Testing Under Pressure: Five Key Principles"

Similar presentations


Ads by Google