Download presentation
Presentation is loading. Please wait.
Published byMadeline Woods Modified over 8 years ago
1
User Acceptance Testing The Hard Way Graham Thomas BCS SIGIST 10 th May 1996
2
2 CONTENTS l Background l Test Method l Test Environment l Test Execution l Implementation l Measures of Success l Lessons Learnt
3
3 BACKGROUND l The Project l Project Structure l The Environment l Start Point
4
4 The Project l Link 3 computer systems l Sales & Marketing l Registration l Billing l In 3 separate business areas l With 3 different product lifecycles l Supported by 3 competing suppliers
5
5 The Environment
6
6 Project Structure
7
7 Start Point
8
8 TEST METHOD l Method l Test Planning l Test Analysis l Test Scripting l Data Definition
9
9 Method
10
10 Test Planning l Plans l Pre-determined end date l Stress & volume testing required l Re-usable test environment to be built l Users want to see bills produced l Resources l 2 testers for 10 weeks l 1 strategist for 10 days
11
11 Test Planning (2) l Proposed Strategy l Structured testing - driven by User Requirements Spec. l Involve User Representatives l Data Tidy & User Procedures to be in place for test execution l Build a regression test environment l Extra time required l Additional resource required
12
12 Test Analysis l Requirements Spec l A technical document l Not understood by users l Not understood by testers l Technical Design Spec’s. l Written by individual suppliers l Difficult to interpret without access to system design docs.
13
13 Test Analysis (2) l Requirements Spec rewritten in English l 200+ requirements extracted l Workshopped business scenarios l Business scenarios reviewed by suppliers
14
14 Test Scripting l Legacy systems had a lack of design documentation l Design documentation for enhancements not delivered l No one had knowledge of how all three systems would interface l Management only interested in the number of scripts, not their content
15
15 Test Scripting (2) l Mgmt. view that Test Team could not ‘Cut the mustard’ l Suppliers view ‘only they could test their systems’ l Brought members of suppliers’ development teams on board l Suppliers not paid until completion of testing
16
16 Data Definition l User Representatives limit their involvement to a review capacity l Pragmatic decisions taken to: l Generate test data from limited set supplied by User Reps. l Satisfy more than one requirement with a single script l Reported this as a risk through to the Project Board
17
17 TEST ENVIRONMENT l Determine requirements l Specify environment l Then Get Real ! l Complete copy of production data for all three systems l Beg, borrow and steal ! l ‘Virtual Environment’
18
18 TEST EXECUTION l Problems l Problems, problems, problems... l Resource Requirements l Progress Monitoring
19
19 Problems l Delayed by late delivery of Code l Incident Reporting System Required l Test Harness didn’t work l Project Board intervention required to bring User Reps. back ‘On Side’ and commit more of their time l Changes !
20
20 More Problems l Additional testers but no accommodation, hardware or software l Systems Integration found wanting l System not stable enough to benefit from automation tools l Short term planning !
21
21 Resource Usage
22
22 Progress Monitoring
23
23 Progress Monitoring (2)
24
24 IMPLEMENTATION l Roll out plan l Three Days l Round Clock l Multi-site co-ordination l Power outage l Tape drive failure l Unforeseen system interaction
25
25 MEASURE OF SUCCESS 4Objectives met l Suppliers view l Users change operating practice l Structured releases l Everything tested first l Full documentation produced
26
26
27
27 LESSONS LEARNT l Plan testing at project inception l Start testing early l Expect the worst l Gather metrics l Measure, Monitor & Manage l Be prepared to change l Testing is not Development contingency ! ! !
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.