Presentation is loading. Please wait.

Presentation is loading. Please wait.

User Acceptance Testing The Hard Way

Similar presentations


Presentation on theme: "User Acceptance Testing The Hard Way"— Presentation transcript:

1 User Acceptance Testing The Hard Way
Graham Thomas BCS SIGIST 10th May 1996 USER ACCEPTANCE TESTING ‘THE HARD WAY’ Synopsis The aim of this talk is to relate to the SIGIST direct experience of User Acceptance Testing. Sound planning, structured testing, following the V model, introducing traceability throughout the lifecycle and gathering metrics isn’t enough. You actually have to do the testing and deliver as near to fault free a product as is possible. Unfortunately nothing ever goes smoothly, and the this talk will identify where in User Acceptance Testing problems can occur, what they might be and how to resolve some of them. So why ‘The Hard Way’? ‘Because if it was easy someone would have already done it!’

2 CONTENTS Background Test Method Test Environment Test Execution
Implementation Measures of Success Lessons Learnt

3 BACKGROUND The Project Project Structure The Environment Start Point

4 The Project Link 3 computer systems In 3 separate business areas
Sales & Marketing Registration Billing In 3 separate business areas With 3 different product lifecycles Supported by 3 competing suppliers

5 The Environment Three separate computer systems, in three different locations, on three different hardware platforms, with three different software sets, supported by three different support groups. Each owned by a separate Business unit. The link between the Mainframe and the UNIX environment had never been proven before.

6 Project Structure Project Board met monthly.
Project Control Group met weekly.

7 Start Point User Acceptance Testing started half way through the build phase of the project. The opportunity for early testing feed back into , Requirements, Analysis, Design and Build phases of the project was lost. We started testing late so were in catch up mode.

8 TEST METHOD Method Test Planning Test Analysis Test Scripting
Data Definition

9 Method

10 Test Planning Plans Resources Pre-determined end date
Stress & volume testing required Re-usable test environment to be built Users want to see bills produced Resources 2 testers for 10 weeks 1 strategist for 10 days

11 Test Planning (2) Proposed Strategy
Structured testing - driven by User Requirements Spec. Involve User Representatives Data Tidy & User Procedures to be in place for test execution Build a regression test environment Extra time required Additional resource required

12 Test Analysis Requirements Spec Technical Design Spec’s.
A technical document Not understood by users Not understood by testers Technical Design Spec’s. Written by individual suppliers Difficult to interpret without access to system design docs.

13 Test Analysis (2) Requirements Spec rewritten in English
200+ requirements extracted Workshopped business scenarios Business scenarios reviewed by suppliers

14 Test Scripting Legacy systems had a lack of design documentation
Design documentation for enhancements not delivered No one had knowledge of how all three systems would interface Management only interested in the number of scripts, not their content Test Scripting was a slow process and took about 4 weeks to understand the task, and begin to estimate the remaining work accurately.

15 Test Scripting (2) Mgmt. view that Test Team could not ‘Cut the mustard’ Suppliers view ‘only they could test their systems’ Brought members of suppliers’ development teams on board Suppliers not paid until completion of testing

16 Data Definition User Representatives limit their involvement to a review capacity Pragmatic decisions taken to: Generate test data from limited set supplied by User Reps. Satisfy more than one requirement with a single script Reported this as a risk through to the Project Board Due to the workload within the user departments; The user representatives kept their level of support to a review capacity just at the point when the test team required active involvement in identifying test data.

17 TEST ENVIRONMENT Determine requirements Specify environment
Then Get Real ! Complete copy of production data for all three systems Beg, borrow and steal ! ‘Virtual Environment’ Circumstance and timescale dictated that the project could not incur the overhead or cost on the mainframe system to produce a cut down acceptance testing environment, so we were forced to use a copy of production. All 19GB. This meant that we had to use production volumes on all three systems. This meant finding three production size environments. A tall order.

18 TEST EXECUTION Problems Resource Requirements Progress Monitoring
Problems, problems, problems . . . Resource Requirements Progress Monitoring

19 Problems Delayed by late delivery of Code
Incident Reporting System Required Test Harness didn’t work Project Board intervention required to bring User Reps. back ‘On Side’ and commit more of their time Changes ! Incident reporting system that could cater with forwarding problems from supplier to supplier to find a resolution. As with all systems, faults bring the inevitable changes. This means re-testing.

20 More Problems Additional testers but no accommodation, hardware or software Systems Integration found wanting System not stable enough to benefit from automation tools Short term planning ! It was evident that the systems had not been successfully integrated by the suppliers in their integration testing because of the high degree of showstopping faults that were being experienced. Several per day with new releases.

21 Resource Usage In December we actually utilised more testing resource than the original estimate for the whole project estimate. This was causing a significant overrun on project costs!

22 Progress Monitoring The existing systems brought a legacy of unsolved problems to be waded through before we could get to the enhancements. This analysis clearly showed which systems we suspected had further faults to be found.

23 Progress Monitoring (2)
This was the rate at which we were detecting errors throughout the project. These graphs always require interpretation. Initial progress was slow because of showstopping integration issues. Error detection then increased dramatically through a combination of; new code release and an increase in the amount of effort expended on testing. Why did it drop off at the end. Was this because we did less testing over Christmas?

24 IMPLEMENTATION Roll out plan Power outage Tape drive failure
Three Days Round Clock Multi-site co-ordination Power outage Tape drive failure Unforeseen system interaction If I was a conspiracy theorist I would have had plenty of grounds, but I am not, I just think it happens that way.

25 MEASURE OF SUCCESS Objectives met Suppliers view
Users change operating practice Structured releases Everything tested first Full documentation produced

26 The spirit of innocence in which we set off testing on the project.

27 LESSONS LEARNT Plan testing at project inception Start testing early
Expect the worst Gather metrics Measure, Monitor & Manage Be prepared to change Testing is not Development contingency ! ! !


Download ppt "User Acceptance Testing The Hard Way"

Similar presentations


Ads by Google