Download presentation
Presentation is loading. Please wait.
Published byChristian Singleton Modified over 6 years ago
1
Sharing the good, the bad, the ugly & What can we do about it?
GUI Test Automation Pierre Veragen The audience: How many are using GUI test automation today? How many feel good about the results? How many tried it and have given up? Sharing the good, the bad, the ugly & What can we do about it? 12/2/2018
2
What are we going to cover today?
Why Consider Test Automation? Test Automation -- an Investment The Different Sides of Test Automation Forces Affecting Test Automation Three Case Studies The “No-Record-Playback” Approach An Iterative Approach to Implementation Be Selective The Deadliest Pitfalls Review QA 12/2/2018
3
Why Consider Test Automation?
Our world today: Frequent changes Frequent builds Frequent deliveries to customer “To catch bugs early, we must test often” What Test Automation can bring: High speed cycle; code-test-analyze-fix-test… Better testing if designed right Repetitive, reliable testing Good coverage for Data-oriented and/or repetitive verification tasks 12/2/2018
4
Test Automation -- an Investment 1/2
Cost: Licenses Training Consulting Building the library Writing scripts The main cost: MAINTENANCE 95% Benefits: Catch bugs early when they happen… get the Duh effect Get bug ownership Free QA resource for other tasks Forces to write down what we actually test Can expand on it Can avoid the biggest embarrassment 12/2/2018
5
Test Automation -- an Investment 2/2
Our experience: App A - 210,000 lines of code App B - 37,000 lines of code App B again but a different test tool - ??? lines of code The easy traps: Volume is quality… Record now, clean up later “Don’t build/run automated tests now because the GUI is changing” Automated Testing is enough >>> All a bunch of fairy dust 8,000 loc shared by the two applications 12/2/2018
6
The Different Sides of Test Automation
Component Type Main Test Tool Target Code Coverage Notes Methods, Classes, Components Unit Test High nUnit Test Driven Design API API driver, code in C++, .Net, VB6… High code coverage when combined with UT, but it is not what drives the test Test scenarios and test cases are driven by the expected usage of the system (Use Cases…) GUI GUI test tool In all cases use code coverage to pinpoint weaknesses in the testing Do not try to test components through a GUI 12/2/2018
7
Forces Affecting Test Automation
“We may not know the details of future changes, but we can guess where it is going to change” Details of control will change >> attach names, enabled, visible, list… GUI Behavior will change >> added message, multilingual, new forms… Different OS, machine speed, network speed, file size Requirement updates See handout for design strategies …and we make errors too… 12/2/2018
8
Case Study 1: “Record & Playback”
Demo 1 Demonstrate brittleness Demonstrate code duplication Demonstrate maintainability issues Code duplication: set window, control names Like hard coding values all over your code Also called Capture & Replay 12/2/2018
9
Case Study 1: “Record & Playback”
The reality: AUT is always on the move Every time you record you duplicate code (buy now, pay later…) Every time you duplicate code you create increase your maintenance cost You’ll have to maintain it in hundreds of places Pretty soon we will record again to add clicking on a message box … in hundreds of places 12/2/2018
10
“Record & Playback” 12/2/2018
11
Case Study 2: “Encapsulate Recorded Scripts”
Demo 2 Two methods to limit duplicating code Code approach: Think of GUI functionality Can’t keep up with methods names and arguments DB approach: Move name details to DB 12/2/2018
12
“Encapsulate Recorded Scripts”
12/2/2018
13
Case Study 3: “Development Project”
Demo of an existing system Demo writing a script 12/2/2018
14
“Development Project”
12/2/2018
15
“Development Project” Business Requirements
Mimic the User’s Actions Provide a Scripter Façade Encapsulate Attach functionalities Design for maintainability and scalability Provide timely Error Reporting of Errors Provide Database-Driven Tests for data and behavioral verifications Provide a framework to Extend the Test Tool’s Functionality A few good coding practices: Encapsulate No Code Duplication 12/2/2018
16
“No-Record-Playback” Approach
The golden rules: Record only to understand how the test tool thinks and how it differentiates one control from another Then DELETE IT Never duplicate code Choose your testing goals wisely The Benefits: Fully encapsulated code – one place for the Click code, one place with the actual control name… Easy to upgrade and expand if well designed 12/2/2018
17
Iterative Approach to Implementation
List what needs to be tested >> Test Scenarios at a high level, using an organized thought process (risk, UC, Requirements…) Earmark what could done through test automation Prioritize Set some reasonable goals (like a Build Smoke Test) to validate the automation approach (tool, encapsulation techniques, library design…) After the first script is running, reconsider your solution 12/2/2018
18
Be Selective in What to Automate 1/2
Repetitive to boredom Where we spent most of the time manually Test Scenario that would catch issues as they happen Unattended testing 12/2/2018
19
Be Selective… 2/2 2 to 5 times the time it takes to perform a test
Some tests are nearly impossible to do by hand Run script 5 to 20 times before it becomes fully stable Build script stability with daily builds Build script stability with daily builds, i.e. when it has little effect on the project schedule 12/2/2018
20
The Deadliest Pitfalls
Using Record & Playback Having no programming experience in the team Looking at it a back-burner project Ignoring the fact that you are building a framework “The test passes because the script ran without errors” Sloppy baselining practices 12/2/2018
21
Review for Success Consider building this test automation framework as a development project (requirement, architecture…) with a development team (SME, Automation Engineer, and Tester) Think reusable components Pick a small initial goal like automating the smoke test, then a basic Regression System Test Do not attempt to automate everything, think ROI Manage timers and synchronization issues in an encapsulated way Test your test tool 12/2/2018
22
Review Think maintainability as the AUT will change
Think reusable components (script / library) Think scalability: use a library architecture to easily adapt to other applications being tested and other environments Get some developer involved at the get go Address synchronization problems early on Balance cool & productive features of the test tool Build a balanced team: SME, Automation Engineer, and Tester 12/2/2018
23
The End More material: Web sites
Expected App changes and what to do about it? Functional requirements built on experience Web sites Fit and FitNesse … 12/2/2018
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.