Download presentation
Presentation is loading. Please wait.
Published byDarren Bennett Modified over 9 years ago
1
From 3 weeks to 30 minutes – a journey through the ups and downs of test automation
2
Who am I? Peter Thomas – Chief Software Engineer – Operations IT, UBS Investment Bank Developer (mostly) I do some architecture I have done Testing I talk a lot (Mentor/Coach) From the dark side (consulting) but getting better
3
Where did we start? Existing mainframe legacy application 3 week manual PPT testing cycle 12 week delivery cycle
4
What did we want to do? Belief there was a better way to deliver software Incremental development to deliver business value quickly Address the rapidly changing business landscape with flexibility in delivery Build quality into the solutions Deliver the software rapidly, but in a cost effective manner Put the fun back into software delivery
5
New YorkLondonKievHyderabadHong Kong 2M trades per day 100 billions settling per day in all major currencies 50+ exchanges across EMEA and APAC 15 scrum teams/120 people 9 applications Production releases every 2 weeks
6
New YorkLondonKievHyderabadHong Kong 200 commits per day 1000 artefacts updated per day 1 commit every 5 minutes peak
7
New YorkLondonKievHyderabadHong Kong 24 Build Targets 60+ Test Targets 800 Automated Functional Tests 10, 000 Unit/Integration Tests 7, 000 Behavioural Tests
8
But……..
9
Our tests were….. Complicated Obscure Random failures Slow to run Difficult to fix
10
“The TDD rut” Complicated Obscure Random failures Slow to run Difficult to fix
11
Test the Right Thing and Test the Thing Right When all you have is a hammer, everything looks like a nail
12
Why do you test?
13
Because TDD tells me so? Because (insert favourite method here) says I should? So I meet the 80% coverage metric?
14
Why do you test? To accept the solution To understand and document the solution To prove its not broken To find the unknown unknowns To help us design and build new features To help us explore what is really needed To show it won’t crash under load, to show it is secure (to test the ‘ilities) …?
15
Why do you test? Agile testing Quadrants – Lisa Crispin, Janet Gregory
16
Testing Purposefully
17
The Right Thing At The Right Level Unit Component System
18
The Right Thing At The Right Level Unit Component System Tests a single class with no dependencies If dependencies like Spring Context, Database used then called Unit Integration Tests technical correctness and robustness Very specific, a failing test indicates an issue in a specific class Difficult to perform on poor quality code Very fast to run, should run on the developer’s desktop in the IDE
19
The Right Thing At The Right Level Unit Component System Tests a group of components which are integrated to perform a business relevant function Can test technical or business correctness, but should be expressed in Domain concepts Specific, a failing test indicates problems in that component Easier to perform on poor quality code, provided component boundaries are clear Can be quick to run, doesn’t need the full application, should run on developers desktop
20
The Right Thing At The Right Level Unit Component System Tests a system at its boundaries as a ‘black box’ Primarily testing for business correctness Not Specific, a failing test could be caused anywhere in the system flow Easy to perform on legacy applications, requires little code modification Slow to run, can be fragile, may not run on developers desktop
21
What We Wanted
22
What We Had Unit tests which weren’t really Unit Tests End to End tests when unit tests would have been sufficient Duplicate and redundant End to End tests
23
The TDD Cycle
24
TDD? @Test public void shouldBeEmtpyAfterCreation() { ReportObject aTrm = new ReportObject(); assertNull(aTrm.getEligibleTrm()); assertNull(aTrm.getProcessedEvent()); assertNull(aTrm.getPayloadXml()); } @Test public void shouldCorrectlySetAttributesViaConstructor() { ReportObject aTrm = new ReportObject(eligibleObject, REPORTABLE_XML); assertEquals(eligibleObject, reportableTrm.getEligibleTrm()); assertEquals(REPORTABLE_XML, reportableTrm.getPayloadXml()); } @Test public void shouldCorrectlySetFieldsViaSetters() { ReportObject aTrm = new ReportObject(); aTrm.setEligibleTrm(eligibleObject); aTrm.setProcessedEvent(child); aTrm.setPayloadXml(REPORTABLE_XML); assertEquals(eligibleObject, aTrm.getEligibleTrm()); assertEquals(child, aTrm.getProcessedEvent()); assertEquals(REPORTABLE_XML, aTrm.getPayloadXml()); }
25
The Hollow Egg
26
98 Tests 2.5K LOC 98 Tests 2.5K LOC 30 Tests 200 LOC 30 Tests 200 LOC
27
RSpec model
28
Outside In - The TDD Spiral
29
Make the Intent Clear How to achieve acceptance without showing your IDE or log file to the users
30
Unit Test Naming? testProcessError() whenWorkItemIsManuallyAssignedThenClientRuleShouldBeSetToManualOverride() shouldAllowAnActioningWorkItemToBeUpdated()
31
Test Data Nightmare
32
What Do You Demo?
34
Executable Specification
35
Improve Testing Stability Avoiding the Broken Windows syndrome
36
Separate Progress & Regression Tests
37
Speed-up Through Parallelism
38
Identify Unstable Tests
39
Quarantine Unstable Tests
40
Avoid External Dependencies
41
Introduce Fakes
42
Avoid Time-Dependent Tests
43
Test Isolation
44
Asynchronous Testing Headache
45
Don’t! Does your test need to be asynchronous? 80/20 rule? Create synchronous test runner harness
46
Asynchronous Testing using Events
47
So…?
48
Treat your Tests Like you Treat your Code “it’s just a test class” is not an excuse Clean Code applies to tests too
49
Think about Why You are Testing Specification tests for internal quality Business tests for external quality
50
Think about Who You are Testing For More people are interested in your tests than you may think
51
Zero Tolerance to Instability “It runs OK on my machine” is not a valid response
52
Interested in a career at UBS? peter.thomas@ubs.com @peterrhysthomas peterrhysthomas.wordpress.com
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.