From 3 weeks to 30 minutes – a journey through the ups and downs of test automation
Who am I? Peter Thomas – Chief Software Engineer – Operations IT, UBS Investment Bank Developer (mostly) I do some architecture I have done Testing I talk a lot (Mentor/Coach) From the dark side (consulting) but getting better
Where did we start? Existing mainframe legacy application 3 week manual PPT testing cycle 12 week delivery cycle
What did we want to do? Belief there was a better way to deliver software Incremental development to deliver business value quickly Address the rapidly changing business landscape with flexibility in delivery Build quality into the solutions Deliver the software rapidly, but in a cost effective manner Put the fun back into software delivery
New YorkLondonKievHyderabadHong Kong 2M trades per day 100 billions settling per day in all major currencies 50+ exchanges across EMEA and APAC 15 scrum teams/120 people 9 applications Production releases every 2 weeks
New YorkLondonKievHyderabadHong Kong 200 commits per day 1000 artefacts updated per day 1 commit every 5 minutes peak
New YorkLondonKievHyderabadHong Kong 24 Build Targets 60+ Test Targets 800 Automated Functional Tests 10, 000 Unit/Integration Tests 7, 000 Behavioural Tests
But……..
Our tests were….. Complicated Obscure Random failures Slow to run Difficult to fix
“The TDD rut” Complicated Obscure Random failures Slow to run Difficult to fix
Test the Right Thing and Test the Thing Right When all you have is a hammer, everything looks like a nail
Why do you test?
Because TDD tells me so? Because (insert favourite method here) says I should? So I meet the 80% coverage metric?
Why do you test? To accept the solution To understand and document the solution To prove its not broken To find the unknown unknowns To help us design and build new features To help us explore what is really needed To show it won’t crash under load, to show it is secure (to test the ‘ilities) …?
Why do you test? Agile testing Quadrants – Lisa Crispin, Janet Gregory
Testing Purposefully
The Right Thing At The Right Level Unit Component System
The Right Thing At The Right Level Unit Component System Tests a single class with no dependencies If dependencies like Spring Context, Database used then called Unit Integration Tests technical correctness and robustness Very specific, a failing test indicates an issue in a specific class Difficult to perform on poor quality code Very fast to run, should run on the developer’s desktop in the IDE
The Right Thing At The Right Level Unit Component System Tests a group of components which are integrated to perform a business relevant function Can test technical or business correctness, but should be expressed in Domain concepts Specific, a failing test indicates problems in that component Easier to perform on poor quality code, provided component boundaries are clear Can be quick to run, doesn’t need the full application, should run on developers desktop
The Right Thing At The Right Level Unit Component System Tests a system at its boundaries as a ‘black box’ Primarily testing for business correctness Not Specific, a failing test could be caused anywhere in the system flow Easy to perform on legacy applications, requires little code modification Slow to run, can be fragile, may not run on developers desktop
What We Wanted
What We Had Unit tests which weren’t really Unit Tests End to End tests when unit tests would have been sufficient Duplicate and redundant End to End tests
The TDD Cycle
public void shouldBeEmtpyAfterCreation() { ReportObject aTrm = new ReportObject(); assertNull(aTrm.getEligibleTrm()); assertNull(aTrm.getProcessedEvent()); assertNull(aTrm.getPayloadXml()); public void shouldCorrectlySetAttributesViaConstructor() { ReportObject aTrm = new ReportObject(eligibleObject, REPORTABLE_XML); assertEquals(eligibleObject, reportableTrm.getEligibleTrm()); assertEquals(REPORTABLE_XML, reportableTrm.getPayloadXml()); public void shouldCorrectlySetFieldsViaSetters() { ReportObject aTrm = new ReportObject(); aTrm.setEligibleTrm(eligibleObject); aTrm.setProcessedEvent(child); aTrm.setPayloadXml(REPORTABLE_XML); assertEquals(eligibleObject, aTrm.getEligibleTrm()); assertEquals(child, aTrm.getProcessedEvent()); assertEquals(REPORTABLE_XML, aTrm.getPayloadXml()); }
The Hollow Egg
98 Tests 2.5K LOC 98 Tests 2.5K LOC 30 Tests 200 LOC 30 Tests 200 LOC
RSpec model
Outside In - The TDD Spiral
Make the Intent Clear How to achieve acceptance without showing your IDE or log file to the users
Unit Test Naming? testProcessError() whenWorkItemIsManuallyAssignedThenClientRuleShouldBeSetToManualOverride() shouldAllowAnActioningWorkItemToBeUpdated()
Test Data Nightmare
What Do You Demo?
Executable Specification
Improve Testing Stability Avoiding the Broken Windows syndrome
Separate Progress & Regression Tests
Speed-up Through Parallelism
Identify Unstable Tests
Quarantine Unstable Tests
Avoid External Dependencies
Introduce Fakes
Avoid Time-Dependent Tests
Test Isolation
Asynchronous Testing Headache
Don’t! Does your test need to be asynchronous? 80/20 rule? Create synchronous test runner harness
Asynchronous Testing using Events
So…?
Treat your Tests Like you Treat your Code “it’s just a test class” is not an excuse Clean Code applies to tests too
Think about Why You are Testing Specification tests for internal quality Business tests for external quality
Think about Who You are Testing For More people are interested in your tests than you may think
Zero Tolerance to Instability “It runs OK on my machine” is not a valid response
Interested in a career at peterrhysthomas.wordpress.com