Download presentation
Presentation is loading. Please wait.
Published byDominic Paul Modified over 9 years ago
1
Empirical Studies in Test-Driven Development Laurie Williams, NCSU williams@csc.ncsu.edu
2
© Laurie Williams 2007 Agenda Overview of Test-Driven Development (TDD) TDD Case Studies TDD within XP Case Studies Summary
3
© Laurie Williams 2007 Test-driven Development
4
© Laurie Williams 2007 Overview of TDD via JUnit
5
© Laurie Williams 2007 xUnit tools
6
© Laurie Williams 2007 Agenda Overview of Test-Driven Development (TDD) TDD Case Studies TDD within XP Case Studies Summary
7
© Laurie Williams 2007 Structured Experiment –3 companies, 24 professional developers all worked in pairs Rolemodel, John Deere, Ericsson –Random groups: Test-first vs. test-last Test-last “refused” to write automated JUnit tests test-never –Test-first: 18% more manual, black box test cases passed 16% more time Good test coverage (98% Method, 92% Statement, 97% Branch)
8
© Laurie Williams 2007 IBM Retail Device Drivers IBM research partners: Julio Sanchez (Mexico) and Michael Maximilien (Almaden)
9
© Laurie Williams 2007 Building Test Assets
10
© Laurie Williams 2007 Defect Density
11
© Laurie Williams 2007 The New Anti-aging Formula?
12
© Laurie Williams 2007 Lessons Learned 1.A passionate champion for the practice and tools is necessary. 2.JUnit well structured framework relative to “ad hoc” automated testing. 3.Can extend JUnit to handle necessary manual intervention. 4.Not all tests can be automated. 5.Create a good design using object-oriented principles. 6.Execute all tests prior to checking code into code base. 7.Write tests prior to code. 8.Institute a nightly build that includes running all the unit tests. 9.Create a test whenever a defect is detected internally or externally.
13
© Laurie Williams 2007 Test-first Performance Research partners: Chih- wei Ho; IBM: Mike Johnson and Michael Maximilien
14
© Laurie Williams 2007 Sample Results
15
© Laurie Williams 2007 Agenda Overview of Test-Driven Development (TDD) TDD Case Studies TDD within XP Case Studies Summary
16
© Laurie Williams 2007 16 IBM: XP-Context Factors (XP-cf) Small team (7-10) Co-located Web development (toolkit) Supplier and customer distributed (US and overseas) Examined one release “old” (low XP) to the next “new” (more XP)
17
© Laurie Williams 2007 17 IBM: XP-Adherence Metrics (XP-am) XP-am MetricPracticeOldNew Automated test class per user story Testing0.110.45 Test coverage (statement)Testing30%46% Unit test runs per person dayTesting14%11% Test LOC/Source LOCTesting0.260.42 Accept test executeTestingManual Did customers run your acceptance tests? TestingNo Pairing FrequencyPair Programming<5%48% Release LengthShort Release10 months5 months Iteration LengthShort ReleaseWeekly
18
© Laurie Williams 2007 18 IBM: XP-Outcome Measures (XP-om) XP Outcome MeasuresOldNew Response to Customer Change (Ratio (user stories in + out) /total) NA0.23 Pre-release Quality (test defects/KLOEC of code) 1.00.50 Post-release Quality (released defects/KLOEC of code) 1.00.61 Productivity (stories / PM) Relative KLOEC / PM Putnam Product. Parameter 1.0 1.34 1.70 1.92 Customer SatisfactionNAHigh (qualitative) Morale (via survey)1.01.11
19
© Laurie Williams 2007 19 Sabre-A: XP Context Factors (XP-cf)
20
© Laurie Williams 2007 20 Sabre-A: XP-Adherence Metrics (XP-am) XP-am MetricPracticeOldNew Automated test class per new/changed class Testing0.0360.572 Test coverage (statement)TestingN/A32.9% Unit test runs per person dayTesting01.0 Test LOC/Source LOCTesting0.0540.296 Accept test executeTestingManual Did customers run your acceptance tests? TestingNo Pairing FrequencyPair Programming <0%50% Release LengthShort Release18 months3.5 months Iteration LengthShort Release--10 days
21
© Laurie Williams 2007 21 Sabre-A: XP-Outcome Measures (XP-om) XP Outcome MeasuresOldNew Response to Customer Change (Ratio (user stories in + out) /total) NAN/A Pre-release Quality (test defects/KLOEC of code) 1.00.35 Post-release Quality (released defects/KLOEC of code) 1.00.70 Productivity (stories / PM) Relative KLOEC / PM Putnam Product. Parameter N/A 1.0 N/A 1.46 2.89 Customer SatisfactionNAHigh (anecdotal) Morale (via survey)N/A68.1%
22
© Laurie Williams 2007 22 Sabre-P: XP Context Factors (XP-cf) l Medium sized team (15) l Co-located l Large web application (1M LOC) l Customers domestic & overseas l Examined 13 th release of the product; 20 months after starting XP
23
© Laurie Williams 2007 23 Sabre-P: XP-Adherence Metrics (XP-am) XP-am MetricPracticeNew Automated test class per new/changed class Testing0.0225 Test coverage (statement)Testing7.7% Unit test runs per person dayTesting0.4 Test LOC/Source LOCTesting0.296 Pair programming 70% Release LengthShort Release3 months Iteration LengthShort Release10 days
24
© Laurie Williams 2007 24 Sabre-P: XP-Outcome Measures (XP-om) XP Outcome MeasuresBangalore SPIN Benchmarking group Capers Jones Pre-release defect densitySimilarLower Total defect densityLower ProductivitySimilarHigher
25
© Laurie Williams 2007 25 Tekelec: XP Context Factors (XP-cf) Small team (4-7; 2 during maintenance phase) Geographically distributed –Contractors in Czech Republic for US development organization (Tekelec) Simulator for a telecommunications signal transfer point system (train new customers) Considerable amount of requirements volatility
26
© Laurie Williams 2007 26 Tekelec: XP-Adherence Metrics (XP-am) XP-am MetricPracticeNew Automated test class per new/changed class Testing1.0 Test coverage (statement)TestingN/A Unit test runs per person dayTesting1/day for all; 1/hour for quickset Test LOC/Source LOCTesting0.91 Pair programming 77.5% Release LengthShort Release4 months Iteration LengthShort Release10 days
27
© Laurie Williams 2007 27 Tekelec: XP-Outcome Measures (XP-om) Outcome measureF-15 project Pre-release Quality (test defects/KLOEC) N/A Post-release Quality (post-release defects/KLOEC) 1.62 defects/KLOEC [Lower than industry standards] Customer Satisfaction (interview) Capability – Neutral Reliability – Satisfied Communication – Very Satisfied Productivity1.22 KLOEC/PM [Lower than industry standards] 2.32 KLOEC/PM (including test code) [on par with industry standards]
28
© Laurie Williams 2007 28 Empirical Studies of XP Teams
29
© Laurie Williams 2007 Agenda Overview of Test-Driven Development (TDD) TDD Case Studies TDD within XP Case Studies Summary
30
© Laurie Williams 2007 Summary Increased quality with “no” long-run productivity impact Valuable test assets created in the process Indications: –Improved design –Anti-aging
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.