Download presentation
Presentation is loading. Please wait.
1
1 Evolutionary Testing Metaheuristic search techniques applied to test problems Stella Levin Advanced Software Tools Seminar Tel-Aviv University 11.2004
2
2 Contents 1. Introduction 2. Metaheuristic Search Techniques 3. White-Box Testing 4. Black-Box Testing 5. Object-Oriented Testing 6. Non-Functional Testing 7. Search Based Software Engineering
3
3 Introduction - Why testing? “The biggest part of software cost is the cost of bugs: the cost of detecting them, the cost of correcting them, the cost of designing tests and the cost of running those tests” - Beizer “In embedded systems errors could result in high risk,endanger human life, big cost” - Wegener
4
4 Successful EA applications NASA evolvable antenna Equal to 12 years working of experienced designer There is no guarantee that a design would be as good [8]
5
5 1. Metaheuristic Search Problem characteristics Large solution space No precise algorithm, no “best” solution Classification of “better” solution “Due to non-linearity of software (if, loops…) test problems are converted to complex, discontinuous, non-linear search spaces”- Baresel
6
6 Transform a problem to optimization problem Candidate solution representation – individual Fitness function for individual Movement from one individual to another
7
7 Hill Climbing - “local” search 1. Select a point in the search space 2. Investigate neighboring points 3. If there is a better neighbor solution (with fitness function), jump to it 4. Repeat steps 2-3 until current position has no better neighbors Require definition of neighboring points
8
8 Hill Climbing - Problem Is there a bigger hill?
9
9 Simulated Annealing Analogy of the chemical process of cooling of a material in a heat bath If F(X2)<F(X1) then move to neighbor X2 Else move to X2 with probability P=e^(-ΔF/T) Initially T is high; T decreases more like hill climbing Require definition of neighbor and cooling function
10
10 Simulated Annealing
11
11 Evolutionary Algorithms Genetic algorithms Developed by J. Holland in the 70s Evolution strategies Developed in Germany at about the same time Analogy with Darwin’s evolution theory, survival of the fittest
12
12
13
13 Evolutionary Algorithms Selection: roulette wheel with fitness Crossover: 01101 01000 11000 11101 Cross at random point Mutation: 11101 10101 Random bit change
14
14 Genetic Programming Program is an individual Crossover and mutation of program’s abstract syntax tree Particular use – to find functions which describe data
15
15 Genetic Programming - Crossover
16
16 2. White-Box Testing Statement coverage Branch coverage Specific path(statement) selection
17
17 White-Box Testing Input variables (x1, x2, … Xk) Program Domain D1D2…Dk Individuals decode input of the program Goal: to find input data that satisfies coverage criteria (statement / branch / path)
18
18 Fitness Function = AL + D AL: Approximation Level acc. McMinn [3] Critical branch: branch missing the Target AL =(Number of critical branches between Target and diverging point) - 1
19
19 Fitness Function = AL + D D: branch distance If (x==y) then … if x!=y then D=abs(x-y) else D=0 D normalize to [0,1] Goal: min fitness (min D) if (x<y) then … If x>=y then D=x-y else D=0 If (flag) then … If flag==false then D=K else D=0
20
20 Example: Triangle Classification Input: int a,b,c from [0,15] 1. Sort a,b,c that a<=b<=c 2. If a+b<=c then NOT A TRIANGLE 3. If a==b or b==c then EQUILATERAL 4. If a==b and b==c then ISOSCELES 5. Else REGULAR
21
21 Optimization problem Program domain: III Individual string Goal: branch coverage Sub-goals: NOT A TRIANGLE, EQUILATERAL, ISOSCELES, REGULAR
22
22 Simulation by hand Goal: EQUILATERAL Generation: REGULAR NOT A TRIANGLE 1010 01 00 0010 1010 0111 0111 0011 10 11 0111 0011 1000 0010 EQUILATERAL
23
23 GA vs. Random Testing Schatz[11] Comp(x,y) = x nesting “if” complexity; y condition complexity
24
24 Real Example Schatz[11] Autopilot system 2046 LOC 75 conditions Branch cov. Performance GA vs. random
25
25 Real Example Performance of gradient descent algorithm
26
26 White Box Testing Summary Variety of problem mapping and fitness functions Flag and state problems
27
27 3. Black Box Testing - Tracey [4] Specification with pre/post-conditions Search for test input data that satisfy pre-condition && ! post-condition Good fitness for data that is near to satisfy boolIf TRUE then 0 else K a==bIf abs(a-b)==0 then 0 else abs(a-b)+K a<bIf a-b<0 then 0 else (a-b)+K a&&bfit(a) + fit(b) a||bmin(fit(a), fit(b))
28
28 int wrap_counter(int n) {//pre (n>=0 && n<=10) if (n>=10) i=0; else i=n+1; return i; //post (n i=n+1) //post (n=10 -> i=0)} Goal1: n>=0 && n<=10 && (n<10 && i!=n+1) Goal2: n>=0 && n<=10 && (n=10 && i!=0)
29
29 Example Insert Error: if (n>10) i=0 Goal2: n>=0 && n<=10 && (n=10 && i!=0) n=2 i=3 : 0+0+(8+K)+0=8+K n=7 i=8 : 0+0+(3+K)+0=3+K n=10 i=11 : 0+0+0+0=0 FOUND!!!
30
30 Application Applied to safety-critical nuclear protection system Use simulated annealing and GA for search Use mutation testing to insert errors About 2000 lines of executable code 733 different disjunctive goals 100% error detection The code was simple
31
31 3. Black-Box Testing Automated Parking System [10] Individual: geometry data of parking space and vehicle – 6 parameters Fitness: minimum distance to collision area Results: 880 scenarios-25 incorrect
32
32 Parking System - Results
33
33 Parking System - Results
34
34 4. Object-Oriented Testing Tonella [5]: Unit Testing of Classes 1. Create object of class under test 2. Put the object to proper state Repeat 1 and 2 for all required objects 3. Invoke method under test 4. Examine final state
35
35 Individual String and Fitness That means A a = new A(); B b = new B(); a.m(3,b); Goal: branch coverage Fitness: proportion of exercised decision nodes that lead to the target $a=A():$b=B():$a.m(int,$b)@3
36
36 Crossover Crossover at random point after constructor and before tested method Repair the individual string $a=A():$b=B(int):$a.m(int,$b)@1,5 $a=A(int,int):$b=B():$b.g():$a.m(int,$b)@0,3,4 $a=A():$b=B(int):$b.g():$a.m(int,$b) @1,4 $a=A(int,int):$b=B():$a.m(int,$b)@0,3,5
37
37 Mutations Mutation of input value Constructor change Insertion of method call Removal of method call $a=A():$b=B(int):$a.m(int,$b)@1,5 $a=A():$b=B(int):$a.m(int,$b)@1,7 $a=A():$c=C():$b=B($c):$a.m(int,$b)@1,7 $a=A():$c=C():$b=B($c):$b.f():$a.m(int,$b)@1,7
38
38 ClassLOCExecs Time (sec) String Tokenizer 313611/1138202 BitSet104626172/17728387004930-1.5h HashMap19821339/41843801338 HashMap29821339/4174310697 LinkedList7042356/579256902034 Stack118510/1022301 TreeSet4821520/214248060 Public methods Branch Coverage Test cases
39
39 5. Non-Functional Testing Execution Time Testing Real-time systems: WC/BC exe time Fitness: execution time for specified input Problem: no sufficient guidance for search Wegener[6] on real systems: GA is better than random and hand-made Problem with low probability branches No guarantee to find WC/BC execution time
40
40 6. SBSE Search Based Software Engineering Module Clustering using simulated annealing, genetic algorithms Cost/Time Estimation using genetic programming Re-engineering using Program Transformation Ryan[7]: Automatic parallelization using genetic programming
41
41 1. Goldberg “Genetic Algorithms” 2. McMinn “SBS Test Data Generation: A Survey” 3. McMinn “Hybridizing ET with the Chaining Approach” 4. Tracey “A Search Based Automated Test-Data Generation Framework for Safety-Critical Systems” 5. Tonella “Evolutionary Testing of Classes” 6. Wegener,Pitschinetz,Sthamer “Automated testing of real- time tasks” 7. Ryan “Automatic re-engineering of software using genetic programming” 8. Nasa “Intelligence report” 9. “Reformulating Software Engineering as a Search Problem” Clarke,Jones…(11 authors) 10. Buehler,Wegener “Evolutionary Functional Testing of an Automated Parking System” 11. McGraw, G., Michael, C., Schatz, M. “Generating Software Test Data by Evolution” References
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.