Download presentation
Presentation is loading. Please wait.
1
A new crossover technique in Genetic Programming Janet Clegg Intelligent Systems Group Electronics Department
2
This presentation Describe basic evolutionary optimisation Overview of failed attempts at crossover methods Describe the new crossover technique Results from testing on two regression problems
3
Evolutionary optimisation
4
Start by choosing a set of random trial solutions (population)
5
Each trial solution is evaluated (fitness / cost) 1.2 0.9 0.8 1.4 0.2 0.3 2.1 2.3 0.3 0.9 1.3 1.0 0.8 2.4 0.4 1.7 1.5 0.6 1.3 0.8 2.5 0.7 1.4 2.1
6
Parent selection 0.9 2.1 1.3 0.7 1.4 0.6 1.7 1.5 2.1 1.7 Select mother Select father
7
Perform crossover child 1 child 2
8
Mutation Probability of mutation small (say 0.1)
9
This provides a new population of solutions – the next generation Repeat generation after generation 1 select parents 2 perform crossover 3 mutate until converged
10
Genetic Algorithms (GA) – Optimises some quantity by varying parameter values which have numerical values Genetic Programming (GP) – Optimises some quantity by varying parameters which are functions / parts of computer code / circuit components Two types of evolutionary optimisation
11
Representation
12
Traditional GA’s - binary representation e.g. 1011100001111 Floating point GA – performs better than binary e.g. 7.2674554 Genetic Program (GP) Nodes represent functions whose inputs are below the branches attached to the node
13
Some crossover methods
14
Crossover in a binary GA Mother 1 0 0 0 0 0 1 0 = 130 Father 0 1 1 1 1 0 1 0 = 122 Child 1 1 1 1 1 1 0 1 0 = 250 Child 2 0 0 0 0 0 0 1 0 = 2
15
Min parameter value Max parameter value Mother father Offspring chosen as random point between mother and father Crossover in a floating point GA
16
Traditional method of crossover in a GP mother father Child 1 Child 2
17
Motivation for this work Tree crossover in a GP does not always perform well Angeline and Luke and Spencer compared:- (1) performance of tree crossover (2) simple mutation of the branches difference in performance was statistically insignificant Consequently some people implement GP’s with no crossover - mutation only
18
Motivation for this work In a GP many people do not use crossover so mutation is the more important operator In a GA the crossover operator contributes a great deal to its performance - mutation is a secondary operator AIM:- find a crossover technique in GP which works as well as the crossover in a GA
19
Cartesian Genetic Programming
20
Cartesian Genetic Programming (CGP) Julian Miller introduced CGP Replaces tree representation with directed graphs – represented by a string of integers The CGP representation will be explained within the first test problem CGP uses mutation only – no crossover
21
First simple test problem A simple regression problem:- Finding the function to best fit data taken from The GP should find this exact function as the optimal fit
22
The traditional GP method for this problem Set of functions and terminals FunctionsTerminals +1 -x *
23
*x x x 1 - * (1-x) * (x*x) Initial population created by randomly choosing functions and terminals within the tree structures
24
Crossover by randomly swapping sub-branches of the parent trees mother father Child 1 Child 2
25
Set of functions – each function identified by an integer FunctionsInteger representation +0 -1 * 2 Set of terminals – each identified by an integer TerminalsInteger representation 10 x1 CGP representation
26
Creating the initial population 2 First integer is random choice of function 0 (+), 1 (-), or 2 (*) 0 1 Second two integers are random choice of terminals 0 (1) or 1 (x) 0 2 1 Next integers are random choice of inputs for the function from the set 0 (1) 1 (x) or node 2 3
27
Creating the initial population 20 10 2 1 random choice of inputs from 0 1 2 3 4 5 Terminals nodes 3 1 3 1 0 2 3 4 5 24 1 6 random choice for output from 0 1 2 3 4 5 6 Terminals all nodes 5 output
28
2 0 1 0 1 1 1 3 1 0 2 3 2 4 1 5 2 3 4 5 6 output output 5 +x x x 1 * + 32 (1*x) +(x+x) = 3x
29
Run the CGP with test data taken from the function Population size 30 28 offspring created at each generation Mutation only to begin with Fitness is the sum of squared differences between data and function
30
Result 0 0 1 2 2 1 1 2 2 0 3 2 0 5 1 5 2 3 4 5 6 output 5 + x 1 x + * + 23 1 x 2 =
31
Any two runs of a GP (or GA) will not be exactly the same To analyse the convergence of the GP we need lots of runs All the following graphs depict the average convergence out of 4000 runs Statistical analysis of GP
32
Introduce crossover
33
Introducing some Crossover Parent 1 0 0 1 2 2 1 1 2 2 0 3 2 0 5 1 5 Parent 2 2 0 1 0 1 1 1 3 1 0 2 3 2 4 1 5 Child 1 0 0 1 2 2 1 1 3 1 0 2 3 2 4 1 5 Child 2 2 0 1 0 1 1 1 2 2 0 3 2 0 5 1 5 Pick random crossover point
34
GP with and without crossover
35
GA with and without crossover
36
Parent 1 0 0 1 2 2 1 1 2 2 0 3 2 0 5 1 5 Parent 2 2 0 1 0 1 1 1 3 1 0 2 3 2 4 1 5 Child 1 0 0 1 2 2 1 1 3 1 0 2 3 2 4 1 5 Child 2 2 0 1 0 1 1 1 2 2 0 3 2 0 5 1 5 Random crossover point but must be between the nodes
37
0 0 1 1 1 2 2 3 2 2 4 1 0 3 5 6 2 3 4 5 6 output +- * x+ * x1x-+x+1x1x 6 3 2 5 4 3 2 2
38
GP crossover at nodes
39
Parent 1 0 0 1 2 2 1 1 2 2 0 3 2 0 5 1 5 Parent 2 2 0 1 0 1 1 1 3 1 0 2 3 2 4 1 5 Child 1 0 0 1 2 2 1 1 3 1 0 2 3 2 4 1 5 Child 2 2 0 1 0 1 1 1 2 2 0 3 2 0 5 1 5 Pick a random node along the string and swap this single node
40
Crossover only one node
41
Parent 1 0 0 1 2 2 1 1 2 2 0 3 2 0 5 1 5 Parent 2 2 0 1 0 1 1 1 3 1 0 2 3 2 4 1 5 Child 1 2 0 1 0 2 1 1 2 1 0 2 2 2 5 1 5 Child 2 0 0 1 2 1 1 1 3 2 0 3 3 0 4 1 5 Each integer in child randomly takes value from either mother or father
42
Random swap crossover
43
Comparison with random search GP with no crossover performs better than any of the trial crossover here How much better than a completely random search is it? The only means it will improve on a random search are by parent selection mutation
44
Comparison with a random search
45
GP converges in 58 generations Random search 73 generations
46
GA performance compared with a completely random search GA tested on a large problem – A random search would have involved searching through 150,000,000 data points The GA reached the solution after testing 27,000 data points ( average convergence of 5000 GA runs) Probability of random search reaching solution in 27,000 trials is 0.00018 !!!!
47
Why does GP tree crossover not always work well?
48
f1f1 f2f2 f3f3 f5f5 f4f4 x8x8 x7x7 x6x6 x5x5 x4x4 x3x3 x2x2 x1x1 f6f6 f7f7 f 1 { f 2 [ f 4 ( x 1,x 2 ), f 5 ( x 3,x 4 ) ], f 3 [ f 6 ( x 5,x 6 ), f 7 ( x 7,x 8 ) ] } g 1 { g 2 [ g 4 ( y 1,y 2 ), g 5 ( y 3,y 4 ) ], g 3 [ g 6 ( y 5,y 6 ), g 7 ( y 7,y 8 ) ] } f 1 { f 2 [ f 4 ( x 1,x 2 ), f 5 ( x 3,x 4 ) ], f 3 [ f 6 ( x 5,x 6 ), g 7 ( y 7,y 8 ) ] }
49
x2x2 x1x1 f g g( x 1 ) = f( x 2 ) Good!
50
x2x2 x1x1 f g g( x 2 ) f( x 1 ) Good! f( x 1 ) f( x 2 ) g( x 2 ) g( x 1 )
51
Suppose we are trying to find a function to fit a set of data looking like this
52
Suppose we have 2 parents which fit the data fairly well
53
/ 1 + x y ^ ^ - -22 0.5 exp Choose crossover point
54
/ 1 2 ^ + + y x ^ ^ - - 2 0.5 0.85 Choose crossover point
57
Introducing the new technique Based on Julian Miller’s Cartesian Genetic Program (CGP) The CGP representation is changed Integer values are replaced by floating point values Crossover is performed as in a floating point GA
58
CGP representation 0 0 1 2 2 1 1 2 2 0 3 2 0 5 1 5 1 2 3 4 5 output New representation – replace integers with floating point variables x 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 x 9 x 10 x 11 x 12 x 13 x 14 x 15 x 16 Where the x i lie in a defined range, say 2 3 4 5 6 output
59
Interpretation of the new representation For the variables x i which represent choice of function If the set of functions is + - * 00.33 0.66 1 + - *
60
x 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 x 9 x 10 x 11 x 12 x 13 x 14 x 15 x 16 1 2 3 4 5 output 00.33 0.66 1 1 x node 1 00.20.4 1 1node 1 xnode 2node 3 0.60.8
61
The crossover operator Crossover is performed as in a floating point GA Two parents, p 1 and p 2 are chosen Offspring o 1 and o 2 are created by Uniformly generated random number is chosen 0 < r i < 1 o i = p 1 + r i (p 2 − p 1 ) when p 1 < p 2
62
Min parameter value Max parameter value Mother father Offspring chosen as random point between mother and father Crossover in the new representation
63
Why is this crossover likely to work better than tree crossover?
64
x8x8 f1f1 f2f2 f3f3 f5f5 f4f4 x7x7 x6x6 x5x5 x4x4 x3x3 x2x2 x1x1 f6f6 f7f7 f 1 { f 2 [ f 4 ( x 1,x 2 ), f 5 ( x 3,x 4 ) ], f 3 [ f 6 ( x 5,x 6 ), f 7 ( x 7,x 8 ) ] } g 1 { g 2 [ g 4 ( y 1,y 2 ), g 5 ( y 3,y 4 ) ], g 3 [ g 6 ( y 5,y 6 ), g 7 ( y 7,y 8 ) ] } f 1 { f 2 [ f 4 ( x 1,x 2 ), f 5 ( x 3,x 4 ) ], f 3 [ f 6 ( x 5,x 6 ), g 7 ( y 7,y 8 ) ] } Mathematical interpretation of tree crossover
65
x 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 x 9 x 10 x 11 x 12 x 13 x 14 x 15 x 16 Mathematical interpretation of the new method 1 2 3 4 5 output The fitness can be thought of as a function of these 16 variables, say and the optimisation becomes that of finding the values of these 16 variables which give the best fitness The new crossover guides each variable towards its optimal value in a continuous manner
66
The new technique has been tested on two problems Two regression problems studied by Koza (1) (2)
67
Test data – 50 points in the interval [-1,1] Fitness is the sum of the absolute errors over 50 points Population size 50 – 48 offspring each generation Tournament selection used to select parents Various rates of crossover tested 0% 25% 50% 75% Number of nodes in representation = 10
68
The following results are based on the average convergence of the new method out of 1000 runs Considered converged when the absolute error is less than 0.01 at all of the 50 data points (same as Koza) Results based on (1) average convergence graphs (2) the average number of generations to converge (3) Koza’s computational effort figure Statistical analysis of new method
69
Average convergence for
70
Convergence for latter generations
71
Introduce variable crossover At generation number 1 crossover performed 90% of the time Rate of crossover linearly decreases until Generation number 180 crossover is 0%
72
Variable crossover
73
Average number of generations to converge and computational effort Percentage crossover Average number of generations to converge Koza’s computational effort 0%16830,000 25%849,000 50%578,000 75%716,000 Variable crossover 4710,000
74
Average convergence for
75
Variable crossover
76
Percentage crossover Average number of generations to converge Koza’s computational effort 0% 51644,000 25%73524,000 50%69114,000 75%65511,000 Variable crossover 27813,000 Average number of generations to converge and computational effort
77
Number of generations to converge for both problems over 100 runs
78
Average convergence ignoring runs which take over 1000 generations to converge
79
The new technique has reduced the average number of generations to converge From 168 down to 47 for the first problem tested From 516 down to 278 for the second problem Conclusions
80
When crossover is 0% this new method is equivalent to the traditional CGP - mutation only The computational effort figures for 0% crossover here are similar to those reported for the traditional CGP Although a larger mutation rate and population size have been used here Conclusions
81
Future work Investigate the effects of varying the GP parameters population size mutation rate selection strategies Test the new technique on other problems larger problems other types of problems
82
Thankyou for listening!
83
Average convergence for using 50 nodes instead of 10
84
Average number of generations to converge and computational effort Percentage crossover Average number of generations to converge Koza’s computational effort 0% 7818,000 25%8513,000 50%7111,000 75%10413,000 Variable crossover 4514,000
85
Average convergence for using 50 nodes instead of 10
86
Average number of generations to converge and computational effort Percentage crossover Average number of generations to converge Koza’s computational effort 0% 13118,000 25%19317,000 50%22412,000 75%15219,000 Variable crossover 5816,000
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.