Download presentation
Presentation is loading. Please wait.
Published byJoanna Lee Modified over 8 years ago
1
Genetic Algorithms José Galaviz Casas Facultad de Ciencias UNAM
2
february 19, 2004Genetic Algorithms, J. Galaviz2 Contents Introduction, motivation, fundamental concepts. How genetic algorithms work. Operators. Theoretical framework. Variations around the same theme
3
february 19, 2004Genetic Algorithms, J. Galaviz3 Nature as optimizer “Although the human mind can create a lot of inventions, it cannot create better, more simple and direct inventions than nature, since in its creations nothing is missing and nothing is superfluous”. (Leonardo da Vinci, Notebook). Optimal individuals live in very complicated environment. Lots of variables (atmospheric pressure, temperature, predators, resources, chemical substances, etc.)
4
february 19, 2004Genetic Algorithms, J. Galaviz4 How (we think that) nature works Evolutionary process. Selection of well adapted individuals. The better the fitness is, the larger the reproduction chance. Searching phenotypes is hard (How can we determine the shape, color, physiological and functional features of an optimal marine predator if we don´t know a white shark?) Easiest way: searching genotypes. Encoding the problem domain. Mutation is evolution’s engine.
5
february 19, 2004Genetic Algorithms, J. Galaviz5 Genetic algorithms Searching methods inspired by natural evolution. Let the nature be your guide. John Holland (60´s). Originally: A model for natural evolution. Now: Method for optimization and machine learning.
6
february 19, 2004Genetic Algorithms, J. Galaviz6 How GA works? Domain encoding. Creation of initial population of codes. Proposed solutions for the optimization problem. Evaluation of fitness. Selection of individuals as a function of their fitness. Creation of new proposed solutions based on the individuals selected. Creation of new proposed solutions based on a random alterations of genetic codes. Iteration of whole process.
7
february 19, 2004Genetic Algorithms, J. Galaviz7 Domain encoding We must know the entire domain of problem (phenotypes space). We define an encoding procedure that maps phenotypes to codes (genetic code, genotype). Typically this is not an injective function (several phenotypes may be mapped to the same genotype). We are interested in the inverse mapping. Formally this is not a function, but we impose restrictions. We hope that actual solution can be obtained from some code in the genotypes space. At least we want some code(s) to be mapped close enough to such solution.
8
february 19, 2004Genetic Algorithms, J. Galaviz8 Evaluation Fitness function. Maps every possible genotype to an aptitude level. Formally a non-negative function. However this is violated in practice. Greatest value for better individuals. Population relative. How fast an impala must be in order to survive a cheetah hunting?
9
february 19, 2004Genetic Algorithms, J. Galaviz9 Selection Proportional to fitness (simple genetic algorithm SGA). But there are other alternatives. Survival of the fittest.
10
february 19, 2004Genetic Algorithms, J. Galaviz10 New individuals (from selected codes) Given two (or more) selected individuals their codes are mixed in order to generate offspring. We manipulate only codes. The genotypes obtained correspond to some phenotypes in the problem’s domain, but generally we don’t care about that. Sometimes we need to guarantee that hybrid individuals are valid phenotypes.
11
february 19, 2004Genetic Algorithms, J. Galaviz11 New individuals (from random alterations) Some elements in the code of new individuals are randomly changed. Generally we don´t care about phenotypes. Sometimes we need to restrict the changes in order to obtain codes for valid phenotypes.
12
february 19, 2004Genetic Algorithms, J. Galaviz12 The general procedure 1. Define the problem domain encoding. 2. Generate an initial population of codes (genotypes). This will be called current generation. 3. Evaluate the fitness of every individual in the current generation. 4. Perform the selection of two individuals in current generation. 5. Determine if the selected individuals must be crossed. Random event p c.
13
february 19, 2004Genetic Algorithms, J. Galaviz13 6.If selected individuals must be crossed, then perform crossover, generate two offspring called new individuals. 7.If selected individuals must not be crossed the selected individuals are called new individuals. 8.For every new individual determine if mutation must be performed for every element in its code. Random event p m. 9.Add the two new individuals in the new generation. 10.If new generation has N individuals, call it current generation, return to step 3 until some convergence criteria has been accomplished. 11.Else return to step 4.
14
february 19, 2004Genetic Algorithms, J. Galaviz14 Proportional selection
15
february 19, 2004Genetic Algorithms, J. Galaviz15 1-point crossover Choose a random cut point in the genetic code of every individual. Mix the complementary parts.
16
february 19, 2004Genetic Algorithms, J. Galaviz16 Example NoCodePhenotype (x)f(x) 00000.0001.285 10010.1251.629 20100.2500.335 30110.3750.792 41000.5002.000 51010.6253.990 61100.7503.104 71110.8751.093 Maximum in x m = 7/11 = 0.6363... Not in genotype’s space!
17
february 19, 2004Genetic Algorithms, J. Galaviz17 P0P0 fitnessP1P1 P2P2 P3P3 0001.2850011.6291013.991013.99 0011.6290011.6290001.2851013.99 0011.62910020011.6291013.99 1111.0930110.7920011.6291002 0110.7920011.6291103.1041013.99 1.285 1.536 2.327 3.592
18
february 19, 2004Genetic Algorithms, J. Galaviz18 Why GA works? We suppose individuals are encoded as binary strings. Schema: pattern, template accomplished by several codewords. Example: 010010110 and 111010010 are instances of schema *1*010*10, also are instances of *1*******, *1*0****0, etc. Defining length (H) = distance between first and last defined position in the schema. 7 in the example. Order o(H) = number of defined positions in the schema. 6 in the example.
19
february 19, 2004Genetic Algorithms, J. Galaviz19 The first model Let m(H, t) be the number of instances of schema H in the t-th generation of a simple genetic algorithm. We assume: –proportional selection –1-point crossover: P(breaking H) = (H)/(l-1) where l is the string length. –uniform mutation: P(survival H) 1- o(H) p m
20
february 19, 2004Genetic Algorithms, J. Galaviz20 The schema theorem The expected number of instances of schema H in generation at time t+1
21
february 19, 2004Genetic Algorithms, J. Galaviz21 Not very useful Only disruptive effects of genetic operators are considered. Only a lower bound, not very tight. Long time behavior is not accurately predicted. Very particular.
22
february 19, 2004Genetic Algorithms, J. Galaviz22 Other kinds of crossover 2-point crossover Uniform crossover
23
february 19, 2004Genetic Algorithms, J. Galaviz23 Some crossover operators
24
february 19, 2004Genetic Algorithms, J. Galaviz24 The building block hypothesis Since the schemas survive easily if: –They are short –They are good (high fitness) Therefore the solutions obtained by the GA must be constructed using schemas with these characteristics, building blocks. Contradictory evidence (hitchhicking).
25
february 19, 2004Genetic Algorithms, J. Galaviz25 Implicit parallelism Every binary string of length l is instance of 2 l schemas. Evaluation of one string is the implicit evaluation of a sample of exponentially many schemas.
26
february 19, 2004Genetic Algorithms, J. Galaviz26 Exploitation Vs. exploration There are two opposite forces working in a GA. –Selection pressure. Exploitation of acquired knowledge. –Mutation. Random exploration of search space. Selection causes convergence, even to a sub-optimal solution. Gravity. Mutation favors finding of optimal solution, but causes divergence. Expansive pressure. Trade-off: emphasizing one of them diminishes the other. Impact in performance and/or robustness.
27
february 19, 2004Genetic Algorithms, J. Galaviz27 Two armed bandit Bandit machine, two arms, the payoff of each arm is a normally distributed random variable. The mean of one of the arms is higher, but we doesn´t know which one. We have a limited amount of money for exploration and exploitation, simultaneously. How the sampling must be performed? Answer: the arm with the current best observed mean must receive exponentially many more experiments than the other one.
28
february 19, 2004Genetic Algorithms, J. Galaviz28 GAs and the bandit If we consider proportional selection only. Let H be a schema with above average fitness: The schema theorem says:
29
february 19, 2004Genetic Algorithms, J. Galaviz29 The building blocks hypo Mitchell-Holland-Forrest, 1994. Royal Roads. Functions created ad-hoc to support the BBH. Genetic algorithm carefully adapted to prevent premature convergence. Three hillclimbers for comparison purposes. SAHC, NAHC, RMHC. RMHC outperforms GA. Oops! Spurious correlation (hitchhicking).
30
february 19, 2004Genetic Algorithms, J. Galaviz30 The idealized GA (IGA) Works on single strings, not population. Always preserve the best code found. Chooses new individual randomly, if such individual is better than the current best individual, cross them.
31
february 19, 2004Genetic Algorithms, J. Galaviz31 How can we reach the desired IGA? Can be approximated by a GA if: –There are no locus with fixed value in high proportion of population. –Good schemas must be favored by strong enough selection, but also hitchhicking must be avoided. –Crossover probability must be high enough to guarantee that good schemas have enough diffusion in population.
32
february 19, 2004Genetic Algorithms, J. Galaviz32 Alternative GAs Elitism. Important feature. It has been proved that elitism is the sufficient condition for convergence to the optimal. Deterministic selection schemes. In a population sorted decreasingly by fitness: –Nietzsche. The i-th individual is crossed with the (i+1)-th. –Vasconcelos. The i-th individual is crossed with the (N-i)-th. Good approximation to IGA. Has been statistically proved (Kuri, 2002) that GA(Vasconcelos)+elitism achieve best performance. Self-adaptation. The control parameters such p m and p c, are encoded in the individuals. AG+hillclimbers or AG+catastrophic events. Co evolution.
33
february 19, 2004Genetic Algorithms, J. Galaviz33 Variations Several encoding schemes: binary (Gray, weighted positional, etc.), non-binary. Ad-hoc operators: problem-dependent. Panmictic (orgiastic) crossover operators. Knowledge-based biased initial population. Several selection schemes: fitness based (proportional, sigma truncation), tournament, ranking (linear, non-linear). Additional features. Imagination is the limit.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.