Genetic Algorithms José Galaviz Casas Facultad de Ciencias UNAM.

Slides:



Advertisements
Similar presentations
Genetic Algorithms Chapter 3. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Genetic Algorithms GA Quick Overview Developed: USA in.
Advertisements

Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Genetic Algorithms Representation of Candidate Solutions GAs on primarily two types of representations: –Binary-Coded –Real-Coded Binary-Coded GAs must.
Genetic Algorithms as a Tool for General Optimization Angel Kuri 2001.
Genetic Algorithms1 COMP305. Part II. Genetic Algorithms.
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
Theory Chapter 11. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory Overview (reduced w.r.t. book) Motivations and problems Holland’s.
Evolutionary Computational Intelligence
Genetic algorithms for neural networks An introduction.
Introduction to Genetic Algorithms Yonatan Shichel.
Genetic Algorithms GAs are one of the most powerful and applicable search methods available GA originally developed by John Holland (1975) Inspired by.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Chapter 14 Genetic Algorithms.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Chapter 6: Transform and Conquer Genetic Algorithms The Design and Analysis of Algorithms.
Genetic Algorithms Overview Genetic Algorithms: a gentle introduction –What are GAs –How do they work/ Why? –Critical issues Use in Data Mining –GAs.
Genetic Algorithm.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
A Brief Introduction to GA Theory. Principles of adaptation in complex systems John Holland proposed a general principle for adaptation in complex systems:
Schemata Theory Chapter 11. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory Why Bother with Theory? Might provide performance.
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory Mixed Integer Problems Most optimization algorithms deal.
CS Machine Learning Genetic Algorithms (II).
CS 484 – Artificial Intelligence1 Announcements Lab 3 due Tuesday, November 6 Homework 6 due Tuesday, November 6 Lab 4 due Thursday, November 8 Current.
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Genetic Algorithms Introduction Advanced. Simple Genetic Algorithms: Introduction What is it? In a Nutshell References The Pseudo Code Illustrations Applications.
1 Machine Learning: Lecture 12 Genetic Algorithms (Based on Chapter 9 of Mitchell, T., Machine Learning, 1997)
Soft Computing A Gentle introduction Richard P. Simpson.
1 Chapter 14 Genetic Algorithms. 2 Chapter 14 Contents (1) l Representation l The Algorithm l Fitness l Crossover l Mutation l Termination Criteria l.
GENETIC ALGORITHM A biologically inspired model of intelligence and the principles of biological evolution are applied to find solutions to difficult problems.
Derivative Free Optimization G.Anuradha. Contents Genetic Algorithm Simulated Annealing Random search method Downhill simplex method.
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Friday, 16 February 2007 William.
Why do GAs work? Symbol alphabet : {0, 1, * } * is a wild card symbol that matches both 0 and 1 A schema is a string with fixed and variable symbols 01*1*
Chapter 9 Genetic Algorithms.  Based upon biological evolution  Generate successor hypothesis based upon repeated mutations  Acts as a randomized parallel.
Edge Assembly Crossover
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
Genetic Algorithms. 2 Overview Introduction To Genetic Algorithms (GAs) GA Operators and Parameters Genetic Algorithms To Solve The Traveling Salesman.
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #12 2/20/02 Evolutionary Algorithms.
1. Genetic Algorithms: An Overview  Objectives - Studying basic principle of GA - Understanding applications in prisoner’s dilemma & sorting network.
5. Implementing a GA 4 학습목표 GA 를 사용해 실제 문제를 해결할 때 고려해야 하는 사항에 대해 이해한다 Huge number of choices with little theoretical guidance Implementation issues + sophisticated.
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
GENETIC ALGORITHMS Tanmay, Abhijit, Ameya, Saurabh.
Genetic Algorithms MITM613 (Intelligent Systems).
1 Autonomic Computer Systems Evolutionary Computation Pascal Paysan.
1 Chapter 3 GAs: Why Do They Work?. 2 Schema Theorem SGA’s features: binary encoding proportional selection one-point crossover strong mutation Schema.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Why do GAs work? Symbol alphabet : {0, 1, * } * is a wild card symbol that matches both 0 and 1 A schema is a string with fixed and variable symbols 01*1*
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Genetic Algorithms. Solution Search in Problem Space.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
 Presented By: Abdul Aziz Ghazi  Roll No:  Presented to: Sir Harris.
Genetic (Evolutionary) Algorithms CEE 6410 David Rosenberg “Natural Selection or the Survival of the Fittest.” -- Charles Darwin.
Introduction to genetic algorithm
Introduction to Genetic Algorithms
Chapter 14 Genetic Algorithms.
Genetic Algorithm in TDR System
Genetic Algorithms.
An Evolutionary Approach
C.-S. Shieh, EC, KUAS, Taiwan
Genetic Algorithms GAs are one of the most powerful and applicable search methods available GA originally developed by John Holland (1975) Inspired by.
Genetic Algorithms, Search Algorithms
Advanced Artificial Intelligence Evolutionary Search Algorithm
A Gentle introduction Richard P. Simpson
Population Based Metaheuristics
Presentation transcript:

Genetic Algorithms José Galaviz Casas Facultad de Ciencias UNAM

february 19, 2004Genetic Algorithms, J. Galaviz2 Contents Introduction, motivation, fundamental concepts. How genetic algorithms work. Operators. Theoretical framework. Variations around the same theme

february 19, 2004Genetic Algorithms, J. Galaviz3 Nature as optimizer “Although the human mind can create a lot of inventions, it cannot create better, more simple and direct inventions than nature, since in its creations nothing is missing and nothing is superfluous”. (Leonardo da Vinci, Notebook). Optimal individuals live in very complicated environment. Lots of variables (atmospheric pressure, temperature, predators, resources, chemical substances, etc.)

february 19, 2004Genetic Algorithms, J. Galaviz4 How (we think that) nature works Evolutionary process. Selection of well adapted individuals. The better the fitness is, the larger the reproduction chance. Searching phenotypes is hard (How can we determine the shape, color, physiological and functional features of an optimal marine predator if we don´t know a white shark?) Easiest way: searching genotypes. Encoding the problem domain. Mutation is evolution’s engine.

february 19, 2004Genetic Algorithms, J. Galaviz5 Genetic algorithms Searching methods inspired by natural evolution. Let the nature be your guide. John Holland (60´s). Originally: A model for natural evolution. Now: Method for optimization and machine learning.

february 19, 2004Genetic Algorithms, J. Galaviz6 How GA works? Domain encoding. Creation of initial population of codes. Proposed solutions for the optimization problem. Evaluation of fitness. Selection of individuals as a function of their fitness. Creation of new proposed solutions based on the individuals selected. Creation of new proposed solutions based on a random alterations of genetic codes. Iteration of whole process.

february 19, 2004Genetic Algorithms, J. Galaviz7 Domain encoding We must know the entire domain of problem (phenotypes space). We define an encoding procedure that maps phenotypes to codes (genetic code, genotype). Typically this is not an injective function (several phenotypes may be mapped to the same genotype). We are interested in the inverse mapping. Formally this is not a function, but we impose restrictions. We hope that actual solution can be obtained from some code in the genotypes space. At least we want some code(s) to be mapped close enough to such solution.

february 19, 2004Genetic Algorithms, J. Galaviz8 Evaluation Fitness function. Maps every possible genotype to an aptitude level. Formally a non-negative function. However this is violated in practice. Greatest value for better individuals. Population relative. How fast an impala must be in order to survive a cheetah hunting?

february 19, 2004Genetic Algorithms, J. Galaviz9 Selection Proportional to fitness (simple genetic algorithm SGA). But there are other alternatives. Survival of the fittest.

february 19, 2004Genetic Algorithms, J. Galaviz10 New individuals (from selected codes) Given two (or more) selected individuals their codes are mixed in order to generate offspring. We manipulate only codes. The genotypes obtained correspond to some phenotypes in the problem’s domain, but generally we don’t care about that. Sometimes we need to guarantee that hybrid individuals are valid phenotypes.

february 19, 2004Genetic Algorithms, J. Galaviz11 New individuals (from random alterations) Some elements in the code of new individuals are randomly changed. Generally we don´t care about phenotypes. Sometimes we need to restrict the changes in order to obtain codes for valid phenotypes.

february 19, 2004Genetic Algorithms, J. Galaviz12 The general procedure 1. Define the problem domain encoding. 2. Generate an initial population of codes (genotypes). This will be called current generation. 3. Evaluate the fitness of every individual in the current generation. 4. Perform the selection of two individuals in current generation. 5. Determine if the selected individuals must be crossed. Random event p c.

february 19, 2004Genetic Algorithms, J. Galaviz13 6.If selected individuals must be crossed, then perform crossover, generate two offspring called new individuals. 7.If selected individuals must not be crossed the selected individuals are called new individuals. 8.For every new individual determine if mutation must be performed for every element in its code. Random event p m. 9.Add the two new individuals in the new generation. 10.If new generation has N individuals, call it current generation, return to step 3 until some convergence criteria has been accomplished. 11.Else return to step 4.

february 19, 2004Genetic Algorithms, J. Galaviz14 Proportional selection

february 19, 2004Genetic Algorithms, J. Galaviz15 1-point crossover Choose a random cut point in the genetic code of every individual. Mix the complementary parts.

february 19, 2004Genetic Algorithms, J. Galaviz16 Example NoCodePhenotype (x)f(x) Maximum in x m = 7/11 = Not in genotype’s space!

february 19, 2004Genetic Algorithms, J. Galaviz17 P0P0 fitnessP1P1 P2P2 P3P

february 19, 2004Genetic Algorithms, J. Galaviz18 Why GA works? We suppose individuals are encoded as binary strings. Schema: pattern, template accomplished by several codewords. Example: and are instances of schema *1*010*10, also are instances of *1*******, *1*0****0, etc. Defining length  (H) = distance between first and last defined position in the schema. 7 in the example. Order o(H) = number of defined positions in the schema. 6 in the example.

february 19, 2004Genetic Algorithms, J. Galaviz19 The first model Let m(H, t) be the number of instances of schema H in the t-th generation of a simple genetic algorithm. We assume: –proportional selection –1-point crossover: P(breaking H) =  (H)/(l-1) where l is the string length. –uniform mutation: P(survival H)  1- o(H) p m

february 19, 2004Genetic Algorithms, J. Galaviz20 The schema theorem The expected number of instances of schema H in generation at time t+1

february 19, 2004Genetic Algorithms, J. Galaviz21 Not very useful Only disruptive effects of genetic operators are considered. Only a lower bound, not very tight. Long time behavior is not accurately predicted. Very particular.

february 19, 2004Genetic Algorithms, J. Galaviz22 Other kinds of crossover 2-point crossover Uniform crossover

february 19, 2004Genetic Algorithms, J. Galaviz23 Some crossover operators

february 19, 2004Genetic Algorithms, J. Galaviz24 The building block hypothesis Since the schemas survive easily if: –They are short –They are good (high fitness) Therefore the solutions obtained by the GA must be constructed using schemas with these characteristics, building blocks. Contradictory evidence (hitchhicking).

february 19, 2004Genetic Algorithms, J. Galaviz25 Implicit parallelism Every binary string of length l is instance of 2 l schemas. Evaluation of one string is the implicit evaluation of a sample of exponentially many schemas.

february 19, 2004Genetic Algorithms, J. Galaviz26 Exploitation Vs. exploration There are two opposite forces working in a GA. –Selection pressure. Exploitation of acquired knowledge. –Mutation. Random exploration of search space. Selection causes convergence, even to a sub-optimal solution. Gravity. Mutation favors finding of optimal solution, but causes divergence. Expansive pressure. Trade-off: emphasizing one of them diminishes the other. Impact in performance and/or robustness.

february 19, 2004Genetic Algorithms, J. Galaviz27 Two armed bandit Bandit machine, two arms, the payoff of each arm is a normally distributed random variable. The mean of one of the arms is higher, but we doesn´t know which one. We have a limited amount of money for exploration and exploitation, simultaneously. How the sampling must be performed? Answer: the arm with the current best observed mean must receive exponentially many more experiments than the other one.

february 19, 2004Genetic Algorithms, J. Galaviz28 GAs and the bandit If we consider proportional selection only. Let H be a schema with above average fitness: The schema theorem says:

february 19, 2004Genetic Algorithms, J. Galaviz29 The building blocks hypo Mitchell-Holland-Forrest, Royal Roads. Functions created ad-hoc to support the BBH. Genetic algorithm carefully adapted to prevent premature convergence. Three hillclimbers for comparison purposes. SAHC, NAHC, RMHC. RMHC outperforms GA. Oops! Spurious correlation (hitchhicking).

february 19, 2004Genetic Algorithms, J. Galaviz30 The idealized GA (IGA) Works on single strings, not population. Always preserve the best code found. Chooses new individual randomly, if such individual is better than the current best individual, cross them.

february 19, 2004Genetic Algorithms, J. Galaviz31 How can we reach the desired IGA? Can be approximated by a GA if: –There are no locus with fixed value in high proportion of population. –Good schemas must be favored by strong enough selection, but also hitchhicking must be avoided. –Crossover probability must be high enough to guarantee that good schemas have enough diffusion in population.

february 19, 2004Genetic Algorithms, J. Galaviz32 Alternative GAs Elitism. Important feature. It has been proved that elitism is the sufficient condition for convergence to the optimal. Deterministic selection schemes. In a population sorted decreasingly by fitness: –Nietzsche. The i-th individual is crossed with the (i+1)-th. –Vasconcelos. The i-th individual is crossed with the (N-i)-th. Good approximation to IGA. Has been statistically proved (Kuri, 2002) that GA(Vasconcelos)+elitism achieve best performance. Self-adaptation. The control parameters such p m and p c, are encoded in the individuals. AG+hillclimbers or AG+catastrophic events. Co evolution.

february 19, 2004Genetic Algorithms, J. Galaviz33 Variations Several encoding schemes: binary (Gray, weighted positional, etc.), non-binary. Ad-hoc operators: problem-dependent. Panmictic (orgiastic) crossover operators. Knowledge-based biased initial population. Several selection schemes: fitness based (proportional, sigma truncation), tournament, ranking (linear, non-linear). Additional features. Imagination is the limit.