Genetic Algorithms & Simulated Evolution

Slides:



Advertisements
Similar presentations
Genetic Algorithms Contents 1. Basic Concepts 2. Algorithm
Advertisements

Tuesday, May 14 Genetic Algorithms Handouts: Lecture Notes Question: when should there be an additional review session?
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Chapter 14 Genetic Algorithms.
Genetic Algorithms Nehaya Tayseer 1.Introduction What is a Genetic algorithm? A search technique used in computer science to find approximate solutions.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Chapter 6: Transform and Conquer Genetic Algorithms The Design and Analysis of Algorithms.
Genetic Algorithm.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
1 Local search and optimization Local search= use single current state and move to neighboring states. Advantages: –Use very little memory –Find often.
Genetic algorithms Prof Kang Li
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Genetic Algorithms Genetic Algorithms – What are they? And how they are inspired from evolution. Operators and Definitions in Genetic Algorithms paradigm.
1 Machine Learning: Lecture 12 Genetic Algorithms (Based on Chapter 9 of Mitchell, T., Machine Learning, 1997)
1 Chapter 14 Genetic Algorithms. 2 Chapter 14 Contents (1) l Representation l The Algorithm l Fitness l Crossover l Mutation l Termination Criteria l.
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
Learning by Simulating Evolution Artificial Intelligence CSMC February 21, 2002.
Evolution Programs (insert catchy subtitle here).
1 Genetic Algorithms K.Ganesh Introduction GAs and Simulated Annealing The Biology of Genetics The Logic of Genetic Programmes Demo Summary.
Genetic Algorithms Przemyslaw Pawluk CSE 6111 Advanced Algorithm Design and Analysis
Chapter 9 Genetic Algorithms.  Based upon biological evolution  Generate successor hypothesis based upon repeated mutations  Acts as a randomized parallel.
Introduction to Genetic Algorithms. Genetic Algorithms We’ve covered enough material that we can write programs that use genetic algorithms! –More advanced.
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
Genetic Algorithms. 2 Overview Introduction To Genetic Algorithms (GAs) GA Operators and Parameters Genetic Algorithms To Solve The Traveling Salesman.
Optimization Problems
Waqas Haider Bangyal 1. Evolutionary computing algorithms are very common and used by many researchers in their research to solve the optimization problems.
GENETIC ALGORITHM Basic Algorithm begin set time t = 0;
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Genetic Algorithms Chapter Description of Presentations
1 Contents 1. Basic Concepts 2. Algorithm 3. Practical considerations Genetic Algorithm (GA)
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Genetic Algorithms. Underlying Concept  Charles Darwin outlined the principle of natural selection.  Natural Selection is the process by which evolution.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
 Presented By: Abdul Aziz Ghazi  Roll No:  Presented to: Sir Harris.
1 Genetic Algorithms Contents 1. Basic Concepts 2. Algorithm 3. Practical considerations.
Optimization Problems
Introduction to Genetic Algorithms
Chapter 14 Genetic Algorithms.
Genetic Algorithms.
Genetic Algorithms.
Dr. Kenneth Stanley September 11, 2006
Evolutionary Algorithms Jim Whitehead
School of Computer Science & Engineering
Introduction to Genetic Algorithm (GA)
Example: Applying EC to the TSP Problem
CSC 380: Design and Analysis of Algorithms
Artificial Intelligence (CS 370D)
Chapter 4 Beyond Classical Search
Comparing Genetic Algorithm and Guided Local Search Methods
Basics of Genetic Algorithms (MidTerm – only in RED material)
Example: Applying EC to the TSP Problem
Introduction to Operators
Optimization Problems
Example: Applying EC to the TSP Problem
GENETIC ALGORITHMS & MACHINE LEARNING
Genetic Algorithms Chapter 3.
Basics of Genetic Algorithms
EE368 Soft Computing Genetic Algorithms.
Boltzmann Machine (BM) (§6.4)
Searching by Constraint & Searching by Evolution
Searching for solutions: Genetic Algorithms
Machine Learning: UNIT-4 CHAPTER-2
Beyond Classical Search
Population Based Metaheuristics
CSC 380: Design and Analysis of Algorithms
Coevolutionary Automated Software Correction
Presentation transcript:

Genetic Algorithms & Simulated Evolution Chapter 4.3+ Concept of Genetic Algorithms Simulated Evolution Algorithm Initialization and Evaluation Selecting the Fittest Producing New Individuals Genetic Algorithms as Search 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

Genetic Algorithms (GA) Inspired by natural evolution: living things evolved into more successful organisms offspring exhibit some traits of each parent hereditary traits are determined by genes genetic instructions are contained in chromosomes chromosomes are strands of DNA DNA is composed of base pairs (ACGT), when in meaningful combinations, encode hereditary traits 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

Genetic Algorithms (GA) Mechanisms of evolutionary change: crossover: the random exchange of parent's chromosomes during reproduction resulting in offspring that have some traits of each parent Crossover requires genetic diversity among the parents to ensure sufficiently varied offspring. 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

Genetic Algorithms (GA) Mechanisms of evolutionary change: mutation: the rare occurrence of errors during the process of copying chromosomes resulting in: nonsensical/deadly changes that produce organisms that can't survive beneficial changes that produce "stronger" organisms harmless changes that aren't beneficial and don’t produce improved organisms 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

Genetic Algorithms (GA) Mechanisms of evolutionary change: natural selection: the fittest survive in a competitive environment resulting in better organisms individuals with better survival traits generally survive for a longer period of time this provides a better chance for reproducing and passing the successful traits on to offspring over many generations the species improves since better traits will out number weaker ones 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

Genetic Algorithms (GA) IDEA: Keep a population of individuals (i.e. solutions) that are complete (or partial) solutions Explore solution space by having these individuals interact and compete interaction produces new individuals competition eliminates weak individuals After multiple generations a strong individual should be found 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

Representation of Individuals Some problems have solutions that can be represented as a vector of values: e.g. satisfiability problem (SAT): determine if propositional logic statement is satisfiable (P1Ù P2)Ú(P1Ù ØP3)Ú(P1Ù ØP4)Ú(ØP3Ù ØP4) each element corresponds with a proposition having a truth value of either true (i.e. 1) or false (i.e. 0) vector: P1 P2 P3 P4 values: 1 0 1 1 Some problems have solutions that can be represented as a permutation of values: e.g. traveling sales person problem (TSP) 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

Simulated Evolution Algorithm Individual evolutionaryAlgorithm (Problem problem) Problem: object that contains initialize: returns an initial population of individuals a population size (e.g. 100) evaluate: sorts the population of individuals by fitness atFitnessThreshold: test if desired solution quality reached select: chooses individuals that survive alter: generates new individuals method that implements crossover a fraction of population to be generated by crossover (e.g. 0.6) method that implements mutation a mutation rate (e.g. 0.001) Individual: the best solution found 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

Simulated Evolution Algorithm Individual evolutionaryAlgorithm(Problem problem) { Population currGen = problem.initialize(); problem.evaluate(currGen); while (!problem.atFitnessThreshold(currGen)) { Population nextGen = problem.select(currGen); problem.alter(nextGen); currGen = nextGen; } return problem.mostFitIndividual(currGen); 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

Initialization: Seeding the Population Initialization sets the beginning population of individuals from which future generations are produced. Concerns: size of the initial population experimentally determined for problem diversity of the initial population (genetic diversity) a common issue resulting from the lack of diversity is premature convergence on non-optimal solution 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

Initialization: Seeding the Population How is a diverse initial population generated? uniformly random: randomly generate individuals from a solution space with uniform distribution grid initialization: choose individuals at regular "intervals" from the solution space non-clustering: require individuals to be a predefined "distance" away from those already in the population local optimization: use another technique (e.g. HC) to find initial population of local optima, which doesn't ensure diversity but guarantees solution to be no worse than the local optima definition of "interval" and "distance" are problem dependent 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

Evaluation: Ranking by Fitness Evaluation ranks the individuals by some fitness measure that corresponds with the quality of the individual solutions. For example, given individual i: classification: (correct(i))2 TSP: distance(i) SAT: #ofTermsSatisfied(i) walking animation: subjective rating Often fitness functions have local minima. 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

©2001-2004 James D. Skrentny from notes by C. Dyer, et. al. Selecting the Fittest Selection chooses which individuals survive and possibly reproduce in the next generation. Selection depends on the evaluation function if too dependent then like greedy search a non-optimal solution may be found if not dependent enough then may not ever converge to a solution Nature doesn't eliminate all "unfit" genes. They usually become recessive for a long period of time, and then may mutate to something useful. too dependent: keep the top n individuals 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

©2001-2004 James D. Skrentny from notes by C. Dyer, et. al. Selecting the Fittest Selection may result in crowding: the most fit individuals quickly reproduce so that a large percentage of the entire population looks very similar this reduces diversity in the population may hinder the long-run progress of the algorithm 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

©2001-2004 James D. Skrentny from notes by C. Dyer, et. al. Selecting the Fittest Deterministic Selection: relies heavily on evaluation converges fastest Two approaches: next generation is parents and their children parents are the best of the current generation parents produce children and survive to next generation next generation is only the children parents are used to produce children only parents don't survive (counters early convergence) 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

©2001-2004 James D. Skrentny from notes by C. Dyer, et. al. Selecting the Fittest Proportional Fitness Selection: each individual is selected proportionally to their evaluation score even the worst individual has a chance to survive this helps prevent stagnation in the population Two approaches: rank selection: individual selected with a probability proportional to its rank in population sorted by fitness proportional selection: individual selected with a probability that is Fitness(individual) / sum Fitness for all individuals 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

©2001-2004 James D. Skrentny from notes by C. Dyer, et. al. Selecting the Fittest Rank selection example: Given the following population and fitness: Sum the ranks 4 + 1 + 2 + 3 + 5 = 15 Determine probabilities ((5+1) – Rank)/ 15 Indiv. Fit. Rank A 5 B 20 1 C 11 2 D 8 3 E 6 4 Prob. 7% 33% 27% 20% 13% (6-2)/15 = 4/15 ~ 27/100 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

©2001-2004 James D. Skrentny from notes by C. Dyer, et. al. Selecting the Fittest Proportional selection example: Given the following population and fitness: Sum the Fitness 5 + 20 + 11 + 8 + 6 = 50 Determine probabilities Fitness(i) / 50 Indiv. Fit. Rank A 5 B 20 1 C 11 2 D 8 3 E 6 4 Prob. 10% 40% 22% 16% 12% 11/50 = 22/100 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

©2001-2004 James D. Skrentny from notes by C. Dyer, et. al. Selecting the Fittest Tournament Selection: randomly select two individuals and the one with the highest rank goes on and reproduces only the rank matters, not the spread between the two fitness scores probability that any individual to reproduce for the next generation is: (2s – 2r + 1) / s2 s is the size of the population r is the rank of the "winning" individual can be generalized to select best of n individuals 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

©2001-2004 James D. Skrentny from notes by C. Dyer, et. al. Selecting the Fittest Tournament selection example: Given the following population and fitness: Select B and D B wins Probability: (2s – 2r + 1) / s2 s = 5 Indiv. Fit. Rank A 5 B 20 1 C 11 2 D 8 3 E 6 4 Prob. 1/25 = 4% 9/25 = 36% 7/25 = 28% 5/25 = 20% 3/25 = 12% A has a 4% probability if it is chosen for both competitors (2*5 – 2*2 + 1) / 52 = 7/25 = 28/100 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

Producing New Individuals Alteration is used to produce new individuals. Mutation: randomly change an individual e.g. TSP: two swap, two interchange e.g. SAT: bit flip Parameters: mutation rate size of the mutation 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

Producing New Individuals Crossover for vector representations: pick one or more pairs of individuals as parents and randomly swap their segments also known as "cut and splice" Parameters: crossover rate number of crossover points positions of the crossover points 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

Producing New Individuals 1-point crossover: pick a dividing point in the parents' vectors and swap the segments For Example crossover point: after the 4th digit given parents: 1101101101 and 0001001000 children produced are: 1101 + 001000 and 0001 + 101101 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

Producing New Individuals N-point crossover: generalization of 1-point crossover pick n dividing points in the parents' vectors and splice together alternating segments Uniform crossover: the value of each element of the vector is randomly chosen from the values in the corresponding elements of the two parents 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

Producing New Individuals Example Technique for Permutation Reps, TSP: Edge Recombination highly successful crossover operator for TSP assumes undirected edges recognizes the important trait of tour is its edges uses an “edge map” containing for each city an edge list of adjacent cities connects cities with fewest edges first to prevent them from becoming isolated 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

Producing New Individuals Edge Recombination Algorithm: Choose from the edge map, the city with fewest adjacent cities. Break ties randomly. Call this first city curr. Remove all occurrences of curr from the edge map. Add curr as the next city in the child tour. If curr has entries in its edge list, go to 4; else, go to 5. Choose from the cities in curr's edge list the one with the fewest cities in its own edge list. Break ties randomly. Make it curr, and go to 2. If no cities remaining in the edge map, go to 6. Otherwise, set curr to one of the remaining cities and introduce a new edge (a mutation!) between this one and the previous curr, and go to 2. Add edge between first and last city of tour (may be a mutation). 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

Genetic Algorithms (GA) as Search GA is similar to a randomized beam search GA is a kind of hill-climbing (HC) search but A significant difference between GAs and HC is that, it is generally a good idea in Gas to create a population of individuals from the local maxima Overall, GAs have less problems with local maxima than back-propagation neural networks (later…) 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

Genetic Algorithms (GA) as Search Local maxima can cause problems: individuals get stuck at good but not optimal any small mutation gives worse fitness since mutation is a random process, a large mutation is possible, and this may help get individuals unstuck crossover can help get out of local maxima 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

©2001-2004 James D. Skrentny from notes by C. Dyer, et. al. Summary Keep a diverse population of individuals Explore solution space by having these individuals interact and compete producing new individuals: crossover, mutation eliminating weaklings: selection techniques After multiple generations a strong individual (i.e. solution) should be found 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.

©2001-2004 James D. Skrentny from notes by C. Dyer, et. al. Summary Applicable to a wide range of problems: optimization like TSP inductive concept learning scheduling animations of motion The results can be very good on some problems, and rather poor on others. GA is very slow if only mutation is used. Crossover makes the algorithm much faster. 4/14/2019 ©2001-2004 James D. Skrentny from notes by C. Dyer, et. al.