Evolution strategies Chapter 4. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies ES quick overview Developed: Germany.

Slides:



Advertisements
Similar presentations
Genetic Algorithms Chapter 3. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Genetic Algorithms GA Quick Overview Developed: USA in.
Advertisements

Evolution strategies Chapter 4.
Parameter control Chapter 8. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Parameter Control in EAs 2 Motivation 1 An EA has many.
Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 8.
CHAPTER 9 E VOLUTIONARY C OMPUTATION I : G ENETIC A LGORITHMS Organization of chapter in ISSO –Introduction and history –Coding of  –Standard GA operations.
Evolutionary Computational Intelligence
Doug Downey, adapted from Bryan Pardo, Machine Learning EECS 349 Machine Learning Genetic Programming.
Fast Evolutionary Optimisation Temi avanzati di Intelligenza Artificiale - Lecture 6 Prof. Vincenzo Cutello Department of Mathematics and Computer Science.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Evolutionary Computational Intelligence
Evolution strategies (ES) Chapter 4. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Evolution strategies Overview.
Christoph F. Eick: Applying EC to TSP(n) Example: Applying EC to the TSP Problem  Given: n cities including the cost of getting from on city to the other.
Genetic Programming Chapter 6. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Genetic Programming GP quick overview Developed: USA.
Genetic Algorithm.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
Evolution Strategies Evolutionary Programming Genetic Programming Michael J. Watts
Intro. ANN & Fuzzy Systems Lecture 36 GENETIC ALGORITHM (1)
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
What is Genetic Programming? Genetic programming is a model of programming which uses the ideas (and some of the terminology) of biological evolution to.
Genetic Algorithms Genetic algorithms imitate a natural optimization process: natural selection in evolution. Developed by John Holland at the University.
Introduction to Evolutionary Algorithms Session 4 Jim Smith University of the West of England, UK May/June 2012.
Neural and Evolutionary Computing - Lecture 6
What is an Evolutionary Algorithm? Chapter 2. A.E. Eiben and J.E. Smith, What is an Evolutionary Algorithm? With Additions and Modifications by Ch. Eick.
Genetic Algorithms Siddhartha K. Shakya School of Computing. The Robert Gordon University Aberdeen, UK
EE459 I ntroduction to Artificial I ntelligence Genetic Algorithms Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University.
Evolution strategies Luis Martí DEE/PUC-Rio. ES quick overview Developed: Germany in the 1970’s Early names: I. Rechenberg, H.-P. Schwefel Typically applied.
Evolutionary Programming
/ 26 Evolutionary Computing Chapter 8. / 26 Chapter 8: Parameter Control Motivation Parameter setting –Tuning –Control Examples Where to apply parameter.
Genetic Programming. GP quick overview Developed: USA in the 1990’s Early names: J. Koza Typically applied to: machine learning tasks (prediction, classification…)
Evolutionary Computing
Evolutionary Computing Dialects Presented by A.E. Eiben Free University Amsterdam with thanks to the EvoNet Training Committee and its “Flying Circus”
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
Evolution strategies Chapter 4. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies ES quick overview Developed: Germany.
Chapter 12 FUSION OF FUZZY SYSTEM AND GENETIC ALGORITHMS Chi-Yuan Yeh.
Evolutionary Computing Chapter 4. / 62 Chapter 4: Representation, Mutation, and Recombination Role of representation and variation operators Most common.
EE749 I ntroduction to Artificial I ntelligence Genetic Algorithms The Simple GA.
Genetic Programming A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 6.
Evolution strategies Chapter 4. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies ES quick overview Developed: Germany.
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Genetic Programming COSC Ch. F. Eick, Introduction to Genetic Programming GP quick overview Developed: USA in the 1990’s Early names: J. Koza Typically.
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
Introduction to Evolutionary Computing II A.E. Eiben Free University Amsterdam with thanks to the EvoNet Training Committee.
Ch 20. Parameter Control Ch 21. Self-adaptation Evolutionary Computation vol. 2: Advanced Algorithms and Operators Summarized and presented by Jung-Woo.
Genetic Programming Using Simulated Natural Selection to Automatically Write Programs.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Evolutionary Programming A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 5.
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming.
EVOLUTIONARY SYSTEMS AND GENETIC ALGORITHMS NAME: AKSHITKUMAR PATEL STUDENT ID: GRAD POSITION PAPER.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
Genetic Programming.
Evolutionary Programming
Evolution Strategies Evolutionary Programming
Evolution strategies Chapter 4.
Example: Applying EC to the TSP Problem
Example: Applying EC to the TSP Problem
○ Hisashi Shimosaka (Doshisha University)
Parameter control Chapter 8.
Evolutionary Programming
Boltzmann Machine (BM) (§6.4)
Genetic Programming Chapter 6.
Genetic Programming Chapter 6.
A Tutorial (Complete) Yiming
Genetic Programming Chapter 6.
Evolution strategies Chapter 4.
Parameter control Chapter 8.
Evolution Strategies Originally developed in Germany in the early 60s with Rechenberg and Schwefel being the main contributors. Main ideas include: floating.
Evolution strategies Chapter 4.
Evolution Strategies Originally developed in Germany in the early 60s with Rechenberg and Schwefel being the main contributors. Main ideas include: floating.
Parameter control Chapter 8.
Presentation transcript:

Evolution strategies Chapter 4

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies ES quick overview Developed: Germany in the 1970’s Early names: I. Rechenberg, H.-P. Schwefel Typically applied to: – numerical optimisation Attributed features: – fast – good optimizer for real-valued optimisation – relatively much theory Special: – self-adaptation of (mutation) parameters standard

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies ES technical summary tableau RepresentationReal-valued vectors RecombinationDiscrete or intermediary MutationGaussian perturbation Parent selectionUniform random Survivor selection ( , ) or (  + ) SpecialtySelf-adaptation of mutation step sizes

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Introductory example Task: minimimise f : R n  R Algorithm: “two-membered ES” using – Vectors from R n directly as chromosomes – Population size 1 – Only mutation creating one child – Greedy selection

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Introductory example: pseudocde Set t = 0 Create initial point x t =  x 1 t,…,x n t  REPEAT UNTIL (TERMIN.COND satisfied) DO Draw z i from a normal distr. for all i = 1,…,n y i t = x i t + z i IF f(x t ) < f(y t ) THEN x t+1 = x t – ELSE x t+1 = y t – FI – Set t = t+1 OD

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Introductory example: mutation mechanism z values drawn from normal distribution N( ,  ) – mean  is set to 0 – variation  is called mutation step size  is varied on the fly by the “1/5 success rule”: This rule resets  after every k iterations by –  =  / cif p s > 1/5 –  =  cif p s < 1/5 –  =  if p s = 1/5 where p s is the % of successful mutations, 0.8  c  1

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Illustration of normal distribution

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Another historical example: the jet nozzle experiment Initial shape Final shape Task: to optimize the shape of a jet nozzle Approach: random mutations to shape + selection

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Another historical example: the jet nozzle experiment cont’d Next Time

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies The famous jet nozzle experiment (movie)

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Representation Chromosomes consist of three parts: – Object variables: x 1,…,x n – Strategy parameters: Mutation step sizes:  1,…,  n  Rotation angles:  1,…,  n  Not every component is always present Full size:  x 1,…,x n,  1,…,  n,  1,…,  k  where k = n(n-1)/2 (no. of i,j pairs)

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Mutation Main mechanism: changing value by adding random noise drawn from normal distribution x’ i = x i + N(0,  ) Key idea: –  is part of the chromosome  x 1,…,x n,   –  is also mutated into  ’ (see later how) Thus: mutation step size  is coevolving with the solution x

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Mutate  first Net mutation effect:  x,     x’,  ’  Order is important: – first    ’ (see later how) – then x  x’ = x + N(0,  ’) Rationale: new  x’,  ’  is evaluated twice – Primary: x’ is good if f(x’) is good – Secondary:  ’ is good if the x’ it created is good Reversing mutation order this would not work

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Mutation case 1: Uncorrelated mutation with one  Chromosomes:  x 1,…,x n,    ’ =  exp(  N(0,1)) x’ i = x i +  ’ N(0,1) Typically the “learning rate”   1/ n ½ And we have a boundary rule  ’ <  0   ’ =  0

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Mutants with equal likelihood Circle: mutants having the same chance to be created

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Mutation case 2: Uncorrelated mutation with n  ’s Chromosomes:  x 1,…,x n,  1,…,  n   ’ i =  i exp(  ’ N(0,1) +  N i (0,1)) x’ i = x i +  ’ i N i (0,1) Two learning rate parmeters: –  ’ overall learning rate –  coordinate wise learning rate   1/(2 n) ½ and   1/(2 n ½ ) ½ And  i ’ <  0   i ’ =  0

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Mutants with equal likelihood Ellipse: mutants having the same chance to be created

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Mutation case 3: Correlated mutations Chromosomes:  x 1,…,x n,  1,…,  n,  1,…,  k  where k = n (n-1)/2 and the covariance matrix C is defined as: – c ii =  i 2 – c ij = 0 if i and j are not correlated – c ij = ½ (  i 2 -  j 2 ) tan(2  ij ) if i and j are correlated Note the numbering / indices of the  ‘s

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Correlated mutations cont’d The mutation mechanism is then:  ’ i =  i exp(  ’ N(0,1) +  N i (0,1))  ’ j =  j +  N (0,1) x ’ = x + N(0,C’) – x stands for the vector  x 1,…,x n  – C’ is the covariance matrix C after mutation of the  values   1/(2 n) ½ and   1/(2 n ½ ) ½ and   5°  i ’ <  0   i ’ =  0 and |  ’ j | >    ’ j =  ’ j - 2  sign(  ’ j )

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Mutants with equal likelihood Ellipse: mutants having the same chance to be created

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Recombination Creates one child Acts per variable / position by either – Averaging parental values, or – Selecting one of the parental values From two or more parents by either: – Using two selected parents to make a child – Selecting two parents for each position anew

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Names of recombinations Two fixed parents Two parents selected for each i z i = (x i + y i )/2 Local intermediary Global intermediary z i is x i or y i chosen randomly Local discrete Global discrete

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Parent selection Parents are selected by uniform random distribution whenever an operator needs one/some Thus: ES parent selection is unbiased - every individual has the same probability to be selected Note that in ES “parent” means a population member (in GA’s: a population member selected to undergo variation)

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Survivor selection Applied after creating children from the  parents by mutation and recombination Deterministically chops off the “bad stuff” Basis of selection is either: – The set of children only: ( , )-selection – The set of parents and children: (  + )-selection

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Survivor selection cont’d (  + )-selection is an elitist strategy ( , )-selection can “forget” Often ( , )-selection is preferred for: – Better in leaving local optima – Better in following moving optima – Using the + strategy bad  values can survive in  x,  too long if their host x is very fit Selective pressure in ES is very high (  7  is the common setting)

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Self-adaptation illustrated Given a dynamically changing fitness landscape (optimum location shifted every 200 generations) Self-adaptive ES is able to – follow the optimum and – adjust the mutation step size after every shift !

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Self-adaptation illustrated cont’d Changes in the fitness values (left) and the mutation step sizes (right)

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Prerequisites for self-adaptation  > 1 to carry different strategies >  to generate offspring surplus Not “too” strong selection, e.g.,  7  ( , )-selection to get rid of misadapted  ‘s Mixing strategy parameters by (intermediary) recombination on them

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Example application: the cherry brandy experiment Task to create a colour mix yielding a target colour (that of a well known cherry brandy) Ingredients: water + red, yellow, blue dye Representation:  w, r, y,b  no self-adaptation! Values scaled to give a predefined total volume (30 ml) Mutation: lo / med / hi  values used with equal chance Selection: (1,8) strategy

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Example application: cherry brandy experiment cont’d Fitness: students effectively making the mix and comparing it with target colour Termination criterion: student satisfied with mixed colour Solution is found mostly within 20 generations Accuracy is very good

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies Example application: the Ackley function (Bäck et al ’93) The Ackley function (here used with n =30): Evolution strategy: – Representation: -30 < x i < 30 (coincidence of 30’s!) 30 step sizes – (30,200) selection – Termination : after fitness evaluations – Results: average best solution is –8 (very good)