Neural and Evolutionary Computing - Lecture 6

Slides:



Advertisements
Similar presentations
Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Advertisements

G5BAIM Artificial Intelligence Methods Graham Kendall Evolutionary Algorithms.
Evolution strategies Chapter 4.
Ali Husseinzadeh Kashan Spring 2010
Evolution strategies Chapter 4. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies ES quick overview Developed: Germany.
Non-Linear Problems General approach. Non-linear Optimization Many objective functions, tend to be non-linear. Design problems for which the objective.
Date:2011/06/08 吳昕澧 BOA: The Bayesian Optimization Algorithm.
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
Evolutionary Computational Intelligence
Genetic Algorithm for Variable Selection
Evolutionary Computational Intelligence Lecture 8: Memetic Algorithms Ferrante Neri University of Jyväskylä.
Evolutionary Computational Intelligence
Differential Evolution Hossein Talebi Hassan Nikoo 1.
Christoph F. Eick: Applying EC to TSP(n) Example: Applying EC to the TSP Problem  Given: n cities including the cost of getting from on city to the other.
Slides are based on Negnevitsky, Pearson Education, Lecture 10 Evolutionary Computation: Evolution strategies and genetic programming n Evolution.
Prepared by Barış GÖKÇE 1.  Search Methods  Evolutionary Algorithms (EA)  Characteristics of EAs  Genetic Programming (GP)  Evolutionary Programming.
Genetic Algorithm.
Neural and Evolutionary Computing - Lecture 10 1 Parallel and Distributed Models in Evolutionary Computing  Motivation  Parallelization models  Distributed.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
Evolution Strategies Evolutionary Programming Genetic Programming Michael J. Watts
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
Chih-Ming Chen, Student Member, IEEE, Ying-ping Chen, Member, IEEE, Tzu-Ching Shen, and John K. Zao, Senior Member, IEEE Evolutionary Computation (CEC),
Introduction to GAs: Genetic Algorithms How to apply GAs to SNA? Thank you for all pictures and information referred.
Neural and Evolutionary Computing - Lecture 5 1 Evolutionary Computing. Genetic Algorithms Basic notions The general structure of an evolutionary algorithm.
Genetic Algorithms Genetic Algorithms – What are they? And how they are inspired from evolution. Operators and Definitions in Genetic Algorithms paradigm.
Computational Complexity Jang, HaYoung BioIntelligence Lab.
Genetic Algorithms Introduction Advanced. Simple Genetic Algorithms: Introduction What is it? In a Nutshell References The Pseudo Code Illustrations Applications.
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Neural Networks Design  Motivation  Evolutionary training  Evolutionary design of the architecture.
Derivative Free Optimization G.Anuradha. Contents Genetic Algorithm Simulated Annealing Random search method Downhill simplex method.
EE459 I ntroduction to Artificial I ntelligence Genetic Algorithms Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University.
G5BAIM Artificial Intelligence Methods Dr. Rong Qu Evolutionary Algorithms.
A New Evolutionary Approach for the Optimal Communication Spanning Tree Problem Sang-Moon Soak Speaker: 洪嘉涓、陳麗徽、李振宇、黃怡靜.
Evolution strategies Luis Martí DEE/PUC-Rio. ES quick overview Developed: Germany in the 1970’s Early names: I. Rechenberg, H.-P. Schwefel Typically applied.
Exact and heuristics algorithms
Evolutionary Programming
METAHEURISTICS Genetic Algorithm Jacques A. Ferland Department of Informatique and Recherche Opérationnelle Université de Montréal
Evolution strategies Chapter 4. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies ES quick overview Developed: Germany.
Genetic Algorithms. The Basic Genetic Algorithm 1.[Start] Generate random population of n chromosomes (suitable solutions for the problem) 2.[Fitness]
Neural Networks - lecture 51 Multi-layer neural networks  Motivation  Choosing the architecture  Functioning. FORWARD algorithm  Neural networks as.
Evolution strategies Chapter 4. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies ES quick overview Developed: Germany.
A survey of Constraint Handling Techniques in Evolutionary Computation Methods Author: Zbigneiw Michalewicz Presenter: Masoud Mazloom 27 th Oct
Diversity Loss in General Estimation of Distribution Algorithms J. L. Shapiro PPSN (Parallel Problem Solving From Nature) ’06 BISCuit 2 nd EDA Seminar.
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Multiobjective Optimization  Particularities of multiobjective optimization  Multiobjective.
Selection and Recombination Temi avanzati di Intelligenza Artificiale - Lecture 4 Prof. Vincenzo Cutello Department of Mathematics and Computer Science.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Evolutionary Programming A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 5.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Chapter 14 Genetic Algorithms.
Genetic Algorithms.
Evolutionary Algorithms Jim Whitehead
Evolution Strategies Evolutionary Programming
Evolution strategies Chapter 4.
One-layer neural networks Approximation problems
Evolution strategies Can programs learn?
Example: Applying EC to the TSP Problem
Example: Applying EC to the TSP Problem
Example: Applying EC to the TSP Problem
G5BAIM Artificial Intelligence Methods
Evolutionary Computation,
Boltzmann Machine (BM) (§6.4)
Review NNs Processing Principles in Neuron / Unit
A Tutorial (Complete) Yiming
Evolution strategies Chapter 4.
Population Based Metaheuristics
Evolution strategies Chapter 4.
Population Methods.
Stochastic Methods.
Presentation transcript:

Neural and Evolutionary Computing - Lecture 6 Evolution Strategies Particularities General structure Recombination Mutation Selection Adaptive and self-adaptive variants Neural and Evolutionary Computing - Lecture 6

Neural and Evolutionary Computing - Lecture 6 Particularities Evolution strategies: evolutionary techniques used in solving continuous optimization problems History: the first strategy has been developed in prima strategie a 1964 by Bienert, Rechenberg si Schwefel (students at the Technical University of Berlin) in order to design a flexible pipe Data encoding: real (the individuals are vectors of floating values belonging to the definition domain of the objective function) Main operator: mutation Particularitaty: self adaptation of the mutation control parameters Neural and Evolutionary Computing - Lecture 6

Neural and Evolutionary Computing - Lecture 6 General structure Structure of the algorithm Population initialization Population evaluation REPEAT construct offspring by recombination change the offspring by mutation offspring evaluation survivors selection UNTIL <stopping condition> Problem class: Find x* in DRn such that f(x*)<f(x) for all x in D The population consists of elements from D (vectors with real components) Rmk. A configuration is better if the value of f is smaller. Criteria related to the convergence (e.g.: value of f) Resource related criteria (e.g.: generations number, nfe) Neural and Evolutionary Computing - Lecture 6

Neural and Evolutionary Computing - Lecture 6 Recombination Aim: construct an offspring starting from a set of parents Intermediate (convex): the offspring is a linear (convex) combination of the parents Discrete: the offspring consists of components randomly taken from the parents Neural and Evolutionary Computing - Lecture 6

Neural and Evolutionary Computing - Lecture 6 Recombination Geometric: Remark: introduced by Michalewicz for solving constrained optimization problems Heuristic recombination: y=xi+u(xi-xk) with xi an element at least as good as xk u – random value from (0,1) Neural and Evolutionary Computing - Lecture 6

Neural and Evolutionary Computing - Lecture 6 Mutation Basic idea: perturb each element in the population by adding a random vector Particularity: this mutation favors the small changes of the current element, unlike the mutation typical to genetic algorithms which does not differentiate small perturbations from large perturbations Neural and Evolutionary Computing - Lecture 6

Neural and Evolutionary Computing - Lecture 6 Mutation Variants: The components of the random vector are independent random variables having the same distribution (E(zizj)=E(zi)E(zj)=0). Examples: a) each component is a random value uniformly distributed in [-s,s] b) each component has the normal (Gaussian) distribution N(0,s) Rmk. The covariance matrix is a diagonal matrix C=diag(s2,s2,…,s2) with s the only control parameter of the mutation Neural and Evolutionary Computing - Lecture 6

Neural and Evolutionary Computing - Lecture 6 Mutation Variants: The components of the random vector are independent random variables having different distributions (E(zizj)= E(zi)E(zj)= 0) Examples: a) the component zi of the perturbation vector has the uniform distribution on [-si,si] b) each component of the perturbation vector has the distribution N(0, si) Rmk. The covariance matrix is a diagonal matrix: C=diag(s21,s22,…,s2n) and the control parameters of mutation are s1,s2,…,sn Neural and Evolutionary Computing - Lecture 6

Neural and Evolutionary Computing - Lecture 6 Mutation Variants: The components are dependent random variables Example: a) the vector z has the distribution N(0,C) Rmk. There are n(n+1)/2 control parameters of the mutation: s1,s2,…,sn - mutation steps a1,a2,…,ak - rotation angles (k=n(n-1)/2) cij = ½ • ( si2 - sj2 ) • tan(2 aij) Neural and Evolutionary Computing - Lecture 6

Neural and Evolutionary Computing - Lecture 6 Mutation Variants [Hansen, PPSN 2006] Neural and Evolutionary Computing - Lecture 6

Neural and Evolutionary Computing - Lecture 6 Mutation Problem: choice of the control parameters Example: perturbation of type N(0,s) s large -> large perturbation s small -> small perturbation Solutions: Adaptive heuristic methods (example: rule 1/5) Self-adaptation (change of parameters by recombination and mutation) s=0.5 s=1 s=2 Neural and Evolutionary Computing - Lecture 6

Neural and Evolutionary Computing - Lecture 6 Mutation 1/5 rule. This is an heuristic rules developed for ES having independent perturbations characterized by a single parameter, s. Idea: s is adjusted by using the success ration of the mutation The success ratio: ps= number of mutations leading to better configurations / total number of mutations Rmk. 1. The success ratio is estimated by using the results of at least n mutations (n is the problem size) 2. This rule has been initially proposed for populations containing just one element Neural and Evolutionary Computing - Lecture 6

Neural and Evolutionary Computing - Lecture 6 Mutation 1/5 Rule. Some theoretical studies conducted for some particular objective functions (e.g. sphere function) led to the remark that c should satisfy 0.8 <= c<1 (e.g.: c=0.817) Neural and Evolutionary Computing - Lecture 6

Neural and Evolutionary Computing - Lecture 6 Mutation Self-adaptation Idea: Extend the elements of the population with components corresponding to the control parameters Apply specific recombination and mutation operators also to control parameters Thus the values of control parameters leading to copmpetitive individuals will have higher chance to survive Extended population elements Neural and Evolutionary Computing - Lecture 6

Neural and Evolutionary Computing - Lecture 6 Mutation Steps: Change the components corresponding to the control parameters Change the variables corresponding to the decision variables Example: the case of independent perturbations Variables with lognormal distribution - ensure that si>0 - it is symmetric arounf 1 Neural and Evolutionary Computing - Lecture 6

Neural and Evolutionary Computing - Lecture 6 Mutation Variant proposed by Michalewicz (1996): ai and bi are the bounds of the interval corresponding to component xi u is a random value in (0,1) t is the iteration counter T is the maximal number of iterations Neural and Evolutionary Computing - Lecture 6

Neural and Evolutionary Computing - Lecture 6 Mutation CMA – ES (Covariance Matrix Adaptation –ES) [Hansen, 1996] Neural and Evolutionary Computing - Lecture 6

Neural and Evolutionary Computing - Lecture 6 Survivors selection Variants: From the set of μ parents construct λ> μ offspring and starting from these select the best μ survivors (the number of offspring should be larger than the number of parents) From the set of μ parents construct λ offspring and from the joined population of parents and offspring select the best μ survivors (truncated selectie). This is an elitist selection (it preserves the best element in the population) Remark: if the number of parents is rho the usual notations are: Neural and Evolutionary Computing - Lecture 6

Neural and Evolutionary Computing - Lecture 6 Survivors selection Particular cases: (1+1) – from one parent generate one offspring and chose the best one (1,/+λ) – from one parent generate several offspring and choose the best element (μ+1) – from a set of μ construct an offspring and insert it into population if it is better than the worst element in the population Neural and Evolutionary Computing - Lecture 6

Neural and Evolutionary Computing - Lecture 6 Survivors selection The variant (μ+1) corresponds to the so called steady state (asynchronous) strategy Generational strategy: At each generation is constructed a new population of offspring The selection is applied to the offspring or to the joined population This is a synchronous process Steady state strategy: At each iteration only one offspring is generated; it is assimilated into population if it is good enough This is an asynchronous process Neural and Evolutionary Computing - Lecture 6

Neural and Evolutionary Computing - Lecture 6 ES variants strategies Each element has a limited life time (k generations) The recombination is based on  parents Fast evolution strategies: The perturbation is based on the Cauchy distribution normala Cauchy Neural and Evolutionary Computing - Lecture 6

Analysis of the behavior of ES Evaluation criteria: Efficiency: The number of evaluation functions necessary such that the objective function reaches a given value (a desired accuracy) Effectiveness: Value of the objective function after a given number of evaluations (nfe) Success ratio: The number of runs in which the algorithm reaches the goal divided by the total number of runs. Neural and Evolutionary Computing - Lecture 6

Neural and Evolutionary Computing - Lecture 6 Summary Encoding Real vectors Recombination Discrete or intermediate Mutation Random additive perturbation (uniform, Gaussian, Cauchy) Parents selection Uniformly random Survivors selection (,) or (+) Particularity Self-adaptive mutation parameters Neural and Evolutionary Computing - Lecture 6