Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 8.

Slides:



Advertisements
Similar presentations
Genetic Algorithms Chapter 3. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Genetic Algorithms GA Quick Overview Developed: USA in.
Advertisements

Constraint Optimization We are interested in the general non-linear programming problem like the following Find x which optimizes f(x) subject to gi(x)
Using Parallel Genetic Algorithm in a Predictive Job Scheduling
Parameter control Chapter 8. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Parameter Control in EAs 2 Motivation 1 An EA has many.
1 Evolutionary Computational Inteliigence Lecture 6b: Towards Parameter Control Ferrante Neri University of Jyväskylä.
Evolution strategies Chapter 4. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies ES quick overview Developed: Germany.
Self-Adaptive Semi-Autonomous Parent Selection (SASAPAS) Each individual has an evolving mate selection function Two ways to pair individuals: –Democratic.
Evolutionary Computational Inteliigence Lecture 6a: Multimodality.
A new crossover technique in Genetic Programming Janet Clegg Intelligent Systems Group Electronics Department.
Doug Downey, adapted from Bryan Pardo, Machine Learning EECS 349 Machine Learning Genetic Programming.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
EC Awards Lecture ~ Spring 2008 Advances in Parameterless Evolutionary Algorithms Lisa Guntly André Nwamba Research Advisor: Dr. Daniel Tauritz Natural.
Evolutionary Computational Intelligence
Evolutionary Computation Application Peter Andras peter.andras/lectures.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Tutorial 4 (Lecture 12) Remainder from lecture 10: The implicit fitness sharing method Exercises – any solutions? Questions?
Genetic Algorithms Overview Genetic Algorithms: a gentle introduction –What are GAs –How do they work/ Why? –Critical issues Use in Data Mining –GAs.
Genetic Programming Chapter 6. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Genetic Programming GP quick overview Developed: USA.
Prepared by Barış GÖKÇE 1.  Search Methods  Evolutionary Algorithms (EA)  Characteristics of EAs  Genetic Programming (GP)  Evolutionary Programming.
Genetic Algorithm.
© Negnevitsky, Pearson Education, CSC 4510 – Machine Learning Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University.
Charles L. Karr Rodney Bowersox Vishnu Singh
Evolution Strategies Evolutionary Programming Genetic Programming Michael J. Watts
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Genetic Algorithms Michael J. Watts
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
A.E. Eiben and J.E. Smith, What is an Evolutionary Algorithm? With Additions and Modifications by Ch. Eick EP---January 31, 2012 Please read specification.
Introduction to Evolutionary Algorithms Session 4 Jim Smith University of the West of England, UK May/June 2012.
What is an Evolutionary Algorithm? Chapter 2. A.E. Eiben and J.E. Smith, What is an Evolutionary Algorithm? With Additions and Modifications by Ch. Eick.
FINAL EXAM SCHEDULER (FES) Department of Computer Engineering Faculty of Engineering & Architecture Yeditepe University By Ersan ERSOY (Engineering Project)
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Neural Networks Design  Motivation  Evolutionary training  Evolutionary design of the architecture.
GENETIC ALGORITHM A biologically inspired model of intelligence and the principles of biological evolution are applied to find solutions to difficult problems.
EE459 I ntroduction to Artificial I ntelligence Genetic Algorithms Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University.
Evolution strategies Luis Martí DEE/PUC-Rio. ES quick overview Developed: Germany in the 1970’s Early names: I. Rechenberg, H.-P. Schwefel Typically applied.
Learning by Simulating Evolution Artificial Intelligence CSMC February 21, 2002.
Evolutionary Programming
/ 26 Evolutionary Computing Chapter 8. / 26 Chapter 8: Parameter Control Motivation Parameter setting –Tuning –Control Examples Where to apply parameter.
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
Evolution strategies Chapter 4. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies ES quick overview Developed: Germany.
Chapter 12 FUSION OF FUZZY SYSTEM AND GENETIC ALGORITHMS Chi-Yuan Yeh.
Genetic Programming A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 6.
Evolutionary Computing Chapter 7. / 37 Chapter 7: Parameters and Parameter Tuning History Taxonomy Parameter Tuning vs Parameter Control EA calibration.
Evolution strategies Chapter 4. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies ES quick overview Developed: Germany.
GENETIC ALGORITHM Basic Algorithm begin set time t = 0;
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Evolutionary Computing Chapter 11. / 7 Chapter 11: Non-stationary and Noisy Function Optimisation What is a non-stationary problem? Effect of uncertainty.
Ch 20. Parameter Control Ch 21. Self-adaptation Evolutionary Computation vol. 2: Advanced Algorithms and Operators Summarized and presented by Jung-Woo.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Overview Last two weeks we looked at evolutionary algorithms.
Evolutionary Programming A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 5.
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
Genetic Algorithms Author: A.E. Eiben and J.E. Smith
Evolutionary Programming
Evolution Strategies Evolutionary Programming
Example: Applying EC to the TSP Problem
An evolutionary approach to solving complex problems
Example: Applying EC to the TSP Problem
Example: Applying EC to the TSP Problem
Parameter control Chapter 8.
Evolutionary Programming
Boltzmann Machine (BM) (§6.4)
Evolutionary Ensembles with Negative Correlation Learning
Parameter control Chapter 8.
Evolution Strategies Originally developed in Germany in the early 60s with Rechenberg and Schwefel being the main contributors. Main ideas include: floating.
Evolution Strategies Originally developed in Germany in the early 60s with Rechenberg and Schwefel being the main contributors. Main ideas include: floating.
Parameter control Chapter 8.
Presentation transcript:

Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 8

Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Motivation 1 An EA has many strategy parameters, e.g. mutation operator and mutation rate crossover operator and crossover rate selection mechanism and selective pressure (e.g. tournament size) population size Good parameter values facilitate good performance Q1 How to find good parameter values ? 2 / 23

Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Motivation 2 EA parameters are rigid (constant during a run) BUT an EA is a dynamic, adaptive process THUS optimal parameter values may vary during a run Q2: How to vary parameter values? 3 / 23

Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Parameter tuning Parameter tuning: the traditional way of testing and comparing different values before the “real” run Problems: users mistakes in settings can be sources of errors or sub- optimal performance costs much time parameters interact: exhaustive search is not practicable good values may become bad during the run 4 / 23

Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Parameter control Parameter control: setting values on-line, during the actual run, e.g. predetermined time-varying schedule p = p(t) using feedback from the search process encoding parameters in chromosomes and rely on natural selection Problems: finding optimal p is hard, finding optimal p(t) is harder still user-defined feedback mechanism, how to ``optimize"? when would natural selection work for strategy parameters? 5 / 23

Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Example Task to solve: min f(x 1,…,x n ) L i  x i  U i for i = 1,…,nbounds g i (x)  0 for i = 1,…,qinequality constraints h i (x) = 0 for i = q+1,…,mequality constraints Algorithm: EA with real-valued representation (x 1,…,x n ) arithmetic averaging crossover Gaussian mutation: x’ i = x i + N(0,  ) standard deviation  is called mutation step size 6 / 23

Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Replace the constant  by a function  (t) 0  t  T is the current generation number Features: changes in  are independent from the search progress strong user control of  by the above formula  is fully predictable a given  acts on all individuals of the population 7 / 23 Varying mutation step size: option1

Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Varying mutation step size: option2 Replace the constant  by a function  (t) updated after every n steps by the 1/5 success rule (cf. ES chapter): Features: changes in  are based on feedback from the search progress some user control of  by the above formula  is not predictable a given  acts on all individuals of the population 8 / 23

Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Varying mutation step size: option3 Assign a personal  to each individual Incorporate this  into the chromosome: (x 1, …, x n,  ) Apply variation operators to x i ‘s and  Features: changes in  are results of natural selection (almost) no user control of   is not predictable a given  acts on one individual 9 / 23

Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Assign a personal  to each variable in each individual Incorporate  ’s into the chromosomes: (x 1, …, x n,  1, …,  n ) Apply variation operators to x i ‘s and  i ‘s Features: changes in  i are results of natural selection (almost) no user control of  i  i is not predictable a given  i acts on 1 gene of one individual 10 / 23 Varying mutation step size: option4

Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Example cont’d Constraints g i (x)  0 for i = 1,…,qinequality constraints h i (x) = 0 for i = q+1,…,mequality constraints are handled by penalties: eval(x) = f(x) + W × penalty(x) where 11 / 23

Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Replace the constant W by a function W(t) 0  t  T is the current generation number Features: changes in W independent from the search progress strong user control of W by the above formula W is fully predictable a given W acts on all individuals of the population 12 / 23 Varying penalty: option 1

Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Varying penalty: option 2 Replace the constant W by W(t) updated in each generation  1,     1 champion: best of its generation Features: changes in W are based on feedback from the search progress some user control of W by the above formula W is not predictable a given W acts on all individuals of the population 13 / 23

Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Varying penalty: option 3 Assign a personal W to each individual Incorporate this W into the chromosome: (x 1, …, x n, W) Apply variation operators to x i ‘s and W Alert: eval ((x, W)) = f (x) + W × penalty(x) while for mutation step sizes we had eval ((x,  )) = f (x) this option is thus sensitive “cheating”  makes no sense 14 / 23

Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Lessons learned from examples Various forms of parameter control can be distinguished by: primary features: what component of the EA is changed how the change is made secondary features: evidence/data backing up changes level/scope of change 15 / 23

Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing What Practically any EA component can be parameterized and thus controlled on-the-fly: representation evaluation function variation operators selection operator (parent or mating selection) replacement operator (survival or environmental selection) population (size, topology) 16 / 23

Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing How Three major types of parameter control: deterministic: some rule modifies strategy parameter without feedback from the search (based on some counter) adaptive: feedback rule based on some measure monitoring search progress self-adaptative: parameter values evolve along with solutions; encoded onto chromosomes they undergo variation and selection 17 / 23

Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Global taxonomy 18 / 23

Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evidence informing the change The parameter changes may be based on: time or nr. of evaluations (deterministic control) population statistics (adaptive control) progress made population diversity gene distribution, etc. relative fitness of individuals creeated with given values (adaptive or self-adaptive control) 19 / 23

Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evidence informing the change Absolute evidence: predefined event triggers change, e.g. increase p m by 10% if population diversity falls under threshold x Direction and magnitude of change is fixed Relative evidence: compare values through solutions created with them, e.g. increase p m if top quality offspring came by high mut. Rates Direction and magnitude of change is not fixed 20 / 23

Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Scope/level The parameter may take effect on different levels: environment (fitness function) population individual sub-individual Note: given component (parameter) determines possibilities Thus: scope/level is a derived or secondary feature in the classification scheme 21 / 23

Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Refined taxonomy Combinations of types and evidences Possible: + Impossible: - 22 / 23

Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evaluation / Summary Parameter control offers the possibility to use appropriate values in various stages of the search Adaptive and self-adaptive parameter control offer users “liberation” from parameter tuning delegate parameter setting task to the evolutionary process the latter implies a double task for an EA: problem solving + self- calibrating (overhead) Robustness, insensivity of EA for variations assumed If no. of parameters is increased by using (self)adaptation For the “meta-parameters” introduced in methods 23 / 23