Ch 20. Parameter Control Ch 21. Self-adaptation Evolutionary Computation vol. 2: Advanced Algorithms and Operators Summarized and presented by Jung-Woo.

Slides:



Advertisements
Similar presentations
1 An Adaptive GA for Multi Objective Flexible Manufacturing Systems A. Younes, H. Ghenniwa, S. Areibi uoguelph.ca.
Advertisements

Using Parallel Genetic Algorithm in a Predictive Job Scheduling
Parameter control Chapter 8. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Parameter Control in EAs 2 Motivation 1 An EA has many.
1 Evolutionary Computational Inteliigence Lecture 6b: Towards Parameter Control Ferrante Neri University of Jyväskylä.
Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 8.
Evolution strategies Chapter 4. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies ES quick overview Developed: Germany.
EvoNet Flying Circus Introduction to Evolutionary Computation Brought to you by (insert your name) The EvoNet Training Committee The EvoNet Flying Circus.
Evolutionary Algorithms
EvoNet Flying Circus Introduction to Evolutionary Computation Brought to you by (insert your name) The EvoNet Training Committee The EvoNet Flying Circus.
Evolutionary Computational Intelligence
Introduction to Genetic Algorithms Yonatan Shichel.
Evolutionary Computational Intelligence Lecture 8: Memetic Algorithms Ferrante Neri University of Jyväskylä.
Evolutionary Computational Intelligence
Genetic Algorithms Nehaya Tayseer 1.Introduction What is a Genetic algorithm? A search technique used in computer science to find approximate solutions.
CS 447 Advanced Topics in Artificial Intelligence Fall 2002.
Genetic Algorithm What is a genetic algorithm? “Genetic Algorithms are defined as global optimization procedures that use an analogy of genetic evolution.
Genetic Algorithms Overview Genetic Algorithms: a gentle introduction –What are GAs –How do they work/ Why? –Critical issues Use in Data Mining –GAs.
Genetic Algorithm.
Evolution Strategies Evolutionary Programming Genetic Programming Michael J. Watts
More on coevolution and learning Jing Xiao April, 2008.
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
A.E. Eiben and J.E. Smith, What is an Evolutionary Algorithm? With Additions and Modifications by Ch. Eick EP---January 31, 2012 Please read specification.
What is Genetic Programming? Genetic programming is a model of programming which uses the ideas (and some of the terminology) of biological evolution to.
Introduction to Evolutionary Algorithms Session 4 Jim Smith University of the West of England, UK May/June 2012.
Neural and Evolutionary Computing - Lecture 6
What is an Evolutionary Algorithm? Chapter 2. A.E. Eiben and J.E. Smith, What is an Evolutionary Algorithm? With Additions and Modifications by Ch. Eick.
Genetic Algorithms Genetic Algorithms – What are they? And how they are inspired from evolution. Operators and Definitions in Genetic Algorithms paradigm.
Genetic Algorithms Introduction Advanced. Simple Genetic Algorithms: Introduction What is it? In a Nutshell References The Pseudo Code Illustrations Applications.
1 Machine Learning: Lecture 12 Genetic Algorithms (Based on Chapter 9 of Mitchell, T., Machine Learning, 1997)
Artificial Intelligence Chapter 4. Machine Evolution.
Evolutionary Programming
/ 26 Evolutionary Computing Chapter 8. / 26 Chapter 8: Parameter Control Motivation Parameter setting –Tuning –Control Examples Where to apply parameter.
Genetic Algorithms Genetic algorithms provide an approach to learning that is based loosely on simulated evolution. Hypotheses are often described by bit.
Evolution strategies Chapter 4. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies ES quick overview Developed: Germany.
Chapter 12 FUSION OF FUZZY SYSTEM AND GENETIC ALGORITHMS Chi-Yuan Yeh.
Evolution strategies Chapter 4. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies ES quick overview Developed: Germany.
Solving Function Optimization Problems with Genetic Algorithms September 26, 2001 Cho, Dong-Yeon , Tel:
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Evolutionary Computing Chapter 11. / 7 Chapter 11: Non-stationary and Noisy Function Optimisation What is a non-stationary problem? Effect of uncertainty.
Evolutionary Computation vol. 2: Advanced Algorithms and Operators CH March Summarized by Kim Soo-Jin.
Overview Last two weeks we looked at evolutionary algorithms.
Evolutionary Programming A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 5.
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming.
EvoNet Flying Circus Introduction to Evolutionary Computation Brought to you by (insert your name) The EvoNet Training Committee The EvoNet Flying Circus.
EVOLUTIONARY SYSTEMS AND GENETIC ALGORITHMS NAME: AKSHITKUMAR PATEL STUDENT ID: GRAD POSITION PAPER.
Breeding Swarms: A GA/PSO Hybrid 簡明昌 Author and Source Author: Matthew Settles and Terence Soule Source: GECCO 2005, p How to get: (\\nclab.csie.nctu.edu.tw\Repository\Journals-
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
Genetic Algorithm in TDR System
Genetic Algorithms.
Evolutionary Programming
Evolution Strategies Evolutionary Programming
C.-S. Shieh, EC, KUAS, Taiwan
Evolution strategies Can programs learn?
Artificial Intelligence Project 2 Genetic Algorithms
Artificial Intelligence Chapter 4. Machine Evolution
G5BAIM Artificial Intelligence Methods
Evolutionary Computation,
Parameter control Chapter 8.
Artificial Intelligence Chapter 4. Machine Evolution
EE368 Soft Computing Genetic Algorithms.
Evolutionary Programming
Artificial Intelligence CIS 342
Genetic Algorithm Soft Computing: use of inexact t solution to compute hard task problems. Soft computing tolerant of imprecision, uncertainty, partial.
Parameter control Chapter 8.
Beyond Classical Search
Population Based Metaheuristics
Evolution Strategies Originally developed in Germany in the early 60s with Rechenberg and Schwefel being the main contributors. Main ideas include: floating.
Evolution Strategies Originally developed in Germany in the early 60s with Rechenberg and Schwefel being the main contributors. Main ideas include: floating.
Parameter control Chapter 8.
Presentation transcript:

Ch 20. Parameter Control Ch 21. Self-adaptation Evolutionary Computation vol. 2: Advanced Algorithms and Operators Summarized and presented by Jung-Woo Ha Biointelligence Laboratory, Seoul National University

Chapter 21. Parameter control (C) 2011, SNU Biointelligence Lab,

Significance of Parameters Heuristic searching algorithms  problems  Representation  Evaluation Components in EC  Select operators that suit representation and evaluation  Most components have parameters  Ratio of mutation, tournament size of selection, init pop-size etc. Parameter setting  determining performances of EC  Evaluation (global vs. local)  Time cost Parameter tuning vs. parameter control  Finding and running with fixed values vs. finding in running 3(C) 2011, SNU Biointelligence Lab,

Classification of parameter control Two aspects  How the mechanism of change works  What component of EA is effected by the mechanism Previous studies  Angeline (1995), Hinterding et al. (1997), Smith and Forgaty (1997) This study  Type of the mechanism  The EA components that are adapted 4(C) 2011, SNU Biointelligence Lab,

Parameter Tuning 1980s  De Jong (1975), Grefenstette (1986) After mid of 1980s  GA was considered as a robust solver  The methods for finding an optimal parameter set became important rather than the optimal parameter set itself. Drawbacks  Works by hands  the process of parameter tuning costs a lot of time, even if parameters are optimized one by one, regardless of their interactions  for a given problem the selected parameter values are not necessarily optimal, even if the effort made in setting them was significant 5(C) 2011, SNU Biointelligence Lab,

Parameter Setting by Analogy and Theoretical Analysis Parameter setting by analogy  The use of parameter settings that have been proved successful for similar problems  Similarity of problems = similarity of parameter set ? Theoretical analysis  Should be simplified: EA and problems  Some theoretical investigation: population, ratio of opertors  Limited to applying to real-world problems 6(C) 2011, SNU Biointelligence Lab,

Solutions for Drawbacks of Parameter Tuning General drawbacks of parameter tuning  Intrinsically dynamic and adaptive process  The use of rigid parameters: contrast to the general evolutionary spirit  Different values of parameters might be optimal at different stages of the evolutionary process  The use of static parameters itself can lead to inferior algorithm performance Solution  By using parameters p(t) that may change over time  difficult  the parameter value p(t) changes are caused by a deterministic rule triggered by the progress of time t, 7(C) 2011, SNU Biointelligence Lab,

How to modify a parameter controlling mutation In numerical optimizations σ: a constant σ(t) : mutation step size (parameter) Incorporate feedback (Rechenberg’s 1 /5 success rule) 8(C) 2011, SNU Biointelligence Lab,

How to modify a parameter controlling mutation To assign a ‘personal’ mutation step size to each individual All components of the individual 9(C) 2011, SNU Biointelligence Lab,

How to control penalty functions In numerical optimizations Static penalty function Dynamic W 10(C) 2011, SNU Biointelligence Lab,

How to control penalty functions Dynamic W with feedback Self-adaptation of W Difference of mutation from penalty function Scope of a parameter 11(C) 2011, SNU Biointelligence Lab,

Aspects of classifying parameter control methods What is changed?  representation, evaluation function, operators, selection process, mutation rate, etc How is the change made?  deterministic heuristic, feedback-based heuristic, or self- adaptive The scope/level of change  population-level, individual-level, etc The evidence upon which the change is carried out  Monitoring performance of operators, diversity of the population, etc 12(C) 2011, SNU Biointelligence Lab,

What is changed? It is necessary to agree on a list of all components of an evolutionary algorithm (which is a difficult task in itself) Despite these drawbacks, the ‘what aspect’ should be maintained as one of the main classification features, as it allows us to locate where a specific mechanism has its effect 13(C) 2011, SNU Biointelligence Lab,

How is the change made? Deterministic parameter control  When the value of a strategy parameter is altered by some deterministic rule  Without using any feedback from the search  a time-varying schedule Adaptive parameter control  when there is some form of feedback from the search that is used to determine the direction and/or magnitude of the change to the strategy parameter Self-adaptive parameter control  the parameters to be adapted are encoded onto the chromosome(s) of the individual and undergo mutation and recombination 14(C) 2011, SNU Biointelligence Lab,

The scope/level of change and Evidence The scope depends on the interpretation mechanism of the given parameters  σ: sub-individual / α: individual the evidence used for determining the change of parameter value  The performance of operators  The diversity of the population Main Criteria  What and How 15(C) 2011, SNU Biointelligence Lab,

Chapter 22. Self-adaptation (C) 2011, SNU Biointelligence Lab,

Introduction Self-adaptation of strategy parameters Strategy parameters  Parameters that control the evolutionary search process  mutation rates, mutation variances, and recombination probabilities  Self-adapted by incorporating them into the representation of individuals in addition to the set of object variables  Indirect link between fitness value and strategy parameters  the speed of the adaptation on the level of strategy parameters is under the control of the user by means of so-called learning rates 17(C) 2011, SNU Biointelligence Lab,

Introduction Difference from dynamic parameter control and adaptive parameter control  Dynamic parameter control  The parameter settings obtain different values according to a deterministic schedule prescribed by the user.  Adaptive parameter control  New values by a feedback mechanism that monitors evolution and explicitly rewards or punishes operators according to their impact on the objective function value  Self-adaptive parameter control  By encoding parameters in the individuals and evolving the parameters themselves. 18(C) 2011, SNU Biointelligence Lab,

Mutation operators Continuous search spaces  Case 1:  Case 2: 19(C) 2011, SNU Biointelligence Lab,

Mutation operators  Case 3:  Case 4: 20(C) 2011, SNU Biointelligence Lab,

Mutation operators Setting for learning rates (τ, τ’, τ 0 ) For sphere model For each variants  Case 1: 21(C) 2011, SNU Biointelligence Lab,

Mutation operators  Time-varying optimal location 22(C) 2011, SNU Biointelligence Lab,

Mutation operators  Case 2:  Case 3: 23(C) 2011, SNU Biointelligence Lab,

Mutation operators Binary search space  Canonical GA (Bit-stream representation)  Outperform GA 24(C) 2011, SNU Biointelligence Lab,

Mutation operators Smith and Fogarty’s conclusions for self-adaptation  Replacing the oldest of the population with the best offspring, conditional on the latter being the better of the two, is the best selection and deletion. Because replacing the oldest (rather than the worst) drops the elitist property of the (μ+ 1 ) strategy, this confirms observations from evolution strategies that self-adaptation needs a non-elitist selection strategy to work successfully  A value of c = 5 was consistently found to produce best results, such that the necessity to produce a surplus of offspring individuals as found by Back (1992b) and the 1 /5 success rule are both confirmed.  Gray coding and standard binary coding showed similar performance, both substantially outperforming the exponential encoding. On the most complex landscapes, however, the Gray coding also outperformed standard binary coding. 25(C) 2011, SNU Biointelligence Lab,

Mutation operators Schwefel’s study  the expected change of pm by repeated mutations should be equal to zero  mutation of p m ∈ {0, 1} must yield a feasible mutation rate p’ m ∈ {0, 1}  small changes should be more likely than large ones  the median should equal one 26(C) 2011, SNU Biointelligence Lab,

Mutation operators Integer search space  Mean step size s  Modified step size s’  A realization of a one-dimensional random variable 27(C) 2011, SNU Biointelligence Lab,

Mutation operators  Modified probability  Geometric random variable Finite-state machine  Self-adaptation of mutating each component  Multimutational self-adaptation: Mutating probability of each component is independent each other 28(C) 2011, SNU Biointelligence Lab,

Recombination operators Cross-over has received mush less attention  Self-adaption is from ES and EP (only mutation) Binary search space  Crossover punctuation: self-adapting the number and location of cross-over points  One-bit self-adaptation  A single strategy parameter bit added to an individual indicated whether uniform crossover or two-point crossover was performed on the parents  Applied to N-peak problems 29(C) 2011, SNU Biointelligence Lab,