Presentation is loading. Please wait.

Presentation is loading. Please wait.

Ch 20. Parameter Control Ch 21. Self-adaptation Evolutionary Computation vol. 2: Advanced Algorithms and Operators Summarized and presented by Jung-Woo.

Similar presentations


Presentation on theme: "Ch 20. Parameter Control Ch 21. Self-adaptation Evolutionary Computation vol. 2: Advanced Algorithms and Operators Summarized and presented by Jung-Woo."— Presentation transcript:

1 Ch 20. Parameter Control Ch 21. Self-adaptation Evolutionary Computation vol. 2: Advanced Algorithms and Operators Summarized and presented by Jung-Woo Ha Biointelligence Laboratory, Seoul National University http://bi.snu.ac.kr/

2 Chapter 21. Parameter control (C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/2

3 Significance of Parameters Heuristic searching algorithms  problems  Representation  Evaluation Components in EC  Select operators that suit representation and evaluation  Most components have parameters  Ratio of mutation, tournament size of selection, init pop-size etc. Parameter setting  determining performances of EC  Evaluation (global vs. local)  Time cost Parameter tuning vs. parameter control  Finding and running with fixed values vs. finding in running 3(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

4 Classification of parameter control Two aspects  How the mechanism of change works  What component of EA is effected by the mechanism Previous studies  Angeline (1995), Hinterding et al. (1997), Smith and Forgaty (1997) This study  Type of the mechanism  The EA components that are adapted 4(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

5 Parameter Tuning 1980s  De Jong (1975), Grefenstette (1986) After mid of 1980s  GA was considered as a robust solver  The methods for finding an optimal parameter set became important rather than the optimal parameter set itself. Drawbacks  Works by hands  the process of parameter tuning costs a lot of time, even if parameters are optimized one by one, regardless of their interactions  for a given problem the selected parameter values are not necessarily optimal, even if the effort made in setting them was significant 5(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

6 Parameter Setting by Analogy and Theoretical Analysis Parameter setting by analogy  The use of parameter settings that have been proved successful for similar problems  Similarity of problems = similarity of parameter set ? Theoretical analysis  Should be simplified: EA and problems  Some theoretical investigation: population, ratio of opertors  Limited to applying to real-world problems 6(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

7 Solutions for Drawbacks of Parameter Tuning General drawbacks of parameter tuning  Intrinsically dynamic and adaptive process  The use of rigid parameters: contrast to the general evolutionary spirit  Different values of parameters might be optimal at different stages of the evolutionary process  The use of static parameters itself can lead to inferior algorithm performance Solution  By using parameters p(t) that may change over time  difficult  the parameter value p(t) changes are caused by a deterministic rule triggered by the progress of time t, 7(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

8 How to modify a parameter controlling mutation In numerical optimizations σ: a constant σ(t) : mutation step size (parameter) Incorporate feedback (Rechenberg’s 1 /5 success rule) 8(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

9 How to modify a parameter controlling mutation To assign a ‘personal’ mutation step size to each individual All components of the individual 9(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

10 How to control penalty functions In numerical optimizations Static penalty function Dynamic W 10(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

11 How to control penalty functions Dynamic W with feedback Self-adaptation of W Difference of mutation from penalty function Scope of a parameter 11(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

12 Aspects of classifying parameter control methods What is changed?  representation, evaluation function, operators, selection process, mutation rate, etc How is the change made?  deterministic heuristic, feedback-based heuristic, or self- adaptive The scope/level of change  population-level, individual-level, etc The evidence upon which the change is carried out  Monitoring performance of operators, diversity of the population, etc 12(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

13 What is changed? It is necessary to agree on a list of all components of an evolutionary algorithm (which is a difficult task in itself) Despite these drawbacks, the ‘what aspect’ should be maintained as one of the main classification features, as it allows us to locate where a specific mechanism has its effect 13(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

14 How is the change made? Deterministic parameter control  When the value of a strategy parameter is altered by some deterministic rule  Without using any feedback from the search  a time-varying schedule Adaptive parameter control  when there is some form of feedback from the search that is used to determine the direction and/or magnitude of the change to the strategy parameter Self-adaptive parameter control  the parameters to be adapted are encoded onto the chromosome(s) of the individual and undergo mutation and recombination 14(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

15 The scope/level of change and Evidence The scope depends on the interpretation mechanism of the given parameters  σ: sub-individual / α: individual the evidence used for determining the change of parameter value  The performance of operators  The diversity of the population Main Criteria  What and How 15(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

16 Chapter 22. Self-adaptation (C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/16

17 Introduction Self-adaptation of strategy parameters Strategy parameters  Parameters that control the evolutionary search process  mutation rates, mutation variances, and recombination probabilities  Self-adapted by incorporating them into the representation of individuals in addition to the set of object variables  Indirect link between fitness value and strategy parameters  the speed of the adaptation on the level of strategy parameters is under the control of the user by means of so-called learning rates 17(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

18 Introduction Difference from dynamic parameter control and adaptive parameter control  Dynamic parameter control  The parameter settings obtain different values according to a deterministic schedule prescribed by the user.  Adaptive parameter control  New values by a feedback mechanism that monitors evolution and explicitly rewards or punishes operators according to their impact on the objective function value  Self-adaptive parameter control  By encoding parameters in the individuals and evolving the parameters themselves. 18(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

19 Mutation operators Continuous search spaces  Case 1:  Case 2: 19(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

20 Mutation operators  Case 3:  Case 4: 20(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

21 Mutation operators Setting for learning rates (τ, τ’, τ 0 ) For sphere model For each variants  Case 1: 21(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

22 Mutation operators  Time-varying optimal location 22(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

23 Mutation operators  Case 2:  Case 3: 23(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

24 Mutation operators Binary search space  Canonical GA (Bit-stream representation)  Outperform GA 24(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

25 Mutation operators Smith and Fogarty’s conclusions for self-adaptation  Replacing the oldest of the population with the best offspring, conditional on the latter being the better of the two, is the best selection and deletion. Because replacing the oldest (rather than the worst) drops the elitist property of the (μ+ 1 ) strategy, this confirms observations from evolution strategies that self-adaptation needs a non-elitist selection strategy to work successfully  A value of c = 5 was consistently found to produce best results, such that the necessity to produce a surplus of offspring individuals as found by Back (1992b) and the 1 /5 success rule are both confirmed.  Gray coding and standard binary coding showed similar performance, both substantially outperforming the exponential encoding. On the most complex landscapes, however, the Gray coding also outperformed standard binary coding. 25(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

26 Mutation operators Schwefel’s study  the expected change of pm by repeated mutations should be equal to zero  mutation of p m ∈ {0, 1} must yield a feasible mutation rate p’ m ∈ {0, 1}  small changes should be more likely than large ones  the median should equal one 26(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

27 Mutation operators Integer search space  Mean step size s  Modified step size s’  A realization of a one-dimensional random variable 27(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

28 Mutation operators  Modified probability  Geometric random variable Finite-state machine  Self-adaptation of mutating each component  Multimutational self-adaptation: Mutating probability of each component is independent each other 28(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/

29 Recombination operators Cross-over has received mush less attention  Self-adaption is from ES and EP (only mutation) Binary search space  Crossover punctuation: self-adapting the number and location of cross-over points  One-bit self-adaptation  A single strategy parameter bit added to an individual indicated whether uniform crossover or two-point crossover was performed on the parents  Applied to N-peak problems 29(C) 2011, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/


Download ppt "Ch 20. Parameter Control Ch 21. Self-adaptation Evolutionary Computation vol. 2: Advanced Algorithms and Operators Summarized and presented by Jung-Woo."

Similar presentations


Ads by Google