Evolutionary Programming

Slides:



Advertisements
Similar presentations
Evolution strategies Chapter 4.
Advertisements

Parameter control Chapter 8. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Parameter Control in EAs 2 Motivation 1 An EA has many.
Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 8.
Evolution strategies Chapter 4. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies ES quick overview Developed: Germany.
Institute of Intelligent Power Electronics – IPE Page1 Introduction to Basics of Genetic Algorithms Docent Xiao-Zhi Gao Department of Electrical Engineering.
Genetic Algorithms. Some Examples of Biologically Inspired AI Neural networks Evolutionary computation (e.g., genetic algorithms) Immune-system-inspired.
Evolutionary Computational Intelligence
Doug Downey, adapted from Bryan Pardo, Machine Learning EECS 349 Machine Learning Genetic Programming.
Tutorial 1 Temi avanzati di Intelligenza Artificiale - Lecture 3 Prof. Vincenzo Cutello Department of Mathematics and Computer Science University of Catania.
Evolutionary Computational Intelligence
Lecture 5: EP and DE 1 Evolutionary Computational Intelligence Lecture 5a: Overview about Evolutionary Programming Ferrante Neri University of Jyväskylä.
Genetic Programming Chapter 6. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Genetic Programming GP quick overview Developed: USA.
Neural and Evolutionary Computing - Lecture 7 1 Evolutionary Programming The origins: L. Fogel (1960) – development of methods which generate automatically.
Genetic Algorithms and Ant Colony Optimisation
Using Genetic Programming to Learn Probability Distributions as Mutation Operators with Evolutionary Programming Libin Hong, John Woodward, Ender Ozcan,
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
Evolution Strategies Evolutionary Programming Genetic Programming Michael J. Watts
Design of an Evolutionary Algorithm M&F, ch. 7 why I like this textbook and what I don’t like about it!
Introduction to Evolutionary Algorithms Session 4 Jim Smith University of the West of England, UK May/June 2012.
G5BAIM Artificial Intelligence Methods Dr. Rong Qu Evolutionary Algorithms.
Evolution strategies Luis Martí DEE/PUC-Rio. ES quick overview Developed: Germany in the 1970’s Early names: I. Rechenberg, H.-P. Schwefel Typically applied.
Evolutionary Programming
/ 26 Evolutionary Computing Chapter 8. / 26 Chapter 8: Parameter Control Motivation Parameter setting –Tuning –Control Examples Where to apply parameter.
Genetic Programming. GP quick overview Developed: USA in the 1990’s Early names: J. Koza Typically applied to: machine learning tasks (prediction, classification…)
Evolutionary Computing
Evolutionary Computing Dialects Presented by A.E. Eiben Free University Amsterdam with thanks to the EvoNet Training Committee and its “Flying Circus”
Evolution strategies Chapter 4. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies ES quick overview Developed: Germany.
Genetic Programming A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 6.
CITS7212: Computational Intelligence An Overview of Core CI Technologies Lyndon While.
Evolution strategies Chapter 4. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies ES quick overview Developed: Germany.
Genetic Programming COSC Ch. F. Eick, Introduction to Genetic Programming GP quick overview Developed: USA in the 1990’s Early names: J. Koza Typically.
Introduction to Evolutionary Computing II A.E. Eiben Free University Amsterdam with thanks to the EvoNet Training Committee.
Ch 20. Parameter Control Ch 21. Self-adaptation Evolutionary Computation vol. 2: Advanced Algorithms and Operators Summarized and presented by Jung-Woo.
Evolutionary Programming A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 5.
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming.
Genetic Programming.
Genetic Algorithms Author: A.E. Eiben and J.E. Smith
Evolutionary Programming
Introduction to Evolutionary Computing
Evolution Strategies Evolutionary Programming
Evolution strategies Chapter 4.
Machine Learning Evolutionary Algorithm (2)
Evolution strategies Can programs learn?
Example: Applying EC to the TSP Problem
CSC 380: Design and Analysis of Algorithms
Multy- Objective Differential Evolution (MODE)
Overview of Supervised Learning
Artificial Intelligence Chapter 4. Machine Evolution
Example: Applying EC to the TSP Problem
G5BAIM Artificial Intelligence Methods
Genetic Algorithms Chapter 3.
Introduction to Artificial Intelligence Lecture 11: Machine Evolution
Evolutionary Computing
Aiman H. El-Maleh Sadiq M. Sait Syed Z. Shazli
Parameter control Chapter 8.
Artificial Intelligence Chapter 4. Machine Evolution
Boltzmann Machine (BM) (§6.4)
Genetic Programming Chapter 6.
Genetic Programming.
Machine Learning: UNIT-4 CHAPTER-2
Genetic Programming Chapter 6.
A Tutorial (Complete) Yiming
Genetic Programming Chapter 6.
Evolution strategies Chapter 4.
Parameter control Chapter 8.
Evolution Strategies Originally developed in Germany in the early 60s with Rechenberg and Schwefel being the main contributors. Main ideas include: floating.
Evolution strategies Chapter 4.
CSC 380: Design and Analysis of Algorithms
Evolution Strategies Originally developed in Germany in the early 60s with Rechenberg and Schwefel being the main contributors. Main ideas include: floating.
Parameter control Chapter 8.
Presentation transcript:

Evolutionary Programming Chapter 5

EP quick overview Developed: USA in the 1960’s Early names: D. Fogel Typically applied to: traditional EP: machine learning tasks by finite state machines contemporary EP: (numerical) optimization Attributed features: very open framework: any representation and mutation op’s OK crossbred with ES (contemporary EP) consequently: hard to say what “standard” EP is Special: no recombination self-adaptation of parameters standard (contemporary EP)

EP technical summary tableau Representation Real-valued vectors Recombination None Mutation Gaussian perturbation Parent selection Deterministic Survivor selection Probabilistic (+) Specialty Self-adaptation of mutation step sizes (in meta-EP)

Historical EP perspective EP aimed at achieving intelligence Intelligence was viewed as adaptive behaviour Prediction of the environment was considered a prerequisite to adaptive behaviour Thus: capability to predict is key to intelligence

Prediction by finite state machines Finite state machine (FSM): States S Inputs I Outputs O Transition function  : S x I  S x O Transforms input stream into output stream Can be used for predictions, e.g. to predict next input symbol in a sequence

FSM example Consider the FSM with: S = {A, B, C} I = {0, 1} O = {a, b, c}  given by a diagram

FSM as predictor Consider the following FSM Task: predict next input Quality: % of in(i+1) = outi Given initial state C Input sequence 011101 Leads to output 110111 Quality: 3 out of 5

Introductory example: evolving FSMs to predict primes P(n) = 1 if n is prime, 0 otherwise I = N = {1,2,3,…, n, …} O = {0,1} Correct prediction: outi= P(in(i+1)) Fitness function: 1 point for correct prediction of next input 0 point for incorrect prediction Penalty for “too much” states

Introductory example: evolving FSMs to predict primes Parent selection: each FSM is mutated once Mutation operators (one selected randomly): Change an output symbol Change a state transition (i.e. redirect edge) Add a state Delete a state Change the initial state Survivor selection: (+) Results: overfitting, after 202 inputs best FSM had one state and both outputs were 0, i.e., it always predicted “not prime”

Modern EP No predefined representation in general Thus: no predefined mutation (must match representation) Often applies self-adaptation of mutation parameters In the sequel we present one EP variant, not the canonical EP

Representation For continuous parameter optimisation Chromosomes consist of two parts: Object variables: x1,…,xn Mutation step sizes: 1,…,n Full size:  x1,…,xn, 1,…,n 

Mutation Chromosomes:  x1,…,xn, 1,…,n  i’ = i • (1 +  • N(0,1)) x’i = xi + i’ • Ni(0,1)   0.2 boundary rule: ’ < 0  ’ = 0 Other variants proposed & tried: Lognormal scheme as in ES Using variance instead of standard deviation Mutate -last Other distributions, e.g, Cauchy instead of Gaussian

Recombination None Rationale: one point in the search space stands for a species, not for an individual and there can be no crossover between species Much historical debate “mutation vs. crossover” Pragmatic approach seems to prevail today

Parent selection Each individual creates one child by mutation Thus: Deterministic Not biased by fitness

Survivor selection P(t):  parents, P’(t):  offspring Pairwise competitions in round-robin format: Each solution x from P(t)  P’(t) is evaluated against q other randomly chosen solutions For each comparison, a "win" is assigned if x is better than its opponent The  solutions with the greatest number of wins are retained to be parents of the next generation Parameter q allows tuning selection pressure Typically q = 10

Example application: the Ackley function (Bäck et al ’93) The Ackley function (here used with n =30): Representation: -30 < xi < 30 (coincidence of 30’s!) 30 variances as step sizes Mutation with changing object variables first ! Population size  = 200, selection with q = 10 Termination : after 200000 fitness evaluations Results: average best solution is 1.4 • 10 –2

Example application: the Ackley function (Bäck et al ’93) The Ackley function (here used with n =30): Representation: -30 < xi < 30 (coincidence of 30’s!) 30 variances as step sizes Mutation with changing object variables first ! Population size  = 200, selection with q = 10 Termination : after 200000 fitness evaluations Results: average best solution is 1.4 • 10 –2

Example application: evolving checkers players (Fogel’02) Neural nets for evaluating future values of moves are evolved NNs have fixed structure with 5046 weights, these are evolved + one weight for “kings” Representation: vector of 5046 real numbers for object variables (weights) vector of 5046 real numbers for ‘s Mutation: Gaussian, lognormal scheme with -first Plus special mechanism for the kings’ weight Population size 15

Example application: evolving checkers players (Fogel’02) Tournament size q = 5 Programs (with NN inside) play against other programs, no human trainer or hard-wired intelligence After 840 generation (6 months!) best strategy was tested against humans via Internet Program earned “expert class” ranking outperforming 99.61% of all rated players