A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming.

Slides:



Advertisements
Similar presentations
G5BAIM Artificial Intelligence Methods Graham Kendall Evolutionary Algorithms.
Advertisements

Evolution strategies Chapter 4.
Parameter control Chapter 8. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Parameter Control in EAs 2 Motivation 1 An EA has many.
1 Evolutionary Computational Inteliigence Lecture 6b: Towards Parameter Control Ferrante Neri University of Jyväskylä.
Parameter Control A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 8.
Evolution strategies Chapter 4. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies ES quick overview Developed: Germany.
Andrew Cannon Yuki Osada Angeline Honggowarsito. Contents What are Evolutionary Algorithms (EAs)? Why are EAs Important? Categories of EAs Mutation Self.
Institute of Intelligent Power Electronics – IPE Page1 Introduction to Basics of Genetic Algorithms Docent Xiao-Zhi Gao Department of Electrical Engineering.
Genetic Algorithms. Some Examples of Biologically Inspired AI Neural networks Evolutionary computation (e.g., genetic algorithms) Immune-system-inspired.
Genetic Algorithms1 COMP305. Part II. Genetic Algorithms.
Evolutionary Computational Intelligence
Doug Downey, adapted from Bryan Pardo, Machine Learning EECS 349 Machine Learning Genetic Programming.
Evolutionary Algorithms Simon M. Lucas. The basic idea Initialise a random population of individuals repeat { evaluate select vary (e.g. mutate or crossover)
Tutorial 1 Temi avanzati di Intelligenza Artificiale - Lecture 3 Prof. Vincenzo Cutello Department of Mathematics and Computer Science University of Catania.
Evolutionary Computational Intelligence
Evolutionary Computation Application Peter Andras peter.andras/lectures.
Lecture 5: EP and DE 1 Evolutionary Computational Intelligence Lecture 5a: Overview about Evolutionary Programming Ferrante Neri University of Jyväskylä.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Ocober 10, 2012Introduction to Artificial Intelligence Lecture 9: Machine Evolution 1 The Alpha-Beta Procedure Example: max.
Genetic Programming Chapter 6. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Genetic Programming GP quick overview Developed: USA.
Genetic Algorithm.
Neural and Evolutionary Computing - Lecture 7 1 Evolutionary Programming The origins: L. Fogel (1960) – development of methods which generate automatically.
Genetic Algorithms and Ant Colony Optimisation
Evolution Strategies Evolutionary Programming Genetic Programming Michael J. Watts
Design of an Evolutionary Algorithm M&F, ch. 7 why I like this textbook and what I don’t like about it!
Introduction to Evolutionary Algorithms Session 4 Jim Smith University of the West of England, UK May/June 2012.
G5BAIM Artificial Intelligence Methods Dr. Rong Qu Evolutionary Algorithms.
Artificial Intelligence Chapter 4. Machine Evolution.
Evolution strategies Luis Martí DEE/PUC-Rio. ES quick overview Developed: Germany in the 1970’s Early names: I. Rechenberg, H.-P. Schwefel Typically applied.
Algorithms and their Applications CS2004 ( ) 13.1 Further Evolutionary Computation.
Evolutionary Programming
/ 26 Evolutionary Computing Chapter 8. / 26 Chapter 8: Parameter Control Motivation Parameter setting –Tuning –Control Examples Where to apply parameter.
Genetic Programming. GP quick overview Developed: USA in the 1990’s Early names: J. Koza Typically applied to: machine learning tasks (prediction, classification…)
Evolutionary Computing
Evolutionary Computing Dialects Presented by A.E. Eiben Free University Amsterdam with thanks to the EvoNet Training Committee and its “Flying Circus”
Evolution strategies Chapter 4. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies ES quick overview Developed: Germany.
Genetic Programming A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 6.
CITS7212: Computational Intelligence An Overview of Core CI Technologies Lyndon While.
Evolution strategies Chapter 4. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolution Strategies ES quick overview Developed: Germany.
Genetic Programming COSC Ch. F. Eick, Introduction to Genetic Programming GP quick overview Developed: USA in the 1990’s Early names: J. Koza Typically.
Introduction to Evolutionary Computing II A.E. Eiben Free University Amsterdam with thanks to the EvoNet Training Committee.
Ch 20. Parameter Control Ch 21. Self-adaptation Evolutionary Computation vol. 2: Advanced Algorithms and Operators Summarized and presented by Jung-Woo.
Evolutionary Programming A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 5.
March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 1 Let’s look at… Machine Evolution.
Genetic Programming.
Genetic Algorithms Author: A.E. Eiben and J.E. Smith
Evolutionary Programming
Evolution Strategies Evolutionary Programming
Evolution strategies Chapter 4.
Machine Learning Evolutionary Algorithm (2)
Evolution strategies Can programs learn?
Example: Applying EC to the TSP Problem
Training Neural networks to play checkers
Artificial Intelligence Chapter 4. Machine Evolution
Example: Applying EC to the TSP Problem
G5BAIM Artificial Intelligence Methods
Introduction to Artificial Intelligence Lecture 11: Machine Evolution
Parameter control Chapter 8.
Artificial Intelligence Chapter 4. Machine Evolution
Evolutionary Programming
Genetic Programming Chapter 6.
Genetic Programming.
Genetic Programming Chapter 6.
A Tutorial (Complete) Yiming
Genetic Programming Chapter 6.
Evolution strategies Chapter 4.
Parameter control Chapter 8.
Evolution strategies Chapter 4.
Parameter control Chapter 8.
Presentation transcript:

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming EP quick overview Developed: USA in the 1960’s Early names: D. Fogel Typically applied to: – traditional EP: machine learning tasks by finite state machines – contemporary EP: (numerical) optimization Attributed features: – very open framework: any representation and mutation op’s OK – crossbred with ES (contemporary EP) – consequently: hard to say what “standard” EP is Special: – no recombination – self-adaptation of parameters standard (contemporary EP)

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming EP technical summary tableau RepresentationReal-valued vectors RecombinationNone MutationGaussian perturbation Parent selectionDeterministic Survivor selection Probabilistic (  +  ) SpecialtySelf-adaptation of mutation step sizes (in meta-EP)

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming Historical EP perspective EP aimed at achieving intelligence Intelligence was viewed as adaptive behaviour Prediction of the environment was considered a prerequisite to adaptive behaviour Thus: capability to predict is key to intelligence

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming Prediction by finite state machines Finite state machine (FSM): – States S – Inputs I – Outputs O – Transition function  : S x I  S x O – Transforms input stream into output stream Can be used for predictions, e.g. to predict next input symbol in a sequence

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming Vending machine Vending machine sells water for 15p States: 1.Got 0 2.Got 5p 3.Got 10p Inputs: 1.5p 2.10p Output: can of water

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming FSM example Consider the FSM with: – S = {A, B, C} – I = {0, 1} – O = {a, b, c} –  given by a diagram

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming Consider the following FSM Task: predict next input Quality: % of in (i+1) = out i Given initial state C Input sequence Leads to output Quality: 3 out of 5 FSM as predictor

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming Introductory example: evolving FSMs to predict primes P(n) = 1 if n is prime, 0 otherwise I = N = {1,2,3,…, n, …} O = {0,1} Correct prediction: out i = P(in (i+1) ) Fitness function: – 1 point for correct prediction of next input – 0 point for incorrect prediction – Penalty for “too much” states

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming Introductory example: evolving FSMs to predict primes Parent selection: each FSM is mutated once Mutation operators (one selected randomly): – Change an output symbol – Change a state transition (i.e. redirect edge) – Add a state – Delete a state – Change the initial state Survivor selection: (  +  ) Results: overfitting, after 202 inputs best FSM had one state and both outputs were 0, i.e., it always predicted “not prime”

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming Modern EP No predefined representation in general Thus: no predefined mutation (must match representation) Often applies self-adaptation of mutation parameters In the sequel we present one EP variant, not the canonical EP

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming Representation For continuous parameter optimisation Chromosomes consist of two parts: – Object variables: x 1,…,x n – Mutation step sizes:  1,…,  n Full size:  x 1,…,x n,  1,…,  n 

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming Mutation Chromosomes:  x 1,…,x n,  1,…,  n   i ’ =  i (1 +  N(0,1)) x’ i = x i +  i ’ N i (0,1)   0.2 boundary rule:  ’ <  0   ’ =  0 Other variants proposed & tried: – Lognormal scheme as in ES – Using variance instead of standard deviation – Mutate  -last – Other distributions, e.g, Cauchy instead of Gaussian

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming Recombination None Rationale: one point in the search space stands for a species, not for an individual and there can be no crossover between species Much historical debate “mutation vs. crossover” Pragmatic approach seems to prevail today

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming Parent selection Each individual creates one child by mutation Thus: – Deterministic – Not biased by fitness

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming Survivor selection P(t):  parents, P’(t):  offspring Pairwise competitions in round-robin format: – Each solution x from P(t)  P’(t) is evaluated against q other randomly chosen solutions – For each comparison, a "win" is assigned if x is better than its opponent – The  solutions with the greatest number of wins are retained to be parents of the next generation Parameter q allows tuning selection pressure Typically q = 10

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming Example application: the Ackley function (Bäck et al ’93) The Ackley function (here used with n =30): Representation: – -30 < x i < 30 (coincidence of 30’s!) – 30 variances as step sizes Mutation with changing object variables first ! Population size  = 200, selection with q = 10 Termination : after fitness evaluations Results: average best solution is –2

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming Example application: evolving checkers players (Fogel’02) Neural nets for evaluating future values of moves are evolved NNs have fixed structure with 5046 weights, these are evolved + one weight for “kings” Representation: – vector of 5046 real numbers for object variables (weights) – vector of 5046 real numbers for  ‘s Mutation: – Gaussian, lognormal scheme with  -first – Plus special mechanism for the kings’ weight Population size 15

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming Example application: evolving checkers players (Fogel’02) Tournament size q = 5 Programs (with NN inside) play against other programs, no human trainer or hard-wired intelligence After 840 generation (6 months!) best strategy was tested against humans via Internet Program earned “expert class” ranking outperforming 99.61% of all rated players

A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming D. B. Fogel, T. J. Hays, S. L. Hahn and J. Quon, "A self-learning evolutionary chess program," in Proceedings of the IEEE, vol. 92, no. 12, pp , Dec doi: /JPROC Abstract: A central challenge of artificial intelligence is to create machines that can learn from their own experience and perform at the level of human experts. Using an evolutionary algorithm, a computer program has learned to play chess by playing games against itself. The program learned to evaluate chessboard configurations by using the positions of pieces, material and positional values, and neural networks to assess specific sections of the chessboard. During evolution, the program improved its play by almost 400 rating points. Testing under simulated tournament conditions against Pocket Fritz 2.0 indicated that the evolved program performs above the master level. URL: mber=29834http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber= &isnu mber=29834