Machine Learning Evolutionary Algorithm (2)

Slides:



Advertisements
Similar presentations
Biologically Inspired AI (mostly GAs). Some Examples of Biologically Inspired Computation Neural networks Evolutionary computation (e.g., genetic algorithms)
Advertisements

Genetic Programming 김용덕 Page 2 Contents What is Genetic Programming? Difference between GP and GA Flowchart for GP Structures in GP.
Evolutionary Computational Intelligence
Doug Downey, adapted from Bryan Pardo, Machine Learning EECS 349 Machine Learning Genetic Programming.
Evolutionary Computational Intelligence
Genetic Programming. Agenda What is Genetic Programming? Background/History. Why Genetic Programming? How Genetic Principles are Applied. Examples of.
Khaled Rasheed Computer Science Dept. University of Georgia
Genetic Algorithms Overview Genetic Algorithms: a gentle introduction –What are GAs –How do they work/ Why? –Critical issues Use in Data Mining –GAs.
Genetic Programming.
Genetic Programming Chapter 6. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Genetic Programming GP quick overview Developed: USA.
Genetic Algorithm.
Neural and Evolutionary Computing - Lecture 7 1 Evolutionary Programming The origins: L. Fogel (1960) – development of methods which generate automatically.
Genetic Algorithms and Ant Colony Optimisation
GENETIC ALGORITHMS AND GENETIC PROGRAMMING Ehsan Khoddam Mohammadi.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
Evolution Strategies Evolutionary Programming Genetic Programming Michael J. Watts
What is Genetic Programming? Genetic programming is a model of programming which uses the ideas (and some of the terminology) of biological evolution to.
Design of an Evolutionary Algorithm M&F, ch. 7 why I like this textbook and what I don’t like about it!
Introduction to Evolutionary Algorithms Session 4 Jim Smith University of the West of England, UK May/June 2012.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Brief introduction to genetic algorithms and genetic programming A.E. Eiben Free University Amsterdam.
G ENETIC P ROGRAMMING Ranga Rodrigo March 17,
Artificial Intelligence Chapter 4. Machine Evolution.
1 Genetic Algorithms and Ant Colony Optimisation.
Chapter 9 Genetic Algorithms.  Based upon biological evolution  Generate successor hypothesis based upon repeated mutations  Acts as a randomized parallel.
Evolutionary Programming
Genetic Programming. GP quick overview Developed: USA in the 1990’s Early names: J. Koza Typically applied to: machine learning tasks (prediction, classification…)
Evolutionary Computing Dialects Presented by A.E. Eiben Free University Amsterdam with thanks to the EvoNet Training Committee and its “Flying Circus”
Introduction to Genetic Algorithms. Genetic Algorithms We’ve covered enough material that we can write programs that use genetic algorithms! –More advanced.
Project 2: Classification Using Genetic Programming Kim, MinHyeok Biointelligence laboratory Artificial.
Genetic Programming A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 6.
GENETIC PROGRAMMING. THE CHALLENGE "How can computers learn to solve problems without being explicitly programmed? In other words, how can computers be.
Automated discovery in math Machine learning techniques (GP, ILP, etc.) have been successfully applied in science Machine learning techniques (GP, ILP,
GENETIC ALGORITHM Basic Algorithm begin set time t = 0;
Genetic Programming COSC Ch. F. Eick, Introduction to Genetic Programming GP quick overview Developed: USA in the 1990’s Early names: J. Koza Typically.
John R. Koza [Edited by J. Wiebe] 1. GENETIC PROGRAMMING 2.
John R. Koza [Edited by J. Wiebe] 1. GENETIC PROGRAMMING 2.
Symbolic Regression via Genetic Programming AI Project #2 Biointelligence lab Cho, Dong-Yeon
Genetic Programming Using Simulated Natural Selection to Automatically Write Programs.
Evolutionary Programming A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 5.
Genetic Programming. What is Genetic Programming? GP for Symbolic Regression Other Representations for GP Example of GP for Knowledge Discovery Outline.
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming.
Genetic Programming.
Chapter 14 Genetic Algorithms.
Genetic Algorithms Author: A.E. Eiben and J.E. Smith
Genetic Algorithms.
Evolutionary Programming
Evolutionary Algorithms Jim Whitehead
Evolution Strategies Evolutionary Programming
Evolution strategies and genetic programming
Medical Diagnosis via Genetic Programming
Example: Applying EC to the TSP Problem
CS Fall 2016 (Shavlik©), Lecture 12, Week 6
GENETIC PROGRAMMING BBB4003.
Optimization and Learning via Genetic Programming
Artificial Intelligence Chapter 4. Machine Evolution
Example: Applying EC to the TSP Problem
Genetic Algorithms Chapter 3.
CS Fall 2016 (Shavlik©), Lecture 12, Week 6
Artificial Intelligence Chapter 4. Machine Evolution
EE368 Soft Computing Genetic Algorithms.
Evolutionary Programming
Boltzmann Machine (BM) (§6.4)
Genetic Programming Chapter 6.
Genetic Programming.
Genetic Programming Chapter 6.
Genetic Programming Chapter 6.
Genetic Algorithm Soft Computing: use of inexact t solution to compute hard task problems. Soft computing tolerant of imprecision, uncertainty, partial.
GENETIC PROGRAMMING BBB4003.
Beyond Classical Search
Presentation transcript:

Machine Learning Evolutionary Algorithm (2)

Evolutionary Programming

EP quick overview Developed: USA in the 1960’s Early names: D. Fogel Typically applied to: traditional EP: machine learning tasks by finite state machines contemporary EP: (numerical) optimization Attributed features: very open framework: any representation and mutation op’s OK crossbred with ES (contemporary EP) consequently: hard to say what “standard” EP is Special: no recombination self-adaptation of parameters standard (contemporary EP)

Evolutionary Programming (EP) There is no fixed structure for representation. There is only mutation operation, and cross-over is not used in this method. Each child is determined by its parent in a way of mutation. So, we can conclude that there are three steps: Initialize population and calculate fitness values for initial population Mutate the parents and generate new population Calculate fitness values of new generation and continue from the second step.

EP technical summary tableau Representation Finite State Machine Recombination None Mutation Gaussian perturbation Parent selection Deterministic Survivor selection Probabilistic (+) Specialty Self-adaptation of mutation step sizes (in meta-EP)

Prediction by finite state machines Finite state machine (FSM): States S Inputs I Outputs O Transition function  : S x I  S x O Transforms input stream into output stream Can be used for predictions, e.g. to predict next input symbol in a sequence

FSM example Consider the FSM with: S = {A, B, C} I = {0, 1} O = {a, b, c}  given by a diagram

FSM as predictor Consider the following FSM Task: predict next input Quality: % of in(i+1) = outi Given initial state C Input sequence 011101 Leads to output 110111 Quality: 3 out of 5

Introductory example: evolving FSMs to predict primes P(n) = 1 if n is prime, 0 otherwise I = N = {1,2,3,…, n, …} O = {0,1} Correct prediction: outi= P(in(i+1)) Fitness function: 1 point for correct prediction of next input 0 point for incorrect prediction Penalty for “too much” states

Introductory example: evolving FSMs to predict primes Parent selection: each FSM is mutated once Mutation operators (one selected randomly): Change an output symbol Change a state transition (i.e. redirect edge) Add a state Delete a state Change the initial state

Genetic Programming

THE CHALLENGE "How can computers learn to solve problems without being explicitly programmed? In other words, how can computers be made to do what is needed to be done, without being told exactly how to do it?"  Attributed to Arthur Samuel (1959)

GP quick overview Developed: USA in the 1990’s Early names: J. Koza Typically applied to: machine learning tasks (prediction, classification…) Attributed features: competes with neural nets and alike needs huge populations (thousands) slow Special: non-linear chromosomes: trees, graphs mutation possible but not necessary (disputed!)

GP technical summary tableau Representation Tree structures Recombination Exchange of subtrees Mutation Random change in trees Parent selection Fitness proportional Survivor selection Generational replacement

REPRESENTATIONS Decision trees If-then production rules Horn clauses Neural nets Bayesian networks Frames Propositional logic Binary decision diagrams Formal grammars Coefficients for polynomials Reinforcement learning tables Conceptual clusters Classifier systems

A COMPUTER PROGRAM BBB121

Introductory example: credit scoring Bank wants to distinguish good from bad loan applicants Model needed that matches historical data ID No of children Salary Marital status OK? ID-1 2 45000 Married ID-2 30000 Single 1 ID-3 40000 Divorced …

Introductory example: credit scoring A possible model: IF (NOC = 2) AND (S > 80000) THEN good ELSE bad In general: IF formula THEN good ELSE bad Only unknown is the right formula, hence Natural fitness of a formula: percentage of well classified cases of the model it stands for Natural representation of formulas (genotypes) is: parse trees

Introductory example: credit scoring IF (NOC = 2) AND (S > 80000) THEN good ELSE bad can be represented by the following tree AND S 2 NOC 80000 > =

Tree based representation

Tree based representation (x  true)  (( x  y )  (z  (x  y)))

Tree based representation while (i < 20) { i = i +1 }

Tree based representation In GA, ES, EP chromosomes are linear structures (bit strings, integer string, real-valued vectors, permutations) Tree shaped chromosomes are non-linear structures In GA, ES, EP the size of the chromosomes is fixed Trees in GP may vary in depth and width

CREATING RANDOM PROGRAMS Creation.avi (creation.gif converted to AVI movie file)

Tree based representation Symbolic expressions can be defined by Terminal set T Function set F (with the arities of function symbols) Adopting the following general recursive definition: Every t  T is a correct expression f(e1, …, en) is a correct expression if f  F, arity(f)=n and e1, …, en are correct expressions There are no other forms of correct expressions In general, expressions in GP are not typed (closure property: any f  F can take any g  F as argument)

CREATING RANDOM PROGRAMS Available functions F = {+, -, *, %, IFLTE} Available terminals T = {X, Y, Random-Constants} The random programs are: Of different sizes and shapes Syntactically valid Executable

GA flowchart GP flowchart

Mutation (+ 2 3 (* X 7) (/ Y 5)) + 2 3 * / X 7 Y 5

Mutation First pick a random node (+ 2 3 (* X 7) (/ Y 5)) + 2 3 * X 7

Mutation Delete the node and its children, and replace with a randomly generated program (+ 2 3 (+ (* 4 2) 3) (/ Y 5)) + 2 3 + / * 3 Y 5 4 2

Crossover (+ X (* 3 Y)) (- (/ 25 X) 7) + - X * / 7 3 Y 25 X

Crossover (+ X (* 3 Y)) (- (/ 25 X) 7) Pick a random node in each program + - X * / 7 3 Y 25 X

Crossover (+ X (* (/ 25 X) Y)) (- 3 7) Swap the two nodes + - X * 3 7

Mutation cont’d Mutation has two parameters: Probability pm to choose mutation vs. recombination Probability to chose an internal point as the root of the subtree to be replaced Remarkably pm is advised to be 0 (Koza’92) or very small, like 0.05 (Banzhaf et al. ’98) The size of the child can exceed the size of the parent

Recombination Most common recombination: exchange two randomly chosen subtrees among the parents Recombination has two parameters: Probability pc to choose recombination vs. mutation Probability to chose an internal point within each parent as crossover point The size of offspring can exceed that of the parents

Selection Parent selection typically fitness proportionate Over-selection in very large populations rank population by fitness and divide it into two groups: group 1: best x% of population, group 2 other (100-x)% 80% of selection operations chooses from group 1, 20% from group 2 for pop. size = 1000, 2000, 4000, 8000 x = 32%, 16%, 8%, 4% motivation: to increase efficiency, %’s come from rule of thumb Survivor selection: Typical: generational scheme (thus none) Recently steady-state is becoming popular for its elitism

Initialisation Maximum initial depth of trees Dmax is set Full method (each branch has depth = Dmax): nodes at depth d < Dmax randomly chosen from function set F nodes at depth d = Dmax randomly chosen from terminal set T Grow method (each branch has depth  Dmax): nodes at depth d < Dmax randomly chosen from F  T nodes at depth d = Dmax randomly chosen from T Common GP initialisation: ramped half-and-half, where grow & full method each deliver half of initial population

FIVE MAJOR PREPARATORY STEPS FOR GP Determining the set of terminals Determining the set of functions Determining the fitness measure Determining the parameters for the run Determining the method for designating a result and the criterion for terminating a run BBB3666 (converted to BMP from eps) The following were cut as parameter subpoints so that the text fit on a slide population size number of generations minor parameters

Building a Better Mouse Apply Genetic Programming to the problem of navigating a maze What are our terminal and function sets? Function Set = {If-Movement-Blocked, While-Not-At-Cheese*} Terminal Set = {Move-Forward, Turn-Left, Turn-Right} * While-Not-At-Cheese will be used exclusively as the root node of the parse tree

Building a Better Mouse How to get the starving mouse to the cheese? One possible solution: While not at the cheese If the way ahead is blocked Turn left 90 degrees Otherwise Move forward Turn right 90 degrees Cheese? Blocked? Is there a better solution for this maze? How good is this solution?

Building a Better Mouse A fitness function: Each function and terminal other than the root node shall cost one unit to execute If the mouse spends more than 100 units, it dies of hunger The fitness measure for a program is determined be executing the program, then squaring the sum of the total units spent and the final distance from the exit A lower fitness measure is preferable to a higher fitness measure Cheese? Blocked?

Building a Better Mouse While not at the cheese (12996) If the way ahead is blocked Turn left 90 degrees Otherwise Move forward one space Turn right 90 degrees

Building a Better Mouse While not at the cheese (12996) If the way ahead is blocked Turn left 90 degrees Otherwise Move forward one space Turn right 90 degrees Mutation: While not at the cheese (12996) If the way ahead is blocked Turn left 90 degrees Otherwise

Building a Better Mouse While not at the cheese (12996) If the way ahead is blocked Turn left 90 degrees Otherwise Move forward one space Turn right 90 degrees Crossover: While not at the cheese (11664) If the way ahead is blocked Move forward one space Turn right 90 degrees Otherwise While not at the cheese (12996) Turn left 90 degrees

Building a Better Mouse (after 4202 generations, with 1000 programs per generation) While not at the cheese If the way ahead is blocked Turn right 90 degrees Move forward one space Otherwise Turn left 90 degrees Fitness measure: 2809 Is this better?

Example We seek to find the Boolean function that returns particular Boolean output values (0 or 1). The Boolean even-k-parity function of k Boolean arguments returns T (true=1) if an even number of its Boolean arguments are T, and otherwise returns F (false =0) (fitness function). Let the input are D0, D1, and D2. The output is S. Terminal set: D0, D1 and D2 Function set: AND, OR, NAND, and NOR. Fitness cases: sum of 8-combinations of the three Boolean arguments D0, D1, and D2 (Truth Table) cases do not equals the correct value of the even-3-parity function (error) construct 4 genes for this problem using Genetic Programming compute the fitness for these genes perform one crossover (best two genes) and one mutation operator for worst gene construct the new population

Summary