Machine Learning Chapter 9. Genetic Algorithm

Slides:



Advertisements
Similar presentations
Genetic Algorithms Chapter 3. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Genetic Algorithms GA Quick Overview Developed: USA in.
Advertisements

Evolutionary Computation
Genetic Algorithms.
Genetic Algorithms Genetic Programming Ata Kaban School of Computer Science University of Birmingham 2003.
Representing Hypothesis Operators Fitness Function Genetic Programming
Biologically Inspired AI (mostly GAs). Some Examples of Biologically Inspired Computation Neural networks Evolutionary computation (e.g., genetic algorithms)
Genetic Algorithms Representation of Candidate Solutions GAs on primarily two types of representations: –Binary-Coded –Real-Coded Binary-Coded GAs must.
Genetic Algorithms. Some Examples of Biologically Inspired AI Neural networks Evolutionary computation (e.g., genetic algorithms) Immune-system-inspired.
Genetic Algorithms GAs are one of the most powerful and applicable search methods available GA originally developed by John Holland (1975) Inspired by.
Genetic Algorithm for Variable Selection
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Chapter 14 Genetic Algorithms.
Basic concepts of Data Mining, Clustering and Genetic Algorithms Tsai-Yang Jea Department of Computer Science and Engineering SUNY at Buffalo.
CS 8751 ML & KDDGenetic Algorithms1 Evolutionary computation Prototypical GA An example: GABIL Genetic Programming Individual learning and population evolution.
Genetic Algorithms (GAs) by Jia-Huei Liao Source: Chapter 9, Machine Learning, Tom M. Mitchell, 1997 The Genetic Programming Tutorial Notebook
Selecting Informative Genes with Parallel Genetic Algorithms Deodatta Bhoite Prashant Jain.
Genetic Algorithm What is a genetic algorithm? “Genetic Algorithms are defined as global optimization procedures that use an analogy of genetic evolution.
Chapter 6: Transform and Conquer Genetic Algorithms The Design and Analysis of Algorithms.
Khaled Rasheed Computer Science Dept. University of Georgia
Genetic Algorithms Overview Genetic Algorithms: a gentle introduction –What are GAs –How do they work/ Why? –Critical issues Use in Data Mining –GAs.
Computing & Information Sciences Kansas State University Lecture 37 of 42 CIS 530 / 730 Artificial Intelligence Lecture 37 of 42 Genetic Programming Discussion:
Genetic Algorithm.
Computing & Information Sciences Kansas State University Friday, 21 Nov 2008CIS 530 / 730: Artificial Intelligence Lecture 35 of 42 Friday, 21 November.
Genetic Algorithms CS121 Spring 2009 Richard Frankel Stanford University 1.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
CS Machine Learning Genetic Algorithms (II).
CS 484 – Artificial Intelligence1 Announcements Lab 3 due Tuesday, November 6 Homework 6 due Tuesday, November 6 Lab 4 due Thursday, November 8 Current.
Lecture 8: 24/5/1435 Genetic Algorithms Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
Computing & Information Sciences Kansas State University Monday, 27 Nov 2006CIS 490 / 730: Artificial Intelligence Lecture 38 of 42 Monday, 27 November.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Computational Complexity Jang, HaYoung BioIntelligence Lab.
1 Machine Learning: Lecture 12 Genetic Algorithms (Based on Chapter 9 of Mitchell, T., Machine Learning, 1997)
1 Chapter 14 Genetic Algorithms. 2 Chapter 14 Contents (1) l Representation l The Algorithm l Fitness l Crossover l Mutation l Termination Criteria l.
GENETIC ALGORITHM A biologically inspired model of intelligence and the principles of biological evolution are applied to find solutions to difficult problems.
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Friday, 16 February 2007 William.
Genetic Algorithms. Evolutionary Methods Methods inspired by the process of biological evolution. Main ideas: Population of solutions Assign a score or.
Genetic Algorithms ML 9 Kristie Simpson CS536: Advanced Artificial Intelligence Montana State University.
1 Genetic Algorithms K.Ganesh Introduction GAs and Simulated Annealing The Biology of Genetics The Logic of Genetic Programmes Demo Summary.
Chapter 9 Genetic Algorithms.  Based upon biological evolution  Generate successor hypothesis based upon repeated mutations  Acts as a randomized parallel.
Machine Learning CS 165B Spring Course outline Introduction (Ch. 1) Concept learning (Ch. 2) Decision trees (Ch. 3) Ensemble learning Neural Networks.
Edge Assembly Crossover
Genetic Algorithms Genetic algorithms provide an approach to learning that is based loosely on simulated evolution. Hypotheses are often described by bit.
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
EE749 I ntroduction to Artificial I ntelligence Genetic Algorithms The Simple GA.
Machine Learning A Quick look Sources: Artificial Intelligence – Russell & Norvig Artifical Intelligence - Luger By: Héctor Muñoz-Avila.
CpSc 881: Machine Learning Genetic Algorithm. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources.
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Wednesday, 21 February 2007.
Computing & Information Sciences Kansas State University Monday, 27 Nov 2006CIS 490 / 730: Artificial Intelligence Lecture 38 of 42 Monday, 27 November.
Chapter 9 Genetic Algorithms Evolutionary computation Prototypical GA
Chapter 9 Genetic Algorithms
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Monday, 25 February 2008 William.
GENETIC ALGORITHM Basic Algorithm begin set time t = 0;
1 Autonomic Computer Systems Evolutionary Computation Pascal Paysan.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
Blocks World Problem. The CS Terminal Specifies the block at the top of the stack. Example CS evaluates to E Note: Evaluates to nil if the stack is empty.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
Introduction to Genetic Algorithms
Chapter 14 Genetic Algorithms.
Genetic Algorithms.
GENETIC ALGORITHM A biologically inspired model of intelligence and the principles of biological evolution are applied to find solutions to difficult.
دانشگاه صنعتی امیرکبیر
Genetic Algorithms Chapter 3.
Boltzmann Machine (BM) (§6.4)
Machine Learning: UNIT-4 CHAPTER-2
Population Based Metaheuristics
Chapter 9 Genetic Algorithms
Presentation transcript:

Machine Learning Chapter 9. Genetic Algorithm Tom M. Mitchell

Genetic Algorithms Evolutionary computation Prototypical GA An example: GABIL Genetic Programming Individual learning and population evolution

Evolutionary Computation Computational procedures patterned after biological evolution Search procedure that probabilistically applies search operators to set of points in the search space

Biological Evolution (1/3) Lamarck and others: Species “transmute” over time Darwin and Wallace: Consistent, heritable variation among individuals in population Natural selection of the fittest Mendel and genetics: A mechanism for inheriting traits genotype  phenotype mapping

Biological Evolution (2/3) GA(Fitness, Fitness_threshold, p, r, m) Initialize: P  p random hypotheses Evaluate: for each h in P, compute Fitness(h) While [maxh Fitness(h)] < Fitness_threshold 1. Select: Probabilistically select (1-r)p members of P to add to PS.

Biological Evolution (3/3) 2. Crossover: Probabilistically select r ·p/2 pairs of hypotheses from P. For each pair, <h1, h2>, produce two offspring by applying the Crossover operator. Add all offspring to Ps. 3. Mutate: Invert a randomly selected bit in m ·p random members of Ps 4. Update: P  Ps 5. Evaluate: for each h in P, compute Fitness(h) Return the hypothesis from P that has the highest fitness.

Representing Hypotheses (Outlook = Overcast  Rain)  (Wind = Strong) by IF Wind = Strong THEN PlayTennis = yes Outlook Wind 011 10 Outlook Wind PlayTennis 111 10

Operators for Genetic Algorithms Initial strings Crossover Mask Offspring Single-point crossover: Two-point crossover: Uniform crossover: Point mutation:

Selecting Most Fit Hypotheses Fitness proportionate selection: ... can lead to crowding Tournament selection: Pick h1, h2 at random with uniform prob. With probability p, select the more fit. Rank selection: Sort all hypotheses by fitness Prob of selection is proportional to rank

GABIL [DeJong et al. 1993] Learn disjunctive set of propositional rules, competitive with C4.5 Fitness: Fitness(h) = (correct(h))2 Representation: IF a1 = Ta2 = F THEN c = T; IF a2 = T THEN c = F represented by Genetic operators: ??? want variable length rule sets want only well-formed bitstring hypotheses a1 a2 c 10 01 1 a1 a2 c 11 10

Crossover with Variable-Length Bitstrings Start with 1. choose crossover points for h1, e.g., after bits 1, 8 2. now restrict points in h2 to those that produce bitstrings with well-defined semantics, e.g., <1, 3>, <1, 8>, <6, 8>. if we choose <1, 3>, result is

GABIL Extensions Add new genetic operators, also applied probabilistically: 1. AddAlternative: generalize constraint on ai by changing a 0 to 1 2. DropCondition: generalize constraint on ai by changing every 0 to 1 And, add new field to bitstring to determine whether to allow these So now the learning strategy also evolves!

GABIL Results Performance of GABIL comparable to symbolic rule/tree learning methods C4.5, ID5R, AQ14 Average performance on a set of 12 synthetic problems: GABIL without AA and DC operators: 92.1% accuracy GABIL with AA and DC operators: 95.2% accuracy symbolic learning methods ranged from 91.2 to 96.6

Schemas How to characterize evolution of population in GA? Schema = string containing 0, 1, * (“don’t care”) Typical schema: 10**0* Instances of above schema: 101101, 100000, ... Characterize population by number of instances representing each possible schema m(s, t) = number of instances of schema s in pop at time t

Consider Just Selection - f(t) = average fitness of pop. at time t m(s, t) = instances of schema s in pop at time t u(s, t) = ave. fitness of instances of s at time t Probability of selecting h in one selection step Probability of selecting an instance of s in one step Expected number of instances of s after n selections ^

Schema Theorem m(s, t) = instances of schema s in pop at time t f(t) = average fitness of pop. at time t u(s, t) = ave. fitness of instances of s at time t pc = probability of single point crossover operator pm = probability of mutation operator l = length of single bit strings o(s) number of defined (non “*”) bits in s d(s) = distance between leftmost, rightmost defined bits in s - ^

Genetic Programming Population of programs represented by trees

Crossover

Block Problem (1/2) Goal: spell UNIVERSAL Terminals: CS (“current stack”) = name of the top block on stack, or F. TB (“top correct block”) = name of topmost correct block on stack NN (“next necessary”) = name of the next block needed above TB in the stack

Block Problem (2/2) Primitive functions: (MS x): (“move to stack”), if block x is on the table, moves x to the top of the stack and returns the value T. Otherwise, does nothing and returns the value F. (MT x): (“move to table”), if block x is somewhere in the stack, moves the block at the top of the stack to the table and returns the value T. Otherwise, returns F. (EQ x y): (“equal”), returns T if x equals y, and returns F otherwise. (NOT x): returns T if x = F, else returns F (DU x y): (“do until”) executes the expression x repeatedly until expression y returns the value T

(EQ (DU (MT CS)(NOT CS)) (DU (MS NN)(NOT NN)) ) Learned Program Trained to t 166 test problems Using population of 300 programs, found this after 10 generations: (EQ (DU (MT CS)(NOT CS)) (DU (MS NN)(NOT NN)) )

Genetic Programming More interesting example: design electronic filter circuits Individuals are programs that transform beginning circuit to final circuit, by adding/subtracting components and connections Use population of 640,000, run on 64 node parallel processor Discovers circuits competitive with best human designs

GP for Classifying Images [Teller and Veloso, 1997] Fitness: based on coverage and accuracy Representation: Primitives include Add, Sub, Mult, Div, Not, Max, Min, Read, Write, If-Then-Else, Either,Pixel, Least, Most, Ave, Variance, Difference, Mini, Library Mini refers to a local subroutine that is separately co-evolved Library refers to a global library subroutine (evolved by selecting the most useful minis) Genetic operators: Crossover, mutation Create “mating pools” and use rank proportionate reproduction

Biological Evolution Lamark (19th century) Believed individual genetic makeup was altered by lifetime experience But current evidence contradicts this view What is the impact of individual learning on population evolution?

Baldwin Effect (1/2) Assume Then Individual learning has no direct influence on individual DNA But ability to learn reduces need to “hard wire” traits in DNA Then Ability of individuals to learn will support more diverse gene pool Because learning allows individuals with various “hard wired” traits to be successful More diverse gene pool will support faster evolution of gene pool  individual learning (indirectly) increases rate of evolution

Baldwin Effect (2/2) Plausible example: 1. New predator appears in environment 2. Individuals who can learn (to avoid it) will be selected 3. Increase in learning individuals will support more diverse gene pool 4. resulting in faster evolution 5. possibly resulting in new non-learned traits such as instinctive fear of predator

Computer Experiments on Baldwin Effect [Hinton and Nowlan, 1987] Evolve simple neural networks: Some network weights fixed during lifetime, others trainable Genetic makeup determines which are fixed, and their weight values Results: With no individual learning, population failed to improve over time When individual learning allowed Early generations: population contained many individuals with many trainable weights Later generations: higher fitness, while number of trainable weights decreased

Summary: Evolutionary Programming Conduct randomized, parallel, hill-climbing search through H Approach learning as optimization problem (optimize fitness) Nice feature: evaluation of Fitness can be very indirect consider learning rule set for multistep decision making no issue of assigning credit/blame to indiv. steps