Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 9 Genetic Algorithms Evolutionary computation Prototypical GA

Similar presentations


Presentation on theme: "Chapter 9 Genetic Algorithms Evolutionary computation Prototypical GA"— Presentation transcript:

1 Chapter 9 Genetic Algorithms Evolutionary computation Prototypical GA
An example: GABIL Genetic Programming

2 Overview of GAs It is a kind of evolutionary computation.
It is general optimization method that searches a large space of candidate objects (hypotheses, population) seeking one that performs best according to the fitness function (a predefined numerical measure ). It is NOT guaranteed to find an optimal object. It is broadly applied on optimization, machine learning, circuit layout, job-shop scheduling, and so on. Jop-shop scheduling: مسئله زمان بندي كار كارگاهي:

3 Motivation for GAs Evolution is know to be a successful, robust method for adaptation within biological systems. Benefits: GAs can search spaces of hypotheses containing complex interacting models. GAs are easily parallelized and can take advantage of the decreasing costs of powerful computer hardware.

4 A Prototypical GA

5 A prototypical GA GA(Fitness, Fitness_threshold, p, r, m)
Initialize: P  p random hypotheses Evaluate: for each h in P, compute Fitness(h) While [maxh Fitness(h)] < Fitness_threshold 1. Select: Probabilistically select (1-r)p members of P to add to PS, based on the following equation: “Fitness proportionate selection” or “roulette wheel selection”

6 A prototypical GA 2. Crossover: Probabilistically select r.p/2 pairs of hypotheses from P. For each pair, <h1, h2>, produce two offspring by applying the Crossover operator. Add all offspring to Ps. 3. Mutate: Invert a randomly selected bit in m·p random members of Ps 4. Update: P  Ps 5. Evaluate: for each h in P, compute Fitness(h) Return the hypothesis from P that has the highest fitness.

7 Main loop of GA

8 Representing Hypotheses: PlayTennis Example
(Outlook = Overcast  Rain)  (Wind = Strong) by IF Wind = Strong THEN PlayTennis = yes by? Outlook Wind 011 10 Outlook Wind PlayTennis 111 10 Applying GA for Rule Extraction

9 Operators for Genetic Algorithms
Initial strings Crossover Mask Offspring Single-point crossover: Two-point crossover: Uniform crossover: Point mutation:

10 Selecting Most Fit Hypotheses
Fitness proportionate selection: …. can lead to crowding (p 259 Mitchell). Tournament selection: Pick h1, h2 at random with uniform prob. With probability p, select the more fit, or with the probability of (1-p) the other one. Fitness sharing: The fitness of an individual is reduced by the presence of other similar individuals in the population

11 Crowding (p. 259 Mitchell)

12 Applications of GA Every optimization methods Circuit layout
Job-shop scheduling Function approximation Choosing network topology for ANN Learning Boolean concepts represented by a disjunctive set of propositional rules (GABIL, DeJong 1993)

13 IF a1 = Ta2 = F THEN c = T; IF a2 = T THEN c = F represented by
GABIL [DeJong et al. 1993] Learn disjunctive set of propositional rules, competitive with C4.5 Fitness: Fitness(h) = (correct(h))2 Representation: IF a1 = Ta2 = F THEN c = T; IF a2 = T THEN c = F represented by Genetic operators: ??? want variable length rule sets want only well-formed bitstring hypotheses a1 a2 c 10 01 1 a1 a2 c 11 10

14 Crossover with Variable-Length Bitstrings
Start with 1. choose crossover points for h1, e.g., after bits 1, 8 2. now restrict points in h2 to those that produce bitstrings with well-defined semantics, e.g., <1, 3>, <1, 8>, <6, 8>. if we choose <1, 3>, result is

15 GABIL Extensions Add new genetic operators, also applied probabilistically: 1. AddAlternative: generalize constraint on a specific attribute by changing a 0 to 1 2. DropCondition: generalize constraint on specific attribute by changing all bits of a particular attribute to 1 Improving the performance from 92.1% (for the basic GA) to 95.2% Moreover, compare to other symbolic decision tree learning algorithms (C4.5, ID5R, AQ14), which are from 91.2% to 96.6%

16 Genetic-based Approaches
Genetic Algorithm: Solutions or hypothesis are in the form of bitstring Genetic Programming: Solutions or hypothesis are in the form of computer programs

17 Genetic Programming Population of programs represented by trees

18 Crossover GP does not have Mutation operation

19 A GP Example: Block Problem
Goal: spell UNIVERSAL Terminals: CS (current stack) = name of the top block on stack, or F (there is no current stack). NN (next necessary) = name of the next block needed above TB in the stack (Koza 1992)

20 A GP Example: Block Problem
Primitive functions: (MS x) (move to stack)= if block x is on the table, moves x to the top of the stack and returns the value T. Otherwise, does nothing and returns the value F. (MT x) (move to table)= if block x is somewhere in the stack, moves the block at the top of the stack to the table and returns the value T. Otherwise, returns F. (EQ x y) (equal)= returns T if x equals y, and returns F otherwise. (NOT x)= returns T if x = F, else returns F (DU x y) (do until)= executes the expression x repeatedly until expression y returns the value T

21 Learned Program Using random initial population of 300 programs, found this after 10 generations: (EQ (DU (MT CS)(NOT CS)) (DU (MS NN)(NOT NN)) )

22 Applications of Genetic Programming
design electronic filter circuits Classifying segments of protein molecules

23 Summary GA looks for a suitable population rather than an extremum point GA does not guarantee to find the extremum GA does not need derive operation, instead it works with the hypothesis directly Its potential for parallel programming Usually needs many epochs (generations)

24 MiniProject 3 Exercises 9.1 and 9.2 Or, Exercise 9.4 P. 270, Mitchell


Download ppt "Chapter 9 Genetic Algorithms Evolutionary computation Prototypical GA"

Similar presentations


Ads by Google