Biologically Inspired Computing: Evolutionary Algorithms: Some Details and Examples This is additional material for lecture 2 of `Biologically Inspired.

Slides:



Advertisements
Similar presentations
Biologically Inspired Computing: Operators for Evolutionary Algorithms
Advertisements

CS6800 Advanced Theory of Computation
Biologically Inspired Computation Really finishing off EC.
Genetic Algorithms Contents 1. Basic Concepts 2. Algorithm
Tuesday, May 14 Genetic Algorithms Handouts: Lecture Notes Question: when should there be an additional review session?
Genetic Algorithms Representation of Candidate Solutions GAs on primarily two types of representations: –Binary-Coded –Real-Coded Binary-Coded GAs must.
Biologically Inspired Computing: Evolutionary Algorithms: Encodings, Operators and Related Issues: Timetables and Groups This is a lecture seven of `Biologically.
Biologically Inspired Computing: Selection and Reproduction Schemes This is DWC’s lecture three `Biologically Inspired Computing’
1 IOE/MFG 543 Chapter 14: General purpose procedures for scheduling in practice Section 14.5: Local search – Genetic Algorithms.
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Evolutionary Computational Intelligence
A new crossover technique in Genetic Programming Janet Clegg Intelligent Systems Group Electronics Department.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Genetic Algorithms Nehaya Tayseer 1.Introduction What is a Genetic algorithm? A search technique used in computer science to find approximate solutions.
Biologically Inspired Computing: Evolutionary Algorithms: Hillclimbing, Landscapes, Neighbourhoods This is lecture 4 of `Biologically Inspired Computing’
Chapter 6: Transform and Conquer Genetic Algorithms The Design and Analysis of Algorithms.
Tutorial 4 (Lecture 12) Remainder from lecture 10: The implicit fitness sharing method Exercises – any solutions? Questions?
Biologically Inspired Computing: Introduction to Evolutionary Algorithms This is lecture four of `Biologically Inspired Computing’ Contents: Local Search.
Genetic Algorithm.
Genetic Algorithms and Ant Colony Optimisation
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
Soft Computing Lecture 18 Foundations of genetic algorithms (GA). Using of GA.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
Genetic algorithms Prof Kang Li
CS 484 – Artificial Intelligence1 Announcements Lab 3 due Tuesday, November 6 Homework 6 due Tuesday, November 6 Lab 4 due Thursday, November 8 Current.
Ch.12 Machine Learning Genetic Algorithm Dr. Bernard Chen Ph.D. University of Central Arkansas Spring 2011.
Genetic algorithms Charles Darwin "A man who dares to waste an hour of life has not discovered the value of life"
L ECTURE 3 Notes are modified from EvoNet Flying Circus slides. Found at: www2.cs.uh.edu/~ceick/ai/EC1.ppt.
PSO and ASO Variants/Hybrids/Example Applications & Results Lecture 12 of Biologically Inspired Computing Purpose: Not just to show variants/etc … for.
More on Heuristics Genetic Algorithms (GA) Terminology Chromosome –candidate solution - {x 1, x 2,...., x n } Gene –variable - x j Allele –numerical.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
1 Genetic Algorithms and Ant Colony Optimisation.
Genetic Algorithms Przemyslaw Pawluk CSE 6111 Advanced Algorithm Design and Analysis
Introduction to Genetic Algorithms. Genetic Algorithms We’ve covered enough material that we can write programs that use genetic algorithms! –More advanced.
Genetic Algorithms CSCI-2300 Introduction to Algorithms
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
ECE 103 Engineering Programming Chapter 52 Generic Algorithm Herbert G. Mayer, PSU CS Status 6/4/2014 Initial content copied verbatim from ECE 103 material.
Genetic Algorithms. 2 Overview Introduction To Genetic Algorithms (GAs) GA Operators and Parameters Genetic Algorithms To Solve The Traveling Salesman.
Chapter 12 FUSION OF FUZZY SYSTEM AND GENETIC ALGORITHMS Chi-Yuan Yeh.
EE749 I ntroduction to Artificial I ntelligence Genetic Algorithms The Simple GA.
Genetic Algorithms. The Basic Genetic Algorithm 1.[Start] Generate random population of n chromosomes (suitable solutions for the problem) 2.[Fitness]
GENETIC ALGORITHM Basic Algorithm begin set time t = 0;
Genetic Algorithms Chapter Description of Presentations
Biologically Inspired Computing: Indirect Encodings, and Hyperheuristics This is DWC’s lecture 7 for `Biologically Inspired Computing’ Contents: direct.
1 Contents 1. Basic Concepts 2. Algorithm 3. Practical considerations Genetic Algorithm (GA)
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Gerstner Lab, CTU Prague 1Motivation Typically,  an evolutionary optimisation framework considers the EA to be used to evolve a population of candidate.
Overview Last two weeks we looked at evolutionary algorithms.
1 Comparative Study of two Genetic Algorithms Based Task Allocation Models in Distributed Computing System Oğuzhan TAŞ 2005.
Genetic Algorithms An Evolutionary Approach to Problem Solving.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Introduction to Genetic Algorithms
Genetic Algorithms.
Artificial Intelligence Project 2 Genetic Algorithms
Genetic Algorithms CPSC 212 Spring 2004.
`Biologically Inspired Computing’
Introduction to Operators
`Biologically Inspired Computing’
This is lecture 4 of `Biologically Inspired Computing’ Contents:
EE368 Soft Computing Genetic Algorithms.
Searching for solutions: Genetic Algorithms
A Gentle introduction Richard P. Simpson
This is additional material for week 2 of
Biologically Inspired Computing: Operators for Evolutionary Algorithms
Population Based Metaheuristics
Presentation transcript:

Biologically Inspired Computing: Evolutionary Algorithms: Some Details and Examples This is additional material for lecture 2 of `Biologically Inspired Computing’ Contents: Steady state and generational EAs, basic ingredients, example run-throughs on the TSP, example encodings, hillclimbing, landscapes

A Standard Evolutionary Algorithm The algorithm whose pseudocode is on the next slide is a steady state, replace-worst EA with tournament selection, using mutation, but no crossover. Parameters are popsize, tournament size, mutation-rate. It can be applied to any problem; the details glossed over are all problem-specific.

A steady state, mutation-only, replace- worst EA with tournament selection 0. Initialise: generate a population of popsize random solutions, evaluate their fitnesses. 1.Run Select to obtain a parent solution X. 2.With probability mute_rate, mutate a copy of X to obtain a mutant M (otherwise M = X) 3.Evaluate the fitness of M. 4.Let W be the current worst in the population (BTR). If M is not less fit than W, then replace W with M. (otherwise do nothing) 5.If a termination condition is met (e.g. we have done 10,000 evaluationss) then stop. Otherwise go to 1. Select: randomly choose tsize individuals from the population. Let c be the one with best fitness (BTR); return X.

A generational, elitist, crossover+mutation EA with Rank-Based selection 0. Initialise: generate a population G of popsize random solutions, evaluate their fitnesses. 1.Run Select 2*(popsize – 1) times to obtain a collection I of 2*(popsize-1) parents. 2.Randomly pair up the parents in I (into popsize – 1 pairs) and apply Vary to produce a child from each pair. Let the set of children be C. 3.Evaluate the fitness of each child. 4.Keep the best in the population G (BTR) and delete the rest. 5.Add all the children to G. 6.If a termination condition is met (e.g. we have done 100 or more generations (runs through steps 1— 5) then stop. Otherwise go to 1,

A generational, elitist, crossover+mutation EA with Rank-Based selection, continued … Select: sort the contents of G from best to worst, assigning rank popsize to the best, popsize-1 to the next best, etc …, and rank 1 to the worst. The ranks sum to F = popsize(popsize+1)/2 Associate a probability Rank_i/F with each individual i. Using these probabilities, choose one individual X, and return X. Vary: 1. With probability cross_rate, do a crossover: I.e produce a child by applying a crossover operator to the two parents. Otherwise, let the child be a randomly chosen one of the parents. 2. Apply mutation to the child. 3. Return the mutated child.

The basic ingredients A running EA can be seen as a sequence of generations. A generation is simply characterised by a population. An EA goes from generation t to generation t+1 in 3 steps: Gen t Select parents Parents Produce children from the parents Children Gen t Children Merge parent and child populations And remove some so that popsize is maintained

The basic ingredients A running EA can be seen as a sequence of generations. A generation is simply characterised by a population. An EA goes from generation t to generation t+1 in 3 steps: Gen t+1

Steady State vs Generational There are two extremes: Steady state: population only changes slightly in each generation. (e.g. select 1 parent, produce 1 child, add that child to pop and remove the worst) Generational: population changes completely in each generation. (select some [could still be 1] parent(s), produce popsize children, they become the next generation. Although, in practice, if we are using a generational EA, we usually use elitism, which means the new generation always contains the best solution from the previous generation, the remaining popsize-1 individuals being new.

Selection and Variation A selection method is a way to choose a parent from the population, in such a way that the fitter an individual is, the more likely it is to be selected. A variation operator or genetic operator is any method that takes in a (set of) parent(s), and produces a new individual (called child). If the input is a single parent, it is called a mutation operator. If two parents, it is called crossover or recombination.

Replacement A replacement method is a way to decide how to produce the next generation from the merged previous generation and children. E.g. we might simply sort them in order of fitness, and take the best popsize of them. What else might we do instead?

Encodings We want to evolve schedules, networks, routes, coffee percolators, drug designs – how do we encode or represent solutions? How you encode dictates what your operators can be, and certain constraints that the operators must meet.

E.g. Bin-Packing The Problem: You have items of different sizes or weights: e.g: 1 (30) 2 (30) 3 (30) 4 (20) 5 (10) And you have several `bins’ with a max capacity, say 40. Find a way to pack them into the smallest number of bins. This is a hard problem. We will look at the equalize-bins version, also hard: pack these items into N bins (we will choose N = 3), with arbitrary capacity, so that the weights of the 3 bins are as equal as possible. Example Encoding; Have nitems numbers, each between 1 and 3, So: 1,2,1,3,3 means: items 1, 3 are in bin 1, item 2 is in bin 2, items 4, 5 are in bin 3. Fitness: minimise difference between heaviest and lightest bin. In this case what is the fitness? Mutation: choose a gene at random, and change it to a random new value. Crossover: choose a random crossover point c (genes 1, 2, 3, and 4); child is copy of parent 1 up to and including gene c, it is copy of parent 2 thereafter. E.g. Crossover of 3,1,2,3,3 and 2,2,1,2,1 at gene 2: 3,1,1,2,1 Crossover of 1,1,3,2,2 and 2,3,1,11 at gene 2: 1,1,1,1,1

Back to Basics With your thirst for seeing example EAs temporarily quenched, the story now skips to simpler algorithms. This will help to explain what it is about the previous ones which make them work.

The Travelling Salesperson Problem An example (hard) problem, for illustration The Travelling Salesperson Problem Find the shortest tour through the cities. A D E C B ABCDE A57415 B53410 C7327 D4429 E The one below is length: 33

Simplest possible EA: Hillclimbing 0. Initialise: Generate a random solution c; evaluate its fitness, f(c). Call c the current solution. 1. Mutate a copy of the current solution – call the mutant m Evaluate fitness of m, f(m). 2. If f(m) is no worse than f(c), then replace c with m, otherwise do nothing (effectively discarding m). 3. If a termination condition has been reached, stop. Otherwise, go to 1. Note. No population (well, population of 1). This is a very simple version of an EA, although it has been around for much longer.

Why “Hillclimbing”? Suppose that solutions are lined up along the x axis, and that mutation always gives you a nearby solutions. Fitness is on the y axis; this is a landscape , Initial solution; 2. rejected mutant; 3. new current solution, 4. New current solution; 5. new current solution; 6. new current soln 7. Rejected mutant; 8. rejected mutant; 9. new current solution, 10. Rejected mutant, …

Example: HC on the TSP We can encode a candidate solution to the TSP as a permutation A D E C B ABCDE A57415 B53410 C7327 D4429 E Here is our initial random solution ACEDB with fitness 32 Current solution

Example: HC on the TSP We can encode a candidate solution to the TSP as a permutation A D E C B ABCDE A57415 B53410 C7327 D4429 E D E C B Here is our initial random solution ACEDB with fitness 32. We are going to mutate it – first make Mutant a copy of Current. A Current solution Mutant

HC on the TSP A D E C B ABCDE A57415 B53410 C7327 D4429 E D E C B We randomly mutate it (swap randomly chosen adjacent nodes) from ACEDB to ACDEB which has fitness so current stays the same Because we reject this mutant. A Current solution Mutant

HC on the TSP A D E C B ABCDE A57415 B53410 C7327 D4429 E D E C B We now try another mutation of Current (swap randomly chosen adjacent nodes) from ACEDB to CAEDB. Fitness is 38, so reject that too. A Current solution Mutant

HC on the TSP A D E C B ABCDE A57415 B53410 C7327 D4429 E D E C B Our next mutant of Current is from ACEDB to AECDB. Fitness 33, reject this too. A Current solution Mutant

HC on the TSP A D E C B ABCDE A57415 B53410 C7327 D4429 E D E C B Our next mutant of Current is from ACEDB to ACDEB. Fitness 33, reject this too. A Current solution Mutant

HC on the TSP A D E C B ABCDE A57415 B53410 C7327 D4429 E D E C B Our next mutant of Current is from ACEDB to ACEBD. Fitness is 32. Equal to Current, so this becomes the new Current. A Current solution Mutant

HC on the TSP ABCDE A57415 B53410 C7327 D4429 E ACEBD is our Current solution, with fitness 32 Current solution D E C B A

HC on the TSP ABCDE A57415 B53410 C7327 D4429 E ACEBD is our Current solution, with fitness 32. We mutate it to DCEBA (note that first and last are adjacent nodes); fitness is 28. So this becomes our new current solution. Current solution D E C B A Mutant D E C B A

HC on the TSP ABCDE A57415 B53410 C7327 D4429 E Our new Current, DCEBA, with fitness 28 Current solution D E C B A

HC on the TSP ABCDE A57415 B53410 C7327 D4429 E Our new Current, DCEBA, with fitness 28. We mutate it, this time getting DCEAB, with fitness 33 – so we reject that and DCEBA is still our Current solution. Current solution D E C B A Mutant D E C B A

And so on …

Landscapes Recall S, the search space, and f(s), the fitness of a candidate in S f(s) members of S lined up along here The structure we get by imposing f(s) on S is called a landscape What does the landscape look like if f(s) is a random number generator? What kinds of problems would have very smooth landscapes? What is the importance of the mutation operator in all this?

Neighbourhoods Let s be an individual in S, f(s) is our fitness function, and M is our mutation operator, so that M(s1)  s2, where s2 is a mutant of s1. Given M, we can usually work out the neighbourhood of an individual point s – the neighbourhood of s is the set of all possible mutants of s E.g. Encoding: permutations of k objects (e.g. for k-city TSP) Mutation: swap any adjacent pair of objects. Neighbourhood: Each individual has k neighbours. E.g. neighbours of EABDC are: { AEBDC, EBADC, EADBC, EABCD, CABDE} Encoding: binary strings of length L (e.g. for L-item bin-packing) Mutation: choose a bit randomly and flip it. Neighbourhood: Each individual has L neighbours. E.g. neighbours of are: { 10110, 01110, 00010, 00100, 00111}

Landscape Topology Mutation operators lead to slight changes in the solution, which tend to lead to slight changes in fitness. I.e. the fitnesses in the neighbourhood of s are often similar to the fitness of s. Landscapes tend to be locally smooth What about big mutations ?? It turns out that ….

Typical Landscapes f(s) members of S lined up along here Typically, with large (realistic) problems, the huge majority of the landscape has very poor fitness – there are tiny areas where the decent solutions lurk. So, big random changes are very likely to take us outside the nice areas.

Quiz Questions for lecture 2 3. Here is an altered distance matrix for the TSP problem in these slides. Illustrate five steps of Hillclimbing, in the same way done in these slides (inventing your own random mutations etc…), using this altered matrix. 4. Consider the example encoding and mutation operator given in these slides for the equalize-bins problem, and imagine nitems = 100 and N (number of bins) is 10. What is the neighbourhood size? ABCDE A B C1732 D E