Speciation/Niching The original SGA (Simple GA) is designed to rapidly search the landscape (exploration) and zoom in (exploitation) on a single solution.

Slides:



Advertisements
Similar presentations
Genetic Algorithms Chapter 3. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Genetic Algorithms GA Quick Overview Developed: USA in.
Advertisements

Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Annual International Conference On GIS, GPS AND Remote Sensing.
Topic Outline ? Black-Box Optimization Optimization Algorithm: only allowed to evaluate f (direct search) decision vector x objective vector f(x) objective.
Using Parallel Genetic Algorithm in a Predictive Job Scheduling
Tuesday, May 14 Genetic Algorithms Handouts: Lecture Notes Question: when should there be an additional review session?
Genetic Algorithms Representation of Candidate Solutions GAs on primarily two types of representations: –Binary-Coded –Real-Coded Binary-Coded GAs must.
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
A New Evolutionary Algorithm for Multi-objective Optimization Problems Multi-objective Optimization Problems (MOP) –Definition –NP hard By Zhi Wei.
Multi-Objective Evolutionary Algorithms Matt D. Johnson April 19, 2007.
Evolutionary Computational Inteliigence Lecture 6a: Multimodality.
Multimodal Problems and Spatial Distribution Chapter 9.
Genetic Algorithms in Materials Processing N. Chakraborti Department of Metallurgical & Materials Engineering Indian Institute of Technology Kharagpur.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Genetic Algorithms Nehaya Tayseer 1.Introduction What is a Genetic algorithm? A search technique used in computer science to find approximate solutions.
Computer Science Genetic Algorithms10/13/10 1 An Investigation of Niching and Species Formation in Genetic Function Optimization Kalyanmoy Deb David E.
Genetic Algorithms Overview Genetic Algorithms: a gentle introduction –What are GAs –How do they work/ Why? –Critical issues Use in Data Mining –GAs.
1 Reasons for parallelization Can we make GA faster? One of the most promising choices is to use parallel implementations. The reasons for parallelization.
Genetic Algorithm.
Multimodal Optimization (Niching) A/Prof. Xiaodong Li School of Computer Science and IT, RMIT University Melbourne, Australia
Cristian Urs and Ben Riveira. Introduction The article we chose focuses on improving the performance of Genetic Algorithms by: Use of predictive models.
1 Local search and optimization Local search= use single current state and move to neighboring states. Advantages: –Use very little memory –Find often.
Intro. ANN & Fuzzy Systems Lecture 36 GENETIC ALGORITHM (1)
Optimization Problems - Optimization: In the real world, there are many problems (e.g. Traveling Salesman Problem, Playing Chess ) that have numerous possible.
Lecture 5. Niching and Speciation (2) 4 학습목표 진화로 얻어진 해의 다양성을 확보하기 위한 대표 적인 방법에 대해 이해한다.
Soft Computing A Gentle introduction Richard P. Simpson.
EE459 I ntroduction to Artificial I ntelligence Genetic Algorithms Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University.
Niching Genetic Algorithms Motivation The Idea Ecological Meaning Niching Techniques.
Evolutionary Computing Chapter 5. / 32 Chapter 5: Fitness, Selection and Population Management Selection is second fundamental force for evolutionary.
Parallel Genetic Algorithms By Larry Hale and Trevor McCasland.
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Selection Methods Choosing the individuals in the population that will create offspring for the next generation. Richard P. Simpson.
CAP6938 Neuroevolution and Artificial Embryogeny Evolutionary Comptation Dr. Kenneth Stanley January 23, 2006.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Mixture Densities Maximum Likelihood Estimates.
Multimodal Problems and Spatial Distribution A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 9.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
 Presented By: Abdul Aziz Ghazi  Roll No:  Presented to: Sir Harris.
Theory of Computational Complexity Probability and Computing Chapter Hikaru Inada Iwama and Ito lab M1.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 Intelligent Exploration for Genetic Algorithms Using Self-Organizing.
Complexity Analysis (Part I)
Using GA’s to Solve Problems
Chapter 14 Genetic Algorithms.
Genetic Algorithms.
Dr. Kenneth Stanley September 11, 2006
Particle Swarm Optimization (2)
Bulgarian Academy of Sciences
Dr. Kenneth Stanley September 13, 2006
Evolution strategies Can programs learn?
Kalyanmoy Deb David E. Goldberg
Multimodal Problems and Spatial Distribution
15-2 Mechanisms of Evolution
Advanced Artificial Intelligence Feature Selection
Basics of Genetic Algorithms (MidTerm – only in RED material)
Multimodal Problems and Spatial Distribution
2 Fundamentals of Genetic Algorithms
Genetic Algorithms Chapter 3.
○ Hisashi Shimosaka (Doshisha University)
Basics of Genetic Algorithms
LECTURE 21: CLUSTERING Objectives: Mixture Densities Maximum Likelihood Estimates Application to Gaussian Mixture Models k-Means Clustering Fuzzy k-Means.
EE368 Soft Computing Genetic Algorithms.
Multimodal Problems and Spatial Distribution
Searching for solutions: Genetic Algorithms
A Gentle introduction Richard P. Simpson
The N-Queens Problem Search The N-Queens Problem Most slides from Milos Hauskrecht.
Lecture 4. Niching and Speciation (1)
Alex Bolsoy, Jonathan Suggs, Casey Wenner
Complexity Analysis (Part I)
Complexity Analysis (Part I)
Coevolutionary Automated Software Correction
Presentation transcript:

Speciation/Niching The original SGA (Simple GA) is designed to rapidly search the landscape (exploration) and zoom in (exploitation) on a single solution. This scheme suffers from genetic drift and premature convergence. In these cases the entire population often converges to a single individual, in the process loosing the diversity originally contained in the original population. For these reasons there has been considerable research in multimodal optimization. That is, simultaneous searching for multiple solutions (peaks) in a population.

Speciation There have been numerous techniques and methodologies that promote speciation in GA’s. We will look at two well known methods carefully. The majority of methods attempt to keep multiple subpopulations running in parallel. Individuals in these “species” are only allowed to interbreed among themselves. Each species (niche) is usually small and usage of some sort of incest prevention methods might be desired. [Goldberg & Richardson 1987] This is the most well known method and is usually termed fitness sharing. [Spears 1994] Simple subpopulation schemes using tag bits.

Fitness Sharing “Genetic Algorithms with Sharing for Multimodal Function Optimization.” by Goldberg and Richardson In fitness sharing the fitness function is modified to “allow” only so many individuals to occupy a single peak at a time. In this case the entire population is spread out over many different peaks and each cluster around each peak interbreed and exploit that peak.

Fitness Sharing Technique Fitness sharing restricts the number of individuals climbing a peak by reducing the fitness of these individuals as the niche count increases. Individuals are said to be in the same niche if the distance between them is restricted by some value.

Niche Example

Fitness Sharing The shared fitness of an individual is Fs(i) = fi/mi Where fi is the normal fitness of the individual And mi is the called the niche count. In order to calculate we need a distance metric d(i,j). This may be either Euclidean or maybe a hamming distance. d(i,j) = sqrt(x1*x1+x2*x2 +…+ xn*xn)

Fitness sharing The shared fitness of an individual is calculated using a sharing function Sh() which makes distant individuals share less. sh(dij)= 1- ( dij/radius)k for 0<=dij< radius sh(dij)= 0 otherwise The value of mi is calculated by mi = Σ sh(dij) from j=1 to pop size.

Fitness Sharing (let k=1) sh(dij)= 1- ( dij/radius) for 0<=dij< radius sh(dij)= 0 otherwise i dij j dik k

Fitness Sharing Flaws Radius is set prior to execution. Determining this value is not easy in most landscapes. This is really a statement that the peaks are evenly distributed throughout the space. Also Radius is fixed so all the peaks are required to have the same radius. There are as a result several extension by other authors that allow variable niche sizes. We need to know the number of peaks in the space. Complexity is O(n2) ( Why?)

Speciation with Tag Bits http://www. aic. nrl. navy Spears developed this method that does not require a distance metric. A “Label” is used instead for each individual Restricted mating and sharing with labels can now be accomplished efficiency. Spears uses tag bits to label individuals

Tag Bits In standard fitness proportional selection the expected number of offspring for an individual is fi/ave_fit where ave_fit is the average fitness of the population. Here fitness is shared over a set of subpopulations {S0,S1,…Sn-1} All individuals in the same set have the same tag bits. Each individual is in only one set.

Tag Bits The shared fitness is therefore calculated using the set cardinalities Fi = fi/||Sj|| where i ε Sj The sum of these finesses divided by N, the pop. count, is the new average fitness F Hence the expected number of offspring is now Fi/F Restricted mating is now done by allowing mating to be performed only between individuals that have the same tag bits.

Tag bits If we use n tag bits then the population can represent 2n species. Population size is of course related to the number of tag bits we wish to use. A population of size 100 and 4 tag bits imply that we can have approx. 100/24 = 100/16 individuals in a species. Tag bits can be mutated as well.

Multiple Peaks

Spears thought experiment Suppose that we have only one tag bit and two peaks, one twice as high as the other. Allocate tag bits to each individual randomly. After random sampling the two population could settle on either of the two peaks, the same or different. With fitness sharing the higher peak can only support twice as many individuals as the lower peak. This means in a population of 100, 66 individuals will be in the higher peak and 34 will be in the lower peak.

Mutating Tag Bits Tag bits will not be modified by crossover. Mutation can be used to allow individuals to move from one subpopulation to another. One can control this by the amount of mutation you allow to occur to tags.

Tag Bit Structure What data structure would you use to store the individuals so that it is efficient/easy to select two individuals from the same species.

Example Problems that these schemes may apply to. The magic square problem is a nice example. There are lots of solutions to a 4 by 4 magic square and each has exactly the same fitness. If done using real numbers the landscape has quite a few peaks (at least 880 unique ones) The Diophantine problem is also interesting since they often have several solutions to each equation.