2 Fundamentals of Genetic Algorithms Alexandre P. Alves da Silva and Djalma M. Falca˜o.

Slides:



Advertisements
Similar presentations
Genetic Algorithms Chapter 3. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Genetic Algorithms GA Quick Overview Developed: USA in.
Advertisements

Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Using Parallel Genetic Algorithm in a Predictive Job Scheduling
Genetic Algorithms Representation of Candidate Solutions GAs on primarily two types of representations: –Binary-Coded –Real-Coded Binary-Coded GAs must.
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
Evolutionary Computational Intelligence
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Genetic Algorithms Nehaya Tayseer 1.Introduction What is a Genetic algorithm? A search technique used in computer science to find approximate solutions.
Chapter 6: Transform and Conquer Genetic Algorithms The Design and Analysis of Algorithms.
Genetic Algorithms Overview Genetic Algorithms: a gentle introduction –What are GAs –How do they work/ Why? –Critical issues Use in Data Mining –GAs.
1 Reasons for parallelization Can we make GA faster? One of the most promising choices is to use parallel implementations. The reasons for parallelization.
Genetic Algorithm.
Computer Implementation of Genetic Algorithm
Genetic Algorithms Beasley, Bull and Martin,
Neural and Evolutionary Computing - Lecture 10 1 Parallel and Distributed Models in Evolutionary Computing  Motivation  Parallelization models  Distributed.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
Intro. ANN & Fuzzy Systems Lecture 36 GENETIC ALGORITHM (1)
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory Mixed Integer Problems Most optimization algorithms deal.
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
What is Genetic Programming? Genetic programming is a model of programming which uses the ideas (and some of the terminology) of biological evolution to.
Genetic algorithms Charles Darwin "A man who dares to waste an hour of life has not discovered the value of life"
Optimization Problems - Optimization: In the real world, there are many problems (e.g. Traveling Salesman Problem, Playing Chess ) that have numerous possible.
Omni-Optimizer A Procedure for Single and Multi-objective Optimization Prof. Kalyanmoy Deb and Santosh Tiwari.
Genetic Algorithms Introduction Advanced. Simple Genetic Algorithms: Introduction What is it? In a Nutshell References The Pseudo Code Illustrations Applications.
GENETIC ALGORITHM A biologically inspired model of intelligence and the principles of biological evolution are applied to find solutions to difficult problems.
GAs: why do they sometimes not work? n The coding moves the GA to operate on a different search space --- bad coding might deceive the GA or might slow.
15/06/2003NORPIE 2004, Trondheim1 Genetic Optimization of Electric Machines, a State of the Art Study S. E. Skaar, R. Nilssen.
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
Learning by Simulating Evolution Artificial Intelligence CSMC February 21, 2002.
Why do GAs work? Symbol alphabet : {0, 1, * } * is a wild card symbol that matches both 0 and 1 A schema is a string with fixed and variable symbols 01*1*
1 Genetic Algorithms K.Ganesh Introduction GAs and Simulated Annealing The Biology of Genetics The Logic of Genetic Programmes Demo Summary.
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
METAHEURISTICS Genetic Algorithm Jacques A. Ferland Department of Informatique and Recherche Opérationnelle Université de Montréal
Genetic Algorithms. 2 Overview Introduction To Genetic Algorithms (GAs) GA Operators and Parameters Genetic Algorithms To Solve The Traveling Salesman.
Evolutionary Computing Chapter 5. / 32 Chapter 5: Fitness, Selection and Population Management Selection is second fundamental force for evolutionary.
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #12 2/20/02 Evolutionary Algorithms.
Chapter 12 FUSION OF FUZZY SYSTEM AND GENETIC ALGORITHMS Chi-Yuan Yeh.
EE749 I ntroduction to Artificial I ntelligence Genetic Algorithms The Simple GA.
Genetic Algorithms. The Basic Genetic Algorithm 1.[Start] Generate random population of n chromosomes (suitable solutions for the problem) 2.[Fitness]
Parallel Genetic Algorithms By Larry Hale and Trevor McCasland.
Waqas Haider Bangyal 1. Evolutionary computing algorithms are very common and used by many researchers in their research to solve the optimization problems.
5. Implementing a GA 4 학습목표 GA 를 사용해 실제 문제를 해결할 때 고려해야 하는 사항에 대해 이해한다 Huge number of choices with little theoretical guidance Implementation issues + sophisticated.
GENETIC ALGORITHM Basic Algorithm begin set time t = 0;
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
In the name of ALLAH Presented By : Mohsen Shahriari, the student of communication in Sajad institute for higher education.
Genetic Algorithms Chapter Description of Presentations
1 Contents 1. Basic Concepts 2. Algorithm 3. Practical considerations Genetic Algorithm (GA)
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Why do GAs work? Symbol alphabet : {0, 1, * } * is a wild card symbol that matches both 0 and 1 A schema is a string with fixed and variable symbols 01*1*
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
Selection and Recombination Temi avanzati di Intelligenza Artificiale - Lecture 4 Prof. Vincenzo Cutello Department of Mathematics and Computer Science.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
1 Comparative Study of two Genetic Algorithms Based Task Allocation Models in Distributed Computing System Oğuzhan TAŞ 2005.
Genetic Algorithms. Solution Search in Problem Space.
Genetic Algorithms An Evolutionary Approach to Problem Solving.
Breeding Swarms: A GA/PSO Hybrid 簡明昌 Author and Source Author: Matthew Settles and Terence Soule Source: GECCO 2005, p How to get: (\\nclab.csie.nctu.edu.tw\Repository\Journals-
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
 Presented By: Abdul Aziz Ghazi  Roll No:  Presented to: Sir Harris.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
CEng 713, Evolutionary Computation, Lecture Notes parallel Evolutionary Computation.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 Intelligent Exploration for Genetic Algorithms Using Self-Organizing.
School of Computer Science & Engineering
C.-S. Shieh, EC, KUAS, Taiwan
2 Fundamentals of Genetic Algorithms
Genetic Algorithms Chapter 3.
Searching for solutions: Genetic Algorithms
Coevolutionary Automated Software Correction
Presentation transcript:

2 Fundamentals of Genetic Algorithms Alexandre P. Alves da Silva and Djalma M. Falca˜o

2.1 Introduction Main design issues in tailoring GAs to large-scale optimization problems –Encoding schemes –Selection procedures –Self-adaptive and knowledge-based operators

2.2 Modern Heuristic Search Techniques Traditional optimization techniques begin with a single candidate and search iteratively for the optimal solution by applying static heuristics. On the other hand, the GA approach uses a population of candidates to search several areas of a solution space, simultaneously and adaptively.

2.2 Modern Heuristic Search Techniques (cont) With GAs –No need for explicit objective function (can be a black box) –Object functions need not to be differentiable –Good at combinatorial optimization problems, achieving high-quality solutions within reasonable run time Heuristic optimization accepts inferior candidate solution “occasionally” in order to escape from local optima.

2.2 Modern Heuristic Search Techniques (cont) The choice of representation for a GA is fundamental to achieving good results. Hybrid methods have been proposed in order to improve the robustness of the search. –Heuristic method as coarse-grain optimizer –Local search as fine-grain optimizer

2.3 Introduction to GAs Genetic algorithms operate on a population of individuals. Each individual is a potential solution to a given problem. After an initial population is randomly or heuristically generated, the algorithm evolves the population through sequential and iterative application of three operators: selection, crossover, and mutation.

2.3 Introduction to GAs (cont) Incorporate prior knowledge whenever possible, but not restrict the population diversity. The population size an inevitable tradeoff between computational cost and solution quality. The coding scheme and the fitness function are the most important aspects of any GA, because they are problem dependent.

2.4 Encoding A decision must be taken on how the parameters of the problem will be mapped into a finite string of symbols, known as genes (with constant or dynamic length), encoding a possible solution in a given problem space. The symbol alphabet used is often binary, though other representations have also been used, including character-based and real-valued encodings. Design GA operators to effectively operate on the representation.

2.5 Fitness Function It is useful to distinguish between the objective function and the fitness function used by a GA. The specification of an appropriate fitness function is crucial for the correct operation of a GA. Reshaping is applied if needed.

2.5 Fitness Function (cont) Four basic techniques for handling constraints when using GAs. –Discard infeasible chromosomes –Repair infeasible chromosomes –Problem-specific GA operators –Penalty functions

2.5 Fitness Function (cont) Some problems associated with the fitness function specification are the following: –Dependence on whether the problem is related to maximization or minimization; –When the fitness function is noisy for a nondeterministic environment; –The fitness function may change dynamically as the GA is executed; –The fitness function evaluation can be so time consuming that only approximations to fitness values can be computed; –The fitness function should allocate very different values to strings in order to make the selection operator work easier; –It must consider the constraints of the problem; and –It could incorporate different sub-objectives.

2.5 Fitness Function (cont) The fitness function is a black box for the GA. Internally, this may be achieved by a mathematical function, a simulator program, or a human expert that decides the quality of a string.

2.5.1 Premature Convergence Once the population has converged, the ability of the GA to continue searching for better solutions is nearly eliminated. Crossover of almost identical chromosomes generally produces similar offspring. Only mutation, with its random perturbation mechanism, remains to explore new regions of the search space.

2.5.1 Premature Convergence (cont) Control the number of reproductive opportunities each individual gets. The strategy is to compress the range of fitness, without loosing selection pressure, and avoid any super-fit individual from suddenly dominating the population.

2.5.2 Slow Finishing This is the opposite problem of premature convergence. After many generations, the average fitness is high, and the difference between the best and the average individuals is small. Therefore, there is insufficient variance in the fitness function values to localize the maxima. An expansion of the range of population fitness is produced, instead of a compression.

2.6 Basic Operators Several important design issues for the selection, crossover, and mutation operators are presented –Selection implements the survival of the fittest according to some predefined fitness function –Crossover is likely to create even better individuals by recombining portions of good individuals –Mutation operator is to maintain diversity within the population and inhibit premature convergence to local optima by randomly sampling new points in the search space

2.6.1 Selection Selection pressure is the degree to which the best individuals are favored. The convergence rate of a GA is determined largely by the magnitude of the selection pressure. Reshaping of fitness functions to control the selection pressure –compression or expansion

2.6.1 Selection (cont) Two classes of selection schemes –Proportionate-based schemes select individuals based on their fitness values relative to the fitness of the other individuals in the population. –Ordinal-based schemes select individuals not based on their fitness but based on their rank within the population.

Tournament Selection This selection scheme is implemented by choosing some number of individuals randomly from the population, copying the best individual from this group into the intermediate population, and repeating it until the mating pool is complete. By adjusting the tournament size, the selection pressure can be made arbitrarily large or small. Bigger tournaments have the effect of increasing the selection pressure.

Truncation Selection In truncation selection, only a subset of the best individuals is chosen to be possibly selected, with the same probability.

Linear Ranking Selection The individuals are sorted according to their fitness values. The selection probability is linearly assigned to the individuals according to their ranks.

Exponential Ranking Selection Exponential ranking selection differs from linear ranking selection only in that the probabilities of the ranked individuals are exponentially weighted.

Elitist Selection Preservation of the elite solutions from the preceding generation ensures that the best solutions known so far will remain in the population and have more opportunities to produce offspring. Elitist selection is used in combination with other selection strategies.

Proportional Selection The probability that an individual will be selected is simply proportionate to its fitness value. The selection probabilities strongly depend on the scaling of the fitness function. There are different scaling operators that help in separating the fitness values in order to improve the work of the proportional selection mechanism. The most common ones are linear scaling, sigma truncation, and power scaling.

Linear Scaling

Sigma Truncation

Power Scaling

2.6.2 Crossover Crossover exchanges genes of two chromosomes.

2.6.2 Crossover (cont) Uniform crossover is another important recombination mechanism [25]. Offspring is created by randomly picking each bit from either of the two parent strings (Fig. 2.3).

2.6.2 Crossover (cont) Crossover is a very controversial operator due to its disruptive nature (i.e., it can split important information).

2.6.3 Mutation The GA literature has reflected a growing recognition of the importance of mutation in contrast with viewing it as responsible for reintroducing inadvertently lost gene values. The mutation operator is more important at the final generations when the majority of the individuals present similar quality. As is shown in Section 2.6.4, a variable mutation rate is very important for the search efficiency. Its setting is much more critical than that of crossover rate.

2.6.3 Mutation (cont) In the case of binary encoding, mutation is carried out by flipping bits at random, with some small probability (usually in the range [0.001; 0.05]). For real-valued encoding, the mutation operator can be implemented by random replacement (i.e., replace the value with a random one). Another possibility is to add/subtract (or multiply by) a random (e.g., uniformily or Gaussian distributed) amount. Mutation can also be used as a hill-climbing mechanism.

2.6.4 Control Parameters Estimation Typical values for the population size, crossover, and mutation rates have been selected in the intervals [30, 200], [0.5, 1.0], and [0.001, 0.05], respectively. Tuning parameters manually is common practice in GA design. The GA search technique is an adaptive process, which requires continuous tracking of the search dynamics. Therefore, the use of constant parameters leads to inferior performance.

2.7 Niching Methods The identification of multiple optima is beneficial for the GA performance. Niching methods extend standard GAs by creating stable subpopulations around global and local optimal solutions. Niching methods maintain population diversity and allow Gas to explore many peaks simultaneously. They are based on either fitness sharing or crowding schemes.

2.7 Niching Methods (cont) Fitness sharing decreases each element’s fitness proportionately to the number of similar individuals in the population (i.e., in the same niche). The similarity measure is based on either the genotype (e.g., Hamming distance) or the phenotype (e.g., Euclidian distance).

2.7 Niching Methods (cont) Crowding implicitly defines neighborhood by the application of tournament rules. It can be implemented as follows. When an offspring is created, one individual is chosen, from a random subset of the population, to disappear. The chosen one is the element that most closely resembles the new offspring.

2.7 Niching Methods (cont) Another idea used by niching methods is restricted mating. This mechanism avoids the recombination of individuals that do not belong to the same niche. Highly fit, but not similar, parents can produce highly unfit offspring. Restricted mating is based on the assumption that if similar parents (i.e., from the same niche) are mated, then offspring will be similar to them.

2.8 Parallel Genetic Algorithms In several cases, the application of GAs to actual problems in business, engineering, and science requires long processing time (hours, days). Most of this time is usually spent in thousands of fitness function evaluations that may involve long calculations. Fortunately, these evaluations can be performed quite independently of each other, which makes GAs very adequate for parallel processing.

2.8 Parallel Genetic Algorithms (cont) The usual classification of parallel GAs found in the literature is –Single-population master–slave Gas: One master node executes the basic GA operations (selection, crossover, and mutation), and the fitness evaluations are distributed among several slave processors. –Multiple-population GAs: This type of GA consists of several subpopulations that exchange individuals occasionally.

2.8 Parallel Genetic Algorithms (cont) –Fine-grained GAs: These consist of a single spatially structured population, usually in a two-dimensional rectangular grid, with one individual per grid point. The fitness evaluation is performed simultaneously for all the individuals, and selection and mating are usually restricted to a small neighborhood around each individual. This type of parallel GA is well suited for massively parallel multiple instruction, multiple data (MIMD) computers.