GAs: why do they sometimes not work? n The coding moves the GA to operate on a different search space --- bad coding might deceive the GA or might slow.

Slides:



Advertisements
Similar presentations
Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Advertisements

A First Course in Genetic Algorithms
Constraint Optimization We are interested in the general non-linear programming problem like the following Find x which optimizes f(x) subject to gi(x)
Student : Mateja Saković 3015/2011.  Genetic algorithms are based on evolution and natural selection  Evolution is any change across successive generations.
Genetic Algorithms Representation of Candidate Solutions GAs on primarily two types of representations: –Binary-Coded –Real-Coded Binary-Coded GAs must.
1 Wendy Williams Metaheuristic Algorithms Genetic Algorithms: A Tutorial “Genetic Algorithms are good at taking large, potentially huge search spaces and.
Date:2011/06/08 吳昕澧 BOA: The Bayesian Optimization Algorithm.
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
Object Recognition Using Genetic Algorithms CS773C Advanced Machine Intelligence Applications Spring 2008: Object Recognition.
Evolutionary Computational Intelligence
A new crossover technique in Genetic Programming Janet Clegg Intelligent Systems Group Electronics Department.
Introduction to Genetic Algorithms Yonatan Shichel.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Evolutionary Computational Intelligence Lecture 8: Memetic Algorithms Ferrante Neri University of Jyväskylä.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Chapter 6: Transform and Conquer Genetic Algorithms The Design and Analysis of Algorithms.
Genetic Algorithms Overview Genetic Algorithms: a gentle introduction –What are GAs –How do they work/ Why? –Critical issues Use in Data Mining –GAs.
Universidad de los Andes-CODENSA The Continuous Genetic Algorithm.
Genetic Programming.
Genetic Algorithms: A Tutorial
Computer Implementation of Genetic Algorithm
Genetic Algorithms CS121 Spring 2009 Richard Frankel Stanford University 1.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
Intro. ANN & Fuzzy Systems Lecture 36 GENETIC ALGORITHM (1)
Genetic algorithms Prof Kang Li
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory Mixed Integer Problems Most optimization algorithms deal.
CS Machine Learning Genetic Algorithms (II).
Lecture 8: 24/5/1435 Genetic Algorithms Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
Genetic Algorithms by using MapReduce
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Genetic Algorithms Michael J. Watts
Genetic algorithms Charles Darwin "A man who dares to waste an hour of life has not discovered the value of life"
Optimization Problems - Optimization: In the real world, there are many problems (e.g. Traveling Salesman Problem, Playing Chess ) that have numerous possible.
Genetic Algorithms Genetic algorithms imitate a natural optimization process: natural selection in evolution. Developed by John Holland at the University.
1 Genetic Algorithms “Genetic Algorithms are good at taking large, potentially huge search spaces and navigating them, looking for optimal combinations.
2 Fundamentals of Genetic Algorithms Alexandre P. Alves da Silva and Djalma M. Falca˜o.
Genetic Algorithms Genetic Algorithms – What are they? And how they are inspired from evolution. Operators and Definitions in Genetic Algorithms paradigm.
Computational Complexity Jang, HaYoung BioIntelligence Lab.
Genetic Algorithms Introduction Advanced. Simple Genetic Algorithms: Introduction What is it? In a Nutshell References The Pseudo Code Illustrations Applications.
Chapter 4.1 Beyond “Classic” Search. What were the pieces necessary for “classic” search.
1 “Genetic Algorithms are good at taking large, potentially huge search spaces and navigating them, looking for optimal combinations of things, solutions.
GENETIC ALGORITHM A biologically inspired model of intelligence and the principles of biological evolution are applied to find solutions to difficult problems.
Derivative Free Optimization G.Anuradha. Contents Genetic Algorithm Simulated Annealing Random search method Downhill simplex method.
GENETIC ALGORITHMS.  Genetic algorithms are a form of local search that use methods based on evolution to make small changes to a popula- tion of chromosomes.
15/06/2003NORPIE 2004, Trondheim1 Genetic Optimization of Electric Machines, a State of the Art Study S. E. Skaar, R. Nilssen.
Artificial Intelligence Chapter 4. Machine Evolution.
Learning by Simulating Evolution Artificial Intelligence CSMC February 21, 2002.
1 Genetic Algorithms and Ant Colony Optimisation.
Why do GAs work? Symbol alphabet : {0, 1, * } * is a wild card symbol that matches both 0 and 1 A schema is a string with fixed and variable symbols 01*1*
Chapter 9 Genetic Algorithms.  Based upon biological evolution  Generate successor hypothesis based upon repeated mutations  Acts as a randomized parallel.
Edge Assembly Crossover
Genetic Algorithms. 2 Overview Introduction To Genetic Algorithms (GAs) GA Operators and Parameters Genetic Algorithms To Solve The Traveling Salesman.
Chapter 12 FUSION OF FUZZY SYSTEM AND GENETIC ALGORITHMS Chi-Yuan Yeh.
5. Implementing a GA 4 학습목표 GA 를 사용해 실제 문제를 해결할 때 고려해야 하는 사항에 대해 이해한다 Huge number of choices with little theoretical guidance Implementation issues + sophisticated.
GENETIC ALGORITHM Basic Algorithm begin set time t = 0;
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
1 Chapter 3 GAs: Why Do They Work?. 2 Schema Theorem SGA’s features: binary encoding proportional selection one-point crossover strong mutation Schema.
Why do GAs work? Symbol alphabet : {0, 1, * } * is a wild card symbol that matches both 0 and 1 A schema is a string with fixed and variable symbols 01*1*
Ch. Eick: Num. Optimization with GAs Numerical Optimization General Framework: objective function f(x 1,...,x n ) to be minimized or maximized constraints:
Genetic Programming Using Simulated Natural Selection to Automatically Write Programs.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
 Presented By: Abdul Aziz Ghazi  Roll No:  Presented to: Sir Harris.
Hirophysics.com The Genetic Algorithm vs. Simulated Annealing Charles Barnes PHY 327.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 Intelligent Exploration for Genetic Algorithms Using Self-Organizing.
Introduction to Genetic Algorithms
Using GA’s to Solve Problems
C.-S. Shieh, EC, KUAS, Taiwan
Genetic Algorithms, Search Algorithms
Beyond Classical Search
Presentation transcript:

GAs: why do they sometimes not work? n The coding moves the GA to operate on a different search space --- bad coding might deceive the GA or might slow the GA down. Bad coding might also make it very hard for the GA to identify bulding blocks of good solutions, resulting in a somewhat randomized, unfocused search behavior. n GA doesn’t use binary strings -> schema theorem does not apply. n Population sizes have to be finite. n Number of interations have to be finite. n Poor handling of constraints. n Good solutions are lost or destroyed. n Insufficient search space coverage during the search process. n too much (-> premature convergence) or not enough selective pressure (- >random search). n Unbalanced sampling –unbalanced selection of parents. –fitness function not well integrated with the selection methods (might need scaling are another selection method). –bad random generators.

Fitness Function Scaling (cont.) n min eval, max eval, and av eval denote the maximum, minimum, and average fitness value in a particular population. n Making the search process less randomized by scaling the fitness function: –F(x)=2**(eval(x)) –F(x)=(eval(x)+c)**k with c+min eval >1, k>1 n Making the search process more randomized by scaling the fitness function –F(x)=log(eval(x)+c) with c+min eval >1 n The selection strategy that is employed by the GA is very important for fitness function scaling: fitness functions that work well with one selection strategy might cooperate poorly with others. By defining fitness functions that are scaled with respect to the average fitness of the population (e.g.  -coding [Forrest 1985]); that is, although the average fitness of a population might improve during the evolution process, the selective pressure remains approximately the same, if this approach is successfully applied.

Popular Selection Strategies 2 major factors influence the genetic search [Whitley] Population Diversity Selective Pressure Improved Stategies Elitist Model (“The best solution never dies.”) Expected Value Model (f(v)/f av ) is decreased by subtracting constant or by division through a constant when the chromosome is selected, to reduce stochastic errors in the selection routine. Crowding [DeJong] (“a newly generated solution replaces a solution in the population that is similar to it”). Ranking Selection Tournament Selection (combines fitness-value based selection with ranking selection)

Hybrid Schemes n Idea: Combine the GA with other problem solving paradigms, occasionally taking advantage of already existing problem-specific knowledge that is incorporated into the other problem solving paradigm. n Example: Goldberg’s G-Improvement[1983]: 1. Select one or more strings from the current population. 2. Sweep bit by bit, performing successive one bit changes to the subject strings as long the fitness improves (local search). 3. At the end of the sweep, reinsert the improved strings into the population, and continue the normal GA-process (until the next local search occurs). GA-Search relying on crossover, mutation, and selection. Local Search: calculus-based, Greedy or using other techniques Search Controller

Hierarchically Embedded GAs n hierarchical approach that employs nested GAs –Outer GA identifies subregion / reference points / solution framework –Inner GA searches within the subregion / framework provided by the outer GA. n Delta Coding, developed by Whitley et al. applies GA-techniques to two levels: –level of potential solutions –level of delta changes (slight modifications of a solution) n is also employed by the Dynamic Parameter Encoding Strategy(DPE). n Greffenstette employs a Meta-GA/GA setting, in which the Meta-GA learns the control parameters of a particular GA (similar ideas are also currently explored at Stanford University for various learning alg.). n also related to meta-learning research (-> and approaches that employ multi-layered decision making strategies.

Multi-Layered Learning & Optimization Approaches M1 M1(solves subproblem) Embedded Approaches M1 Multi-Layered Approaches Meta-Decision-Maker M1 Meta-Learning/ Meta-Optimization Tree-like Approaches Remarks: Meta-decision-makers can use voting, evidence combination or might employ a strategy that was learnt by training the meta-decision maker with meta data. Tree-like approaches employ the same decision making scheme at all levels; however, decisions in intermediate nodes are “navigational, whereas decisions in leafs are final.