Presentation is loading. Please wait.

Presentation is loading. Please wait.

GAs: why do they sometimes not work? n The coding moves the GA to operate on a different search space --- bad coding might deceive the GA or might slow.

Similar presentations


Presentation on theme: "GAs: why do they sometimes not work? n The coding moves the GA to operate on a different search space --- bad coding might deceive the GA or might slow."— Presentation transcript:

1 GAs: why do they sometimes not work? n The coding moves the GA to operate on a different search space --- bad coding might deceive the GA or might slow the GA down. Bad coding might also make it very hard for the GA to identify bulding blocks of good solutions, resulting in a somewhat randomized, unfocused search behavior. n GA doesn’t use binary strings -> schema theorem does not apply. n Population sizes have to be finite. n Number of interations have to be finite. n Poor handling of constraints. n Good solutions are lost or destroyed. n Insufficient search space coverage during the search process. n too much (-> premature convergence) or not enough selective pressure (- >random search). n Unbalanced sampling –unbalanced selection of parents. –fitness function not well integrated with the selection methods (might need scaling are another selection method). –bad random generators.

2 Fitness Function Scaling (cont.) n min eval, max eval, and av eval denote the maximum, minimum, and average fitness value in a particular population. n Making the search process less randomized by scaling the fitness function: –F(x)=2**(eval(x)) –F(x)=(eval(x)+c)**k with c+min eval >1, k>1 n Making the search process more randomized by scaling the fitness function –F(x)=log(eval(x)+c) with c+min eval >1 n The selection strategy that is employed by the GA is very important for fitness function scaling: fitness functions that work well with one selection strategy might cooperate poorly with others. By defining fitness functions that are scaled with respect to the average fitness of the population (e.g.  -coding [Forrest 1985]); that is, although the average fitness of a population might improve during the evolution process, the selective pressure remains approximately the same, if this approach is successfully applied.

3 Popular Selection Strategies 2 major factors influence the genetic search [Whitley] Population Diversity Selective Pressure Improved Stategies Elitist Model (“The best solution never dies.”) Expected Value Model (f(v)/f av ) is decreased by subtracting constant or by division through a constant when the chromosome is selected, to reduce stochastic errors in the selection routine. Crowding [DeJong] (“a newly generated solution replaces a solution in the population that is similar to it”). Ranking Selection Tournament Selection (combines fitness-value based selection with ranking selection)

4 Hybrid Schemes n Idea: Combine the GA with other problem solving paradigms, occasionally taking advantage of already existing problem-specific knowledge that is incorporated into the other problem solving paradigm. n Example: Goldberg’s G-Improvement[1983]: 1. Select one or more strings from the current population. 2. Sweep bit by bit, performing successive one bit changes to the subject strings as long the fitness improves (local search). 3. At the end of the sweep, reinsert the improved strings into the population, and continue the normal GA-process (until the next local search occurs). GA-Search relying on crossover, mutation, and selection. Local Search: calculus-based, Greedy or using other techniques Search Controller

5 Hierarchically Embedded GAs n hierarchical approach that employs nested GAs –Outer GA identifies subregion / reference points / solution framework –Inner GA searches within the subregion / framework provided by the outer GA. n Delta Coding, developed by Whitley et al. applies GA-techniques to two levels: –level of potential solutions –level of delta changes (slight modifications of a solution) n is also employed by the Dynamic Parameter Encoding Strategy(DPE). n Greffenstette employs a Meta-GA/GA setting, in which the Meta-GA learns the control parameters of a particular GA (similar ideas are also currently explored at Stanford University for various learning alg.). n also related to meta-learning research (-> Stolfo@Columbia) and approaches that employ multi-layered decision making strategies.

6 Multi-Layered Learning & Optimization Approaches M1 M1(solves subproblem) Embedded Approaches M1 Multi-Layered Approaches Meta-Decision-Maker M1 Meta-Learning/ Meta-Optimization Tree-like Approaches Remarks: Meta-decision-makers can use voting, evidence combination or might employ a strategy that was learnt by training the meta-decision maker with meta data. Tree-like approaches employ the same decision making scheme at all levels; however, decisions in intermediate nodes are “navigational, whereas decisions in leafs are final.


Download ppt "GAs: why do they sometimes not work? n The coding moves the GA to operate on a different search space --- bad coding might deceive the GA or might slow."

Similar presentations


Ads by Google