Presentation is loading. Please wait.

Presentation is loading. Please wait.

Optimization with Genetic Algorithms Walter Reade October 31, 2002 TAG Meeting.

Similar presentations


Presentation on theme: "Optimization with Genetic Algorithms Walter Reade October 31, 2002 TAG Meeting."— Presentation transcript:

1 Optimization with Genetic Algorithms Walter Reade October 31, 2002 TAG Meeting

2 Outline  Background on Optimization  Introduction to Genetic Algorithms  Using GAs to Solve Difficult Problems  A MatLab Implementation  Summary / Questions

3 How Do We Find the Minimum?

4 Gradient Methods (Steepest Descent)  Move in the direction of steepest gradient.  Simple to implement, guaranteed convergence.  Must know something about the derivative.  Can easily get stuck in a local minimum.

5 Stochastic Methods  Heuristic Using “Rules of Thumb”  Metaheuristic A framework of heuristics used to update a set of solutions during a search.  Simulated Annealing  Tabu Search  Ant Colony Systems

6 Genetic Algorithms  Use a population of possible solutions to the search space.  Each solution is encoded in a string called a chromosome (or genome).  Chromosomes are evaluated for fitness each generation (iteration); chromosomes that are more fit have a high probability of surviving.

7 Genetic Algorithms (cont.)  Once the surviving population is chosen, different “parent” chromosomes are combined to form “child” chromosomes.  Chromosomes may undergo mutation.  A new generation is formed, the process is repeated.  By selection, cross-over, and mutation, GAs search the solution space while creating stronger solutions over each generation.

8 Fitness and Selection  Roulette Wheel  Competition  Etc.

9 Cross-Over  Replaces two parent solutions with two children solutions.  Mechanism for covering large area of search space.

10 Mutation  Operates on a single chromosome.  Mechanism to improve local search space.

11 Advantages to using GAs  Flexible and adaptive to a wide variety of problems.  Robust, global search capability.  Does not require the solution space to be smooth, continuous, or differentiable.  Can be used in permutation problems.  No practical drawbacks. Slow local convergence Perceived learning overhead

12 Applications  Function optimization  Job shop scheduling  Process planning  Assembly line optim.  Process control  Airplane landing  Nested design  Keyboard layout  Creativity  VLSI  Traveling Sales Man  Chemical kinetics  Etc.

13 Solving Difficult Problems

14 Difficult Problems  Appeared in Jan/Feb 2002 SIAM News in the 100- Dollar, 100-Digit Challenge. exp(sin(50*x)) + sin(60*exp(y)) + sin(70*sin(x)) + sin(sin(80*y)) - sin(10*(x+y)) + 0.25*(x^2 + y^2)  The genetic algorithm was able to solve this to 10 digits of precision in 2000 generations, which took 25 seconds on a P-III 1.0 GHz. (35% success rate)

15 Permutation (Order-based) Problems TTime-share Example One condo building at a ski resort Four identical condo units 16 week ski season – 64 total owners 2 nd choice = 2 free lift tickets per person, 3 rd choice = 5 free tickets, otherwise $$ and 7 free tickets. 5 out of 16 weeks are twice as popular Maximum occupancy = 22 PPossible solutions: 1x10 67

16 Results of GA  A previous published result (using SA) found a minimum of 224 after 261 iterations, and no improvement after 1,000,000 iterations.  The GA found a cost of 200 after 2,150 iterations, and a minimum of 172 after 250,000 iterations.  (Author of previous work was “astonished” at the new result.)

17 Using GAs in MatLab  http://www.ie.ncsu.edu/mirage/GAToolBox/gaot/ http://www.ie.ncsu.edu/mirage/GAToolBox/gaot/

18 MatLab Code % Bounds on the variables bounds = [-5 5; -5 5]; % Evaluation Function evalFn = 'Four_Eval'; evalOps = []; % Generate an intialize population of size 80 startPop=initializega(80,bounds,evalFn,[1e-10 1]); % GA Options [epsilon float/binary display] gaOpts=[1e-10 1 0]; % Termination Operators -- 500 Generations termFns = 'maxGenTerm'; termOps = [500]; % Selection Function selectFn = 'normGeomSelect'; selectOps = [0.08]; % Crossover Operators xFns = 'arithXover heuristicXover simpleXover'; xOpts = [1 0; 1 3; 1 0]; % Mutation Operators mFns = 'boundaryMutation multiNonUnifMutation nonUnifMutation unifMutation'; mOpts = [2 0 0;3 200 3;2 200 3;2 0 0]; % Apply the genetic algorithm [soln endPop bestPop trace]=ga(bounds,evalFn,evalOps,startPop,ga Opts,termFns,termOps,selectFn,selectOps,xF ns,xOpts,mFns,mOpts);

19 Evaluation Function function [x, soln] = Four_Eval(x, options) soln = -(exp(sin(50*x(1))) + sin(60*exp(x(2))) + sin(70*sin(x(1))) +... sin(sin(80*x(2))) - sin(10*(x(1)+x(2))) + 0.25*(x(1)^2 + x(2)^2));

20 Time Share Evaluation Function function [assignment, soln] = local_min(assignment, options) global family_info cost = 0; occupancy = zeros(16,1); for i = 1:64 if ceil(assignment(i)/4) == family_info(i,2)% first choice cost = cost+ 0; elseif ceil(assignment(i)/4) == family_info(i,3)% second choice cost = cost + 2*family_info(i,1); elseif ceil(assignment(i)/4) == family_info(i,4)% third choice cost = cost + 4*family_info(i,1); else % didn't get any choice cost = cost + 50 + 7*family_info(i,1); end building = ceil(assignment(i)/4); occupancy(building) = occupancy(building) + family_info(i,1); end for i = 1:16 if occupancy(building) > 22 cost = cost + 1000; end soln = -cost;

21 Summary  Genetic Algorithms are: Powerful Flexible Easy to use and understand  Consider using a GA for your next optimization problem!

22


Download ppt "Optimization with Genetic Algorithms Walter Reade October 31, 2002 TAG Meeting."

Similar presentations


Ads by Google