Metaheuristics for the New Millennium Bruce L. Golden RH Smith School of Business University of Maryland by Presented at the University of Iowa, March.

Slides:



Advertisements
Similar presentations
Decision Support Andry Pinto Hugo Alves Inês Domingues Luís Rocha Susana Cruz.
Advertisements

CS6800 Advanced Theory of Computation
Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
Simulated Annealing General Idea: Start with an initial solution
Acoustic design by simulated annealing algorithm
CHAPTER 8 A NNEALING- T YPE A LGORITHMS Organization of chapter in ISSO –Introduction to simulated annealing –Simulated annealing algorithm Basic algorithm.
Simulated Annealing Student (PhD): Umut R. ERTÜRK Lecturer : Nazlı İkizler Cinbiş
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
Date:2011/06/08 吳昕澧 BOA: The Bayesian Optimization Algorithm.
Hybridization of Search Meta-Heuristics Bob Buehler.
1 Chapter 5 Advanced Search. 2 Chapter 5 Contents l Constraint satisfaction problems l Heuristic repair l The eight queens problem l Combinatorial optimization.
Recent Development on Elimination Ordering Group 1.
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
Nature’s Algorithms David C. Uhrig Tiffany Sharrard CS 477R – Fall 2007 Dr. George Bebis.
Task Assignment and Transaction Clustering Heuristics.
MAE 552 – Heuristic Optimization Lecture 4 January 30, 2002.
EA* A Hybrid Approach Robbie Hanson. What is it?  The A* algorithm, using an EA for the heuristic.  An efficient way of partitioning the search space.
Ant Colony Optimization Optimisation Methods. Overview.
Simulated Annealing Van Laarhoven, Aarts Version 1, October 2000.
MAE 552 – Heuristic Optimization Lecture 10 February 13, 2002.
Carmine Cerrone, Raffaele Cerulli, Bruce Golden GO IX Sirmione, Italy July
Introduction to Simulated Annealing 22c:145 Simulated Annealing  Motivated by the physical annealing process  Material is heated and slowly cooled.
Simulated Annealing G.Anuradha. What is it? Simulated Annealing is a stochastic optimization method that derives its name from the annealing process used.
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Elements of the Heuristic Approach
Evaluating Performance for Data Mining Techniques
Escaping local optimas Accept nonimproving neighbors – Tabu search and simulated annealing Iterating with different initial solutions – Multistart local.
Genetic Algorithm.
Vilalta&Eick: Informed Search Informed Search and Exploration Search Strategies Heuristic Functions Local Search Algorithms Vilalta&Eick: Informed Search.
The Traveling Salesperson Problem Algorithms and Networks.
1 Local search and optimization Local search= use single current state and move to neighboring states. Advantages: –Use very little memory –Find often.
Solving the Concave Cost Supply Scheduling Problem Xia Wang, Univ. of Maryland Bruce Golden, Univ. of Maryland Edward Wasil, American Univ. Presented at.
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory Mixed Integer Problems Most optimization algorithms deal.
1 The Euclidean Non-uniform Steiner Tree Problem by Ian Frommer Bruce Golden Guruprasad Pundoor INFORMS Annual Meeting Denver, Colorado October 2004.
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
Facility Layout 6 MULTIPLE, Other algorithms, Department Shapes.
Optimization Problems - Optimization: In the real world, there are many problems (e.g. Traveling Salesman Problem, Playing Chess ) that have numerous possible.
1 Chapter 5 Advanced Search. 2 Chapter 5 Contents l Constraint satisfaction problems l Heuristic repair l The eight queens problem l Combinatorial optimization.
Simulated Annealing.
Applying Genetic Algorithm to the Knapsack Problem Qi Su ECE 539 Spring 2001 Course Project.
Doshisha Univ., Kyoto, Japan CEC2003 Adaptive Temperature Schedule Determined by Genetic Algorithm for Parallel Simulated Annealing Doshisha University,
0 Weight Annealing Heuristics for Solving Bin Packing Problems Kok-Hua Loh University of Maryland Bruce Golden University of Maryland Edward Wasil American.
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
Mobile Agent Migration Problem Yingyue Xu. Energy efficiency requirement of sensor networks Mobile agent computing paradigm Data fusion, distributed processing.
Applications of Dynamic Programming and Heuristics to the Traveling Salesman Problem ERIC SALMON & JOSEPH SEWELL.
Solving the Maximum Cardinality Bin Packing Problem with a Weight Annealing-Based Algorithm Kok-Hua Loh University of Maryland Bruce Golden University.
A Computational Study of Three Demon Algorithm Variants for Solving the TSP Bala Chandran, University of Maryland Bruce Golden, University of Maryland.
FORS 8450 Advanced Forest Planning Lecture 5 Relatively Straightforward Stochastic Approach.
The Generalized Traveling Salesman Problem: A New Genetic Algorithm Approach by John Silberholz, University of Maryland Bruce Golden, University of Maryland.
Simulated Annealing G.Anuradha.
Vaida Bartkutė, Leonidas Sakalauskas
1 Effect of Spatial Locality on An Evolutionary Algorithm for Multimodal Optimization EvoNum 2010 Ka-Chun Wong, Kwong-Sak Leung, and Man-Hon Wong Department.
CS270 Project Overview Maximum Planar Subgraph Danyel Fisher Jason Hong Greg Lawrence Jimmy Lin.
An Introduction to Simulated Annealing Kevin Cannons November 24, 2005.
Intro. ANN & Fuzzy Systems Lecture 37 Genetic and Random Search Algorithms (2)
Balanced Billing Cycles and Vehicle Routing of Meter Readers by Chris Groër, Bruce Golden, Edward Wasil University of Maryland, College Park American University,
Journal of Computational and Applied Mathematics Volume 253, 1 December 2013, Pages 14–25 Reporter : Zong-Dian Lee A hybrid quantum inspired harmony search.
Discrete ABC Based on Similarity for GCP
Heuristic Optimization Methods
Si Chen, University of Maryland Bruce Golden, University of Maryland
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
metaheuristic methods and their applications
Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?
Multi-Objective Optimization
Solving the Minimum Labeling Spanning Tree Problem
Generating and Solving Very Large-Scale Vehicle Routing Problems
Md. Tanveer Anwar University of Arkansas
Bioinformatics, Vol.17 Suppl.1 (ISMB 2001)
Stephen Chen York University
Simulated Annealing & Boltzmann Machines
Presentation transcript:

Metaheuristics for the New Millennium Bruce L. Golden RH Smith School of Business University of Maryland by Presented at the University of Iowa, March 2003

Focus of Paper  Introduce two new metaheuristics  Discuss several variants of each  Illustrate performance using the traveling salesman problem  Explain why the new metaheuristics work so well  Search space smoothing  Demon algorithms  Point out connections to “old” metaheuristics  Simulated annealing  Genetic algorithms 1

Search Space Smoothing: Overview  Developed by Gu and Huang in 1994  Applied to a small number of small TSP instances  In smoothing, we first normalize distances so that they fall between 0 and 1  At each iteration, the original distances are transformed by the smoothing transformation. Two-opt is applied. The degree of transformation decreases from one iteration to the next, until the original distances re-emerge. 3

Search Space Smoothing Algorithm 6

Search Space Smoothing: Summary  Smoothing clearly outperforms two-opt and takes approximately the same amount of time  Further experimentation with different “smoothing” functions has led to even better results  Smoothing, like simulated annealing, can accept uphill moves  Smoothing suggests a new way of classifying heuristics 17

Other Smoothing Functions (Coy et al., 2000)  Exponential  Logarithmic  Hyperbolic  Sigmoidal 18  Concave  Convex

Part II: Demon Algorithms  Review previous work  Introduce three new demon algorithm variants  Perform a computational study  Present conclusions  simulated annealing (SA)  the demon algorithm  preliminary computational work 19

Simulated Annealing  Generate an initial tour and set T (temperature)  Repeat until stopping condition:  The choice of T and a are essential  Generate a new tour and calculate  E (change in energy)  If  E  0, accept new tour  Else, if rand(0,1) < exp (-  E/T), accept new tour  Else, reject new tour  Implement annealing schedule (T=a*T) 20

Demon Algorithms  Wood and Downs developed several demon algorithms for solving the TSP  In DA, the demon acts as a creditor  The demon begins with credit = D > 0  Consider an arc exchange  If  E <D, accept new tour and D =D -  E  Arc exchanges with  E < 0 build credit  Arc exchanges with  E > 0 reduce credit 21

Demon Algorithms (continued)  To encourage minimization, Wood and Downs propose two techniques  Wood and Downs also propose a random component  The demon value is a normal random variable centered around the demon mean value  All changes in tour length impact the demon mean value  Impose an upper bound on the demon value, restricting the demon value after energy decreasing moves  Anneal the demon value 22

Demon Algorithms (continued)  This leads to four algorithms (Wood and Downs)  Bounded demon algorithm (BD)  Randomized bounded demon algorithm (RBD)  Annealed demon algorithm (AD)  Randomized annealed demon algorithm (RAD) 23

New Demon Algorithms  Two new techniques come to mind (Pepper et al.)  The idea is to impose a bound on the demon value (or demon mean value) and anneal that bound in ABD and RABD  Annealed bounded demon algorithm (ABD)  Randomized annealed bounded demon algorithm (RABD)  For RAD and RABD, anneal both the bound on the demon mean and the standard deviation. This leads to two additional algorithms, ADH and ABDH 24

Computational Study  Eleven algorithms in all  We selected 29 instances from TSPLIB  The instances range in size from 105 to 1,432 nodes  The instances have different structures  Each algorithm was applied 25 times to each instance from a randomized greedy start  Best and average performance and running time statistics were gathered 25

Preliminary Computational Results & Observations  Simulated annealing was best overall  RABD and ABD are nearly competitive with SA  The intuition behind the hybrids makes sense, but parameter setting becomes more difficult  The normal distribution can be replaced by “easier” distributions  Smarter DA variants may exist 27

Parameter Settings  We selected three representative test instances  Resulting parameter vector is applied to all 29 instances  For each algorithm, a GA determines a set of parameter values (parameter vector) that works well on these instances 26

New Variant #1: Triangular Demon Algorithm  Instead of sampling from a normal distribution, the demon value is sampled from the p.d.f. below DMDM 0.5 D M 1.5 D M 28

New Variant #2: Uniform Demon Algorithm  Instead of sampling from a normal distribution, the demon value is sampled from the p.d.f. below DMDM 0.5 D M 1.5 D M 29

New Variant #3: Annealed Uniform Demon Algorithm  Instead of sampling from a normal distribution, the demon value is sampled from the p.d.f. below  f is set to 0.5 initially and is annealed over time DMDM (1-f) D M (1+f) D M 30

Advantages of New Variants  Only two parameters need to be set (initial demon value and annealing schedule) – same as for SA  The mean and standard deviation are annealed at the same time  Sampling is easier in these three cases than sampling from a normal distribution 31

Experimental Design  We selected 36 symmetric, Euclidean instances from TSPLIB  The instances range in size from 105 to 1,432 nodes  For each algorithm, parameters were set using a GA-based procedure on a small subset of the instances 32

Setting Parameters  Single-stage genetic algorithm  Fitness of parameter vector v is where m is the number of test problems in the subset, D(v,i) is the tour length generated by vector v on test problem i, and B(i) is the optimal solution to problem i  The fitness is the root mean square of percent above optimal 33

Experimental Design (continued)  Starting tour  Tour improvement using 2-opt  Termination condition: 50 iterations of no improvement after going below the initial tour length or a maximum of 500 iterations  greedy heuristic  savings heuristic 34

Experimental Design (continued)  Each algorithm was run 25 times for each of the 36 instances  Averaged results are presented  All runs were carried out on a Sun Ultra 10 workstation on a Solaris 7 platform  The six best algorithms are compared 35

Experimental Results Algorithm Greedy TourSavings Tour Average % above optimal Standard deviation Running time (hours) Average % above optimal Standard deviation Running time (hours) Simulated Annealing Annealed Bounded Randomized Annealed Bounded Uniform Annealed Uniform Triangular

Part II: Conclusions  With a greedy start, Annealed Uniform is best  When the savings heuristic is used at the start  Demon algorithms are sensitive to the starting conditions  Using the savings heuristic significantly reduces computation times  Demon algorithms can be applied to other combinatorial optimization problems  Annealed Bounded is best  Triangular and Annealed Uniform also perform well and beat SA 37

Final Comments  Smoothing and demon algorithms are widely applicable  They are simple and elegant  There are few parameters to tune  They involve approximately 20 lines of code 38