Optimization with Meta-Heuristics

Slides:



Advertisements
Similar presentations
Decision Support Andry Pinto Hugo Alves Inês Domingues Luís Rocha Susana Cruz.
Advertisements

Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
Simulated Annealing General Idea: Start with an initial solution
Local search algorithms
Local search algorithms
Two types of search problems
Iterative improvement algorithms Prof. Tuomas Sandholm Carnegie Mellon University Computer Science Department.
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
Task Assignment and Transaction Clustering Heuristics.
Imagine that I am in a good mood Imagine that I am going to give you some money ! In particular I am going to give you z dollars, after you tell me the.
1 Randomness in Computation Example 1: Breaking symmetry. Example 2: Finding witnesses. Example 3: Monte Carlo integration. Example 4: Approximation algorithms.
Introduction to Simulated Annealing 22c:145 Simulated Annealing  Motivated by the physical annealing process  Material is heated and slowly cooled.
By Rohit Ray ESE 251.  Most minimization (maximization) strategies work to find the nearest local minimum  Trapped at local minimums (maxima)  Standard.
Elements of the Heuristic Approach
Algorithms for Network Optimization Problems This handout: Minimum Spanning Tree Problem Approximation Algorithms Traveling Salesman Problem.
Local Search and Optimization
Vilalta&Eick: Informed Search Informed Search and Exploration Search Strategies Heuristic Functions Local Search Algorithms Vilalta&Eick: Informed Search.
1 IE 607 Heuristic Optimization Simulated Annealing.
The Basics and Pseudo Code
1 Simulated Annealing Contents 1. Basic Concepts 2. Algorithm 3. Practical considerations.
Simulated Annealing.
Dinner and a Golf Outing: Solving the real “Social Golfer Problem” Scott R. Schultz - Mercer University, Macon, GA.
Doshisha Univ., Kyoto, Japan CEC2003 Adaptive Temperature Schedule Determined by Genetic Algorithm for Parallel Simulated Annealing Doshisha University,
0 Weight Annealing Heuristics for Solving Bin Packing Problems Kok-Hua Loh University of Maryland Bruce Golden University of Maryland Edward Wasil American.
1 Short Term Scheduling. 2  Planning horizon is short  Multiple unique jobs (tasks) with varying processing times and due dates  Multiple unique jobs.
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
Solving the Maximum Cardinality Bin Packing Problem with a Weight Annealing-Based Algorithm Kok-Hua Loh University of Maryland Bruce Golden University.
A Computational Study of Three Demon Algorithm Variants for Solving the TSP Bala Chandran, University of Maryland Bruce Golden, University of Maryland.
Single-solution based metaheuristics. Outline Local Search Simulated annealing Tabu search …
Simulated Annealing G.Anuradha.
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
Local search algorithms In many optimization problems, the state space is the space of all possible complete solutions We have an objective function that.
Optimization Problems
Outline Problem Definition Related Works & Complexity MILP Formulation Solution Algorithms Computational Experiments Conclusions & Future Research 1/26.
Heuristic Methods for the Single- Machine Problem Chapter 4 Elements of Sequencing and Scheduling by Kenneth R. Baker Byung-Hyun Ha R2.
1 Simulated Annealing Contents 1. Basic Concepts 2. Algorithm 3. Practical considerations.
Simulated Annealing To minimize the wire length. Combinatorial Optimization The Process of searching the solution space for optimum possible solutions.
Raunak Singh (ras2192) IEOR 4405: Production Scheduling 28 th April 2009.
Intro. ANN & Fuzzy Systems Lecture 37 Genetic and Random Search Algorithms (2)
Metaheuristics for the New Millennium Bruce L. Golden RH Smith School of Business University of Maryland by Presented at the University of Iowa, March.
1 Contents 1. Basic Concepts 2. Algorithm 3. Practical considerations Simulated Annealing (SA)
Eick: Informed Search Informed Search and Exploration Search Strategies Heuristic Functions Local Search Algorithms Vilalta&Eick: Informed Search.
1 Genetic Algorithms Contents 1. Basic Concepts 2. Algorithm 3. Practical considerations.
Optimization Problems
Scientific Research Group in Egypt (SRGE)
CSCI 4310 Lecture 10: Local Search Algorithms
Department of Computer Science
Heuristic Optimization Methods
Van Laarhoven, Aarts Version 1, October 2000
Traffic Simulator Calibration
Tabu Search Review: Branch and bound has a “rigid” memory structure (i.e. all branches are completed or fathomed). Simulated Annealing has no memory structure.
Local Search Strategies: From N-Queens to Walksat
By Rohit Ray ESE 251 Simulated Annealing.
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Zsolt Ugray, The University of Texas at Austin, MSIS Dept.
Haim Kaplan and Uri Zwick
Heuristic search INT 404.
Haim Kaplan and Uri Zwick
Optimization Problems
CSE 589 Applied Algorithms Spring 1999
Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?
Lecture 9: Tabu Search © J. Christopher Beck 2005.
Xin-She Yang, Nature-Inspired Optimization Algorithms, Elsevier, 2014
Topic 15 Job Shop Scheduling.
Artificial Intelligence
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
Greg Knowles ECE Fall 2004 Professor Yu Hu Hen
Submodular Maximization with Cardinality Constraints
Presentation transcript:

Optimization with Meta-Heuristics Question: Can you ever prove that a solution generated using a meta-heuristic is optimal? Answer: Yes, for a minimization problem, if the value of the solution equals a lower bound. Question: If the solution of a meta-heuristic for a minimization problem does not equal the lower bound, does that mean the solution is not optimal? Answer: Not necessarily, you just don’t know. Observation: Developing a good lower bound just as important as developing a good meta-heuristic algorithm.

DOE for Meta-Heuristics Question: With all the seeming randomness, and choices of neighborhoods and algorithm parameters, how do you know you have developed a good approach or not? Answer: Design of Experiments

DOE for Meta-Heuristics Recall the classic Johnson, et al. simulated annealing algorithm: 1. Get an initial solution S. 2. Get an initial temperature T > 0. 3. While not yet frozen do the following: 3.1 Perform the following loop l time. 3.1.1 Pick a random neighbor S’ of S. 3.1.2 Let D = cost(S’) – cost(S) 3.1.3 If D <= 0 (downhill move), Set S = S’. 3.1.4 If D > 0 (uphill move), Set S = S’ with probability e-D/T. 3.2 Set T = rT (reduce temperature). 4. Return S. What are potential experimental parameters?

DOE for Meta-Heuristics Design parameters for simulated annealing algorithm include: Problem instances Cooling approach Starting temperature Number of iterations at temperature Temperature reduction rate Termination condition Variance from Johnson’s classic algorithm Neighborhoods Acceptance probability function

DOE for Meta-Heuristics Obtaining Problem Instances: Benchmark problems www.palette.ecn.purdue.edu/~uzsoy2/spssm.html http://w.cba.neu.edu/~msolomon/problems.htm many others Problem generator How many problems Size of problem Problem characteristics (unique for different problem types)

www.palette.ecn.purdue.edu/~uzsoy2/spssm.html Shop Scheduling Benchmark Problems General Information C Programs For Problem Generation Parameter Values For Problem Generation J//Cmax Problems J//Lmax Problems J/2SETS/Cmax Problems J/2SETS/Lmax Problems F//Cmax Problems F//Lmax Problems

DOE for Meta-Heuristics How to report results: Must evaluate to something (solution value – lower bound) Compare solution versus run time Compare over some problem generation parameter (due date range)

DOE for Meta-Heuristics How to report results: different sized problem instances

DOE for Meta-Heuristics How to report results: Comparison to benchmark problems VFSA: K=4,S=1.5,B=0.8 Problem Name LB Previous UB (from Uzsoy) UB (best known from Balas) Balas CPU sec. After 5 min 30 min 60 min 120 min Best VFSA r_20_15_1_1_2 1140 1464 1263 175 1299 1275 1244 r_20_15_1_1_3 1182 1501 1304 98 1271 1268 1266 1258 r_20_15_1_1_4 1160 1492 1396 127 1367 1332 1326 r_20_15_1_1_6 1027 1448 145 1229 1222 1208 r_20_15_1_1_8 1127 1552 1459 135 1449 1417 1388 r_20_15_1_2_1 1721 2090 1817 85 1818 r_20_15_1_2_10 1775 2092 1873 39 r_20_15_1_2_5 1925 2181 1949 37 1930 r_20_15_1_2_8 1599 1785 1636 15 r_20_15_1_2_9 1956 2246 2020 r_20_15_2_1_1 2165 2000 11 2014 1972 1970 1963 1960 r_20_15_2_1_3 1727 2100 1976 131 1971 1918 1901 1900 1898 r_20_15_2_1_5 1521 1839 1726 121 1697 1690 1684 1683 r_20_15_2_1_7 1575 1957 1908 118 1846 1830 1824 r_20_15_2_1_9 1858 2143 1968 110 1955 1929 1914 1913

DOE for Meta-Heuristics How to report results:

DOE for Meta-Heuristics How to report results:

DOE for Meta-Heuristics In class assignment: Develop a DOE for the traveling salesman problem.