GRASP: A Sampling Meta-Heuristic

Slides:



Advertisements
Similar presentations
G5BAIM Artificial Intelligence Methods
Advertisements

Types of Algorithms.
Multi-Objective Optimization NP-Hard Conflicting objectives – Flow shop with both minimum makespan and tardiness objective – TSP problem with minimum distance,
Tabu Search Strategy Hachemi Bennaceur 5/1/ iroboapp project, 2013.
Lecture 12: Revision Lecture Dr John Levine Algorithms and Complexity March 27th 2006.
CSE 460 Hybrid Optimization In this section we will look at hybrid search methods That combine stochastic search with systematic search Problem Classes.
S. J. Shyu Chap. 1 Introduction 1 The Design and Analysis of Algorithms Chapter 1 Introduction S. J. Shyu.
Vehicle Routing & Scheduling: Part 1
Algorithm Strategies Nelson Padua-Perez Chau-Wen Tseng Department of Computer Science University of Maryland, College Park.
R Bai, EK Burke and G Kendall Speaker: Ufo
GREEDY RANDOMIZED ADAPTIVE SEARCH PROCEDURES Reporter : Benson.
Reporter : Mac Date : Multi-Start Method Rafael Marti.
MAE 552 – Heuristic Optimization Lecture 4 January 30, 2002.
1 COMP8620 Lecture 5-6 Neighbourhood Methods, and Local Search (with special emphasis on TSP)
EAs for Combinatorial Optimization Problems BLG 602E.
Ant Colony Optimization Optimisation Methods. Overview.
2-Layer Crossing Minimisation Johan van Rooij. Overview Problem definitions NP-Hardness proof Heuristics & Performance Practical Computation One layer:
1 Greedy Randomized Adaptive Search and Variable Neighbourhood Search for the minimum labelling spanning tree problem Kuo-Hsien Chuang 2008/11/05.
MAE 552 – Heuristic Optimization Lecture 5 February 1, 2002.
Carmine Cerrone, Raffaele Cerulli, Bruce Golden GO IX Sirmione, Italy July
Backtracking.
Ant Colony Optimization: an introduction
Tabu Search Manuel Laguna. Outline Background Short Term Memory Long Term Memory Related Tabu Search Methods.
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Elements of the Heuristic Approach
T ABU S EARCH Ta-Chun Lien. R EFERENCE Fred G., Manuel L., Tabu Search, Kluwer Academic Publishers, USA(1997)
CSE 589 Applied Algorithms Spring Colorability Branch and Bound.
Escaping local optimas Accept nonimproving neighbors – Tabu search and simulated annealing Iterating with different initial solutions – Multistart local.
Scott Perryman Jordan Williams.  NP-completeness is a class of unsolved decision problems in Computer Science.  A decision problem is a YES or NO answer.
MIC’2011 1/58 IX Metaheuristics International Conference, July 2011 Restart strategies for GRASP+PR Talk given at the 10 th International Symposium on.
Heuristic Optimization Methods
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Fundamentals of Algorithms MCS - 2 Lecture # 7
COSC 4426 Topics in Computer Science II Discrete Optimization Good results with problems that are too big for people or computers to solve completely
Escaping Local Optima. Where are we? Optimization methods Complete solutions Partial solutions Exhaustive search Hill climbing Random restart General.
Heuristic Optimization Methods Tabu Search: Advanced Topics.
Heuristic Optimization Methods Greedy algorithms, Approximation algorithms, and GRASP.
1 Combinatorial Problem. 2 Graph Partition Undirected graph G=(V,E) V=V1  V2, V1  V2=  minimize the number of edges connect V1 and V2.
Course: Logic Programming and Constraints
FORS 8450 Advanced Forest Planning Lecture 11 Tabu Search.
CAS 721 Course Project Implementing Branch and Bound, and Tabu search for combinatorial computing problem By Ho Fai Ko ( )
G5BAIM Artificial Intelligence Methods
Single-solution based metaheuristics. Outline Local Search Simulated annealing Tabu search …
Reactive Tabu Search Contents A brief review of search techniques
LOG740 Heuristic Optimization Methods Local Search / Metaheuristics.
Combinatorial Optimization Chapter 8, Essentials of Metaheuristics, 2013 Spring, 2014 Metaheuristics Byung-Hyun Ha R2.
Types of Algorithms. 2 Algorithm classification Algorithms that use a similar problem-solving approach can be grouped together We’ll talk about a classification.
Optimization Problems
Heuristic Methods for the Single- Machine Problem Chapter 4 Elements of Sequencing and Scheduling by Kenneth R. Baker Byung-Hyun Ha R2.
METAHEURISTIC Jacques A. Ferland Department of Informatique and Recherche Opérationnelle Université de Montréal
Metaheuristics for the New Millennium Bruce L. Golden RH Smith School of Business University of Maryland by Presented at the University of Iowa, March.
1 Combinatorial Problem. 2 Graph Partition Undirected graph G=(V,E) V=V1  V2, V1  V2=  minimize the number of edges connect V1 and V2.
Escaping Local Optima. Where are we? Optimization methods Complete solutions Partial solutions Exhaustive search Hill climbing Exhaustive search Hill.
HM 2004 – ECAI 2004VNS/GRASP for SPP 1 GRASP/VNS hybrid for the Strip Packing Problem Jesús David Beltrán, José Eduardo Calderón, Rayco Jorge Cabrera,
On the Ability of Graph Coloring Heuristics to Find Substructures in Social Networks David Chalupa By, Tejaswini Nallagatla.
Unified Adaptivity Optimization of Clock and Logic Signals Shiyan Hu and Jiang Hu Dept of Electrical and Computer Engineering Texas A&M University.
Optimization Problems
School of Computer Science & Engineering
Optimization Problems
Heuristic Optimization Methods
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Types of Algorithms.
1.3 Modeling with exponentially many constr.
Heuristics Definition – a heuristic is an inexact algorithm that is based on intuitive and plausible arguments which are “likely” to lead to reasonable.
Types of Algorithms.
Optimization Problems
Multi-Objective Optimization
Subset of Slides from Lei Li, HongRui Liu, Roberto Lu
Algorithms for Budget-Constrained Survivable Topology Design
Types of Algorithms.
Presentation transcript:

GRASP: A Sampling Meta-Heuristic Topics What is GRASP The Procedure Applications Merit

What is GRASP GRASP : Greedy Randomized Adaptive Search Procedure Random Construction: TSP: randomly select next city to add High Solution Variance Low Solution Quality TSP: randomly select next city to add Greedy Construction: TSP: select nearest city to add High Solution Quality Low Solution Variance GRASP: Tries to Combine the Advantages of Random and Greedy Solution Construction Together.

The Knapsack Example Knapsack problem Construction Heuristic Backpack: 8 units of space, 4 items to pick Item Value in terms of dollars: 2,5,7,9 Item Cost in terms of space units: 1,3,5,7 Construction Heuristic Pick the Most Valuable Item Pick the Most Valuable Per Unit

Solution Quality Solution Quality For Heuristic 1: (1,4) , Value 11 For Heuristic 2: (1,4), Value 11. Optimal Solution: (2,3), Value 12 None of them gives the Optimal solution This is true for any heuristic Theoretically, for a NP-Hard problem, there is no polynomial algorithm

Semi-Greedy Heuristics Add at each step, not necessarily the highest rated solution components Do the following Put high (not only the highest) solution components into a restricted candidate list (RCL) Choose one element of the RCL randomly and add it to the partial solution Adaptive element: The greedy function depends on the partial solution constructed so far. Until a full solution is constructed.

Mechanism of RCL Size of the Restricted Candidate List 1) If we set size of the RCL to be really big, then the semi-greedy heuristic turns into a pure random heuristic 2) If we set the size of RCL to be 1, the sem-greedy heuristic turns into the pure greedy heuristic Typically, this size is set between 3~5.

GRASP Do the following While the stopping criteria unsatisfied Phase I: Construct the current solution according to a greedy myopic measure of goodness (GMMOG) with random selection from a restricted candidate list Phase II: Using a local search improvement heuristic to get better solutions While the stopping criteria unsatisfied

GRASP GRASP is a combination of semi-greedy heuristic with a local search procedure Local search from a Random Construction: Best solution often better than greedy, if not too large prob. Average solution quality worse than greedy heuristic High variance Local Search from Greedy Construction: Average solution quality better than random Low (No Variance)

The Knapsack Example Knapsack problem Two Greedy Functions Backpack: 8 units of space, 4 items to pick Item Value in terms of dollars: 2,5,7,9 Item Cost in terms of space units: 1,3,5,7 Two Greedy Functions Pick the Most Valuable Item Pick the Most Valuable Per Unit

GRASP The Most Valuable Item with RCL=2 Items 4 and 3 with values 9,7 are in the RCL Flip a coin, we select …. The Most Valuable Per Unit with RCL = 2 Items 1 and 2 are selected with values 2/1 =2 and 5/3 = 1.7,

GRASP extensions Merits Extension Fast High Quality Solution Time Critical Decision Few Parameters to tune Extension Reactive GRASP – The RCL Size The use of Elite Solutions found Long term memory, Path relinking

Literature T.A.Feo and M.G.C. Resende, “A probabilistic Heuristic for a computational Difficult Set covering Problem,” Operations Research Letters, 8:67-71, 1989 P. Festa and M.G.C. Resende, “GRASP: An annotated Biblograph” in P. Hansen and C.C. Ribeiro, editors, “Essays and Surveys on Metaheuristics, Kluwer Academic Publishers, 2001 M.G.C.Resende and C.C.Ribeiro, “Greedy Randomized Adaptive Search Procedure”, in Handbook of Metaheuristics, F. Glover and G. Kochenberger, eds, Kluwer Academic Publishers, 219-249, 2002

Neighbourhood For each solution S  S, N(S)  S is a neighbourhood In some sense each T  N(S) is in some sense “close” to S Defined in terms of some operation Very like the “action” in search

Neighbourhood Exchange neighbourhood: Exchange k things in a sequence or partition Examples: Knapsack problem: exchange k1 things inside the bag with k2 not in. (for ki, k2 = {0, 1, 2, 3}) Matching problem: exchange one marriage for another

2-opt Exchange

2-opt Exchange

2-opt Exchange

2-opt Exchange

2-opt Exchange

2-opt Exchange

3-opt exchange Select three arcs Replace with three others 2 orientations possible

3-opt exchange

3-opt exchange

3-opt exchange

3-opt exchange

3-opt exchange

3-opt exchange

3-opt exchange

3-opt exchange

3-opt exchange

3-opt exchange

3-opt exchange

Neighbourhood Strongly connected: Any solution can be reached from any other (e.g. 2-opt) Weakly optimally connected The optimum can be reached from any starting solution

Neighbourhood Hard constraints create solution impenetrable mountain ranges Soft constraints allow passes through the mountains E.g. Map Colouring (k-colouring) Colour a map (graph) so that no two adjacent countries (nodes) are the same colour Use at most k colours Minimize number of colours

  Map Colouring ? Starting sol Two optimal solutions Define neighbourhood as: Change the colour of at most one vertex Make k-colour constraint soft…

Variable Neighbourhood Search Large Neighbourhoods are expensive Small neighbourhoods are less effective Only search larger neighbourhood when smaller is exhausted

Variable Neighbourhood Search m Neighbourhoods Ni |N1| < |N2| < |N3| < … < |Nm| Find initial sol S ; best = z (S) k = 1; Search Nk(S) to find best sol T If z(T) < z(S) S = T k = 1 else k = k+1

VNS does not follow a trajectory Like SA, tabu search