Course: Logic Programming and Constraints

Slides:



Advertisements
Similar presentations
Solving IPs – Cutting Plane Algorithm General Idea: Begin by solving the LP relaxation of the IP problem. If the LP relaxation results in an integer solution,
Advertisements

Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.
Variable Neighborhood Search for Bin Packing Problem Borislav Nikolić, Hazem Ismail Abdel Aziz Ali, Kostiantyn Berezovskyi, Ricardo Garibay Martinez, Muhammad.
1 Reinforced Tabu Search (RTS) for Graph Colouring Daniel Porumbel PhD student (joint work with Jin Kao Hao and Pascale Kuntz) Laboratoire d’Informatique.
Tabu Search Strategy Hachemi Bennaceur 5/1/ iroboapp project, 2013.
Preference Elicitation Partial-revelation VCG mechanism for Combinatorial Auctions and Eliciting Non-price Preferences in Combinatorial Auctions.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Graphs – Basic Concepts
Easy Optimization Problems, Relaxation, Local Processing for a small subset of variables.
Gizem ALAGÖZ. Simulation optimization has received considerable attention from both simulation researchers and practitioners. Both continuous and discrete.
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
Computability and Complexity 23-1 Computability and Complexity Andrei Bulatov Search and Optimization.
MAE 552 – Heuristic Optimization Lecture 8 February 8, 2002.
Totally Unimodular Matrices Lecture 11: Feb 23 Simplex Algorithm Elliposid Algorithm.
Design and Analysis of Algorithms
1 Introduction to Linear and Integer Programming Lecture 9: Feb 14.
Recent Development on Elimination Ordering Group 1.
3 -1 Chapter 3 The Greedy Method 3 -2 The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each.
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
NP-Complete Problems Reading Material: Chapter 10 Sections 1, 2, 3, and 4 only.
NP-Complete Problems Problems in Computer Science are classified into
Chapter 10: Iterative Improvement
MAE 552 – Heuristic Optimization Lecture 4 January 30, 2002.
MAE 552 – Heuristic Optimization
Backtracking Reading Material: Chapter 13, Sections 1, 2, 4, and 5.
Ant Colony Optimization: an introduction
Tabu Search Manuel Laguna. Outline Background Short Term Memory Long Term Memory Related Tabu Search Methods.
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Escaping local optimas Accept nonimproving neighbors – Tabu search and simulated annealing Iterating with different initial solutions – Multistart local.
© The McGraw-Hill Companies, Inc., Chapter 3 The Greedy Method.
Internet Traffic Engineering by Optimizing OSPF Weights Bernard Fortz (Universit é Libre de Bruxelles) Mikkel Thorup (AT&T Labs-Research) Presented by.
Fixed Parameter Complexity Algorithms and Networks.
ENCI 303 Lecture PS-19 Optimization 2
Heuristic Optimization Methods
Design Techniques for Approximation Algorithms and Approximation Classes.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Sets.
Tabu Search UW Spring 2005 INDE 516 Project 2 Lei Li, HongRui Liu, Roberto Lu.
Heuristic Optimization Methods
Contents of Chapter 7 Chapter 7 Backtracking 7.1 The General method
Heuristic Optimization Methods Tabu Search: Advanced Topics.
Week 10Complexity of Algorithms1 Hard Computational Problems Some computational problems are hard Despite a numerous attempts we do not know any efficient.
FORS 8450 Advanced Forest Planning Lecture 11 Tabu Search.
Single-solution based metaheuristics. Outline Local Search Simulated annealing Tabu search …
NP-Complete Problems. Running Time v.s. Input Size Concern with problems whose complexity may be described by exponential functions. Tractable problems.
Presenter: Leo, Shih-Chang, Lin Advisor: Frank, Yeong-Sung, Lin /12/16.
METAHEURISTIC Jacques A. Ferland Department of Informatique and Recherche Opérationnelle Université de Montréal
Preliminary Background Tabu Search Genetic Algorithm.
Approximation Algorithms Department of Mathematics and Computer Science Drexel University.
HM 2004 – ECAI 2004VNS/GRASP for SPP 1 GRASP/VNS hybrid for the Strip Packing Problem Jesús David Beltrán, José Eduardo Calderón, Rayco Jorge Cabrera,
Tabu Search Subset of Slides from Lei Li, HongRui Liu, Roberto Lu Edited by J. Wiebe.
On the Ability of Graph Coloring Heuristics to Find Substructures in Social Networks David Chalupa By, Tejaswini Nallagatla.
Discrete Optimization
Limitation of Computation Power – P, NP, and NP-complete
Scientific Research Group in Egypt (SRGE)
Lap Chi Lau we will only use slides 4 to 19
Discrete ABC Based on Similarity for GCP
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Heuristic Optimization Methods
Computers versus human brains a cooperative game for scientific discoveries Alain Hertz Polytechnique Montréal Mons, August 23, 2017.
Topics in Algorithms Lap Chi Lau.
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Comparing Genetic Algorithm and Guided Local Search Methods
METAHEURISTICS Neighborhood (Local) Search Techniques
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Study Guide for ES205 Yu-Chi Ho Jonathan T. Lee Nov. 7, 2000
An Agent-Based Algorithm for Generalized Graph Colorings
3-3 Optimization with Linear Programming
Subset of Slides from Lei Li, HongRui Liu, Roberto Lu
Chapter 10: Iterative Improvement
Presentation transcript:

Course: Logic Programming and Constraints Appendix B: 1.Tabu Search for Graph Coloring 2.Variable Neighborhood Search Course: Logic Programming and Constraints

1.Tabu search for graph coloring Definitions and notations Given a graph G = (V, E) with vertex set V and edge set E, and given an integer k, a k-coloring of G is a function c : V  {1,2,…,k}. The value c(x) of a vertex x is called the color of x. The vertices with color i (1  i  k) define a color class, denoted Vi. If two adjacent vertices x and y have the same color i, vertices x and y, the edge [x, y] and color i are said conflicting. The k-coloring without conflicting edges is said legal and its color classes are called stable sets. The Graph Coloring Problem (GCP) is to determine the smallest integer k, called chromatic number of G, such that there exist a legal k-coloring of G.

How to solve the GCP Given a fixed integer k, the problem of determining whether there exists a legal k-coloring of G is called k-GCP. An algorithm that solves the k-GCP can be used to solve the GCP. Scheme: 1. Use a local search algorithm for finding a legal coloring c. 2. Set k* equal to the number of colors used in c and k := k* -1. 3. Solve the k-GCP; if a legal k-coloring is found then go to 2, else return k*.

The Tabucol algorithm Tabucol was introduced in 1987 by Hertz and de Werra. Tabucol is a tabu search algorithm for the k-GCP. It first generates an initial random k-coloring; which contains a large number of conflicting edges. Then, the algorithm iteratively modifies the color of a single vertex, the objective being to decrease the number of conflicting edges until a legal k-coloring is obtained. A tabu list is used in order to escape from local optima and to avoid short term cycling.

The search space S is the set of k-colorings of a given graph G. A solution c S is a partition of the vertex set into k-subsets V1, …,Vk. The evaluation function f measures the number of conflicting edges. For a solution c = (V1, …,Vk) in S, f(c) = ki=1|Ei|, where Ei denotes the set of edges with both endpoints in Vi (i.e. the number of conflicting edges). The goal of Tabucol is to determine a k-coloring c such that f(c) = 0. An elementary transformation, called 1-move, consists in changing the color of a single vertex.

The k-coloring c’ = c + (v, i) can be described as follows: For a vertex v and a color i  c(v), we denote (v, i) the 1-move that assign color i to v and the solution resulting from this 1-move is denoted c  (v, i). The k-coloring c’ = c + (v, i) can be described as follows: c’(v) = i c’(w) = c(w) for all w V – {v} The neighborhood N(c) of a solution c S is defined as the set of k-colorings that can be reached from c by applying a single 1-move. N(c) contains |V|(k -1) solutions. The performance of a 1-move (v, i) on a solution c can be measured by (v, i) = f(c  (v, i)) – f(c). Tabucol involves the notion of critical vertices (i.e. vertices involved in a conflicting edges). F(c) is the number of conflicting vertices in c. A 1-move (v, i) involves a critical vertex c is said critical. Tabucol performs only critical 1-moves.

When the 1-move (v,i) is applied on c, then (v, c(v)) becomes a tabu 1-move for L+  F(c) iterations. The duration of the tabu status of (v, c(v)) depends on the number of conflicting vertices in c and on two parameters L and . A 1-move (v,i) is defined as candidate if it is both critical and not tabu, or if f(c  (v,i)) = 0. The last condition is a very elementary aspiration criterion. At each iteration, Tabucol performs the best candidate 1-move. The algorithm stops as soon as f(c) = 0 and it returns c, which is a legal k-coloring. Note: The size of tabu list increases with the number conflicting vertices. (L = [0, 9] and  = 0.6)

Algorithm Tabucol Build a random solution c; Set c* := c and iter = 0; Input: A graph G = (V, E) and an integer k > 0. Parameters: MaxIter, L and . Output: Solution c* Build a random solution c; Set c* := c and iter = 0; Set the tabu list to the empty list; Repeat until f(c) = c or iter = MaxIter Set iter := iter + 1; Choose a candidate 1-move (v, i) with minimum value (v,i); Introduce move (v, c(v)) into the tabu list for L+  F(c) iterations; Set c := c  (v, i); if f(c) < f(c*) then set c* := c;

2. Variable Neighborhood Search Main idea of VNS Variable neighborhood search (VNS) is a metaheuristic, proposed by Mladenovic and Hansen, in 1997. It’s based on a simple principle: a systematic change of neighborhood within the search. Many extensions have been made, to allow solving large problem instances.  Many applications.

Basic Scheme Nk, (k = 1,2,…, kmax), a finite set of pre-selected neighborhood with Nk(x) the set of solutions in the k-th neighborhood of x. Neighborhoods Nk may be induced from one or more metric (or quasi metric) functions introduced into a solution space S. An optimal solution xopt is a feasible solution where a minimum of the cost function f is reached. We call x’X (feasible set) a local minimum of f w.r.t Nk, if there is no solution x Nk(x’)  X such that f(x) < f(x’). VNS is based on three simple facts: Fact 1: A local minimum w.r.t. one neighborhood structure is not necessary so with another. Fact 2: A global minimum is a local minimum w.r.t. all possible neighborhood structures. Fact 3: For many problems local minima w.r.t. one or several Nk are relatively close to each other.

Basic Scheme (cont.) In order to find the local minimum by using several neighborhoods, facts 1-3 can be used in three different ways: (i) deterministic, (ii) stochastic and (iii) both deterministic and stochastic. The Variable neighborhood descent (VND) method is obtained if change of neighborhood is performed in a determistic way. The Reduced VNS (RVNS) method is obtained if random points are selected from Nk(x), without being followed by descent.

Steps of the basic Variable Neighborhood Descent Initialization. Select the set of neighborhood structures Nk for k = 1,2,…,kmax, that will be used in the descent; find an initial solution x. Repeat the following sequence until no improvement is obtained: (1) Set k = 1; (2) Repeat the following steps until k = kmax: (a) Exploration of neighborhood: find the best neighbor x’ of x (x’ Nk(x)); (b) Move or not If the solution thus obtained x’ is better than x, set x x’ and k  1; otherwise, set k  k+1;

Reduced VNS The Reduced VNS (RVNS) method is obtained if random points are selected from Nk(x), without being followed by descent. RVNS is useful for very large instances for which local search is costly. The best value for the parameter kmax is often 2.

Reduced VNS Initialization. Select the set of neighborhood structures Nk for k = 1,2,…,kmax, that will be used in the search; find an initial solution x; choose a stopping condition; Repeat the following sequence until the stopping condition is met: (1) Set k = 1; (2) Repeat the following steps until k = kmax: (a) Shaking: generate a point x’ at random from the k-th neighborhood of of x (x’ Nk(x)); (b) Move or not. If this point is better than the incumbent, move there (x  x’), and continue the search with N1 (k  1); otherwise, set k  k+1;

The stop condition may be: Maximum CPU time allowed Maximum number of iterations Maximum number of iterations between two improvements. The point x’ is generated at random in step 2a in order to avoid cycling, which may occur if any deterministic rule was used. The basic VNS (VNS) method combines deterministic and stochastic changes of neighborhood. In the basic VNS, the local search step (2b) may be replaced by VND. Using VNS/VND leads to successful applications.

Steps of the basic VNS Initialization. Select the set of neighborhood structures Nk for k = 1,2,…,kmax, that will be used in the search; find an initial solution x; choose a stopping condition; Repeat the following sequence until the stopping condition is met: (1) Set k = 1; (2) Repeat the following steps until k = kmax: (a) Shaking: generate a point x’ at random from the k-th neighborhood of of x (x’ Nk(x)); (b) Local search. Apply some local search method with x’ as initial solution; denote with x’’ the so obtained local optimum; (b) Move or not. If this point is better than the incumbent, move there (x  x’’), and continue the search with N1 (k  1); otherwise, set k  k +1;

C. Avanthay, A. Hertz and N. Zufferey, A variable neighborhood search for graph coloring, European Journal of Operation Research 151, pp. 379-388, 2003. They designed 12 different large neighborhoods. Experiments reported show that the method is more efficient than using Tabucol. http://mat.gsia.cmu.edu/COLOR2