EMIS 8373: Integer Programming

Slides:



Advertisements
Similar presentations
Algorithm Design Methods (I) Fall 2003 CSE, POSTECH.
Advertisements

Progress in Linear Programming Based Branch-and-Bound Algorithms
Vehicle Routing & Scheduling: Part 1
1 State of the art for TSP TSP instances of thousand of cities can be consistently solved to optimality. Instances of up to cities have been solved:
Recent Development on Elimination Ordering Group 1.
Spring 08, Feb 14 ELEC 7770: Advanced VLSI Design (Agrawal) 1 ELEC 7770 Advanced VLSI Design Spring 2008 Linear Programming – A Mathematical Optimization.
1 Traveling Salesman Problem (TSP) Given n £ n positive distance matrix (d ij ) find permutation  on {0,1,2,..,n-1} minimizing  i=0 n-1 d  (i),  (i+1.
1 Maximum matching Max Flow Shortest paths Min Cost Flow Linear Programming Mixed Integer Linear Programming Worst case polynomial time by Local Search.
Greedy Algorithms Reading Material: Chapter 8 (Except Section 8.5)
Vehicle Routing & Scheduling
1 IOE/MFG 543 Chapter 14: General purpose procedures for scheduling in practice Section 14.4: Local search (Simulated annealing and tabu search)
Easy Optimization Problems, Relaxation, Local Processing for a single variable.
Spring 07, Mar 13, 15 ELEC 7770: Advanced VLSI Design (Agrawal) 1 ELEC 7770 Advanced VLSI Design Spring 2007 Linear Programming – A Mathematical Optimization.
1 IE 607 Heuristic Optimization Introduction to Optimization.
Ant Colony Optimization: an introduction
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Elements of the Heuristic Approach
Metaheuristics Meta- Greek word for upper level methods
Decision Procedures An Algorithmic Point of View
1 1 1-to-Many Distribution Vehicle Routing John H. Vande Vate Spring, 2005.
Design Techniques for Approximation Algorithms and Approximation Classes.
Global Optimization The Problem minimize f(x) subject to g i (x)>b i i=1,…,m h j (x)=c j j=1,…,n When x is discrete we call this combinatorial optimization.
Search Methods An Annotated Overview Edward Tsang.
Heuristic Optimization Methods Greedy algorithms, Approximation algorithms, and GRASP.
Course: Logic Programming and Constraints
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
Hardware Accelerator for Combinatorial Optimization Fujian Li Advisor: Dr. Areibi.
LOG740 Heuristic Optimization Methods Local Search / Metaheuristics.
Introduction to Optimization
Presenter: Leo, Shih-Chang, Lin Advisor: Frank, Yeong-Sung, Lin /12/16.
EMIS 8373: Integer Programming Combinatorial Relaxations and Duals Updated 8 February 2005.
Heuristic Methods for the Single- Machine Problem Chapter 4 Elements of Sequencing and Scheduling by Kenneth R. Baker Byung-Hyun Ha R2.
EMIS 8373: Integer Programming Column Generation updated 12 April 2005.
Intro. ANN & Fuzzy Systems Lecture 37 Genetic and Random Search Algorithms (2)
1 Contents 1. Basic Concepts 2. Algorithm 3. Practical considerations Simulated Annealing (SA)
Discrete Optimization MA2827 Fondements de l’optimisation discrète Approximate methods: local search Material based.
EMIS 8373: Integer Programming Combinatorial Optimization Problems updated 27 January 2005.
A MapReduced Based Hybrid Genetic Algorithm Using Island Approach for Solving Large Scale Time Dependent Vehicle Routing Problem Rohit Kondekar BT08CSE053.
Discrete Optimization
CSCI 4310 Lecture 10: Local Search Algorithms
Distributed Vehicle Routing Approximation
Integer Programming An integer linear program (ILP) is defined exactly as a linear program except that values of variables in a feasible solution have.
Heuristic Optimization Methods
6.5 Stochastic Prog. and Benders’ decomposition
Tabu Search Review: Branch and bound has a “rigid” memory structure (i.e. all branches are completed or fathomed). Simulated Annealing has no memory structure.
EMIS 8373: Integer Programming
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Heuristics Definition – a heuristic is an inexact algorithm that is based on intuitive and plausible arguments which are “likely” to lead to reasonable.
Chapter 6. Large Scale Optimization
CSE 589 Applied Algorithms Spring 1999
Heuristic Algorithms via VBA
Multi-Objective Optimization
Lecture 11 Overview Self-Reducibility.
Subset of Slides from Lei Li, HongRui Liu, Roberto Lu
Lecture 9: Tabu Search © J. Christopher Beck 2005.
Chapter 1. Formulations (BW)
Chapter 5. Advanced Search
Algorithm Design Methods
Heuristic Algorithms via VBA
Heuristic Algorithms via VBA
CSE 550 Computer Network Design
6.5 Stochastic Prog. and Benders’ decomposition
EMIS 8373: Integer Programming
Graphical solution A Graphical Solution Procedure (LPs with 2 decision variables can be solved/viewed this way.) 1. Plot each constraint as an equation.
Chapter 1. Formulations.
Vehicle Routing John H. Vande Vate Fall,
Branch-and-Bound Algorithm for Integer Program
Solver via VBA IE 469 Spring 2019.
Presentation transcript:

EMIS 8373: Integer Programming Primal Heuristics: Greedy Solutions and Local Search updated 8 February 2005

Combinatorial Optimization Problems Input A finite set N = {1, 2, …, n} Weights (costs) cj for all j  N Cost(SN) = A set F of feasible subsets of N Optimization Problem Find a minimum-weight feasible subset

Generic Greedy Heuristic for COP Start with an “empty” solution Let S0 =  and t = 1. Choose j in N \ St-1 such that the cost of the resulting solution St-1  {j} is minimized. St = St-1  {j} If St-1 is feasible and Cost(St)  Cost(St-1  {j}) stop and return St-1. If t = n then Stop. Otherwise, let t = t + 1 and go to step 2.

A Greedy Heuristic for UFL UFL can be modeled as a COP where N is the set of depots and we assume that for any given set of depots S  N all 100% of each client’s demand is satisfied by the nearest depot.

A Greedy Heuristic for UFL: Initialization and First Iteration S0 =  and S0 is infeasible. Cost({1}) = 10+ 1+2+3+5+6+8 =35 Cost({2}) =11+ 9+5+4+3+2+1=35 Cost({3}) = 34 Cost({4}) = 50 j = 3 S1 = {3}, Cost({S1}) = 34

A Greedy Heuristic for UFL: Iteration 2 Cost({1,3}) = 10 + 16 + 11 = 37 1 c1 d1 10 2 c2 3 d2 c3 2 c4 d3 16 2 c5 1 d4 c6

A Greedy Heuristic for UFL: Iteration 2 Cost({2,3}) = 16 + 11 + 16 = 43 3 c1 d1 4 c2 d2 11 4 c3 2 c4 d3 16 2 c5 1 d4 c6

A Greedy Heuristic for UFL: Iteration 2 Cost({3,4}) = 16 + 10 + 15= 41 3 c1 d1 4 c2 3 d2 c3 2 c4 d3 16 2 c5 1 d4 10 c6

A Greedy Heuristic for UFL: Second Iteration S1 = {3}, Cost({S1}) = 34 S2 = {1,3} Cost({S2}) = 37 > 34 Stop. Return S1.

Generic Local Search for COP For each S  F define a neighborhood N(S)  F\{S}. Select an initial solution S  F If Cost(T)  Cost(S) for all T in N(S) then stop and return S. Otherwise select a minimum cost neighbor T  N(S), let S = T, and goto step 3.

Local Search The initial solution S could be generated in a variety of ways such as: A randomly generated solution A solution returned by a greedy heuristic A solution returned by solving an LP relaxation and rounding It is up to the user to define the neighborhood The neighborhoods of a solution is the set of solutions that are “close” to it. “Good” neighborhoods are usually ones that are “easy” to evaluate

A Local Search Heuristic for UFL Neighborhood defined by two “moves” T = S  {j} where j in N\S (i.e., add another depot to the current solution) T = S \ {j} where j in S (i.e., shut down one of the depots in the current solution) Let N(S) be the set of all T  F such that |(S  T) \ (S  T)| = 1. Evaluating the neighborhood requires solving |N| transportation problems.

UFL Local Search Example 1 {3, 4} Cost = 41 {1, 3, 4} Cost = 47 {2, 3, 4} Cost = 52 {4} Cost = 50 { 3} Cost = 34 {1, 3} Cost = 37 {2, 3} Cost = 43 {3, 4} Cost = 41 Stop. Return {3}

UFL Local Search Example 2 {1, 3, 4} Cost = 47 {3, 4} Cost = 41 {1, 2, 3, 4} Cost = 58 {1, 4} Cost = 42 {1, 3} Cost = 37 {3} Cost = 34 {1, 2, 3} Cost = 48 {1} Cost = 35 {1, 3, 4} Cost = 47 Stop. Return {3} {1, 3} Cost = 37 {2, 3} Cost = 43 {3, 4} Cost = 41

UFL Local Search Example 3 {1, 2, 3, 4} Cost =58 {2, 3, 4} Cost = 52 {1, 3, 4} Cost = 47 {1, 2, 4} Cost = 43 {1, 2, 3} Cost = 48 {2, 4} Cost = 39 {1, 4} Cost = 42 {1, 2, 3, 4} Cost = 58 {1, 2} Cost =33 {2} Cost =35 {1} Cost = 35 {1, 2, 3} Cost = 48 {1, 2, 4} Cost = 43

Solution Space for UFL Example

Local Minima Depending on where we start in the solution space, the local search heuristic may converge to different local minima. A solution S  F is a local minima if Cost(S)  Cost(T) for all T  N(S) Usually there is no a priori guarantee that a local search will converge to a global minimum. There are some exceptions to this such as The Simplex Method for LP & MST Metaheuristics such as Simulated Annealing and Tabu Search attempt to overcome this drawback of local search.

Symmetric TSP 9 2 3 10 6 2 5 1 1 4 3 6 5 4 2 tij Table Graph Representation 9 2 3 10 6 2 5 1 4 1 3 6 5 4 2

Greedy Heuristic Nearest Neighbor Start at city 1 Let N = N \ {1} Let i = 1 Move to city j where j = argminN d[i,j] Let N = N \ {j} If |N| > 0 then let i = j and goto set 4

Nearest Neighbor Example 9 10 1 3 2 5 4 6 2 3 10 2 1 1 5 5 4 Cost = 20 2

Local Search Heuristic: Pairwise Exchange

First Pairwise Exchange 10 1 2 4 5 3 Cost = 20 (4, 3, 1, 5, 2) 1 3 2 5 4 6 Cost = 17 (1, 3, 4, 5, 2)

Pairwise Exchange Example: Second Pass

Second Pairwise Exchange 2 2 3 6 1 4 1 3 (4, 3, 1, 5, 2) 5 4 1 3 2 5 4 6 Cost = 17 Cost = 16 (5, 3, 1, 4, 2)

Pairwise Exchange Example: Third Pass Stop: Every pairwise exchange leads to a worse solution.