1 Approximation Algorithm Instructor: yedeshi

Slides:



Advertisements
Similar presentations
Great Theoretical Ideas in Computer Science
Advertisements

Instructor Neelima Gupta Table of Contents Approximation Algorithms.
Great Theoretical Ideas in Computer Science for Some.
Lecture 24 Coping with NPC and Unsolvable problems. When a problem is unsolvable, that's generally very bad news: it means there is no general algorithm.
1 Approximation Algorithm Instructor: YE, Deshi
Approximation Algorithms for TSP
1 The TSP : Approximation and Hardness of Approximation All exact science is dominated by the idea of approximation. -- Bertrand Russell ( )
End Topics Approximate Vertex Cover Approximate TSP Tour Computation of FFT P, NP, NP Complete, NP hard.
Combinatorial Algorithms
Great Theoretical Ideas in Computer Science.
1 Discrete Structures & Algorithms Graphs and Trees: II EECE 320.
Approximation Algorithms
Approximation Algorithms: Combinatorial Approaches Lecture 13: March 2.
3 -1 Chapter 3 The Greedy Method 3 -2 The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each.
1 Optimization problems such as MAXSAT, MIN NODE COVER, MAX INDEPENDENT SET, MAX CLIQUE, MIN SET COVER, TSP, KNAPSACK, BINPACKING do not have a polynomial.
Approximation Algorithms1. 2 Outline and Reading Approximation Algorithms for NP-Complete Problems (§13.4) Approximation ratios Polynomial-Time Approximation.
1 Approximation Algorithms CSC401 – Analysis of Algorithms Lecture Notes 18 Approximation Algorithms Objectives: Typical NP-complete problems Approximation.
1 Chapter 11 Approximation Algorithms Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2006 Lecture 7 Monday, 4/3/06 Approximation Algorithms.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 8. Approximation Alg Approximation.
Job Scheduling Lecture 19: March 19. Job Scheduling: Unrelated Multiple Machines There are n jobs, each job has: a processing time p(i,j) (the time to.
Vertex cover problem S  V such that for every {u,v}  E u  S or v  S (or both)
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2009 Lecture 7 Tuesday, 4/7/09 Approximation Algorithms.
9-1 Chapter 9 Approximation Algorithms. 9-2 Approximation algorithm Up to now, the best algorithm for solving an NP-complete problem requires exponential.
An introduction to Approximation Algorithms Presented By Iman Sadeghi.
Approximation Algorithms Motivation and Definitions TSP Vertex Cover Scheduling.
Approximation Algorithms
1 Chapter 11 Approximation Algorithms Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.
Approximation Algorithms
1 Approximation Algorithms. 2 Motivation By now we’ve seen many NP-Complete problems. We conjecture none of them has polynomial time algorithm.
Programming & Data Structures
1 Chapter 11 Approximation Algorithms Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.
Theory of Computing Lecture 10 MAS 714 Hartmut Klauck.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
© The McGraw-Hill Companies, Inc., Chapter 3 The Greedy Method.
1 The TSP : NP-Completeness Approximation and Hardness of Approximation All exact science is dominated by the idea of approximation. -- Bertrand Russell.
Approximation schemes Bin packing problem. Bin Packing problem Given n items with sizes a 1,…,a n  (0,1]. Find a packing in unit-sized bins that minimizes.
APPROXIMATION ALGORITHMS VERTEX COVER – MAX CUT PROBLEMS
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Advanced Algorithm Design and Analysis (Lecture 13) SW5 fall 2004 Simonas Šaltenis E1-215b
Approximating the Minimum Degree Spanning Tree to within One from the Optimal Degree R 陳建霖 R 宋彥朋 B 楊鈞羽 R 郭慶徵 R
Great Theoretical Ideas in Computer Science.
Chapter 15 Approximation Algorithm Introduction Basic Definition Difference Bounds Relative Performance Bounds Polynomial approximation Schemes Fully Polynomial.
Approximation Algorithms
1 Combinatorial Algorithms Parametric Pruning. 2 Metric k-center Given a complete undirected graph G = (V, E) with nonnegative edge costs satisfying the.
Approximation Algorithms
Polynomial-time reductions We have seen several reductions:
Week 10Complexity of Algorithms1 Hard Computational Problems Some computational problems are hard Despite a numerous attempts we do not know any efficient.
Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Approximation Algorithms These lecture slides are adapted from CLRS.
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
Unit 9: Coping with NP-Completeness
Approximation Algorithms for TSP Tsvi Kopelowitz 1.
LIMITATIONS OF ALGORITHM POWER
1 Chapter 11 Approximation Algorithms Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.
1 Approximation algorithms Algorithms and Networks 2015/2016 Hans L. Bodlaender Johan M. M. van Rooij TexPoint fonts used in EMF. Read the TexPoint manual.
Approximation Algorithms by bounding the OPT Instructor Neelima Gupta
Instructor Neelima Gupta Table of Contents Introduction to Approximation Algorithms Factor 2 approximation algorithm for TSP Factor.
COSC 3101A - Design and Analysis of Algorithms 14 NP-Completeness.
Approximation Algorithms based on linear programming.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2010 Lecture 7 Tuesday, 4/6/10 Approximation Algorithms 1.
TU/e Algorithms (2IL15) – Lecture 11 1 Approximation Algorithms.
An introduction to Approximation Algorithms Presented By Iman Sadeghi
Approximation Algorithms
Hamiltonian Cycle and TSP
Hamiltonian Cycle and TSP
Approximation Algorithms
Approximation Algorithms for TSP
Approximation Algorithms
Approximation Algorithms
Presentation transcript:

1 Approximation Algorithm Instructor: yedeshi

2 Dealing with Hard Problems What to do if: Divide and conquer Dynamic programming Greedy Linear Programming/Network Flows … does not give a polynomial time algorithm?

3 Dealing with Hard Problems Solution I: Ignore the problem Can’t do it ! There are thousands of problems for which we do not know polynomial time algorithms For example: Traveling Salesman Problem (TSP) Set Cover

4 Traveling Salesman Problem Traveling SalesmanProblem (TSP) Input: undirected graph with lengths on edges Output: shortest cycle that visits each vertex exactly once Best known algorithm: O(n 2 n ) time.

5 The vertex-cover problem A vertex cover of an undirected graph G = (V, E) is a subset V' ⊆ V such that if (u, v) ∈ E, then u ∈ V' or v ∈ V' (or both). A vertex cover for G is a set of vertices that covers all the edges in E. As a decision problem, we define VERTEX-COVER = { 〈 G, k 〉 : graph G has a vertex cover of size k}. Best known algorithm: O(kn k )

6 Dealing with Hard Problems Exponential time algorithms for small inputs. E.g., (100/99) n time is not bad for n < Polynomial time algorithms for some (e.g., average-case) inputs Polynomial time algorithms for all inputs, but which return approximate solutions

7 Approximation Algorithms An algorithm A is ρ-approximate, if, on any inputof size n: The cost C A of the solution produced by thealgorithm, and The cost C OPT of the optimal solution are such that C A ≤ ρ C OPT We will see: 2-approximation algorithm for TSP in the plane 2-approximation algorithm for Vertex Cover

8 Comments on Approximation “C A ≤ ρ C OPT ” makes sense only for minimization problems For maximization problems, replace by C OPT ≤ ρ C A Additive approximation “C A ≤ ρ + C OPT “ also makes sense, although difficult to achieve

9 The Vertex-cover problem

10 The vertex-cover problem A vertex cover of an undirected graph G = (V, E) is a subset V' ⊆ V such that if (u, v) ∈ E, then u ∈ V' or v ∈ V' (or both). A vertex cover for G is a set of vertices that covers all the edges in E. The goal is to find a vertex cover of minimum size in a given undirected graph G.

11 Naive Algorithm APPROX-VERTEX-COVER(G) 1 C ← Ø 2 E′ ← E[G] 3 while E′ ≠ Ø 4 do let (u, v) be an arbitrary edge of E′ 5 C ← C ∪ {u, v} 6 remove from E′ every edge incident on either u or v 7 return C

12 Illustration of Naive algorithm Input Edge bc is chosen Set C = {b, c} Edge ef is chosen Optimal solution {b, e, d} Naive algorithm C={b,c,d,e,f,g}

13 Approximation 2 Theorem. APPROX-VERTEX-COVER is a 2-approximation algorithm. Pf. let A denote the set of edges that were picked by APPROX- VERTEX-COVER. To cover the edges in A, any vertex cover, in particular, an optimal cover C* must include at least one endpoint of each edge in A. No two edges in A share an endpoint. Thus no two edges in A are covered by the same vertex from C*, and we have the lower bound C* ≥ |A| On the other hand, the algorithm picks an edge for which neither of its endpoints is already in C. |C| = 2|A| Hence, |C| = 2|A| ≤ 2|C*|.

14 Vertex cover: summary No better constant-factor approximation is known!! More precisely, minimum vertex cover is known to be approximable within (for a given |V|≥2) but cannot be approximated within for any sufficiently large vertex degree.

15 The Traveling Salesman Problem Traveling SalesmanProblem (TSP) Input: undirected graph G = (V, E) with edges cost c(u, v) associated with each edge (u, v) ∈ E Output: shortest cycle that visits each vertex exactly once Triangle inequality if for all vertices u, v, w ∈ V, c(u, w) ≤ c(u, v) + c(v, w). u v w

16 2-approximation for TSP with triangle inequality Compute MST T An edge between any pair of points Weight = distance between endpoints Compute a tree-walk W of T Each edge visited twice Convert W into a cycle H using shortcuts

17 Algorithm APPROX-TSP-TOUR(G, c) 1 select a vertex r ∈ V [G] to be a "root" vertex 2 compute a minimum spanning tree T for G from root r using MST-PRIM(G, c, r) 3 let L be the list of vertices visited in a preorder tree walk of T 4 return the hamiltonian cycle H that visits the vertices in the order L

18 Preorder Traversal Preorder: (root-left-right) Visit the root first; and then traverse the left subtree; and then traverse the right subtree. Example: Order: A,B,C,D,E,F,G,H,I

19 Illustration A full walk of the tree visits the vertices in the order a, b, c, b, h, b, a, d, e, f, e, g, e, d, a. MST Tree walk W preorder walk (Final solution H) OPT solution

20 2-approximation Theorem. APPROX-TSP-TOUR is a polynomial- time 2-approximation algorithm for the traveling- salesman problem with the triangle inequality. Pf. Let C OPT be the optimal cycle Cost(T) ≤ Cost(C OPT ) Removing an edge from H gives a spanning tree, T is a spanning tree of minimum cost Cost(W) = 2 Cost(T) Each edge visited twice Cost(H) ≤ Cost(W) Triangle inequality Cost(H) ≤ 2 Cost(COPT )

21 Load Balancing Input. m identical machines; n jobs, job j has processing time t j. Job j must run contiguously on one machine. A machine can process at most one job at a time. Def. Let J(i) be the subset of jobs assigned to machine i. The load of machine i is L i =  j  J(i) t j. Def. The makespan is the maximum load on any machine L = max i L i. Load balancing. Assign each job to a machine to minimize makespan.

22 List-scheduling algorithm. n Consider n jobs in some fixed order. n Assign job j to machine whose load is smallest so far. Implementation. O(n log n) using a priority queue. Load Balancing: List Scheduling List-Scheduling(m, n, t 1,t 2, …,t n ) { for i = 1 to m { L i  0 J(i)   } for j = 1 to n { i = argmin k L k J(i)  J(i)  {j} L i  L i + t j } jobs assigned to machine i load on machine i machine i has smallest load assign job j to machine i update load of machine i

23 Load Balancing: List Scheduling Analysis Theorem. [Graham, 1966] Greedy algorithm is a (2-1/m)-approximation. n First worst-case analysis of an approximation algorithm. n Need to compare resulting solution with optimal makespan L*. Lemma 1. The optimal makespan L*  max j t j. Pf. Some machine must process the most time-consuming job. ▪ Lemma 2. The optimal makespan Pf. n The total processing time is  j t j. n One of m machines must do at least a 1/m fraction of total work. ▪

24 Load Balancing: List Scheduling Analysis Theorem. Greedy algorithm is a (2-1/m)-approximation. Pf. Consider load L i of bottleneck machine i. n Let j be last job scheduled on machine i. n When job j assigned to machine i, i had smallest load. Its load before assignment is L i - t j  L i - t j  L k for all 1  k  m. j 0 L = L i L i - t j machine i blue jobs scheduled before j

25 Load Balancing: List Scheduling Analysis Theorem. Greedy algorithm is a (2-1/m)- approximation. Pf. Consider load L i of bottleneck machine i. n Let j be last job scheduled on machine i. n When job j assigned to machine i, i had smallest load. Its load before assignment is L i - t j  L i - t j  L k for all 1  k  m. n Sum inequalities over all k and divide by m: n Now▪ Lemma 1 Lemma 2 Lemma 1

26 Load Balancing: List Scheduling Analysis Q. Is our analysis tight? A. Essentially yes. Indeed, LS algorithm has tight bound 2- 1/m Ex: m machines, m(m-1) jobs length 1 jobs, one job of length m machine 2 idle machine 3 idle machine 4 idle machine 5 idle machine 6 idle machine 7 idle machine 8 idle machine 9 idle machine 10 idle list scheduling makespan = 19 m = 10

27 Load Balancing: List Scheduling Analysis Q. Is our analysis tight? A. Essentially yes. Indeed, LS algorithm has tight bound 2- 1/m Ex: m machines, m(m-1) jobs length 1 jobs, one job of length m m = 10 optimal makespan = 10

28 Machine 2 Machine 1adf bceg yes Load Balancing on 2 Machines Claim. Load balancing is hard even if only 2 machines. Pf. NUMBER-PARTITIONING  P LOAD-BALANCE. ad f bc ge length of job f TimeL0 machine 1 machine 2

29 Load Balancing: LPT Rule Longest processing time (LPT). Sort n jobs in descending order of processing time, and then run list scheduling algorithm. LPT-List-Scheduling(m, n, t 1,t 2, …,t n ) { Sort jobs so that t 1 ≥ t 2 ≥ … ≥ t n for i = 1 to m { L i  0 J(i)   } for j = 1 to n { i = argmin k L k J(i)  J(i)  {j} L i  L i + t j } jobs assigned to machine i load on machine i machine i has smallest load assign job j to machine i update load of machine i

30 Load Balancing: LPT Rule Observation. If at most m jobs, then list-scheduling is optimal. Pf. Each job put on its own machine. ▪ Lemma 3. If there are more than m jobs, L*  2 t m+1. Pf. n Consider first m+1 jobs t 1, …, t m+1. n Since the t i 's are in descending order, each takes at least t m+1 time. n There are m+1 jobs and m machines, so by pigeonhole principle, at least one machine gets two jobs. ▪ Theorem. LPT rule is a 3/2 approximation algorithm. Pf. Same basic approach as for list scheduling. ▪ Lemma 3 ( by observation, can assume number of jobs > m )

31 Load Balancing: LPT Rule Q. Is our 3/2 analysis tight? A. No. Theorem. [Graham, 1969] LPT rule is a (4/3 – 1/(3m))-approximation. Pf. More sophisticated analysis of same algorithm. Q. Is Graham's (4/3 – 1/(3m))- analysis tight? A. Essentially yes. Ex: m machines, n = 2m+1 jobs, 2 jobs of length m+1, m+2, …, 2m-1 and one job of length m.

32 LPT Proof. Jobs are indexed t 1 ≥ t 2 ≥ … ≥ t n. If n ≤ m, already optimal (one machine processes one job). If n> 2m, then t n ≤ L*/3. Similar as the analysis of LS algorithm. Suppose total 2m – h jobs, 0 ≤ h < m Check that LPT is already optimal solution 1 h h+1 h+2n-1 n Time

33 Approximation Scheme NP-complete problems allow polynomial-time approximation algorithms that can achieve increasingly smaller approximation ratios by using more and more computation time Tradeoff between computation time and the quality of the approximation For any fixed ∈ >0, An approximation scheme for an optimization problem is an (1 + ∈ )- approximation algorithm.

34 PTAS and FPTAS We say that an approximation scheme is a polynomial-time approximation scheme (PTAS) if for any fixed ∈ > 0, the scheme runs in time polynomial in the size n of its input instance. Example: O(n 2/ ∈ ). an approximation scheme is a fully polynomial-time approximation scheme (FPTAS) if it is an approximation scheme and its running time is polynomial both in 1/ ∈ and in the size n of the input instance Example: O((1/ ∈ ) 2 n 3 ).

35 The Subset Sum Input. A pair (S, t), where S is a set {x 1, x 2,..., x n } of positive integers and t is a positive integer Output. A subset S′ of S Goal. Maximize the sum of S′ but its value is not larger than t.

36 An exponential-time exact algorithm If L is a list of positive integers and x is another positive integer, then we let L + x denote the list of integers derived from L by increasing each element of L by x. For example, if L = 〈 1, 2, 3, 5, 9 〉, then L + 2 = 〈 3, 4, 5, 7, 11 〉. We also use this notation for sets, so that S + x = {s + x : s ∈ S}.

37 Exact algorithm MERGE-LISTS(L, L′): returns the sorted list that is the merge of its two sorted input lists L and L′ with duplicate values removed. EXACT-SUBSET-SUM(S, t) 1 n ← |S| 2 L 0 ← 〈0〉 3 for i ← 1 to n 4 do L i ← MERGE-LISTS(L i-1, L i-1 + x i ) 5 remove from L i every element that is greater than t 6 return the largest element in L n

38 Example For example, if S = {1, 4, 5}, then P1 ={0, 1}, P2 ={0, 1, 4, 5}, P3 ={0, 1, 4, 5, 6, 9, 10}. Given the identity Since the length of L i can be as much as 2 i, it is an exponential-time algorithm.

39 The Subset-sum problem: FPTAS Trimming or rounding: if two values in L are close to each other, then for the purpose of finding an approximate solution there is no reason to maintain both of them explicitly. Let δ such that 0 < δ < 1. L′ is the result of trimming L, for every element y that was removed from L, there is an element z still in L′ that approximates y, that is

40 Example For example, if δ = 0.1 and L = 〈 10, 11, 12, 15, 20, 21, 22, 23, 24, 29 〉, then we can trim L to obtain L′ = 〈 10, 12, 15, 20, 23, 29 〉, TRIM(L, δ) 1 m ← |L| 2 L′ ← 〈 y 1 〉 3 last ← y 1 4 for i ← 2 to m 5 do if y i > last · (1 + δ) ▹ y i ≥ last because L is sorted 6 then append y i onto the end of L′ 7 last ← y i 8 return L′

41 (1+ ∈ )-Approximation algorithm APPROX-SUBSET-SUM(S, t, ∈ ) 1 n ← |S| 2 L 0 ← 〈 0 〉 3 for i ← 1 to n 4 do L i ← MERGE-LISTS(L i-1, L i-1 + x i ) 5 L i ← TRIM(L i, ∈ /2n) 6 remove from L i every element that is greater than t 7 let z* be the largest value in L n 8 return z*

42 FPTAS Theorem. APPROX-SUBSET-SUM is a fully polynomial-time approximation scheme for the subset-sum problem. Pf. The operations of trimming L i in line 5 and removing from L i every element that is greater than t maintain the property that every element of L i is also a member of P i. Therefore, the value z* returned in line 8 is indeed the sum of some subset of S.

43 Pf. Con. Pf. Let y* ∈ P n denote an optimal solution to the subset-sum problem. we know that z* ≤ y*. We need to show that y*/z* ≤ 1 + ∈. By induction on i, it can be shown that for every element y in P i that is at most t, there is a z ∈ L i such that Thus, there is a z ∈ L n, such that

44 Pf. Con. And thus, Since there is a z ∈ L n Hence,

45 Pf. Con. To show FPTAS, we need to bound L i. After trimming, successive elements z and z′ of L i must have the relationship z′/z > 1+ ∈ /2n Each list, therefore, contains the value 0, possibly the value 1, and up to ⌊ log 1+ ∈ /2n t ⌋ additional values