Approximation Algorithms

Slides:



Advertisements
Similar presentations
1 The TSP : Approximation and Hardness of Approximation All exact science is dominated by the idea of approximation. -- Bertrand Russell ( )
Advertisements

NP-Complete Problems Polynomial time vs exponential time
S. J. Shyu Chap. 1 Introduction 1 The Design and Analysis of Algorithms Chapter 1 Introduction S. J. Shyu.
Combinatorial Algorithms
Lecture 21 Approximation Algorithms Introduction.
Chapter 10 Complexity of Approximation (1) L-Reduction Ding-Zhu Du.
Chapter 3 The Greedy Method 3.
Complexity 16-1 Complexity Andrei Bulatov Non-Approximability.
Computability and Complexity 23-1 Computability and Complexity Andrei Bulatov Search and Optimization.
Approximation Algorithms
Branch and Bound Searching Strategies
Approximation Algorithms: Combinatorial Approaches Lecture 13: March 2.
3 -1 Chapter 3 The Greedy Method 3 -2 The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each.
1 Optimization problems such as MAXSAT, MIN NODE COVER, MAX INDEPENDENT SET, MAX CLIQUE, MIN SET COVER, TSP, KNAPSACK, BINPACKING do not have a polynomial.
1 -1 Chapter 1 Introduction Why Do We Need to Study Algorithms? To learn strategies to design efficient algorithms. To understand the difficulty.
Approximation Algorithms1. 2 Outline and Reading Approximation Algorithms for NP-Complete Problems (§13.4) Approximation ratios Polynomial-Time Approximation.
Approximation Algorithms Lecture for CS 302. What is a NP problem? Given an instance of the problem, V, and a ‘certificate’, C, we can verify V is in.
1 Approximation Algorithms CSC401 – Analysis of Algorithms Lecture Notes 18 Approximation Algorithms Objectives: Typical NP-complete problems Approximation.
1 Vertex Cover Problem Given a graph G=(V, E), find V' ⊆ V such that for each edge (u, v) ∈ E at least one of u and v belongs to V’ and |V’| is minimized.
Greedy Algorithms Reading Material: Chapter 8 (Except Section 8.5)
88- 1 Chapter 8 The Theory of NP-Completeness P: the class of problems which can be solved by a deterministic polynomial algorithm. NP : the class.
Approximation Algorithms for the Traveling Salesperson Problem.
1 Branch and Bound Searching Strategies 2 Branch-and-bound strategy 2 mechanisms: A mechanism to generate branches A mechanism to generate a bound so.
1 NP-Complete Problems Polynomial time vs exponential time –Polynomial O(n k ), where n is the input size (e.g., number of nodes in a graph, the length.
NP-Complete Problems (Fun part)
Chapter 11 Limitations of Algorithm Power Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
9-1 Chapter 9 Approximation Algorithms. 9-2 Approximation algorithm Up to now, the best algorithm for solving an NP-complete problem requires exponential.
9-1 Chapter 9 Approximation Algorithms. 9-2 Approximation algorithm Up to now, the best algorithm for solving an NP-complete problem requires exponential.
Steiner trees Algorithms and Networks. Steiner Trees2 Today Steiner trees: what and why? NP-completeness Approximation algorithms Preprocessing.
Approximation Algorithms Motivation and Definitions TSP Vertex Cover Scheduling.
The Theory of NP-Completeness 1. Nondeterministic algorithms A nondeterminstic algorithm consists of phase 1: guessing phase 2: checking If the checking.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
© The McGraw-Hill Companies, Inc., Chapter 3 The Greedy Method.
Approximation Algorithms Department of Mathematics and Computer Science Drexel University.
Chapter 12 Coping with the Limitations of Algorithm Power Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Advanced Algorithm Design and Analysis (Lecture 13) SW5 fall 2004 Simonas Šaltenis E1-215b
Chapter 15 Approximation Algorithm Introduction Basic Definition Difference Bounds Relative Performance Bounds Polynomial approximation Schemes Fully Polynomial.
TECH Computer Science NP-Complete Problems Problems  Abstract Problems  Decision Problem, Optimal value, Optimal solution  Encodings  //Data Structure.
Approximation Algorithms
Approximation Algorithms
Week 10Complexity of Algorithms1 Hard Computational Problems Some computational problems are hard Despite a numerous attempts we do not know any efficient.
Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Approximation Algorithms These lecture slides are adapted from CLRS.
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
1 Branch and Bound Searching Strategies Updated: 12/27/2010.
1 Approximation Algorithm Updated at 2011/01/03. 2 Approximation Algorithm Up to now, the best algorithm for solving an NP-complete problem requires exponential.
1 Approximation Algorithm Updated on 2012/12/25. 2 Approximation Algorithm Up to now, the best algorithm for solving an NP-complete problem requires exponential.
Design and Analysis of Algorithms - Chapter 101 Our old list of problems b Sorting b Searching b Shortest paths in a graph b Minimum spanning tree b Primality.
Approximation Algorithms Guo QI, Chen Zhenghai, Wang Guanhua, Shen Shiqi, Himeshi De Silva.
Branch and Bound Searching Strategies
The Theory of NP-Completeness 1. Nondeterministic algorithms A nondeterminstic algorithm consists of phase 1: guessing phase 2: checking If the checking.
1 The instructor will be absent on March 29 th. The class resumes on March 31 st.
Approximation Algorithms by bounding the OPT Instructor Neelima Gupta
Approximation algorithms
ICS 353: Design and Analysis of Algorithms NP-Complete Problems King Fahd University of Petroleum & Minerals Information & Computer Science Department.
Approximation Algorithms
Design and Analysis of Approximation Algorithms
The Theory of NP-Completeness
Optimization problems such as
Approximation Algorithms
Hamiltonian Cycle and TSP
Hamiltonian Cycle and TSP
Approximation Algorithms
Computability and Complexity
ICS 353: Design and Analysis of Algorithms
ICS 353: Design and Analysis of Algorithms
The Theory of NP-Completeness
Approximation Algorithm
Lecture 24 Vertex Cover and Hamiltonian Cycle
Presentation transcript:

Approximation Algorithms Chapter 9 Approximation Algorithms

NP-Complete Problem Enumeration Branch an Bound Greedy Approximation PTAS K-Approximation No Approximation

Approximation algorithm Up to now, the best algorithm for solving an NP-complete problem requires exponential time in the worst case. It is too time-consuming. To reduce the time required for solving a problem, we can relax the problem, and obtain a feasible solution “close” to an optimal solution

Approximation Ratios Optimization Problems We have some problem instance x that has many feasible “solutions”. We are trying to minimize (or maximize) some cost function c(S) for a “solution” S to x. For example, ŠFinding a minimum spanning tree of a graph Finding a smallest vertex cover of a graph ŠFinding a smallest traveling salesperson tour in a graph

Approximation Ratios An approximation produces a solution T Relative approximation ratio T is a k-approximation to the optimal solution OPT if c(T)/c(OPT) < k (assuming a minimizing problem; a maximization approximation would be the reverse) Absolute approximation ratio For example, chromatic number problem If the optimal solution of this instance is three and the approximation T is four, then T is a 1-approximation to the optimal solution.

The Euclidean traveling salesperson problem (ETSP) The ETSP is to find a shortest closed path through a set S of n points in the plane. The ETSP is NP-hard.

An approximation algorithm for ETSP Input: A set S of n points in the plane. Output: An approximate traveling salesperson tour of S. Step 1: Find a minimal spanning tree T of S. Step 2: Find a minimal Euclidean weighted matching M on the set of vertices of odd degrees in T. Let G=M∪T. Step 3: Find an Eulerian cycle of G and then traverse it to find a Hamiltonian cycle as an approximate tour of ETSP by bypassing all previously visited vertices.

An example for ETSP algorithm Step1: Find a minimal spanning tree.

Step2: Perform weighted matching Step2: Perform weighted matching. The number of points with odd degrees must be even because is even.

Step3: Construct the tour with an Eulerian cycle and a Hamiltonian cycle.

How close the approximate solution to an optimal solution? Time complexity: O(n3) Step 1: O(nlogn) Step 2: O(n3) Step 3: O(n) How close the approximate solution to an optimal solution? The approximate tour is within 3/2 of the optimal one. (The approximate rate is 3/2.) (See the proof on the next page.)

Proof of approximate rate optimal tour L: j1…i1j2…i2j3…i2m {i1,i2,…,i2m}: the set of odd degree vertices in T. 2 matchings: M1={[i1,i2],[i3,i4],…,[i2m-1,i2m]} M2={[i2,i3],[i4,i5],…,[i2m,i1]} length(L) length(M1) + length(M2) (triangular inequality)  2 length(M )  length(M) 1/2 length(L ) G = T∪M  length(T) + length(M)  length(L) + 1/2 length(L) = 3/2 length(L)

The bottleneck traveling salesperson problem (BTSP) Minimize the longest edge of a tour. This is a mini-max problem. This problem is NP-hard. The input data for this problem fulfill the following assumptions: The graph is a complete graph. All edges obey the triangular inequality rule.

An algorithm for finding an optimal solution Step1: Sort all edges in G = (V,E) into a nondecresing sequence |e1||e2|…|em|. Let G(ei) denote the subgraph obtained from G by deleting all edges longer than ei. Step2: i←1 Step3: If there exists a Hamiltonian cycle in G(ei), then this cycle is the solution and stop. Step4: i←i+1 . Go to Step 3.

An example for BTSP algorithm e.g. There is a Hamiltonian cycle, A-B-D-C-E-F-G-A, in G(BD). The optimal solution is 13.

Theorem for Hamiltonian cycles Def : The t-th power of G=(V,E), denoted as Gt=(V,Et), is a graph that an edge (u,v)Et if there is a path from u to v with at most t edges in G. Theorem: If a graph G is bi-connected, then G2 has a Hamiltonian cycle.

An example for the theorem G2 A Hamiltonian cycle: A-B-C-D-E-F-G-A

An approximation algorithm for BTSP Input: A complete graph G=(V,E) where all edges satisfy triangular inequality. Output: A tour in G whose longest edges is not greater than twice of the value of an optimal solution to the special bottleneck traveling salesperson problem of G. Step 1: Sort the edges into |e1||e2|…|em|. Step 2: i := 1. Step 3: If G(ei) is bi-connected, construct G(ei)2, find a Hamiltonian cycle in G(ei)2 and return this as the output. Step 4: i := i + 1. Go to Step 3.

An example Add some more edges. Then it becomes bi-connected.

A Hamiltonian cycle: A-G-F-E-D-C-B-A. The longest edge: 16 Time complexity: polynomial time

How good is the solution ? The approximate solution is bounded by two times an optimal solution. Reasoning: A Hamiltonian cycle is bi-connected. eop: the longest edge of an optimal solution G(ei): the first bi-connected graph |ei||eop| The length of the longest edge in G(ei)22|ei| (triangular inequality) 2|eop|

NP-completeness Theorem: If there is a polynomial approximation algorithm which produces a bound less than two, then NP=P. (The Hamiltonian cycle decision problem reduces to this problem.) Proof: For an arbitrary graph G=(V,E), we expand G to a complete graph Gc: Cij = 1 if (i,j)  E Cij = 2 if otherwise (The definition of Cij satisfies the triangular inequality.)

Let V* denote the value of an optimal solution of the bottleneck TSP of Gc. V* = 1  G has a Hamiltonian cycle Because there are only two kinds of edges, 1 and 2 in Gc, if we can produce an approximate solution whose value is less than 2V*, then we can also solve the Hamiltonian cycle decision problem.

The bin packing problem n items a1, a2, …, an, 0 ai  1, 1  i  n, to determine the minimum number of bins of unit capacity to accommodate all n items. E.g. n = 5, {0.3, 0.5, 0.8, 0.2 0.4} The bin packing problem is NP-hard.

An approximation algorithm for the bin packing problem (first-fit) place ai into the lowest-indexed bin which can accommodate ai. Theorem: The number of bins used in the first-fit algorithm is at most twice of the optimal solution.

Proof of the approximate rate Notations: S(ai): the size of ai OPT(I): the size of an optimal solution of an instance I FF(I): the size of bins in the first-fit algorithm C(Bi): the sum of the sizes of aj’s packed in bin Bi in the first-fit algorithm OPT(I)  C(Bi) + C(Bi+1)  1 m nonempty bins are used in FF: C(B1)+C(B2)+…+C(Bm)  m/2  FF(I) = m < 2 = 2  2 OPT(I) FF(I) < 2 OPT(I)

Knapsack problem Fractional knapsack problem 0/1 knapsack problem P NP-Complete Approximation PTAS

Fractional knapsack problem n objects, each with a weight wi > 0 a profit pi > 0 capacity of knapsack: M Maximize Subject to 0  xi  1, 1  i  n

The knapsack algorithm The greedy algorithm: Step 1: Sort pi/wi into nonincreasing order. Step 2: Put the objects into the knapsack according to the sorted sequence as possible as we can. e. g. n = 3, M = 20, (p1, p2, p3) = (25, 24, 15) (w1, w2, w3) = (18, 15, 10) Sol: p1/w1 = 25/18 = 1.32 p2/w2 = 24/15 = 1.6 p3/w3 = 15/10 = 1.5 Optimal solution: x1 = 0, x2 = 1, x3 = 1/2

0/1 knapsack problem Def: n objects, each with a weight wi > 0 a profit pi > 0 capacity of knapsack : M Maximize pixi 1in Subject to wixi  M xi = 0 or 1, 1 i n Decision version : Given K,  pixi  K ? Knapsack problem : 0  xi  1, 1 i n. <Theorem> partition  0/1 knapsack decision problem.

Polynomial-Time Approximation Schemes A problem L has a polynomial-time approximation scheme (PTAS) if it has a polynomial-time (1+ε)-approximation algorithm, for any fixed ε >0 (this value can appear in the running time). 0/1 Knapsack has a PTAS, with a running time that is O(n^3 / ε).

Knapsack: PTAS Intuition for approximation algorithm. Given a error ration ε, we calculate a threshold to classify items BIG  enumeration ; SMALL greedy In our case, T will be found to be 46.8 . Thus BIG = {1, 2, 3} and SMALL = {4, 5, 6, 7, 8}. i 1 2 3 4 5 6 7 8 pi 90 61 50 33 29 23 15 13 wi 30 25 17 12 10 9 pi/wi 2.72 2.03 2.0 1.94 1.93 1.91 1.5 1.44

Knapsack: PTAS Solution 1: Solution 2: For the BIG, we try to enumerate all possible solutions. Solution 1: We select items 1 and 2. The sum of normalized profits is 15. The corresponding sum of original profits is 90 + 61 = 151. The sum of weights is 63. Solution 2: We select items 1, 2, and 3. The sum of normalized profits is 20. The corresponding sum of original profits is 90 + 61 + 50 = 201. The sum of weights is 88.

Knapsack: PTAS For the SMALL, we use greedy strategy to find a possible solutions. Solution 1: For Solution 1, we can add items 4 and 6. The sum of profits will be 151 + 33 + 23 = 207. Solution 2: For Solution 2, we can not add any item from SMALL. Thus the sum of profits is 201.

A bad example A convex hull of n points in the plane can be computed in O(nlogn) time in the worst case. An approximation algorithm: Step1: Find the leftmost and rightmost points.

Step2: Divide the points into K strips Step2: Divide the points into K strips. Find the highest and lowest points in each strip.

Step3: Apply the Graham scan to those highest and lowest points to construct an approximate convex hull. (The highest and lowest points are already sorted by their x-coordinates.)

Time complexity Time complexity: O(n+k) Step 1: O(n) Step 2: O(n) Step 3: O(k)

How good is the solution ? How far away the points outside are from the approximate convex hull? Answer: L/K. L: the distance between the leftmost and rightmost points.