Approximate Algorithms

Slides:



Advertisements
Similar presentations
Algorithm Design Methods Spring 2007 CSE, POSTECH.
Advertisements

Approximation Algorithms
Knapsack Problem Section 7.6. Problem Suppose we have n items U={u 1,..u n }, that we would like to insert into a knapsack of size C. Each item u i has.
1 NP-Complete Problems. 2 We discuss some hard problems:  how hard? (computational complexity)  what makes them hard?  any solutions? Definitions 
Complexity 16-1 Complexity Andrei Bulatov Non-Approximability.
Greedy vs Dynamic Programming Approach
Introduction to Approximation Algorithms Lecture 12: Mar 1.
1 Optimization problems such as MAXSAT, MIN NODE COVER, MAX INDEPENDENT SET, MAX CLIQUE, MIN SET COVER, TSP, KNAPSACK, BINPACKING do not have a polynomial.
1 Combinatorial Dominance Analysis The Knapsack Problem Keywords: Combinatorial Dominance (CD) Domination number/ratio (domn, domr) Knapsack (KP) Incremental.
Polynomial time approximation scheme Lecture 17: Mar 13.
Odds and Ends HP ≤ p HC (again) –Turing reductions Strong NP-completeness versus Weak NP-completeness Vertex Cover to Hamiltonian Cycle.
Ch 13 – Backtracking + Branch-and-Bound
KNAPSACK PROBLEM A dynamic approach. Knapsack Problem  Given a sack, able to hold K kg  Given a list of objects  Each has a weight and a value  Try.
1 Introduction to Approximation Algorithms Lecture 15: Mar 5.
The Knapsack Problem Input –Capacity K –n items with weights w i and values v i Goal –Output a set of items S such that the sum of weights of items in.
The Theory of NP-Completeness 1. Nondeterministic algorithms A nondeterminstic algorithm consists of phase 1: guessing phase 2: checking If the checking.
1 Approximation Through Scaling Algorithms and Networks 2014/2015 Hans L. Bodlaender Johan M. M. van Rooij.
APPROXIMATION ALGORITHMS VERTEX COVER – MAX CUT PROBLEMS
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Approximation Algorithms These lecture slides are adapted from CLRS.
1 Approximate Algorithms (chap. 35) Motivation: –Many problems are NP-complete, so unlikely find efficient algorithms –Three ways to get around: If input.
Approximation Algorithms Department of Mathematics and Computer Science Drexel University.
Variations of the Prize- Collecting Steiner Tree Problem Olena Chapovska and Abraham P. Punnen Networks 2006 Reporter: Cheng-Chung Li 2006/08/28.
Spring 2008The Greedy Method1. Spring 2008The Greedy Method2 Outline and Reading The Greedy Method Technique (§5.1) Fractional Knapsack Problem (§5.1.1)
Exhaustive search Exhaustive search is simply a brute- force approach to combinatorial problems. It suggests generating each and every element of the problem.
Approximation algorithms
TU/e Algorithms (2IL15) – Lecture 11 1 Approximation Algorithms.
The Theory of NP-Completeness
Introduction to Approximation Algorithms
Lecture on Design and Analysis of Computer Algorithm
UNIT III DYNAMIC PROGRAMMING ALGORITHMS
Algorithm Design Methods
CS 3343: Analysis of Algorithms
Approximate Algorithms (chap. 35)
Approximation algorithms
Seminar on Dynamic Programming.
Robbing a House with Greedy Algorithms
Week 11 - Monday CS221.
CS4234 Optimiz(s)ation Algorithms
Approximation Algorithms
Computability and Complexity
Heuristics Definition – a heuristic is an inexact algorithm that is based on intuitive and plausible arguments which are “likely” to lead to reasonable.
The Subset Sum Game Revisited
Merge Sort 11/28/2018 2:18 AM The Greedy Method The Greedy Method.
The Greedy Method Spring 2007 The Greedy Method Merge Sort
Coping With NP-Completeness
Exam 2 LZW not on syllabus. 73% / 75%.
MCA 301: Design and Analysis of Algorithms
Lecture 11 Overview Self-Reducibility.
Lecture 11 Overview Self-Reducibility.
Approximation Algorithms
Linear Programming Duality, Reductions, and Bipartite Matching
Advanced Algorithms Analysis and Design
Merge Sort 1/17/2019 3:11 AM The Greedy Method The Greedy Method.
Polynomial time approximation scheme
NP-Completeness Yin Tat Lee
Algorithm Design Methods
NP-Completeness Reference: Computers and Intractability: A Guide to the Theory of NP-Completeness by Garey and Johnson, W.H. Freeman and Company, 1979.
Dynamic Programming.
Algorithm Design Methods
Complexity Theory in Practice
The Theory of NP-Completeness
Ch09 _2 Approximation algorithm
More NP-Complete Problems
Coping With NP-Completeness
Instructor: Aaron Roth
Dynamic Programming.
Knapsack Problem A dynamic approach.
Algorithm Design Methods
Seminar on Dynamic Programming.
Presentation transcript:

Approximate Algorithms

Approximate Greedy For some problems, the obvious greedy algorithm does not give always give the most optimal solution, but may guarantee that it gives one that is within say a factor of 2 of optimal. Max Cut: A random cut expects to have half the edges cross it.

Approximate Knapsack Get as much value as you can into the knapsack

Approximate Knapsack Ingredients: Instances: The volume V of the knapsack. The volume and price of n objects <<v1,p1>,<v2,p2>,… ,<vn,pn>>. Solutions: A set of objects that fit in the knapsack. i.e. iS vi  V Cost of Solution: The total value of objects in set. i.e. iS pi Goal: Get as much value as you can into the knapsack.

Approximate Knapsack Dynamic Programming Running time = ( V × n ) = ( 2#bits in V × n ) Poly time if size of knapsack is small Exponential time if size is an arbitrary integer.

Approximate Knapsack Dynamic Programming Running time = ( V × n ) = ( 2#bits in V × n ) NP-Complete Approximate Algorithm In poly-time (n3/), solution can be found that is perfect in iS vi  V (1+) as good as optimal wrt iS pi Eg,  = .001, Time is 1000n3

Approximate Knapsack Subinstance:  V’[0..V], i[0..n], knapsack(V’,i) = maximize iS pi subject to S  {1..i} and iS vi  V Recurrence Relation knapsack(V’,i) = max( knapsack(V’,i-1), knapsack(V’-vi,i-1)+pi ) No Yes

Approximate Knapsack  V’[0..V], i[0..n], knapsack(V’,i) 1 2 V’-vi 1 2 V’-vi V’ V OptSol price 1 2 same + pi No i-1 same Yes i Take best of best. Our price? n

Approximate Knapsack  V’[0..V], i[0..n], knapsack(V’,i) 1 2 V’ V 1 2 V’ V OptSol price 1 2 i-1 i Time = O(nV) n

Approximate Knapsack Ingredients: (strange version) Instances: The price P wanted from the knapsack. The volume and price of n objects <<v1,p1>,<v2,p2>,… ,<vn,pn>>. Solutions: A set of objects with total value P. i.e. iS pi ≥ P Cost of Solution: The total volume of objects in set. i.e. iS vi Goal: Minimize the volume needed to obtain this value P. into the knapsack.

Approximate Knapsack Subinstance:  P’[0..P], i[0..n], knapsack’(P’,i) = minimize iS vi subject to S  {1..i} and iS pi ≥ P Recurrence Relation knapsack’(P’,i) = min( knapsack’(P’,i-1), knapsack’(P’-pi,i-1)+vi ) No Yes

Approximate Knapsack  V’[0..V], i[0..n], knapsack’(P’,i) 1 2 P’-pi 1 2 P’-pi P’ P OptSol volume 1 2 same + vi No i-1 same Yes i Take best of best. Our volume? n

Approximate Knapsack Original problem knapsack(V,n) 1 2 P’ P 1 2 P’ P OptSol volume 1 2 i Find largest price not using more than V volume Time = O(nP) n P = i pi

Approximate Knapsack Dynamic Programming Running time = ( V × n ) = ( 2#bits in V × n ) Poly time if size of knapsack is small Exponential time if size is an arbitrary integer. Strange Dynamic Programming Running time = ( P × n ) = ( 2#bits in P × n ) Poly time if prices are small Exponential time if prices are arbitrary integers.

Approximate Knapsack Approximation Algorithm: Given V, <<v1,p1>,<v2,p2>,… ,<vn,pn>>, &  Lose precision in prices. eg pi = 1011011010112 p’i = pi with low k bits removed = 101101102 p’i = pi with low k bits zeroed = 1011011000002 Solve knapsack using the strange algorithm. 10110110 k=4 (k chosen later) = pi / 2k or ≥ pi - 2k

Approximate Knapsack Original problem knapsack(V,n) 1 2 P’ P 1 2 P’ P OptSol volume 1 2 i Time = O(n P) = O(n P/2k) n

Approximate Knapsack Approximation Algorithm: Let Salg be the set of items selected by our alg. Let Sopt be the set of items in the optimal sol. pi iSalg Let Palg = be the price returned by our alg. pi iSopt Let Popt = be the price returned by our alg. Need Palg ≥ Popt (1-)

Approximate Knapsack = pi Palg because rounded down ≥ p’i iSalg Palg because rounded down ≥ p’i iSalg because Salg is an optimal solution for the <vi,p’i> problem ≥ p’i iSopt because rounded by at most 2k. ≥ (pi -2k) iSopt = Popt – n 2k

Approximate Knapsack  Popt  Popt  Palg ≥ Popt – n2k n2k Popt = 1 - (need) n2k Popt = 1 - = 1 -  Popt 2k  Popt n = Time = O(n P/2k) = O( n2 P )  Popt P = i pi Popt ≥ maxi pi ≥ P n Time = O( n3)  done

Approximate Clique Clique: Given <G,k>, does G contains a k-clique? Brute Force: Try out all n choose k possible subsets. If k = (n) then 2(n) subsets to check If k=3 then only O(n3) NP-Complete: ie, if there is a poly time algorithm for it, then there is one for many important problems.

Clique: Given <G,k>, does G contains a k-clique? Approximate Clique Clique: Given <G,k>, does G contains a k-clique? Given a graph G, how well can you approximate the size of the largest clique? Not at all. It is NP-complete to know if the largest clique is of size n or n1-