Approximation Algorithms for Knapsack Problems 1 Tsvi Kopelowitz Modified by Ariel Rosenfeld.

Slides:



Advertisements
Similar presentations
Approximation Algorithms
Advertisements

MCS 312: NP Completeness and Approximation algorithms Instructor Neelima Gupta
Knapsack Problem Section 7.6. Problem Suppose we have n items U={u 1,..u n }, that we would like to insert into a knapsack of size C. Each item u i has.
An Introduction to Computational Complexity Edith Elkind IAM, ECS.
Analysis of Algorithms
9/27/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Adam Smith Algorithm Design and Analysis L ECTURE 14 Dynamic.
Approximation Algorithms for TSP
Merge Sort 4/15/2017 6:09 PM The Greedy Method The Greedy Method.
Complexity 16-1 Complexity Andrei Bulatov Non-Approximability.
Greedy vs Dynamic Programming Approach
Sum of Subsets and Knapsack
Parameterized Approximation Scheme for the Multiple Knapsack Problem by Klaus Jansen (SODA’09) Speaker: Yue Wang 04/14/2009.
1 Optimization problems such as MAXSAT, MIN NODE COVER, MAX INDEPENDENT SET, MAX CLIQUE, MIN SET COVER, TSP, KNAPSACK, BINPACKING do not have a polynomial.
Approximation Algorithms
1 Vertex Cover Problem Given a graph G=(V, E), find V' ⊆ V such that for each edge (u, v) ∈ E at least one of u and v belongs to V’ and |V’| is minimized.
Dean H. Lorenz, Danny Raz Operations Research Letter, Vol. 28, No
Polynomial time approximation scheme Lecture 17: Mar 13.
Optimization problems INSTANCE FEASIBLE SOLUTIONS COST.
Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under contract.
Integer Programming Difference from linear programming –Variables x i must take on integral values, not real values Lots of interesting problems can be.
More on Intractability Knapsack Problem Wednesday, August 5 th 1.
Week 2: Greedy Algorithms
Dynamic Programming 0-1 Knapsack These notes are taken from the notes by Dr. Steve Goddard at
Approximation Algorithms Motivation and Definitions TSP Vertex Cover Scheduling.
1 The Greedy Method CSC401 – Analysis of Algorithms Lecture Notes 10 The Greedy Method Objectives Introduce the Greedy Method Use the greedy method to.
Approximation Algorithms
Computational aspects of stability in weighted voting games Edith Elkind (NTU, Singapore) Based on joint work with Leslie Ann Goldberg, Paul W. Goldberg,
Improved results for a memory allocation problem Rob van Stee University of Karlsruhe Germany Leah Epstein University of Haifa Israel WADS 2007 WAOA 2007.
The Theory of NP-Completeness 1. Nondeterministic algorithms A nondeterminstic algorithm consists of phase 1: guessing phase 2: checking If the checking.
Approximation schemes Bin packing problem. Bin Packing problem Given n items with sizes a 1,…,a n  (0,1]. Find a packing in unit-sized bins that minimizes.
1 Approximation Through Scaling Algorithms and Networks 2014/2015 Hans L. Bodlaender Johan M. M. van Rooij.
Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Approximation Algorithms These lecture slides are adapted from CLRS.
Computer Science Day 2013, May Distinguished Lecture: Andy Yao, Tsinghua University Welcome and the 'Lecturer of the Year' award.
Partitioning Graphs of Supply and Demand Generalization of Knapsack Problem Takao Nishizeki Tohoku University.
Although this may seem a paradox, all exact science is dominated by the idea of approximation. Bertrand Russell Approximation Algorithm.
The Greedy Method. The Greedy Method Technique The greedy method is a general algorithm design paradigm, built on the following elements: configurations:
Approximation Algorithms for TSP Tsvi Kopelowitz 1.
Approximation Algorithms Department of Mathematics and Computer Science Drexel University.
CS 3343: Analysis of Algorithms Lecture 18: More Examples on Dynamic Programming.
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
1 BECO 2004 When can one develop an FPTAS for a sequential decision problem? with apologies to Gerhard Woeginger James B. Orlin MIT working jointly with.
CS 3343: Analysis of Algorithms Lecture 19: Introduction to Greedy Algorithms.
1 Chapter 11 Approximation Algorithms Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.
1 Approximation algorithms Algorithms and Networks 2015/2016 Hans L. Bodlaender Johan M. M. van Rooij TexPoint fonts used in EMF. Read the TexPoint manual.
Approximation Algorithms
Spring 2008The Greedy Method1. Spring 2008The Greedy Method2 Outline and Reading The Greedy Method Technique (§5.1) Fractional Knapsack Problem (§5.1.1)
CS 361 – Chapter 10 “Greedy algorithms” It’s a strategy of solving some problems –Need to make a series of choices –Each choice is made to maximize current.
The Theory of NP-Completeness 1. Nondeterministic algorithms A nondeterminstic algorithm consists of phase 1: guessing phase 2: checking If the checking.
TU/e Algorithms (2IL15) – Lecture 12 1 Linear Programming.
Approximation Algorithms based on linear programming.
Divide and Conquer. Problem Solution 4 Example.
TU/e Algorithms (2IL15) – Lecture 11 1 Approximation Algorithms.
The Subset-sum Problem
Merge Sort 7/29/ :21 PM The Greedy Method The Greedy Method.
Approximation algorithms
Robbing a House with Greedy Algorithms
Approximation Algorithms for TSP
Prepared by Chen & Po-Chuan 2016/03/29
CS 3343: Analysis of Algorithms
Merge Sort 11/28/2018 2:18 AM The Greedy Method The Greedy Method.
The Greedy Method Spring 2007 The Greedy Method Merge Sort
Merge Sort 11/28/2018 8:16 AM The Greedy Method The Greedy Method.
Lecture 11 Overview Self-Reducibility.
Approximation Algorithms
Greedy Algorithms.
Merge Sort 1/17/2019 3:11 AM The Greedy Method The Greedy Method.
Polynomial time approximation scheme
Lecture 21 More Approximation Algorithms
CSC 413/513- Intro to Algorithms
Merge Sort 5/2/2019 7:53 PM The Greedy Method The Greedy Method.
Presentation transcript:

Approximation Algorithms for Knapsack Problems 1 Tsvi Kopelowitz Modified by Ariel Rosenfeld

Knapsack Given: a set S of n objects with weights and values, and a weight bound: ◦w 1, w 2, …, w n, B (weights, weight bound). ◦v 1, v 2, …, v n (values - profit). Find: subset of S with total weight at most B, and maximum total value. Formally: Problem is known to be NP-hard 2

Assumptions. (every item can be added to T). (non-negative values) Values, weights, and bound are all integers. Note: This is a maximum problem. Define: OPT = The optimal solution. We will see a 2-approximation for two versions of knapsack. 3

Uniform Knapsack (simple form) (value=weight) 2-approximation algorithm: ◦Sort the items such that v 1 ≥ v 2 ≥ … ≥ v n. ◦Pick such that and. 4

Uniform Knapsack (proff) Claim:. Proof: ◦Assume (by contradiction):. ◦By the algorithm: =>. ◦ => =>contradiction. (Since items were sorted) 5

2-approximation (general knapsack) Define ratio: Sort items such that r 1 ≥ r 2 ≥ … ≥ r n. Pick such that and. If( ) return else return A. 6

2-approximation try it yourself… 7

Claim: Proof sketch (fill the details yourselves): Consider fractional knapsack (where we can take some of each item)– the optimal solution is A’= (A and a fraction of v j+1.) ◦. ◦.◦. 2-approximation (general knapsack) 8

9 A(i,j) = Smallest weight subset of objects 1,…,i with a total value of j. A DP algorithm for knapsack A j n v max i n Upper bound on optimal profit

A DP algorithm for knapsack The result is: max j such that A(n, j) ≤ B. The runtime is: O(n 2 v max ) Pseudo-polynomial 10

11 Definition: Fully Polynomial Time Approximation Scheme (FPTAS) Given ε, delivers a solution with a ratio of (1- ε ) for maximum and a ratio of (1+ ε ) for minimum, and runs in time polynomial in the size of the input and (1/ ε ) Definition: Pseudo-polynomial If input of integers is given in unary form, runs in polynomial time. Definitions

FPTAS for knapsack The Idea – use scaling!! Given error bound, define. For each object i, define. Use DP to find optimal solution OPT’ for the rounded values. Return the set OPT’ with the original values. 12

Correctness Claim: Proof: For every i:. (1) => (3) (2) (4) 13

Correctness Proof continued: (1) (3) (2) (4) 14

Complexity and Notes Time: algorithm is a FPTAS. 15