The Knapsack Problem.

Slides:



Advertisements
Similar presentations
Dynamic Programming.
Advertisements

Knapsack Problem Section 7.6. Problem Suppose we have n items U={u 1,..u n }, that we would like to insert into a knapsack of size C. Each item u i has.
Analysis of Algorithms
Problem Solving Dr. Andrew Wallace PhD BEng(hons) EurIng
Dynamic Programming.
Algorithm Strategies Nelson Padua-Perez Chau-Wen Tseng Department of Computer Science University of Maryland, College Park.
Dynamic Programming (II)
0-1 Knapsack Problem A burglar breaks into a museum and finds “n” items Let v_i denote the value of ith item, and let w_i denote the weight of the ith.
Dealing with NP-Complete Problems
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2006 Lecture 2 Monday, 9/13/06 Design Patterns for Optimization Problems.
© 5/7/2002 V.S. Subrahmanian1 Knapsack Problem Notes V.S. Subrahmanian University of Maryland.
© 5/7/2002 V.S. Subrahmanian1 Knapsack Problem Notes V.S. Subrahmanian University of Maryland.
UNC Chapel Hill Lin/Manocha/Foskey Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject.
KNAPSACK PROBLEM A dynamic approach. Knapsack Problem  Given a sack, able to hold K kg  Given a list of objects  Each has a weight and a value  Try.
Dynamic Programming 0-1 Knapsack These notes are taken from the notes by Dr. Steve Goddard at
Analysis of Algorithms
Lecture 7: Greedy Algorithms II
1 Dynamic Programming Jose Rolim University of Geneva.
Backtracking.
The Knapsack Problem Input –Capacity K –n items with weights w i and values v i Goal –Output a set of items S such that the sum of weights of items in.
Dave Risch. Project Specifications There is a “knapsack” that you want to fill with the most valuable items that are available to you. Each item has a.
First Ingredient of Dynamic Programming
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
Recursion and Dynamic Programming. Recursive thinking… Recursion is a method where the solution to a problem depends on solutions to smaller instances.
Fundamentals of Algorithms MCS - 2 Lecture # 7
Dynamic Programming Sequence of decisions. Problem state. Principle of optimality. Dynamic Programming Recurrence Equations. Solution of recurrence equations.
CSC 413/513: Intro to Algorithms Greedy Algorithms.
CS 8833 Algorithms Algorithms Dynamic Programming.
DP (not Daniel Park's dance party). Dynamic programming Can speed up many problems. Basically, it's like magic. :D Overlapping subproblems o Number of.
Honors Track: Competitive Programming & Problem Solving Optimization Problems Kevin Verbeek.
1 Programming for Engineers in Python Autumn Lecture 12: Dynamic Programming.
Greedy Algorithms. What is “Greedy Algorithm” Optimization problem usually goes through a sequence of steps A greedy algorithm makes the choice looks.
Dynamic Programming. Many problem can be solved by D&C – (in fact, D&C is a very powerful approach if you generalize it since MOST problems can be solved.
CS 3343: Analysis of Algorithms Lecture 18: More Examples on Dynamic Programming.
1 Dynamic Programming Topic 07 Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program, Image and Vision Computing Lab. School of Information and Computer Technology.
Thursday, May 2 Dynamic Programming – Review – More examples Handouts: Lecture Notes.
Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject to some constraints. (There may.
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
Greedy Algorithms BIL741: Advanced Analysis of Algorithms I (İleri Algoritma Çözümleme I)1.
Dynamic Programming … Continued 0-1 Knapsack Problem.
2/19/ ITCS 6114 Dynamic programming 0-1 Knapsack problem.
Fundamental Data Structures and Algorithms Ananda Guna March 18, 2003 Dynamic Programming Part 1.
Dynamic Programming Fundamental Data Structures and Algorithms Klaus Sutner March 30, 2004.
Greedy Algorithms Prof. Kharat P. V. Department of Information Technology.
CSCI 58000, Algorithm Design, Analysis & Implementation Lecture 12 Greedy Algorithms (Chapter 16)
Data Structures Lab Algorithm Animation.
Dynamic Programming Sequence of decisions. Problem state.
Fundamental Structures of Computer Science
Advanced Algorithms Analysis and Design
Weighted Interval Scheduling
CS 3343: Analysis of Algorithms
Seminar on Dynamic Programming.
Types of Algorithms.
Prepared by Chen & Po-Chuan 2016/03/29
CS Algorithms Dynamic programming 0-1 Knapsack problem 12/5/2018.
Dynamic Programming Dynamic Programming 1/15/ :41 PM
Greedy Algorithms: Maximizing Loot
Introduction to Algorithms: Dynamic Programming
CSC 413/513- Intro to Algorithms
Lecture 4 Dynamic Programming
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Dynamic Programming Sequence of decisions. Problem state.
Lecture 5 Dynamic Programming
Lecture 5 Dynamic Programming
This is not an advertisement for the profession
0-1 Knapsack problem.
Advance Algorithm Dynamic Programming
Dynamic Programming Sequence of decisions. Problem state.
Knapsack Problem A dynamic approach.
Seminar on Dynamic Programming.
Presentation transcript:

The Knapsack Problem

The Knapsack Problem Knapsack of capacity M N Types of Indivisible Items (unlimited number of each type) V1 V2 V3 VN Problem: What is the selection of items that fits in the knapsack maximizing the total value of its contents? CSCI 311 - Data Structures

The Knapsack Problem Type A Type B Type C Value 100 76 54 Weight 5 4 3 Note: For each node in this tree, we have a set of possible decisions. Each decision has a cost (its weight) and leads to an associated yield. The goal is to find a sequence of decisions that leads to an optimal solution. The number of possible solutions is exponential with M. We’d have to find them all and then choose the very best. 8 5 3 4 [100] 3 [76] 4 [54] 5 5 3 3 4 3 4 1 2 1 [154] [130] [152] [108] [130] [154] CSCI 311 - Data Structures

The Knapsack Problem The recursive nature of the problem jumps out at us when we observe the decision tree. The problem has optimal substructure and overlapping sub-problems, so it is solvable with dynamic programming. What we have to figure out is how to map the problem onto some kind of data structure to store solutions to each sub-problem as the tree is traversed. CSCI 311 - Data Structures

The Knapsack Problem N=3 M=27 27 22 23 24 17 12 7 2 100 76 54 5 4 3 Type A Type B Type C Value 100 76 54 Weight 5 4 3 [0] 27 5 3 4 [100] 22 [76] 23 [54] 24 5 [200] 17 5 Question: How many possibilities have to be tested if you use a brute-force approach? Question: Can you find overlapping sub-problems? Question: Can you see the problem’s optimal substructure? Question: Can you see what the optimal gain is for N=3, M=27? Question: What is the strategy that you follow to get to the optimal solution? [300] 12 5 [400] 7 5 [500] 2 Question: What kind of data structure is needed to apply DP to this problem? CSCI 311 - Data Structures

The Knapsack Problem (recursive solution) knap(M) max = 0; for i = 1 to N // Loop through item types // Solve problem assuming we include // an item of type i do spaceLeft = M – size[i] if spaceLeft >= 0 // if type i fits then // Compute candidate sol’n t t = knap(spaceLeft)+val[i] if t > max then max = t return max; CSCI 311 - Data Structures

The Knapsack Problem (DP solution) knap(M) if maxKnown[M]!= unknown then return maxKnown[M]; // Otherwise, result not yet known: max = 0 for i = 1 to N // Try each item type do spaceLeft = M – size[i] if spaceLeft >= 0 // If item type i fits then // Compute candidate solution t t = knap(spaceLeft) + val[i] if t > max then max = t; maxi = i; maxKnown[M] = max // memoize result return max CSCI 311 - Data Structures