CS 312: Algorithm Design & Analysis Lecture #26: 0/1 Knapsack This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License.Creative.

Slides:



Advertisements
Similar presentations
Dynamic Programming 25-Mar-17.
Advertisements

Analysis of Algorithms
Dynamic Programming In this handout A shortest path example
Introduction to Algorithms
0-1 Knapsack Problem A burglar breaks into a museum and finds “n” items Let v_i denote the value of ith item, and let w_i denote the weight of the ith.
Dynamic Programming 0-1 Knapsack These notes are taken from the notes by Dr. Steve Goddard at
The Knapsack Problem Input –Capacity K –n items with weights w i and values v i Goal –Output a set of items S such that the sum of weights of items in.
CS 312: Algorithm Analysis Lecture #3: Algorithms for Modular Arithmetic, Modular Exponentiation This work is licensed under a Creative Commons Attribution-Share.
CS 312: Algorithm Analysis
CS 312: Algorithm Design & Analysis Lecture #34: Branch and Bound Design Options for Solving the TSP: Tight Bounds This work is licensed under a Creative.
CS 312: Algorithm Analysis Lecture #4: Primality Testing, GCD This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License.Creative.
CS 312: Algorithm Design & Analysis Lecture #17: Connectedness in Graphs This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported.
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
Dynamic Programming Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by or formulated.
CS 312: Algorithm Analysis Lecture #8: Non-Homogeneous Recurrence Relations This work is licensed under a Creative Commons Attribution-Share Alike 3.0.
1 1 Slide © 2000 South-Western College Publishing/ITP Slides Prepared by JOHN LOUCKS.
1 0-1 Knapsack problem Dr. Ying Lu RAIK 283 Data Structures & Algorithms.
CS 312: Algorithm Analysis Lecture #32: Intro. to State-Space Search This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported.
CS 312: Algorithm Design & Analysis Lecture #23: Making Optimal Change with Dynamic Programming Slides by: Eric Ringger, with contributions from Mike Jones,
CS 312: Algorithm Design & Analysis Lecture #12: Average Case Analysis of Quicksort This work is licensed under a Creative Commons Attribution-Share Alike.
Algorithm Paradigms High Level Approach To solving a Class of Problems.
CS 312: Algorithm Analysis Lecture #1: Algorithms and Efficiency This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License.Creative.
CS 312: Algorithm Design & Analysis Lecture #24: Optimality, Gene Sequence Alignment This work is licensed under a Creative Commons Attribution-Share Alike.
CS 312: Algorithm Design & Analysis Lecture #2: Asymptotic Notation This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported.
CS 312: Algorithm Analysis Lecture #4: Primality Testing, GCD This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License.Creative.
CS 312: Algorithm Design & Analysis Lecture #35: Branch and Bound Design Options: State Spaces Slides by: Eric Ringger, with contributions from Mike Jones,
CS 312: Algorithm Analysis Lecture #1: Algorithms and Efficiency This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License.Creative.
CS 3343: Analysis of Algorithms Lecture 18: More Examples on Dynamic Programming.
Slides by: Eric Ringger, adapted from slides by Stuart Russell of UC Berkeley. CS 312: Algorithm Design & Analysis Lecture #36: Best-first State- space.
1 Dynamic Programming Topic 07 Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program, Image and Vision Computing Lab. School of Information and Computer Technology.
12-CRS-0106 REVISED 8 FEB 2013 CSG523/ Desain dan Analisis Algoritma Dynamic Programming Intelligence, Computing, Multimedia (ICM)
Algorithmics - Lecture 121 LECTURE 11: Dynamic programming - II -
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
CS 312: Algorithm Analysis Lecture #7: Recurrence Relations a.k.a. Difference Equations Slides by: Eric Ringger, with contributions from Mike Jones, Eric.
CS 312: Algorithm Analysis Lecture #8: Non-Homogeneous Recurrence Relations This work is licensed under a Creative Commons Attribution-Share Alike 3.0.
CS 312: Algorithm Analysis Lecture #33: Branch and Bound, Job Assignment This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported.
CS 312: Algorithm Analysis
CS 3343: Analysis of Algorithms Lecture 19: Introduction to Greedy Algorithms.
1 1 © 2003 Thomson  /South-Western Slide Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
CS 312: Algorithm Analysis Lecture #31: Linear Programming: the Simplex Algorithm, part 2 This work is licensed under a Creative Commons Attribution-Share.
CS 312: Algorithm Analysis Lecture #35: Branch and Bound Design Options - State Spaces Slides by: Eric Ringger, with contributions from Mike Jones, Eric.
CS 312: Algorithm Analysis Lecture #31: Linear Programming: the Simplex Algorithm, part 2 This work is licensed under a Creative Commons Attribution-Share.
2/19/ ITCS 6114 Dynamic programming 0-1 Knapsack problem.
CS 312: Algorithm Analysis Lecture #4: Primality Testing, GCD This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License.Creative.
Fundamental Data Structures and Algorithms Ananda Guna March 18, 2003 Dynamic Programming Part 1.
CS 312: Algorithm Analysis Lecture #9: Recurrence Relations - Change of Variable Slides by: Eric Ringger, with contributions from Mike Jones, Eric Mercer,
Dynamic Programming. What is Dynamic Programming  A method for solving complex problems by breaking them down into simpler sub problems. It is applicable.
CS 312: Algorithm Design & Analysis Lecture #29: Network Flow and Cuts This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported.
Dynamic Programming (optimization problem) CS3024 Lecture 10.
CS 312: Algorithm Analysis Lecture #30: Linear Programming: Intro. to the Simplex Algorithm This work is licensed under a Creative Commons Attribution-Share.
CS 312: Algorithm Analysis Lecture #27: Network Flow This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License.Creative.
CSC317 Greedy algorithms; Two main properties:
CS 3343: Analysis of Algorithms
Dynamic Programming Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Seminar on Dynamic Programming.
Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by recurrences with overlapping subproblems.
Chapter 8 Dynamic Programming
Advanced Design and Analysis Techniques
Topic 25 Dynamic Programming
Dynamic Programming.
Dynamic Programming.
CS 3343: Analysis of Algorithms
CS Algorithms Dynamic programming 0-1 Knapsack problem 12/5/2018.
Advanced Algorithms Analysis and Design
Dynamic Programming 1/15/2019 8:22 PM Dynamic Programming.
CSC 413/513- Intro to Algorithms
Lecture 4 Dynamic Programming
Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by recurrences with overlapping subproblems.
0-1 Knapsack problem.
Seminar on Dynamic Programming.
Presentation transcript:

CS 312: Algorithm Design & Analysis Lecture #26: 0/1 Knapsack This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License.Creative Commons Attribution-Share Alike 3.0 Unported License Slides by: Eric Ringger, with contributions from Mike Jones, Eric Mercer, Sean Warnick

Announcements  Homework #17  Due now  Project #5 – Gene Sequence Alignment  WB Experience: due Wednesday  Early: next Monday  Due: next Wednesday  Mid-term exam:  Review: Wednesday  Exam days: Thu., Fri., Sat.  In Testing Center

Objectives  Contrast 0/1 Knapsack with Divisible Knapsack  Apply the DP methodology to 0/1 Knapsack  Analyze efficiency

Divisible Knapsack Problem How would you formulate this problem mathematically?

Divisible Knapsack Problem Fix Order of Objects means and means {, } and means

Divisible Knapsack Problem Fix Order of Objects Corresponding Object Weights Corresponding Object Values

Divisible Knapsack Problem The problem is formulated mathematically as

Divisible Knapsack Problem The problem is formulated mathematically as

Divisible Knapsack Problem The problem is formulated mathematically as

Divisible Knapsack Problem Problems of this form are called Linear Programs. How to solve?

0/1 Knapsack  Still looking for optimal loads  Still constrained by weight capacity  BUT cannot divide objects into fractions  Will greedy approach find an optimal solution every time?  No. There is no universally optimal selection function.  Other solution ideas?  Try all combinations  DP

Naïve Solution  Try each combination and see which one gives the best answer.  O(2 n )  Can we do better? n objects

Dynamic Programming 1.Ask: Am I solving an optimization problem? 2.Devise a minimal description (address) for any problem instance and sub-problem 3.Divide problems into sub-problems: define the recurrence to specify the relationship of problems to sub-problems 4.Check that the optimality property holds: An optimal solution to a problem is built from optimal solutions to sub-problems. 5.Store results – typically in a table – and re-use the solutions to sub-problems in the table as you build up to the overall solution. 6.Back-trace / analyze the table to extract the composition of the final solution.

Optimization? Step #1

0/1 Knapsack Sub-Problems  How should we define the sub-problems?  What if we use the same idea as the DP coins algorithm to inspire our thinking here?  Enforce weight limits  Idea: sub-problems indexed by:  i: consider objects up through type i  j: consider knapsack with capacity j (up to W)  Sub-problem:  V[i,j] stores the max value for a load of capacity j using objects up through type i  Final answer: V[n,W] Step #2

0/1 Knapsack Sub-Problems  Relationships among sub-problems  To compute V[i,j], we can either include an object of type i or not.  If we do not include one, then  V[i,j] = V[i-1,j]  If we do include one (only if w[i]  j), then  V[i,j] = V[i-1,j-w[i]]+v[i]  Initial conditions / boundary cases:  For j  0, V[i,j] = 0  For i  0, V[i,j] = 0  Summary: Step #3 Step #4

0/1 Knapsack Sub-Problems

0/1 Knapsack: Larger Example Remember: Entry V[i, j ] represents the max value you can get from objects 0 through i at weight j.

Extracting Solution Components Extracting the solution: Start with V[5,11] (which means “use all objects 1-5 to get a solution with weight = 11”). V[4,11] = V[5,11] and V[4,11-7]+28 != 40, so don’t include object 5. Step #6

Next, try V[4,11] which means “use all objects 1-4 to get a solution with weight = 11). V[4,11] != V[3,11] and V[3,11-6]+22 = 40, so do include object 4. Extracting Solution Components

Next, try V[3,5] which means “use all objects 1-3 to get a solution with weight = 5). V[3,5] != V[2,5] and V[2,5-5]+18 = 18, so include object 3. Extracting Solution Components

Summary: Extracting Solution Components Interpret the blue arrows:

Summary: Extracting Solution Components Interpret the blue arrows:

How long does it take?

Efficiency

0/1 Knapsack on-the-fly, Visualized with the Table

0/1 Knapsack on-the-fly V(5,11) V(4,11) V(4,11-w[5])+v[5]= V(4,7)+28 V(3,11)V(3,5)+22 V(3,7) V(3,1)+22 V(2,11) V(1,11) ………………… It’s a DAG; constructed in constrained fashion.

Assignment  Homework #16.5  0/1 Knapsack Example  New DP problem!