CS 312: Algorithm Design & Analysis Lecture #23: Making Optimal Change with Dynamic Programming Slides by: Eric Ringger, with contributions from Mike Jones,

Slides:



Advertisements
Similar presentations
Overview What is Dynamic Programming? A Sequence of 4 Steps
Advertisements

RAIK 283: Data Structures & Algorithms
Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by recurrences with overlapping subproblems.
1 Dynamic Programming Jose Rolim University of Geneva.
CPSC 311, Fall 2009: Dynamic Programming 1 CPSC 311 Analysis of Algorithms Dynamic Programming Prof. Jennifer Welch Fall 2009.
Dynamic Programming Technique. D.P.2 The term Dynamic Programming comes from Control Theory, not computer science. Programming refers to the use of tables.
Chapter 8 Dynamic Programming Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
CSE 780 Algorithms Advanced Algorithms Greedy algorithm Job-select problem Greedy vs DP.
CPSC 411 Design and Analysis of Algorithms Set 5: Dynamic Programming Prof. Jennifer Welch Spring 2011 CPSC 411, Spring 2011: Set 5 1.
UNIVERSITY OF SOUTH CAROLINA College of Engineering & Information Technology Bioinformatics Algorithms and Data Structures Chapter 11: Core String Edits.
Dynamic Programming A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 8 ©2012 Pearson Education, Inc. Upper Saddle River,
Lecture 7 Topics Dynamic Programming
Unit 1. Sorting and Divide and Conquer. Lecture 1 Introduction to Algorithm and Sorting.
Dynamic Programming Introduction to Algorithms Dynamic Programming CSE 680 Prof. Roger Crawfis.
CS 312: Algorithm Analysis Lecture #3: Algorithms for Modular Arithmetic, Modular Exponentiation This work is licensed under a Creative Commons Attribution-Share.
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 8 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
CS 312: Algorithm Analysis
CS 312: Algorithm Design & Analysis Lecture #34: Branch and Bound Design Options for Solving the TSP: Tight Bounds This work is licensed under a Creative.
CS 312: Algorithm Analysis Lecture #4: Primality Testing, GCD This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License.Creative.
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
ADA: 7. Dynamic Prog.1 Objective o introduce DP, its two hallmarks, and two major programming techniques o look at two examples: the fibonacci.
CS 312: Algorithm Analysis Lecture #8: Non-Homogeneous Recurrence Relations This work is licensed under a Creative Commons Attribution-Share Alike 3.0.
CS 312: Algorithm Analysis Lecture #32: Intro. to State-Space Search This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported.
CS 312: Algorithm Design & Analysis Lecture #12: Average Case Analysis of Quicksort This work is licensed under a Creative Commons Attribution-Share Alike.
CS 312: Algorithm Analysis Lecture #1: Algorithms and Efficiency This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License.Creative.
CS 312: Algorithm Design & Analysis Lecture #24: Optimality, Gene Sequence Alignment This work is licensed under a Creative Commons Attribution-Share Alike.
CS 312: Algorithm Design & Analysis Lecture #2: Asymptotic Notation This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported.
CS 312: Algorithm Analysis Lecture #4: Primality Testing, GCD This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License.Creative.
Dynamic Programming Louis Siu What is Dynamic Programming (DP)? Not a single algorithm A technique for speeding up algorithms (making use of.
1 Dynamic Programming Andreas Klappenecker [partially based on slides by Prof. Welch]
CS 312: Algorithm Design & Analysis Lecture #35: Branch and Bound Design Options: State Spaces Slides by: Eric Ringger, with contributions from Mike Jones,
CS 312: Algorithm Analysis Lecture #1: Algorithms and Efficiency This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License.Creative.
Dynamic Programming. Many problem can be solved by D&C – (in fact, D&C is a very powerful approach if you generalize it since MOST problems can be solved.
CS 3343: Analysis of Algorithms Lecture 18: More Examples on Dynamic Programming.
Slides by: Eric Ringger, adapted from slides by Stuart Russell of UC Berkeley. CS 312: Algorithm Design & Analysis Lecture #36: Best-first State- space.
12-CRS-0106 REVISED 8 FEB 2013 CSG523/ Desain dan Analisis Algoritma Dynamic Programming Intelligence, Computing, Multimedia (ICM)
Algorithmics - Lecture 121 LECTURE 11: Dynamic programming - II -
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
CS 312: Algorithm Analysis Lecture #7: Recurrence Relations a.k.a. Difference Equations Slides by: Eric Ringger, with contributions from Mike Jones, Eric.
CS 312: Algorithm Analysis Lecture #8: Non-Homogeneous Recurrence Relations This work is licensed under a Creative Commons Attribution-Share Alike 3.0.
CS 312: Algorithm Analysis Lecture #33: Branch and Bound, Job Assignment This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported.
COSC 3101NJ. Elder Announcements Midterm Exam: Fri Feb 27 CSE C –Two Blocks: 16:00-17:30 17:30-19:00 –The exam will be 1.5 hours in length. –You can attend.
CS 312: Algorithm Analysis
CS 312: Algorithm Analysis Lecture #31: Linear Programming: the Simplex Algorithm, part 2 This work is licensed under a Creative Commons Attribution-Share.
CS 312: Algorithm Analysis Lecture #35: Branch and Bound Design Options - State Spaces Slides by: Eric Ringger, with contributions from Mike Jones, Eric.
CS 312: Algorithm Analysis Lecture #31: Linear Programming: the Simplex Algorithm, part 2 This work is licensed under a Creative Commons Attribution-Share.
CS 312: Algorithm Analysis Lecture #4: Primality Testing, GCD This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License.Creative.
CS 312: Algorithm Analysis Lecture #9: Recurrence Relations - Change of Variable Slides by: Eric Ringger, with contributions from Mike Jones, Eric Mercer,
Dynamic Programming. What is Dynamic Programming  A method for solving complex problems by breaking them down into simpler sub problems. It is applicable.
CS 312: Algorithm Design & Analysis Lecture #26: 0/1 Knapsack This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License.Creative.
CS 312: Algorithm Design & Analysis Lecture #29: Network Flow and Cuts This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported.
Dynamic Programming (optimization problem) CS3024 Lecture 10.
CS 312: Algorithm Analysis Lecture #30: Linear Programming: Intro. to the Simplex Algorithm This work is licensed under a Creative Commons Attribution-Share.
9/27/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Adam Smith Algorithm Design and Analysis L ECTURE 16 Dynamic.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Lecture 12.
All-pairs Shortest paths Transitive Closure
Dynamic Programming Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Introduction to the Design and Analysis of Algorithms
Seminar on Dynamic Programming.
Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by recurrences with overlapping subproblems.
Chapter 8 Dynamic Programming
Dynamic Programming.
Data Structures and Algorithms
Discrete Mathematics CMP-101 Lecture 12 Sorting, Bubble Sort, Insertion Sort, Greedy Algorithms Abdul Hameed
Dynamic Programming 1/15/2019 8:22 PM Dynamic Programming.
Dynamic Programming.
Dynamic Programming.
Dynamic Programming II DP over Intervals
Seminar on Dynamic Programming.
Presentation transcript:

CS 312: Algorithm Design & Analysis Lecture #23: Making Optimal Change with Dynamic Programming Slides by: Eric Ringger, with contributions from Mike Jones, Eric Mercer, Sean Warnick This work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License.Creative Commons Attribution-Share Alike 3.0 Unported License

Announcements  Project #4: “Intelligent scissors”  Due: today  Project #5: Gene Sequence Alignment  Begin discussing main ideas on Wednesday  Reading: worth your time  Mid-term Exam: coming up next week

Objectives  Use the Dynamic Programming strategy to solve another example problem: “the coins problem”  Extract the composition of a solution from a DP table

The Coins Problem: Making Change

DP Strategy: From Problem to Table to Algorithm 1.Start with a problem definition 2.Devise a minimal description (address) for any problem instance and sub-problem 3.Define recurrence to specify the relationship of problems to sub- problems  i.e., Define the conceptual DAG on sub-problems 4.Embed the DAG in a table  Use the address as indexes:  E.g., 2-D case: index rows and columns 5.Two possible strategies 1.Fill in the sub-problem cells, proceeding from the smallest to the largest. 2.Draw the DAG in the table from the top problem down to the smallest sub-problems; solve the relevant sub-problems in their table cells, from smallest to largest.  Equivalently: solve the top problem instance recursively, using the table as a memory function

Making Optimal Change using DP

Recurrence

Next Steps Using DP  Design the table to contain the needed sub-problem results  Design the DP algorithm to walk the table and apply the recurrence relation  Solve an example by running the DP algorithm  Modify the resulting algorithm for space and time, if necessary

Example

Amount senine= seon=2 shum=4 limnah=7 j i

Making Change Amount senine= seon=2 shum=4 limnah=7 j i

Making Change Amount senine= seon=201??? shum=4 limnah=7 How does one compute C(2,2)? j i

Making Change Amount senine= seon=2011 shum=4 limnah=7 How does one compute C(2,2)? j i +1

Making Change Amount senine= seon=20112 shum=4 limnah=7 How does one compute C(2,3)? j i +1

Making Change Amount senine= seon= shum=4 limnah=7 j i +1

Making Change Amount senine= seon= shum=4 limnah=7 j i +1

Making Change Amount senine= seon= shum=40112 limnah=7 j i

Making Change Amount senine= seon= shum= limnah=7 j i

Making Change Amount senine= seon= shum= limnah= j i

Making Change Amount senine= seon= shum= limnah= j i

Question  Which coins?  Extract the composition of a solution from the table.

Extracting a Solution: C(4,7) Amount senine= seon= shum= limnah= j i C(4,7)=C(4,7-7)+1, so include a limnah Move to C(4,7-7).

Extracting a Solution: C(3,7) Amount senine= seon= shum= limnah= j i not= C(3,7)= C(3,7-4) +1 give a shum move left. C(2,3)= C(2,1)+1 give a seon move left. C(1,1) =C(1,0)+1 give a senine move left. Can you think of other ways to extract a solution?

Efficiency Amount senine= seon= shum= limnah= j i

Algorithm in Pseudo-code function coins(d, J) Input: Array d[1..m] specifies the denominations; J is the balance for which to make change Output: Minimum number of coins needed to make change for J units using coins from d array c[1..m,0..J] for i=1 to m do c[i,0] = 0 for j=1 to J do if i<1 then c[i,j] = 0 else if i =1 then c[i,j] = j / d[i] else if i > 1 && j >= d[i] then c[i,j] = min(c[i-1,j],1+c[i,j-d[i]]) else if i >1 && 0<j<d[i] then c[i,j] = c[i-1,j] else if i>1 && j<=0 then c[i,j] = 0 return c[m,J] Easy translation from table + recurrence to pseudo-code! How much space? How much time?

Pre-computing vs. on-the-fly  Eager: Pre-computing  Fill the table bottom-up, then extract solution  Lazy: On-demand  Build the DAG (in the table) top-down  Solve only the necessary table entries (from bottom-up or top-down)

Making Change: Top Down Amount senine=1 seon=2 shum=4 limnah=7 Top down: start at bottom right, make recursive calls. Save time by storing every intermediate value. j i

Making Change: Top Down Amount senine=1 seon=2 shum=4 limnah=7 How much space and time did this require? j i

Comparison  How does DP algorithm for making change compare to the greedy algorithm?  speed  space  correctness  simplicity  optimality

Assignment  HW #15  Read 6.3  Read Project #5 Instructions