CSC 172 DATA STRUCTURES. DYNAMIC PROGRAMMING TABULATION MEMMOIZATION.

Slides:



Advertisements
Similar presentations
Dynamic Programming 25-Mar-17.
Advertisements

Dynamic Programming Nithya Tarek. Dynamic Programming Dynamic programming solves problems by combining the solutions to sub problems. Paradigms: Divide.
CPSC 335 Dynamic Programming Dr. Marina Gavrilova Computer Science University of Calgary Canada.
Overview What is Dynamic Programming? A Sequence of 4 Steps
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.
15-May-15 Dynamic Programming. 2 Algorithm types Algorithm types we will consider include: Simple recursive algorithms Backtracking algorithms Divide.
Data Structures and Algorithms (60-254)
CSC 427: Data Structures and Algorithm Analysis
16-May-15 Dynamic Programming. 2 Algorithm types Algorithm types we will consider include: Simple recursive algorithms Backtracking algorithms Divide.
1 Dynamic Programming Jose Rolim University of Geneva.
Dynamic Programming Lets begin by looking at the Fibonacci sequence.
1 Longest Common Subsequence (LCS) Problem: Given sequences x[1..m] and y[1..n], find a longest common subsequence of both. Example: x=ABCBDAB and y=BDCABA,
Data Structures Lecture 10 Fang Yu Department of Management Information Systems National Chengchi University Fall 2010.
CPSC 311, Fall 2009: Dynamic Programming 1 CPSC 311 Analysis of Algorithms Dynamic Programming Prof. Jennifer Welch Fall 2009.
Dynamic Programming CIS 606 Spring 2010.
CS 206 Introduction to Computer Science II 10 / 14 / 2009 Instructor: Michael Eckmann.
Dynamic Programming Optimization Problems Dynamic Programming Paradigm
CPSC 411 Design and Analysis of Algorithms Set 5: Dynamic Programming Prof. Jennifer Welch Spring 2011 CPSC 411, Spring 2011: Set 5 1.
Dynamic Programming1. 2 Outline and Reading Matrix Chain-Product (§5.3.1) The General Technique (§5.3.2) 0-1 Knapsack Problem (§5.3.3)
UNC Chapel Hill Lin/Manocha/Foskey Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject.
Dynamic Programming Reading Material: Chapter 7 Sections and 6.
© 2004 Goodrich, Tamassia Dynamic Programming1. © 2004 Goodrich, Tamassia Dynamic Programming2 Matrix Chain-Products (not in book) Dynamic Programming.
Lecture 8: Dynamic Programming Shang-Hua Teng. First Example: n choose k Many combinatorial problems require the calculation of the binomial coefficient.
CS 206 Introduction to Computer Science II 10 / 16 / 2009 Instructor: Michael Eckmann.
CS 206 Introduction to Computer Science II 10 / 08 / 2008 Instructor: Michael Eckmann.
Longest Common Subsequence
CS 206 Introduction to Computer Science II 02 / 25 / 2009 Instructor: Michael Eckmann.
Dynamic Programming Introduction to Algorithms Dynamic Programming CSE 680 Prof. Roger Crawfis.
Dynamic Programming CSC 172 SPRING 2002 LECTURE 6.
Dynamic Programming Part One HKOI Training Team 2004.
Dynamic Programming UNC Chapel Hill Z. Guo.
Recursion and Dynamic Programming CSC 172 SPRING 2004 LECTURE 9.
ADA: 7. Dynamic Prog.1 Objective o introduce DP, its two hallmarks, and two major programming techniques o look at two examples: the fibonacci.
1 CSC 427: Data Structures and Algorithm Analysis Fall 2008 Dynamic programming  top-down vs. bottom-up  divide & conquer vs. dynamic programming  examples:
Algorithm Paradigms High Level Approach To solving a Class of Problems.
CS 8833 Algorithms Algorithms Dynamic Programming.
Greedy Methods and Backtracking Dr. Marina Gavrilova Computer Science University of Calgary Canada.
Honors Track: Competitive Programming & Problem Solving Optimization Problems Kevin Verbeek.
CS 206 Introduction to Computer Science II 02 / 23 / 2009 Instructor: Michael Eckmann.
Dynamic Programming continued David Kauchak cs302 Spring 2012.
Dynamic Programming.
1 Dynamic Programming Andreas Klappenecker [partially based on slides by Prof. Welch]
Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject to some constraints. (There may.
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
Chapter 7 Dynamic Programming 7.1 Introduction 7.2 The Longest Common Subsequence Problem 7.3 Matrix Chain Multiplication 7.4 The dynamic Programming Paradigm.
9-Feb-16 Dynamic Programming. 2 Algorithm types Algorithm types we will consider include: Simple recursive algorithms Backtracking algorithms Divide and.
Part 2 # 68 Longest Common Subsequence T.H. Cormen et al., Introduction to Algorithms, MIT press, 3/e, 2009, pp Example: X=abadcda, Y=acbacadb.
CSC 213 Lecture 19: Dynamic Programming and LCS. Subsequences (§ ) A subsequence of a string x 0 x 1 x 2 …x n-1 is a string of the form x i 1 x.
Dynamic Programming Tutorial &Practice on Longest Common Sub-sequence.
Fundamental Data Structures and Algorithms Ananda Guna March 18, 2003 Dynamic Programming Part 1.
Dynamic Programming Fundamental Data Structures and Algorithms Klaus Sutner March 30, 2004.
Dynamic Programming. What is Dynamic Programming  A method for solving complex problems by breaking them down into simpler sub problems. It is applicable.
Greedy Algorithms Prof. Kharat P. V. Department of Information Technology.
Dynamic Programming 26-Apr-18.
Design & Analysis of Algorithm Dynamic Programming
CSC 172 DATA STRUCTURES.
Algorithmics - Lecture 11
JinJu Lee & Beatrice Seifert CSE 5311 Fall 2005 Week 10 (Nov 1 & 3)
Dynamic programming techniques
Dynamic programming techniques
CSCE 411 Design and Analysis of Algorithms
Data Structures and Algorithms
Dynamic Programming 23-Feb-19.
Applied Discrete Mathematics Week 7: Computation
Longest Common Subsequence
Lecture 5 Dynamic Programming
Longest Common Subsequence
Richard Anderson Lecture 19 Memory Efficient Dynamic Programming
Presentation transcript:

CSC 172 DATA STRUCTURES

DYNAMIC PROGRAMMING TABULATION MEMMOIZATION

Dynamic Programming  If you can mathematically express a problem recursively, then you can express it as a recursive algorithm.  However, sometimes, this can be inefficiently expressed by a compiler  Fibonacci numbers  To avoid this recursive “explosion” we can use dynamic programming

Fibonacci Numbers

static int F(int x) { if (x<1) return 1; if (x<=2) return 1; return F(x-1) + F(x-2); }

static int knownF[] = new int[maxN]; static int F(int x) { if (knownF[x] != 0) return knownF[x]; int t = x ; if (x<=1) return ; if (x>1) t = F(x-1) + F(x-2); return knownF[x] = t; }

Example Problem: Making Change  For a currency with coins C 1,C 2,…C n (cents) what is the minimum number of coins needed to make K cents of change?  US currency has 1,5,10, and 25 cent denominations  Anyone got a 50-cent piece?  We can make 63 cents by using two quarters, one dime & 3 pennies  What if we had a 21 cent piece?

63 cents  25,25,10,1,1,1  Suppose a 21 cent coin?  21,21,21 is optimal

Recursive Solution 1. If we can make change using exactly one coin, then that is a minimum 2. Otherwise for each possible value j compute the minimum number of coins needed to make j cents in change and K – j cents in change independently. Choose the j that minimizes the sum of the two computations.

public static int makeChange (int[] coins, int change){ int minCoins = change; for (int k = 0;k<coins.length;k++) if (coins[k] == change) return 1; for (int j = 1;j<= change/2;j++) { int thisCoins = makeChange(coins,j) +makeChange(coins,change-j); if (thisCoins < minCoins) minCoins = thisCoins; } return minCoins; }// How long will this take?

How many calls? 63¢ 62¢2¢2¢ 1¢1¢ 61¢31¢32¢...

How many calls? 63¢ 3¢3¢ 1¢1¢ 4¢4¢61¢62¢... 2¢2¢

How many calls? 63¢ 1¢1¢ 3¢3¢ 1¢1¢ 4¢4¢61¢62¢... 2¢2¢ 1¢

How many calls? 63¢ 1¢1¢ 3¢3¢ 1¢1¢ 4¢4¢61¢62¢... 2¢2¢ 1¢ 3¢3¢1¢1¢4¢4¢ 61¢... 2¢2¢

How many times do you call for 2¢? 63¢ 1¢1¢ 3¢3¢ 1¢1¢ 4¢4¢61¢62¢... 2¢2¢ 1¢ 3¢3¢1¢1¢4¢4¢ 61¢... 2¢2¢

Some Solutions 1(1) & 62(21,21,10,10) 2(1,1) & 61(25,25,10,1).. 21(21) & 42(21,21) …. 31(21,10) & 32(21,10,1)

Improvements?  Limit the inner loops to the coins 1 & 21,21,10,10 5 & 25,21,10,1,1 10 & 21,21,10,1 21 & 21,21 25 & 25,10,1,1,1 Still, a recursive branching factor of 5 How many times do we solve for 52 cents?

public static int makeChange (int[] coins, int change){ int minCoins = change; for (int k = 0;k<coins.length;k++) if (coins[k] == change) return 1; for (int j = 1;j<= coins.length;j++) { if (change < coins[j]) continue; int thisCoins = 1+makeChange(coins,change- coins[j]); if (thisCoins < minCoins) minCoins = thisCoins; } return minCoins; }// How long will this take?

How many calls? 63¢ 58¢53¢62¢42¢38¢ 48¢43¢52¢32¢13¢ 57¢52¢61¢41¢37¢

Tabulation aka Dynamic Programming  Build a table of partial results.  The trick is to save answers to the sub- problems in an array.  Use the stored sub-solutions to solve the larger problems

DP for change making  Find optimum solution for 1 cent  Find optimum solution for 2 cents using previous  Find optimum solution for 3 cents using previous  …etc.  At any amount a, for each denomination d, check the minimum coins for the (previously calculated) amount a-d  We can always get from a-d to a with one more coin

public static int makeChange (int[] coins, int differentcoins, int maxChange, int[] coinsUsed, int [] lastCoin){ coinsUsed[0] = 0; lastCoin[0]=1; for (int cents = 1; cents <= maxChange; cents++) { int minCoins = cents; int newCoin = 1; for (int j = 0;j<differentCoins;j++) { if (coins[j] > cents) continue; if (coinsUsed[cents – coins[j]]+1 < minCoins){ minCoins=coinsUsed[cents – coins[j]]+1; newCoin = coins[j]; } coinsUsed[cents] = minCoins; lastCoin[cents] = newCoin; }

Dynamic Programming solution  O(NK)  N denominations  K amount of change  By backtracking through the lastCoin[] array, we can generate the sequence needed for the amount in question.

LONGEST COMMON SUBSEQUENCE Suppose we have two lists and we want to know the difference between them? -file systems -web sites -DNA sequences

LONGEST COMMON SUBSEQUENCE Consider strings from {a,b,c} What is the LCS of abcabba and cbabac ?

LONGEST COMMON SUBSEQUENCE Consider strings from {a,b,c} What is the LCS of abcabba and cbabac ? baba cbba

c b a b a c a b c a b b a c b a b a c

a b c a b b a c b a b a c

baba a b c a b b a cbba c b a b a c

Recursive LCS Length To find the length of an LCS of lists x and y we need to find the lengths of the LCSs of all pairs of prefixes, one from x and one from y. Suppose x = (a1,a2,...am), y=(b1,b2,....bn) i is between 0 and m, y between 0 and m

BASIS: if i+j = 0, then LCS is null L(0,0)=0 INDUCTION: Consider i and j and suppose we have already computed L(g,h) for any g and h such that g+h < i+j. There are 3 cases (1) If either i or j is 0 then, L(i,j)=0 (2) If i>0 and j>0 and ai != bj then L(i,j) = max(L(i,j-1),L(i-1,j) (3) If i>0 and j>0 and ai==bj then L(i,j) = 1 + L(i-1,j-1)

Recursive LCS Length The algorithm works, but is exponential in the small of m and n. If we start with L(3,3) we end up calling L(0,0) twenty time We can build a 2D table and store the intermediate results and get a runtime O(mn)

Intuitively c 6 a 5 b 4 a 3 b 2 c a b c a b b a

Intuitively c a b a b c a b c a b b a

Intuitively c a b a b c a b c a b b a

for (int j = 0 ; j <= n; j++) L[0][j] = 0; for (int I = 1 ; I <m;i++) { L[i][0] = 0; for(int j = 1 ; j <=n; j++) if (a[i] != b[j]) if (L[i-1][j] >= L[i][j-1]) L[i][j] = L[i-1][j]; else L[i][j] = L[i][j-1]; else /* a[i] == a[j] */ L[i][j] = 1 + L[i-1][j-1] }