Introduction to Algorithms: Dynamic Programming

Slides:



Advertisements
Similar presentations
Introduction to Algorithms 6.046J/18.401J/SMA5503
Advertisements

Dynamic Programming.
Lecture 8: Dynamic Programming Shang-Hua Teng. Longest Common Subsequence Biologists need to measure how similar strands of DNA are to determine how closely.
Overview What is Dynamic Programming? A Sequence of 4 Steps
Algorithms Dynamic programming Longest Common Subsequence.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Dynamic Programming.
David Luebke 1 5/4/2015 CS 332: Algorithms Dynamic Programming Greedy Algorithms.
Comp 122, Fall 2004 Dynamic Programming. dynprog - 2 Lin / Devi Comp 122, Spring 2004 Longest Common Subsequence  Problem: Given 2 sequences, X =  x.
1 Dynamic Programming (DP) Like divide-and-conquer, solve problem by combining the solutions to sub-problems. Differences between divide-and-conquer and.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 11.
1 Longest Common Subsequence (LCS) Problem: Given sequences x[1..m] and y[1..n], find a longest common subsequence of both. Example: x=ABCBDAB and y=BDCABA,
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2006 Lecture 1 (Part 3) Design Patterns for Optimization Problems.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2002 Lecture 2 Tuesday, 2/5/02 Dynamic Programming.
Dynamic Programming CIS 606 Spring 2010.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2009 Design Patterns for Optimization Problems Dynamic Programming.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2002 Lecture 1 (Part 3) Tuesday, 9/3/02 Design Patterns for Optimization.
Analysis of Algorithms CS 477/677
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2008 Design Patterns for Optimization Problems Dynamic Programming.
November 7, 2005Copyright © by Erik D. Demaine and Charles E. Leiserson Dynamic programming Design technique, like divide-and-conquer. Example:
Analysis of Algorithms
Lecture 7 Topics Dynamic Programming
Longest Common Subsequence
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2005 Design Patterns for Optimization Problems Dynamic Programming.
First Ingredient of Dynamic Programming
Dynamic Programming (16.0/15) The 3-d Paradigm 1st = Divide and Conquer 2nd = Greedy Algorithm Dynamic Programming = metatechnique (not a particular algorithm)
Dynamic Programming UNC Chapel Hill Z. Guo.
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
ADA: 7. Dynamic Prog.1 Objective o introduce DP, its two hallmarks, and two major programming techniques o look at two examples: the fibonacci.
Dynamic programming 叶德仕
David Luebke 1 10/24/2015 CS 332: Algorithms Greedy Algorithms Continued.
COSC 3101A - Design and Analysis of Algorithms 7 Dynamic Programming Assembly-Line Scheduling Matrix-Chain Multiplication Elements of DP Many of these.
6/4/ ITCS 6114 Dynamic programming Longest Common Subsequence.
Dynamic Programming continued David Kauchak cs302 Spring 2012.
Introduction to Algorithms Jiafen Liu Sept
Part 2 # 68 Longest Common Subsequence T.H. Cormen et al., Introduction to Algorithms, MIT press, 3/e, 2009, pp Example: X=abadcda, Y=acbacadb.
David Luebke 1 2/26/2016 CS 332: Algorithms Dynamic Programming.
9/27/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Adam Smith Algorithm Design and Analysis L ECTURE 16 Dynamic.
Dr Nazir A. Zafar Advanced Algorithms Analysis and Design Advanced Algorithms Analysis and Design By Dr. Nazir Ahmad Zafar.
Dynamic Programming Csc 487/687 Computing for Bioinformatics.
Dynamic Programming Typically applied to optimization problems
Dynamic Programming (DP)
Design & Analysis of Algorithm Dynamic Programming
David Meredith Dynamic programming David Meredith
CS 3343: Analysis of Algorithms
Advanced Algorithms Analysis and Design
Dynamic programming 叶德仕
Least common subsequence:
Dynamic Programming CISC4080, Computer Algorithms CIS, Fordham Univ.
CS200: Algorithm Analysis
Dynamic Programming Several problems Principle of dynamic programming
Chapter 8 Dynamic Programming.
Dynamic Programming Comp 122, Fall 2004.
CSCE 411 Design and Analysis of Algorithms
Dynamic Programming.
CS Algorithms Dynamic programming 0-1 Knapsack problem 12/5/2018.
Dynamic Programming Dr. Yingwu Zhu Chapter 15.
CS 3343: Analysis of Algorithms
Longest Common Subsequence
Dynamic Programming Comp 122, Fall 2004.
Lecture 8. Paradigm #6 Dynamic Programming
Ch. 15: Dynamic Programming Ming-Te Chi
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Longest Common Subsequence
Dynamic Programming CISC4080, Computer Algorithms CIS, Fordham Univ.
Longest common subsequence (LCS)
Analysis of Algorithms CS 477/677
Longest Common Subsequence
Dynamic Programming.
Longest Common Subsequence
Presentation transcript:

Introduction to Algorithms: Dynamic Programming

Introduction to Algorithms Dynamic Programming Longest common subsequence Optimal substructure Overlapping subproblems CS 421 - Introduction to Algorithms

CS 421 - Introduction to Algorithms Dynamic Programming Design technique, like divide-and-conquer. Example: Longest Common Subsequence (LCS) Given two sequences x[1 . . m] and y[1 . . n], find a longest subsequence common to them both. CS 421 - Introduction to Algorithms

CS 421 - Introduction to Algorithms Dynamic Programming Design technique, like divide-and-conquer. Example: Longest Common Subsequence (LCS) Given two sequences x[1 . . m] and y[1 . . n], find a longest subsequence common to them both. “a” not “the” CS 421 - Introduction to Algorithms

CS 421 - Introduction to Algorithms Dynamic programming Design technique, like divide-and-conquer. Example: Longest Common Subsequence (LCS) Given two sequences x[1 . . m] and y[1 . . n], find a longest subsequence common to them both. x: A B C D y: CS 421 - Introduction to Algorithms

CS 421 - Introduction to Algorithms Dynamic programming Design technique, like divide-and-conquer. Example: Longest Common Subsequence (LCS) Given two sequences x[1 . . m] and y[1 . . n], find a longest subsequence common to them both. x: A B C D y: BCBA = LCS(x, y) functional notation, but not a function CS 421 - Introduction to Algorithms

CS 421 - Introduction to Algorithms Brute-Force LCS Algorithm Check every subsequence of x[1 . . m] to see if it is also a subsequence of y[1 . . n]. CS 421 - Introduction to Algorithms

Brute-Force LCS Algorithm Check every subsequence of x[1 . . m] to see if it is also a subsequence of y[1 . . n]. Analysis Checking = O(n) time per subsequence. 2m subsequences of x (each bit-vector of length m determines a distinct subsequence of x). Worst-case running time = O(n 2m) = exponential time. CS 421 - Introduction to Algorithms

Towards a Better Algorithm Simplification: Look at the length of a longest-common subsequence. Extend the algorithm to find the LCS itself. CS 421 - Introduction to Algorithms

Towards a Better Algorithm Simplification: Look at the length of a longest-common subsequence. Extend the algorithm to find the LCS itself. Notation: Denote the length of a sequence s by | s |. CS 421 - Introduction to Algorithms

Towards a Better Algorithm Simplification: Look at the length of a longest-common subsequence. Extend the algorithm to find the LCS itself. Strategy: Consider prefixes of x and y. • Define c[i, j] = | LCS(x[1 . . i], y[1 . . j]) |. Then, c[m, n] = | LCS(x, y) |. CS 421 - Introduction to Algorithms

Recursive Formulation Theorem. c[i–1, j–1] + 1 if x[i] = y[j], max{c[i–1, j], c[i, j–1]} otherwise. c[i, j] = CS 421 - Introduction to Algorithms

Recursive Formulation Theorem. c[i–1, j–1] + 1 if x[i] = y[j], max{c[i–1, j], c[i, j–1]} otherwise. c[i, j] = Proof. Case x[i] = y[ j]: 1 2 i m x: L = j 1 2 n y: L CS 421 - Introduction to Algorithms

Recursive Formulation Theorem. c[i–1, j–1] + 1 if x[i] = y[j], max{c[i–1, j], c[i, j–1]} otherwise. c[i, j] = Proof. Case x[i] = y[ j]: 1 2 i m x: L = j 1 2 n y: L Let z[1 . . k] = LCS(x[1 . . i], y[1 . . j]), where c[i, j] = k. Then, z[k] = x[i], or else z could be extended. Thus, z[1 . . k–1] is CS of x[1 . . i–1] and y[1 . . j–1].

CS 421 - Introduction to Algorithms Proof (cont'd) Claim: z[1 . . k–1] = LCS(x[1 . . i–1], y[1 . . j–1]). Suppose w is a longer CS of x[1 . . i–1] and y[1 . . j–1], that is, | w | > k–1. Then, cut and paste: w || z[k] (w concatenated with z[k]) is a common subsequence of x[1 . . i] and y[1 . . j] with | w || z[k] | > k. Contradiction, proving the claim. CS 421 - Introduction to Algorithms

CS 421 - Introduction to Algorithms Proof (cont'd) Claim: z[1 . . k–1] = LCS(x[1 . . i–1], y[1 . . j–1]). Suppose w is a longer CS of x[1 . . i–1] and y[1 . . j–1], that is, | w | > k–1. Then, cut and paste: w || z[k] (w concatenated with z[k]) is a common subsequence of x[1 . . i] and y[1 . . j] with | w || z[k] | > k. Contradiction, proving the claim. Thus, c[i–1, j–1] = k–1, which implies that c[i, j] = c[i–1, j–1] + 1. Other cases are similar. CS 421 - Introduction to Algorithms

Dynamic-Programming Hallmark #1 Optimal substructure An optimal solution to a problem (instance) contains optimal solutions to subproblems. CS 421 - Introduction to Algorithms

Dynamic-Programming Hallmark #1 Optimal substructure An optimal solution to a problem (instance) contains optimal solutions to subproblems. If z = LCS(x, y), then any prefix of z is an LCS of a prefix of x and a prefix of y. CS 421 - Introduction to Algorithms

Recursive Algorithm for LCS LCS(x, y, i, j) if x[i] = y[ j] then c[i, j]  LCS(x, y, i–1, j–1) + 1 else c[i, j]  max{LCS(x, y, i–1, j), LCS(x, y, i, j–1)} CS 421 - Introduction to Algorithms

Recursive Algorithm for LCS LCS(x, y, i, j) if x[i] = y[ j] then c[i, j]  LCS(x, y, i–1, j–1) + 1 else c[i, j]  max{LCS(x, y, i–1, j), LCS(x, y, i, j–1)} Worst-case: x[i]  y[ j], in which case the algorithm evaluates two sub-problems, each with only one parameter decremented. CS 421 - Introduction to Algorithms

CS 421 - Introduction to Algorithms Recursion Tree Let m = 3, n = 4: 3,4 2,4 3,3 1,4 2,3 2,3 3,2 1,3 2,2 1,3 2,2 CS 421 - Introduction to Algorithms

CS 421 - Introduction to Algorithms Recursion Tree Let m = 3, n = 4: 3,4 2,4 3,3 m+n 1,4 2,3 2,3 3,2 1,3 2,2 1,3 2,2 Height = m + n  work potentially exponential. CS 421 - Introduction to Algorithms

CS 421 - Introduction to Algorithms Recursion Tree Let m = 3, n = 4: 3,4 2,4 3,3 same subproblem 1,4 2,3 2,3 3,2 1,3 2,2 1,3 2,2 CS 421 - Introduction to Algorithms

Dynamic-Programming Hallmark #2 Overlapping subproblems A recursive solution contains a “small” number of distinct subproblems repeated many times. CS 421 - Introduction to Algorithms

Dynamic-Programming Hallmark #2 Overlapping subproblems A recursive solution contains a “small” number of distinct subproblems repeated many times. The number of distinct LCS subproblems for two strings of lengths m and n is only mn. CS 421 - Introduction to Algorithms

CS 421 - Introduction to Algorithms Memoization Algorithm Memoization: After computing a solution to a subproblem, store it in a table. Subsequent calls check the table to avoid redoing work. CS 421 - Introduction to Algorithms

CS 421 - Introduction to Algorithms Memoization Algorithm Memoization: After computing a solution to a subproblem, store it in a table. Subsequent calls check the table to avoid redoing work. LCS(x, y, i, j) if c[i, j] = NIL then if x[i] = y[j] then c[i, j]  LCS(x, y, i–1, j–1) + 1 else c[i, j]  max{LCS(x, y, i–1, j), LCS(x, y, i, j–1)} same as before CS 421 - Introduction to Algorithms

CS 421 - Introduction to Algorithms Memoization Algorithm Memoization: After computing a solution to a subproblem, store it in a table. Subsequent calls check the table to avoid redoing work. LCS(x, y, i, j) if c[i, j] = NIL then if x[i] = y[j] then c[i, j]  LCS(x, y, i–1, j–1) + 1 else c[i, j]  max{LCS(x, y, i–1, j), LCS(x, y, i, j–1)} same as before Time = (mn) = constant work per table entry. Space = (mn). CS 421 - Introduction to Algorithms

Dynamic-Programming Algorithm B C D 1 2 3 4 IDEA: Compute the table bottom-up. 1 1 1 1 1 1 1 1 1 2 2 2 1 2 2 2 2 2 1 1 2 2 2 3 3 1 2 2 3 3 3 4 1 2 2 3 3 4 4

Dynamic-Programming Algorithm B C D 1 2 3 4 IDEA: Compute the table bottom-up. Time = (mn). 1 1 1 1 1 1 1 1 1 2 2 2 1 2 2 2 2 2 1 1 2 2 2 3 3 1 2 2 3 3 3 4 1 2 2 3 3 4 4

Dynamic-Programming Algorithm B C D 1 2 3 4 Then reconstruct LCS by tracing backwards. 1 1 1 1 1 1 1 1 1 1 1 2 2 2 2 1 2 2 2 2 2 2 1 1 2 2 2 3 3 3 3 1 2 2 3 3 3 4 4 4 1 2 2 3 3 4 4

Dynamic-programming algorithm B C D 1 2 3 4 Space = (mn). 1 1 1 1 1 1 1 1 1 1 1 2 2 2 2 1 2 2 2 2 2 2 1 1 2 2 2 3 3 3 3 1 2 2 3 3 3 4 4 4 1 2 2 3 3 4 4