Richard Anderson Lecture 18 Dynamic Programming

Slides:



Advertisements
Similar presentations
Knapsack Problem Section 7.6. Problem Suppose we have n items U={u 1,..u n }, that we would like to insert into a knapsack of size C. Each item u i has.
Advertisements

CSEP 521 Applied Algorithms Richard Anderson Lecture 7 Dynamic Programming.
CSE 421 Algorithms Richard Anderson Lecture 19 Longest Common Subsequence.
CSE 421 Algorithms Richard Anderson Lecture 16 Dynamic Programming.
CSE 421 Algorithms Richard Anderson Lecture 19 Longest Common Subsequence.
Dynamic Programming Reading Material: Chapter 7 Sections and 6.
Dynamic Programming Adapted from Introduction and Algorithms by Kleinberg and Tardos.
CSCI 256 Data Structures and Algorithm Analysis Lecture 14 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
DP “It is easy to hardcode the solution, but it is difficult to come up with an algorithm.” by jack choi.___.
Prof. Swarat Chaudhuri COMP 482: Design and Analysis of Algorithms Spring 2012 Lecture 16.
1 Chapter 6 Dynamic Programming. 2 Algorithmic Paradigms Greedy. Build up a solution incrementally, optimizing some local criterion. Divide-and-conquer.
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
Dynamic Programming … Continued
Algorithms Classroom Activities Richard Anderson University of Washington July 3, 20081IUCEE: Algorithm Activities.
CSE 421 Algorithms Richard Anderson Lecture 18 Dynamic Programming.
Dynamic Programming Sequence of decisions. Problem state.
Greedy Method 6/22/2018 6:57 PM Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015.
Weighted Interval Scheduling
CS 3343: Analysis of Algorithms
The Greedy Method and Text Compression
Seminar on Dynamic Programming.
Richard Anderson Lecture 17 Dynamic Programming
JinJu Lee & Beatrice Seifert CSE 5311 Fall 2005 Week 10 (Nov 1 & 3)
String Processing.
Richard Anderson Lecture 18 Dynamic Programming
CS38 Introduction to Algorithms
Prepared by Chen & Po-Chuan 2016/03/29
Richard Anderson Lecture 18 Dynamic Programming
Merge Sort 11/28/2018 2:21 AM The Greedy Method The Greedy Method.
Lecture 35 CSE 331 Nov 27, 2017.
Lecture 34 CSE 331 Nov 20, 2017.
Richard Anderson Lecture 17 Dynamic Programming
Lecture 35 CSE 331 Nov 28, 2016.
Richard Anderson Lecture 16 Dynamic Programming
CS Algorithms Dynamic programming 0-1 Knapsack problem 12/5/2018.
ICS 353: Design and Analysis of Algorithms
Richard Anderson Lecture 18 Dynamic Programming
Memory Efficient Longest Common Subsequence
Richard Anderson Lecture 20 LCS / Shortest Paths
Richard Anderson Lecture 19 Longest Common Subsequence
ICS 353: Design and Analysis of Algorithms
CSEP 521 Applied Algorithms
Richard Anderson Lecture 16a Dynamic Programming
Richard Anderson Lecture 16 Dynamic Programming
Dynamic Programming 1/15/2019 8:22 PM Dynamic Programming.
Dynamic Programming Merge Sort 1/18/ :45 AM Spring 2007
3. Brute Force Selection sort Brute-Force string matching
Lecture 34 CSE 331 Nov 21, 2016.
Lecture 8. Paradigm #6 Dynamic Programming
Trevor Brown DC 2338, Office hour M3-4pm
3. Brute Force Selection sort Brute-Force string matching
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Richard Anderson Lecture 19 Memory Efficient Dynamic Programming
Dynamic Programming II DP over Intervals
String Processing.
Merge Sort 4/28/ :13 AM Dynamic Programming Dynamic Programming.
Richard Anderson Lecture 19 Longest Common Subsequence
Richard Anderson Lecture 16, Winter 2019 Dynamic Programming
Lecture 5 Dynamic Programming
Analysis of Algorithms CS 477/677
Lecture 5 Dynamic Programming
Richard Anderson Lecture 19 Memory Efficient Dynamic Programming
Richard Anderson Lecture 17, Winter 2019 Dynamic Programming
Data Structures and Algorithm Analysis Lecture 15
Dynamic Programming.
Richard Anderson Lecture 20 Space Efficient LCS
Memory Efficient Dynamic Programming / Shortest Paths
Seminar on Dynamic Programming.
3. Brute Force Selection sort Brute-Force string matching
Richard Anderson Autumn 2019 Lecture 8 – Greedy Algorithms II
Presentation transcript:

Richard Anderson Lecture 18 Dynamic Programming CSE 421 Algorithms Richard Anderson Lecture 18 Dynamic Programming

Announcements

One dimensional dynamic programming: Interval scheduling Opt[ j ] = max (Opt[ j – 1], wj + Opt[ p[ j ] ])

Two dimensional dynamic programming K-segment linear approximation Optk[ j ] = min i { Optk-1[ i ] + Ei,j } for 0 < i < j 4 3 2 1

Two dimensional dynamic programming Subset sum and knapsack Opt[ j, K] = max(Opt[ j – 1, K], Opt[ j – 1, K – wj] + wj) Opt[ j, K] = max(Opt[ j – 1, K], Opt[ j – 1, K – wj] + vj) 4 3 2 1

Alternate approach for Strict Subset Sum Alternate solution of Strict Subset Sum with a dynamic programming algorithm: Sum[i, K] = true if there is a subset of {w1,…wi} that sums to exactly K, false otherwise Sum [i, K] = Sum [i -1, K] OR Sum[i - 1, K - wi] Sum [0, 0] = true; Sum[0, K] = false for K != 0 Saving space: Can you replace the two dimensional array with a one dimensional array?

Run time for Subset Sum With n items and target sum K, the run time is O(nK) If K is 1,000,000,000,000,000,000,000,000 this is very slow Alternate brute force algorithm: examine all subsets: O(n2n)

Dynamic Programming Examples Optimal Billboard Placement Text, Solved Exercise, Pg 307 Linebreaking with hyphenation Compare with HW problem 6, Pg 317 String approximation Text, Solved Exercise, Page 309 Longest Common Subsequence Text, Section 6.6

Billboard Placement Maximize income in placing billboards Constraint: bi = (pi, vi), vi: value of placing billboard at position pi Constraint: At most one billboard every five miles Example {(6,5), (8,6), (12, 5), (14, 1)}

Design a Dynamic Programming Algorithm for Billboard Placement Compute Opt[1], Opt[2], . . ., Opt[n] What is Opt[k]? Input b1, …, bn, where bi = (pi, vi), position and value of billboard i

Solution j = 0; // j is five miles behind the current position // the last valid location for a billboard, if one placed at P[k] for k := 1 to n while (P[ j ] < P[ k ] – 5) j := j + 1; j := j – 1; Opt[ k] = Max(Opt[ k-1] , V[ k ] + Opt[ j ]);

Optimal line breaking

Optimal Paragraphing Optimal line breaking: Assign words to paragraphs to minimize the sum of line penalties (square of excess white space) Words w1, w2, . . . wn Pen(i, j): penalty of filling a line with wi,…,wj Opt[j, k]: minimum penalty of ending line k with wj

String approximation Given a string S, and a library of strings B = {b1, …bm}, construct an approximation of the string S by using copies of strings in B. B = {abab, bbbaaa, ccbb, ccaacc} S = abaccbbbaabbccbbccaabab

Formal Model Strings from B assigned to non-overlapping positions of S Strings from B may be used multiple times Cost of d for unmatched character in S Cost of g for mismatched character in S MisMatch(i, j) – number of mismatched characters of bj, when aligned starting with position i in s.

Design a Dynamic Programming Algorithm for String Approximation Compute Opt[1], Opt[2], . . ., Opt[n] What is Opt[k]? Target string S = s1s2…sn Library of strings B = {b1,…,bm} MisMatch(i,j) = number of mismatched characters with bj when aligned starting at position i of S.

Opt[k] = fun(Opt[0],…,Opt[k-1]) How is the solution determined from sub problems? Target string S = s1s2…sn Library of strings B = {b1,…,bm} MisMatch(i,j) = number of mismatched characters with bj when aligned starting at position i of S.

Solution for i := 1 to n Opt[k] = Opt[k-1] + d; for j := 1 to |B| p = i – len(bj); Opt[k] = min(Opt[k], Opt[p-1] + g MisMatch(p, j));

Longest Common Subsequence C=c1…cg is a subsequence of A=a1…am if C can be obtained by removing elements from A (but retaining order) LCS(A, B): A maximum length sequence that is a subsequence of both A and B ocurranec occurrence attacggct tacgacca

Determine the LCS of the following strings BARTHOLEMEWSIMPSON KRUSTYTHECLOWN

String Alignment Problem Align sequences with gaps Charge dx if character x is unmatched Charge gxy if character x is matched to character y CAT TGA AT CAGAT AGGA Note: the problem is often expressed as a minimization problem, with gxx = 0 and dx > 0

LCS Optimization A = a1a2…am B = b1b2…bn Opt[ j, k] is the length of LCS(a1a2…aj, b1b2…bk)

Optimization recurrence If aj = bk, Opt[ j,k ] = 1 + Opt[ j-1, k-1 ] If aj != bk, Opt[ j,k] = max(Opt[ j-1,k], Opt[ j,k-1])

Give the Optimization Recurrence for the String Alignment Problem Charge dx if character x is unmatched Charge gxy if character x is matched to character y Opt[ j, k] = Let aj = x and bk = y Express as minimization Opt[j,k] = max(gxy + Opt[j-1,k-1], dx + Opt[j-1, k], dy + Opt[j, k-1])

Dynamic Programming Computation

Code to compute Opt[j,k]

Storing the path information A[1..m], B[1..n] for i := 1 to m Opt[i, 0] := 0; for j := 1 to n Opt[0,j] := 0; Opt[0,0] := 0; for i := 1 to m for j := 1 to n if A[i] = B[j] { Opt[i,j] := 1 + Opt[i-1,j-1]; Best[i,j] := Diag; } else if Opt[i-1, j] >= Opt[i, j-1] { Opt[i, j] := Opt[i-1, j], Best[i,j] := Left; } else { Opt[i, j] := Opt[i, j-1], Best[i,j] := Down; } b1…bn a1…am

How good is this algorithm? Is it feasible to compute the LCS of two strings of length 300,000 on a standard desktop PC? Why or why not.