CSE 421 Algorithms Richard Anderson Lecture 18 Dynamic Programming.

Slides:



Advertisements
Similar presentations
Algorithm Design Methods Spring 2007 CSE, POSTECH.
Advertisements

Knapsack Problem Section 7.6. Problem Suppose we have n items U={u 1,..u n }, that we would like to insert into a knapsack of size C. Each item u i has.
CSEP 521 Applied Algorithms Richard Anderson Lecture 7 Dynamic Programming.
S. J. Shyu Chap. 1 Introduction 1 The Design and Analysis of Algorithms Chapter 1 Introduction S. J. Shyu.
Dynamic Programming Reading Material: Chapter 7..
CSE 421 Algorithms Richard Anderson Lecture 19 Longest Common Subsequence.
CSE 421 Algorithms Richard Anderson Lecture 16 Dynamic Programming.
KNAPSACK PROBLEM A dynamic approach. Knapsack Problem  Given a sack, able to hold K kg  Given a list of objects  Each has a weight and a value  Try.
Dynamic Programming Reading Material: Chapter 7 Sections and 6.
Lecture 10: Support Vector Machines
Dynamic Programming Adapted from Introduction and Algorithms by Kleinberg and Tardos.
1 Approximation Through Scaling Algorithms and Networks 2014/2015 Hans L. Bodlaender Johan M. M. van Rooij.
CSCI 256 Data Structures and Algorithm Analysis Lecture 14 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
1 Announcements: The deadline for grad lecture notes was extended to Friday Nov. 9 at 3:30pm. Assignment #4 is posted: Due Thurs. Nov. 15. Nov is.
Prof. Swarat Chaudhuri COMP 482: Design and Analysis of Algorithms Spring 2012 Lecture 16.
1 Chapter 6 Dynamic Programming. 2 Algorithmic Paradigms Greedy. Build up a solution incrementally, optimizing some local criterion. Divide-and-conquer.
Greedy Algorithms. What is “Greedy Algorithm” Optimization problem usually goes through a sequence of steps A greedy algorithm makes the choice looks.
Implicit Hitting Set Problems Richard M. Karp Erick Moreno Centeno DIMACS 20 th Anniversary.
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 18.
CSCI 256 Data Structures and Algorithm Analysis Lecture 16 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
Algorithm Design and Analysis
Lecture 32 CSE 331 Nov 16, 2016.
Weighted Interval Scheduling
The Greedy Method and Text Compression
The Greedy Method and Text Compression
Richard Anderson Lecture 17 Dynamic Programming
Richard Anderson Lecture 18 Dynamic Programming
CS38 Introduction to Algorithms
Richard Anderson Lecture 18 Dynamic Programming
Lecture 35 CSE 331 Nov 27, 2017.
Lecture 34 CSE 331 Nov 20, 2017.
Integer Programming (정수계획법)
Lecture 33 CSE 331 Nov 18, 2016.
Lecture 33 CSE 331 Nov 17, 2017.
Richard Anderson Lecture 17 Dynamic Programming
Lecture 35 CSE 331 Nov 28, 2016.
Richard Anderson Lecture 16 Dynamic Programming
Richard Anderson Lecture 18 Dynamic Programming
Richard Anderson Lecture 28 Coping with NP-Completeness
Richard Anderson Lecture 19 Longest Common Subsequence
CSE 421 Richard Anderson Autumn 2016, Lecture 3
CSEP 521 Applied Algorithms
Richard Anderson Lecture 16a Dynamic Programming
Richard Anderson Lecture 16 Dynamic Programming
Polynomial time approximation scheme
Richard Anderson Lecture 6 Greedy Algorithms
3. Brute Force Selection sort Brute-Force string matching
Richard Anderson Autumn 2015 Lecture 1
Lecture 32 CSE 331 Nov 15, 2017.
Lecture 33 CSE 331 Nov 14, 2014.
Richard Anderson Autumn 2016 Lecture 7
Lecture 34 CSE 331 Nov 21, 2016.
Integer Programming (정수계획법)
3. Brute Force Selection sort Brute-Force string matching
Lecture 36 CSE 331 Nov 28, 2011.
Lecture 36 CSE 331 Nov 30, 2012.
Dynamic Programming II DP over Intervals
Richard Anderson Winter 2009 Lecture 2
Richard Anderson Lecture 27 Survey of NP Complete Problems
Richard Anderson Winter 2019 Lecture 7
Richard Anderson Lecture 16, Winter 2019 Dynamic Programming
1.6 Linear Programming Pg. 30.
Richard Anderson Lecture 18 Dynamic Programming
Richard Anderson Lecture 17, Winter 2019 Dynamic Programming
Data Structures and Algorithm Analysis Lecture 15
Algorithm Course Algorithms Lecture 3 Sorting Algorithm-1
3. Brute Force Selection sort Brute-Force string matching
Richard Anderson Autumn 2019 Lecture 7
Richard Anderson Autumn 2019 Lecture 8 – Greedy Algorithms II
Presentation transcript:

CSE 421 Algorithms Richard Anderson Lecture 18 Dynamic Programming

Announcements Homework Deadlines –HW 6: Friday, November 13 –HW 7: Wednesday, November 18 –HW 8: Wednesday, November 25 –HW 9: Friday, December 4 –HW 10: Friday, December 11

Optimal linear interpolation Error =  (y i –ax i – b) 2 Optimal linear interpolation with K segments

Notation Points p 1, p 2,..., p n ordered by x-coordinate (p i = (x i, y i )) E i,j is the least squares error for the optimal line interpolating p i,... p j

Optimal interpolation with k segments Optimal segmentation with three segments –Min i,j {E 1,i + E i,j + E j,n } –O(n 2 ) combinations considered Generalization to k segments leads to considering O(n k-1 ) combinations

Opt k [ j ] : Minimum error approximating p 1 …p j with k segments Express Opt k [ j ] in terms of Opt k-1 [1],…,Opt k-1 [ j ] Opt k [ j ] = min i { Opt k-1 [ i ] + E i,j } for 0 < i < j

Optimal sub-solution property Optimal solution with k segments extends an optimal solution of k-1 segments on a smaller problem

Optimal multi-segment interpolation Compute Opt[ k, j ] for 0 < k < j < n for j := 1 to n Opt[ 1, j] = E 1,j ; for k := 2 to n-1 for j := 2 to n t := E 1,j for i := 1 to j -1 t = min (t, Opt[k-1, i ] + E i,j ) Opt[k, j] = t

Determining the solution When Opt[k,j] is computed, record the value of i that minimized the sum Store this value in a auxiliary array Use to reconstruct solution

Variable number of segments Segments not specified in advance Penalty function associated with segments Cost = Interpolation error + C x #Segments

Penalty cost measure Opt[ j ] = min(E 1,j, min i (Opt[ i ] + E i,j + P))

Subset Sum Problem Let w 1,…,w n = {6, 8, 9, 11, 13, 16, 18, 24} Find a subset that has as large a sum as possible, without exceeding 50

Adding a variable for Weight Opt[ j, K ] the largest subset of {w 1, …, w j } that sums to at most K {2, 4, 7, 10} –Opt[2, 7] = –Opt[3, 7] = –Opt[3,12] = –Opt[4,12] =

Subset Sum Recurrence Opt[ j, K ] the largest subset of {w 1, …, w j } that sums to at most K

Subset Sum Grid {2, 4, 7, 10} Opt[ j, K] = max(Opt[ j – 1, K], Opt[ j – 1, K – w j ] + w j )

Subset Sum Code for j = 1 to n for k = 1 to W Opt[j, k] = max(Opt[j-1, k], Opt[j-1, k-w j ] + w j )

Knapsack Problem Items have weights and values The problem is to maximize total value subject to a bound on weght Items {I 1, I 2, … I n } –Weights {w 1, w 2, …,w n } –Values {v 1, v 2, …, v n } –Bound K Find set S of indices to: –Maximize  i  S v i such that  i  S w i <= K

Knapsack Recurrence Opt[ j, K] = max(Opt[ j – 1, K], Opt[ j – 1, K – w j ] + w j ) Subset Sum Recurrence: Knapsack Recurrence:

Knapsack Grid Weights {2, 4, 7, 10} Values: {3, 5, 9, 16} Opt[ j, K] = max(Opt[ j – 1, K], Opt[ j – 1, K – w j ] + v j )

Dynamic Programming Examples Examples –Optimal Billboard Placement Text, Solved Exercise, Pg 307 –Linebreaking with hyphenation Compare with HW problem 6, Pg 317 –String approximation Text, Solved Exercise, Page 309

Billboard Placement Maximize income in placing billboards –b i = (p i, v i ), v i : value of placing billboard at position p i Constraint: –At most one billboard every five miles Example –{(6,5), (8,6), (12, 5), (14, 1)}

Design a Dynamic Programming Algorithm for Billboard Placement Compute Opt[1], Opt[2],..., Opt[n] What is Opt[k]? Input b 1, …, b n, where b i = (p i, v i ), position and value of billboard i

Opt[k] = fun(Opt[0],…,Opt[k-1]) How is the solution determined from sub problems? Input b 1, …, b n, where b i = (p i, v i ), position and value of billboard i

Solution j = 0; // j is five miles behind the current position // the last valid location for a billboard, if one placed at P[k] for k := 1 to n while (P[ j ] < P[ k ] – 5) j := j + 1; j := j – 1; Opt[ k] = Max(Opt[ k-1], V[ k ] + Opt[ j ]);

Optimal line breaking and hyphen- ation Problem: break lines and insert hyphens to make lines as balanced as possible Typographical considerations: –Avoid excessive white space –Limit number of hyphens –Avoid widows and orphans –Etc.

Penalty Function Pen(i, j) – penalty of starting a line a position i, and ending at position j Key technical idea –Number the breaks between words/syllables Opt-i-mal line break-ing and hyph-en-a-tion is com-put-ed with dy-nam-ic pro-gram-ming

String approximation Given a string S, and a library of strings B = {b 1, …b m }, construct an approximation of the string S by using copies of strings in B. B = {abab, bbbaaa, ccbb, ccaacc} S = abaccbbbaabbccbbccaabab

Formal Model Strings from B assigned to non- overlapping positions of S Strings from B may be used multiple times Cost of  for unmatched character in S Cost of  for mismatched character in S –MisMatch(i, j) – number of mismatched characters of b j, when aligned starting with position i in s.

Design a Dynamic Programming Algorithm for String Approximation Compute Opt[1], Opt[2],..., Opt[n] What is Opt[k]? Target string S = s 1 s 2 …s n Library of strings B = {b 1, …,b m } MisMatch(i,j) = number of mismatched characters with b j when aligned starting at position i of S.

Opt[k] = fun(Opt[0],…,Opt[k-1]) How is the solution determined from sub problems? Target string S = s 1 s 2 …s n Library of strings B = {b 1, …,b m } MisMatch(i,j) = number of mismatched characters with b j when aligned starting at position i of S.

Solution for i := 1 to n Opt[k] = Opt[k-1] +  ; for j := 1 to |B| p = i – len(b j ); Opt[k] = min(Opt[k], Opt[p-1] +  MisMatch(p, j));