CSEP 521 Applied Algorithms

Slides:



Advertisements
Similar presentations
Knapsack Problem Section 7.6. Problem Suppose we have n items U={u 1,..u n }, that we would like to insert into a knapsack of size C. Each item u i has.
Advertisements

CSEP 521 Applied Algorithms Richard Anderson Lecture 7 Dynamic Programming.
CSE 421 Algorithms Richard Anderson Lecture 15 Fast Fourier Transform.
1 More On Dynamic programming Algorithms Shortest path with edge constraint: Let G=(V, E) be a directed graph with weighted edges. Let s and v be two vertices.
CSE 421 Algorithms Richard Anderson Lecture 16 Dynamic Programming.
Dynamic Programming Adapted from Introduction and Algorithms by Kleinberg and Tardos.
CSCI 256 Data Structures and Algorithm Analysis Lecture 14 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
Prof. Swarat Chaudhuri COMP 482: Design and Analysis of Algorithms Spring 2012 Lecture 16.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 14.
CS 3343: Analysis of Algorithms Lecture 18: More Examples on Dynamic Programming.
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
Instructor Neelima Gupta Instructor: Ms. Neelima Gupta.
CS 3343: Analysis of Algorithms Lecture 19: Introduction to Greedy Algorithms.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 18.
CSE 421 Algorithms Richard Anderson Lecture 18 Dynamic Programming.
Dynamic Programming Sequence of decisions. Problem state.
Lecture 32 CSE 331 Nov 16, 2016.
Greedy Method 6/22/2018 6:57 PM Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015.
Weighted Interval Scheduling
CS 3343: Analysis of Algorithms
Merge Sort 7/29/ :21 PM The Greedy Method The Greedy Method.
The Greedy Method and Text Compression
Analysis of Algorithms CS 477/677
Seminar on Dynamic Programming.
The Greedy Method and Text Compression
Richard Anderson Lecture 17 Dynamic Programming
Richard Anderson Lecture 18 Dynamic Programming
CS38 Introduction to Algorithms
Prepared by Chen & Po-Chuan 2016/03/29
Richard Anderson Lecture 18 Dynamic Programming
CS 3343: Analysis of Algorithms
Merge Sort 11/28/2018 2:18 AM The Greedy Method The Greedy Method.
Merge Sort 11/28/2018 2:21 AM The Greedy Method The Greedy Method.
Lecture 35 CSE 331 Nov 27, 2017.
Lecture 34 CSE 331 Nov 20, 2017.
Merge Sort 11/28/2018 8:16 AM The Greedy Method The Greedy Method.
Lecture 33 CSE 331 Nov 18, 2016.
Lecture 33 CSE 331 Nov 17, 2017.
Richard Anderson Lecture 17 Dynamic Programming
Lecture 35 CSE 331 Nov 28, 2016.
Richard Anderson Lecture 16 Dynamic Programming
CS Algorithms Dynamic programming 0-1 Knapsack problem 12/5/2018.
Richard Anderson Lecture 18 Dynamic Programming
Richard Anderson Lecture 16a Dynamic Programming
Richard Anderson Lecture 16 Dynamic Programming
Advanced Algorithms Analysis and Design
Dynamic Programming 1/15/2019 8:22 PM Dynamic Programming.
Merge Sort 1/17/2019 3:11 AM The Greedy Method The Greedy Method.
3. Brute Force Selection sort Brute-Force string matching
Lecture 32 CSE 331 Nov 15, 2017.
Lecture 33 CSE 331 Nov 14, 2014.
Lecture 34 CSE 331 Nov 21, 2016.
3. Brute Force Selection sort Brute-Force string matching
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Lecture 36 CSE 331 Nov 28, 2011.
Lecture 36 CSE 331 Nov 30, 2012.
Dynamic Programming II DP over Intervals
Richard Anderson Winter 2009 Lecture 2
Dynamic Programming Sequence of decisions. Problem state.
Richard Anderson Lecture 16, Winter 2019 Dynamic Programming
Merge Sort 5/2/2019 7:53 PM The Greedy Method The Greedy Method.
Analysis of Algorithms CS 477/677
Richard Anderson Lecture 18 Dynamic Programming
Richard Anderson Lecture 17, Winter 2019 Dynamic Programming
Data Structures and Algorithm Analysis Lecture 15
Dynamic Programming.
Dynamic Programming Sequence of decisions. Problem state.
Seminar on Dynamic Programming.
Week 8 - Friday CS322.
3. Brute Force Selection sort Brute-Force string matching
Presentation transcript:

CSEP 521 Applied Algorithms Richard Anderson Lecture 6 Dynamic Programming

Announcements Midterm today! Reading for this week Makeup lecture 60 minutes, start of class, closed book Reading for this week 6.1, 6.2, 6.3., 6.4 Makeup lecture February 19, 6:30 pm. Still waiting on confirmation on MS room.

Dynamic Programming Weighted Interval Scheduling Given a collection of intervals I1,…,In with weights w1,…,wn, choose a maximum weight set of non-overlapping intervals 4 Sorted by finish times 6 3 5 7 6

Optimality Condition Opt[ j ] is the maximum weight independent set of intervals I1, I2, . . ., Ij Opt[ j ] = max( Opt[ j – 1], wj + Opt[ p[ j ] ]) Where p[ j ] is the index of the last interval which finishes before Ij starts

Algorithm MaxValue(j) = if j = 0 return 0 else return max( MaxValue(j-1), wj + MaxValue(p[ j ])) Worst case run time: 2n

A better algorithm M[ j ] initialized to -1 before the first recursive call for all j MaxValue(j) = if j = 0 return 0; else if M[ j ] != -1 return M[ j ]; else M[ j ] = max(MaxValue(j-1), wj + MaxValue(p[ j ])); return M[ j ];

Iterative version MaxValue (j) { M[ 0 ] = 0; for (k = 1; k <= j; k++){ M[ k ] = max(M[ k-1 ], wk + M[ P[ k ] ]); return M[ j ]; }

Fill in the array with the Opt values Opt[ j ] = max (Opt[ j – 1], wj + Opt[ p[ j ] ]) 2 4 7 4 6 7 6

Computing the solution Opt[ j ] = max (Opt[ j – 1], wj + Opt[ p[ j ] ]) Record which case is used in Opt computation 2 4 7 4 6 7 6

Dynamic Programming The most important algorithmic technique covered in CSEP 521 Key ideas Express solution in terms of a polynomial number of sub problems Order sub problems to avoid recomputation

Optimal linear interpolation Error = S(yi –axi – b)2

What is the optimal linear interpolation with three line segments

What is the optimal linear interpolation with two line segments

What is the optimal linear interpolation with n line segments

Notation Points p1, p2, . . ., pn ordered by x-coordinate (pi = (xi, yi)) Ei,j is the least squares error for the optimal line interpolating pi, . . . pj Comment: Ei,j can be computed in O(n2) time

Optimal interpolation with two segments Give an equation for the optimal interpolation of p1,…,pn with two line segments Ei,j is the least squares error for the optimal line interpolating pi, . . . pj

Optimal interpolation with k segments Optimal segmentation with three segments Mini,j{E1,i + Ei,j + Ej,n} O(n2) combinations considered Generalization to k segments leads to considering O(nk-1) combinations

Optk[ j ] : Minimum error approximating p1…pj with k segments How do you express Optk[ j ] in terms of Optk-1[1],…,Optk-1[ j ]?

Optimal sub-solution property Optimal solution with k segments extends an optimal solution of k-1 segments on a smaller problem

Optimal multi-segment interpolation Compute Opt[ k, j ] for 0 < k < j < n for j := 1 to n Opt[ 1, j] = E1,j; for k := 2 to n-1 for j := 2 to n t := E1,j for i := 1 to j -1 t = min (t, Opt[k-1, i ] + Ei,j) Opt[k, j] = t

Determining the solution When Opt[k,j] is computed, record the value of i that minimized the sum Store this value in a auxiliary array Use to reconstruct solution

Variable number of segments Segments not specified in advance Penalty function associated with segments Cost = Interpolation error + C x #Segments

Penalty cost measure Opt[ j ] = min(E1,j, mini(Opt[ i ] + Ei,j + P))

Subset Sum Problem Let w1,…,wn = {6, 8, 9, 11, 13, 16, 18, 24} Find a subset that has as large a sum as possible, without exceeding 50

Adding a variable for Weight Opt[ j, K ] the largest subset of {w1, …, wj} that sums to at most K {2, 4, 7, 10} Opt[2, 7] = Opt[3, 7] = Opt[3,12] = Opt[4,12] =

Subset Sum Recurrence Opt[ j, K ] the largest subset of {w1, …, wj} that sums to at most K Opt[ j, K] = max(Opt[ j – 1, K], Opt[ j – 1, K – wj] + wj)

Subset Sum Grid Opt[ j, K] = max(Opt[ j – 1, K], Opt[ j – 1, K – wj] + wj) 4 3 2 1 {2, 4, 7, 10}

Subset Sum Code Opt[ j, K] = max(Opt[ j – 1, K], Opt[ j – 1, K – wj] + wj)

Knapsack Problem Items have weights and values The problem is to maximize total value subject to a bound on weght Items {I1, I2, … In} Weights {w1, w2, …,wn} Values {v1, v2, …, vn} Bound K Find set S of indices to: Maximize SieSvi such that SieSwi <= K

Knapsack Recurrence Subset Sum Recurrence: Opt[ j, K] = max(Opt[ j – 1, K], Opt[ j – 1, K – wj] + wj) Knapsack Recurrence:

Knapsack Grid Weights {2, 4, 7, 10} Values: {3, 5, 9, 16} 4 3 2 1 Opt[ j, K] = max(Opt[ j – 1, K], Opt[ j – 1, K – wj] + vj) 4 3 2 1 Weights {2, 4, 7, 10} Values: {3, 5, 9, 16}

Dynamic Programming Examples Optimal Billboard Placement Text, Solved Exercise, Pg 307 Linebreaking with hyphenation Compare with HW problem 6, Pg 317 String approximation Text, Solved Exercise, Page 309

Billboard Placement Maximize income in placing billboards Constraint: bi = (pi, vi), vi: value of placing billboard at position pi Constraint: At most one billboard every five miles Example {(6,5), (8,6), (12, 5), (14, 1)}

Design a Dynamic Programming Algorithm for Billboard Placement Compute Opt[1], Opt[2], . . ., Opt[n] What is Opt[k]? Input b1, …, bn, where bi = (pi, vi), position and value of billboard i

Opt[k] = fun(Opt[0],…,Opt[k-1]) How is the solution determined from sub problems? Input b1, …, bn, where bi = (pi, vi), position and value of billboard i

Solution j = 0; // j is five miles behind the current position // the last valid location for a billboard, if one placed at P[k] for k := 1 to n while (P[ j ] < P[ k ] – 5) j := j + 1; j := j – 1; Opt[ k] = Max(Opt[ k-1] , V[ k ] + Opt[ j ]);

Optimal line breaking and hyphen-ation Problem: break lines and insert hyphens to make lines as balanced as possible Typographical considerations: Avoid excessive white space Limit number of hyphens Avoid widows and orphans Etc.

Penalty Function Pen(i, j) – penalty of starting a line a position i, and ending at position j Key technical idea Number the breaks between words/syllables Opt-i-mal line break-ing and hyph-en-a-tion is com-put-ed with dy-nam-ic pro-gram-ming

String approximation Given a string S, and a library of strings B = {b1, …bm}, construct an approximation of the string S by using copies of strings in B. B = {abab, bbbaaa, ccbb, ccaacc} S = abaccbbbaabbccbbccaabab

Formal Model Strings from B assigned to non-overlapping positions of S Strings from B may be used multiple times Cost of d for unmatched character in S Cost of g for mismatched character in S MisMatch(i, j) – number of mismatched characters of bj, when aligned starting with position i in s.

Design a Dynamic Programming Algorithm for String Approximation Compute Opt[1], Opt[2], . . ., Opt[n] What is Opt[k]? Target string S = s1s2…sn Library of strings B = {b1,…,bm} MisMatch(i,j) = number of mismatched characters with bj when aligned starting at position i of S.

Opt[k] = fun(Opt[0],…,Opt[k-1]) How is the solution determined from sub problems? Target string S = s1s2…sn Library of strings B = {b1,…,bm} MisMatch(i,j) = number of mismatched characters with bj when aligned starting at position i of S.

Solution for i := 1 to n Opt[k] = Opt[k-1] + d; for j := 1 to |B| p = i – len(bj); Opt[k] = min(Opt[k], Opt[p-1] + g MisMatch(p, j));