Richard Anderson Lecture 17, Winter 2019 Dynamic Programming

Slides:



Advertisements
Similar presentations
Algorithm Design Methods Spring 2007 CSE, POSTECH.
Advertisements

MCS 312: NP Completeness and Approximation algorithms Instructor Neelima Gupta
CSEP 521 Applied Algorithms Richard Anderson Lecture 7 Dynamic Programming.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 20.
Greedy vs Dynamic Programming Approach
1 ECE 665 Spring 2005 Computer Algorithms Dynamic Programming Knapsack revisited.
Dynamic Programming Reading Material: Chapter 7..
1 More On Dynamic programming Algorithms Shortest path with edge constraint: Let G=(V, E) be a directed graph with weighted edges. Let s and v be two vertices.
Lecture 38 CSE 331 Dec 3, A new grading proposal Towards your final score in the course MAX ( mid-term as 25%+ finals as 40%, finals as 65%) .
CSE 421 Algorithms Richard Anderson Lecture 16 Dynamic Programming.
Lecture 36 CSE 331 Dec 2, Graded HW 8 END of the lecture.
KNAPSACK PROBLEM A dynamic approach. Knapsack Problem  Given a sack, able to hold K kg  Given a list of objects  Each has a weight and a value  Try.
The Knapsack Problem Input –Capacity K –n items with weights w i and values v i Goal –Output a set of items S such that the sum of weights of items in.
1 Approximation Through Scaling Algorithms and Networks 2014/2015 Hans L. Bodlaender Johan M. M. van Rooij.
CSCI 256 Data Structures and Algorithm Analysis Lecture 14 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
1 Announcements: The deadline for grad lecture notes was extended to Friday Nov. 9 at 3:30pm. Assignment #4 is posted: Due Thurs. Nov. 15. Nov is.
Prof. Swarat Chaudhuri COMP 482: Design and Analysis of Algorithms Spring 2012 Lecture 16.
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
CS 3343: Analysis of Algorithms Lecture 19: Introduction to Greedy Algorithms.
1 Approximation algorithms Algorithms and Networks 2015/2016 Hans L. Bodlaender Johan M. M. van Rooij TexPoint fonts used in EMF. Read the TexPoint manual.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 18.
CSCI 256 Data Structures and Algorithm Analysis Lecture 16 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
Exhaustive search Exhaustive search is simply a brute- force approach to combinatorial problems. It suggests generating each and every element of the problem.
CSE 421 Algorithms Richard Anderson Lecture 18 Dynamic Programming.
Algorithm Design and Analysis
Advanced Algorithms Analysis and Design
Lecture 32 CSE 331 Nov 16, 2016.
Algorithm Design Methods
Weighted Interval Scheduling
CS 3343: Analysis of Algorithms
Seminar on Dynamic Programming.
Richard Anderson Lecture 17 Dynamic Programming
Richard Anderson Lecture 18 Dynamic Programming
CS38 Introduction to Algorithms
Richard Anderson Lecture 18 Dynamic Programming
CS 3343: Analysis of Algorithms
Lecture 35 CSE 331 Nov 27, 2017.
Lecture 34 CSE 331 Nov 20, 2017.
Lecture 33 CSE 331 Nov 18, 2016.
Lecture 33 CSE 331 Nov 17, 2017.
Richard Anderson Lecture 17 Dynamic Programming
Exam 2 LZW not on syllabus. 73% / 75%.
Lecture 35 CSE 331 Nov 28, 2016.
Richard Anderson Lecture 16 Dynamic Programming
Richard Anderson Lecture 18 Dynamic Programming
Richard Anderson Lecture 28 Coping with NP-Completeness
Richard Anderson Lecture 19 Longest Common Subsequence
CSEP 521 Applied Algorithms
Richard Anderson Lecture 16a Dynamic Programming
Richard Anderson Lecture 16 Dynamic Programming
CSE 421 Richard Anderson Autumn 2015, Lecture 3
Lecture 32 CSE 331 Nov 15, 2017.
Lecture 33 CSE 331 Nov 14, 2014.
Lecture 34 CSE 331 Nov 21, 2016.
Algorithm Design Methods
Lecture 36 CSE 331 Nov 28, 2011.
Lecture 36 CSE 331 Nov 30, 2012.
Algorithm Design Methods
Richard Anderson Winter 2009 Lecture 2
Richard Anderson Lecture 27 Survey of NP Complete Problems
Dynamic Programming Sequence of decisions. Problem state.
Richard Anderson Lecture 16, Winter 2019 Dynamic Programming
1.6 Linear Programming Pg. 30.
Richard Anderson Lecture 18 Dynamic Programming
Data Structures and Algorithm Analysis Lecture 15
Dynamic Programming Sequence of decisions. Problem state.
Knapsack Problem A dynamic approach.
Algorithm Design Methods
Seminar on Dynamic Programming.
CSE 421 Richard Anderson Autumn 2019, Lecture 3
Presentation transcript:

Richard Anderson Lecture 17, Winter 2019 Dynamic Programming CSE 421 Algorithms Richard Anderson Lecture 17, Winter 2019 Dynamic Programming

Announcements HW 7. Partially Posted I’ve moved My new office: CSE2 344

Optimal linear interpolation Optimal linear interpolation with K segments Error = S(yi –axi – b)2

Optk[ j ] = min i { Optk-1[ i ] + Ei,j } for 0 < i < j Optimal solution with k segments extends an optimal solution of k-1 segments on a smaller problem

Variable number of segments Segments not specified in advance Penalty function associated with segments Cost = Interpolation error + C x #Segments

Penalty cost measure Opt[ j ] = min(E1,j, mini(Opt[ i ] + Ei,j + P))

Subset Sum Problem Let w1,…,wn = {6, 8, 9, 11, 13, 16, 18, 24} Find a subset that has as large a sum as possible, without exceeding 50 6, 8, 9, 11, 16

Adding a variable for Weight Opt[ j, K ] the largest subset of {w1, …, wj} that sums to at most K {2, 4, 7, 10} Opt[2, 7] = Opt[3, 7] = Opt[3,12] = Opt[4,12] =

Subset Sum Recurrence Opt[ j, K ] the largest subset of {w1, …, wj} that sums to at most K Opt[ j, K] = max(Opt[ j – 1, K], Opt[ j – 1, K – wj] + wj)

Subset Sum Grid Opt[ j, K] = max(Opt[ j – 1, K], Opt[ j – 1, K – wj] + wj) 4 3 2 1 {2, 4, 7, 10}

Subset Sum Code for j = 1 to n for k = 1 to W Opt[j, k] = max(Opt[j-1, k], Opt[j-1, k-wj] + wj) Opt[ j, K] = max(Opt[ j – 1, K], Opt[ j – 1, K – wj] + wj)

Knapsack Problem Items have weights and values The problem is to maximize total value subject to a bound on weght Items {I1, I2, … In} Weights {w1, w2, …,wn} Values {v1, v2, …, vn} Bound K Find set S of indices to: Maximize SieSvi such that SieSwi <= K

Knapsack Recurrence Subset Sum Recurrence: Opt[ j, K] = max(Opt[ j – 1, K], Opt[ j – 1, K – wj] + wj) Knapsack Recurrence:

Knapsack Grid Weights {2, 4, 7, 10} Values: {3, 5, 9, 16} 4 3 2 1 Opt[ j, K] = max(Opt[ j – 1, K], Opt[ j – 1, K – wj] + vj) 4 3 2 1 Weights {2, 4, 7, 10} Values: {3, 5, 9, 16}

Dynamic Programming Examples Optimal Billboard Placement Text, Solved Exercise, Pg 307 Linebreaking with hyphenation Compare with HW problem 6, Pg 317 String approximation Text, Solved Exercise, Page 309

Billboard Placement Maximize income in placing billboards Constraint: bi = (pi, vi), vi: value of placing billboard at position pi Constraint: At most one billboard every five miles Example {(6,5), (8,6), (12, 5), (14, 1)}

Design a Dynamic Programming Algorithm for Billboard Placement Compute Opt[1], Opt[2], . . ., Opt[n] What is Opt[k]? Input b1, …, bn, where bi = (pi, vi), position and value of billboard i

Opt[k] = fun(Opt[0],…,Opt[k-1]) How is the solution determined from sub problems? Input b1, …, bn, where bi = (pi, vi), position and value of billboard i

Solution j = 0; // j is five miles behind the current position // the last valid location for a billboard, if one placed at P[k] for k := 1 to n while (P[ j ] < P[ k ] – 5) j := j + 1; j := j – 1; Opt[ k] = Max(Opt[ k-1] , V[ k ] + Opt[ j ]);