Week 8 - Friday CS322.

Slides:



Advertisements
Similar presentations
Greedy Algorithms.
Advertisements

Analysis of Algorithms
Dynamic Programming.
CSCI 256 Data Structures and Algorithm Analysis Lecture 13 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 17.
Lectures on Greedy Algorithms and Dynamic Programming
CSCI 256 Data Structures and Algorithm Analysis Lecture 6 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
1 Chapter 15-1 : Dynamic Programming I. 2 Divide-and-conquer strategy allows us to solve a big problem by handling only smaller sub-problems Some problems.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 14.
CS 3343: Analysis of Algorithms Lecture 18: More Examples on Dynamic Programming.
1 Dynamic Programming Topic 07 Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program, Image and Vision Computing Lab. School of Information and Computer Technology.
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
Dynamic Programming David Kauchak cs161 Summer 2009.
1 Today’s Material Dynamic Programming – Chapter 15 –Introduction to Dynamic Programming –0-1 Knapsack Problem –Longest Common Subsequence –Chain Matrix.
Week 8 - Wednesday.  What did we talk about last time?  Relations  Properties of relations  Reflexive  Symmetric  Transitive.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 17.
Greedy algorithms: CSC317
CSC317 Selection problem q p r Randomized‐Select(A,p,r,i)
All-pairs Shortest paths Transitive Closure
Week 9 - Monday CS 113.
Week 13: Searching and Sorting
Algorithmic Problem Solving
Dynamic Programming Sequence of decisions. Problem state.
Lecture on Design and Analysis of Computer Algorithm
Analysis of Algorithms
Lecture 32 CSE 331 Nov 16, 2016.
CS 3343: Analysis of Algorithms
Weighted Interval Scheduling
CS 3343: Analysis of Algorithms
Analysis of Algorithms CS 477/677
Design & Analysis of Algorithm Greedy Algorithm
Chapter 4 Divide-and-Conquer
Richard Anderson Lecture 17 Dynamic Programming
Lecture 10. Dynamic Programming I
Types of Algorithms.
Greedy Algorithms / Interval Scheduling Yin Tat Lee
Presented by Po-Chuan & Chen-Chen 2016/03/08
Topic 25 Dynamic Programming
CS38 Introduction to Algorithms
CS 3343: Analysis of Algorithms
Binary Search Back in the days when phone numbers weren’t stored in cell phones, you might have actually had to look them up in a phonebook. How did you.
Dynamic Programming.
Dynamic Programming.
Prepared by Chen & Po-Chuan 2016/03/29
CS 3343: Analysis of Algorithms
Lecture 33 CSE 331 Nov 18, 2016.
Lecture 33 CSE 331 Nov 17, 2017.
Chapter 6 Dynamic Programming
Richard Anderson Lecture 16 Dynamic Programming
Data Structures Review Session
Chapter 6 Dynamic Programming
Richard Anderson Lecture 16a Dynamic Programming
Richard Anderson Lecture 16 Dynamic Programming
Advanced Algorithms Analysis and Design
Advanced Algorithms Analysis and Design
Chapter 15-1 : Dynamic Programming I
Divide and Conquer Algorithms Part I
Lecture 32 CSE 331 Nov 15, 2017.
Lecture 33 CSE 331 Nov 14, 2014.
Lecture 8. Paradigm #6 Dynamic Programming
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Lecture 36 CSE 331 Nov 28, 2011.
Lecture 36 CSE 331 Nov 30, 2012.
David Kauchak cs161 Summer 2009
Richard Anderson Lecture 16, Winter 2019 Dynamic Programming
Analysis of Algorithms CS 477/677
This is not an advertisement for the profession
Dynamic Programming.
Time Complexity and the divide and conquer strategy
Presentation transcript:

Week 8 - Friday CS322

Last time What did we talk about last time? Exam 2! And before that? Review Integer multiplication Master Theorem Solved exercises from Chapter 5

Questions?

Logical warmup There are five pirates who wish to divide 100 gold coins These pirates are ranked: Captain Lieutenant Master Midshipman Seaman In order of rank, each pirate gets the opportunity to propose a plan for dividing up the gold If at least half of the pirates (including the proposer) agree on the proposition, it is carried out Otherwise, the pirate is killed and the next highest ranking pirate makes a proposal Pirates are completely rational, who value, in this order: Staying alive Maximizing gold coins received Seeing other pirates die If you were the captain, what would you propose? Hint: Work backwards!

Dynamic Programming

Previous approaches We covered greedy approaches, where you simply want to take the next best thing These are often linear or O(n log n) due to sorting We looked at divide-and-conquer approaches Usually taking an unimpressive polynomial running time like O(n2) and improving it, perhaps to O(n log n) But there are harder problems that appear as if they might take exponential time

Dynamic Programming Dynamic programming shares some similarities with divide and conquer We break a problem down into subproblems We build correct solutions up from smaller subproblems into larger ones The subproblems tend not to be as simple as simply dividing the problem in half Dynamic programming dances on the edge of exploring an exponential number of solutions But somehow manages to only look at a polynomial set

Weighted Interval Scheduling Student Lecture

Weighted Interval Scheduling

Interval scheduling In the interval scheduling problem, some resource (a phone, a motorcycle, a toilet) can only be used by one person at a time. People make requests to use the resource for a specific time interval [s, f]. The goal is to schedule as many uses as possible. There's no preference based on who or when the resource is used.

Weighted interval scheduling The weighted interval scheduling problem extends interval scheduling by attaching a weight (usually a real number) to each request Now the goal is not to maximize the number of requests served but the total weight Our greedy approach is worthless, since some high value requests might be tossed out We could try all possible subsets of requests, but there are exponential of those Dynamic programming will allow us to save parts of optimal answers and combine them efficiently

Weighted interval scheduling example 3 12 9 7 20 5 4 14 2

Notation We have n requests labeled 1, 2,…, n Request i has a start time si and a finish time fi Request i has a value vi Two intervals are compatible if they don't overlap

Designing the algorithm Let's go back to our intuition from the unweighted problem Imagine that the requests are sorted by finish time so that f1 ≤ f2 ≤ … ≤ fn We say that request i comes before request j if i < j, giving a natural left-to-right order For any request j, let p(j) be the largest index i < j such that request i ends before j begins If there is no such request, then p(j) = 0

p(j) examples 1 p(1) = 0 2 p(2) = 0 3 p(3) = 1 4 p(4) = 0 5 p(5) = 3 6 Index 1 2 3 4 5 6 2 p(1) = 0 p(2) = 0 p(3) = 1 p(4) = 0 p(5) = 3 p(6) = 3 4 4 7 2 1

More algorithm design Consider an optimal solution O It either contains the last request n or it doesn't If O contains n, it does not contain any requests between p(n) and n Furthermore, if O contains n, it has an optimal solution for the problem for just requests 1, 2, …, p(n) Since those requests don't overlap with n, they have to be the best or they wouldn't be optimal If O does not contain n, then O is simply the optimal solution of requests 1, 2,…, n - 1

Subproblems found! It might not be obvious, but the last slide laid out a way to break a problem into smaller subproblems Let OPT(j) be the value of the optimal solution to the subproblem of requests 1, 2,…, j OPT(j) = max(vj + OPT(p(j)), OPT(j – 1)) Another way to look at this is that we will include j in our optimal solution for requests 1, 2,…,j iff vj + OPT(p(j)) ≥ OPT(j – 1)

We've already got an algorithm! Compute-Opt(j) If j = 0 then Return 0 Else Return max(vj + Compute-Opt(p(j)), Compute-Opt(j – 1))

How long does Compute-Opt take? Well, for every request j, we have to do two recursive calls Look at the tree from the requests a few slides back 6 5 3 4 2 1 Uh oh.

Needless recomputation The issue here is that we are needlessly recomputing optimal values for smaller subproblems You might recall that we had a similar problem with the naïve implementation of a recursive Fibonacci function In the worst case, the algorithm has an exponential running time Just how exponential depends on the structure of the problem

Memoization The solution is something called memoization, which means storing the value for an optimal solution whenever you compute it Then, if you need it again, you just look it up (from the memo you left yourself) To make this work, we need an array M of length n that stores the optimal value found for each request Initially, it's all -1 or null or another value that indicates empty

Updated algorithm M-Compute-Opt(j) If j = 0 then Return 0 Else if M[j] is not empty then Return M[j] Else Return max(vj + M-Compute-Opt(p(j)), M-Compute-Opt(j – 1))

Running time of memoized algorithm Constant non-recursive work is done inside of each call The recursion will be constant if M[j] already has a value There will only be n cases when M[j] doesn't have a value The running time is O(n) Note that sorting the requests in the first place takes O(n log n)

Going beyond the value We have only found the value of an optimal solution, not the actual intervals included As with many dynamic programming solutions, the value is the hard part For each optimal value, we could keep the solution at that point, but it would be linear in length Storing that solution would require O(n2) space and make the algorithm O(n2) in order to keep it updated

Reconstructing the solution Recall that j is in our optimal solution if and only if vj + OPT(p(j)) ≥ OPT(j – 1) On that basis, we can backtrack through the M array and output request j only if the inequality holds

Algorithm for solution Find-Solution(j, M) If j = 0 then Output nothing Else if vj + M[p(j)] ≥ M[j – 1] then Output j together with the result of Find-Solution(p(j)) Else Output the result of Find-Solution(j – 1) Algorithm is O(n)

Principles of Dynamic Programming

Why is this dynamic programming? The key element that separates dynamic programming from divide-and-conquer is that you have to keep the answers to subproblems around It's not simply a one-and-done situation Based on which intervals overlap with which other intervals, it's hard to predict when you'll need an earlier M[j] value Thus, dynamic programming can often give us polynomial algorithms but with linear (and sometimes even larger) space requirements

Iterative vs. recursive The array M is thus the key to this (and other similar) dynamic programming algorithms The same problem could have been tackled with a non-recursive approach where you compute each M[j] in order Every dynamic programming problem can use either: Memoized recursion Building up solutions iteratively The two solutions are equivalent, but the book will prefer iterative solutions, since they are usually faster in practice and easier to analyze

Iterative solution to weighted interval scheduling Iterative-Compute-Opt M[0] = 0 For j = 1 up to n M[j] = max(vj + M[p(j)], M[j – 1]) Algorithm is (even more obviously) O(n)

Informal guidelines Weighted interval scheduling follows a set of informal guidelines that are essentially universal in dynamic programming solutions: There are only a polynomial number of subproblems The solution to the original problem can easily be computed from (or is one of) the solutions to the subproblems There is a natural ordering of subproblems from "smallest" to "largest" There is an easy-to-compute recurrence that lets us compute the solution to a subproblem from the solutions of smaller subproblems

Upcoming

Next time… Segmented least squares Subset sums and knapsacks

Reminders Read sections 6.3 and 6.4 Employer Meet and Greet Monday, March 19, 2018 from 11 a.m. – 1 p.m. Masters Center (Mineral Gallery) Employers attending: Clark Associates Donegal Insurance Group Gross Investments Hershey Entertainment and Resorts Members 1st Credit Union Northwestern Mutual WebpageFX