Weighted Interval Scheduling

Slides:



Advertisements
Similar presentations
Dynamic Programming.
Advertisements

Analysis of Algorithms
Greedy Algorithms Be greedy! always make the choice that looks best at the moment. Local optimization. Not always yielding a globally optimal solution.
Problem Solving Dr. Andrew Wallace PhD BEng(hons) EurIng
Lecture 36 CSE 331 Dec 2, Graded HW 8 END of the lecture.
Dynamic Programming 0-1 Knapsack These notes are taken from the notes by Dr. Steve Goddard at
1 Dynamic Programming Jose Rolim University of Geneva.
CSC401: Analysis of Algorithms CSC401 – Analysis of Algorithms Chapter Dynamic Programming Objectives: Present the Dynamic Programming paradigm.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 17.
Honors Track: Competitive Programming & Problem Solving Optimization Problems Kevin Verbeek.
Lectures on Greedy Algorithms and Dynamic Programming
CSCI 256 Data Structures and Algorithm Analysis Lecture 6 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
December 14, 2015 Design and Analysis of Computer Algorithm Pradondet Nilagupta Department of Computer Engineering.
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
Greedy Algorithms BIL741: Advanced Analysis of Algorithms I (İleri Algoritma Çözümleme I)1.
Dynamic Programming … Continued 0-1 Knapsack Problem.
Spring 2008The Greedy Method1. Spring 2008The Greedy Method2 Outline and Reading The Greedy Method Technique (§5.1) Fractional Knapsack Problem (§5.1.1)
1 Algorithms CSCI 235, Fall 2015 Lecture 29 Greedy Algorithms.
PREPARED BY: Qurat Ul Ain SUBMITTED TO: Ma’am Samreen.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy algorithms: CSC317
Greedy Algorithms.
All-pairs Shortest paths Transitive Closure
Dynamic Programming Sequence of decisions. Problem state.
Lecture on Design and Analysis of Computer Algorithm
CS 3343: Analysis of Algorithms
Merge Sort 7/29/ :21 PM The Greedy Method The Greedy Method.
The Greedy Method and Text Compression
Topic 25 Dynamic Programming
CS38 Introduction to Algorithms
Prepared by Chen & Po-Chuan 2016/03/29
ICS 353: Design and Analysis of Algorithms
CS 3343: Analysis of Algorithms
Unit-5 Dynamic Programming
Merge Sort 11/28/2018 2:18 AM The Greedy Method The Greedy Method.
The Greedy Method Spring 2007 The Greedy Method Merge Sort
Merge Sort 11/28/2018 2:21 AM The Greedy Method The Greedy Method.
Merge Sort 11/28/2018 8:16 AM The Greedy Method The Greedy Method.
Chapter 6 Dynamic Programming
Lecture 35 CSE 331 Nov 28, 2016.
CS Algorithms Dynamic programming 0-1 Knapsack problem 12/5/2018.
Chapter 6 Dynamic Programming
Dynamic Programming Dr. Yingwu Zhu Chapter 15.
CS 3343: Analysis of Algorithms
MCA 301: Design and Analysis of Algorithms
Advanced Algorithms Analysis and Design
Greedy Algorithms.
Dynamic Programming 1/15/2019 8:22 PM Dynamic Programming.
Advanced Algorithms Analysis and Design
Dynamic Programming Dynamic Programming 1/15/ :41 PM
Merge Sort 1/17/2019 3:11 AM The Greedy Method The Greedy Method.
Lecture 6 Topics Greedy Algorithm
Dynamic Programming Dynamic Programming 1/18/ :45 AM
Merge Sort 1/18/ :45 AM Dynamic Programming Dynamic Programming.
Dynamic Programming Merge Sort 1/18/ :45 AM Spring 2007
Longest Common Subsequence
Merge Sort 2/22/ :33 AM Dynamic Programming Dynamic Programming.
Dynamic Programming.
Algorithms CSCI 235, Spring 2019 Lecture 29 Greedy Algorithms
CSC 413/513- Intro to Algorithms
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
CSC 143 Binary Search Trees.
Dynamic Programming II DP over Intervals
Merge Sort 4/28/ :13 AM Dynamic Programming Dynamic Programming.
Merge Sort 5/2/2019 7:53 PM The Greedy Method The Greedy Method.
Analysis of Algorithms CS 477/677
Dynamic Programming Merge Sort 5/23/2019 6:18 PM Spring 2008
Dynamic Programming.
Knapsack Problem A dynamic approach.
Presentation transcript:

Weighted Interval Scheduling

Optimal Substructure Is the solution equal to the “sum” of smaller solutions? or Does the solution to the whole problem define solutions to the sub-problems?

Shortest Path: Optimal Substructure? w v Shortest path from node u to node v goes through node w Is this the shortest path from w to v? Is this the shortest path from u to w? Yes – improving one improves the whole

Longest Path: Optimal Substructure? w v Longest path from node u to node v goes through node w Is this the longest path from u to w?

The (u,v) path does define the longest path from u to w True False 10

The (u,v) path does define the longest path from u to w True False 10

Longest path from u to v u w v u v v Longest path from u to v is not composed of the longest path from u to w plus the longest path from w to v. u w Longest path from u to w?

Interval Scheduling Problem Input: A list of intervals, each with a: Start time si Finish time fi Output: A subset of non-overlapping intervals Goal: The largest possible subset

Weighted Interval Scheduling Problem Input: A list of intervals, each with a: Start time si Finish time fi Weight wi Output: A subset of non-overlapping intervals Goal: The largest possible weight sum

Weighted “Classroom” Scheduling Problem Input: A list of classes, each with a: Start time si Finish time fi Enrollment wi Output: A set of (non-overlapping) classes to be scheduled in the same room Goal: The largest possible enrollment

(Unweighted) Interval Scheduling Solution: Sort by finish time, keep taking next available

Weighted Interval Scheduling 6 1 3 100 8 2 16 100 1 Optimal weight: 100 + 3 + 100 = 203

Notice: If we have already sorted by f, we know that p(i) < i Notation Number the objects by finish time Consider object i p(i) = Largest j such that fj < si (Largest j such that j does not overlap i) Notice: If we have already sorted by f, we know that p(i) < i (fp(i) < si < fi )

Weighted Interval Scheduling Note: we are numbering from 1, not 0 (It will be less confusing later on.) 7:6 2:1 5:3 9:100 1:8 4:2 8:16 3:100 6:1 p(6) = 4 What is p(9)? p(8) = 5 p(7) = Undefined

What is p(9)? 1 2 3 4 5 6 7 8 5:3 2:1 4:2 1:8 3:100 6:1 8:16 9:100 7:6 10

What is p(9)? 1 2 3 4 5 6 7 8 5:3 2:1 4:2 1:8 3:100 6:1 8:16 9:100 7:6 10

Dynamic Programming We will solve the (weighted) problem by dynamic programming Allows us to try a limited number of potential solutions at once More than one solution (as opposed to a greedy approach) But still a limited number (as opposed to brute force)

DP Hints We follow the “steps” that apply to all DP problems Order objects Concentrate on optimal score (not solution) Use the optimal substructure property to create a recursion Use overlapping subproblems property to avoid a recursive encoding

Step 1: Order Objects A DP solution usually requires that we place some order on the objects In this case, we will still want to order by finish time. Assume this already has been done: so if i < j  fi  fj

Step 2: Concentrate on optimal score We will start by looking for the optimal score only In this case: what is the highest weight total we can achieve?  Define: opt(i) = Optimal score using only elements 1 to i Finding the optimal score will be easier than trying to find the optimal solution Finding the optimal solution will actually be an easy second step

Step 3: Exploit Optimal Substructure Is the solution equal to the “sum” of smaller solutions? or Does the solution to the whole problem define solutions to the sub-problems?

Optimal Substructure opt(9) = 100+100+20+7 = 227 optimal = 100+100? 6 1 3 20 100 100 4 3 7 optimal = 100+100? optimal = 7+20? Yes – since opt(n) = (100+100)+(7+20)

Optimal Substruture opt(9) = 100+3+4+5 = 112 opt(8) = 100+3+4 = 107? 6 1 3 5 8 2 4 100 1 opt(8) = 100+3+4 = 107? Yes – so opt(9) = opt(8) + 5

Optimal Substruture opt(9) = 100+100+7 = 207 opt(8) = 100+100+7 = 207? 6 1 3 2 100 100 4 3 7 opt(8) = 100+100+7 = 207? Yes – so opt(9) = opt(8)

Using Optimal Substructure We want to find the value of opt(n) Interval n is used in the optimal solution Two possibilities: Interval n is not used in the optimal solution This is a tautology. (Or it isn’t.)

Using Optimal Substructure We can make use of optimal substructure to solve the problem Compute best solution possible when using interval n Compute best solution possible when not using interval n use whichever is better

Case 1: interval n is used If we know interval n is used, what else do we need to do? p(n) is the last interval we can still use (if j > p(n), then fj > sn) Claim: opt(n) = opt(p(n)) + wn

Case 1: interval n is used Claim: opt(n) = opt(p(n)) + wn Must have opt(n)  opt(p(n)) + wn Take solution for opt(p(n)) Add interval n (must be possible) Resulting total: opt(p(n)) + wn Must have opt(n)  opt(p(n)) + wn Remove interval n from optimal New weight: opt(n) – wn Picked from {1, 2, …, p(n)} …Hence must be  opt(p(n)) Result: opt(n) – wn  opt(p(n)) Results comes from the optimal sub-structure property

Case 1: Illustrated opt(9) = 100+3+100 = 203, p(9) = 6 8 2 16 100 1 opt(9) = 100+3+100 = 203, p(9) = 6 opt(6) = 100 + 3 = 103 opt(9) = opt(6) + w9

If interval n is in the optimal solution: then opt(n) = opt(p(n)) + wn Case 1: Recap If interval n is in the optimal solution: then opt(n) = opt(p(n)) + wn

If we know the interval item is not used, what else do we need to do? Case 2: Item n is not used If we know the interval item is not used, what else do we need to do? Claim: opt(n) = opt(n-1) If item n is not used, then opt(n) and opt(n-1) must correspond to the same solution score

The optimal solution clearly doesn’t use interval 9 Case 2: Illustrated 7:6 2:1 5:3 9:2 1:8 4:2 8:1 3:100 6:100 opt(9) = 100 + 100 + 3 = 203 The optimal solution clearly doesn’t use interval 9 opt(8) = 100 + 100 + 3 = 203 opt(9) = opt(8) So we get the same solution if we remove 9, considering only intervals 1 to 8

Summing Up If n is in the optimal solution: opt(n) = opt(p(n))+wn If n is not in the optimal solution: opt(n) = opt(n-1) Either we use item n, or we don’t. opt(p(n))+ wn opt(n) = max opt(n-1) opt(0) = 0

Recursive Formula int WIS(array W, array P, int n) { if n==0 return 0 int c1 = WIS(W, P, n-1) int c2 = WIS(W, P, P[n])+W[n] return max(c1, c2) Bad news: runtime is terrible Good news: Overlapping sub-problems

Step 5: Bottom-Up Calculation opt(p(n))+ wn opt(n) = max opt(0) = 0 opt(n-1) You can definitely calculate opt(i) if you know opt(i) i < n Calculate opt(1), then opt(2), then …

Execution opt(7) = max{opt(0) + w7, opt(6)} = 103 8 8 100 100 103 103 103 119 203 opt(7) = max{opt(0) + w7, opt(6)} = 103 opt(8) = max{opt(5) + w8, opt(7)} = 119 opt(8) = max{opt(6) + w9, opt(8)} = 203 opt(6) = max{opt(4) + w6, opt(5)} = 103 opt(2) = max{opt(0) + w2, opt(1)} = 8 opt(1) = max{opt(0) + w1, opt(0)} = 8 opt(3) = max{opt(0) + w3, opt(2)} = 100 opt(4) = max{opt(2) + w4, opt(3)} = 100 opt(5) = max{opt(3) + w5, opt(4)} = 100

Trace-back 7:6 (p = 0) 5:3 (p = 3) 2:1 (p = 0) 9:100 (p = 6) 8 8 100 100 100 100 103 103 103 103 103 103 103 119 203 203 203

Maintains Solutions 7:6 (p = 0) 5:3 (p = 3) 2:1 (p = 0) 9:100 (p = 6) 1:8 (p = 0) 4:2 (p = 2) 8:16 (p = 5) 3:100 (p = 0) 6:1 (p = 4) 8 8 8 100 100 100 103 103 103 119 119 203 As we continue, we track up to n different solutions -- more than a greedy algorithm, less than brute force

Runtime We need to fill in values into n boxes 8 8 8 100 100 100 103 103 103 119 119 203 We need to fill in values into n boxes Each box requires O(1) time to fill (if we know p()) Total runtime: O(n)

Bottom-Up Code What is the runtime of the WIS algorithm? int WIS(array W, array P, int n) array opt, opt[0] = 0 for i  1 to n c1  opt[P[i]] + W[i] c2  opt[i-1] opt[i]  max{c1,c2} return opt[n] array WIS_Sol(array opt, array P, int n) array solution i  n while (i > 0) if opt[i] == opt[i-1] i = i-1 else solution.push(i) i  p[i] return solution What is the runtime of the WIS algorithm?

Runtime of WIS? O(1) O(log n) O(n) O(n log n) O(n2) 9

Runtime of WIS? O(1) O(log n) O(n) O(n log n) O(n2) 9