Called as the Interval Scheduling Problem. A simpler version of a class of scheduling problems. – Can add weights. – Can add multiple resources – Can ask.

Slides:



Advertisements
Similar presentations
Lecture 7 Paradigm #5 Greedy Algorithms
Advertisements

1 Chapter 4 Greedy Algorithms Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Greedy Algorithms.
Greedy Algorithms Greed is good. (Some of the time)
Analysis of Algorithms
Instructor Neelima Gupta Table of Contents Greedy Algorithms.
Greedy Algorithms Be greedy! always make the choice that looks best at the moment. Local optimization. Not always yielding a globally optimal solution.
Outline. Theorem For the two processor network, Bit C(Leader) = Bit C(MaxF) = 2[log 2 ((M + 2)/3.5)] and Bit C t (Leader) = Bit C t (MaxF) = 2[log 2 ((M.
Cs333/cutler Greedy1 Introduction to Greedy Algorithms The greedy technique Problems explored –The coin changing problem –Activity selection.
Merge Sort 4/15/2017 6:09 PM The Greedy Method The Greedy Method.
Discrete Structure Li Tak Sing( 李德成 ) Lectures
Lecture 7: Greedy Algorithms II Shang-Hua Teng. Greedy algorithms A greedy algorithm always makes the choice that looks best at the moment –My everyday.
Tirgul 10 Rehearsal about Universal Hashing Solving two problems from theoretical exercises: –T2 q. 1 –T3 q. 2.
Greedy Algorithms CIS 606 Spring Greedy Algorithms Similar to dynamic programming. Used for optimization problems. Idea – When we have a choice.
CSE 421 Algorithms Richard Anderson Lecture 6 Greedy Algorithms.
Week 2: Greedy Algorithms
Lecture 7: Greedy Algorithms II
1 The Greedy Method CSC401 – Analysis of Algorithms Lecture Notes 10 The Greedy Method Objectives Introduce the Greedy Method Use the greedy method to.
9/3/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Guest lecturer: Martin Furer Algorithm Design and Analysis L ECTURE.
David Luebke 1 8/23/2015 CS 332: Algorithms Greedy Algorithms.
1 Greedy algorithm 叶德仕 2 Greedy algorithm’s paradigm Algorithm is greedy if it builds up a solution in small steps it chooses a decision.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 8.
David Luebke 1 10/24/2015 CS 332: Algorithms Greedy Algorithms Continued.
CSC 413/513: Intro to Algorithms Greedy Algorithms.
CSE 331: Review. Main Steps in Algorithm Design Problem Statement Algorithm Real world problem Problem Definition Precise mathematical def “Implementation”
CSCI 256 Data Structures and Algorithm Analysis Lecture 6 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
Graph Colouring L09: Oct 10. This Lecture Graph coloring is another important problem in graph theory. It also has many applications, including the famous.
CSC 201: Design and Analysis of Algorithms Greedy Algorithms.
Copyright © Zeph Grunschlag, Induction Zeph Grunschlag.
Provinci summer training 2010 June 17: Recursion and recursive decent parser June 24: Greedy algorithms and stable marriage July 1: Holiday July 8: Math.
1 Chapter 5-1 Greedy Algorithms Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
CS6045: Advanced Algorithms NP Completeness. NP-Completeness Some problems are intractable: as they grow large, we are unable to solve them in reasonable.
Lecture 8 CSE 331. Main Steps in Algorithm Design Problem Statement Algorithm Problem Definition “Implementation” Analysis n! Correctness+Runtime Analysis.
Greedy Algorithms BIL741: Advanced Analysis of Algorithms I (İleri Algoritma Çözümleme I)1.
Greedy Algorithms Interval Scheduling and Fractional Knapsack These slides are based on the Lecture Notes by David Mount for the course CMSC 451 at the.
CSEP 521 Applied Algorithms Richard Anderson Winter 2013 Lecture 3.
Greedy algorithms David Kauchak cs302 Spring 2012.
Copyright © Zeph Grunschlag, Induction Zeph Grunschlag.
Spring 2008The Greedy Method1. Spring 2008The Greedy Method2 Outline and Reading The Greedy Method Technique (§5.1) Fractional Knapsack Problem (§5.1.1)
CS 361 – Chapter 10 “Greedy algorithms” It’s a strategy of solving some problems –Need to make a series of choices –Each choice is made to maximize current.
1 Algorithms CSCI 235, Fall 2015 Lecture 29 Greedy Algorithms.
6/13/20161 Greedy A Comparison. 6/13/20162 Greedy Solves an optimization problem: the solution is “best” in some sense. Greedy Strategy: –At each decision.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 17.
Greedy algorithms: CSC317
Greedy Algorithms.
Greedy Algorithms – Chapter 5
Merge Sort 7/29/ :21 PM The Greedy Method The Greedy Method.
Greedy Algorithms / Interval Scheduling Yin Tat Lee
Presented by Po-Chuan & Chen-Chen 2016/03/08
Lecture 22 CSE 331 Oct 22, 2010.
Merge Sort 11/28/2018 2:18 AM The Greedy Method The Greedy Method.
Merge Sort 11/28/2018 8:16 AM The Greedy Method The Greedy Method.
Advanced Algorithms Analysis and Design
Greedy Algorithms.
Merge Sort 1/17/2019 3:11 AM The Greedy Method The Greedy Method.
Richard Anderson Lecture 6 Greedy Algorithms
Lecture 19 CSE 331 Oct 8, 2014.
Richard Anderson Autumn 2016 Lecture 7
Lecture 18 CSE 331 Oct 9, 2017.
Richard Anderson Lecture 7 Greedy Algorithms
Algorithms CSCI 235, Spring 2019 Lecture 29 Greedy Algorithms
Richard Anderson Winter 2019 Lecture 7
Merge Sort 5/2/2019 7:53 PM The Greedy Method The Greedy Method.
Week 2: Greedy Algorithms
Lecture 19 CSE 331 Oct 10, 2016.
Richard Anderson Autumn 2015 Lecture 7
Advance Algorithm Dynamic Programming
Richard Anderson Autumn 2019 Lecture 7
Presentation transcript:

Called as the Interval Scheduling Problem. A simpler version of a class of scheduling problems. – Can add weights. – Can add multiple resources – Can ask for scheduling all the requests so as to minimize some objective function. – … Interval Scheduling.

How to pick the largest number of non-overlapping intervals? In principle – Decide which one to pick first, say I 1, based on some rule. – Remove all intervals that overlap with I 1 from consideration. – Pick the next one, using the same rule. Such rule based algorithms, especially when the rule is local, are called as greedy algorithms. – Careful: May not work always. Interval Scheduling

Some first rules for this problem. Suppose we sort the intervals based on their start times. Rule 1: Among the available intervals, pick the one that starts the earliest. – Intuitively, should maximize the usage of the resource. – But, intuition can be often misleading. Interval Scheduling

Early Start rule does not work! Interval Scheduling

Why did Rule 1 fail? – At least from the picture, we think that the request that starts early may not release the resource early enough. – Miss considering several requests in that case. Rule 2: Shortest Duration. – Should free the resource as quickly as possible. – Does it work? – Question: Try if this works. Interval Scheduling

Rule 2 also does not work. Interval Scheduling

The second rule did not work because the interval with the shortest duration may actually make unavailable several non-overlapping intervals. Yet another attempt. Rule 3: Early Finish – Pick the request that finishes the earliest. – Intuitively, keeps the resource available for future requests as quickly as possible. – Skeptically, will it work? Given the failure of the other rules. Interval Scheduling

Algorithm EarlyFinish( I ) Begin Sort the requests by increasing order of finish times as S = { r 1, r 2, …, r n } i = 1; A =  ; While S is not empty do Add r i to the solution A Delete from S all requests that overlap with r i i = Next(i); //index of the next compatible request End-while Return A Interval Scheduling

Question: What is the runtime of the algorithm. Give a brief justification. Interval Scheduling

In our proof, we let O be any optimal (best possible) solution. O = {o 1, o 2, …, o m } is the set of requests that are accepted by the optimal solution. Let A be the solution produced by our algorithm (according to Rule 3). A = {a 1, a 2, …, a k }. Ideally, we would A = O, which means that all elements of A are in O and vice-versa. That may not be true always. We should just settle for |A| = | O |. Interval Scheduling

We therefore prove that |A| = | O |. To do so, we show that our algorithm releases the resource on or before O does. – In a way, the algorithm ``stays ahead’’ of the game. We then use this property to show via contradiction that indeed |A| = | O |. Interval Scheduling

Proof of Algorithm Stays Ahead by induction. Let s(i) and f(i) denote the start and finish times of request i, sorted by s(i). Let us also sort the elements of O in increasing order of their finish times. We want to show that for every i between 1 and k (both inclusive), f(a i ) <= f(o i ). Base case: i = 1. By the choice of the algorithm, we pick the request with the earliest finish time. So, f(a 1 ) <= f(o 1 ), always. Interval Scheduling

Hypothesis: Let us assume that for the ith request in A, a i, it holds that f(a i ) <= f(o i ). Step: Let us consider ai+1. We have s(a i+1 ) >= f(a i ) and s(o i+1 ) >= f(o i ). By hypothesis, we also have that f(a i ) = f(a i ). Putting together, we have that s(o i+1 ) >= f(a i ). So, both requests a i+1 and o i+1 are available at the i+1 st step of the algorithm. If indeed the algorithm picked ai+1 instead of o i+1, then f(a i+1 ) <= f(o i+1 ). Interval Scheduling

Why does the above help? It tells that the algorithm CAN accept as many as the best solution did. Now we show that |A| = | O |. Suppose that |A| = k and | O | = m, and m > k. Now, apply the above result for the request a k. We get that indeed O k+1 is available for the algorithm. So, the algorithm could not have stopped without considering O m+1. Interval Scheduling

Our algorithm follows what is called as a greedy algorithm design. The design principle is based on the following ideas – The solution is built incrementally – At each step, the algorithm can look at the currently built-up solution, and the input, – Make a decision (the decision cannot be changed later). The decision is made using a rule that is called as the greedy rule. The Algorithm Design

Notice that not all greedy rules work to get the optimal solution. Therefore, in general, need to prove that the used greedy rule works. Good to know that there are also proof design strategies. – Algorithm Stays Ahead is one such. – Structure based Proof – Exchange argument The Algorithm Design

Consider the following scheduling problem. Often occurs in many setting such as register allocation, classroom allocation, coloring,… Formally, there are requests with start and finish times. Have to satisfy all the requests by using as few resources as possible. The constraint is that the same resource cannot be used by more than one request at any given time. Example – Structure Based

Several ways to solve this problem in the literature. Can covert this to a problem on graphs. – Called Interval Graph Coloring We will not use graphs, but commonalities in the solution remain. Resource Allocation

How many resources would ANY solution need? For each request i, denote the interval of the request as the line segment between s(i) to f(i). Define the depth of an interval I as the number of intervals that overlap I. Define the depth of an input as the maximum depth of any interval in the input. Since any resource cannot be used simultaneously by overlapping intervals, any solution needs depth(Input) many resources. Resource Allocation

The lower bound on the number of resources is a structurally valid property. Does not depend on what algorithms are used etc. Even the best possible (optimal) solution needs as many resources. Now, we should just think of designing an algorithm that takes no more than the above number of resources. – Works as a proof of correctness too. Resource Allocation

Our algorithm is based on the following simple rule. Start a new resource if no previous resource is available. Viewed as steps of the greedy algorithm: – The incremental steps are to assign a resource to the current request. – The algorithm looks at the currently available resources, and – DECIDES whether to start one more resource. – Of course, once started, a resource cannot be taken back. Resource Allocation

Question: Express the above as an algorithm and state and justify its runtime. Resource Allocation

We will show that our algorithm uses exactly d = Depth(Input) many resources. Can actually argue that if our algorithm uses d resources, for any d, then there is an interval that has such a depth. Resource Allocation

Still on scheduling for this example too. Turns out that scheduling problems is a vast and fertile area for research. – More papers written even currently… Recall the proof of our Generic MST Algorithm – We modified an MST T to get another tree T’ such that Wt(T’) <= Wt(T). – Plus, T’ is (closer to) what the algorithm builds. – So, we modify the optimal solution to make it look like our solution, without diluting the quality of the solution An Example – Exchange Argument