Single Machine Scheduling Problem Lesson 5. Maximum Lateness and Related Criteria Problem 1|r j |L max is NP-hard.

Slides:



Advertisements
Similar presentations
Covers, Dominations, Independent Sets and Matchings AmirHossein Bayegan Amirkabir University of Technology.
Advertisements

Longest Common Subsequence
Greedy Algorithms Greed is good. (Some of the time)
Analysis of Algorithms
1 EE5900 Advanced Embedded System For Smart Infrastructure Static Scheduling.
ISE480 Sequencing and Scheduling Izmir University of Economics ISE Fall Semestre.
Instructor Neelima Gupta Table of Contents Greedy Algorithms.
Online Scheduling with Known Arrival Times Nicholas G Hall (Ohio State University) Marc E Posner (Ohio State University) Chris N Potts (University of Southampton)
Greedy Algorithms Basic idea Connection to dynamic programming
Parallel Scheduling of Complex DAGs under Uncertainty Grzegorz Malewicz.
1 Swiss Federal Institute of Technology Computer Engineering and Networks Laboratory Embedded Systems Exercise 2: Scheduling Real-Time Aperiodic Tasks.
Spring, Scheduling Operations. Spring, Scheduling Problems in Operations Job Shop Scheduling. Personnel Scheduling Facilities Scheduling.
The number of edge-disjoint transitive triples in a tournament.
PTAS for Bin-Packing. Special Cases of Bin Packing 1. All item sizes smaller than Claim 1: Proof: If then So assume Therefore:
CPSC 668Set 10: Consensus with Byzantine Failures1 CPSC 668 Distributed Algorithms and Systems Fall 2009 Prof. Jennifer Welch.
1 IOE/MFG 543 Chapter 3: Single machine models (Sections 3.1 and 3.2)
1 Single Machine Deterministic Models Jobs: J 1, J 2,..., J n Assumptions: The machine is always available throughout the scheduling period. The machine.
1 Set # 3 Dr. LEE Heung Wing Joseph Phone: Office : HJ639.
Accept or Reject: Can we get the work done in time? Marjan van den Akker Joint work with Han Hoogeveen.
CSE 421 Algorithms Richard Anderson Lecture 6 Greedy Algorithms.
Deterministic Scheduling
1 Set # 4 Dr. LEE Heung Wing Joseph Phone: Office : HJ639.
Job Scheduling Lecture 19: March 19. Job Scheduling: Unrelated Multiple Machines There are n jobs, each job has: a processing time p(i,j) (the time to.
Distributed Combinatorial Optimization
1 IOE/MFG 543 Chapter 3: Single machine models (Sections )
Online Scheduling of Precedence Constrained Tasks Yumei Huo Department of Computer Science New Jersey Institute.
1 IOE/MFG 543 Chapter 10: Single machine stochastic models Sections 10.1 and 10.4 You may skip Sections
Minimizing Makespan and Preemption Costs on a System of Uniform Machines Hadas Shachnai Bell Labs and The Technion IIT Tami Tamir Univ. of Washington Gerhard.
The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each decision is locally optimal. These.
EHSAN KHODDAM MOHAMMADI MILAD GANJALIZADEH BABAK YADEGARI First Steps to Study SCHEDULING بسم الله الرحمن الرحيم.
Approximation schemes Bin packing problem. Bin Packing problem Given n items with sizes a 1,…,a n  (0,1]. Find a packing in unit-sized bins that minimizes.
Extensions of the Basic Model Chapter 6 Elements of Sequencing and Scheduling by Kenneth R. Baker Byung-Hyun Ha R1.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
1 Combinatorial Algorithms Parametric Pruning. 2 Metric k-center Given a complete undirected graph G = (V, E) with nonnegative edge costs satisfying the.
An Efficient Algorithm for Scheduling Instructions with Deadline Constraints on ILP Machines Wu Hui Joxan Jaffar School of Computing National University.
1 Short Term Scheduling. 2  Planning horizon is short  Multiple unique jobs (tasks) with varying processing times and due dates  Multiple unique jobs.
CSCI 256 Data Structures and Algorithm Analysis Lecture 6 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
Outline Introduction Minimizing the makespan Minimizing total flowtime
Maximum Flow Problem (Thanks to Jim Orlin & MIT OCW)
15.082J and 6.855J March 4, 2003 Introduction to Maximum Flows.
Chapter 8 Maximum Flows: Additional Topics All-Pairs Minimum Value Cut Problem  Given an undirected network G, find minimum value cut for all.
Variations of the Prize- Collecting Steiner Tree Problem Olena Chapovska and Abraham P. Punnen Networks 2006 Reporter: Cheng-Chung Li 2006/08/28.
CS 3343: Analysis of Algorithms Lecture 19: Introduction to Greedy Algorithms.
1 JOB SEQUENCING WITH DEADLINES The problem is stated as below. There are n jobs to be processed on a machine. Each job i has a deadline d i ≥ 0 and profit.
CSEP 521 Applied Algorithms Richard Anderson Winter 2013 Lecture 3.
Problems in Combinatorial Optimization. Linear Programming.
 2004 SDU 1 Lecture5-Strongly Connected Components.
Approximation Algorithms based on linear programming.
6/13/20161 Greedy A Comparison. 6/13/20162 Greedy Solves an optimization problem: the solution is “best” in some sense. Greedy Strategy: –At each decision.
Approximation Algorithms for Scheduling Lecture 11.
Linear program Separation Oracle. Rounding We consider a single-machine scheduling problem, and see another way of rounding fractional solutions to integer.
CSCE 668 DISTRIBUTED ALGORITHMS AND SYSTEMS
Approximation Algorithms for Scheduling
Some Topics in OR.
Assignment Problem, Dynamic Programming
Parallel Machines (Q||ΣCi , Q|pmtn|ΣCi , R||ΣCi , R|pmtn|Cmax, R|pmtn|Lmax) Lesson 8.
Shop Scheduling Problem
Design and Analysis of Algorithm
Greedy Algorithms Basic idea Connection to dynamic programming
Presented by Po-Chuan & Chen-Chen 2016/03/08
3.3 Applications of Maximum Flow and Minimum Cut
Richard Anderson Lecture 6 Greedy Algorithms
Richard Anderson Autumn 2016 Lecture 7
Richard Anderson Lecture 7 Greedy Algorithms
CSCE 668 DISTRIBUTED ALGORITHMS AND SYSTEMS
György Dósa – M. Grazia Speranza – Zsolt Tuza:
Richard Anderson Winter 2019 Lecture 7
Single Machine Deterministic Models
Richard Anderson Autumn 2015 Lecture 7
Richard Anderson Autumn 2019 Lecture 7
Presentation transcript:

Single Machine Scheduling Problem Lesson 5

Maximum Lateness and Related Criteria Problem 1|r j |L max is NP-hard.

1|r j |L max Polynomial solvable cases: r j = r for all j = 1,..., n. Jackson ’ s rule: Schedule jobs in order of nondecreasing due dates. d j = d for all j = 1,..., n. Schedule jobs in order of nondecreasing release dates. p j = 1 for all j = 1,..., n. Horn ’ s rule: At any time schedule an available job with the smallest due date. It is easy to prove the correctness of all these rules by using interchange arguments.

Precedence relations The previous results may be extended to the corresponding problems with precedence relations between jobs. In case d j = d we have to modify the release dates before applying the corresponding rule. Other cases require a similar modification to the due dates.

Modification of Release Dates i → j & r i + p i > r j i riri rjrj r i + p i t j r ’ j = r i + p i j j → k k rkrk r ’ k = r i + p i + p j k

Modification of Due Dates i → j & d j – p j < d i i didi d j – p j t d’i = dj – pjd’i = dj – pj j djdj i i didi t j djdj Li’Li’ LjLj L i ’ = L j

1| prec; p j = 1; r j |L max Again, we can prove by exchange arguments that after modifying release times and due dates the above scheduling rules provide optimal schedules. We will give a proof only for problem 1| prec; p j = 1; r j |L max. The other proofs are similar.

1| prec; p j = 1; r j |L max Theorem 4.3 The schedule constructed by the Horn’s rule with modified due dates is optimal. Horn ’ s rule: At any time schedule an available job with the smallest due date.

Active schedule A schedule is called active if it is not possible to schedule jobs early without violating the constraints. In such an active schedule, each job starts at a release time or at a finishing time of some job. The exists an optimal schedule which is active. Consider an optimal active schedule S* which coincides as long as possible with the schedule S constructed by Horn ’ s rule.

Proof of Theorem 4.3 j ● ● ● S*: i1i1 i2i2 i3i3 ilil i i S:S: t Let t be the first time at which a job i of S and a different job j of S* begin. Let i 1,..., i l be successors of job j. r i, r j  t Horn ’ s rule + d i  d j  d i ν i ● ● ● S’:S’: ji1i1 i2i2 i l–1 ilil d i  d j  d i ν S’ is optimal

1| prec; pmtn; r j |L max Earliest Due Date Rule (EDD-rule): Schedule the jobs starting at the smallest r j -value. At each decision point t given by a release time or a finishing time of some job, schedule a job j with the following properties: r j  t, all its predecessors are scheduled, and it has the smallest modified due date.. (4.7)

1| prec; pmtn; r j |L max Theorem 4.4 The schedule constructed by EDD rule is optimal for problem 1| prec; pmtn; r j |L max.

Proof of Theorem 4.4 j ● ● ● S*: i1i1 i2i2 ilil i i S:S: t Assume that both schedule coincide until time t. Let i 1,..., i l be successors of job j. r i, r j  t EDD rule+ d i  d j  d i ν i ● ● ● S’:S’: ji1i1 i2i2 i l–1 ilil d i  d j  d i ν S’ is optimal ii2i2

Head-Tail Problem

Largest Tail Rule Corollary 4.5 A preemptive schedule for the one machine head- tail problem with precedence constraints can be constructed in O(n 2 ) time using the following rule: At each time given by a head t or a finishing time t of some job, schedule a precedence feasible job j with r j  t which has a largest tail.

1| prec; p j = 1|L max The first step is to modify the due dates in such a way that they are compatible with the precedence relations. Additionally, we assume that all modified due dates are nonnegative. So we have L max  0. Using the modified due dates d j, an optimal schedule can be calculated in O(n) time.

Two ideas. The jobs are processed in [0,n]. This implies that no job j with d j  n is late, even if it is processed as the last job. Because L max  0, these jobs have no influence on the L max -value. To sort the jobs we may use a bucket sorting method i.e. we construct the sets

1|tree| Σw j C j We have to schedule jobs with arbitrary processing times on a single machine so that a weighted sum of completion times is minimized. The processing time are assumed to be positive. Precedence constraints are given by a tree. We first assume that the tree is an outtree (i.e. each node in the tree has at most one predecessor). Before presenting an algorithm for outtrees, we will prove some basic properties of optimal schedules.

Notation For each job i = 1,..., n, define q i = w i /p i and let S(i) be the set of (not necessarily immediate) successors of i including i. For a set of jobs I  {1,..., n} define Two subsets I, J  {1,..., n} are parallel (I ~ J) if, for all i  I, j  J, neither i is a successor of j nor vise versa. The parallel sets must be disjoint. In the case {i} ~ {j} we simply write i ~ j.

Property of Optimal Schedule (1) Lemma 4.6 Let π be an optimal sequence and let I, J represent two blocks (sets of jobs to be processed consequently) of π such that I is scheduled before J. Let π’ be the sequence we get from π by swapping I and J. Then a) I ~ J implies q(I)  q(J), b)if I ~ J and q(I) = q(J), then π’ is also optimal.

Proof of Lemma 4.6 (a) π π’π’ Block I Block J p(I)p(I) p(J)p(J) p(J)p(J) f := Σ w j C j π is optimal f(π)  f(π’) f(π’) 0  f(π’) – f(π) = w(I)p(J) – w(J)p(I) q(I) = w(I)/p(I)  w(J)/p(J) = q(J) I ~ J

Proof of Lemma 4.6 (b) π π’π’ Block I Block J p(I)p(I) p(J)p(J) p(J)p(J) f := Σ w j C j w(I)p(J) = w(J)p(I) q(I) = q(J) I ~ J f(π’) = f(π)

Property of Optimal Schedule (2) Theorem 4.7 Let i, j be jobs with i → j and q j = max{q k | k  S(i)}. Then there exists an optimal schedule in which i is processed immediately before j.

Proof of Theorem 4.7 Each schedule can be represented by a sequence. Let π be an optimal sequence with the property that the number l of jobs scheduled between i and j is minimal. Assume l > 0. Then we have the following situation. kj ● ● ● i i → j and q j = max{q k | k  S(i)}.

Case 1: k  S(i) kj ● ● ● i i → j Outtree k ~ j Lemma 4.6 Optimal schedule π : q(k)  q(j) q j = max{q k | k  S(i)}q(k)  q(j) q(k) = q(j) kj ● ● ● i Optimal schedule π’ :

Case 2: k  S(i) kj ● ● ● i Optimal schedule π : h ● ● ● Block K : r  K  r  S(i) h is the latest job between i and j | h  S(i) i → j Outtree  e: j  S(e)  i  S(e) r  K  j  S(r)  K ~ j q(K)  q(j)

Case 2: k  S(i) kj ● ● ● i Optimal schedule π : h ● ● ● Block K : r  K  r  S(i) h is the latest job between i and j | h  S(i) h  S(i) r  S(i) q(h)  q(K)  q(j)r  S(h)  K ~ j q j = max{q k |k  S(i)}q(h)  q(j) Optimal schedule π’ : kj ● ● ● ih q(K) = q(j) + Lemma 4.6

Idea of Algorithm The conditions of Theorem 4.7 are satisfied if we choose a job different from the root with maximal q j -value, along with its unique father i. Since the exist an optimal schedule in which i is processed immediately before j, we merge nodes i and j and make all sons of j additional sons of i. The new node i, which represents the subsequence π i : i, j, will have the label q(i):= q(J i ), with J i = {i, j}. Note that for a son of j, its new farther i (represented by J i ) can be identified by looking for the set J i which contains j.

Merging Procedure The merging process will be apply recursively. In the general step, each node i represents a set of jobs J i and corresponding sequence π i of the jobs J i, where i is the first job in this sequence. We select a vertex j different from the root with maximal q(j)- value. Let f be the unique father of j in the original outtree. Then we have to find a node i of the current tree with f  J i. We merge j and i, replacing J i and π i by J i ∪ J j and π i ○ π j, where π i ○ π j is the concatenation of the sequences π i and π j.

Optimality Theorem 4.8 The Merging Procedure can be implemented in polynomial time and calculate an optimal sequence of the 1|outtree| Σw j C j problem.

Proof of Theorem 4.8 We proof optimality by induction on the number of jobs. Clearly the procedure is correct if we have only one job. Let P be a problem with n jobs. Assume that i, j are the first jobs merged by the algorithm. Let P ’ be the resulting problem with n – 1 jobs, where i is replaced by I:={i, j} with w(I) = w(i) + w(j) and p(I) = p(i) + p(j).

Proof of Theorem 4.8 (2) Let R be the set of sequences of the form π: π(1),..., π(k), i, j, π(k + 3),..., π(n), and let R ’ be the set of sequences of the form π’: π(1),..., π(k), I, π(k + 3),..., π(n). Note that by Theorem 4.7 set R contains an optimal schedule. For the corresponding objective function values f n (π) and f n- 1 (π’) we have f n (π) – f n-1 (π’) = w(i)p(i) + w(j)(p(i) + p(j)) – – (w(i) + w(j))(p(i) + p(j)) = – w(i)p(j). We conclude that π is optimal  π’ is optimal.

1|intree| Σw j C j To solve a 1|intree| Σw j C j problem P, we reduce P to a 1|outtree| Σw j ’ C j problem P’ with i is a successor of j in P’  j is a successor of i in P w j ’ = – w j for j = 1,..., n. Then a sequence π: 1,..., n is feasible for P if and only if π’: n,..., 1 is feasible for P’.

1|intree| Σw j C j

1|| Σw j C j Smith ’ s ratio rule: Put the jobs in order of nondecreasing ratios p i /w i, which applies if w i > 0. Homework To prove that Smith’s ratio rule leads to optimal sequence by using interchange arguments.

1|| ΣC j Smith ’ s rule (SPT-rule): Put the jobs in order of nondecreasing processing times.

1| pmtn; r j | ΣC j Modified Smith ’ s rule: At each release time or finishing time of a job, schedule an unfinished job which is available and has the smallest remaining processing time.

Optimality Theorem 4.9 A schedule constructed by modified Smith’s rule is optimal for problem 1| pmtn; r j | ΣC j.

Proof of Theorem 4.9 j S*: i i S:S: t Assume that both schedule coincide until time t. S’:S’: t’t’ ijj jiijj Smith ’ s rule  the remaining processing time of j is not smaller than the remaining processing time of i  S ’ is optimal.

Exercise Partition Given n positive integer numbers s 1, s 2,…, s n, is there a subset J  I ={1,..., n} such that Partition problem is NP-hard. Show that the problem 1|r j |L max is NP-hard by reducing the partition problem to it. (Hint: Given an instance of the partition problem, construct an instance of jobs with release dates such that if there is a partition no job is late, if there is no partition, at least one job is late.)

Exercise 5.1 Find an optimal schedule for the following instance of 1| pmtn; r j | ΣC j problem. J1J1 J2J2 J3J3 J4J4 J5J5 J6J6 J7J7 rjrj pjpj

Exercise 5.2 Find an optimal schedule for the following instance of 1| outtree| Σw j C j problem. J1J1 J2J2 J3J3 J4J4 J5J5 J6J6 J7J7 pjpj wjwj

Running time The complexity of the algorithm is O(n 2 ). This can be seen as follows. If we exclude the recursive calls in Step 7, the number of steps for the Procedure Decompose is O(|S|). Thus, for the number f (n) of computational steps we have the recursion f (n) = cn + Σ f (n i ) where n i is the number of jobs in the i-th block and Σ n i  n.