Approximation Algorithms for Scheduling

Slides:



Advertisements
Similar presentations
Algorithm Design Methods Spring 2007 CSE, POSTECH.
Advertisements

Covers, Dominations, Independent Sets and Matchings AmirHossein Bayegan Amirkabir University of Technology.
Minimum Clique Partition Problem with Constrained Weight for Interval Graphs Jianping Li Department of Mathematics Yunnan University Jointed by M.X. Chen.
FLOW SHOPS: F2||Cmax. FLOW SHOPS: JOHNSON'S RULE2 FLOW SHOP SCHEDULING (n JOBS, m MACHINES) n JOBS BANK OF m MACHINES (SERIES) n M1 M2Mm.
ISE480 Sequencing and Scheduling Izmir University of Economics ISE Fall Semestre.
Lateness Models Contents
Spring, Scheduling Operations. Spring, Scheduling Problems in Operations Job Shop Scheduling. Personnel Scheduling Facilities Scheduling.
1 IOE/MFG 543 Chapter 3: Single machine models (Sections 3.1 and 3.2)
1 IOE/MFG 543 Chapter 5: Parallel machine models (Sections )
1 Single Machine Deterministic Models Jobs: J 1, J 2,..., J n Assumptions: The machine is always available throughout the scheduling period. The machine.
1 Ecole Polytechnque, Nov 7, 2007 Scheduling Unit Jobs to Maximize Throughput Jobs:  all have processing time (length) = 1  release time r j  deadline.
1 Set # 3 Dr. LEE Heung Wing Joseph Phone: Office : HJ639.
Accept or Reject: Can we get the work done in time? Marjan van den Akker Joint work with Han Hoogeveen.
EDA (CS286.5b) Day 11 Scheduling (List, Force, Approximation) N.B. no class Thursday (FPGA) …
1 IOE/MFG 543 Chapter 7: Job shops Sections 7.1 and 7.2 (skip section 7.3)
Minimizing Makespan and Preemption Costs on a System of Uniform Machines Hadas Shachnai Bell Labs and The Technion IIT Tami Tamir Univ. of Washington Gerhard.
INTRODUCTION TO SCHEDULING
EHSAN KHODDAM MOHAMMADI MILAD GANJALIZADEH BABAK YADEGARI First Steps to Study SCHEDULING بسم الله الرحمن الرحيم.
Approximation schemes Bin packing problem. Bin Packing problem Given n items with sizes a 1,…,a n  (0,1]. Find a packing in unit-sized bins that minimizes.
An algorithm for a Parallel Machine Problem with Eligibility and Release and Delivery times, considering setup times Manuel Mateo Management.
Operational Research & ManagementOperations Scheduling Introduction Operations Scheduling 1.Setting up the Scheduling Problem 2.Single Machine Problems.
1 Short Term Scheduling. 2  Planning horizon is short  Multiple unique jobs (tasks) with varying processing times and due dates  Multiple unique jobs.
Approximation Schemes Open Shop Problem. O||C max and Om||C max {J 1,..., J n } is set of jobs. {M 1,..., M m } is set of machines. J i : {O i1,..., O.
1 The Theory of NP-Completeness 2 Cook ’ s Theorem (1971) Prof. Cook Toronto U. Receiving Turing Award (1982) Discussing difficult problems: worst case.
Outline Introduction Minimizing the makespan Minimizing total flowtime
Maximum Flow Problem (Thanks to Jim Orlin & MIT OCW)
15.082J and 6.855J March 4, 2003 Introduction to Maximum Flows.
Parallel Machine Scheduling
© The McGraw-Hill Companies, Inc., Chapter 12 On-Line Algorithms.
1 JOB SEQUENCING WITH DEADLINES The problem is stated as below. There are n jobs to be processed on a machine. Each job i has a deadline d i ≥ 0 and profit.
Problems in Combinatorial Optimization. Linear Programming.
11 -1 Chapter 12 On-Line Algorithms On-Line Algorithms On-line algorithms are used to solve on-line problems. The disk scheduling problem The requests.
Approximation Algorithms based on linear programming.
Single Machine Scheduling Problem Lesson 5. Maximum Lateness and Related Criteria Problem 1|r j |L max is NP-hard.
Approximation Algorithms for Scheduling Lecture 11.
Linear program Separation Oracle. Rounding We consider a single-machine scheduling problem, and see another way of rounding fractional solutions to integer.
Classification of Scheduling Problems
The Theory of NP-Completeness
8.3.2 Constant Distance Approximations
Some Topics in OR.
Approximation Algorithms
Algorithm Design Methods
Job Shop Scheduling Contents 1. Problem Statement 2. Disjunctive Graph
Basic Project Scheduling
Assignment Problem, Dynamic Programming
Lecture 8: Dispatch Rules
Parallel Machines (Q||ΣCi , Q|pmtn|ΣCi , R||ΣCi , R|pmtn|Cmax, R|pmtn|Lmax) Lesson 8.
Basic Project Scheduling
Shop Scheduling Problem
Design and Analysis of Algorithm
Deterministic Models: Preliminaries
Computability and Complexity
Heuristics Definition – a heuristic is an inexact algorithm that is based on intuitive and plausible arguments which are “likely” to lead to reasonable.
Introduction to Maximum Flows
Applied Combinatorics, 4th Ed. Alan Tucker
Polynomial time approximation scheme
Chapter 6: Flow shops Sections 6.1 and 6.2 (skip section 6.3)
Chapter 7: Job shops Sections 7.1 and 7.2 (skip section 7.3)
CUBE MATERIALIZATION E0 261 Jayant Haritsa
Selfish Load Balancing
Algorithm Design Methods
Topic 15 Job Shop Scheduling.
György Dósa – M. Grazia Speranza – Zsolt Tuza:
Algorithm Design Methods
The Theory of NP-Completeness
Ch09 _2 Approximation algorithm
Single Machine Deterministic Models
List Scheduling Given a list of jobs (each with a specified processing time), assign them to processors to minimize makespan (max load) In Graham’s notation:
Algorithm Design Methods
Complexity Theory: Foundations
Presentation transcript:

Approximation Algorithms for Scheduling

O3||Cmax Partition. Given natural numbers a1,…,an, B. Is there a subset S  {1,…, n}such that Reduction: n+1 jobs, p01=p02=p03=B, pi1=pi2=pi3=ai , i=1,…,n. Is there exist a schedule with Cmax= 3B. M1 M2 M3 B 2B 3B

Partition B 2B 3B Operations of partition jobs: B 2B 3B M1 M2 M3 M1 M2 B 2B 3B Operations of partition jobs: M1 M2 M3 B 2B 3B

F3||Cmax 3-Partition. Given natural numbers a1,…,a3n, B, such that and B/4 < ai < B/2 for all i. Can numbers a1,…,a3n be partitioned into n triplet S1,...,Sn such that the sum of the numbers in each subset is equal to B. Reduction: n–1 “special” jobs and n “partition” jobs, “special” jobs: q11=B, q12=B, q13=2B, qi1=2B, qi2=B, qi3=2B, i=2,…, n–2, qn–11=2B, qn–12=B, qn–13=B. “partition” jobs pi1= 0, pi2=ai , pi3 = 0, i=1,…,n. Is there exist a schedule with Cmax= (2n–1)B. M1 1 2 3 n–1 M2 1 2 3 n–1 M3 1 2 3 n–1 2B 4B 6B (2n–1)B

Approximation Scheduling problems provide a good starting point for the study approximation algorithms, and historically they were among the first problems to be analyzed this way. When was the first time that someone presented a worst-case analysis of an approximation algorithm. It was probably 1966, the year that Ron Graham analyzed a simple procedure for one of the most basic scheduling problem: minimizing makespan in an identical parallel machine environment.

P||Cmax We have a set of n jobs, J1,..., Jn, and m identical machines, M1,..., Mm. Each job must be processed without interruption for a time pj > 0 on one of the m machines, each of which can process at most one job at a time.

List Scheduling (LS) We a given a list of the jobs in some arbitrary order. Whenever a machine becomes available, the next job on the list is assigned to begin processing on the machine.

First Approximation Algorithm Theorem 10.1 (Graham [1966]) The List Scheduling Algorithm is a 2 – (1/m) factor approximation algorithm for P||Cmax.

Proof M1 M2 Jk M3 M4 t = 0 sk Ck

Notation Jobs are denoted as Jj, and n denotes the number of jobs. Machines are denoted Mi, and when there is more than one machine, we use m to denote the number of machines. A schedule σ is an assignment of jobs to machines and associated starting times s1(σ),...,sn(σ) of jobs J1,..., Jn, respectively.

Feasible Schedule A feasible schedule will have the property that at any point in time a machine processes at most one job at a time; that is if two jobs Jj and Jk are assigned to the same machine then either sj + pj ≤ sk or sk + pk ≤ sj . We use Cj to denote the time at which job Jj completes processing in given schedule, i.e., sj + pj = Cj. A schedule associated with a given algorithm A is denoted σ(A).

Precedence constraints

1||Lmax We consider scheduling problem on a single machine. In a given instance, each job Jj has a due date dj associated with it, as well as a processing time pj > 0; for a given schedule σ we define the lateness of a job as Lj(σ) = Cj(σ) – dj ; in particular, if the job completes before its due date, its “lateness” can be negative. Our goal find a schedule that minimizes the maximum lateness over all jobs,

Delivery times We consider a new model with delivery times. In this model, each job Jj must be processed on the machine and then spend an additional amount of time qj being delivered. This “delivery” can be interpreted as an additional processing requirement on a non-bottleneck machine, or as a physical delivery (travel) time. The delivery-completion time of a job Jj is sj + pj + qj.

1|rj|Lmax Next we introduce job release dates into the model: each job Jj cannot begin its processing before its release dates rj. This problem is strongly NP-hard; a reduction from 3-PARTITION is relatively straightforward.

Lower Bounds

List Scheduling Algorithm (LS) Let us first consider a straightforward generalization of Graham’s list scheduling algorithm LS; whenever the machine becomes available, schedule the first available job on the list. A job is defined as “available” if it has already been released.

Algorithm LS Theorem 10.2 and this bound is tight.

Proof

Bad case r p q J1 M J2 1 LS: J1, J2 OPT: J2, J1 ratio = (2M+1)/(M+2)

Jackson’s Rule (J) Apply List Scheduling Algorithm to a list with jobs ordered by non-increasing delivery times (equivalent to an EDD, or earliest due-date, ordering). Of course, the two-job instance just given illustrates that the worst-case performance does not improve over arbitrary list scheduling.

Algorithm J Corollary 10.3 and this bound is tight.

Critical job Let us examine this heuristic J in slightly more detail. Let us define a critical job Jc as one whose lateness attains that of the schedule,

Critical job Jc, Interference job Jb qc qb< qc Ja Jb Jc ra Lmax =Lc critical sequence S; S’S Jb is interference job

A simple 3/2-Approximation Algorithm

Algorithm NS Construct the schedule given by Jackson’s rule, and determine a critical job Jc and critical sequence for the schedule. If there exists no interference job Jb, then stop and return this schedule. If min{pb, qc} ≤ P/2, then stop and return the schedule of Step 1. Otherwise, order the jobs of A according to nondecreasing release dates and of B according to nonincreasing delivery times. (Note that Jb = Jd .) Construct a schedule given by the order set A, followed by Jb, followed by the ordered set B. Return the better of this schedule and that constructed in Step 1.

Algorithm NS (2) Theorem 10.4 (Novicki & Smutnicki [1994]) Algorithm NS is a (3/2)-approximation algorithm.

Proof (1st Schedule) b = d a=b=c qc Ja Jb Jc ra critical sequence S Lmax =Lc No interference job Jb schedule is optimal b = d a=b=c schedule is optimal

Proof (2nd Schedule) Jb A; rj ≤ qj B; rj > qj

Case 1.1: Jh, Jk  A Q Jh Jk Jb A; rj ≤ qj Jb  Q rh ≤ rk

Case 1.2: Jh, Jk  B Q Jb Jh Jk B; rj > qj Jb  Q qk ≤ qh

Case 2.1: Jk = Jb Q Jh Jk =Jb A; rj ≤ qj B; rj > qj Case 2.1.1: All of the jobs of Q\{Jb} are processed before job Jb in σ*.

Case 2.1: Jk = Jb Q\{Jb}A Q Jh Jk =Jb A; rj ≤ qj B; rj > qj Case 2.1.2: At least one job of Q\{Jb} is processed after job Jb in σ*. Q\{Jb}A rj ≤ qj

Case 2.2: Jh = Jb Q Jh =Jb Jk A; rj ≤ qj B; rj > qj Case 2.1.1: All of the jobs of Q\{Jb} are processed after job Jb in σ*.

Case 2.2: Jk = Jb Q\{Jb} B Q Jh =Jb Jk A; rj ≤ qj B; rj > qj Case 2.2.2: At least one job of Q\{Jb} is processed before job Jb in σ*. Q\{Jb} B rj > qj

Case 3: Jh  A, Jk  B Q Jh Jb Jk A; rj ≤ qj B; rj > qj Let Ji be that jobs of Q that gets processed first in σ*. Ji  A rh ≤ ri Ji  B ri ≥ qi ≥ qk Pi = Pb

PTAS While 1|rj|Lmax is strongly NP-hard, there do exist polynomial approximation schemes for it, that is, a family of (1+1/k)-approximation algorithms for k = 1, 2, 3, … . To explain the polynomial approximation scheme, we first examine an artificial situation that we shall use as a tool in analyzing the eventual algorithm.

Modified Instance

Modified Instance (2) I Ji Jj ri rj Lmax Ji Jj Lmax

Big and small jobs A B No interference job Schedule σJ is optimal Jb Jj Let us analyze the schedule σJ delivered by Jackson’s rule on this instance. Jc is critical job; Jb is interference job. No interference job Schedule σJ is optimal Jb  A

Big and small jobs A B Ji Jb Jc is a critical job; Jb is an interference job. Jb  B Job Jc begins processing after job Jb in σ*

Conclusion

How to design it? In fact, in order to reconstruct σ it is sufficient to know simply positions that each of the jobs of B have in σ; we can then run Jackson’s rule with the exception that when a position belonging to a job of B is reached, we place that job next in the ordering. Of course, we also do not know the positions of the jobs of B in σ, but by trying every possible choice for positioning the jobs of B in the sequence and selecting the best schedule generated, we are guaranteed to find a schedule as good as σ. There are O(n|B|) such choices.

How to choose δ?

Exercise LPT rule: Order the job list by non-increasing processing times. Apply List Scheduling Algorithm Prove that LPT is a 4/3-factor approximation algorithm for P||Cmax.