Accept or Reject: Can we get the work done in time? Marjan van den Akker Joint work with Han Hoogeveen.

Slides:



Advertisements
Similar presentations
NORMAL OR GAUSSIAN DISTRIBUTION Chapter 5. General Normal Distribution Two parameter distribution with a pdf given by:
Advertisements

Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
CPE555A: Real-Time Embedded Systems
1 NP-Complete Problems. 2 We discuss some hard problems:  how hard? (computational complexity)  what makes them hard?  any solutions? Definitions 
Online Scheduling with Known Arrival Times Nicholas G Hall (Ohio State University) Marc E Posner (Ohio State University) Chris N Potts (University of Southampton)
Outline. Theorem For the two processor network, Bit C(Leader) = Bit C(MaxF) = 2[log 2 ((M + 2)/3.5)] and Bit C t (Leader) = Bit C t (MaxF) = 2[log 2 ((M.
Spring, Scheduling Operations. Spring, Scheduling Problems in Operations Job Shop Scheduling. Personnel Scheduling Facilities Scheduling.
The General Linear Model. The Simple Linear Model Linear Regression.
In this handout Stochastic Dynamic Programming
Engineering Economic Analysis Canadian Edition
Complexity 16-1 Complexity Andrei Bulatov Non-Approximability.
1 IOE/MFG 543 Chapter 11: Stochastic single machine models with release dates.
Binomial Random Variables. Binomial experiment A sequence of n trials (called Bernoulli trials), each of which results in either a “success” or a “failure”.
1 Tardiness Models Contents 1. Moor’s algorithm which gives an optimal schedule with the minimum number of tardy jobs 1 ||  U j 2. An algorithm which.
Chapter 3 Simple Regression. What is in this Chapter? This chapter starts with a linear regression model with one explanatory variable, and states the.
3.3 Brownian Motion 報告者:陳政岳.
Evaluating Hypotheses
1 Set # 3 Dr. LEE Heung Wing Joseph Phone: Office : HJ639.
So far, we have considered regression models with dummy variables of independent variables. In this lecture, we will study regression models whose dependent.
CSE 421 Algorithms Richard Anderson Lecture 6 Greedy Algorithms.
Computability and Complexity 24-1 Computability and Complexity Andrei Bulatov Approximation.
Randomness in Computation and Communication Part 1: Randomized algorithms Lap Chi Lau CSE CUHK.
1 IOE/MFG 543 Chapter 3: Single machine models (Sections )
Continuous Random Variables and Probability Distributions
Topic4 Ordinary Least Squares. Suppose that X is a non-random variable Y is a random variable that is affected by X in a linear fashion and by the random.
Fundamental Techniques
1 Dynamic Programming Jose Rolim University of Geneva.
Getting rid of stochasticity (applicable sometimes) Han Hoogeveen Universiteit Utrecht Joint work with Marjan van den Akker.
10/31/02CSE Greedy Algorithms CSE Algorithms Greedy Algorithms.
1 IOE/MFG 543 Chapter 10: Single machine stochastic models Sections 10.1 and 10.4 You may skip Sections
Maximum likelihood (ML)
10/31/02CSE Greedy Algorithms CSE Algorithms Greedy Algorithms.
Scheduling of Flexible Resources in Professional Service Firms Arun Singh CS 537- Dr. G.S. Young Dept. of Computer Science Cal Poly Pomona.
Package Transportation Scheduling Albert Lee Robert Z. Lee.
The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each decision is locally optimal. These.
The paired sample experiment The paired t test. Frequently one is interested in comparing the effects of two treatments (drugs, etc…) on a response variable.
ECES 741: Stochastic Decision & Control Processes – Chapter 1: The DP Algorithm 1 Chapter 1: The DP Algorithm To do:  sequential decision-making  state.
Approximation schemes Bin packing problem. Bin Packing problem Given n items with sizes a 1,…,a n  (0,1]. Find a packing in unit-sized bins that minimizes.
© 2009 IBM Corporation 1 Improving Consolidation of Virtual Machines with Risk-aware Bandwidth Oversubscription in Compute Clouds Amir Epstein Joint work.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Continuous Distributions The Uniform distribution from a to b.
Stochastic Models Lecture 2 Poisson Processes
Week 10Complexity of Algorithms1 Hard Computational Problems Some computational problems are hard Despite a numerous attempts we do not know any efficient.
CP Summer School Modelling for Constraint Programming Barbara Smith 2. Implied Constraints, Optimization, Dominance Rules.
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
1 Short Term Scheduling. 2  Planning horizon is short  Multiple unique jobs (tasks) with varying processing times and due dates  Multiple unique jobs.
OR 2004 Tilburg, 1/9/20041 Simulation of a job shop scheduling strategy: Do advanced scheduling algorithms pay off when processing times are disturbed?
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Linear Program Set Cover. Given a universe U of n elements, a collection of subsets of U, S = {S 1,…, S k }, and a cost function c: S → Q +. Find a minimum.
Session 10 University of Southern California ISE514 September 24, 2015 Geza P. Bottlik Page 1 Outline Questions? Comments? Quiz Introduction to scheduling.
Earliness and Tardiness Penalties Chapter 5 Elements of Sequencing and Scheduling by Kenneth R. Baker Byung-Hyun Ha R1.
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
Algorithm Design Methods 황승원 Fall 2011 CSE, POSTECH.
Stochastic Optimization
1 JOB SEQUENCING WITH DEADLINES The problem is stated as below. There are n jobs to be processed on a machine. Each job i has a deadline d i ≥ 0 and profit.
Introduction A probability distribution is obtained when probability values are assigned to all possible numerical values of a random variable. It may.
1 Introduction Optimization: Produce best quality of life with the available resources Engineering design optimization: Find the best system that satisfies.
Approximation Algorithms based on linear programming.
The NP class. NP-completeness Lecture2. The NP-class The NP class is a class that contains all the problems that can be decided by a Non-Deterministic.
Single Machine Scheduling Problem Lesson 5. Maximum Lateness and Related Criteria Problem 1|r j |L max is NP-hard.
Chapter 6 – Continuous Probability Distribution Introduction A probability distribution is obtained when probability values are assigned to all possible.
Approximation Algorithms for Scheduling
CHAPTER 8 Operations Scheduling
Basic Project Scheduling
Basic Project Scheduling
Computability and Complexity
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Simple Linear Regression
Stochastic Simulations
Presentation transcript:

Accept or Reject: Can we get the work done in time? Marjan van den Akker Joint work with Han Hoogeveen

Outline of the talk Problem description Review Moore-Hogdson Stochastic processing times –consequences –four classes of instances Summary

Problem setting n jobs become available at time 0 Known processing time Known due date Known reward for completing in time (currently 1) Decision to make now: accept or reject minimize number of tardy jobs on a single machine

Moore-Hodgson 1.Number the jobs in EDD order 2.Let S denote the EDD schedule 3.Find the first job not on time in S (suppose this is job j) 4.Remove from S the largest available job from jobs 1,…,j 5.Continue with Step 3 for this new schedule S until all jobs are on time

Solving the problem from scratch Observations First the on time jobs On time jobs in EDD order Forget about the late jobs Knowing the on time set is sufficient

Dominance rule Let E 1 and E 2 be two subsets of jobs 1,…,j All jobs in E 1 and E 2 are on time (feasible) Cardinality of E 1 and E 2 is equal The total processing time of the jobs in E 2 is more than the total processing time of the jobs in E 1 Then subset E 2 can be discarded.

Proof (sketch) Take an optimal schedule starting with E 2 (remainder: jobs from j+1, …, n) E2E2 remainder E1E1 time 0

Use dynamic programming Find E j * (k): feasible subset of jobs 1,…,j with cardinality k and minimum total processing time Use state variables f j (k) equal to p(E j * (k)) Define z j as maximum number of on time jobs from jobs 1, …, j Initialization: f j (k)=0 for j=k=0 (and +  otherwise)

Recurrence relation Put f j+1 (0)=0 f j+1 (k)=min{f j (k),f j (k-1)+p j+1 } (k=1,…,z j ) If f j (z_j)+p j+1  d j+1 then z j+1 =z j +1 and f j+1 (z j+1 )=f j (z j )+p j+1 ; otherwise, z j+1 =z j.

Relation with Moore-Hodgson The set E j *(k-1) can be computed from E j * (k) by removing the largest job! Recurrence relation f j+1 (k)=min{f j (k),f j (k-1)+p j+1 } Equal to: remove the largest job

Moore-Hodgson 1.Number the jobs in EDD order 2.Compute the values f j (z j ): –If f j (z_j)+p j+1  d j+1 then z j+1 =z j +1 and f j+1 (z j+1 )=f j (z j )+p j+1 i.e. J j+1 is added –else z j+1 = z j and f j+1 (z j+1 ) = min{f j (z j ),f j (z j -1)+p j+1 } i.e. largest job is removed

Stochastic processing times Completion times are uncertain Decision about accept or reject must be made before running the schedule When do you consider a job on time?

On time stochastically Work with a sequence of on time jobs (instead of a set of completion times) Add a job to this sequence and compute the probability that it is ready on time If this probability is large enough (at least equal to the minimum success probability msp) then accept it as on time

Classes of processing times Gamma distribution Negative binomial distribution Equally disturbed processing times p_j Normal distribution Jobs must be independent

Class 1: Gamma distribution Parameters a_j and b (common) If x_1 and x_2 follow the gamma distribution and are independent, then x_1+x_2 is gamma distributed with parameters a_1+a_2 and b.

More gamma Define S as the set of the job j and all its predecessors in the schedule Define p(S) as the sum of all processing times of jobs in S Then C_j=p(S) follows a gamma distribution with parameters a(S) and b.

Even more gamma Denote the msp of job j by y_j Job j is on time if the probability that C_j is no more than d_j is at least y_j C_j depends on a(S) only Given d_j and y_j, you can compute the maximum value of a(S) such that P(C j <= d j ) is at least y j : call it D_j

Last of Gamma Treat D_j as ordinary due dates Treat a_j as ordinary deterministic processing times Then the dominance rule still holds You can use Moore-Hodgson! Negative binomial distribution: similar

More complicated problems p j Var dj dj msp j Job Job Normally distributed processing times Optimum: first job 2 and then job 1 From now on: equal msp values  EDD-order

Equal disturbances On time probability of job j depends on: –Number of predecessors (on time jobs before j) –Total processing time of its predecessors Dominance rule: given the cardinality of the on time set, take the one with minimum total processing time Use dynamic programming with state variables f j (k) that indicate the minimum total processing possible (as before) Hence: Moore-Hodgson’s solves it!

Normal distribution (1) Parameters: expected processing time of job j and variance of job j Reminder: expected value and variances of X 1 +X 2 are equal to the respective sums Necessary for computing the on time probability of job j: –Total processing time of predecessors –Total variance of predecessors

Normal distribution (2) Dominance rule: if cardinality and total processing time are equal, then take the set with minimum total variance (msp > 0.5) Use state variables f j (k,P): –k is cardinality of on time set –P is total processing time of on time set –f j (k,P) is minimum variance possible

Normal distribution: details Running time pseudo-polynomial Problem is NP-hard Role of total variance and total processing time in the dominance rule and in the DP is interchangeable

What to remember (optional) Moore-Hodgson = Dynamic Programming DP is applicable in a stochastic environment –Stochastic on time: work with the minimum success probability –EDD sequence optimal for the on time set?? Weighted case can be solved in a similar way

Yes: single machine, minimize number of tardy jobs Solvable in O(n log n) time by Moore-Hodgson Known problem?

Miscellaneous remarks DP computes more state variables than necessary DP can be used for the weighted case: –Use f j (W) with W is the total weight of the on time set (instead of cardinality of the on time set) DP can be used for more problems (to be shown next)

Negative binomial distribution Parameters s_j and p (common for all jobs) If independent, then C_j=p(S) follows a negative binomial distribution with parameters s(S) and p Same as gamma distribution