8/14/04J. Bard and J. W. Barnes Operations Research Models and Methods Copyright 2004 - All rights reserved Lecture 11 – Stochastic Processes Topics Definitions.

Slides:



Advertisements
Similar presentations
Discrete time Markov Chain
Advertisements

Many useful applications, especially in queueing systems, inventory management, and reliability analysis. A connection between discrete time Markov chains.
Lecture 6  Calculating P n – how do we raise a matrix to the n th power?  Ergodicity in Markov Chains.  When does a chain have equilibrium probabilities?
Continuous-Time Markov Chains Nur Aini Masruroh. LOGO Introduction  A continuous-time Markov chain is a stochastic process having the Markovian property.
CS433 Modeling and Simulation Lecture 06 – Part 03 Discrete Markov Chains Dr. Anis Koubâa 12 Apr 2009 Al-Imam Mohammad Ibn Saud University.
Flows and Networks Plan for today (lecture 2): Questions? Continuous time Markov chain Birth-death process Example: pure birth process Example: pure death.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Discrete Time Markov Chains
Topics Review of DTMC Classification of states Economic analysis
Lecture 12 – Discrete-Time Markov Chains
TCOM 501: Networking Theory & Fundamentals
Chapter 17 Markov Chains.
Flows and Networks (158052) Richard Boucherie Stochastische Operations Research -- TW wwwhome.math.utwente.nl/~boucherierj/onderwijs/158052/ html.
1 Part III Markov Chains & Queueing Systems 10.Discrete-Time Markov Chains 11.Stationary Distributions & Limiting Probabilities 12.State Classification.
Андрей Андреевич Марков. Markov Chains Graduate Seminar in Applied Statistics Presented by Matthias Theubert Never look behind you…
Markov Chain Part 2 多媒體系統研究群 指導老師:林朝興 博士 學生:鄭義繼. Outline Review Classification of States of a Markov Chain First passage times Absorbing States.
Continuous Time Markov Chains and Basic Queueing Theory
Lecture 13 – Continuous-Time Markov Chains
Introduction to stochastic process
. PGM: Tirgul 8 Markov Chains. Stochastic Sampling  In previous class, we examined methods that use independent samples to estimate P(X = x |e ) Problem:
048866: Packet Switch Architectures Dr. Isaac Keslassy Electrical Engineering, Technion Review.
Model Antrian By : Render, ect. Outline  Characteristics of a Waiting-Line System.  Arrival characteristics.  Waiting-Line characteristics.  Service.
Queuing Systems Chapter 17.
EMGT 501 Fall 2005 Midterm Exam SOLUTIONS.
TCOM 501: Networking Theory & Fundamentals
Chapter 4: Stochastic Processes Poisson Processes and Markov Chains
Queueing Theory: Part I
Lecture 11 Queueing Models. 2 Queueing System  Queueing System:  A system in which items (or customers) arrive at a station, wait in a line (or queue),
2003 Fall Queuing Theory Midterm Exam(Time limit:2 hours)
Markov Chains Chapter 16.
INDR 343 Problem Session
Stochastic Process1 Indexed collection of random variables {X t } t   for each t  T  X t is a random variable T = Index Set State Space = range.
Lecture 14 – Queuing Systems
CS6800 Advanced Theory of Computation Fall 2012 Vinay B Gavirangaswamy

Lecture 11 – Stochastic Processes
Queuing Networks. Input source Queue Service mechanism arriving customers exiting customers Structure of Single Queuing Systems Note: 1.Customers need.
Introduction to Stochastic Models GSLM 54100
Probability Review Thinh Nguyen. Probability Theory Review Sample space Bayes’ Rule Independence Expectation Distributions.
Lecture 14 – Queuing Networks Topics Description of Jackson networks Equations for computing internal arrival rates Examples: computation center, job shop.
Queuing Theory Basic properties, Markovian models, Networks of queues, General service time distributions, Finite source models, Multiserver queues Chapter.
1 Queueing Theory Frank Y. S. Lin Information Management Dept. National Taiwan University
Queueing Theory What is a queue? Examples of queues: Grocery store checkout Fast food (McDonalds – vs- Wendy’s) Hospital Emergency rooms Machines waiting.
1 Elements of Queuing Theory The queuing model –Core components; –Notation; –Parameters and performance measures –Characteristics; Markov Process –Discrete-time.
CS433 Modeling and Simulation Lecture 12 Queueing Theory Dr. Anis Koubâa 03 May 2008 Al-Imam Mohammad Ibn Saud University.
Markov Chains and Random Walks. Def: A stochastic process X={X(t),t ∈ T} is a collection of random variables. If T is a countable set, say T={0,1,2, …
Markov Chains X(t) is a Markov Process if, for arbitrary times t1 < t2 < < tk < tk+1 If X(t) is discrete-valued If X(t) is continuous-valued i.e.
Chapter 61 Continuous Time Markov Chains Birth and Death Processes,Transition Probability Function, Kolmogorov Equations, Limiting Probabilities, Uniformization.
Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004 Assignment 3.
8/14/04J. Bard and J. W. Barnes Operations Research Models and Methods Copyright All rights reserved Lecture 12 – Discrete-Time Markov Chains Topics.
CS433 Modeling and Simulation Lecture 07 – Part 01 Continuous Markov Chains Dr. Anis Koubâa 14 Dec 2008 Al-Imam.
(C) J. M. Garrido1 Objects in a Simulation Model There are several objects in a simulation model The activate objects are instances of the classes that.
Queuing Theory.  Queuing Theory deals with systems of the following type:  Typically we are interested in how much queuing occurs or in the delays at.
Flows and Networks (158052) Richard Boucherie Stochastische Operations Research -- TW wwwhome.math.utwente.nl/~boucherierj/onderwijs/158052/ html.
CS433 Modeling and Simulation Lecture 11 Continuous Markov Chains Dr. Anis Koubâa 01 May 2009 Al-Imam Mohammad Ibn Saud University.
11. Markov Chains (MCs) 2 Courtesy of J. Bard, L. Page, and J. Heyl.
Markov Processes What is a Markov Process?
© 2015 McGraw-Hill Education. All rights reserved. Chapter 17 Queueing Theory.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
From DeGroot & Schervish. Example Occupied Telephone Lines Suppose that a certain business office has five telephone lines and that any number of these.
Let E denote some event. Define a random variable X by Computing probabilities by conditioning.
Flows and Networks (158052) Richard Boucherie Stochastische Operations Research -- TW wwwhome.math.utwente.nl/~boucherierj/onderwijs/158052/ html.
Lecture 14 – Queuing Networks
Availability Availability - A(t)
Discrete-time markov chain (continuation)
Al-Imam Mohammad Ibn Saud University
Queueing Theory What is a queue? Examples of queues:
Lecture 13 – Queuing Systems
Lecture 14 – Queuing Networks
Lecture 11 – Stochastic Processes
Presentation transcript:

8/14/04J. Bard and J. W. Barnes Operations Research Models and Methods Copyright All rights reserved Lecture 11 – Stochastic Processes Topics Definitions Review of probability Realization of a stochastic process Continuous vs. discrete systems Examples Classification scheme

2 Basic Definitions Stochastic process: System that changes over time in an uncertain manner State: Snapshot of the system at some fixed point in time Transition: Movement from one state to another Examples Automated teller machine (ATM) Printed circuit board assembly operation Runway activity at airport

3 Elements of Probability Theory Experiment: Any situation where the outcome is uncertain. Sample Space, S: All possible outcomes of an experiment (we will call it “state space”). Event: Any collection of outcomes (points) in the sample space. A collection of events E 1, E 2,…,E n is said to be mutually exclusive if E i  E j =  for all i ≠ j = 1,…,n. Random Variable: Function or procedure that assigns a real number to each outcome in the sample space. Cumulative Distribution Function (CDF), F(·): Probability distribution function for the random variable X such that F(a) = Pr{X ≤ a}.

4 Model Components (continued) Time: Either continuous or discrete parameter. State:Describes the attributes of a system at some point in time. s = (s 1, s 2,..., s v ); for ATM example s = (n) Convenient to assign a unique nonnegative integer index to each possible value of the state vector. We call this X and require that for each s  X. For ATM example, X = n. In general, X t is a random variable.

5 Activity:Takes some amount of time – duration. Culminates in an event. For ATM example  service completion. Transition:Caused by an event and results in movement from one state to another. For ATM example, Stochastic Process:A collection of random variables {X t }, where t  T = {0, 1, 2,...}.

6 Markovian Property Given that the present state is known, the conditional probability of the next state is independent of the states prior to the present state. Present state at time t is i: X t = i Next state at time t + 1 is j: X t+1 = j Conditional Probability Statement of Markovian Property: Pr{X t+1 = j | X 0 = k 0, X 1 = k 1,…,X t = i} = Pr{X t+1 = j | X t = i} for t = 0, 1,…, and all possible sequences i, j, k 0, k 1,..., k t –1.

7 Realization of the Process Deterministic Process Time between arrivals Pr{ t a   } = 0,  < 1 min = 1,   1 min Arrivals occur every minute. Time for servicing customer Pr{ t s   } = 0,  < 0.75 min = 1,   0.75 min Processing takes exactly 0.75 minutes. Number in system, n (no transient response)

8 Realization of the Process (continued) Stochastic Process Time for servicing customer Pr{ t s   } = 0,  < 0.75 min = 0.6, 0.75    1.5 min = 1,   1.5 min Number in system, n

9 Birth and Death Processes Pure Birth Process; e.g., Hurricanes Pure Death Process; e.g., Delivery of a truckload of parcels Birth-Death Process; e.g., Repair shop for taxi company

10 Queueing Systems Queue Discipline:Order in which customers are served; FIFO, LIFO, Random, Priority Five Field Notation: Arrival distribution / Service distribution / Number of servers / Maximum number in the system / Number in the calling population

11 Queueing Notation Distributions (interarrival and service times) M = Exponential D = Constant time E k = Erlang GI = General independent (arrivals only) G = General Parameters s = number of servers K = Maximum number in system N = Size of calling population

12 Characteristics of Queues Infinite queue: e.g., Mail order company (GI/G/s) Finite queue: e.g., Airline reservation system (M/M/s/K) a. Customer arrives but then leaves b. No more arrivals after K

13 Characteristics of Queues (continued) Finite input source: e.g., Repair shop for trucking firm (N vehicles) with s service bays and limited capacity parking lot (K – s spaces). Each repair takes 1 day (GI/D/s/K/N). In this diagram N = K so we have GI/D/s/K/K system.

14 Examples of Stochastic Processes Service Completion Triggers an Arrival: e.g., multistage assembly process with single worker, no queue. state = 0, worker is idle state = k, worker is performing operation k = 1,..., 5

15 Examples (continued) Multistage assembly process with single worker with queue. (Assume 3 stages only) s = (s 1, s 2 ) where { s 1 = number of parts in system s 2 = current operation being performed 3 d 2,1 0,0 1,13,1 a 1 d a a 2,21,23,2 a a 2,31,33,3 aa 2 d 1 d 1 d 2 d 2 d 3 d 3 d … … … Assume k = 1, 2, 3

16 Queueing Model with Two Servers, One Operation s = (s 1, s 2, s 3 ) wheres i = { s 3 = number in queue 0 if server i is idle i = 1, 2 1 if server i is busy State- transition network

17 Series System with No Queues ComponentNotationDefinition State s = (s 1, s 2, s 3 ) State space S = { (0,0,0), (1,0,0),..., (0,1,1), (1,1,1) } The state space consists of all possible binary vectors of 3 components. Activities Y = {a, d 1, d 2, d 3 }a =arrival at operation 1 d j =completion of operation j for j = 1, 2, 3 0 if server i is idle 1 if server i is busy for i = 1, 2, 3 s i = {

18 Transitions for Markov Processes Exponential interarrival and service times (M/M/s) State space: S = {1, 2,...} Probability of going from state i to state j in one move: p ij Theoretical requirements: 0  p ij  1,  j p ij = 1, i = 1,…,m State-transition matrix P =

19 Single Channel Queue – Two Kinds of Service Bank teller: normal service (d), travelers checks (c), idle (i) Letp = portion of customers who buy checks after normal service s 1 = number in system s 2 = status of teller, where s 2  {i, d, c} State- transition network

20 Part Processing with Rework Consider a machining operation in which there is a 0.4 probability that upon completion, a processed part will not be within tolerance. Machine is in one of three states: 0 = idle, 1 = working on part for first time, 2 = reworking part. State-transition network a =arrival s 1 =service completion from state 1 s 2 =service completion from state 2

21 Markov Chains A discrete state space Markovian property for transitions One-step transition probabilities, p ij, remain constant over time (stationary) Example: Game of Craps Roll 2 dice:Win = 7 or 11; Loose = 2, 3, 12; otherwise 4, 5, 6, 8, 9, 10 (called point) and roll again  win if point  loose if 7 otherwise roll again, and so on. (There are other possible bets not include here.)

22 State-Transition Network for Craps

23 Transition Matrix for Game of Craps

24 State-Transition Network for Simple Markov Chain

25 Classification of States Accessible:Possible to go from state i to state j (path exists in the network from i to j). Two states communicate if both are accessible from each other. A system is irreducible if all states communicate. State i is recurrent if the system will return to it after leaving some time in the future. If a state is not recurrent, it is transient.

26 Classification of States (continued) A state is periodic if it can only return to itself after a fixed number of transitions greater than 1 (or multiple of a fixed number). A state that is not periodic is aperiodic. a.Each state visited every 3 iterations b.Each state visited in multiples of 3 iterations

27 Classification of States (continued) An absorbing state is one that locks in the system once it enters. This diagram might represent the wealth of a gambler who begins with $2 and makes a series of wagers for $1 each. Let a i be the event of winning in state i and d i the event of losing in state i. There are two absorbing states: 0 and 4.

28 Classification of States (continued) Class: set of states that communicate with each other. A class is either all recurrent or all transient and may be either all periodic or aperiodic. States in a transient class communicate only with each other so no arcs enter any of the corresponding nodes in the network diagram from outside the class. Arcs may leave, though, passing from a node in the class to one outside.

29 Illustration of Concepts Example 1 Every pair of states communicates, forming a single recurrent class; however, the states are not periodic. Thus the stochastic process is aperiodic and irreducible.

30 Illustration of Concepts Example 2 States 0 and 1 communicate and for a recurrent class. States 3 and 4 form separate transient classes. State 2 is an absorbing state and forms a recurrent class.

31 Illustration of Concepts Example 3 Every state communicates with every other state, so we have irreducible stochastic process. Periodic?Yes, so Markov chain is irreducible and periodic.

32 What you Should know about Stochastic Processes What a state is What a realization is (stationary vs. transient) What the difference is between a continuous and discrete-time system What the common applications are What a state-transition matrix is How systems are classified