Lecture 11 – Stochastic Processes

Slides:



Advertisements
Similar presentations
Many useful applications, especially in queueing systems, inventory management, and reliability analysis. A connection between discrete time Markov chains.
Advertisements

1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Discrete Probability Distributions
Discrete Time Markov Chains
Lecture 12 – Discrete-Time Markov Chains
1 Part III Markov Chains & Queueing Systems 10.Discrete-Time Markov Chains 11.Stationary Distributions & Limiting Probabilities 12.State Classification.
Discrete-Time Markov Chains. © Tallal Elshabrawy 2 Introduction Markov Modeling is an extremely important tool in the field of modeling and analysis of.
Lecture 13 – Continuous-Time Markov Chains
DEPARTMENT OF HEALTH SCIENCE AND TECHNOLOGY STOCHASTIC SIGNALS AND PROCESSES Lecture 1 WELCOME.
Introduction to stochastic process
Simulation Where real stuff starts. ToC 1.What, transience, stationarity 2.How, discrete event, recurrence 3.Accuracy of output 4.Monte Carlo 5.Random.
Probability Theory Part 2: Random Variables. Random Variables  The Notion of a Random Variable The outcome is not always a number Assign a numerical.
1 Introduction to Biostatistics (PUBHLTH 540) Sampling.
Markov Chains Chapter 16.
CS6800 Advanced Theory of Computation Fall 2012 Vinay B Gavirangaswamy
Lecture 11 – Stochastic Processes
1 Exponential Distribution & Poisson Process Memorylessness & other exponential distribution properties; Poisson process and compound P.P.’s.
1 Performance Evaluation of Computer Systems By Behzad Akbari Tarbiat Modares University Spring 2009 Introduction to Probabilities: Discrete Random Variables.
Introduction to Stochastic Models GSLM 54100
Chapter 12 Review of Calculus and Probability
Generalized Semi-Markov Processes (GSMP)
Chapter 5 Statistical Models in Simulation
Probability Review Thinh Nguyen. Probability Theory Review Sample space Bayes’ Rule Independence Expectation Distributions.
Lecture 14 – Queuing Networks Topics Description of Jackson networks Equations for computing internal arrival rates Examples: computation center, job shop.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Queuing Theory Basic properties, Markovian models, Networks of queues, General service time distributions, Finite source models, Multiserver queues Chapter.
Generalized Semi- Markov Processes (GSMP). Summary Some Definitions The Poisson Process Properties of the Poisson Process  Interarrival times  Memoryless.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
Chapter 61 Continuous Time Markov Chains Birth and Death Processes,Transition Probability Function, Kolmogorov Equations, Limiting Probabilities, Uniformization.
Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
8/14/04J. Bard and J. W. Barnes Operations Research Models and Methods Copyright All rights reserved Lecture 12 – Discrete-Time Markov Chains Topics.
CS433 Modeling and Simulation Lecture 07 – Part 01 Continuous Markov Chains Dr. Anis Koubâa 14 Dec 2008 Al-Imam.
S. Mandayam/ CompArch2/ECE Dept./Rowan University Computer Architecture II: Specialized /02 Fall 2001 John L. Schmalzel Shreekanth Mandayam.
Queuing Theory.  Queuing Theory deals with systems of the following type:  Typically we are interested in how much queuing occurs or in the delays at.
Probability theory is the branch of mathematics concerned with analysis of random phenomena. (Encyclopedia Britannica) An experiment: is any action, process.
CS433 Modeling and Simulation Lecture 11 Continuous Markov Chains Dr. Anis Koubâa 01 May 2009 Al-Imam Mohammad Ibn Saud University.
8/14/04J. Bard and J. W. Barnes Operations Research Models and Methods Copyright All rights reserved Lecture 11 – Stochastic Processes Topics Definitions.
Flows and Networks Plan for today (lecture 3): Last time / Questions? Output simple queue Tandem network Jackson network: definition Jackson network: equilibrium.
Random Variables r Random variables define a real valued function over a sample space. r The value of a random variable is determined by the outcome of.
Modeling and Simulation
Random Variables By: 1.
Great Theoretical Ideas in Computer Science.
From DeGroot & Schervish. Example Occupied Telephone Lines Suppose that a certain business office has five telephone lines and that any number of these.
Reliability Engineering
Let E denote some event. Define a random variable X by Computing probabilities by conditioning.
Conditional Probability 423/what-is-your-favorite-data-analysis-cartoon 1.
CS723 - Probability and Stochastic Processes. Lecture No. 09.
Lecture 14 – Queuing Networks
Random variables (r.v.) Random variable
Availability Availability - A(t)
Exponential Distribution & Poisson Process
Monte Carlo simulation
Flows and Networks Plan for today (lecture 3):
Simulation Statistics
Probability.
ETM 607 – Spreadsheet Simulations
Flows and Networks Plan for today (lecture 4):
Lecture on Markov Chain
CPSC 531: System Modeling and Simulation
Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger
CSE 531: Performance Analysis of Systems Lecture 4: DTMC
Lecture 14 – Queuing Networks
September 1, 2010 Dr. Itamar Arel College of Engineering
Performance evaluation
Dept. of Electrical & Computer engineering
Autonomous Cyber-Physical Systems: Probabilistic Models
Lecture 2 Basic Concepts on Probability (Section 0.2)
CS723 - Probability and Stochastic Processes
Presentation transcript:

Lecture 11 – Stochastic Processes Topics Definitions Review of probability Realization of a stochastic process Continuous vs. discrete systems Examples

Basic Definitions Stochastic process: System that changes over time in an uncertain manner Examples Automated teller machine (ATM) Printed circuit board assembly operation Runway activity at airport State: Snapshot of the system at some fixed point in time Transition: Movement from one state to another

Elements of Probability Theory Experiment: Any situation where the outcome is uncertain. Sample Space, S: All possible outcomes of an experiment (we will call them the “state space”). Event: Any collection of outcomes (points) in the sample space. A collection of events E1, E2,…,En is said to be mutually exclusive if Ei  Ej =  for all i ≠ j = 1,…,n. Random Variable (RV): Function or procedure that assigns a real number to each outcome in the sample space. Cumulative Distribution Function (CDF), F(·): Probability distribution function for the random variable X such that F(a)  Pr{X ≤ a}

Components of Stochastic Model Time: Either continuous or discrete parameter. State: Describes the attributes of a system at some point in time. s = (s1, s2, . . . , sv); for ATM example s = (n) Convenient to assign a unique nonnegative integer index to each possible value of the state vector. We call this X and require that for each s  X. For ATM example, X = n. In general, Xt is a random variable.

Model Components (continued) Activity: Takes some amount of time – duration. Culminates in an event. For ATM example  service completion. Transition: Caused by an event and results in movement from one state to another. For ATM example, # = state, a = arrival, d = departure Stochastic Process: A collection of random variables {Xt}, where t  T = {0, 1, 2, . . .}.

Realization of the Process Deterministic Process Time between arrivals Pr{ ta   } = 0,  < 1 min = 1,   1 min Arrivals occur every minute. Time for servicing customer Pr{ ts   } = 0,  < 0.75 min = 1,   0.75 min Processing takes exactly 0.75 minutes. Number in system, n (no transient response)

Realization of the Process (continued) Stochastic Process Time for servicing customer Pr{ ts   } = 0,  < 0.75 min = 0.6, 0.75    1.5 min = 1,   1.5 min Number in system, n

Markovian Property Given that the present state is known, the conditional probability of the next state is independent of the states prior to the present state. Present state at time t is i: Xt = i Next state at time t + 1 is j: Xt+1 = j Conditional Probability Statement of Markovian Property: Pr{Xt+1 = j | X0 = k0, X1 = k1,…, Xt = i } = Pr{Xt+1 = j | Xt = i }   for t = 0, 1,…, and all possible sequences i, j, k0, k1, . . . , kt–1 Interpretation: Given the present, the past is irrelevant in determining the future.

Transitions for Markov Processes State space: S = {1, 2, . . . , m} Probability of going from state i to state j in one move: pij State-transition matrix Theoretical requirements: 0  pij  1, j pij = 1, i = 1,…,m

Discrete-Time Markov Chain A discrete state space Markovian property for transitions One-step transition probabilities, pij, remain constant over time (stationary) Simple Example State-transition matrix State-transition diagram 1 2 P = 0.6 0.3 0.1 0.8 0.2

(There are other possible bets not include here.) Game of Craps Roll 2 dice Outcomes Win = 7 or 11 Loose = 2, 3, 12 Point = 4, 5, 6, 8, 9, 10 If point, then roll again. Win if point Loose if 7 Otherwise roll again, and so on (There are other possible bets not include here.)

State-Transition Network for Craps

Transition Matrix for Game of Craps Sum 2 3 4 5 6 7 8 9 10 11 12 Prob. 0.028 0.056 0.083 0.111 0.139 0.167 Probability of win = Pr{ 7 or 11 } = 0.167 + 0.056 = 0.223 Probability of loss = Pr{ 2, 3, 12 } = 0.028 + 0.56 + 0.028 = 0.112

Examples of Stochastic Processes Single stage assembly process with single worker, no queue State = 0, worker is idle State = 1, worker is busy Multistage assembly process with single worker, no queue State = 0, worker is idle State = k, worker is performing operation k = 1, . . . , 5

Examples (continued) Multistage assembly process with single worker with queue (Assume 3 stages only; i.e., 3 operations) s = (s1, s2) where Operations k = 1, 2, 3

Queuing Model with Two Servers s = (s1, s2 , s3) where i = 1, 2 State-transition network

Series System with No Queues Component Notation Definition State s = (s1, s2 , s3) State space S = { (0,0,0), (1,0,0), . . . , (0,1,1), (1,1,1) } The state space consists of all possible binary vectors of 3 components. Activities Y = {a, d1, d2 , d3} a = arrival at operation 1 dj = completion of operation j for j = 1, 2, 3

What You Should Know About Stochastic Processes What a state is. What a realization is (stationary vs. transient). What the difference is between a continuous and discrete-time system. What the common applications are. What a state-transition matrix is.