Stochastic Models Lecture 3 Continuous-Time Markov Processes

Slides:



Advertisements
Similar presentations
Discrete time Markov Chain
Advertisements

Many useful applications, especially in queueing systems, inventory management, and reliability analysis. A connection between discrete time Markov chains.
Continuous-Time Markov Chains Nur Aini Masruroh. LOGO Introduction  A continuous-time Markov chain is a stochastic process having the Markovian property.
Flows and Networks Plan for today (lecture 2): Questions? Continuous time Markov chain Birth-death process Example: pure birth process Example: pure death.
Markov Chains.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
INDR 343 Problem Session
IERG5300 Tutorial 1 Discrete-time Markov Chain
Operations Research: Applications and Algorithms
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
TCOM 501: Networking Theory & Fundamentals
IEG5300 Tutorial 5 Continuous-time Markov Chain Peter Chen Peng Adapted from Qiwen Wang’s Tutorial Materials.
Copyright © 2005 Department of Computer Science CPSC 641 Winter Markov Chains Plan: –Introduce basics of Markov models –Define terminology for Markov.
Flows and Networks (158052) Richard Boucherie Stochastische Operations Research -- TW wwwhome.math.utwente.nl/~boucherierj/onderwijs/158052/ html.
1 Part III Markov Chains & Queueing Systems 10.Discrete-Time Markov Chains 11.Stationary Distributions & Limiting Probabilities 12.State Classification.
Al-Imam Mohammad Ibn Saud University
Matrices, Digraphs, Markov Chains & Their Use. Introduction to Matrices  A matrix is a rectangular array of numbers  Matrices are used to solve systems.
Continuous Time Markov Chains and Basic Queueing Theory
Operations Research: Applications and Algorithms
NETE4631:Capacity Planning (3)- Private Cloud Lecture 11 Suronapee Phoomvuthisarn, Ph.D. /
TCOM 501: Networking Theory & Fundamentals
Mean Delay in M/G/1 Queues with Head-of-Line Priority Service and Embedded Markov Chains Wade Trappe.
If time is continuous we cannot write down the simultaneous distribution of X(t) for all t. Rather, we pick n, t 1,...,t n and write down probabilities.
1 Markov Chains H Plan: –Introduce basics of Markov models –Define terminology for Markov chains –Discuss properties of Markov chains –Show examples of.
Solutions Queueing Theory 1
6. Markov Chain. State Space The state space is the set of values a random variable X can take. E.g.: integer 1 to 6 in a dice experiment, or the locations.
Introduction to Stochastic Models GSLM 54100
Instructor: Spyros Reveliotis homepage: IE6650: Probabilistic Models Fall 2007.
This paper is about model of volume and volatility in FX market Our model consists of 4 parts; 1. M odel of Order Flow Generation 2. Model of Price Determination.
S TOCHASTIC M ODELS L ECTURE 1 M ARKOV C HAINS Nan Chen MSc Program in Financial Engineering The Chinese University of Hong Kong (ShenZhen) Sept. 9, 2015.
S TOCHASTIC M ODELS L ECTURE 1 P ART II M ARKOV C HAINS Nan Chen MSc Program in Financial Engineering The Chinese University of Hong Kong (ShenZhen) Sept.
NETE4631:Capacity Planning (2)- Lecture 10 Suronapee Phoomvuthisarn, Ph.D. /
Flows and Networks Plan for today (lecture 6): Last time / Questions? Kelly / Whittle network Optimal design of a Kelly / Whittle network: optimisation.
Analysis of M/M/c/N Queuing System With Balking, Reneging and Synchronous Vacations Dequan Yue Department of Statistics, College of Sciences Yanshan University,
Stochastic Models Lecture 2 Poisson Processes
Queuing Theory Basic properties, Markovian models, Networks of queues, General service time distributions, Finite source models, Multiserver queues Chapter.
Flows and Networks Plan for today (lecture 4): Last time / Questions? Output simple queue Tandem network Jackson network: definition Jackson network: equilibrium.
Markov Chains X(t) is a Markov Process if, for arbitrary times t1 < t2 < < tk < tk+1 If X(t) is discrete-valued If X(t) is continuous-valued i.e.
Chapter 61 Continuous Time Markov Chains Birth and Death Processes,Transition Probability Function, Kolmogorov Equations, Limiting Probabilities, Uniformization.
Model under consideration: Loss system Collection of resources to which calls with holding time  (c) and class c arrive at random instances. An arriving.
1 Markov chains and processes: motivations Random walk One-dimensional walk You can only move one step right or left every time unit Two-dimensional walk.
CS433 Modeling and Simulation Lecture 07 – Part 01 Continuous Markov Chains Dr. Anis Koubâa 14 Dec 2008 Al-Imam.
S TOCHASTIC M ODELS L ECTURE 3 P ART II C ONTINUOUS -T IME M ARKOV P ROCESSES Nan Chen MSc Program in Financial Engineering The Chinese University of Hong.
S TOCHASTIC M ODELS L ECTURE 2 P ART II P OISSON P ROCESSES Nan Chen MSc Program in Financial Engineering The Chinese University of Hong Kong (ShenZhen)
S TOCHASTIC M ODELS L ECTURE 5 P ART II S TOCHASTIC C ALCULUS Nan Chen MSc Program in Financial Engineering The Chinese University of Hong Kong (Shenzhen)
S TOCHASTIC M ODELS L ECTURE 4 P ART II B ROWNIAN M OTIONS Nan Chen MSc Program in Financial Engineering The Chinese University of Hong Kong (Shenzhen)
Flows and Networks (158052) Richard Boucherie Stochastische Operations Research -- TW wwwhome.math.utwente.nl/~boucherierj/onderwijs/158052/ html.
CS433 Modeling and Simulation Lecture 11 Continuous Markov Chains Dr. Anis Koubâa 01 May 2009 Al-Imam Mohammad Ibn Saud University.
11. Markov Chains (MCs) 2 Courtesy of J. Bard, L. Page, and J. Heyl.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
S TOCHASTIC M ODELS L ECTURE 4 B ROWNIAN M OTIONS Nan Chen MSc Program in Financial Engineering The Chinese University of Hong Kong (Shenzhen) Nov 11,
Flows and Networks Plan for today (lecture 3): Last time / Questions? Output simple queue Tandem network Jackson network: definition Jackson network: equilibrium.
Flows and Networks Plan for today (lecture 6): Last time / Questions? Kelly / Whittle network Optimal design of a Kelly / Whittle network: optimisation.
S TOCHASTIC M ODELS L ECTURE 4 P ART III B ROWNIAN M OTION Nan Chen MSc Program in Financial Engineering The Chinese University of Hong Kong (Shenzhen)
S TOCHASTIC M ODELS L ECTURE 5 S TOCHASTIC C ALCULUS Nan Chen MSc Program in Financial Engineering The Chinese University of Hong Kong (Shenzhen) Nov 25,
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Courtesy of J. Bard, L. Page, and J. Heyl
Industrial Engineering Dep
Lecture on Markov Chain
Solutions Queueing Theory 1
Hidden Markov Models Part 2: Algorithms
Discrete-time markov chain (continuation)
Lecture 4: Algorithmic Methods for G/M/1 and M/G/1 type models
Discrete-time markov chain (continuation)
Solutions Queueing Theory 1
Solutions Queueing Theory 1
Queueing networks.
Further Topics on Random Variables: Derived Distributions
CS723 - Probability and Stochastic Processes
Further Topics on Random Variables: Derived Distributions
Presentation transcript:

Stochastic Models Lecture 3 Continuous-Time Markov Processes Nan Chen MSc Program in Financial Engineering The Chinese University of Hong Kong (ShenZhen) Oct 14, 2015

Outline Introduction of Continuous-Time Processes Limiting Probabilities

3.1 Introduction Use a section header for each of the topics, so there is a clear transition to the audience.

A New Perspective on Poisson Process A Poisson process can be constructed as follows: At each state it will stay for an exponential time with mean Then, it proceeds from to

Continuous-Time Markov Chains We may follow a similar way to construct a continuous-time Markov chains on a state space : Each time when it enters a state , we select a random sojourn time independently of its history; After a time it will exit the state . It will enter state with probability (Remark: We have a discrete-time Markov chain embedded in the process. )

Transition Probability Function Let record the state of the above Markov chain over time. The following quantity is known as the transition probability function of the process. Obviously, we have

Kolmogorov Backward/Forward Equations Theorem (Kolmogorov Backward Equation) For states and times where

Kolmogorov Backward/Forward Equations (Continued) Theorem (Kolmogorov Forward Equation) For states and times

Example I: Two-State Chain Consider a machine that works for an exponential amount of time having mean before breaking down; and suppose that it takes an exponential amount of time having mean to repair the machine. If the machine is in working condition at time 0, then what is the probability that it will be working at time Give a brief overview of the presentation. Describe the major focus of the presentation and why it is important. Introduce each of the major topics. To provide a road map for the audience, you can repeat this Overview slide throughout the presentation, highlighting the particular topic you will discuss next.

Example I: Two-State Chain From the backward equations, we have From them, we can solve for Give a brief overview of the presentation. Describe the major focus of the presentation and why it is important. Introduce each of the major topics. To provide a road map for the audience, you can repeat this Overview slide throughout the presentation, highlighting the particular topic you will discuss next.

Computing Transition Probabilities For any pair of states and let Then, we can rewrite the backward/forward equations as follows (backward) (forward)

Computing Transition Probabilities (Continued) Introduce matrices by letting the element in row , column of these matrices be, respectively, The two equations can be written as (backward) (forward) The solution should be

3.2 Limiting Probabilities Use a section header for each of the topics, so there is a clear transition to the audience.

Limiting Probabilities In analogy with a basic result in discrete-time Markov chains, the probability that a continuous-time Markov chain will be in state at time often converges to a limiting value that is independent of the initial state. We intend to study

Limiting Probabilities (Continued) To derive a set of equations for the , we may let approach in the forward equation. Then, In addition,

Limiting Probabilities (Continued) It is easy to see that if the embedded Markov chain has a limiting stationary distribution i.e., then The solution to the equations on the last slide should be

Example II: A Shoe Shine Shop Consider a shoe shine establishment consisting two chairs --- chair 1 and chair 2. A customer upon arrival goes initially to chair 1 where his shoes are cleaned and polish is applied. After this is done, the customer moves on to chair 2 where the polish is buffed. The service times at the two chairs are independent and exponentially distributed with respective rates and

Example II: A Shoe Shine Shop (Continued) Suppose that potential customers arrive in accordance with a Poisson process having rate , and that a potential customer will enter the system only if both chairs are empty. Let us define the state of the system as follows: State 0: empty system State 1: chair 1 is taken State 2: chair 2 is taken Give a brief overview of the presentation. Describe the major focus of the presentation and why it is important. Introduce each of the major topics. To provide a road map for the audience, you can repeat this Overview slide throughout the presentation, highlighting the particular topic you will discuss next.

Example II: A Shoe Shine Shop (Continued) What is the limiting probability for this continuous-time Markov chain? Give a brief overview of the presentation. Describe the major focus of the presentation and why it is important. Introduce each of the major topics. To provide a road map for the audience, you can repeat this Overview slide throughout the presentation, highlighting the particular topic you will discuss next.

Homework Assignments Read Ross Chapter 6.1, 6.2, 6.4, 6.5, and 6.9. Answer Questions: Exercises 8 (Page 399, Ross) Exercises 10, 13 (Page 400, Ross) Exercises 14, 17 (Page 401, Ross) Exercise 20 (Page 402, Ross) (Optional, Extra Bonus) Exercise 48 (Page 407). Give a brief overview of the presentation. Describe the major focus of the presentation and why it is important. Introduce each of the major topics. To provide a road map for the audience, you can repeat this Overview slide throughout the presentation, highlighting the particular topic you will discuss next.