Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lecture on Markov Chain

Similar presentations


Presentation on theme: "Lecture on Markov Chain"— Presentation transcript:

1 Lecture on Markov Chain
Definition : Markov Chain is a discrete time discrete alphabet random process {Xn, n>0} such that Definition : Initial distribution is the probability distribution of

2 Prove Definition : P is a stochastic matrix means P has all non negative components and each row adds up to 1.

3 Example: Bernoulli Process
q p … 0 q p … P = 0 0 q p … 0 0 0 q …

4 Example Let Tn be the time of the nth success. is a Markov Chain
0 p pq pq2 … 0 0 p pq … P = p …

5 is a discrete alphabet i.i.d. process. Then is a Markov Chain.
Example is a discrete alphabet i.i.d. process. Then is a Markov Chain. (Show it for yourself.)

6 Example is a discrete alphabet i.i.d. process with

7 Classification of States

8

9

10 Little’s Formula Scenario:
Customers arrive at random times. Or equivalently, packets are assigned to links at random times. The probability distribution of time between two arrives is given. Service time is random. The probability distribution of service time is given We are interested in finding out: Average number of customers in queue Average delay seen by customers Packets arriving with random packet lengths Packets transmitted at a constant rate

11 N(t): number of customers at time t.
pn(t) : probability of n customers in queue or under service at time t. pn(0): Initial probability of having n customers. Is assumed to be known. N(t): number of customers at time t.

12 Little’s Theorem

13 Queuing Models  /  / Number of servers
Probability distribution of service time (M: memoryless or exponential, D:deterministic, etc.,)  /  / Number of servers Nature of arrival process (M:memoryless or Poisson, G:general, D:deterministic) For M/M/1, the arrival process is Poisson with rate λ, the service time is exponential random variable with mean 1/μ, and there is one server.

14 Poisson Process

15 Properties of Poisson Process

16

17 Customer Service Time Also service times are independent of each other and of inter-arrival times. For exponential random variables, Memoryless Property of Exponential Distribution The probability that there will be no arrivals in the next τ seconds is the same as the conditional probability of no arrival in the next τ seconds given that there have been no arrivals for the past t seconds for any t.

18 M/M/1 Queuing Model Balance equations
Consider (kδ, (k+1)δ] for k0. The number of transitions from state n to state n+1 is at most one, and the number of transitions from state n+1 to state n is also at most 1. In steady state, consider that system goes from state n to state n+1. λδ n n+1 μδ

19

20 By Little’s theorem, T, the average delay in the system, i. e
By Little’s theorem, T, the average delay in the system, i.e., queue time plus service time, is equal to Average delay in queue = average delay in system - average service time  W NQ = average number of customers in queue Number of customers in queue Note :  is the utilization factor, i.e., long term proportion of time the server is busy. Then  =1- p0 where p0 is the probability that the system is empty.

21 Example Suppose we increase the arrival and transmission rates by the same factor k.

22 arrivals and departures
Therefore, the delay is reduced by a factor of k while the number of customers are the same. The reason is that the time is compressed by a factor of k and the customers move out faster. Arrival 4 3 2 1 Original system N(t) Departure t 4 3 2 1 Arrival System with faster arrivals and departures N(t) Departure t

23 Multiplexing Assume m Poisson arrivals of rate of /m each for the total transmission capacity of rate . Two possible scenarios: Divide capacity into m, and transmit each arrival process over the divided one. Then we have m M/M/1 queues with arrival rate /m with service rate  /m. Time division multiplexing (TDM) or frequency division multiplexing (FDM) accomplishes this. Merge the m arrival processes into one Poisson process of rate  and use the entire transmission capacity for it. Statistical multiplexing achieves this.


Download ppt "Lecture on Markov Chain"

Similar presentations


Ads by Google