Meaning of Markov Chain Markov Chain is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous even.
We have two states of Markov Chain: Transient and Recurrent States. Recurrent States is when starting from that state, the chain has probability 1 to return back to that state. Transient State is when starting from that state, the chain has the probability of 0 to return back to that state. In other words it can never return back to that state. For example; Recurrent stateTransient state
Consider now a finite state Markov chain and suppose that the states are numbered so that T = {1, 2,..., t} denotes the set of transient states. Let P11 P12 · · · P1t Pt = Pt1 Pt2 · · · Ptt and note that since Pt specifies only the transition probabilities from transient states into transient states, some of its row sums are less than 1 (otherwise, T would be a closed class of states). For transient states i and j, let sij denote the expected number of time periods that the Markov chain is in state j, given that it starts in state i. Let δi,j = 1 when i = j and let it be 0 otherwise. Condition on the initial transition to obtain sij = δi,j + Pik skj = δi,j + Pik skj t K=1
where the final equality follows since it is impossible to go from a recurrent to a transient state, implying that skj = 0 when k is a recurrent state. Let S denote the matrix of values sij, i, j = 1,..., t. That is, S11 S12 · · · S1t S = St1 St2 · · · Stt Based on the fact that the above equation is a general equation implying that Skj=0, the equation below can be used for transient states computations S = I+PT S Where I is the Identity Matrix of Size t, Pt is the transition probabilities and S is the expected number of periods in each state. The equation above can be transformed to; S-Pts=I S(1-Pt)=I Therefore S= I/(1-Pt) S= (I-Pt) −1
Example: Consider the gambler’s ruin problem with p = 0.4 and N = 7.Starting with 3 units, determine (a) the expected amount of time the gambler has 5 units, (b) the expected amount of time the gambler has 2 units. Solution: The matrix PT, which specifies Pij, i, j ∈ {1, 2, 3, 4, 5, 6}, is as follows: I= PT = Applying the above Equation we will have (I-Pt)
(I - PT) = Inverting I−PT gives ( This Computation was done Using Math lab) S = (I−PT ) − Hence, s3,5 = , s3,2 =
For i ∈ T,j ∈ T, the quantity fij, equal to the probability that the Markov chain ever makes a transition into state j given that it starts in state i, is easily determined from PT. To determine the relationship, let us start by deriving an expression for sij by conditioning on whether state j is ever entered. This yields sij = E[time in j |start in i, ever transit to j ]fij +E[time in j |start in i, never transit to j ](1 −fij ) = (δi,j +sjj )fij + δi,j (1 −fi,j ) = δi,j + fij sjj since sjj is the expected number of additional time periods spent in state j given that it is eventually entered from state i. Solving the preceding equation yields fij = sij − δi,j sjj Example: what is the probability that the gambler ever has a fortune of 1? Solution: Since s3,1 = and s1,1 = , then f3,1 = s3,1 s1,1 = THE END. THANK YOU