Download presentation
Presentation is loading. Please wait.
1
Solutions to group exercises 1. (a) Truncating the chain is equivalent to setting transition probabilities to any state in {M+1,...} to zero. Renormalizing the transition matrix to make it stochastic we get (b) We have so
2
and yielding detailed balance for the truncated chain. 2. Let Z n = (X 2n,X 2n+1 ). Then Z n is a Markov chain with transition matrix Let T k = P(hit (0,1) before (1,0)|start at k). First step analysis yields the equations
3
(b) If q 0 =p 1 =p we get p 01,01 =1/2, fair coin. 3. (a) so whence the differential equation follows by letting. (b) Since P 1 (t)=1-P 0 (t) we get
4
which can either be solved directly, or one can check that the given solution satisfies the differential equation. (c) Letting we get This value as a starting distribution also yields a marginal distribution that is free of t, so it behaves like a stationary distribution (which we will define later).
5
Announcement MathAcrossCampus Colloquium (http://www.math.washington.edu/mac/) Evolutionary trees, coalescents, and gene trees: can mathematicians find the woods? JOE FELSENSTEIN Genome Sciences, UW Thursday, November 13, 2008, 3:30 Kane Hall 210 Reception to follow
6
The Markov property X(t) is a Markov process if for any n for all j, i 0,...,i n in S and any t 0 <t 1 <...<t n <t. The transition probabilities are homogeneous if p ij (s,t)=p ij (0,t-s). We will usually assume this, and write p ij (t).
7
Semigroup property Let P t be [p ij (t)]. Then P t is a substochastic semigroup, meaning that (a)P 0 = I (b)P s+t = P s P t (c)P t is a substochastic matrix, i.e. has nonnegative entries with row sums at most 1.
8
Proof (a)? (b) (c)
9
Standard semigroup P t,t≥0 is a standard semigroup if as. Theorem: For a standard semigroup the transition probabilities are continuous. Proof: By Chapman-Kolmogorov Unless otherwise specified we will consider standard semigroups.
10
Infinitesimal generator By continuity of the transition probabilities, Taylor expansion suggests We must have g ij ≥0, g ii ≤0. Let G=[g ij ]. Then (under regularity conditions) G is called the infinitesimal generator of P t.
11
Birth process G = Under regularity conditions we have so we must have
12
Forward equations so and or
13
Backward equations Instead of looking at (t,t+h] look at (0,h]: so
14
Formal solution In many cases we can solve both these equations by But this can be difficult to actually calculate.
15
The 0-1 case
16
0-1 case, continued Thus
17
Marginal distribution Let. Then for a starting distribution (0) we have (t) = (0) P t. For the 0-1 process we get
18
Exponential holding times Suppose X(t)=j. Consider Let be the time spent in j until the next transition after time t. By the Markov property, P(stay in j in (u,u+v], given stays at least u) is precisely P(stay in j v time units). Mathematically Let g(v)=P( >v). Then we have g(u+v)=g(u)g(v), and it follows that g(u)=exp(- u). By the backward eqn and P( >v)=p jj (v).
19
Jump chain Given that the chain jumps from i at a particular time, the probability that it jumps to j is -g ij /g ii. Here is why (roughly): Suppose t< <t+h, and there is only one jump in (t,t+h] (likely for small h). Then
20
Construction The way the continuous time Markov chains work is: (1)Draw an initial value i 0 from (0) (2)If, stay in i 0 for a random time which is (3)Draw a new state from the distribution where
21
Death process Let g i,i-1 = i = - g i,i. The forward equation is Write. Then This is a Lagrange equation with sln or
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.