Download presentation
Presentation is loading. Please wait.
Published byEdwin Lester Modified over 8 years ago
1
Day 3 Markov Chains For some interesting demonstrations of this topic visit: http://ocw.mit.edu/OcwWeb/Mathematics/18-06Spring- 2005/Tools/index.htm
2
Equations of the form: are called discrete equations because they only model the system at whole number time increments. Difference equation is an equation involving differences. We can see difference equation from at least three points of views: as sequence of number, discrete dynamical system and iterated function. It is the same thing but we look at different angle.
3
Difference Equations vs. Differential Equations Dynamical system come with many different names. Our particular interesting dynamical system is for the system whose state depends on the input history. In discrete time system, we call such system difference equation (equivalent to differential equation in continuous time).
4
Markov Matrices Properties of Markov Matrices: All entries are ≥ 0. All Columns add up to one. Note: the powers of the matrix will maintain these properties. Each column is representing probabilities. Consider the matrix
5
Markov Matrices 1 is an eigenvalue of all Markov Matrices Why? Subtract 1 down each entry in the diagonal. Each column will then add to zero - which means that the rows are dependent. - which means that the matrix is singular.
6
Markov Matrices 0.1 0.01 0.3 A = 0.2 0.99 0.3 0.7 0 0.4 One eigenvalue is 1 all other eigenvalues have an absolute value ≤ 1. We are interested in raising A to some powers If 1 is an eigenvector and all other vectors are less than 1 then the steady state is the eigenvector. Note: this requires n independent vectors. [ ]
7
Short cuts for finding eigenvectors -0.9 0.01 0.3 A - I = 0.2 -0.01 0.3 det ( A -1I ) 0.7 0 -0.6 To find the eigenvector that corresponds to λ = 1 Use to get the last row to be zero. Then use the top row to get the missing middle value. (working on next slide) [ ]
8
Short cuts for finding eigenvectors Then use the top row to get the missing middle value. (-0.9)(0.6) + (0.01)(???) + (0.3)(0.7) = 0 ??? = 33 Or the 2 nd row to get the middle value (0.2)(0.6) + (-0.01)(???) + (0.3)(0.7) = 0 ??? = 33
9
Applications of Markov Matrices Markov Matrices are used to when the probability of an event depends on its current state. For this model, the probability of an event must remain constant over time. The total population is not changing over time. Markov matrices have applications in Electrical engineering, waiting times, stochastic process.
10
Applications of Markov Matrices u k+1 = Au k Suppose we have two cities Suzhou (S) and Hangzhou (H) with initial condition at k = 0, S = 0 and H =1000. We would like to describe movement in population between these two cities. u s+1 = 0.9 0.2 u S u H+1 0.1 0.8 u H Population of Suzhou and Hongzhou at time t+1 Column 1:.9 of the people in S stay there and.1 move to H Column 2:.8 of the people in H stay there are and.2 move to S [ ] Population of S and H at time t
11
Applications of Markov Matrices u k+1 = Au k u s+1 = 0. 9 0.2 u S u H +1 0.1 0.8 u H Find the eigenvalues and eigenvectors. [ ] current state next state
12
Applications of Markov Matrices u k+1 = Au k u s+1 = 0. 9 0.2 u S u H +1 0.1 0.8 u H Find the eigenvalues and eigenvectors. Eigenvalues:λ 1 = 1 and λ 2 = 0.7 (from properties of Markov Matrices and the trace) Eigenvectors: ker (A - I),ker (A - 0.7I) [ ]
13
Applications of Markov Matrices u k+1 = Au k u s+1 = 0. 9 0.2 u S u H +1 0.1 0.8 u H [ ] eigenvalue 1 eigenvalue 0.7eigenvector This tells us about time and ∞. λ = 1will be a steady state, λ = 0.7will disappear as t → ∞ The eigenvector tells us that we need a ratio of 2:1. The total population is still 1000 so the final population will be 1000 (2/3) and 1000 (1/3).
14
Applications u s+1 =. 9.2 u S u H +1.1.8 u H To find the amounts after a finite number of steps A k u 0 = c 1 (1) k 2 + c 2 (0.7) k -1 1 1 Use the initial condition to solve for constants 0 = c 1 2 + c 2 -1 c 1 = 1000/3 1000 1 1 c 2 = 2000/3 [ ] Initial condition at k=0, S = 0 and H = 1000 [ ]
15
Steady state for Markov Matrices Every Markov chain will be a steady state. The steady state will be the eigenvector for the eigenvalue λ = 1.
16
Homework: p. 487 3-6,8,9,13 white book, eigenvalue review worksheet 1-5 "Genius is one per cent inspiration, ninety-nine per cent perspiration. “ Thomas Alva Edison
17
More Info http://ocw.mit.edu/courses/mathematics/18-06-linear-algebra- spring-2010/video-lectures/lecture-24-markov-matrices- fourier-series/ www.math.hawaii.edu/~pavel/fibonacci.pdf http://people.revoledu.com/kardi/tutorial/DifferenceEquation/ WhatIsDifferenceEquation.htm https://www.math.duke.edu//education/ccp/materials/linalg/di ffeqs/diffeq2.html
18
Fibonacci via matrices http://www.maths.leeds.ac.uk/applied/0380/fibonacci03.pdf For More information visit:
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.