Download presentation
Presentation is loading. Please wait.
Published byEmily Price Modified over 9 years ago
1
Hidden Markov Models Tunghai University Fall 2005
2
Simple Model - Markov Chains Markov Property: The state of the system at time t+1 only depends on the state of the system at time t X1X1 X2X2 X3X3 X4X4 X5X5
3
Markov Chains Stationarity Assumption Probabilities are independent of t when the process is “stationary” So, This means that if system is in state i, the probability that the system will transition to state j is p ij no matter what the value of t is
4
Weather: – raining today rain tomorrow p rr = 0.4 – raining today no rain tomorrow p rn = 0.6 – no raining today rain tomorrow p nr = 0.2 – no raining today no rain tomorrow p rr = 0.8 Simple Example
5
Transition Matrix for Example Note that rows sum to 1 Such a matrix is called a Stochastic Matrix If the rows of a matrix and the columns of a matrix all sum to 1, we have a Doubly Stochastic Matrix
6
Gambler’s Example – At each play we have the following: Gambler wins $1 with probability p Gambler loses $1 with probability 1-p – Game ends when gambler goes broke, or gains a fortune of $100 – Both $0 and $100 are absorbing states 01 2 N-1 N p p p p 1-p Start (10$) or
7
Coke vs. Pepsi Given that a person’s last cola purchase was Coke, there is a 90% chance that her next cola purchase will also be Coke. If a person’s last cola purchase was Pepsi, there is an 80% chance that her next cola purchase will also be Pepsi. coke pepsi 0.1 0.9 0.8 0.2
8
Coke vs. Pepsi Given that a person is currently a Pepsi purchaser, what is the probability that she will purchase Coke two purchases from now? The transition matrix is: (Corresponding to one purchase ahead)
9
Coke vs. Pepsi Given that a person is currently a Coke drinker, what is the probability that she will purchase Pepsi three purchases from now?
10
Coke vs. Pepsi Assume each person makes one cola purchase per week. Suppose 60% of all people now drink Coke, and 40% drink Pepsi. What fraction of people will be drinking Coke three weeks from now? Let (Q 0,Q 1 )=(0.6,0.4) be the initial probabilities. We will regard Coke as 0 and Pepsi as 1 We want to find P(X 3 =0) P 00
11
Hidden Markov Models - HMM H1H1 H2H2 H L-1 HLHL X1X1 X2X2 X L-1 XLXL HiHi XiXi Hidden variables Observed data
12
Coin-Tossing Example 0.9 Fair loaded head tail 0.9 0.1 1/2 1/4 3/4 1/2 H1H1 H2H2 H L-1 HLHL X1X1 X2X2 X L-1 XLXL HiHi XiXi L tosses Fair/Loade d Head/Tail Start 1/2
13
H1H1 H2H2 H L-1 HLHL X1X1 X2X2 X L-1 XLXL HiHi XiXi L tosses Fair/Loade d Head/Tail 0.9 Fair loaded head tail 0.9 0.1 1/2 1/4 3/4 1/2 Start 1/2 Coin-Tossing Example Query: what are the most likely values in the H-nodes to generate the given data?
14
1. Compute the posteriori belief in H i (specific i) given the evidence {x 1,…,x L } for each of H i ’s values h i, namely, compute p(h i | x 1,…,x L ). 2. Do the same computation for every H i but without repeating the first task L times. Coin-Tossing Example Seeing the set of outcomes {x 1,…,x L }, compute p(loaded | x 1,…,x L ) for each coin toss Query: what are the probabilities for fair/loaded coins given the set of outcomes {x 1,…,x L }?
15
C-G Islands Example Regular DNA C-G island C-G islands: DNA parts which are very rich in C and G A C G T change A C G T (1-P)/4 P/6 q/4 P P q q q qP P (1-q)/6 (1-q)/3 p/3 p/6
16
C-G Islands Example A C G T change A C G T H1H1 H2H2 H L-1 HLHL X1X1 X2X2 X L-1 XLXL HiHi XiXi C-G island? A/C/G/T
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.