Download presentation
Presentation is loading. Please wait.
Published byNicholas Simpson Modified over 8 years ago
1
Bayan Turki Bagasi
2
Introduction Generating a Test Sequence Estimating the State Sequence Estimating Transition and Emission Matrices Estimating Posterior State Probabilities
3
Observe a sequence of emissions Do not know the sequence of states the model went through to generate the emissions.
4
Analyses of hidden Markov models seek to recover the sequence of states from the observed data.
5
The model uses: A red die, having six sides, labeled 1 through 6. A green die, having twelve sides, five of which are labeled 2 through 6, while the remaining seven sides are labeled 1.
6
A weighted red coin, for which the probability of heads is.9 and the probability of tails is.1. A weighted green coin, for which the probability of heads is.95 and the probability of tails is.05. a Markov model with two states and six possible emissions.
8
The model creates a sequence of numbers from the set {1, 2, 3, 4, 5, 6} with the following rules: Begin by rolling the red die and writing down the number that comes up, which is the emission. Toss the red coin and do one of the following: If the result is heads, roll the red die and write down the result. If the result is tails, roll the green die and write down the result.
9
At each subsequent step, you flip the coin that has the same color as the die you rolled in the previous step. If the coin comes up heads, roll the same die as in the previous step. If the coin comes up tails, switch to the other die.
10
Green coinRed coin 0.10.9Red coin 0.950.05green coin
11
4 32 1 56
12
The model is not hidden because you know the sequence of states from the colors of the coins and dice
13
Given a sequence of emissions what is the most likely state path? how can you estimate transition and emission probabilities of the model? What is the forward probability that the model generates a given sequence? What is the posterior probability that the model is in a particular state at any point in the sequence
14
hmmgenerate — Generates a sequence of states and emissions from a Markov model hmmgenerate hmmestimate — Calculates maximum likelihood estimates of transition and emission probabilities from a sequence of emissions and a known sequence of states hmmestimate hmmtrain — Calculates maximum likelihood estimates of transition and emission probabilities from a sequence of emissions hmmtrain hmmviterbi — Calculates the most probable state path for a hidden Markov model hmmviterbi hmmdecode — Calculates the posterior state probabilities of a sequence of emissions hmmdecode
15
TRANS = [.9.1;.05.95;]; EMIS = [1/6, 1/6, 1/6, 1/6, 1/6, 1/6;... 7/12, 1/12, 1/12, 1/12, 1/12, 1/12]; [seq,states] = hmmgenerate(100,TRANS,EMIS); hmmgenerate begins in state 1 at step 0 Seq is 100 element from 1-6 randomly States is 100 element 1 or 2 dependent on Seq
16
useing the Viterbi algorithm to compute the most likely sequence of states given sequence seq of emissions likelystates = hmmviterbi(seq, TRANS, EMIS); likelystates is a sequence the same length as seq.
17
compute the percentage of the actual sequence states that agrees with the sequence likelystates. sum(states==likelystates)/100 ans = 0.8200
18
Using hmmestimate requires that you know the sequence of states States that the model went through to generate seq. [TRANS_EST, EMIS_EST] = hmmestimate(seq, states)
19
TRANS_EST =[ 0.8989 0.1011 0.0585 0.9415] EMIS_EST = [0.1721 0.1721 0.1749 0.1612 0.1803 0.1393 0.5836 0.0741 0.0804 0.0789 0.0726 0.1104] You can compare the outputs with the original transition and emission matrices, TRANS and EMIS
20
Using hmmtrain If you do not know the sequence of states states, but you have initial guesses for TRANS and EMIS TRANS_GUESS = [.85.15;.1.9]; EMIS_GUESS = [.17.16.17.16.17.17;.6.08.08.08.08 08]; [TRANS_EST2, EMIS_EST2] = hmmtrain(seq, TRANS_GUESS, EMIS_GUESS)
21
Two factors reduce the reliability of the output matrices of hmmtrain: guesses for the matrices TRANS_EST and EMIS_EST. use different The sequence seq. use long
22
The posterior state probabilities of an emission sequence seq are the conditional probabilities that the model is in a particular state when it generates a symbol in seq, given that seq is emitted.
23
PSTATES = hmmdecode(seq,TRANS,EMIS) The output PSTATES is an M-by-L matrix, where M is the number of states and L is the length of seq. PSTATES(i,j) is the conditional probability that the model is in state i when it generates the jth symbol of seq, given that seq is emitted.
24
The most important methods can be used on recognition are : Hmmgenrate : to test the model Hmmtrain : to estmaite transition and emission probility from initial transtion and emission
25
http://www.mathworks.com/help/stats/hidden -markov-models-hmm.html. (n.d.). Retrieved 11 17, 2013, from Hidden Markov Models (HMM).
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.