Download presentation
Presentation is loading. Please wait.
Published byMabel Cox Modified over 9 years ago
1
Hidden Markov Models CISC 5800 Professor Daniel Leeds
2
Representing sequence data Spoken language DNA sequences Daily stock values Example: spoken language F?r plu? fi?e is nine Between F and r expect a vowel: “aw”, “ee”, “ah”; NOT “oh”, “uh” At end of “plu” expect consonant: “g”, “m”, “s”; NOT “d”, “p” 2
3
Markov Models 1 2 3.5.3.2.8.1.9 3
4
A dice-y example Two colored die What is the probability we start at s A ? What is the probability we have the sequence of die choices: s A, s A ? What is the probability we have the sequence of die choices: s B, s A, s B, s A ? 4
5
A dice-y example What is the probability we have the sequence of die choices: s B, s A, s B, s A ? Dynamic programming: find answer for q t, then compute q t+1 5 State\Timet1t1 t2t2 t3t3 sAsA 0.3 sBsB 0.7
6
Hidden Markov Models 6 Q O A A A
7
Deducing die based on observed “emissions” Each color is biased A (red)B (blue) We see: 5What is probability of o=5 | B (blue) We see: 5, 3What is probability of o=5,3 | B, B? What is probability of o=5,3 and s=B,B What is MOST probable s | o=5,3? 7 oP(o|s A )P(o|s B ) 1.3.1 2.2.1 3.2.1 4.2 5.1.2 6.1.3
8
Goal: calculate most likely states given observable data 8
9
9
10
Parameters in HMM 10 How do we learn these values?
11
First, assume we know the states 11
12
Learning HMM parameters: A i,j First, assume we know the states 12
13
First, assume we know the states 13
14
Challenges in HMM learning 14
15
Expectation-Maximization, or “EM” 15
16
Computing states q t 16
17
Details of forward and backward probabilities 17
18
E-step: State probabilities 18
19
Recall: when states known 19
20
M-step 20
21
Review of HMMs in action 21 For classification, find highest probability class given features Features for one sound: [q 1, o 1, q 2, o 2, …, q T, o T ] Conclude word: Generates states: Q O sound1sound2 sound3
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.