Download presentation
Presentation is loading. Please wait.
2
. cmsc726: HMMs material from: slides from Sebastian Thrun, and Yair Weiss
3
Outline u Time Series u Markov Models u Hidden Markov Models u Learning HMMs with EM u Applying HMMs u Summary
4
Audio Spectrum Audio Spectrum of the Song of the Prothonotary Warbler
5
Bird Sounds Chestnut-sided WarblerProthonotary Warbler
6
Questions One Could Ask u What bird is this? u How will the song continue? u Is this bird sick? u What phases does this song have? Time series classification Time series prediction Outlier detection Time series segmentation
7
Other Sound Samples
8
Another Time Series Problem Intel Cisco General Electric Microsoft
9
Questions One Could Ask u Will the stock go up or down? u What type stock is this (eg, risky)? u Is the behavior abnormal? Time series prediction Time series classification Outlier detection
10
Music Analysis
11
Questions One Could Ask u Is this Beethoven or Bach? u Can we compose more of that? u Can we segment the piece into themes? Time series classification Time series prediction/generation Time series segmentation
12
CiteSeer.Com (Citation Index) Dave Rumelhart Takeo KanadeTom Mitchell Raj ReddyJim Morris
13
Questions One Could Ask u Shall UMD give tenure? u Shall UMD hire? u Shall UMD fire? Time series classification Time series prediction Outlier detection Disclaimer: This is a joke!
14
The Real Question u How do we model these problems? u How do we formulate these questions as a inference/learning problems?
15
Outline For Today u Time Series u Markov Models u Hidden Markov Models u Learning HMMs with EM u Applying HMMs u Summary
16
Weather: A Markov Model Sunny Rainy Snowy 80% 15% 5% 60% 2% 38% 20% 75%5%
17
Ingredients of a Markov Model u States: u State transition probabilities: u Initial state distribution: Sunny Rainy Snowy 80% 15% 5% 60% 2% 38% 20% 75% 5%
18
Ingredients of Our Markov Model u States: u State transition probabilities: u Initial state distribution: Sunny Rainy Snowy 80% 15% 5% 60% 2% 38% 20% 75% 5%
19
Probability of a Time Series u Given: u What is the probability of this series?
20
Outline For Today u Time Series u Markov Models u Hidden Markov Models u Learning HMMs with EM u Applying HMMs u Summary
21
Hidden Markov Models Sunny Rainy Snowy 80% 15% 5% 60% 2% 38% 20% 75%5% Sunny Rainy Snowy 80% 15% 5% 60% 2% 38% 20% 75% 5% 60% 10% 30% 65% 5% 30% 50% 0% 50% NOT OBSERVABLE
22
Ingredients of an HMM u States: u State transition probabilities: u Initial state distribution: Observations: Observation probabilities:
23
Ingredients of Our HMM u States: u Observations: u State transition probabilities: u Initial state distribution: u Observation probabilities:
24
Probability of a Time Series u Given: u What is the probability of this series?
25
Calculating Data Likelihood u Problem: exponential in time u Is there a more efficient way?
26
The Forward Algorithm (1) S2S2 S3S3 S1S1 S2S2 S3S3 S1S1 O2O2 O3O3 O1O1 O2O2 O3O3 O1O1 S2S2 S3S3 S1S1 O2O2 O3O3 O1O1 S2S2 S3S3 S1S1 O2O2 O3O3 O1O1 S2S2 S3S3 S1S1 O2O2 O3O3 O1O1 …
27
Question u Does this solve our problem of calculating ? YESNO Count
28
Answer u And the answer is… Yes!
29
Exercise u What is the probability of observing AB? a. Initial state s 1 : b. Initial state chosen at random: s2s2 s1s1 0.6 0.4 0.3 0.7 0.3 B 0.7 A 0.2 B 0.8 A 0.2 (0.4 0.8 + 0.6 0.7) = 0.148 0.5 0.148 + (0.5 0.3 (0.3 0.7 + 0.7 0.8)) = 0.1895
30
Next Question: What is the probability that the state at time t was S i ? u Can we answer this? S2S2 S3S3 S1S1 S2S2 S3S3 S1S1 O2O2 O3O3 O1O1 O2O2 O3O3 O1O1 S2S2 S3S3 S1S1 O2O2 O3O3 O1O1 S2S2 S3S3 S1S1 O2O2 O3O3 O1O1 S2S2 S3S3 S1S1 O2O2 O3O3 O1O1 … No!
31
The Backward Algorithm (2) S2S2 S3S3 S1S1 S2S2 S3S3 S1S1 O2O2 O3O3 O1O1 O2O2 O3O3 O1O1 S2S2 S3S3 S1S1 O2O2 O3O3 O1O1 S2S2 S3S3 S1S1 O2O2 O3O3 O1O1 S2S2 S3S3 S1S1 O2O2 O3O3 O1O1 …
32
The Forward-Backward Algorithm (3) S2S2 S3S3 S1S1 S2S2 S3S3 S1S1 O2O2 O3O3 O1O1 O2O2 O3O3 O1O1 S2S2 S3S3 S1S1 O2O2 O3O3 O1O1 S2S2 S3S3 S1S1 O2O2 O3O3 O1O1 S2S2 S3S3 S1S1 O2O2 O3O3 O1O1 …
33
Summary (So Far) For a given HMM, we can compute
34
Finding the best state sequence We would like to the most likely path (and not just the most likely state at each time slice) The Viterbi algorithm is an efficient method for finding the MPE: and we to reconstruct the path:
35
Outline u Time Series u Markov Models u Hidden Markov Models u Learning HMMs with EM u Applying HMMs u Summary
36
Hidden Markov Models Sunny Rainy Snowy 80% 15% 5% 60% 2% 38% 20% 75%5% Sunny Rainy Snowy 80% 15% 5% 60% 2% 38% 20% 75% 5% 60% 10% 30% 65% 5% 30% 50% 0% 50% NOT OBSERVABLE
37
Summary So Far u HMMs: generative probabilistic models of time series with hidden state u Forward-backward algorithm: Efficient algorithm for calculating
38
What about learning?
39
EM Problem: Find HMM that makes data most likely E-Step: Computefor given M-Step: Compute new under these expectations (this is now a Markov model)
40
E-Step u Calculate using the forward-backward algorithm, for fixed model
41
The M Step: generate =( , a, b)
42
Summary (Learning) Given observation sequence O Guess initial model u Iterate: Calculate expected times in state S i at time t (and in S j at time t ) using forward-backward algorithm Find new model by frequency counts
43
Outline For Today u Time Series u Markov Models u Hidden Markov Models u Learning HMMs with EM u Applying HMMs u Summary
44
Three Problems u What bird is this? u How will the song continue? u Is this bird abnormal? Time series classification Time series prediction Outlier detection
45
Time Series Classification Train one HMM l for each bird l Given time series O, calculate
46
Outlier Detection Train HMM Given time series O, calculate probability u If abnormally low, raise flag u If high, raise flag
47
Time Series Prediction Train HMM Given time series O, calculate distribution over final state (via ) and ‘hallucinate’ new states and observations according to a, b
48
Typical HMM in Speech Recognition 20-dim frequency space clustered using EM Use Bayes rule + Viterbi for classification Linear HMM representing one phoneme [Rabiner 86] + everyone else
49
Typical HMM in Robotics [Blake/Isard 98, Fox/Dellaert et al 99]
50
Problems with HMMs u Zero probabilities l Training sequence: AAABBBAAA l Test sequence: AAABBBCAAA u Finding “right” number of states, right structure u Numerical instabilities
51
Outline u Time Series u Markov Models u Hidden Markov Models u Learning HMMs with EM u Applying HMMs u Summary
52
HMMs: Main Lessons u HMMs: Generative probabilistic models of time series (with hidden state) u Forward-Backward: Algorithm for computing probabilities over hidden states u Learning models: EM, iterates estimation of hidden state and model fitting u Extremely practical, best known methods in speech, computer vision, robotics, … u Numerous extensions exist (continuous observations, states; factorial HMMs, controllable HMMs=POMDPs, …)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.