Download presentation
Presentation is loading. Please wait.
1
Part 4 c Baum-Welch Algorithm CSE717, SPRING 2008 CUBS, Univ at Buffalo
2
Review of Last Class Production Probability Forward-backward Algorithm Dynamic programming Decoding Problem Viterbi Algorithm Dynamic programming
3
Parameter Estimation in HMM (Known Hidden States) Parameters in HMM Initial state probability State transition probabilities State sequence
4
Parameter Estimation in HMM (Unknown Hidden States) Parameters in HMM Initial state probability State transition probabilities Possible state sequences E-Step M-Step
5
E-Step (Baum-Welch)
6
M-Step (Baum-Welch)
7
Termination Condition of Baum-Welch Algorithm if the quality measure is considerably improved by the updated model, continue with the E/M steps otherwise stop!
8
Multiple Observation Sequences Small modification is needed for multiple observation sequences For example: Single Observation O Multiple Observations
9
Updating Observation Likelihood (Discrete HMM: is represented non-parametrically)
10
Updating Observation Likelihood (Continuous HMM: is represented by mixture model) Observation Likelihood represented by Mixture density model Multivariate Normal Distribution
11
E-Step: Given observation O, estimating current state and model labeling
12
M-Step: Updating parameters of mixture model
13
Updating Observation Likelihood (Semi-continuous HMM) All states share a single set of component densities for building the mixture model
14
E-Step: Given observation O, estimating current state and model labeling
15
M-Step: Updating parameters of mixture model
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.