Download presentation
Presentation is loading. Please wait.
Published byPhilomena Lawrence Modified over 8 years ago
1
Savyasachi Singh Computational NeuroEngineering Lab March 19, 2008
2
Introduction
3
Model ParametersModel Parameters
4
Assumptions
5
Three basic problemsThree basic problems
6
Evaluation ProblemEvaluation Problem
7
Forward AlgorithmForward Algorithm
8
Backward AlgorithmBackward Algorithm
9
Decoding ProblemDecoding Problem
10
Viterbi AlgorithmViterbi Algorithm
11
Learning ProblemLearning Problem
12
ML Estimation: EM algorithmML Estimation: EM algorithm
13
Baum Welch AlgorithmBaum Welch Algorithm
14
Re-estimation formulaeRe-estimation formulae
15
Gradient based methodGradient based method
16
Practical PitfallsPractical Pitfalls
17
Limitations
18
Isolated Word RecognitionIsolated Word Recognition FEATURE EXTRACTION FEATURE EXTRACTION HMM Word 1 HMM Word 1 HMM Word 2 HMM Word 2 HMM Word 3 HMM Word 3 HMM Word K HMM Word K SELECT MAXIMUM SELECT MAXIMUM
19
Typical ImplementationsTypical Implementations
20
HW 4 part c pseudocodeHW 4 part c pseudocode 1.Chop speech signal into frames and extract features. (preferably MFCC) 2.Choose HMM parameters N, M, cov. type, A etc. 3.Start learning procedure for train set for each word repeat following steps for each state Initialize GMM’s and get parameters (use mixgauss_init.m) end Train HMM with EM (use mhmm_em.m) end 4.Start testing procedure for test set for each test utterance Compare with all trained models and get log likelihood (score) using forward backward algorithm. (use mhmm_logprob.m) Select model with highest score as recognized word. end 5.Tabulate confusion matrix.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.