MLVQ(EM 演算法 ) Speaker: 楊志民 Date:
training Remove Dc_bias Feature extraction 411.C Silence.c Duration.c Breath.c Test data recognize Recognize rate Speech feature Feature extraction model train Initial models
Initialize VQ Initial state loop VQ (get mixture means) MLVQ (get mean,variance weight, determin) Initialize MLVQ
Mixture Gaussian density function The mixture Gaussian can fit any kinds of distribution x f(x)
Estimation theory Bayes ’ theorem our goal is to maximize the log-likelihood of the observable ,
We take the conditional expectation of over X computed with parameter vector : The following expression is obtained
by Jensen’s inequality : The convergence of the EM algorithm lies in the fact that if we choose so that then
Jensen’s inequality 對數函數 (f(x)=log(x)) 為一凸函數, 其滿足下列不等式 推廣上式, 其中 必須滿足
Jensen’s inequality
Thus, we can The EM method is an iterative method, and we need a initial model Q0 Q1 Q2 … Maximization
Iteration: Set Set repeat from step2 until convergence. M-step:Compute to maximize the auxiliary to maximize the auxiliaryQ-function. E-Step Estimate unobserved data using auxiliary Q-function initialization: Choose an initial estimate Φ No Yes Step of implement EM