Presentation is loading. Please wait.

Presentation is loading. Please wait.

MLVQ(EM 演算法 ) Speaker: 楊志民 Date:96.10.4. training Remove Dc_bias Feature extraction 411.C Silence.c Duration.c Breath.c Test data recognize Recognize.

Similar presentations


Presentation on theme: "MLVQ(EM 演算法 ) Speaker: 楊志民 Date:96.10.4. training Remove Dc_bias Feature extraction 411.C Silence.c Duration.c Breath.c Test data recognize Recognize."— Presentation transcript:

1 MLVQ(EM 演算法 ) Speaker: 楊志民 Date:96.10.4

2 training Remove Dc_bias Feature extraction 411.C Silence.c Duration.c Breath.c Test data recognize Recognize rate Speech feature Feature extraction model train Initial models

3 Initialize VQ Initial state loop VQ (get mixture means) MLVQ (get mean,variance weight, determin) Initialize MLVQ

4 Mixture Gaussian density function The mixture Gaussian can fit any kinds of distribution x f(x)

5 Estimation theory  Bayes ’ theorem  our goal is to maximize the log-likelihood of the observable ,

6 We take the conditional expectation of over X computed with parameter vector : The following expression is obtained

7 by Jensen’s inequality : The convergence of the EM algorithm lies in the fact that if we choose so that then

8 Jensen’s inequality 對數函數 (f(x)=log(x)) 為一凸函數, 其滿足下列不等式 推廣上式, 其中 必須滿足

9 Jensen’s inequality

10 Thus, we can The EM method is an iterative method, and we need a initial model Q0  Q1  Q2  … Maximization

11 Iteration: Set Set repeat from step2 until convergence. M-step:Compute to maximize the auxiliary to maximize the auxiliaryQ-function. E-Step Estimate unobserved data using auxiliary Q-function initialization: Choose an initial estimate Φ No Yes Step of implement EM


Download ppt "MLVQ(EM 演算法 ) Speaker: 楊志民 Date:96.10.4. training Remove Dc_bias Feature extraction 411.C Silence.c Duration.c Breath.c Test data recognize Recognize."

Similar presentations


Ads by Google