Download presentation
Presentation is loading. Please wait.
1
Mixture Density Networks
Qiang Lou
2
Outline Simple Example Introduction of MDN Analysis of MDN
Weights Optimization Prediction
3
Simple Example The inverse problem of x=t+0.3*sin(2лt)
Input: x, output: t
4
Learn form the example Obviously, we find the problem of the conventional neural networks. multi-valued mapping Reason: f(x,w*)= E[Y|X], the average of the correct target values, sometimes is not correct solution.
5
Solution: mixture density networks
MDN: overcome the limits mentioned above ---- using a linear combination of kernel function: Three parameters: coefficients: means: variances:
6
How to model the parameters?
---- using the outputs of the conventional NN Coefficients: Variances: Means can be directly represented by output of NN:
7
Basic structure of MDN
8
Weights Optimization Similar to the conventional NN:
maximum likelihood (minimize the negative logarithm of the likelihood). We try to minimize E(w), which is equivalent to maximize the likelihood.
9
Weights Optimization Using chain rule and back propagation:
start off the algorithm:
10
Prediction General Way
take the conditional average of the target data: Accurate Way take the solution of the most probable components μk , where k = arg maxk( )
11
Results of example
12
Problems The number of the outputs of the MDN
Assume: L models in the mixture model K outputs in conventional NN Outputs of MDN: (K+2) L 2)
13
Thank you !
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.