Download presentation
Presentation is loading. Please wait.
1
A gentle introduction to Gaussian distribution
2
Review Random variable Coin flip experiment X = 0X = 1 X: Random variable
3
Review Probability mass function (discrete) x 01 P(x) P(x) >= 0 Example: Coin flip experiment Any other constraints? Hint: What is the sum?
4
Review Probability density function (continuous) f(x) x f(x) >= 0 Examples? Unlike discrete, Density function does not represent probability but its rate of change called the “likelihood”
5
Review Probability density function (continuous) f(x) x f(x) >= 0 x0x0 X 0 +dx P( x 0 < x < x 0 +dx ) = f(x 0 ).dx But, P( x = x 0 ) = 0 & Integrates to 1.0
6
The Gaussian Distribution Courtesy: http://research.microsoft.com/~cmbishop/PRML/index.htm
7
A 2D Gaussian
8
Central Limit Theorem The distribution of the sum of N i.i.d. random variables becomes increasingly Gaussian as N grows. Example: N uniform [0,1] random variables.
9
Central Limit Theorem (Coin flip) Flip coin N times Each outcome has an associated random variable X i (=1, if heads, otherwise 0) Number of heads N H is a random variable –Sum of N i.i.d. random variables N H = x 1 + x 2 + …. + x N
10
Central Limit Theorem (Coin flip) Probability mass function of N H –P(Head) = 0.5 (fair coin) N = 5N = 10N = 40
11
Geometry of the Multivariate Gaussian
12
Moments of the Multivariate Gaussian (1) thanks to anti-symmetry of z
13
Moments of the Multivariate Gaussian (2)
14
Maximum likelihood Fit a probability density model p(x | θ) to the data –Estimate θ Given independent identically distributed (i.i.d.) data X = (x 1, x 2, …, x N ) –Likelihood –Log likelihood Maximum likelihood: Maximize ln p(X | θ) w.r.t. θ
15
Maximum Likelihood for the Gaussian (1) Given i.i.d. data, the log likelihood function is given by Sufficient statistics
16
Maximum Likelihood for the Gaussian (2) Set the derivative of the log likelihood function to zero, and solve to obtain Similarly
17
Mixtures of Gaussians (1) Old Faithful data set Single GaussianMixture of two Gaussians
18
Mixtures of Gaussians (2) Combine simple models into a complex model: Component Mixing coefficient K=3
19
Mixtures of Gaussians (3)
20
Mixtures of Gaussians (4) Determining parameters ¹, §, and ¼ using maximum log likelihood Solution: use standard, iterative, numeric optimization methods or the expectation maximization algorithm (Chapter 9). Log of a sum; no closed form maximum.
21
Thank you!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.