Download presentation
Presentation is loading. Please wait.
1
Kernel Methods Part 2 Bing Han June 26, 2008
2
Local Likelihood Logistic Regression
3
After a simple calculation, we get We denote the probabilities Logistic regression models are usually fit by maximum likelihood
4
Local Likelihood The data has feature x i and classes {1,2,…,J} The linear model is
5
Local Likelihood Local logistic regression The local log-likelihood for this J class model
6
Kernel Density Estimation
7
We have a random sample x1, x2, …,xN, we want to estimate probability density A natural local estimate Smooth Pazen estimate
8
Kernel Density Estimation A popular choice is Gaussian Kernel A natural generalization of the Gaussian density estimate by the Gaussian product kernel
9
Kernel Density Classification Density estimates Estimates of class priors By Bayes’ theorem
10
Kernel Density Classification
11
Naïve Bayes Classifier Assume given a class G=j, the features Xk are independent
12
Naïve Bayes Classifier A generalized additive model
13
Similar to logistic regression
14
Radial Basis Functions Functions can be represented as expansions in basis functions Radial basis functions treat kernel functions as basis functions. This lead to model
15
Method of learning parameters Optimize the sum-of squares with respect to all the parameters:
16
Radial Basis Functions Reduce the parameter set and assume a constant value for it will produce an undesirable effect. Renormalized radial basis functions
17
Radial Basis Functions
18
Mixture models Gaussian mixture model for density estimation In general, mixture models can use any component densities. The Gaussian mixture model is the most popular.
19
Mixture models If, Radial basis expansion If, kernel density estimate Where
20
Mixture models The parameter are usually fit by maximum likelihood, such as EM algorithm The mixture model also provides an estimate of the probability that observation i belong to component m
21
Questions?
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.