Download presentation
Presentation is loading. Please wait.
Published byBenedict Arnold Modified over 9 years ago
1
LDA (Linear Discriminant Analysis) ShaLi
2
Limitation of PCA The direction of maximum variance is not always good for classification
3
Limitation of PCA The direction of maximum variance is not always good for classification
4
Limitation of PCA The direction of maximum variance is not always good for classification
5
Limitation of PCA The direction of maximum variance is not always good for classification
6
Limitation of PCA There are better direction that support classification tasks. LDA tries to find the best direction to separate classes
7
Idea of LDA
8
Find the w that maximizes minimizes
9
Limitations of LDA If the distributions are significantly non Gaussian, the LDA projections may not preserve complex structure in the data needed for classification LDA will also fail if discriminatory information is not in the mean but in the variance of the data
10
10 LDA for two class and k=1 Compute the means of classes: Projected class means: Difference between projected class means:
11
11 LDA for two class and k=1 Scatter of projected data in 1 dim space
12
12 Objective function Find the w that maximizes minimizes LDA does this by maximizing :
13
13 Objective function—Numerator We can rewrite: Between class scatter
14
14 Objective function—Denominator We can rewrite: Within class scatter
15
15 Objective function Putting all together: Where Maximize r(w) by setting the first derivative of w to zero:
16
For K=1 For K>1 Extension to K>1 Transform data onto the new subspace: Y = X × W n×k n×d d×k
17
17 Prediction as a classifier Classification rule: x in Class 2 if y(x)>0, else x in Class 1, where
18
Comparison of PCA and LDA PCA: Perform dimensionality reduction while preserving as much of the Variance in the high dimensional space as possible. LDA: Perform dimensionality reduction while preserving as much of the class discriminatory information as possible. PCA is the standard choice for unsupervised problems(no labels) LDA exploits class labels to find a subspace so that separates the classes as good as possible
19
PCA and LDA example Var1 and Var2 are large Seriously overlap m1 and m2 are close Data: Springleaf customer information 2 classes Original dimension: d=1934 Reduced dimension: k=1
20
PCA and LDA example PCALDA Data: Iris 3 classes Original dimension: d=4 Reduced dimension: k=2
21
PCA and LDA example Data: coffee bean recognition 5 classes Original dimension: d=60 Reduced dimension: k=3
22
Question?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.