Download presentation
Presentation is loading. Please wait.
1
Feature Extraction (I)
Data Mining II Year Lluís Belanche Alfredo Vellido
2
Dimensionality reduction (1)
3
Dimensionality reduction (2)
4
Signal representation vs classification
5
Principal Components Analysis (PCA)
General goal : project the data onto a new subspace so that a maximum of relevant information is preserved In PCA, relevant information is variance (dispersion).
6
PCA Theory (1)
7
PCA Theory (2)
8
PCA Theory (3)
9
PCA Theory (4)
10
Algorithm for PCA
11
PCA examples (1)
12
PCA examples (2)
13
PCA examples (2)
14
PCA examples (3)
15
PCA examples (4)
16
Two solutions: in which sense are they optimal?
In the signal representation sense In the signal separation sense In both In none
17
Other approaches to FE Kernel PCA: perform PCA in xΦ(x),
where K(x,y) = < Φ(x), Φ(y)> is a kernel ICA (Independent Components Analysis): Seeks statistical independence of features (PCA seeks uncorrelated features) Equivalence to PCA iff features are Gaussian Image and audio analysis brings own methods Series expansion descriptors (from the DFT, DCT or DST) Moment-based features Spectral features Wavelet descriptors Cao, J.J. et al. A comparison of PCA, KPCA and ICA for dimensionality reduction. Neurocomputing 55, pp (2003)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.