Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lecture 14 PCA, pPCA, ICA.

Similar presentations


Presentation on theme: "Lecture 14 PCA, pPCA, ICA."— Presentation transcript:

1 Lecture 14 PCA, pPCA, ICA

2 Principal Components Analysis
PCA is a data analysis technique to find the subspace of input space that carries most of the variance of the data. It is therefore useful as a tool to reduce the dimensionality of input space. The solution is found by an eigen-value decomposition of the sample covariance matrix. PPCA is a probabilistic model that has ML solution equal to the PCA solution. It is a special case of FA with isotropic variance. Therefore, the EM algorithm for FA is applicable for learning.

3 Independent Component Analysis
FA, PPCA have Gaussian prior models. In ICA we use non-Gaussian prior models (i.e. heavy tailed or bi-modal). We also do not insist on dimensionality reduction, although that is also possible, but not necessary. The canonical example is 2 speakers producing different mixtures of sound in 2 microphones that we wish to unmix. The source distributions are non-Gaussian but independent, the noise model is typically Gaussian. The simplest ICA model is square and has no noise. We can use a change of variable to go from sources to inputs. Learning is through gradient descend with the ``natural gradient’’.


Download ppt "Lecture 14 PCA, pPCA, ICA."

Similar presentations


Ads by Google