Presentation is loading. Please wait.

Presentation is loading. Please wait.

A Fast Fixed-Point Algorithm for Independent Component Analysis

Similar presentations


Presentation on theme: "A Fast Fixed-Point Algorithm for Independent Component Analysis"— Presentation transcript:

1 A Fast Fixed-Point Algorithm for Independent Component Analysis
Neural Computation, 9: , 1997 A. Hyvarinen, E. Oja Summarized by Seong-woo Chung

2 (C) 2001, SNU CSE Biointelligence Lab
Introduction Independent Component Analysis (ICA) is to express a set of random variables as linear combinations of statistically independent component variables Two applications of ICA are blind source separation and feature extraction (C) 2001, SNU CSE Biointelligence Lab

3 (C) 2001, SNU CSE Biointelligence Lab
Introduction(2) In the simplest form of ICA Observable m scalar variables v1, v2, …, vm n unknown independent components s1, s2, …, sn n < m v (vector) is linear combinations of s (vector) with an unknown m×n matrix (called the mixing matrix) Can only estimate non-Gaussian independent components (except if just one of the independent components is Gaussian) Defines that the independent components si have unit variance (C) 2001, SNU CSE Biointelligence Lab

4 (C) 2001, SNU CSE Biointelligence Lab
Introduction(3) The problem of estimating the matrix A can be simplified by performing sphering or prewhitening of the data v v is linearly transformed to a vector x = Mv such that its elements xi are mutually uncorrelated and all have unit variance Thus the correlation matrix of x equals unity: B=MA is an orthogonal matrix due to assumptions on the components si (C) 2001, SNU CSE Biointelligence Lab

5 ICA by Kurtosis Minimization and Maximization
ICA use the fourth-order cumulant or kurtosis of the signals, defined for a zero-mean random variable v as For a Gaussian, kurtosis is zero, for densities peaked at zero, positive, and for flatter densities, negative To find w satisfing , following object function have to be minimized or maximized (C) 2001, SNU CSE Biointelligence Lab

6 ICA by Kurtosis Minimization and Maximization(2)
Using gradient rule, The advantage is fast adaptation in a non-stationary environment A resulting trade-off is that the convergence is slow, and depends on a good choice of the learning rate sequence μ(t) (C) 2001, SNU CSE Biointelligence Lab

7 Fixed-Point Algorithm
Using the above derivation, we get the following fixed-point algorithm for ICA Take a random initial vector w(0) of norm 1. Let k=1 Let Divide w(k) by its norm If |w(k)w(k-1)| is not close to 1, let k=k+1 and go back to step 2, Otherwise, output the vector w(k) (C) 2001, SNU CSE Biointelligence Lab

8 (C) 2001, SNU CSE Biointelligence Lab
Application Blind source separation Feature extraction (C) 2001, SNU CSE Biointelligence Lab

9 Blind source separation
<Eight independent components of the EEG data> (C) 2001, SNU CSE Biointelligence Lab

10 (C) 2001, SNU CSE Biointelligence Lab
Feature extraction <Some ICA basis vectors of natural image data> (C) 2001, SNU CSE Biointelligence Lab

11 Discussion(about the Fast Fixed-Point Analysis)
The convergence of the algorithm is very fast There is no learning rate or other adustable parameters Finds the independent components one at a time Both components of negative and positive kurtosis can be found (C) 2001, SNU CSE Biointelligence Lab


Download ppt "A Fast Fixed-Point Algorithm for Independent Component Analysis"

Similar presentations


Ads by Google