Introduction to Independent Component Analysis Math 285 project Fall 2015 Jingmei Lu Xixi Lu 12/10/2015.

Slides:



Advertisements
Similar presentations
Independent Components Analysis
Advertisements

Independent Component Analysis
Independent Component Analysis: The Fast ICA algorithm
Discovering Cyclic Causal Models by Independent Components Analysis Gustavo Lacerda Peter Spirtes Joseph Ramsey Patrik O. Hoyer.
EE645: Independent Component Analysis
Dimension reduction (2) Projection pursuit ICA NCA Partial Least Squares Blais. “The role of the environment in synaptic plasticity…..” (1998) Liao et.
Color Imaging Analysis of Spatio-chromatic Decorrelation for Colour Image Reconstruction Mark S. Drew and Steven Bergner
Face Recognition Ying Wu Electrical and Computer Engineering Northwestern University, Evanston, IL
Independent Component Analysis & Blind Source Separation
REAL-TIME INDEPENDENT COMPONENT ANALYSIS IMPLEMENTATION AND APPLICATIONS By MARCOS DE AZAMBUJA TURQUETI FERMILAB May RTC 2010.
Independent Component Analysis (ICA)
Independent Component Analysis & Blind Source Separation Ata Kaban The University of Birmingham.
A gentle introduction to Gaussian distribution. Review Random variable Coin flip experiment X = 0X = 1 X: Random variable.
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
ICA-based Blind and Group-Blind Multiuser Detection.
Independent Component Analysis (ICA) and Factor Analysis (FA)
An Introduction to Independent Component Analysis (ICA) 吳育德 陽明大學放射醫學科學研究所 台北榮總整合性腦功能實驗室.
A Quick Practical Guide to PCA and ICA Ted Brookings, UCSB Physics 11/13/06.
Bayesian belief networks 2. PCA and ICA
Some Statistics Stuff (A.K.A. Shamelessly Stolen Stuff)
ICA Alphan Altinok. Outline  PCA  ICA  Foundation  Ambiguities  Algorithms  Examples  Papers.
Techniques for studying correlation and covariance structure
Modern Navigation Thomas Herring
Multidimensional Data Analysis : the Blind Source Separation problem. Outline : Blind Source Separation Linear mixture model Principal Component Analysis.
Survey on ICA Technical Report, Aapo Hyvärinen, 1999.
Review of Probability.
Independent Components Analysis with the JADE algorithm
ECE 8443 – Pattern Recognition LECTURE 03: GAUSSIAN CLASSIFIERS Objectives: Normal Distributions Whitening Transformations Linear Discriminants Resources.
Independent Component Analysis on Images Instructor: Dr. Longin Jan Latecki Presented by: Bo Han.
Heart Sound Background Noise Removal Haim Appleboim Biomedical Seminar February 2007.
Independent Component Analysis
Hongyan Li, Huakui Wang, Baojin Xiao College of Information Engineering of Taiyuan University of Technology 8th International Conference on Signal Processing.
Independent Component Analysis Zhen Wei, Li Jin, Yuxue Jin Department of Statistics Stanford University An Introduction.
N– variate Gaussian. Some important characteristics: 1)The pdf of n jointly Gaussian R.V.’s is completely described by means, variances and covariances.
ECE 8443 – Pattern Recognition LECTURE 10: HETEROSCEDASTIC LINEAR DISCRIMINANT ANALYSIS AND INDEPENDENT COMPONENT ANALYSIS Objectives: Generalization of.
Learning Theory Reza Shadmehr LMS with Newton-Raphson, weighted least squares, choice of loss function.
A note about gradient descent: Consider the function f(x)=(x-x 0 ) 2 Its derivative is: By gradient descent (If f(x) is more complex we usually cannot.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: ML and Simple Regression Bias of the ML Estimate Variance of the ML Estimate.
2010/12/11 Frequency Domain Blind Source Separation Based Noise Suppression to Hearing Aids (Part 2) Presenter: Cian-Bei Hong Advisor: Dr. Yeou-Jiunn Chen.
Computer Vision Lecture 6. Probabilistic Methods in Segmentation.
CSC2515: Lecture 7 (post) Independent Components Analysis, and Autoencoders Geoffrey Hinton.
Lecture 2: Statistical learning primer for biologists
Lecture 3 BME452 Biomedical Signal Processing 2013 (copyright Ali Işın, 2013) 1 BME452 Biomedical Signal Processing Lecture 3  Signal conditioning.
PCA vs ICA vs LDA. How to represent images? Why representation methods are needed?? –Curse of dimensionality – width x height x channels –Noise reduction.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 12: Advanced Discriminant Analysis Objectives:
Principal Component Analysis (PCA)
Independent Component Analysis Independent Component Analysis.
Feature Selection and Extraction Michael J. Watts
ICA and PCA 學生:周節 教授:王聖智 教授. Outline Introduction PCA ICA Reference.
An Introduction of Independent Component Analysis (ICA) Xiaoling Wang Jan. 28, 2003.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 09: Discriminant Analysis Objectives: Principal.
Learning Theory Reza Shadmehr Distribution of the ML estimates of model parameters Signal dependent noise models.
Geology 6600/7600 Signal Analysis 04 Sep 2014 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
HST.582J/6.555J/16.456J Gari D. Clifford Associate Director, Centre for Doctoral Training, IBME, University of Oxford
Dimension reduction (1) Overview PCA Factor Analysis Projection persuit ICA.
By: Soroosh Mariooryad Advisor: Dr.Sameti 1 BSS & ICA Speech Recognition - Spring 2008.
CSC2535: Computation in Neural Networks Lecture 7: Independent Components Analysis Geoffrey Hinton.
Lectures 15: Principal Component Analysis (PCA) and
LECTURE 11: Advanced Discriminant Analysis
Brain Electrophysiological Signal Processing: Preprocessing
Principal Component Analysis (PCA)
Application of Independent Component Analysis (ICA) to Beam Diagnosis
PCA vs ICA vs LDA.
Bayesian belief networks 2. PCA and ICA
Presented by Nagesh Adluru
A Fast Fixed-Point Algorithm for Independent Component Analysis
Learning Theory Reza Shadmehr
Discovery of Hidden Structure in High-Dimensional Data
Mathematical Foundations of BME
Presentation transcript:

Introduction to Independent Component Analysis Math 285 project Fall 2015 Jingmei Lu Xixi Lu 12/10/2015

Agenda The “Cocktail Party” Problem ICA model Principle of ICA Fast ICA algorithm Separate mixed audio signal Reference

Sources Observations s1s1 s2s2 x1x1 x2x2 Purpose: estimate the two original speech signals s 1 (t) and s 2 (t), using only the recorded signals x 1 (t) and x 2 (t) The “Cocktail Party” Problem

Motivation Independent SourcesMixture signal

Motivation Independent Sources Recovered signals

What is ICA? “ Independent component analysis (ICA) is a method for finding underlying factors or components from multivariate (multi-dimensional) statistical data. What distinguishes ICA from other methods is that it looks for components that are both statistically independent, and nonGaussian.” A.Hyvarinen, A.Karhunen, E.Oja ‘Independent Component Analysis ’

ICA Model Observe n linear mixtures x 1,…x n of n independent components x j = a j1 s 1 + a j2 s a jn s n, for all j x j: observed random variable s j : independent source variable ICA model: x = As a ij is the entry of A Task: estimate A and s using only the observeable random vector x

ICA Model Two assumptions: 1. The components s i are statistically independent 2. The independent components must have nongaussian distributions.

Why non-Gaussian Assume : 1) s 1 and s 2 are gaussian 2) mixing matrix A is orthogonal Then x 1 and x 2 are gaussian, uncorrelated, and of unit variance. Their joint density is

Why non-Gaussian Since the density is completely symmetric, it does not contain any information on the direction of the columns of the mixing matrix A.

Why non-Gaussian Assume s1 and s2 follow uniform distribution with zero mean and unit variance Mixing matrix A is x=As The edges of the parallelogram are in the direction of the columns of A

Principle of ICA y is a linear combination of s i, with weights given by z i Central Limit Theorem: the distribution of a sum of independent random variables tends toward a guassian distribution, under certain condition. z T s is more gaussian than either of s i. And becomes least gaussian when its equal to one of s i. So we could take w as a vector which maximizes the non-gaussianity of w T x.

Measure of Nongaussianity Entropy (H): degree of information that an observation gives A Gaussian variable has the largest entropy among all random variables of equal variance Negentropy J Computationally difficult

Negentropy approximations In fastICA algorithm, use G is some nonquadratic function. v is a Gaussian variable of zero mean and unit variance. Maximize J(y) to maximize nongaussianity.

Fast ICA Data Preprocessing Centering Whitening Fast ICA algorithm Maximize non gaussianity

Data Preprocessing

Fast ICA Algorithm 1. Choose an initial weight vector w. 2. Let w + = E{xg(w T x)} – E{g ′ (w T x)}w g() is the derivatives of functions G 3. w = w + /||w + ||. (Normalization step) 4. If not converged go back to 2 converged if norm(w new – w old ) < ξ ξ typically around

Separate mixed audio signal

Mixed signals

Separated signals

Separated signals by PCA

Other applications Separation of Artifacts in MEG Data Finding Hidden Factors in Financial Data Reducing Noise in Natural Images Telecommunications

Reference Hyvärinen, A., Karhunen, J., Oja, E.: 2001, Independent Component Analysis: Algorithms and Applications, Wiley, New York. Särelä. "COCKTAIL PARTY PROBLEM." COCKTAIL PARTY PROBLEM. N.p., 20 Apr Web. Dec.-Jan