ICA Alphan Altinok. Outline  PCA  ICA  Foundation  Ambiguities  Algorithms  Examples  Papers.

Slides:



Advertisements
Similar presentations
Independent Component Analysis: The Fast ICA algorithm
Advertisements

3D Geometry for Computer Graphics
Component Analysis (Review)
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Independent Component Analysis & Blind Source Separation
Principal Component Analysis CMPUT 466/551 Nilanjan Ray.
Principal Component Analysis
Independent Component Analysis (ICA)
Application of Statistical Techniques to Neural Data Analysis Aniket Kaloti 03/07/2006.
Pattern Recognition Topic 1: Principle Component Analysis Shapiro chap
Dimensional reduction, PCA
Independent Component Analysis & Blind Source Separation Ata Kaban The University of Birmingham.
Face Recognition using PCA (Eigenfaces) and LDA (Fisherfaces)
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Face Recognition Jeremy Wyatt.
Independent Component Analysis (ICA) and Factor Analysis (FA)
Bayesian belief networks 2. PCA and ICA
Smart Traveller with Visual Translator for OCR and Face Recognition LYU0203 FYP.
Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Survey on ICA Technical Report, Aapo Hyvärinen, 1999.
Linear Algebra and Image Processing
Summarized by Soo-Jin Kim
Dimensionality Reduction: Principal Components Analysis Optional Reading: Smith, A Tutorial on Principal Components Analysis (linked to class webpage)
Independent Components Analysis with the JADE algorithm
Feature extraction 1.Introduction 2.T-test 3.Signal Noise Ratio (SNR) 4.Linear Correlation Coefficient (LCC) 5.Principle component analysis (PCA) 6.Linear.
Independent Component Analysis on Images Instructor: Dr. Longin Jan Latecki Presented by: Bo Han.
INDEPENDENT COMPONENT ANALYSIS OF TEXTURES based on the article R.Manduchi, J. Portilla, ICA of Textures, The Proc. of the 7 th IEEE Int. Conf. On Comp.
1 Recognition by Appearance Appearance-based recognition is a competing paradigm to features and alignment. No features are extracted! Images are represented.
Classification Course web page: vision.cis.udel.edu/~cv May 12, 2003  Lecture 33.
Computer Vision Lab. SNU Young Ki Baik Nonlinear Dimensionality Reduction Approach (ISOMAP, LLE)
ECE 8443 – Pattern Recognition LECTURE 10: HETEROSCEDASTIC LINEAR DISCRIMINANT ANALYSIS AND INDEPENDENT COMPONENT ANALYSIS Objectives: Generalization of.
ECE 8443 – Pattern Recognition LECTURE 08: DIMENSIONALITY, PRINCIPAL COMPONENTS ANALYSIS Objectives: Data Considerations Computational Complexity Overfitting.
2010/12/11 Frequency Domain Blind Source Separation Based Noise Suppression to Hearing Aids (Part 2) Presenter: Cian-Bei Hong Advisor: Dr. Yeou-Jiunn Chen.
Project 11: Determining the Intrinsic Dimensionality of a Distribution Okke Formsma, Nicolas Roussis and Per Løwenborg.
Principal Manifolds and Probabilistic Subspaces for Visual Recognition Baback Moghaddam TPAMI, June John Galeotti Advanced Perception February 12,
EE4-62 MLCV Lecture Face Recognition – Subspace/Manifold Learning Tae-Kyun Kim 1 EE4-62 MLCV.
PCA vs ICA vs LDA. How to represent images? Why representation methods are needed?? –Curse of dimensionality – width x height x channels –Noise reduction.
Elements of Pattern Recognition CNS/EE Lecture 5 M. Weber P. Perona.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 12: Advanced Discriminant Analysis Objectives:
Principal Component Analysis (PCA)
MACHINE LEARNING 7. Dimensionality Reduction. Dimensionality of input Based on E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)
Independent Component Analysis Independent Component Analysis.
2D-LDA: A statistical linear discriminant analysis for image matrix
Introduction to Independent Component Analysis Math 285 project Fall 2015 Jingmei Lu Xixi Lu 12/10/2015.
An Introduction of Independent Component Analysis (ICA) Xiaoling Wang Jan. 28, 2003.
Affine Registration in R m 5. The matching function allows to define tentative correspondences and a RANSAC-like algorithm can be used to estimate the.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 10: PRINCIPAL COMPONENTS ANALYSIS Objectives:
Unsupervised Learning II Feature Extraction
Independent Component Analysis features of Color & Stereo images Authors: Patrik O. Hoyer Aapo Hyvarinen CIS 526: Neural Computation Presented by: Ajay.
Part 3: Estimation of Parameters. Estimation of Parameters Most of the time, we have random samples but not the densities given. If the parametric form.
Machine Learning Supervised Learning Classification and Regression K-Nearest Neighbor Classification Fisher’s Criteria & Linear Discriminant Analysis Perceptron:
Ch 12. Continuous Latent Variables ~ 12
LECTURE 11: Advanced Discriminant Analysis
LECTURE 09: BAYESIAN ESTIMATION (Cont.)
LECTURE 10: DISCRIMINANT ANALYSIS
9.3 Filtered delay embeddings
Recognition: Face Recognition
Principal Component Analysis (PCA)
PCA vs ICA vs LDA.
Face Recognition and Detection Using Eigenfaces
Bayesian belief networks 2. PCA and ICA
Principal Component Analysis
PCA is “an orthogonal linear transformation that transfers the data to a new coordinate system such that the greatest variance by any projection of the.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Dimensionality Reduction
A Fast Fixed-Point Algorithm for Independent Component Analysis
Feature space tansformation methods
LECTURE 09: DISCRIMINANT ANALYSIS
Independent Factor Analysis
Presentation transcript:

ICA Alphan Altinok

Outline  PCA  ICA  Foundation  Ambiguities  Algorithms  Examples  Papers

PCA & ICA  PCA  Projects d-dimensional data onto a lower dimensional subspace in a way that is optimal in Σ|x 0 – x| 2.  ICA  Seek directions in feature space such that resulting signals show independence.

PCA  Compute d-dimensional μ (mean).  Compute d x d covariance matrix.  Compute eigenvectors and eigenvalues.  Choose k largest eigenvalues.  k is the inherent dimensionality of the subspace governing the signal and (d – k) dimensions generally contain noise.  Form a d x k matrix A with k columns of eigenvalues.  The representation of data by principal components consists of projecting data into k-dimensional subspace by x = A t (x – μ).

PCA  A simple 3-layer neural network can form such a representation when trained.

ICA  While PCA seeks directions that represents data best in a Σ|x 0 – x| 2 sense, ICA seeks such directions that are most independent from each other.  Used primarily for separating unknown source signals from their observed linear mixtures.  Typically used in Blind Source Separation problems. ICA is also used in feature extraction.

ICA – Foundation  q source signals s 1 (k), s 2 (k), …, s q (k)  with 0 means  k is the discrete time index or pixels in images  scalar valued  mutually independent for each value of k  h measured mixture signals x 1 (k), x 2 (k), …, x h (k)  Statistical independence for source signals  p[s 1 (k), s 2 (k), …, s q (k)] = П p[s i (k)]

ICA – Foundation  The measured signals will be given by  x j (k) = Σs i (k)a ij + n j (k)  For j = 1, 2, …, h, the elements a ij are unknown.  Define vectors x(k) and s(k), and matrix A  Observed:x(k) = [x 1 (k), x 2 (k), …, x h (k)]  Source:s(k) = [s 1 (k), s 2 (k), …, s q (k)]  Mixing matrix:A = [a 1, a 2, …, a q ]  The equation above can be stated in vector-matrix form  x(k) = As(k) + n(k) = Σs i (k)a i + n(k)

Ambiguities with ICA  The ICA expansion  x(k) = As(k) + n(k) = Σs i (k)a i + n(k)  Amplitudes of separated signals cannot be determined.  There is a sign ambiguity associated with separated signals.  The order of separated signals cannot be determined.

ICA – Using NNs  Prewhitening – transform input vectors x(k) by  v(k) = V x(k)  Whitening matrix V can be obtained by NN or PCA  Separation (NN or contrast approximation)  Estimation of ICA basis vectors (NN or batch approach)

ICA – Fast Fixed Point Algorithm  FFPA converges rapidly to the most accurate solution allowed by the data structure.

ICA – Example

 BSS of recorded speech and music signals. speech / musicspeech / speech speech / speech in difficult environment mic1 mic2 separated1 separated2

ICA – Example  Source images  separation demo separation demo

ICA – Papers  Hinton – A New View of ICA  Interprets ICA as a probability density model.  Overcomplete, undercomplete, and multi-layer non-linear ICA becomes simpler.  Cardoso – Blind Signal Separation, Statistical Principles  Modelling identifiability.  Contrast functions.  Estimating functions.  Adaptive algorithms.  Performance issues.

ICA – Papers  Hyvarinen – ICA Applied to Feature Extraction from Color and Stereo Images  Seeks to extend ICA by contrasting it to the processing done in neural receptive fields.  Hyvarinen – Survey on ICA  Lawrence – Face Recognition, A Convolutional Neural Network Approach  Combines local image sampling, a SOM, and a convolutional NN that provides partial invariance to translation, rotation, scaling, and deformations.

ICA – Papers  Sejnowski – Independent Component Representations for Face Recognition  Sejnowski – A Comparison of Local vs Global Image Decompositions for Visual Speechreading  Bartlett – Viewpoint Invariant Face Recognition Using ICA and Attractor Networks  Bartlett – Image Representations for Facial Expression Coding

ICA – Links    