1 Maarten De Vos SISTA – SCD - BIOMED K.U.Leuven On the combination of ICA and CPA Maarten De Vos Dimitri Nion Sabine Van Huffel Lieven De Lathauwer.

Slides:



Advertisements
Similar presentations
Multiuser Detection for CDMA Systems
Advertisements

3D Geometry for Computer Graphics
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Air Force Technical Applications Center 1 Subspace Based Three- Component Array Processing Gregory Wagner Nuclear Treaty Monitoring Geophysics Division.
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Dimension reduction (2) Projection pursuit ICA NCA Partial Least Squares Blais. “The role of the environment in synaptic plasticity…..” (1998) Liao et.
Fast Bayesian Matching Pursuit Presenter: Changchun Zhang ECE / CMR Tennessee Technological University November 12, 2010 Reading Group (Authors: Philip.
Dimension reduction (1)
Minimum Redundancy and Maximum Relevance Feature Selection
Graph Laplacian Regularization for Large-Scale Semidefinite Programming Kilian Weinberger et al. NIPS 2006 presented by Aggeliki Tsoli.
2008 SIAM Conference on Imaging Science July 7, 2008 Jason A. Palmer
Data mining and statistical learning - lecture 6
1cs542g-term High Dimensional Data  So far we’ve considered scalar data values f i (or interpolated/approximated each component of vector values.
Capacity of Multi-antenna Guassian Channels Introduction: Single user with multiple antennas at transmitter and receiver. Higher data rate Limited bandwidth.
Computer Graphics Recitation 5.
Dimensional reduction, PCA
Independent Component Analysis (ICA) and Factor Analysis (FA)
1cs542g-term Notes  Extra class next week (Oct 12, not this Friday)  To submit your assignment: me the URL of a page containing (links to)
12- OFDM with Multiple Antennas. Multiple Antenna Systems (MIMO) TX RX Transmit Antennas Receive Antennas Different paths Two cases: 1.Array Gain: if.
MATH 685/ CSI 700/ OR 682 Lecture Notes Lecture 6. Eigenvalue problems.
HELSINKI UNIVERSITY OF TECHNOLOGY LABORATORY OF COMPUTER AND INFORMATION SCIENCE NEURAL NETWORKS RESEACH CENTRE Variability of Independent Components.
Sparsity-Aware Adaptive Algorithms Based on Alternating Optimization and Shrinkage Rodrigo C. de Lamare* + and Raimundo Sampaio-Neto * + Communications.
Multidimensional Data Analysis : the Blind Source Separation problem. Outline : Blind Source Separation Linear mixture model Principal Component Analysis.
Survey on ICA Technical Report, Aapo Hyvärinen, 1999.
Principles of the Global Positioning System Lecture 11 Prof. Thomas Herring Room A;
Presented By Wanchen Lu 2/25/2013
Independent Components Analysis with the JADE algorithm
Machine Learning CUNY Graduate Center Lecture 3: Linear Regression.
-1- ICA Based Blind Adaptive MAI Suppression in DS-CDMA Systems Malay Gupta and Balu Santhanam SPCOM Laboratory Department of E.C.E. The University of.
Ilmenau University of Technology Communications Research Laboratory 1  A new multi-dimensional model order selection technique called closed- form PARAFAC.
Mining Discriminative Components With Low-Rank and Sparsity Constraints for Face Recognition Qiang Zhang, Baoxin Li Computer Science and Engineering Arizona.
Introduction to tensor, tensor factorization and its applications
“A fast method for Underdetermined Sparse Component Analysis (SCA) based on Iterative Detection- Estimation (IDE)” Arash Ali-Amini 1 Massoud BABAIE-ZADEH.
Fast and incoherent dictionary learning algorithms with application to fMRI Authors: Vahid Abolghasemi Saideh Ferdowsi Saeid Sanei. Journal of Signal Processing.
Modern Navigation Thomas Herring
LTSI (1) Faculty of Mech. & Elec. Engineering, University AL-Baath, Syria Ahmad Karfoul (1), Julie Coloigner (2,3), Laurent Albera (2,3), Pierre Comon.
The Group Lasso for Logistic Regression Lukas Meier, Sara van de Geer and Peter Bühlmann Presenter: Lu Ren ECE Dept., Duke University Sept. 19, 2008.
Estimation of Number of PARAFAC Components
Parameter estimation. 2D homography Given a set of (x i,x i ’), compute H (x i ’=Hx i ) 3D to 2D camera projection Given a set of (X i,x i ), compute.
Efficient computation of Robust Low-Rank Matrix Approximations in the Presence of Missing Data using the L 1 Norm Anders Eriksson and Anton van den Hengel.
Tools to monitor brain state Alain de Cheveigné, CNRS / ENS / UCL.
2010/12/11 Frequency Domain Blind Source Separation Based Noise Suppression to Hearing Aids (Part 2) Presenter: Cian-Bei Hong Advisor: Dr. Yeou-Jiunn Chen.
Sparse Signals Reconstruction Via Adaptive Iterative Greedy Algorithm Ahmed Aziz, Ahmed Salim, Walid Osamy Presenter : 張庭豪 International Journal of Computer.
Blind Information Processing: Microarray Data Hyejin Kim, Dukhee KimSeungjin Choi Department of Computer Science and Engineering, Department of Chemical.
Full-rank Gaussian modeling of convolutive audio mixtures applied to source separation Ngoc Q. K. Duong, Supervisor: R. Gribonval and E. Vincent METISS.
Texas A&M University, Department of Aerospace Engineering AN EMBEDDED FUNCTION TOOL FOR MODELING AND SIMULATING ESTIMATION PROBLEMS IN AEROSPACE ENGINEERING.
Channel Independent Viterbi Algorithm (CIVA) for Blind Sequence Detection with Near MLSE Performance Xiaohua(Edward) Li State Univ. of New York at Binghamton.
THREE-WAY COMPONENT MODELS pages By: Maryam Khoshkam 1.
CSC2515: Lecture 7 (post) Independent Components Analysis, and Autoencoders Geoffrey Hinton.
BART VANLUYTEN, JAN C. WILLEMS, BART DE MOOR 44 th IEEE Conference on Decision and Control December 2005 Model Reduction of Systems with Symmetries.
Principal Component Analysis (PCA)
September 28, 2000 Improved Simultaneous Data Reconciliation, Bias Detection and Identification Using Mixed Integer Optimization Methods Presented by:
Independent Component Analysis Independent Component Analysis.
Introduction to Independent Component Analysis Math 285 project Fall 2015 Jingmei Lu Xixi Lu 12/10/2015.
A Kernel Approach for Learning From Almost Orthogonal Pattern * CIS 525 Class Presentation Professor: Slobodan Vucetic Presenter: Yilian Qin * B. Scholkopf.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
An Introduction of Independent Component Analysis (ICA) Xiaoling Wang Jan. 28, 2003.
Affine Registration in R m 5. The matching function allows to define tentative correspondences and a RANSAC-like algorithm can be used to estimate the.
Dimension reduction (1) Overview PCA Factor Analysis Projection persuit ICA.
CSC2535: Computation in Neural Networks Lecture 7: Independent Components Analysis Geoffrey Hinton.
Dimension reduction (2) EDR space Sliced inverse regression Multi-dimensional LDA Partial Least Squares Network Component analysis.
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
Estimation Techniques for High Resolution and Multi-Dimensional Array Signal Processing EMS Group – Fh IIS and TU IL Electronic Measurements and Signal.
Estimation Techniques for High Resolution and Multi-Dimensional Array Signal Processing EMS Group – Fh IIS and TU IL Electronic Measurements and Signal.
Brain Electrophysiological Signal Processing: Preprocessing
Application of Independent Component Analysis (ICA) to Beam Diagnosis
Filtering and State Estimation: Basic Concepts
Presented by Nagesh Adluru
Xiu-Lin Wang, Xiao-Feng Gong, Qiu-Hua Lin
A Fast Fixed-Point Algorithm for Independent Component Analysis
Presentation transcript:

1 Maarten De Vos SISTA – SCD - BIOMED K.U.Leuven On the combination of ICA and CPA Maarten De Vos Dimitri Nion Sabine Van Huffel Lieven De Lathauwer

2 Maarten De Vos What is ICA? What is CPA? Why combining ICA and CPA? Our algorithm Results Conclusion Roadmap

3 Maarten De Vos What is ICA? What is CPA? Why combining ICA and CPA? Our algorithm Results Conclusion Roadmap

4 Maarten De Vos EEG 1 EEG 2 EEG 3 Independent Component Analysis (ICA) ICA: Estimate statistically independent sources s 1, s 2 and s 3 ; and mixing coefficients m i i  Decomposing a measurement (EEG) into contributing sources. EEG 3 = m 31 s 1 + m 32 s 2 + m 33 s 3 EEG 1 = m 11 s 1 + m 12 s 2 + m 13 s 3 EEG 2 = m 21 s 1 + m 22 s 2 + m 23 s 3

5 Maarten De Vos Decomposition of a measured signal S1S1 SRSR M1M1 MRMR =++ …. + E Y PCA estimates orthogonal sources (basis) = MQQ * S S1S1 SRSR M1M1 MRMR =++ …. + E Y ICA estimates statistically independent sources = MS Matrix decompositions (e.g. PCA) are often not unique. ICA imposes statistical independence to sources.

6 Maarten De Vos Different implementations of ‘independence’ Jade: Joint Approximate Diagonalization of Eigenmatrices All the higher-order cross-cumulants are zero Fourth order tensor cumulant is diagonal Mixing matrix is the matrix that approximately diagonalizes the eigenmatrices of cumulant Computation of ICA

7 Maarten De Vos = * * * * * Approximate diagonalization Sobi: Second Order Blind Identification Assumption that sources are autocorrelated Mixing matrix also diagonalizes set of matrices Matrices are correlation matrices at different time lags * * * * * *

8 Maarten De Vos Decomposition of a measured signal S1S1 SRSR M1M1 MRMR =++ …. + E Y B A = Y S C PCA Tucker / HOSVD : estimates subspace If a signal is multi-dimensional (=higher order tensor), multilinear algebra tools can be used that better exploit the multi-dimensional nature of the data. = B A S C QQ*PP* OO*

9 Maarten De Vos Decomposition of a measured signal S1S1 SRSR M1M1 MRMR =++ …. + E Y S1S1 SRSR A1A1 ARAR =+ Y E BRBR B1B1 PCA CPA: Canonical / Parallel Factor Analysis If a signal is multi-dimensional (=higher order tensor), multilinear algebra tools can be used that better exploit the multi-dimensional nature of the data.

10 Maarten De Vos CPA components are not orthogonal The best Rank – R approximation may not exist The R components are not ordered But the decomposition is unique and no rotation is possible without changing the model part … Something about CPA S1S1 SRSR A1A1 ARAR =++ …. Y E BRBR B1B1

11 Maarten De Vos CPA computed often by ALS: –Minimization of the (Frobenius -) norm of residuals –Minimize 1)Initialize A,S,B 2)Update A, given S and B : 3)Update S, given A and B : 4)Update B, given A and S : 5)Iterate (2-3-4) until convergence Computation of CPA S1S1 SRSR A1A1 ARAR =++ …. Y E BRBR B1B1

12 Maarten De Vos Sometimes long swamps, meaning that the costfunction converges very slowly. Computation of CPA (2)

13 Maarten De Vos In order to reduce swamps, interpolate A, B and S from the estimates of 2 previous iterations and use the interpolated matrices at the current iteration 1.Line Search: 2.Then ALS update Choice of crucial =1 annihilates LS step (i.e. we get standard ALS) Search directions Improvement of ALS: Line search

14 Maarten De Vos [Harshman, 1970] « LSH » [Bro, 1997] « LSB » [Rajih, Comon, 2005] « Enhanced Line Search (ELS) » [Nion, De Lathauwer, 2006] «Enhanced Line Search with Complex Step (ELSCS) » Improvement of ALS : Line search

15 Maarten De Vos

16 Maarten De Vos What is ICA? What is CPA? Why combining ICA and CPA? Our algorithm Simulation results Conclusion Overview

17 Maarten De Vos These activations have different ratios in different subjects  Gives rise to a trilinear CPA structure [Beckmann et al., 2005]

18 Maarten De Vos Beckmann et al (2005) Combination of ICA and CPA : tensor pICA Tensor pICA outperforms CPA due to low Signal-to-Noise Ratio Algo from paper: –One iteration step to optimize ICA costfunction –One iteration step to optimize trilinear structure –Optimize ‘until convergence’ Algo implemented in paper: –Compute ICA on matricized tensor –Decompose afterwards mixing vector to obtain trilinear decomposition Tensor pICA

19 Maarten De Vos Does it make sense to add constraints? –- : uniqueness –+: robustness –+: more identifiable if constraints make sense –+: see results

20 Maarten De Vos What is ICA? What is CPA? Why combining ICA and CPA? Our algorithm Results Conclusion Overview

21 Maarten De Vos We developed a new algorithm that simultaneously imposed the independence and the trilinear constraint A1A1 ARAR B1B1 BRBR =+ …. + Y SRSR S1S1

22 Maarten De Vos Compute fourth-order cumulant tensor Compute the ‘eigenmatrices’ of this tensor -> 3 rd order tensor Add slice with covariance matrix to this tensor This tensor has a 3 rd order CPA structure ICA - CPA * * * * * * * * * * *

23 Maarten De Vos Compute fourth-order cumulant tensor Compute the ‘eigenmatrices’ of this tensor -> 3 rd order tensor This tensor has a 3 rd order CPA structure When the mixing matrix has a bilinear structure (the mixing vector has a Khatri-Rao structure) and this tensor can be rewritten as a 5 th order tensor with CPA structure: ICA - CPA

24 Maarten De Vos How to compute the 5 th order CPA? ALS breaks symmetry, simulations showed bad performance Taking into account the partial symmetry  naturally preserved in a line-search scheme –Search directions: between current estimate and ALS update –Step size: rooting real polynomial of degree 10

25 Maarten De Vos What is ICA? What is CPA? Why combining ICA and CPA? Our algorithm Results Conclusion Overview

26 Maarten De Vos Application in fMRI? [Stegeman, 2007]: CPA on fMRI comparable to tensor pICA if correct number of components is chosen [Daubechies, 2009]: ICA on fMRI works rather because of sparsity than because of independence.  Infomax

27 Maarten De Vos We consider narrow-band sources received by a uniform circular array (UCA) of I identical sensors of radius P. We assume free-space propagation. The entries of A represent the gain between a transmitter and an antenna We generated BPSK user signals: all source distributions are binary (1 or -1), with an equal probability of both values. B contains the chips of the spreading codes for the different users. Application in telecommunications

28 Maarten De Vos Well-conditioned mixtureMixture (5,2,1000) Rank overestimated Colored noise

29 Maarten De Vos Outperforms orthogonality constraint

30 Maarten De Vos We developed a new algorithm to impose both independence and trilinear constraints simultaneously: ICA-CPA We showed that the method outperforms both standard ICA and CPA in certain situations It should only be used when assumptions are validated … Conclusion

31 Maarten De Vos