1 Exercise 1 Submission Monday 19 Dec, 2010 Delayed Submission: 4 points every week How would you calculate efficiently the PCA of data where the dimensionality.

Slides:



Advertisements
Similar presentations
Agenda of Week XI Review of Week X Factor analysis Illustration Method of maximum likelihood Principal component analysis Usages, basic model Objective,
Advertisements

5.4 Correlation and Best-Fitting Lines
Active Appearance Models
Dimension reduction (1)
Principal Component Analysis CMPUT 466/551 Nilanjan Ray.
REAL-TIME INDEPENDENT COMPONENT ANALYSIS IMPLEMENTATION AND APPLICATIONS By MARCOS DE AZAMBUJA TURQUETI FERMILAB May RTC 2010.
Independent Component Analysis (ICA)
New Image Encryption and Compression Method Based on Independent Component Analysis.
Factor Analysis Research Methods and Statistics. Learning Outcomes At the end of this lecture and with additional reading you will be able to Describe.
Application of Statistical Techniques to Neural Data Analysis Aniket Kaloti 03/07/2006.
Dimensional reduction, PCA
Independent Component Analysis & Blind Source Separation Ata Kaban The University of Birmingham.
Independent Component Analysis (ICA) and Factor Analysis (FA)
A Quick Practical Guide to PCA and ICA Ted Brookings, UCSB Physics 11/13/06.
Bayesian belief networks 2. PCA and ICA
ICA Alphan Altinok. Outline  PCA  ICA  Foundation  Ambiguities  Algorithms  Examples  Papers.
1 Exercise 1 Submission Monday 6 Dec, 2010 Delayed Submission: 4 points every week How would you calculate efficiently the PCA of data where the dimensionality.
A Unifying Review of Linear Gaussian Models
Dimensionality reduction Kenneth D. Harris 24/6/15.
Techniques for studying correlation and covariance structure
Principal Component Analysis. Philosophy of PCA Introduced by Pearson (1901) and Hotelling (1933) to describe the variation in a set of multivariate data.
Line of Best Fit In a Scatter plot there is usually no single line that passes through all of the data points, so we must try to find the line that best.
Survey on ICA Technical Report, Aapo Hyvärinen, 1999.
Independent Components Analysis with the JADE algorithm
ERP DATA ACQUISITION & PREPROCESSING EEG Acquisition: 256 scalp sites; vertex recording reference (Geodesic Sensor Net)..01 Hz to 100 Hz analogue filter;
Graphs Recording scientific findings. The Importance of Graphs Line Graphs O Graphs are a “picture” of your data. O They can reveal patterns or trends.
Feature extraction 1.Introduction 2.T-test 3.Signal Noise Ratio (SNR) 4.Linear Correlation Coefficient (LCC) 5.Principle component analysis (PCA) 6.Linear.
Independent Component Analysis on Images Instructor: Dr. Longin Jan Latecki Presented by: Bo Han.
Heart Sound Background Noise Removal Haim Appleboim Biomedical Seminar February 2007.
EMIS 8381 – Spring Netflix and Your Next Movie Night Nonlinear Programming Ron Andrews EMIS 8381.
Hongyan Li, Huakui Wang, Baojin Xiao College of Information Engineering of Taiyuan University of Technology 8th International Conference on Signal Processing.
How to create a graph and how to interpret different graph designs
ECE 8443 – Pattern Recognition LECTURE 10: HETEROSCEDASTIC LINEAR DISCRIMINANT ANALYSIS AND INDEPENDENT COMPONENT ANALYSIS Objectives: Generalization of.
A note about gradient descent: Consider the function f(x)=(x-x 0 ) 2 Its derivative is: By gradient descent (If f(x) is more complex we usually cannot.
EE4-62 MLCV Lecture Face Recognition – Subspace/Manifold Learning Tae-Kyun Kim 1 EE4-62 MLCV.
Neural Computation Prof. Nathan Intrator
PCA vs ICA vs LDA. How to represent images? Why representation methods are needed?? –Curse of dimensionality – width x height x channels –Noise reduction.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 12: Advanced Discriminant Analysis Objectives:
Principal Component Analysis (PCA)
Principal Component Analysis (PCA)
Feature Selection and Extraction Michael J. Watts
Principal Component Analysis (PCA).
2D-LDA: A statistical linear discriminant analysis for image matrix
Introduction to Independent Component Analysis Math 285 project Fall 2015 Jingmei Lu Xixi Lu 12/10/2015.
An Introduction of Independent Component Analysis (ICA) Xiaoling Wang Jan. 28, 2003.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 09: Discriminant Analysis Objectives: Principal.
Four files erik_a2_stockholm_temp Erik simulation scenario A2 until 2100 erik_a2_stockholm_prec Erik simulation scenario A2 until.
Linear Vs. Non-Linear Functions We are learning to model functions as tables, graphs and equations (rules). Monday, June 27, 2016.
Results from Mean and Variance Calculations The overall mean of the data for all features was for the REF class and for the LE class. The.
Principal Component Analysis
Principal Component Analysis (PCA)
Generalized and Hybrid Fast-ICA Implementation using GPU
LECTURE 11: Advanced Discriminant Analysis
School of Computer Science & Engineering
Brain Electrophysiological Signal Processing: Preprocessing
Aim: How to plot or graph data
Principal Component Analysis (PCA)
Principal Component Analysis
Blind Signal Separation using Principal Components Analysis
Dynamic graphics, Principal Component Analysis
Lecture 14 PCA, pPCA, ICA.
Linear and Nonlinear Functions
Bayesian belief networks 2. PCA and ICA
Principal Component Analysis (PCA)
Descriptive Statistics vs. Factor Analysis
A Fast Fixed-Point Algorithm for Independent Component Analysis
Factor Analysis (Principal Components) Output
Aim: How to plot or graph data
Feature Extraction (I)
Ex2. Due May 24 via to subject: Ex2 and last names
Presentation transcript:

1 Exercise 1 Submission Monday 19 Dec, 2010 Delayed Submission: 4 points every week How would you calculate efficiently the PCA of data where the dimensionality d is much larger than the number of vector observations n? Give the equation and explain Download the EEG data that appears next to the exercise. It contains frames of EEG recordings from three electrodes. There are two class labels organized in two files Extract PCAs from the data, test scatter plots of original data and after projecting onto the principal components, plot Eigen values. Projections on which principal components are most correlated with the class labels?

Ex1. Part 2 Submit to subject: Ex1 NC and last Additional information about the data: 1.There are four groups in the data, you can seek separation between ESP and EHP, and between ESR and EHR. I suggest that you try both and see where separation is better.

Ex1. Part 2 Submit to subject: Ex1 NC and last 1.Given a high dimensional data, is there a way to know if all possible projections of the data are Gaussian? Explain - What if there is some additive Gaussian noise?

Ex1. (cont.) 2. Use Fast ICA (easily found on Google) e/dlcode.html e/dlcode.html – Get the data from the Web site – Apply fastica to de-mix (different non-linear functions) – Devise ways to demonstrate that the new data is more independent than the original, how do you tell when non-linearity is best (show also graphically)

Ex1 – (Cont.) 3. Create a BCM learning rule which can go into the Fast ICA algorithm of Hyvarinen. – Run it on the previos data – Explain the results