Dimensionality reduction Kenneth D. Harris 24/6/15.

Slides:



Advertisements
Similar presentations
FMRI Methods Lecture 10 – Using natural stimuli. Reductionism Reducing complex things into simpler components Explaining the whole as a sum of its parts.
Advertisements

Discrimination amongst k populations. We want to determine if an observation vector comes from one of the k populations For this purpose we need to partition.
Component Analysis (Review)
Face Recognition Ying Wu Electrical and Computer Engineering Northwestern University, Evanston, IL
Dimensionality Reduction PCA -- SVD
PCA + SVD.
1er. Escuela Red ProTIC - Tandil, de Abril, 2006 Principal component analysis (PCA) is a technique that is useful for the compression and classification.
Neuroinformatics 1: review of statistics Kenneth D. Harris UCL, 28/1/15.
An introduction to Principal Component Analysis (PCA)
Principal Component Analysis
Application of Statistical Techniques to Neural Data Analysis Aniket Kaloti 03/07/2006.
L15:Microarray analysis (Classification) The Biological Problem Two conditions that need to be differentiated, (Have different treatments). EX: ALL (Acute.
Dimensional reduction, PCA
Principal Component Analysis IML Outline Max the variance of the output coordinates Optimal reconstruction Generating data Limitations of PCA.
Canonical correlations
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Bayesian belief networks 2. PCA and ICA
3D Geometry for Computer Graphics
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
Techniques for studying correlation and covariance structure
Principal Component Analysis. Philosophy of PCA Introduced by Pearson (1901) and Hotelling (1933) to describe the variation in a set of multivariate data.
CSC 478 Programming Data Mining Applications Course Summary Bamshad Mobasher DePaul University Bamshad Mobasher DePaul University.
Survey on ICA Technical Report, Aapo Hyvärinen, 1999.
Factor Analysis Psy 524 Ainsworth.
Summarized by Soo-Jin Kim
Presented By Wanchen Lu 2/25/2013
Principal Components Analysis BMTRY 726 3/27/14. Uses Goal: Explain the variability of a set of variables using a “small” set of linear combinations of.
Canonical Correlation Analysis, Redundancy Analysis and Canonical Correspondence Analysis Hal Whitehead BIOL4062/5062.
Additive Data Perturbation: data reconstruction attacks.
Canonical Correlation Analysis and Related Techniques Simon Mason International Research Institute for Climate Prediction The Earth Institute of Columbia.
Classification Course web page: vision.cis.udel.edu/~cv May 12, 2003  Lecture 33.
N– variate Gaussian. Some important characteristics: 1)The pdf of n jointly Gaussian R.V.’s is completely described by means, variances and covariances.
EE4-62 MLCV Lecture Face Recognition – Subspace/Manifold Learning Tae-Kyun Kim 1 EE4-62 MLCV.
Discriminant Analysis
Cluster analysis and spike sorting
EIGENSYSTEMS, SVD, PCA Big Data Seminar, Dedi Gadot, December 14 th, 2014.
Principle Component Analysis and its use in MA clustering Lecture 12.
Principal Component Analysis Zelin Jia Shengbin Lin 10/20/2015.
Principal Component Analysis and Linear Discriminant Analysis for Feature Reduction Jieping Ye Department of Computer Science and Engineering Arizona State.
Confirmatory analysis for multiple spike trains Kenneth D. Harris 29/7/15.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 10: PRINCIPAL COMPONENTS ANALYSIS Objectives:
Principal Components Analysis ( PCA)
Multivariate statistical methods. Multivariate methods multivariate dataset – group of n objects, m variables (as a rule n>m, if possible). confirmation.
Principal Component Analysis
Principal Component Analysis (PCA)
Principal Component Analysis
Dimensionality Reduction
LECTURE 09: BAYESIAN ESTIMATION (Cont.)
COMP 1942 PCA TA: Harry Chan COMP1942.
School of Computer Science & Engineering
LECTURE 10: DISCRIMINANT ANALYSIS
Principal Component Analysis (PCA)
Random Effects Analysis
Probabilistic Models with Latent Variables
Lecture 14 PCA, pPCA, ICA.
Bayesian belief networks 2. PCA and ICA
Techniques for studying correlation and covariance structure
Introduction PCA (Principal Component Analysis) Characteristics:
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
SVD, PCA, AND THE NFL By: Andrew Zachary.
X.1 Principal component analysis
LECTURE 09: DISCRIMINANT ANALYSIS
Feature Selection Methods
Principal Component Analysis
WellcomeTrust Centre for Neuroimaging University College London
Canonical Correlation Analysis and Related Techniques
Feature Extraction (I)
Feature Selection in BCIs (section 5 and 6 of Review paper)
Dimensionality Reduction Part 1 of 2
Outline Variance Matrix of Stochastic Variables and Orthogonal Transforms Principle Component Analysis Generalized Eigenvalue Decomposition.
Presentation transcript:

Dimensionality reduction Kenneth D. Harris 24/6/15

Exploratory vs. confirmatory analysis Exploratory analysis Helps you formulate a hypothesis End result is usually a nice-looking picture Any method is equally valid – because it just helps you think of a hypothesis Confirmatory analysis Where you test your hypothesis Multiple ways to do it (Classical, Bayesian, Cross-validation) You have to stick to the rules Inductive vs. deductive reasoning (K. Popper)

Principal component analysis Finds directions of maximum variance in a data set These correspond to the eigenvectors of the covariance matrix

Relation to SVD

PCA: auditory cortex population vectors Bartho et al, EJN 2009

Discriminant analysis

Discriminant analysis: auditory cortex Projections chosen to maximally separate sustained responses Looks completely different to PCA! Bartho et al, EJN 2009

Factor analysis

Canonical correlation analysis

Independent component analysis

ICA in practice Fast ICA algorithm Need to chose a measure of non-Gaussianity Do an SVD first! It will take less time It will give better results

Wide-field movie: SVD 1

Wide-field movie: SVD 2

Wide-field movie: SVD 3

IC 1 (from 12 SVDs)

IC 2 (from 12 SVDs)

IC 3 (from 12 SVDs)

IC 4 (from 12 SVDs)

IC 1 (from 100 SVDs)

IC 2 (from 100 SVDs)

Non-negative matrix factorization

Non-negative factor 1

Non-negative factor 2

Non-negative factor 3

Non-negative factor 4

Non-negative factor 5

Non-negative factor 6

Non-negative factor 7

Many more methods… jPCA: Churchland, Cunningham et al, Nature 2012 Mante, Sussillo et al, Nature 2013

Summary There are lots of methods for doing dimensionality reduction THEY ARE EXPLORATORY ANALYSES Different methods will often give you different results Use them – they might help you formulate a hypothesis Then do a confirmatory analysis. These usually do not use dimensionality reduction.