Neuroinformatics Aapo Hyvärinen Professor, Group Leader.

Slides:



Advertisements
Similar presentations
FMRI Methods Lecture 10 – Using natural stimuli. Reductionism Reducing complex things into simpler components Explaining the whole as a sum of its parts.
Advertisements

Design of Experiments Questions Network Inference Working Group October 8, 2008.
Bayesian inference Lee Harrison York Neuroimaging Centre 01 / 05 / 2009.
Bayesian Belief Propagation
Independent Component Analysis: The Fast ICA algorithm
Bayesian models for fMRI data
Uncertainty and Information Integration in Biomedical Applications Claudia Plant Research Group for Bioimaging TU München.
The General Linear Model Or, What the Hell’s Going on During Estimation?
Independent Component Analysis & Blind Source Separation
Independent Component Analysis (ICA)
Regulatory Network (Part II) 11/05/07. Methods Linear –PCA (Raychaudhuri et al. 2000) –NIR (Gardner et al. 2003) Nonlinear –Bayesian network (Friedman.
Application of Statistical Techniques to Neural Data Analysis Aniket Kaloti 03/07/2006.
Independent Component Analysis & Blind Source Separation Ata Kaban The University of Birmingham.
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Independent Component Analysis (ICA) and Factor Analysis (FA)
Arizona State University DMML Kernel Methods – Gaussian Processes Presented by Shankar Bhargav.
1 Automatic Request Categorization in Internet Services Abhishek B. Sharma (USC) Collaborators: Ranjita Bhagwan (MSR, India) Monojit Choudhury (MSR, India)
ICA Alphan Altinok. Outline  PCA  ICA  Foundation  Ambiguities  Algorithms  Examples  Papers.
Correlational Designs
(1) A probability model respecting those covariance observations: Gaussian Maximum entropy probability distribution for a given covariance observation.
Learning In Bayesian Networks. Learning Problem Set of random variables X = {W, X, Y, Z, …} Training set D = { x 1, x 2, …, x N }  Each observation specifies.
Linear Algebra and Matrices
HELSINKI UNIVERSITY OF TECHNOLOGY LABORATORY OF COMPUTER AND INFORMATION SCIENCE NEURAL NETWORKS RESEACH CENTRE Variability of Independent Components.
Survey on ICA Technical Report, Aapo Hyvärinen, 1999.
Factor Analysis Psy 524 Ainsworth.
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
1 A Taste of Data Mining. 2 Definition “Data mining is the analysis of data to establish relationships and identify patterns.” practice.findlaw.com/glossary.html.
0 Pattern Classification, Chapter 3 0 Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda,
Outline What Neural Networks are and why they are desirable Historical background Applications Strengths neural networks and advantages Status N.N and.
L 1 Chapter 12 Correlational Designs EDUC 640 Dr. William M. Bauer.
2 2  Background  Vision in Human Brain  Efficient Coding Theory  Motivation  Natural Pictures  Methodology  Statistical Characteristics  Models.
Network modelling using resting-state fMRI: effects of age and APOE Lars T. Westlye University of Oslo CAS kickoff meeting 23/
EEG/MEG source reconstruction
Empirical Research Methods in Computer Science Lecture 7 November 30, 2005 Noah Smith.
ECE 8443 – Pattern Recognition LECTURE 10: HETEROSCEDASTIC LINEAR DISCRIMINANT ANALYSIS AND INDEPENDENT COMPONENT ANALYSIS Objectives: Generalization of.
LECTURE 1 - SCOPE, OBJECTIVES AND METHODS OF DISCIPLINE "ECONOMETRICS"
1 CMSC 671 Fall 2001 Class #25-26 – Tuesday, November 27 / Thursday, November 29.
Advanced Analysis and Modeling Tools for Columnar- and Laminar-Level High- Resolution fMRI Data at 7+ Tesla Rainer Goebel Maastricht Brain Imaging Center.
CSC2515: Lecture 7 (post) Independent Components Analysis, and Autoencoders Geoffrey Hinton.
EEG/MEG source reconstruction
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
1 Statistics & R, TiP, 2011/12 Neural Networks  Technique for discrimination & regression problems  More mathematical theoretical foundation  Works.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 12: Advanced Discriminant Analysis Objectives:
NTU & MSRA Ming-Feng Tsai
2D-LDA: A statistical linear discriminant analysis for image matrix
Introduction to Independent Component Analysis Math 285 project Fall 2015 Jingmei Lu Xixi Lu 12/10/2015.
Dynamic Causal Model for evoked responses in MEG/EEG Rosalyn Moran.
Bayesian inference Lee Harrison York Neuroimaging Centre 23 / 10 / 2009.
Independent Component Analysis features of Color & Stereo images Authors: Patrik O. Hoyer Aapo Hyvarinen CIS 526: Neural Computation Presented by: Ajay.
McGraw-Hill/Irwin © 2003 The McGraw-Hill Companies, Inc.,All Rights Reserved. Part Four ANALYSIS AND PRESENTATION OF DATA.
Dimension reduction (2) EDR space Sliced inverse regression Multi-dimensional LDA Partial Least Squares Network Component analysis.
1 Nonlinear models for Natural Image Statistics Urs Köster & Aapo Hyvärinen University of Helsinki.
Bayesian inference & visual processing in the brain
Big data classification using neural network
Machine Learning for Computer Security
LECTURE 11: Advanced Discriminant Analysis
The general linear model and Statistical Parametric Mapping
Wellcome Trust Centre for Neuroimaging University College London
T test.
Introduction to Radial Basis Function Networks
M/EEG Statistical Analysis & Source Localization
Wellcome Centre for Neuroimaging at UCL
Mixture Models with Adaptive Spatial Priors
Discovery of Hidden Structure in High-Dimensional Data
Toward a Great Class Project: Discussion of Stoianov & Zorzi’s Numerosity Model Psych 209 – 2019 Feb 14, 2019.
Will Penny Wellcome Trust Centre for Neuroimaging,
Mitsuo Kawato ATR Computational Neuroscience Labs
Linear Algebra and Matrices
NON-NEGATIVE COMPONENT PARTS OF SOUND FOR CLASSIFICATION Yong-Choon Cho, Seungjin Choi, Sung-Yang Bang Wen-Yi Chu Department of Computer Science &
Group DCM analysis for cognitive & clinical studies
Presentation transcript:

Neuroinformatics Aapo Hyvärinen Professor, Group Leader

2 Aapo Hyvärinen, professor, leader Patrik Hoyer, academy research fellow, co-leader Michael Gutmann, post-doc Ilmari Kurki, post-doc Kun Zhang, post-doc (11/ /2009)‏ Jun-ichiro Hirayama, visiting post-doc (1-12/2010)‏ Graduated PhD students: Urs Köster (12/2009), Jussi Lindgren(12/2008)‏ Current PhD students: Doris Entner, Antti Hyttinen, Miika Pihlaja, Jouni Puuronen Neuroinformatics Group: Members

3 Natural image statistics Build probabilistic models of natural images to model biological vision Computational estimation theory Computationally efficient estimation of probabilistic models -Unnormalized models or latent variable models Brain imaging data analysis Finding sources and their interactions in EEG/MEG data Causal analysis (talk by D. Entner)‏ Analyze which variables are causes and which are effects Non-Gaussian Bayesian networks / Structural equation models Neuroinformatics Group: Projects

4 Natural Image Statistics First book on the subject Published in June 2009 Combined textbook/monograph Free preprint on the web

5 New intuitive principle: train classifier to distinguish between your data and artificial noise If we use logistic regression with log p(x|θ) as regression function (AISTATS2010, poster on display) Provides consistent (convergent) estimator for θ Works directly for unnormalized models Generalized to a family which includes normalization using importance sampling (submitted)‏ Useful for complex models of natural images, e.g. 3 layers (COSYNE2010 poster on display)‏ Computational estimation theory MRF filters estimated from natural images

Brain imaging data analysis: Inverse problem in EEG/MEG Linear inverse problem: x=As with dim(x)<<dim(s)‏ A known from physics MEG data (x)‏ Estimated activity (s)‏ (Uutela, Hämäläinen, Somersalo, 1999)‏

Beyond the inverse problem Inverse solution transforms a 306 x 100,000 matrix to a 10,000 x 100,000 matrix Not easy to understand Need data analysis methods to understand content Something like ICA should help Actually, does ICA solve something like an inverse problem?

Blind source separation for MEG Improve BSS by taking short-time Fourier transforms as preprocessing (Hyvärinen, Ramkumar, Parkkonen, Hari, NeuroImage, 2010)‏ Takes into account the oscillatory nature of data A spatial ICA using basic inverse problem solver After inverse solution, many variables Take transpose of data matrix like with fMRI Force independence of spatial patterns, not time courses.

Combining inverse modelling with ICA Deep question: What is the connection between ICA and inverse problems? In both: x=As, x observed data ICA: A square, unknown Inverse problem: A known, but has many more columns Two ideas we're working on: Combine inverse modelling with ICA by constructing independent components in cortical space Use a prior on matrix A to make sources localized on the cortex

10 After separating sources, analyze their interactions Connections = correlations? We can find directions using time structure or non-Gaussianity (causal inference)‏ Clinical applications: schizophrenia, Alzheimer, etc. Connectivity analysis

11 Measure brain activity in two subjects when they are interacting Find connectivity between the two subjects Completely new field Data analysis method development definitely needed Collaborative project with Riitta Hari in the Computational Sciences program of the Academy of Finland Also the title of her ERC Advanced Investigator grant. Two post-docs starting in September Towards two-person neuroscience

12 Probabilistic methods with emphasis on computational aspects Interface between informatics and statistics Typically unsupervised learning Discovery of hidden components, connections etc. Need also abstract theory of computationally efficient estimation methods Applications in many areas We have special expertise in neuroscience Brain imaging project going to be important in future Moving towards more application-inspired research Vision