Particle Filtering in MEG: from single dipole filtering to Random Finite Sets A. SorrentinoCNR-INFM LAMIA, Genova methods for image and data analysis www.dima.unige.it/~piana/mida/group.html.

Slides:



Advertisements
Similar presentations
Bayesian inference Lee Harrison York Neuroimaging Centre 01 / 05 / 2009.
Advertisements

Bayesian Belief Propagation
Mobile Robot Localization and Mapping using the Kalman Filter
Jose-Luis Blanco, Javier González, Juan-Antonio Fernández-Madrigal University of Málaga (Spain) Dpt. of System Engineering and Automation May Pasadena,
CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
Lecture 23 Exemplary Inverse Problems including Earthquake Location.
Monte Carlo Localization for Mobile Robots Karan M. Gupta 03/10/2004
Lab 2 Lab 3 Homework Labs 4-6 Final Project Late No Videos Write up
Visual Tracking CMPUT 615 Nilanjan Ray. What is Visual Tracking Following objects through image sequences or videos Sometimes we need to track a single.
Markov Localization & Bayes Filtering 1 with Kalman Filters Discrete Filters Particle Filters Slides adapted from Thrun et al., Probabilistic Robotics.
PHD Approach for Multi-target Tracking
On Systems with Limited Communication PhD Thesis Defense Jian Zou May 6, 2004.
Artificial Learning Approaches for Multi-target Tracking Jesse McCrosky Nikki Hu.
Adaptive Rao-Blackwellized Particle Filter and It’s Evaluation for Tracking in Surveillance Xinyu Xu and Baoxin Li, Senior Member, IEEE.
A brief Introduction to Particle Filters
Sérgio Pequito Phd Student
SLAM: Simultaneous Localization and Mapping: Part I Chang Young Kim These slides are based on: Probabilistic Robotics, S. Thrun, W. Burgard, D. Fox, MIT.
Nonlinear and Non-Gaussian Estimation with A Focus on Particle Filters Prasanth Jeevan Mary Knox May 12, 2006.
Lecture 8 The Principle of Maximum Likelihood. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.
Robust Monte Carlo Localization for Mobile Robots
Monte Carlo Localization
1 Hybrid Agent-Based Modeling: Architectures,Analyses and Applications (Stage One) Li, Hailin.
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
A Probabilistic Approach to Collaborative Multi-robot Localization Dieter Fox, Wolfram Burgard, Hannes Kruppa, Sebastin Thrun Presented by Rajkumar Parthasarathy.
Comparative survey on non linear filtering methods : the quantization and the particle filtering approaches Afef SELLAMI Chang Young Kim.
Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this.
Particle Filtering for Non- Linear/Non-Gaussian System Bohyung Han
Bayesian Filtering for Location Estimation D. Fox, J. Hightower, L. Liao, D. Schulz, and G. Borriello Presented by: Honggang Zhang.
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
Bayesian Filtering for Robot Localization
Bayes Factor Based on Han and Carlin (2001, JASA).
Particle Filtering in Network Tomography
Markov Localization & Bayes Filtering
Computer vision: models, learning and inference Chapter 19 Temporal models.
From Bayesian Filtering to Particle Filters Dieter Fox University of Washington Joint work with W. Burgard, F. Dellaert, C. Kwok, S. Thrun.
Particle Filtering (Sequential Monte Carlo)
Computer vision: models, learning and inference Chapter 19 Temporal models.
Simultaneous Localization and Mapping Presented by Lihan He Apr. 21, 2006.
Jamal Saboune - CRV10 Tutorial Day 1 Bayesian state estimation and application to tracking Jamal Saboune VIVA Lab - SITE - University.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Probabilistic Robotics Bayes Filter Implementations.
Young Ki Baik, Computer Vision Lab.
Overview Particle filtering is a sequential Monte Carlo methodology in which the relevant probability distributions are iteratively estimated using the.
Mixture Models, Monte Carlo, Bayesian Updating and Dynamic Models Mike West Computing Science and Statistics, Vol. 24, pp , 1993.
Forward-Scan Sonar Tomographic Reconstruction PHD Filter Multiple Target Tracking Bayesian Multiple Target Tracking in Forward Scan Sonar.
MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation 1 Dynamic Sensor Resource Management for ATE MURI.
-Arnaud Doucet, Nando de Freitas et al, UAI
Mobile Robot Localization (ch. 7)
Maximum a posteriori sequence estimation using Monte Carlo particle filters S. J. Godsill, A. Doucet, and M. West Annals of the Institute of Statistical.
Boosted Particle Filter: Multitarget Detection and Tracking Fayin Li.
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
CSE-473 Project 2 Monte Carlo Localization. Localization as state estimation.
SLAM Tutorial (Part I) Marios Xanthidis.
6. Population Codes Presented by Rhee, Je-Keun © 2008, SNU Biointelligence Lab,
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
ASEN 5070: Statistical Orbit Determination I Fall 2014
Ch3: Model Building through Regression
Probabilistic Robotics
Course: Autonomous Machine Learning
Introduction to particle filter
Auxiliary particle filtering: recent developments
Filtering and State Estimation: Basic Concepts
Introduction to particle filter
Uncertainty Propagation
Presentation transcript:

Particle Filtering in MEG: from single dipole filtering to Random Finite Sets A. SorrentinoCNR-INFM LAMIA, Genova methods for image and data analysis

Co-workers Genova group: Cristina Campi(Math Dep.) Annalisa Pascarella(Comp. Sci. Dep.) Michele Piana(Math. Dep.) Long-time collaboration Lauri Parkkonen(Brain Research Unit, LTL, Helsinki) Recent collaboration Matti Hamalainen(MEG Core Lab, Martinos Center, Boston)

Basics of MEG modeling Biot-Savart Neural current Ohmic term Biot-Savart Accurate model of brain conductivity Poisson

2 approaches to MEG source modeling Continuous current distributionFocal current Imaging approachParametric approach Unknown Model Result N large M small Method Regularization methods Non-linear optimization methods

Automatic current dipole estimate Common approximations to solve this problem: Number of sources constant Source locations fixed Bayesian filtering allows overcoming these limitations Common methods:  Manual dipole modeling  Automatic dipole modeling Estimate the number of sources Estimate the source locations Least Squares for source strengths Manual dipole modeling still the main reference method for comparisons (Stenbacka et al. 2002, Liljestrom et al 2005)

Bayesian filtering in MEG - assumptions J 1 J 2 … J t … B 1 B 2 … B t … Two stochastic processes: Markovian assumptions: Our actual model The final aim: Markov process Instantaneous propagation No feedback

Bayesian filtering in MEG – key equations “Observation” “Evolution” …… ESTIMATESESTIMATES Linear-Gaussian model  Kalman filter Non-linear model  Particle filter Likelihood function Transition kernel

Particle filtering of current dipoles The key idea: sequential Monte Carlo sampling. (single dipole space) Draw random samples (“particles”) from the prior Update the particle weights Resample and let particles evolve

A 2D example – the data

A 2D example – the particles

The full 3D case – auditory stimuli S. et al., ICS 1300 (2007)

Comparison with beamformers and RAP-MUSIC Two quasi-correlated sources Pascarella et al., ICS 1300 (2007); S. et al., J. Phys. Conf. Ser. 135 (2008) Beamformers: suppression of correlated sources

Comparison with beamformers and RAP-MUSIC Pascarella et al., ICS 1300 (2007); S. et al., J. Phys. Conf. Ser. 135 (2008) Two orthogonal sources RAP-MUSIC: wrong source orientation, wrong source waveform

Rao-Blackwellization Campi et al. Inverse Problems (2008); S. et al. J. Phys. Conf. Ser. (2008) Can we exploit the linear substructure? Analytic solution (Kalman filter) Sampled (particle filter) Accurate results with much fewer particles Statistical efficiency increased (reduced variance of importance weights) Increased computational cost

Bayesian filtering with multiple dipoles A collection of spaces (single-dipole space D, double-dipole space,...) A collection of posterior densities (one on each space) Exploring with particles all spaces (up to...) One particle = one dipoleOne particle = two dipolesOne particle = three dipoles Reversible Jumps (Green 1995) from one space to another one

Random Finite Sets – why Non uniquess of vector representations of multi-dipole states: (dipole_1,dipole_2) and (dipole_2,dipole_1) same physical state, different points in D X D Consequence: multi-modal posterior density non-unique maximum non-representative mean Where is the set of all finite subsets of (single dipole space) equipped with the Mathéron topology A random finite set  of dipoles is a measurable function Let ( , ,P) be a probability space For some realizations,

Random Finite Sets - how Probability measure of RFS: a conceptual definition Belief measure instead of probability measure Probability Hypothesis Density (PHD): the RFS-analogous of the conditional mean The integral of the PHD in a volume = number of dipoles in that volume Peaks of the PHD = estimates of dipole parameters Model order selection: the number of sources estimated dynamically Multi-dipole belief measures can be derived from single-dipole probability measures

RFS-based particle filter: Results S. et al., Human Brain Mapping (2009) Monte Carlo simulations: data sets Random locations (distance >2 cm) Always same temporal waveforms 2 time-correlated sources peak-SNR between 1 and 20 Results: 75% sources recovered (<2 cm) Average error 6 mm, independent on SNR Temporal correlation affects the detectability very slightly

RFS-based particle filter: Results S. et al., Human Brain Mapping (2009) Comparison with manual dipole modeling Data: 10 sources mimicking complex visual activation The particle filter performed on average like manual dipole modeling performed by uninformed users (on average 6 out of 10 sources correctly recovered)

In progress Source space limited to the cortical surface Two simulated sources

In progress Two sources recovered with orientation constraint Only one source recovered without orientation constraint

References - Sorrentino A., Parkkonen L., Pascarella A., Campi C. and Piana M. Dynamical MEG source modeling with multi-target Bayesian filtering Human Brain Mapping 30: 1911:1921 (2009) -Sorrentino A., Pascarella A., Campi C. and Piana M. A comparative analysis of algorithms for the magnetoencephalography inverse problem Journal of Physics: Conference Series 135 (2008) Sorrentino A., Pascarella A., Campi C. and Piana M. Particle filters for the magnetoencephalography inverse problem: increasing the efficiency through a semi-analytic approach (Rao-Blackwellization) Journal of Physics: Conference Series 124 (2008) Campi C., Pascarella A., Sorrentino A. and Piana M. A Rao-Blackwellized particle filter for magnetoencephalography Inverse Problems 24 (2008) Sorrentino A., Parkkonen L. and Piana M. Particle filters: a new method for reconstructing multiple current dipoles from MEG data International Congress Series 1300 (2007)