MCMC Inference over Latent Diffeomorphisms

Slides:



Advertisements
Similar presentations
Change-Point Detection Techniques for Piecewise Locally Stationary Time Series Michael Last National Institute of Statistical Sciences Talk for Midyear.
Advertisements

Contrastive Divergence Learning
SE263 Video Analytics Course Project Initial Report Presented by M. Aravind Krishnan, SERC, IISc X. Mei and H. Ling, ICCV’09.
Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
Bayesian Estimation in MARK
Efficient Cosmological Parameter Estimation with Hamiltonian Monte Carlo Amir Hajian Amir Hajian Cosmo06 – September 25, 2006 Astro-ph/
1 Removing Camera Shake from a Single Photograph Rob Fergus, Barun Singh, Aaron Hertzmann, Sam T. Roweis and William T. Freeman ACM SIGGRAPH 2006, Boston,
Stochastic approximate inference Kay H. Brodersen Computational Neuroeconomics Group Department of Economics University of Zurich Machine Learning and.
Optimization methods Morten Nielsen Department of Systems biology, DTU.
Stanford CS223B Computer Vision, Winter 2007 Lecture 12 Tracking Motion Professors Sebastian Thrun and Jana Košecká CAs: Vaibhav Vaish and David Stavens.
Motion Analysis (contd.) Slides are from RPI Registration Class.
CSci 6971: Image Registration Lecture 4: First Examples January 23, 2004 Prof. Chuck Stewart, RPI Dr. Luis Ibanez, Kitware Prof. Chuck Stewart, RPI Dr.
Particle filters (continued…). Recall Particle filters –Track state sequence x i given the measurements ( y 0, y 1, …., y i ) –Non-linear dynamics –Non-linear.
1 Integration of Background Modeling and Object Tracking Yu-Ting Chen, Chu-Song Chen, Yi-Ping Hung IEEE ICME, 2006.
Today Introduction to MCMC Particle filters and MCMC
Comparative survey on non linear filtering methods : the quantization and the particle filtering approaches Afef SELLAMI Chang Young Kim.
Preprocessing II: Between Subjects John Ashburner Wellcome Trust Centre for Neuroimaging, 12 Queen Square, London, UK.
Introduction to Monte Carlo Methods D.J.C. Mackay.
1 Miodrag Bolic ARCHITECTURES FOR EFFICIENT IMPLEMENTATION OF PARTICLE FILTERS Department of Electrical and Computer Engineering Stony Brook University.
1 Hybrid methods for solving large-scale parameter estimation problems Carlos A. Quintero 1 Miguel Argáez 1 Hector Klie 2 Leticia Velázquez 1 Mary Wheeler.
MCMC: Particle Theory By Marc Sobel. Particle Theory: Can we understand it?
Introduction to variable selection I Qi Yu. 2 Problems due to poor variable selection: Input dimension is too large; the curse of dimensionality problem.
BraMBLe: The Bayesian Multiple-BLob Tracker By Michael Isard and John MacCormick Presented by Kristin Branson CSE 252C, Fall 2003.
Particle Filters in high dimensions Peter Jan van Leeuwen and Mel Ades Data-Assimilation Research Centre DARC University of Reading Lorentz Center 2011.
Adaptive Importance Sampling for Estimation in Structured Domains L.E. Ortiz and L.P. Kaelbling.
1 Robust estimation techniques in real-time robot vision Ezio Malis, Eric Marchand INRIA Sophia, projet ICARE INRIA Rennes, projet Lagadic.
Computer vision: models, learning and inference Chapter 19 Temporal models.
Machine Learning Lecture 23: Statistical Estimation with Sampling Iain Murray’s MLSS lecture on videolectures.net:
Particle Filters for Shape Correspondence Presenter: Jingting Zeng.
Ch 4. Linear Models for Classification (1/2) Pattern Recognition and Machine Learning, C. M. Bishop, Summarized and revised by Hee-Woong Lim.
Continuous Variables Write message update equation as an expectation: Proposal distribution W t (x t ) for each node Samples define a random discretization.
Lecture 4: Statistics Review II Date: 9/5/02  Hypothesis tests: power  Estimation: likelihood, moment estimation, least square  Statistical properties.
Non-Bayes classifiers. Linear discriminants, neural networks.
Bayesian Reasoning: Tempering & Sampling A/Prof Geraint F. Lewis Rm 560:
Sequential Inference for Evolving Groups of Objects 이범진 Biointelligence Lab Seoul National University.
Molecular Modelling - Lecture 2 Techniques for Conformational Sampling Uses CHARMM force field Written in C++
MCMC (Part II) By Marc Sobel. Monte Carlo Exploration  Suppose we want to optimize a complicated distribution f(*). We assume ‘f’ is known up to a multiplicative.
Real-Time Tracking with Mean Shift Presented by: Qiuhua Liu May 6, 2005.
Short Introduction to Particle Filtering by Arthur Pece [ follows my Introduction to Kalman filtering ]
Visual Tracking by Cluster Analysis Arthur Pece Department of Computer Science University of Copenhagen
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 9: Review.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
A latent Gaussian model for compositional data with structural zeroes Adam Butler & Chris Glasbey Biomathematics & Statistics Scotland.
Optimization Indiana University July Geoffrey Fox
Gil McVean, Department of Statistics Thursday February 12 th 2009 Monte Carlo simulation.
Occlusion Tracking Using Logical Models Summary. A Variational Partial Differential Equations based model is used for tracking objects under occlusions.
Kevin Stevenson AST 4762/5765. What is MCMC?  Random sampling algorithm  Estimates model parameters and their uncertainty  Only samples regions of.
A Collapsed Variational Bayesian Inference Algorithm for Latent Dirichlet Allocation Yee W. Teh, David Newman and Max Welling Published on NIPS 2006 Discussion.
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
(joint work with Ai-ru Cheng, Ron Gallant, Beom Lee)
Fast edge-directed single-image super-resolution
Optimization of Monte Carlo Integration
GEOGG121: Methods Monte Carlo methods, revision
Accelerated Sampling for the Indian Buffet Process
Particle Filtering for Geometric Active Contours
Probabilistic Robotics
Metropolis light transport
Jun Liu Department of Statistics Stanford University
Addy Satija and Jef Caers Department of Energy Resources Engineering
Markov chain monte carlo
Advanced Artificial Intelligence
Camera Calibration Using Neural Network for Image-Based Soil Deformation Measurement Systems Zhao, Honghua Ge, Louis Civil, Architectural, and Environmental.
How do we find the best linear regression line?
Indiana University July Geoffrey Fox
Finding Periodic Discrete Events in Noisy Streams
VQMC J. Planelles.
Population Methods.
Stochastic Methods.
Presentation transcript:

MCMC Inference over Latent Diffeomorphisms Angel Yu

Diffeomorphisms A space of transformations on manifolds These transformations are: Bijective Differentiable Inverse also differentiable

CPAB Transformation Integrate Continuous Piecewise Affine velocity fields over time t.

Example CPAB transformation 2D CPAB Transformation Tessellate into triangles Example CPAB transformation 64 triangles => 58 parameters

Parallel Computation of CPAB Computation of CPAB transformation is independent at each point Embarrassingly parallel over the points Time to compute a 2D CPAB transformation on a 512x512 image: Computation time is fast enough that the communication overhead is significant.

Infer underlying transformation Given 2 images and correspondences between the 2 images Infer the parameters in the underlying transformation by minimizing least squared objective function: Using: Gradient Descent Metropolis’ Algorithm Particle Filter

Gradient Descent A common optimization algorithm. Algorithm: Start with initial 𝜃 0 At each iteration 𝑖, take 𝜃 𝑖+1 = 𝜃 𝑖 −𝛾𝑓′( 𝜃 𝑖 ) Easy to get stuck at local minimums

Metropolis’ Algorithm An MCMC Sampling algorithm used to sample from a target distribution 𝑃(𝑥) Start off with an initial sample 𝑥 0 At each iteration Sample 𝑥′ from a proposal distribution 𝑄(𝑥|𝑥 𝑖 ) usually a Gaussian centered at 𝑥 𝑖 Calculate 𝛼=𝑃(𝑥′)/𝑃( 𝑥 𝑖 ) Accept if 𝑥′ with probability 𝛼 and set 𝑥 𝑖+1 = 𝑥 ′ Otherwise reject and set 𝑥 𝑖+1 = 𝑥 𝑖 To use this to minimize our objective function we take:

Metropolis Algorithm Results Takes 91s for 50,000 iterations

Particle Filter Another MCMC sampling algorithm primarily used for changing 𝑃(𝑥): Start off with 𝑛 particles randomly sampled from the prior At each iterations: Calculate the weights for each particle proportional to its probability Resample from the current particles based on the distribution of weights Perturb each particle using a perturbation distribution, usually a Gaussian centered at the particle In our case, our weights are: Allows easy parallelization over the number of particles.

Particle Filter Results

Particle Filter Timings Time required for 200 iterations on 16x16 correspondences: Time required for each iteration on 512x512 correspondences:

Summary Implemented the calculation of CPAB transformations in Julia Both Serial and Parallel Works with both SharedArrays and Darrays Implemented Metropolis’ Algorithm and Particle Filter Both produced good results on the inference problem Almost linear speedup when using Parallel Particle Filter on large enough problems