Slides for Sampling from Posterior of Shape

Slides:



Advertisements
Similar presentations
Introduction to Monte Carlo Markov chain (MCMC) methods
Advertisements

Level set based Image Segmentation Hang Xiao Jan12, 2013.
Markov Chain Monte Carlo Convergence Diagnostics: A Comparative Review By Mary Kathryn Cowles and Bradley P. Carlin Presented by Yuting Qi 12/01/2006.
Efficient Cosmological Parameter Estimation with Hamiltonian Monte Carlo Amir Hajian Amir Hajian Cosmo06 – September 25, 2006 Astro-ph/
Ch 11. Sampling Models Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by I.-H. Lee Biointelligence Laboratory, Seoul National.
Markov-Chain Monte Carlo
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
CHAPTER 16 MARKOV CHAIN MONTE CARLO
Lecture 3: Markov processes, master equation
Bayesian Reasoning: Markov Chain Monte Carlo
BAYESIAN INFERENCE Sampling techniques
Optimization methods Morten Nielsen Department of Systems biology, DTU.
CS774. Markov Random Field : Theory and Application Lecture 16 Kyomin Jung KAIST Nov
Computing the Posterior Probability The posterior probability distribution contains the complete information concerning the parameters, but need often.
Computational statistics 2009 Random walk. Computational statistics 2009 Random walk with absorbing barrier.
Bayesian Analysis of X-ray Luminosity Functions A. Ptak (JHU) Abstract Often only a relatively small number of sources of a given class are detected in.
The University of Texas at Austin, CS 395T, Spring 2008, Prof. William H. Press IMPRS Summer School 2009, Prof. William H. Press 1 4th IMPRS Astronomy.
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
Machine Learning CUNY Graduate Center Lecture 7b: Sampling.
Today Introduction to MCMC Particle filters and MCMC
End of Chapter 8 Neil Weisenfeld March 28, 2005.
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Image Analysis and Markov Random Fields (MRFs) Quanren Xiong.
Introduction to Monte Carlo Methods D.J.C. Mackay.
Priors, Normal Models, Computing Posteriors
Machine Learning Lecture 23: Statistical Estimation with Sampling Iain Murray’s MLSS lecture on videolectures.net:
Kalman Filter (Thu) Joon Shik Kim Computational Models of Intelligence.
Stochastic Linear Programming by Series of Monte-Carlo Estimators Leonidas SAKALAUSKAS Institute of Mathematics&Informatics Vilnius, Lithuania
Module 1: Statistical Issues in Micro simulation Paul Sousa.
1 Gil McVean Tuesday 24 th February 2009 Markov Chain Monte Carlo.
Bayesian Inversion of Stokes Profiles A.Asensio Ramos (IAC) M. J. Martínez González (LERMA) J. A. Rubiño Martín (IAC) Beaulieu Workshop ( Beaulieu sur.
Monte Carlo Methods1 T Special Course In Information Science II Tomas Ukkonen
Markov Random Fields Probabilistic Models for Images
MTA SzTAKI & Veszprém University (Hungary) Guests at INRIA, Sophia Antipolis, 2000 and 2001 Paintbrush Rendering of Images Tamás Szirányi.
Stochastic Systems Group Curve Sampling and Conditional Simulation Ayres Fan John W. Fisher III Alan S. Willsky MIT Stochastic Systems Group.
An Introduction to Markov Chain Monte Carlo Teg Grenager July 1, 2004.
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
Lecture #9: Introduction to Markov Chain Monte Carlo, part 3
Reducing MCMC Computational Cost With a Two Layered Bayesian Approach
A tutorial on Markov Chain Monte Carlo. Problem  g (x) dx I = If{X } form a Markov chain with stationary probability  i  I  g(x ) i  (x ) i  i=1.
CS774. Markov Random Field : Theory and Application Lecture 15 Kyomin Jung KAIST Oct
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
TEMPLATE DESIGN © Approximate Inference Completing the analogy… Inferring Seismic Event Locations We start out with the.
Gil McVean, Department of Statistics Thursday February 12 th 2009 Monte Carlo simulation.
Kevin Stevenson AST 4762/5765. What is MCMC?  Random sampling algorithm  Estimates model parameters and their uncertainty  Only samples regions of.
HW7: Evolutionarily conserved segments ENCODE region 009 (beta-globin locus) Multiple alignment of human, dog, and mouse 2 states: neutral (fast-evolving),
Random Sampling Algorithms with Applications Kyomin Jung KAIST Aug ERC Workshop.
The Monte Carlo Method/ Markov Chains/ Metropolitan Algorithm from sec in “Adaptive Cooperative Systems” -summarized by Jinsan Yang.
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Introduction to Sampling based inference and MCMC
MCMC Output & Metropolis-Hastings Algorithm Part I
Advanced Statistical Computing Fall 2016
Intro to Sampling Methods
Linear and generalized linear mixed effects models
Jun Liu Department of Statistics Stanford University
MURI Annual Review Meeting Randy Moses November 3, 2008
Innovative Front-End Signal Processing
Bayesian inference Presented by Amir Hadadi
Markov chain monte carlo
Remember that our objective is for some density f(y|) for observations where y and  are vectors of data and parameters,  being sampled from a prior.
Introduction to particle filter
Introduction to particle filter
MURI Kickoff Meeting Randolph L. Moses November, 2008
Ch13 Empirical Methods.
Lecture 15 Sampling.
Opinionated Lessons #39 MCMC and Gibbs Sampling in Statistics
Markov Networks.
Presentation transcript:

Slides for Sampling from Posterior of Shape MURI Kickoff Meeting Randolph L. Moses November 3, 2008

Shape Analysis (Fan, Fisher & Willsky) Topics Shape Analysis (Fan, Fisher & Willsky) MCMC sampling from shape priors/posteriors MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation

MCMC Sampling from Shape Posteriors

MCMC Shape Sampling Key contributions Publications Establishes conditions for detailed balance when sampling from normal perturbations of (simply connected) shapes, thus enabling samples from a posterior over shapes Associated MCMC Algorithm using approximation Analysis of distribution over shape posterior (rather than point estimate such as MAP) Publications MICCAI ’07 Journal version to be submitted to Anuj’s Special Issue of PAMI MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation

Force level set  to be zero on the curve Embedding the curve Force level set  to be zero on the curve Chain rule gives us MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation

Popular energy functionals Geodesic active contours (Caselles et al.): Separating the means (Yezzi et al.): Piecewise constant intensities (Chan and Vese): MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation

Markov Chain Monte Carlo C is a curve, y is the observed image (can be vector), S is a shape model Typically model data as iid given the curve We wish to sample from p(x|y;S), but cannot do so directly Instead, iteratively sample from a proposal distribution q and keep samples according to an acceptance rule a. Goal is to form a Markov chain with stationary distribution p Examples include Gibbs sampling, Metropolis-Hastings MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation

Metropolis-Hastings algorithm: Start with x0 At time t, sample candidate ft from q given xt-1 Calculate Hastings ratio: Set xt = ft with probability min(1, rt), otherwise xt = xt-1 Go back to 2 MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation

Asymptotic Convergence We want to form a Markov chain such that its stationary distribution is p(x): For asymptotic convergence, sufficient conditions are: Ergodicity Detailed balance MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation

MCMC Curve Sampling Generate perturbation on the curve: Sample by adding smooth random fields: S controls the degree of smoothness in field, k term is a curve smoothing term, g is an inflation term Mean term to move average behavior towards higher-probability areas of p MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation

Piecewise-constant observation model: Chan-Vese energy functional: Synthetic noisy image Piecewise-constant observation model: Chan-Vese energy functional: Probability distribution (T=2s2): MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation

Needle in the Haystack MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation

Most likely samples MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation

Least likely samples MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation

Confidence intervals MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation

When “best” is not best In this example, the most likely samples under the model are not the most accurate according to the underlying truth 10%/90% “confidence” bands do a good job of enclosing the true answer Histogram image tells us more uncertainty in upper-right corner “Median” curve is quite close to the true curve Optimization would result in subpar results MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation

Conditional simulation In many problems, the model admits many reasonable solutions We can use user information to reduce the number of reasonable solutions Regions of inclusion or exclusion Partial segmentations Curve evolution methods largely limit user input to initialization With conditional simulation, we are given the values on a subset of the variables. We then wish to generate sample paths that fill in the remainder of the variables (e.g., simulating pinned Brownian motion) Can result in an interactive segmentation algorithm MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation

Future approaches More general density models (piecewise smooth) Full 3D volumes Additional features can be added to the base model: Uncertainty on the expert segmentations Shape models (semi-local) Exclusion/inclusion regions Topological change (through level sets) Better perturbations MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation