Bayesian Belief Propagation

Slides:



Advertisements
Similar presentations
HOPS: Efficient Region Labeling using Higher Order Proxy Neighborhoods Albert Y. C. Chen 1, Jason J. Corso 1, and Le Wang 2 1 Dept. of Computer Science.
Advertisements

Andrew Cosand ECE CVRR CSE
Mobile Robot Localization and Mapping using the Kalman Filter
Linear Time Methods for Propagating Beliefs Min Convolution, Distance Transforms and Box Sums Daniel Huttenlocher Computer Science Department December,
State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.
Unsupervised Learning Clustering K-Means. Recall: Key Components of Intelligent Agents Representation Language: Graph, Bayes Nets, Linear functions Inference.
CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
Lab 2 Lab 3 Homework Labs 4-6 Final Project Late No Videos Write up
Introduction to Belief Propagation and its Generalizations. Max Welling Donald Bren School of Information and Computer and Science University of California.
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
Rob Fergus Courant Institute of Mathematical Sciences New York University A Variational Approach to Blind Image Deconvolution.
Belief Propagation on Markov Random Fields Aggeliki Tsoli.
Visual Recognition Tutorial
Computer Vision Optical Flow
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
1 Graphical Models in Data Assimilation Problems Alexander Ihler UC Irvine Collaborators: Sergey Kirshner Andrew Robertson Padhraic Smyth.
Stanford CS223B Computer Vision, Winter 2007 Lecture 12 Tracking Motion Professors Sebastian Thrun and Jana Košecká CAs: Vaibhav Vaish and David Stavens.
Lecture 5: Learning models using EM
Probabilistic video stabilization using Kalman filtering and mosaicking.
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotic Embedded Systems Laboratory University of Southern California
Learning Low-Level Vision William T. Freeman Egon C. Pasztor Owen T. Carmichael.
Optical flow and Tracking CISC 649/849 Spring 2009 University of Delaware.
Lecture 19: Optical flow CS6670: Computer Vision Noah Snavely
Particle Filtering for Non- Linear/Non-Gaussian System Bohyung Han
© 2003 by Davi GeigerComputer Vision November 2003 L1.1 Tracking We are given a contour   with coordinates   ={x 1, x 2, …, x N } at the initial frame.
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Computer Vision Optical Flow Marc Pollefeys COMP 256 Some slides and illustrations from L. Van Gool, T. Darell, B. Horn, Y. Weiss, P. Anandan, M. Black,
Super-Resolution of Remotely-Sensed Images Using a Learning-Based Approach Isabelle Bégin and Frank P. Ferrie Abstract Super-resolution addresses the problem.
Computer vision: models, learning and inference
Markov Localization & Bayes Filtering
TP15 - Tracking Computer Vision, FCUP, 2013 Miguel Coimbra Slides by Prof. Kristen Grauman.
Optical Flow Donald Tanguay June 12, Outline Description of optical flow General techniques Specific methods –Horn and Schunck (regularization)
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Probabilistic Robotics Bayes Filter Implementations.
Scientific Writing Abstract Writing. Why ? Most important part of the paper Number of Readers ! Make people read your work. Sell your work. Make your.
Markov Random Fields Probabilistic Models for Images
Continuous Variables Write message update equation as an expectation: Proposal distribution W t (x t ) for each node Samples define a random discretization.
December 9, 2014Computer Vision Lecture 23: Motion Analysis 1 Now we will talk about… Motion Analysis.
1 University of Texas at Austin Machine Learning Group 图像与视频处理 计算机学院 Motion Detection and Estimation.
Mobile Robot Localization (ch. 7)
CS Statistical Machine learning Lecture 24
Joint Tracking of Features and Edges STAN BIRCHFIELD AND SHRINIVAS PUNDLIK CLEMSON UNIVERSITY ABSTRACT LUCAS-KANADE AND HORN-SCHUNCK JOINT TRACKING OF.
1 Mean Field and Variational Methods finishing off Graphical Models – Carlos Guestrin Carnegie Mellon University November 5 th, 2008 Readings: K&F:
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Motion Estimation using Markov Random Fields Hrvoje Bogunović Image Processing Group Faculty of Electrical Engineering and Computing University of Zagreb.
Tracking with dynamics
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Kalman Filter with Process Noise Gauss- Markov.
Regularization of energy-based representations Minimize total energy E p (u) + (1- )E d (u,d) E p (u) : Stabilizing function - a smoothness constraint.
Markov Networks: Theory and Applications Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208
Bayesian Belief Propagation for Image Understanding David Rosenberg.
MOTION Model. Road Map Motion Model Non Parametric Motion Field : Algorithms 1.Optical flow field estimation. 2.Block based motion estimation. 3.Pel –recursive.
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
Learning Deep Generative Models by Ruslan Salakhutdinov
Tracking We are given a contour G1 with coordinates G1={x1 , x2 , … , xN} at the initial frame t=1, were the image is It=1 . We are interested in tracking.
Today.
Ch3: Model Building through Regression
Markov Random Fields with Efficient Approximations
Course: Autonomous Machine Learning
Dynamical Statistical Shape Priors for Level Set Based Tracking
Filtering and State Estimation: Basic Concepts
Graduate School of Information Sciences, Tohoku University
Announcements more panorama slots available now
≠ Particle-based Variational Inference for Continuous Systems
Image and Video Processing
Physical Fluctuomatics 7th~10th Belief propagation
Expectation-Maximization & Belief Propagation
Announcements more panorama slots available now
Presentation transcript:

Bayesian Belief Propagation Reading Group

Overview Problem Background Bayesian Modelling Markov Random Fields Examine use of Bayesian Belief Propagation (BBP) in three low level vision applications. Contour Motion Estimation Dense Depth Estimation Unwrapping Phase Images Convergence Issues Conclusions

Problem Background A problem of probabilistic inference Estimate unknown variables given observed data. For low level vision: Estimate unknown scene properties (e.g. depth) from image properties (e.g. Intensity gradients)

Bayesian models in low level vision A statistical description of an estimation problem. Given data d, we want to estimate unknown parameters u Two components Prior Model p(u) – Captures know information about unknown data and is independent of observed data. Distribution of probable solutions. Sensor Model p(d|u) – Describes relationship between sensed measurements d and unknown hidden data u. Combine using Bayes’ Rule to give the posterior

Markov Random Fields Image Data Nodes (d) Pairwise Markov Random Field: Model commonly used to represent images Hidden Scene Nodes (u) Sensor model Prior model ui Neighborhood Ni

Contour Motion Estimation Yair Weiss

Contour Motion Estimation Estimate the motion of contour using only local information. Less computationally intensive method than optical flow. Application example: object tracking. Difficult due to the aperture problem.

Contour Motion Estimation Actual Ideal Aperture Problem

Contour Motion Estimation ui ui+1 ui-1 Brightness Constant Constraint Equation i i+1 i-1 i-2 i+2 Prior Model: ui+1 = ui + n where n ~ N(0,sp) where Ii = I(xi,yi,t) di di-1 di+1 di-2 di+2 ui ui-1 ui+1 ui-2 ui+2

1D Belief Propagation di di-1 di+1 di-2 di+2 ui ui-1 ui+1 ui-2 ui+2 Iterate until message values converge ui ui-1 ui+1 ui-2 ui+2

Results Contour motion estimation [Weiss] Faster and more accurate solutions over pre-existing methods such as relaxation. Results after iteration n are optimal given all data within distance of n nodes. Due to the nature of the problem, all velocity components should and do converge to the same value. Interesting to try algorithm on problems where this is not the case Multiple motions within the same contour Rotating contours (requires a new prior model) Only one dimensional problems tackled but extensions to 2D are discussed. Also use of algorithm to solve Direction Of Figure (DOF) problem using convexity (not discussed)

Dense Depth Estimation Richard Szeliski

Depth Estimation Assume smooth variation in disparity i Depth Zi Disparity ui = 1 / Zi Define prior using Gibbs Distribution: Ep(u) is an energy functional:

Depth Estimation Where H is a measurement matrix and Disparity: related to correlation metric di i Image T=0 Image T=1 Image T=t Image T=t+1 Where H is a measurement matrix and Image T=t+2 Image t=t+3 Es(u) is an energy functional:

Depth Estimation Posterior: E(u) is the overall energy: where and Energy function E(u) minimized when u=A-1b Matrix A-1 is large and expensive to compute

Gauss-Seidel Relaxation Minimize energy locally for each node ui keeping all other nodes fixed. Leads to update rule: This is also the estimated mean of the marginal probability distribution p(ui|d) given by Gibbs Sampling. For the 1-D example given by Weiss:

Results Dense depth estimation [Szeliski] Dense (per pixel) depth estimation from a sequence of images with known camera motion. Adapted Kalman Filter: estimates of depth from time t-1 are used to improve estimates at time t. Uses multi-resolution technique (image pyramid) to improve convergence times. Uses Gibbs Sampling to sample the posterior. Stochastic Gauss-Seidel relaxation Not guaranteed to converge. Problem can be reformulated to use message passing. Does not account for loops in the network, only recently has belief propagation in networks with loops been fully understood [Yedidia et al]

Unwrapping Phase Images Brendan Frey et al

Unwrapping Phase Images Wrapped phase images are produced by devices such as MRI and radar. Unwrapping involves finding shift values between each point. Unwrapping is simple in one dimension One path through data Use local gradient to estimate shift. For 2D images, the problem is more difficult (NP-hard) Many paths through the data Shifts along all paths must be consistent

Zero-Curl Constraint Data Point a(x,y) b(x,y) b(x+1,y) a(x,y+1) (x,y) Shift node Constraint Node

Sensor Data Estimating relative shift (variables a and b) values [-1,0 or 1] between each data point. Use local image gradient as sensor input Sensor nodes: Hidden shift nodes: Gaussian sensor model: Estimate from wrapped image

Belief Propagation 1.0 -1 1 Belief a(x,y) Data Point m5 m5 Shift node 1 Belief a(x,y) Data Point m5 m5 Shift node m4 m1 m2 m3 Constraint Node

Results Unwrapping phase images [Frey et al.] Initialize message to uniform distribution and iterate to convergence. Estimates a solution to an NP-Hard problem in O(n) time in the number of the nodes. Reduction in reconstruction error over relaxation methods. Does not account for loops in the network, messages could cycle leading to incorrect belief estimates. Not guaranteed to converge.

Convergence Convergence only guaranteed when network is a tree structure and all data is available. In networks with loops, messages can cycle resulting in incorrect belief estimates. Multi-resolution methods such as image pyramids can be used to speed up convergence times (and improve results).

Conclusion BBP used to infer marginal posterior distribution of hidden information from observable data. Efficient message passing system is linear in the number of nodes as opposed to exponential. Propagate local information globally to achieve more reliable estimates. Useful for low level vision applications Contour Motion Estimation [Weiss] Dense Depth Estimation [Szeliski] Unwrapping Phase Images [Frey et al] Improved results over standard relaxation algorithms. Can be used in conjunction with multi-resolution framework to improve convergence times. Need to account for loops to prevent cycling of messages [Yedidia et al].