Announcements Readings for today:

Slides:



Advertisements
Similar presentations
Mean-Field Theory and Its Applications In Computer Vision1 1.
Advertisements

Bayesian Belief Propagation
HOPS: Efficient Region Labeling using Higher Order Proxy Neighborhoods Albert Y. C. Chen 1, Jason J. Corso 1, and Le Wang 2 1 Dept. of Computer Science.
Linear Time Methods for Propagating Beliefs Min Convolution, Distance Transforms and Box Sums Daniel Huttenlocher Computer Science Department December,
Graph Cut Algorithms for Computer Vision & Medical Imaging Ramin Zabih Computer Science & Radiology Cornell University Joint work with Y. Boykov, V. Kolmogorov,
Introduction to Markov Random Fields and Graph Cuts Simon Prince
Learning with Inference for Discrete Graphical Models Nikos Komodakis Pawan Kumar Nikos Paragios Ramin Zabih (presenter)
Exact Inference in Bayes Nets
Loopy Belief Propagation a summary. What is inference? Given: –Observabled variables Y –Hidden variables X –Some model of P(X,Y) We want to make some.
CS774. Markov Random Field : Theory and Application Lecture 17 Kyomin Jung KAIST Nov
The University of Ontario CS 4487/9687 Algorithms for Image Analysis Multi-Label Image Analysis Problems.
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
10/11/2001Random walks and spectral segmentation1 CSE 291 Fall 2001 Marina Meila and Jianbo Shi: Learning Segmentation by Random Walks/A Random Walks View.
Markov Random Fields (MRF)
Smoothing 3D Meshes using Markov Random Fields
Contextual Classification by Melanie Ganz Lecture 6, Medical Image Analysis 2008.
Markov random field Institute of Electronics, NCTU
Graphical models, belief propagation, and Markov random fields 1.
Chapter 8-3 Markov Random Fields 1. Topics 1. Introduction 1. Undirected Graphical Models 2. Terminology 2. Conditional Independence 3. Factorization.
Easy Optimization Problems, Relaxation, Local Processing for a single variable.
2010/5/171 Overview of graph cuts. 2010/5/172 Outline Introduction S-t Graph cuts Extension to multi-label problems Compare simulated annealing and alpha-
1 Computer Vision Research  Huttenlocher, Zabih –Recognition, stereopsis, restoration, learning  Strong algorithmic focus –Combinatorial optimization.
Stereo & Iterative Graph-Cuts Alex Rav-Acha Vision Course Hebrew University.
1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU
Clustering Color/Intensity
MRF Labeling With Graph Cut CMPUT 615 Nilanjan Ray.
Stereo Computation using Iterative Graph-Cuts
© 2006 by Davi GeigerComputer Vision April 2006 L1.1 Binocular Stereo Left Image Right Image.
Computer vision: models, learning and inference
Announcements PS3 Due Thursday PS4 Available today, due 4/17. Quiz 2 4/24.
Stereo Matching & Energy Minimization Vision for Graphics CSE 590SS, Winter 2001 Richard Szeliski.
Advanced Image Processing Image Relaxation – Restoration and Feature Extraction 02/02/10.
Image Analysis and Markov Random Fields (MRFs) Quanren Xiong.
Computer vision: models, learning and inference
MRFs and Segmentation with Graph Cuts Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 03/31/15.
MRFs and Segmentation with Graph Cuts Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 02/24/10.
Learning Lateral Connections between Hidden Units Geoffrey Hinton University of Toronto in collaboration with Kejie Bao University of Toronto.
Markov Random Fields Probabilistic Models for Images
1 Markov Random Fields with Efficient Approximations Yuri Boykov, Olga Veksler, Ramin Zabih Computer Science Department CORNELL UNIVERSITY.
CS774. Markov Random Field : Theory and Application Lecture 02
Lecture 19: Solving the Correspondence Problem with Graph Cuts CAP 5415 Fall 2006.
1 Markov random field: A brief introduction (2) Tzu-Cheng Jen Institute of Electronics, NCTU
Approximate Inference: Decomposition Methods with Applications to Computer Vision Kyomin Jung ( KAIST ) Joint work with Pushmeet Kohli (Microsoft Research)
1 Mean Field and Variational Methods finishing off Graphical Models – Carlos Guestrin Carnegie Mellon University November 5 th, 2008 Readings: K&F:
Gaussian Mixture Models and Expectation-Maximization Algorithm.
Probabilistic Graphical Models seminar 15/16 ( ) Haim Kaplan Tel Aviv University.
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Motion Estimation using Markov Random Fields Hrvoje Bogunović Image Processing Group Faculty of Electrical Engineering and Computing University of Zagreb.
Lecture 9 State Space Gradient Descent Gibbs Sampler with Simulated Annealing.
A global approach Finding correspondence between a pair of epipolar lines for all pixels simultaneously Local method: no guarantee we will have one to.
Efficient Belief Propagation for Image Restoration Qi Zhao Mar.22,2006.
Graph Algorithms for Vision Amy Gale November 5, 2002.
Distributed cooperation and coordination using the Max-Sum algorithm
Photoconsistency constraint C2 q C1 p l = 2 l = 3 Depth labels If this 3D point is visible in both cameras, pixels p and q should have similar intensities.
Bayesian Belief Propagation for Image Understanding David Rosenberg.
Markov Random Fields in Vision
Ch 6. Markov Random Fields 6.1 ~ 6.3 Adaptive Cooperative Systems, Martin Beckerman, Summarized by H.-W. Lim Biointelligence Laboratory, Seoul National.
Energy minimization Another global approach to improve quality of correspondences Assumption: disparities vary (mostly) smoothly Minimize energy function:
MRFs and Segmentation with Graph Cuts Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 03/27/12.
6.8 Maximizer of the Posterior Marginals 6.9 Iterated Conditional Modes of the Posterior Distribution Jang, HaYoung.
Biointelligence Laboratory, Seoul National University
Markov Random Fields with Efficient Approximations
STEREO MATCHING USING POPULATION-BASED MCMC
Prof. Adriana Kovashka University of Pittsburgh April 4, 2017
Haim Kaplan and Uri Zwick
CSE 589 Applied Algorithms Spring 1999
Expectation-Maximization & Belief Propagation
Adaptive Cooperative Systems Chapter 6 Markov Random Fields
Mean Field and Variational Methods Loopy Belief Propagation
Outline Texture modeling - continued Markov Random Field models
Presentation transcript:

Announcements Readings for today: Markov Random Field Modeling in Computer Vision. Li. First two chapters on reserve. ``Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images,’’ Geman and Geman. On reserve.

Markov Random Fields Markov chains, HMMs have 1D structure At every time, there is one state. This enabled use of dynamic programming. Markov Random Fields break this 1D structure. Field of sites, each of which has a label, simultaneously. Label at one site dependent on others, no 1D structure to dependencies. This means no optimal, efficient algorithms. Why MRFs? Objects have parts with complex dependencies. We need to model these. MRFs (and belief nets) model complex dependencies.

Example: Image Restoration Every pixel is a site. Label is intensity, uncorrupted by noise. Label depends on observation; pixel corrupted by noise. Also depends on other labels. If you see an image with one pixel missing, you can guess value of missing pixel pretty well.

Example: Stereo Every pixel is a site. Label of a pixel is its disparity. Disparity implies two pixels match. Prob. depends on similarity of pixels. Disparity at one pixel related to others since nearby pixels have similar disparities.

Definitions S indexes a discrete set of sites. S = {1, …, m} S = {(i,j) | 1 <= i, j <= n} for nxn grid. Ld = discrete set of labels, eg. {1, … M}. Labels could be continuous, but we skip that. A labeling assigns a label to every site, f = {f1, … fm}. fi is the label of site i.

Neighborhoods Neighborhood specifies dependencies. N = {Ni | for all i in S} Ni is neighborhood of i. j in Ni means i and j are neighbors. A site is not its own neighbor. Neighborhood is symmetric. Neighborhood -> conditional indep. F is an MRF on S w.r.t. N iff: P(f) > 0 P(fi | fS-{i}) = P(fi | fNi)

Using MRFs We need to define sites and labels. Define neighborhood structure capturing conditional probability structure. Assign probabilities that capture problem. Find most probable labeling. Gibbs Distribution useful conceptualization.

Gibbs Distribution Cliques capture dependencies of neighborhoods. {i} is a clique for all i. {i1, i2, … in} is a clique if ik in Nj for all 1<=i,j<=n.

Gibbs Distribution (2) U(f) is energy function. Vc(f) is clique potential Z is normalizing value. Sum over all labelings. T is temperature.

MRF=GRF Given any MRF, we can define an equivalent GRF. That means, find an appropriate energy U(f) To find f that maximizes P(f) it suffices to minimize:

Example: Piecewise Constant Image Restoration Every pixel is a site. Four connected neighborhoods Each site is a clique All pairs of neighbors are cliques Observation, di of intensity at site i.

Example, cont’d Prior on labels Prior on discontinuities Minimize Energy:

Optimization Our problem is going to be to choose f to minimize this energy. Usually this is NP-hard: heuristics or exponential algorithms. Greedy: loop through sites, changing labeling to reduce energy. Constant time to make this decision.

Optimization (2) Simulated Annealing. Pick site, i, at random. Let f’ be old labels, f be f’ with fi randomly changed. p = min(1, P(f/f’)). Replace f’ with f with probability p. As T -> 0 method becomes deterministic. By slowly lowering T states of f become a Markov chain guaranteed to converge to global optimum. This takes exponential time.

Optimization (3) Belief Propagation. At each step, a site collects information from neighbors on their probable labeling. Passes info to each neighbor based on info from other neighbors (avoids repeating to neighbor what that neighbor has told. In graph with no loops, like dynamic programming, forward-backward method. In general MRF, heuristic (that has been analyzed). (eg., Yedidia, Freeman and Weiss).

Optimization (4) Graph cuts. (eg., Boykov, Veksler, and Zabih). Find all sites labeled b. Relabel a subset of these a, so that energy is minimized over all possible such relabelings. This can be posed as a graph cut problem, solved optimally in polynomial time.

Skeletons Intuition: Abstract a 2D shape into a stick figure. Why is this good? Captures part structure. Represents 2D structure in 1D.

Similarity of structure is more apparent in skeleton than in boundary. (Sebastian and Kimia)

Grassfire Transform (Blum)

Alternate Definition A point is on the skeleton of a closed curve if and only if: It is inside the closed curve. A circle centered on point intersects skeleton in at least two places, but contains no points outside the curve. Shape is union of these circles.

Sensitive to noise

Skeletons of 3D Objects Grassfire produces a 2D surface. Intuitively, skeletons seem 1D. Harder to compare 2D surfaces; extract parts, etc….

Generalized Cylinders Contains an axis, a cross-section that sweeps along that axis, changing shape.

Problem How do you define a 1D skeleton of a 3D shape And relate this to the 1D skeleton of 2D image of that shape?