Introduction to Markov Random Fields and Graph Cuts Simon Prince

Slides:



Advertisements
Similar presentations
Primal-dual Algorithm for Convex Markov Random Fields Vladimir Kolmogorov University College London GDR (Optimisation Discrète, Graph Cuts et Analyse d'Images)
Advertisements

Algorithms for MAP estimation in Markov Random Fields Vladimir Kolmogorov University College London Tutorial at GDR (Optimisation Discrète, Graph Cuts.
Linear Time Methods for Propagating Beliefs Min Convolution, Distance Transforms and Box Sums Daniel Huttenlocher Computer Science Department December,
Markov Networks Alan Ritter.
1 EE5900 Advanced Embedded System For Smart Infrastructure Static Scheduling.
Exact Inference in Bayes Nets
EMIS 8374 Vertex Connectivity Updated 20 March 2008.
I Images as graphs Fully-connected graph – node for every pixel – link between every pair of pixels, p,q – similarity w ij for each link j w ij c Source:
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Graph-Based Image Segmentation
Foreground/Background Image Segmentation. What is our goal? To label each pixel in an image as belonging to either the foreground of the scene or the.
Piyush Kumar (Lecture 6: MaxFlow MinCut Applications)
1 s-t Graph Cuts for Binary Energy Minimization  Now that we have an energy function, the big question is how do we minimize it? n Exhaustive search is.
Graph-based image segmentation Václav Hlaváč Czech Technical University in Prague Faculty of Electrical Engineering Department of Cybernetics Prague, Czech.
Probabilistic Inference Lecture 1
Contextual Classification by Melanie Ganz Lecture 6, Medical Image Analysis 2008.
Chapter 8-3 Markov Random Fields 1. Topics 1. Introduction 1. Undirected Graphical Models 2. Terminology 2. Conditional Independence 3. Factorization.
CS774. Markov Random Field : Theory and Application Lecture 06 Kyomin Jung KAIST Sep
1 Graphical Models in Data Assimilation Problems Alexander Ihler UC Irvine Collaborators: Sergey Kirshner Andrew Robertson Padhraic Smyth.
P 3 & Beyond Solving Energies with Higher Order Cliques Pushmeet Kohli Pawan Kumar Philip H. S. Torr Oxford Brookes University CVPR 2007.
2010/5/171 Overview of graph cuts. 2010/5/172 Outline Introduction S-t Graph cuts Extension to multi-label problems Compare simulated annealing and alpha-
1 Computer Vision Research  Huttenlocher, Zabih –Recognition, stereopsis, restoration, learning  Strong algorithmic focus –Combinatorial optimization.
Stereo & Iterative Graph-Cuts Alex Rav-Acha Vision Course Hebrew University.
MRF Labeling With Graph Cut CMPUT 615 Nilanjan Ray.
Announcements Readings for today:
Belief Propagation Kai Ju Liu March 9, Statistical Problems Medicine Finance Internet Computer vision.
Stereo Computation using Iterative Graph-Cuts
What Energy Functions Can be Minimized Using Graph Cuts? Shai Bagon Advanced Topics in Computer Vision June 2010.
Measuring Uncertainty in Graph Cut Solutions Pushmeet Kohli Philip H.S. Torr Department of Computing Oxford Brookes University.
Graph-Cut Algorithm with Application to Computer Vision Presented by Yongsub Lim Applied Algorithm Laboratory.
Computer vision: models, learning and inference
CPSC 422, Lecture 18Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 18 Feb, 25, 2015 Slide Sources Raymond J. Mooney University of.
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Network Flow & Linear Programming Jeff Edmonds York University Adapted from NetworkFlow.ppt.
Graph-based Segmentation
Computer vision: models, learning and inference
MRFs and Segmentation with Graph Cuts Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 03/31/15.
CS774. Markov Random Field : Theory and Application Lecture 08 Kyomin Jung KAIST Sep
Presenter : Kuang-Jui Hsu Date : 2011/5/23(Tues.).
MRFs and Segmentation with Graph Cuts Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 02/24/10.
Advanced Algorithms Piyush Kumar (Lecture 4: MaxFlow MinCut Applications) Welcome to COT5405.
Graph Cut 韋弘 2010/2/22. Outline Background Graph cut Ford–Fulkerson algorithm Application Extended reading.
Interactive Shortest Path Part 3 An Image Segmentation Technique Jonathan-Lee Jones.
CS774. Markov Random Field : Theory and Application Lecture 13 Kyomin Jung KAIST Oct
Undirected Models: Markov Networks David Page, Fall 2009 CS 731: Advanced Methods in Artificial Intelligence, with Biomedical Applications.
Graph Cuts Marc Niethammer. Segmentation by Graph-Cuts A way to compute solutions to the optimization problems we looked at before. Example: Binary Segmentation.
Markov Random Fields Probabilistic Models for Images
Algorithms for MAP estimation in Markov Random Fields Vladimir Kolmogorov University College London.
1 Markov Random Fields with Efficient Approximations Yuri Boykov, Olga Veksler, Ramin Zabih Computer Science Department CORNELL UNIVERSITY.
CS774. Markov Random Field : Theory and Application Lecture 02
Machine Learning – Lecture 15
Lecture 19: Solving the Correspondence Problem with Graph Cuts CAP 5415 Fall 2006.
Approximate Inference: Decomposition Methods with Applications to Computer Vision Kyomin Jung ( KAIST ) Joint work with Pushmeet Kohli (Microsoft Research)
Tractable Higher Order Models in Computer Vision (Part II) Slides from Carsten Rother, Sebastian Nowozin, Pusohmeet Khli Microsoft Research Cambridge Presented.
Lecture 2: Statistical learning primer for biologists
Machine Learning – Lecture 15
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 25.
A global approach Finding correspondence between a pair of epipolar lines for all pixels simultaneously Local method: no guarantee we will have one to.
Prof. Swarat Chaudhuri COMP 382: Reasoning about Algorithms Fall 2015.
Markov Random Fields in Vision
Network Topology Single-level Diversity Coding System (DCS) An information source is encoded by a number of encoders. There are a number of decoders, each.
Markov Random Fields Tomer Michaeli Graduate Course
Markov Random Fields with Efficient Approximations
Richard Anderson Lecture 23 Network Flow
Lecture 22 Network Flow, Part 2
Markov Random Fields for Edge Classification
Lecture 31: Graph-Based Image Segmentation
Lecture 22 Network Flow, Part 2
EMIS 8374 Max-Flow in Undirected Networks Updated 18 March 2008
Presentation transcript:

Introduction to Markov Random Fields and Graph Cuts Simon Prince

Plan of Talk Denoising problem Markov random fields (MRFs) Max-flow / min-cut Binary MRFs (exact solution)

Binary Denoising Before After Image represented as binary discrete variables. Some proportion of pixels randomly changed polarity.

Denoising Task Observed DataUncorrupted Image

Denoising Bayes’ rule: Likelihoods: Prior: Markov random field (smoothness) MAP Inference: Graph cuts Many other vision tasks have this structure (stereo, segmentation etc.)

Plan of Talk Denoising problem Markov random fields (MRFs) Max-flow / min-cut Binary MRFs –submodular (exact solution)

Undirected Models MRF is an example of an undirected model Product of positive potential functions Z is a normalizing constant Z referred to as “partition function”

Alternate Formulation or product of exponential costs Product of positive potential functions where

In this model a variable is conditionally independent of all the others given its neighbours For example So connections in graphical model (top) tell us about independence relations MRF Example Consider a product of functions over neighbours

Proof of Markov Property Using conditional probability relation (only depends on neighbours)

MRF Example Consider the case where variables are binary, so functions return 4 different values depending on the combination of neighbours. Let’s choose

2D MRF Example Product of pairwise functions between neighbouring variables. Lower probability if don’t take the same value. Samples from this distribution look like mostly smooth images

Denoising with MRFs Observed image, x Original image, y MRF Prior (pairwise cliques) Inference via Bayes’ rule: Likelihoods

MAP Inference Unary terms (compatability of data with label y) Pairwise terms (compatability of neighboring labels)

Graph Cuts Overview Unary terms (compatability of data with label y) Pairwise terms (compatability of neighboring labels) Graph cuts used to optimise this cost function: Three main cases:

Graph Cuts Overview Unary terms (compatability of data with label y) Pairwise terms (compatability of neighboring labels) Graph cuts used to optimise this cost function: Approach: Convert minimization into the form of a standard CS problem, MAXIMUM FLOW or MINIMUM CUT ON A GRAPH Low order polynomial methods for solving this problem are known

Plan of Talk Denoising problem Markov random fields (MRFs) Max-flow / min-cut Binary MRFs - submodular (exact solution)

Max-Flow Problem Goal: To push as much ‘flow’ as possible through the directed graph from the source to the sink. Cannot exceed the (non-negative) capacities c ij associated with each edge.

Saturated Edges When we are pushing the maximum amount of flow: There must be at least one saturated edge on any path from source to sink (otherwise we could push more flow) The set of saturated edges hence separate the source and sink

Min Cut Define a cut on the graph as a set of edges that separate the source and sink. Cost of cut = total capacity of these edges Edges that saturate in the maximum flow solution form the minimum cost cut Can talk interchangeably about max-flow or min-cut

Plan of Talk Denoising problem Markov random fields (MRFs) Max-flow / min-cut Binary MRFs – submodular (exact solution)

Graph Cuts: Binary MRF Unary terms (compatability of data with label y) Pairwise terms (compatability of neighboring labels) Graph cuts used to optimise this cost function: First work with binary case (i.e. True label y is 0 or 1) Constrain pairwise costs so that they are “zero-diagonal”

Graph Construction One node per pixel (here a 3x3 image) Edge from source to every pixel node Edge from every pixel node to sink Reciprocal edges between neighbours Note that in the minimum cut EITHER the edge connecting to the source will be cut, OR the edge connecting to the sink, but NOT BOTH (unnecessary). Which determines whether we give that pixel label 1 or label 0. Now a 1 to 1 mapping between possible labelling and possible minimum cuts

Graph Construction Now add capacities so that minimum cut, minimizes our cost function Unary costs U(0), U(1) attached to links to source and sink. Either one or the other is paid. Pairwise costs between pixel nodes as shown. Why? Easiest to understand with some worked examples.

Example 1

Example 2

Example 3

Graph Cuts: Binary MRF Unary terms (compatability of data with label y) Pairwise terms (compatability of neighboring labels) Graph cuts used to optimise this cost function: Summary of approach Associate each possible solution with a minimum cut on a graph Set capacities on graph, so cost of cut matches the cost function Use augmenting paths to find minimum cut This minimizes the cost function and finds the MAP solution

Denoising Results Original Pairwise costs increasing

Things I haven’t told you! How to solve the max flow problem How to use more general costs (not always possible) How to generalize to more labels. – sometimes possible exactly (Schlesinger & Flach) – sometimes approximately (alpha expansion) – sometimes NP hard

The End Contact me if you want slides or notes