Markov Random Fields in Vision

Slides:



Advertisements
Similar presentations
COMP 482: Design and Analysis of Algorithms
Advertisements

Markov Networks Alan Ritter.
Graphical Models BRML Chapter 4 1. the zoo of graphical models Markov networks Belief networks Chain graphs (Belief and Markov ) Factor graphs =>they.
Introduction to Markov Random Fields and Graph Cuts Simon Prince
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Graph-Based Image Segmentation
Foreground/Background Image Segmentation. What is our goal? To label each pixel in an image as belonging to either the foreground of the scene or the.
Piyush Kumar (Lecture 6: MaxFlow MinCut Applications)
Probabilistic Inference Lecture 1
CSCI 256 Data Structures and Algorithm Analysis Lecture 18 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
1 Chapter 7 Network Flow Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
1 Chapter 7 Network Flow Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
1 COMMONWEALTH OF AUSTRALIA Copyright Regulations 1969 WARNING This material has been reproduced and communicated to you by or on behalf.
Markov Nets Dhruv Batra, Recitation 10/30/2008.
Chapter 8-3 Markov Random Fields 1. Topics 1. Introduction 1. Undirected Graphical Models 2. Terminology 2. Conditional Independence 3. Factorization.
Advanced Algorithms Piyush Kumar (Lecture 2: Max Flows) Welcome to COT5405 Slides based on Kevin Wayne’s slides.
CSE 421 Algorithms Richard Anderson Lecture 22 Network Flow.
Stereo & Iterative Graph-Cuts Alex Rav-Acha Vision Course Hebrew University.
1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU
MRF Labeling With Graph Cut CMPUT 615 Nilanjan Ray.
Maximum Flows Lecture 4: Jan 19. Network transmission Given a directed graph G A source node s A sink node t Goal: To send as much information from s.
Stereo Computation using Iterative Graph-Cuts
Measuring Uncertainty in Graph Cut Solutions Pushmeet Kohli Philip H.S. Torr Department of Computing Oxford Brookes University.
Graph-Cut Algorithm with Application to Computer Vision Presented by Yongsub Lim Applied Algorithm Laboratory.
Computer vision: models, learning and inference
CSE 421 Algorithms Richard Anderson Lecture 22 Network Flow.
Computer vision: models, learning and inference
CS774. Markov Random Field : Theory and Application Lecture 08 Kyomin Jung KAIST Sep
Presenter : Kuang-Jui Hsu Date : 2011/5/23(Tues.).
MRFs and Segmentation with Graph Cuts Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 02/24/10.
Advanced Algorithms Piyush Kumar (Lecture 4: MaxFlow MinCut Applications) Welcome to COT5405.
CS774. Markov Random Field : Theory and Application Lecture 13 Kyomin Jung KAIST Oct
10/11/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Adam Smith Algorithm Design and Analysis L ECTURE 22 Network.
Maximization of Network Survivability against Intelligent and Malicious Attacks (Cont’d) Presented by Erion Lin.
Graph Cuts Marc Niethammer. Segmentation by Graph-Cuts A way to compute solutions to the optimization problems we looked at before. Example: Binary Segmentation.
Markov Random Fields Probabilistic Models for Images
1 COMMONWEALTH OF AUSTRALIA Copyright Regulations 1969 WARNING This material has been reproduced and communicated to you by or on behalf.
Algorithms for MAP estimation in Markov Random Fields Vladimir Kolmogorov University College London.
1 Markov Random Fields with Efficient Approximations Yuri Boykov, Olga Veksler, Ramin Zabih Computer Science Department CORNELL UNIVERSITY.
CS774. Markov Random Field : Theory and Application Lecture 02
CS223 Advanced Data Structures and Algorithms 1 Maximum Flow Neil Tang 3/30/2010.
Chapter 7 April 28 Network Flow.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 25.
Chapter 7 May 3 Ford-Fulkerson algorithm Step-by-step walk through of an example Worst-case number of augmentations Edmunds-Karp modification Time complexity.
1 EE5900 Advanced Embedded System For Smart Infrastructure Static Scheduling.
CSCI 256 Data Structures and Algorithm Analysis Lecture 20 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 23.
CSCE 3110 Data Structures & Algorithm Analysis Graphs (I) Reading: Chap.9, Weiss.
1 Network Flow CSC401 – Analysis of Algorithms Chapter 8 Network Flow Objectives: Flow networks –Flow –Cut Maximum flow –Augmenting path –Maximum flow.
CSE 421 Algorithms Richard Anderson Lecture 22 Network Flow.
Prof. Swarat Chaudhuri COMP 382: Reasoning about Algorithms Fall 2015.
1 Chapter 7 Network Flow Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
Approximation Algorithms Duality My T. UF.
1 Chapter 7 Network Flow Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
Maximum Flow Problem Definitions and notations The Ford-Fulkerson method.
TU/e Algorithms (2IL15) – Lecture 8 1 MAXIMUM FLOW (part II)
Markov Random Fields Tomer Michaeli Graduate Course
Markov Random Fields with Efficient Approximations
Richard Anderson Lecture 23 Network Flow
Chapter 7 Network Flow Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
Network Flow and Connectivity in Wireless Sensor Networks
Richard Anderson Lecture 23 Network Flow
Richard Anderson Lecture 23 Network Flow
Richard Anderson Lecture 21 Network Flow
Piyush Kumar (Lecture 6: MaxFlow MinCut Applications)
Lecture 21 Network Flow, Part 1
Richard Anderson Lecture 22 Network Flow
Lecture 21 Network Flow, Part 1
Lecture 22 Network Flow, Part 2
Richard Anderson Lecture 22 Network Flow
Presentation transcript:

Markov Random Fields in Vision Many slides drawn from presentations by Simon Prince/UCL and Kevin Wayne/Princeton

Image Denoising Foreground Extraction Stereo Disparity

Why study MRFs? Image denoising is based on modeling what kinds of images are more probable Foreground extraction is based on modeling what spatial distribution of foreground pixels is more probable Stereo disparity estimation is based on modeling what kinds of disparity fields are more probable

Modeling the joint probability distribution Associate a random variable with each pixel

Conditional independence assumptions necessary for tractability Directed graphical models (a.k.a Bayes nets, Belief networks)

MRF Definition

Example: Image with 4-connected pixels

Hammersley Clifford Theorem Any positive distribution that obeys the Markov property can be written in the form Where the c terms are maximal cliques Cliques = subsets of variables that all connect to each other. Maximal = cannot add any more variables and still be a clique

MRF on a line which favors smoothness Consider the case where variables are binary, so functions return 4 different values depending on the combination of neighbours. Let’s choose

Denoising with MRFs MRF Prior (pairwise cliques) Original image, y Likelihoods Observed image, x Inference via Bayes’ rule:

MAP Inference Unary terms Pairwise terms (compatibility of data with label y) Pairwise terms (compatibility of neighboring labels)

Graph Cuts Overview Graph cuts used to optimise this cost function: Unary terms (compatibility of data with label y) Pairwise terms (compatibility of neighboring labels) Three main cases:

Graph Cuts Overview Approach: Graph cuts used to optimise this cost function: Unary terms (compatibility of data with label y) Pairwise terms (compatibility of neighboring labels) Approach: Convert minimization into the form of a standard CS problem, MAXIMUM FLOW or MINIMUM CUT ON A GRAPH Low order polynomial methods for solving this problem are known

Chapter 7 Network Flow Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.

Minimum Cut Problem Flow network. Abstraction for material flowing through the edges. G = (V, E) = directed graph, no parallel edges. Two distinguished nodes: s = source, t = sink. c(e) = capacity of edge e. 2 9 5 10 15 15 4 10 source = where material originates, sink = where material goes. We use cut to mean s-t cut. source s 5 3 sink 8 6 10 t 4 6 15 10 capacity 15 4 30 7

Cuts Def. An s-t cut is a partition (A, B) of V with s  A and t  B. Def. The capacity of a cut (A, B) is: 2 9 5 10 15 15 4 10 s 5 3 8 6 10 t A 4 6 15 10 15 Capacity = 10 + 5 + 15 = 30 4 30 7

Cuts Def. An s-t cut is a partition (A, B) of V with s  A and t  B. Def. The capacity of a cut (A, B) is: 2 9 5 10 15 15 4 10 7->3 not counted s 5 3 8 6 10 t A 4 6 15 10 15 Capacity = 9 + 15 + 8 + 30 = 62 4 30 7

Minimum Cut Problem Min s-t cut problem. Find an s-t cut of minimum capacity. 2 9 5 10 15 15 4 10 s 5 3 8 6 10 t 4 6 15 A 10 15 Capacity = 10 + 8 + 10 = 28 4 30 7

Flows Def. An s-t flow is a function that satisfies: For each e  E: (capacity) For each v  V – {s, t}: (conservation) Def. The value of a flow f is: s 2 3 4 5 6 7 t 15 30 10 8 9 4 4 source = where material originates, sink = where material goes flow conservation = otherwise warehouse overfills or oil pipe bursts flow conservation is analogous to Kirchoff's law * flow: abstract entity generated at source, transmitted across edges, absorbed at sink * assume no arcs enter s or leave t (makes a little cleaner, no loss of generality) 4 4 capacity flow Value = 4

Flows Def. An s-t flow is a function that satisfies: For each e  E: (capacity) For each v  V – {s, t}: (conservation) Def. The value of a flow f is: 6 s 2 3 4 5 6 7 t 15 30 10 8 9 10 6 4 3 8 8 1 10 capacity flow 11 11 Value = 24

Maximum Flow Problem Max flow problem. Find s-t flow of maximum value. 9 s 2 3 4 5 6 7 t 15 30 10 8 9 10 1 9 Equalize inflow and outflow at every intermediate node. Maximize flow sent from s to t. 4 8 9 4 10 capacity flow 14 14 Value = 28

Flows and Cuts Flow value lemma. Let f be any flow, and let (A, B) be any s-t cut. Then, the net flow sent across the cut is equal to the amount leaving s. 6 2 9 5 10 6 10 15 15 4 4 10 3 8 8 s 5 3 8 6 10 t A 1 10 4 6 15 10 15 11 11 Value = 24 4 30 7

Flows and Cuts Flow value lemma. Let f be any flow, and let (A, B) be any s-t cut. Then, the net flow sent across the cut is equal to the amount leaving s. 6 2 9 5 10 6 10 15 15 4 4 10 total amount of flow that leaves S minus amount flow that "swirls" back 3 8 8 s 5 3 8 6 10 t A 1 10 4 6 15 10 15 11 Value = 6 + 0 + 8 - 1 + 11 = 24 11 4 30 7

Flows and Cuts Flow value lemma. Let f be any flow, and let (A, B) be any s-t cut. Then, the net flow sent across the cut is equal to the amount leaving s. 6 2 9 5 10 6 10 15 15 4 4 10 3 8 8 s 5 3 8 6 10 t A 1 10 4 6 15 10 15 11 Value = 10 - 4 + 8 - 0 + 10 = 24 11 4 30 7

Max-Flow Min-Cut Theorem Max-flow min-cut theorem. [Ford-Fulkerson 1956] The value of the max flow is equal to the value of the min cut. There are low order polynomial time algorithms known for determining these, and associated software is available

Graph Cuts: Binary MRF Graph cuts used to optimise this cost function: Unary terms (compatability of data with label y) Pairwise terms (compatability of neighboring labels) First work with binary case (i.e. True label y is 0 or 1) Constrain pairwise costs so that they are “zero-diagonal”

Graph Construction One node per pixel (here a 3x3 image) Edge from source to every pixel node Edge from every pixel node to sink Reciprocal edges between neighbours Note that in the minimum cut EITHER the edge connecting to the source will be cut, OR the edge connecting to the sink, but NOT BOTH (unnecessary). Which determines whether we give that pixel label 1 or label 0. Now a 1 to 1 mapping between possible labelling and possible minimum cuts

Graph Construction Now add capacities so that minimum cut, minimizes our cost function Unary costs U(0), U(1) attached to links to source and sink. Either one or the other is paid. Pairwise costs between pixel nodes as shown. Why? Easiest to understand with some worked examples.

Example 1

Example 2

Graph Cuts: Binary MRF Graph cuts used to optimise this cost function: Unary terms (compatability of data with label y) Pairwise terms (compatability of neighboring labels) Summary of approach Associate each possible solution with a minimum cut on a graph Set capacities on graph, so cost of cut matches the cost function This minimizes the cost function and finds the MAP solution

Image Denoising Foreground Extraction Stereo Disparity

The connection between graph cuts and MRF inference was first made in this paper