Graph Algorithms for Vision Amy Gale November 5, 2002.

Slides:



Advertisements
Similar presentations
Algorithms for MAP estimation in Markov Random Fields Vladimir Kolmogorov University College London Tutorial at GDR (Optimisation Discrète, Graph Cuts.
Advertisements

1 LP, extended maxflow, TRW OR: How to understand Vladimirs most recent work Ramin Zabih Cornell University.
Lindsey Bleimes Charlie Garrod Adam Meyerson
Linear Time Methods for Propagating Beliefs Min Convolution, Distance Transforms and Box Sums Daniel Huttenlocher Computer Science Department December,
Graph Partitioning Problems Lecture 18: March 14 s1 s3 s4 s2 T1 T4 T2 T3 s1 s4 s2 s3 t3 t1 t2 t4 A region R1 R2 C1 C2.
Approximation Algoirthms: Graph Partitioning Problems Lecture 17: March 16 s1 s3 s4 s2 T1 T4 T2 T3.
Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
Graph Cut Algorithms for Computer Vision & Medical Imaging Ramin Zabih Computer Science & Radiology Cornell University Joint work with Y. Boykov, V. Kolmogorov,
Introduction to Markov Random Fields and Graph Cuts Simon Prince
1 EE5900 Advanced Embedded System For Smart Infrastructure Static Scheduling.
Learning with Inference for Discrete Graphical Models Nikos Komodakis Pawan Kumar Nikos Paragios Ramin Zabih (presenter)
A polylogarithmic approximation of the minimum bisection Robert Krauthgamer The Hebrew University Joint work with Uri Feige.
The University of Ontario CS 4487/9687 Algorithms for Image Analysis Multi-Label Image Analysis Problems.
Graph-Based Image Segmentation
1 s-t Graph Cuts for Binary Energy Minimization  Now that we have an energy function, the big question is how do we minimize it? n Exhaustive search is.
Learning with Inference for Discrete Graphical Models Nikos Komodakis Pawan Kumar Nikos Paragios Ramin Zabih (presenter)
Markov Random Fields (MRF)
1 Can this be generalized?  NP-hard for Potts model [K/BVZ 01]  Two main approaches 1. Exact solution [Ishikawa 03] Large graph, convex V (arbitrary.
CS448f: Image Processing For Photography and Vision Graph Cuts.
Last Time Pinhole camera model, projection
2010/5/171 Overview of graph cuts. 2010/5/172 Outline Introduction S-t Graph cuts Extension to multi-label problems Compare simulated annealing and alpha-
1 Computer Vision Research  Huttenlocher, Zabih –Recognition, stereopsis, restoration, learning  Strong algorithmic focus –Combinatorial optimization.
Stereo & Iterative Graph-Cuts Alex Rav-Acha Vision Course Hebrew University.
MRF Labeling With Graph Cut CMPUT 615 Nilanjan Ray.
The plan for today Camera matrix
Optical flow and Tracking CISC 649/849 Spring 2009 University of Delaware.
Announcements Readings for today:
Lecture 10: Stereo and Graph Cuts
Computability and Complexity 24-1 Computability and Complexity Andrei Bulatov Approximation.
Stereo Computation using Iterative Graph-Cuts
Lecture 19: Optical flow CS6670: Computer Vision Noah Snavely
Graph-Cut Algorithm with Application to Computer Vision Presented by Yongsub Lim Applied Algorithm Laboratory.
Computer vision: models, learning and inference
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 25 Instructor: Paul Beame.
By Rohit Ray ESE 251.  Most minimization (maximization) strategies work to find the nearest local minimum  Trapped at local minimums (maxima)  Standard.
What have we leaned so far? Camera structure Eye structure Project 1: High Dynamic Range Imaging.
On the Construction of Data Aggregation Tree with Minimum Energy Cost in Wireless Sensor Networks: NP-Completeness and Approximation Algorithms National.
Fast Approximate Energy Minimization via Graph Cuts
A Selective Overview of Graph Cut Energy Minimization Algorithms Ramin Zabih Computer Science Department Cornell University Joint work with Yuri Boykov,
Michael Bleyer LVA Stereo Vision
Graph Cut & Energy Minimization
APPROXIMATION ALGORITHMS VERTEX COVER – MAX CUT PROBLEMS
Graph Cut Algorithms for Binocular Stereo with Occlusions
Graph Cut 韋弘 2010/2/22. Outline Background Graph cut Ford–Fulkerson algorithm Application Extended reading.
CS774. Markov Random Field : Theory and Application Lecture 13 Kyomin Jung KAIST Oct
Markov Random Fields Probabilistic Models for Images
Jonathan Dinger 1. Traffic footage example 2  Important step in video analysis  Background subtraction is often used 3.
Algorithms for MAP estimation in Markov Random Fields Vladimir Kolmogorov University College London.
1 Markov Random Fields with Efficient Approximations Yuri Boykov, Olga Veksler, Ramin Zabih Computer Science Department CORNELL UNIVERSITY.
Lecture 19: Solving the Correspondence Problem with Graph Cuts CAP 5415 Fall 2006.
Presenter : Kuang-Jui Hsu Date : 2011/3/24(Thur.).
Gaussian Mixture Models and Expectation-Maximization Algorithm.
CS654: Digital Image Analysis Lecture 28: Advanced topics in Image Segmentation Image courtesy: IEEE, IJCV.
A global approach Finding correspondence between a pair of epipolar lines for all pixels simultaneously Local method: no guarantee we will have one to.
Outline Standard 2-way minimum graph cut problem. Applications to problems in computer vision Classical algorithms from the theory literature A new algorithm.
Photoconsistency constraint C2 q C1 p l = 2 l = 3 Depth labels If this 3D point is visible in both cameras, pixels p and q should have similar intensities.
Markov Random Fields in Vision
Energy functions f(p)  {0,1} : Ising model Can solve fast with graph cuts V( ,  ) = T[  ] : Potts model NP-hard Closely related to Multiway Cut Problem.
Some links between min-cuts, optimal spanning forests and watersheds Cédric Allène, Jean-Yves Audibert, Michel Couprie, Jean Cousty & Renaud Keriven ENPC.
Energy minimization Another global approach to improve quality of correspondences Assumption: disparities vary (mostly) smoothly Minimize energy function:
Constraints Satisfaction Edmondo Trentin, DIISM. Constraint Satisfaction Problems: Local Search In many optimization problems, the path to the goal is.
Markov Random Fields with Efficient Approximations
Haim Kaplan and Uri Zwick
Haim Kaplan and Uri Zwick
CSE 589 Applied Algorithms Spring 1999
Lecture 31: Graph-Based Image Segmentation
Announcements more panorama slots available now
Algorithms (2IL15) – Lecture 7
Announcements more panorama slots available now
How to and how not to use graph cuts
Presentation transcript:

Graph Algorithms for Vision Amy Gale November 5, 2002

Energy Minimization  What is “energy” in this context?  What are some “classic” methods for energy minimization?  Huh? Graphs?  What are some graph-based methods for energy minimization?

Dissecting the Energy Function Energy of labeling f = data term+ smoothness term Cost of giving pixel p label f(p) given its current label. Intuition: penalty for label that differs from observed behavior of pixel. Cost of giving pixel p label f(p) given that its neighbor(s) have label(s) f(q). Intuition: penalty for differing from nearby pixels: we expect piecewise constant labels.

Data Term for Stereo  f(p) is disparity for pixel p, should be integer-valued even if actual disparity is not.  Data term should evaluate the difference between I(p) and the best interpolated value “near” I´(p+f(p)). I p I´I´ p+f(p) I´I´ I f ?? 1 2 0

Answers to questions you might not have asked yet  is the “regularization parameter”: it allows us to control the relative importance of smoothness term vs. data term.  N is a set of ordered pairs that we can define according to the neighbors we want to consider important: 4-neighborhood, 8- neighborhood, etc.

Some Specific Energy Models  Potts Model  Ising Model: Potts Model with two possible labels

Energy is Not Desirable  E(f) represents combined data and smoothness conflicts, so we want to minimize it.  Computer science already has some energy minimization algorithms lying around: Metropolis Algorithm. Simulated Annealing.

The Metropolis Algorithm 1.Start with some f. 2.Generate a random change to get f´ (for an image: change a single pixel label). 3.Compute  E = E(f´) – E(f). 4.If  E <0, set f = f´, otherwise set f = f´ with probability e -  E/T. 5.Go to step 2.

Metropolis: the role of T  T is a parameter to the algorithm.  When T is high, effectively a random search.  When T is low, can easily get stuck in local minima.  Not a great tradeoff either way.  Plus, result is sensitive to initial estimated f.

Simulated Annealing  Simply the Metropolis Algorithm with decreasing T.  Can be proved to find the global minimum if T is decreased “slowly enough”.  A worthwhile vision algorithm?

s t ,1 100,1 s t 1, ,99 100,100 s t 100,1 100, ,1 s t 1 100,100 Graphs  G = (V,E).  Relevant algorithm here: min cut (= max flow) between source s and sink t on an undirected graph. s t 1 100

Energy Minimization by Graph Cuts  Given points P want to build G where cuts in G are related to labelings of P. Labeling = cut Pixel = vertex Label = special vertex (s,t) Edge…?

Building the Graph: Ising Model pixel vertex n-link w(p,q) = ab cds t d-link w(p,s) = c 1 (p) w(p,t) = c 0 (p) label vertex cut cost = c 1 (a) + c 1 (c) + c 1 (d) + c 0 (b) + 2

Cost of a Cut in this Graph  Cut partitions graph vertices into S and T.  Cost of cut is cost of edges between S and T.

So Graph Cuts are Good Things (and that’s that?)  Ising Model allows two possible labels, which isn’t enough for any interesting/useful problem.  In general, the Potts model allows N possible labels, so what can we do? Multi-way Cut? This is NP-hard. By reduction…so is minimizing Potts energy at all! Need some new approach.

Approximation Algorithms  Sometimes our best option in the presence of NP-hardness.  Recall we can minimize the 2-label problem quickly, how can we leverage this?                Now choose 2 new labels and repeat…

 Swap Moves DEFINITION: an  swap move is a reallocation of some set of pixels between  and .  What happens in a single  swap move : To pixel labels? To overall energy?  What happens when we run to convergence?

How do we do this with graph cuts?  For an  swap, find min-cut on the following graph: (wlog) s =  -vertex, t =  -vertex V = {s,t}  {p: f(p)  {  }}(f current labeling)  Convention varies, authors in field (Zabih, Kolmogorov et al) say a cut gives label  to pixel p if it SEVERS the edge ( ,p).

Example (with a nasty surprise) Say we have pixels p 1, p 2, p 3 and possible labels  d(  ) = d(  ) = k / 2 and d(  ) = k. cp1p1 p2p2 p3p3  0kk  k0k  220 Initially:  E(f) = k  swap?   p1p1 p2p2 0 0+k/2 k/2 k+k k no change  swap?   p2p2 p3p3 0+k/2 0+0 k/2 k+0 2+k   swap?  p1p1 p3p3 0+k/2 k+k/2 2+k/2 no change  swap?   p2p2 p3p3 0+k/2 0 k/2 k 2+k  swap?   p1p1 p2p2 0 0+k/2 k/2 k+k k no change  is swap move minimum with E(f) = k BUT  has E(f) = 4 !!!  swap?  p1p1 p3p3 0+k/2 k+k/2 2+k/2 

 Swap Algorithm  Start with arbitrary f  Set change = 0  For each label pair  : find lowest-energy  -  swap f´ using graph cut if E(f´) < E(f) set f = f´ and change = 1  If change == 1 go to step 2

 swap is not a c-approximation algorithm for any c  Because the k in the last example could be anything at all…  Is there something similar we can do?      Now choose a new label and repeat…          

 -Expansion Moves DEFINITION: an  expansion move is a relabeling of some set of pixels to   Intuition: let label  compete against the collection  of all other labels.

Setting up the Graph  Two label vertices, wlog let s correspond to  and t to .  A pixel vertex for every pixel in the image.

Setting up the graph, ctd  Need some extra nodes and some constraints.  For  -expansion, d must be a metric: 1. d(  ) = 0   2. d(  d(  3. d(  d(  d(   Now add a gadget between p,q if (p,q)  N and f(q)  f(p)

Setting up the Graph: Example   p1p1 p3p3 p2p2 a 12 p4p4 a 34 c  (p 1 )c  (p 2 )c  (p 3 )c  (p 4 ) f(p 1 ) =  f(p 2 ) = f(p 3 ) =  f(p 4 ) =  d(  )  c  (p 3 )c  (p 2 )c  (p 1 ) d(  )d(  ) 0 d(  )d(  )

 -expansion Algorithm  Start with arbitrary f  Set change = 0  For each label  : find lowest-energy  -expansion f´ using graph cut if E(f´) < E(f) set f = f´ and change = 1  If change == 1 go to step 2

Optimality of  -expansion Let Let f* be the global optimum solution and f´ be the solution found by  -expansion. THEOREM: E(f´)  2cE(f*)

Some Results  -expansion Simulated Annealing (started from solution given by yet another algorithm, otherwise results would be much much worse) Ground Truth