Segmentácia farebného obrazu. Image segmentation.

Slides:



Advertisements
Similar presentations
Department of Computer Engineering
Advertisements

Segmentácia farebného obrazu
I Images as graphs Fully-connected graph – node for every pixel – link between every pair of pixels, p,q – similarity w ij for each link j w ij c Source:
GrabCut Interactive Image (and Stereo) Segmentation Carsten Rother Vladimir Kolmogorov Andrew Blake Antonio Criminisi Geoffrey Cross [based on Siggraph.
GrabCut Interactive Foreground Extraction using Iterated Graph Cuts Carsten Rother Vladimir Kolmogorov Andrew Blake Microsoft Research Cambridge-UK.
Graph-Based Image Segmentation
Stephen J. Guy 1. Photomontage Photomontage GrabCut – Interactive Foreground Extraction 1.
Graph-based image segmentation Václav Hlaváč Czech Technical University in Prague Faculty of Electrical Engineering Department of Cybernetics Prague, Czech.
GrabCut Interactive Image (and Stereo) Segmentation Joon Jae Lee Keimyung University Welcome. I will present Grabcut – an Interactive tool for foreground.
Image Segmentation Selim Aksoy Department of Computer Engineering Bilkent University
Markov Random Fields (MRF)
Lecture 6 Image Segmentation
Image segmentation. The goals of segmentation Group together similar-looking pixels for efficiency of further processing “Bottom-up” process Unsupervised.
EE 7730 Image Segmentation.
Thresholding Otsu’s Thresholding Method Threshold Detection Methods Optimal Thresholding Multi-Spectral Thresholding 6.2. Edge-based.
CS 376b Introduction to Computer Vision 04 / 08 / 2008 Instructor: Michael Eckmann.
1 Image Segmentation Image segmentation is the operation of partitioning an image into a collection of connected sets of pixels. 1. into regions, which.
Region Segmentation. Find sets of pixels, such that All pixels in region i satisfy some constraint of similarity.
Segmentation CSE P 576 Larry Zitnick Many slides courtesy of Steve Seitz.
Segmentation Divide the image into segments. Each segment:
Announcements Project 2 more signup slots questions Picture taking at end of class.
Today: Image Segmentation Image Segmentation Techniques Snakes Scissors Graph Cuts Mean Shift Wednesday (2/28) Texture analysis and synthesis Multiple.
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
CS 376b Introduction to Computer Vision 04 / 04 / 2008 Instructor: Michael Eckmann.
Image Segmentation Today’s Readings Forsyth & Ponce, Chapter 14
Presentation By Michael Tao and Patrick Virtue. Agenda History of the problem Graph cut background Compute graph cut Extensions State of the art Continued.
Perceptual Organization: Segmentation and Optical Flow.
Segmentation and Clustering Today’s Readings Forsyth & Ponce, Chapter 7 (plus lots of optional references in the slides) From Sandlot ScienceSandlot Science.
Announcements vote for Project 3 artifacts Project 4 (due next Wed night) Questions? Late day policy: everything must be turned in by next Friday.
Announcements Project 3 questions Photos after class.
Computer Vision - A Modern Approach Set: Segmentation Slides by D.A. Forsyth Segmentation and Grouping Motivation: not information is evidence Obtain a.
Image Segmentation Selim Aksoy Department of Computer Engineering Bilkent University
Graph-based Segmentation
Image Segmentation Image segmentation is the operation of partitioning an image into a collection of connected sets of pixels. 1. into regions, which usually.
Image Segmentation Rob Atlas Nick Bridle Evan Radkoff.
Segmentation and Grouping Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 02/23/10.
Computer Vision James Hays, Brown
MRFs and Segmentation with Graph Cuts Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 03/31/15.
What we didn’t have time for CS664 Lecture 26 Thursday 12/02/04 Some slides c/o Dan Huttenlocher, Stefano Soatto, Sebastian Thrun.
Region Segmentation Readings: Chapter 10: 10.1 Additional Materials Provided K-means Clustering (text) EM Clustering (paper) Graph Partitioning (text)
MRFs and Segmentation with Graph Cuts Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 02/24/10.
CSE 185 Introduction to Computer Vision Pattern Recognition 2.
Chapter 10 Image Segmentation.
Markov Random Fields Probabilistic Models for Images
EECS 274 Computer Vision Segmentation by Clustering II.
1 Markov Random Fields with Efficient Approximations Yuri Boykov, Olga Veksler, Ramin Zabih Computer Science Department CORNELL UNIVERSITY.
CS654: Digital Image Analysis
Gaussian Mixture Models and Expectation-Maximization Algorithm.
Image Segmentation Shengnan Wang
Chapter 13 (Prototype Methods and Nearest-Neighbors )
CS654: Digital Image Analysis Lecture 28: Advanced topics in Image Segmentation Image courtesy: IEEE, IJCV.
Thresholding Foundation:. Thresholding In A: light objects in dark background To extract the objects: –Select a T that separates the objects from the.
Lecture 30: Segmentation CS4670 / 5670: Computer Vision Noah Snavely From Sandlot ScienceSandlot Science.
Image segmentation.
MRFs and Segmentation with Graph Cuts Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 03/27/12.
K-Means Segmentation.
Machine Vision ENT 273 Lecture 4 Hema C.R.
Image Segmentation Image segmentation is the operation of partitioning an image into a collection of connected sets of pixels. 1. into regions, which usually.
Region Segmentation Readings: Chapter 10: 10
Markov Random Fields with Efficient Approximations
Image Segmentation Image segmentation is the operation of partitioning an image into a collection of connected sets of pixels. 1. into regions, which usually.
Lecture 31: Graph-Based Image Segmentation
Digital Image Processing
Seam Carving Project 1a due at midnight tonight.
Segmentation (continued)
Announcements Project 4 questions Evaluations.
Announcements Project 4 out today (due Wed March 10)
Announcements Project 1 is out today help session at the end of class.
“Traditional” image segmentation
Presentation transcript:

Segmentácia farebného obrazu

Image segmentation

Segmentation Segmentation means to divide up the image into a patchwork of regions, each of which is “homogeneous”, that is, the “same” in some sense - intensity, texture, colour, … The segmentation operation only subdivides an image; it does not attempt to recognize the segmented image parts!

Complete segmentation - divides an image into nonoverlapping regions that match to the real world objects. Cooperation with higher processing levels which use specific knowledge of the problem domain is necessary. Complete vs. partial segmentation

Partial segmentation- in which regions do not correspond directly with image objects. Image is divided into separate regions that are homogeneous with respect to a chosen property such as brightness, color, texture, etc.

Gestalt (celostné) laws of perceptual organization The emphasis in the Gestalt approach was on the configuration of the elements. Proximity: Objects that are closer to one another tend to be grouped together. Closure: Humans tend to enclose a space by completing a contour and ignoring gaps.

Gestalt laws of perceptual organization Similarity: Elements that look similar will be perceived as part of the same form. (color, shape, texture, and motion). Continuation: Humans tend to continue contours whenever the elements of the pattern establish an implied direction.

Gestalt laws A series of factors affect whether elements should be grouped together. A series of factors affect whether elements should be grouped together. Proximity: tokens that are nearby tend to be grouped. Proximity: tokens that are nearby tend to be grouped. Similarity: similar tokens tend to be grouped together. Similarity: similar tokens tend to be grouped together. Common fate: tokens that have coherent motion tend to be grouped together. Common fate: tokens that have coherent motion tend to be grouped together. Common region: tokens that lie inside the same closed region tend to be grouped together. Common region: tokens that lie inside the same closed region tend to be grouped together. Parallelism: parallel curves or tokens tend to be grouped together. Parallelism: parallel curves or tokens tend to be grouped together.

Closure: tokens or curves that tend to lead to closed curves tend to be grouped together. Closure: tokens or curves that tend to lead to closed curves tend to be grouped together. Symmetry: curves that lead to symmetric groups are grouped together. Symmetry: curves that lead to symmetric groups are grouped together. Continuity: tokens that lead to “continuous” curves tend to be grouped. Continuity: tokens that lead to “continuous” curves tend to be grouped. Familiar configuration: tokens that, when grouped, lead to a familiar object, tend to be grouped together. Familiar configuration: tokens that, when grouped, lead to a familiar object, tend to be grouped together. Gestalt laws

Consequence: Groupings by Invisible Completions * Images from Steve Lehar’s Gestalt papers: Stressing the invisible groupings:

Image segmentation Segmentation criteria: a segmentation is a partition of an image I into a set of regions S satisfying: Segmentation criteria: a segmentation is a partition of an image I into a set of regions S satisfying: 1.  S i = SPartition covers the whole image. 2. S i  S j = , i  jNo regions intersect. 3.  S i, P(S i ) = trueHomogeneity predicate is satisfied by each region. 4. P(S i  S j ) = false,Union of adjacent regions i  j, S i adjacent S j does not satisfy it.

Image segmentation So, all we have to do is to define and implement the similarity predicate. But, what do we want to be similar in each region? But, what do we want to be similar in each region? Is there any property that will cause the regions to be meaningful objects? Is there any property that will cause the regions to be meaningful objects?

Segmetnation methods Pixel-based Histogram Clustering Region-based Region growing Split and merge Edge-based Model-based Physics-based Graph-based

Histogram-based segmentation How many “orange” pixels are in this image? How many “orange” pixels are in this image? This type of question can be answered by looking at the histogram. This type of question can be answered by looking at the histogram.

Histogram-based segmentation How many modes are there? How many modes are there? Solve this by reducing the number of colors to K and mapping each pixel to the closest color. Solve this by reducing the number of colors to K and mapping each pixel to the closest color. Here’s what it looks like if we use two colors. Here’s what it looks like if we use two colors.

Clustering-based segmentation How to choose the representative colors? How to choose the representative colors? This is a clustering problem! This is a clustering problem! K-means algorithm can be used for clustering. K-means algorithm can be used for clustering.

Clustering-based segmentation K-means clustering of color.

Clustering-based segmentation K-means clustering of color.

22 K-means clustering using intensity alone and color alone Image Clusters on intensity Clusters on color * From Marc Pollefeys COMP Results of K-Means Clustering:

Clustering-based segmentation Clustering can also be used with other features (e.g., texture) in addition to color. Clustering can also be used with other features (e.g., texture) in addition to color. Original ImagesColor Regions Texture Regions

Clustering-based segmentation K-means variants: K-means variants: Different ways to initialize the means. Different ways to initialize the means. Different stopping criteria. Different stopping criteria. Dynamic methods for determining the right number of clusters (K) for a given image. Dynamic methods for determining the right number of clusters (K) for a given image. Problem: histogram-based and clustering- based segmentation can produce messy regions. Problem: histogram-based and clustering- based segmentation can produce messy regions. How can these be fixed? How can these be fixed?

Clustering-based segmentation Expectation-Maximization (EM) algorithm can be used as a probabilistic clustering method where each cluster is modeled using a Gaussian. Expectation-Maximization (EM) algorithm can be used as a probabilistic clustering method where each cluster is modeled using a Gaussian. The clusters are updated iteratively by computing the parameters of the Gaussians. The clusters are updated iteratively by computing the parameters of the Gaussians. Example from the UC Berkeley’s Blobworld system.

Clustering-based segmentation Examples from the UC Berkeley’s Blobworld system.

Region growing Region growing techniques start with one pixel of a potential region and try to grow it by adding adjacent pixels till the pixels being compared are too dissimilar. Region growing techniques start with one pixel of a potential region and try to grow it by adding adjacent pixels till the pixels being compared are too dissimilar. The first pixel selected can be just the first unlabeled pixel in the image or a set of seed pixels can be chosen from the image. The first pixel selected can be just the first unlabeled pixel in the image or a set of seed pixels can be chosen from the image. Usually a statistical test is used to decide which pixels can be added to a region. Usually a statistical test is used to decide which pixels can be added to a region. Region is a population with similar statistics. Region is a population with similar statistics. Use statistical test to see if neighbor on border fits into the region population. Use statistical test to see if neighbor on border fits into the region population.

Region growing image segmentation

Split-and-merge 1. Start with the whole image. 2. If the variance is too high, break into quadrants. 3. Merge any adjacent regions that are similar enough. 4. Repeat steps 2 and 3, iteratively until no more splitting or merging occur.  Idea: good Results: blocky

Split-and-merge

Split-and-merge

Split-and-merge A large connected region formed by merging pixels labeled as residential after classification. A satellite image.More compact sub-regions after the split-and-merge procedure.

Mean Shift Segmentation

34 Mean Shift Algorithm 1.Choose a search window size. 2.Choose the initial location of the search window. 3.Compute the mean location (centroid of the data) in the search window. 4.Center the search window at the mean location computed in Step 3. 5.Repeat Steps 3 and 4 until convergence. The mean shift algorithm seeks the “mode” or point of highest density of a data distribution:

Mean Shift Setmentation Algorithm 1.Convert the image into tokens (via color, gradients, texture measures etc). 2.Choose initial search window locations uniformly in the data. 3.Compute the mean shift window location for each initial position. 4.Merge windows that end up on the same “peak” or mode. 5.The data these merged windows traversed are clustered together. *Image From: Dorin Comaniciu and Peter Meer, Distribution Free Decomposition of Multivariate Data, Pattern Analysis & Applications (1999)2:22–30 Mean Shift Segmentation

36 Mean Shift Segmentation Extension Gary Bradski’s internally published agglomerative clustering extension: Mean shift dendrograms 1.Place a tiny mean shift window over each data point 2.Grow the window and mean shift it 3.Track windows that merge along with the data they transversed 4.Until everything is merged into one cluster Is scale (search window size) sensitive. Solution, use all scales: Best 4 clusters:Best 2 clusters: Advantage over agglomerative clustering: Highly parallelizable

Mean Shift Segmentation Results:

Graph Cut

First: select a region of interest Graph Cut

How to select the object automatically? ? Graph Cut

What are graphs? Nodes usually pixels sometimes samples Edges weights associated (W(i,j)) E.g. RGB value difference

Graph Cut What are cuts? Each “cut” -> points, W(I,j) Optimization problem W(i,j) = |RGB(i) – RGB(j)|

Graph Cut Go back to our selected region Each “cut” -> points, W(I,j) Optimization problem W(i,j) = |RGB(i) – RGB(j)|

Graph Cut Go back to our selected region Each “cut” -> points, W(I,j) Optimization problem W(i,j) = |RGB(i) – RGB(j)|

Graph Cut We want highest sum of weights Each “cut” -> points, W(I,j) Optimization problem W(i,j) = |RGB(i) – RGB(j)|

Graph Cut We want highest sum of weights Each “cut” -> points, W(I,j) Optimization problem W(i,j) = |RGB(i) – RGB(j)| These cuts give low points W(i,j) = |RGB(i) – RGB(j)|is low

Graph Cut We want highest sum of weights Each “cut” -> points, W(I,j) Optimization problem W(i,j) = |RGB(i) – RGB(j)| These cuts give high points W(i,j) = |RGB(i) – RGB(j)|is high

Graph-based segmentation An image is represented by a graph whose nodes are pixels or small groups of pixels. An image is represented by a graph whose nodes are pixels or small groups of pixels. The goal is to partition the nodes into disjoint sets so that the similarity within each set is high and across different sets is low. The goal is to partition the nodes into disjoint sets so that the similarity within each set is high and across different sets is low.

Graph-based segmentation Let G = (V,E) be a graph. Each edge (u,v) has a weight w(u,v) that represents the similarity between u and v. Let G = (V,E) be a graph. Each edge (u,v) has a weight w(u,v) that represents the similarity between u and v. Graph G can be broken into 2 disjoint graphs with node sets A and B by removing edges that connect these sets. Graph G can be broken into 2 disjoint graphs with node sets A and B by removing edges that connect these sets. Let cut(A,B) =  w(u,v). Let cut(A,B) =  w(u,v). One way to segment G is to find the minimal cut. One way to segment G is to find the minimal cut. u  A, v  B

Graph-based segmentation

Minimal cut favors cutting off small node groups, so Shi and Malik proposed the normalized cut. Minimal cut favors cutting off small node groups, so Shi and Malik proposed the normalized cut. cut(A,B) cut(A,B) Ncut(A,B) = assoc(A,V) assoc(B,V) assoc(A,V) =  w(u,t) u  A, t  V How much is A connected to the graph as a whole Normalized cut

Solve for Minimum Penalty Partition A Partition B cut

Graph-based segmentation Ncut(A,B) = A B

Graph Cut Optimization solver Solver Example Recursion: 1.Grow 2.If W(i,j) low 1.Stop 2.Continue

Graph Cut Result : Isolation

Image Segmentation and Graph Cuts Image Segmentation Graph Cuts

The Pipeline Assign W(i,j) Solve for minimum penalty Cut into 2 Subdivide? Yes No Input: Image Output: Segments Each iteration cuts into 2 pieces

Assign W(i,j) W(i,j) = |RGB(i) – RGB(j)| is noisy! Could use brightness and locality X(i) is the spatial location of node i F(i) is the feature vector for node i which can be intensity, color, texture, motion… The formula is set up so that w(i,j) is 0 for nodes that are too far apart.

Graph-based segmentation Shi and Malik turned graph cuts into an eigenvector/eigenvalue problem. Shi and Malik turned graph cuts into an eigenvector/eigenvalue problem. Set up a weighted graph G=(V,E). Set up a weighted graph G=(V,E). V is the set of (N) pixels. V is the set of (N) pixels. E is a set of weighted edges (weight w ij gives the similarity between nodes i and j). E is a set of weighted edges (weight w ij gives the similarity between nodes i and j). Length N vector d: d i is the sum of the weights from node i to all other nodes. Length N vector d: d i is the sum of the weights from node i to all other nodes. N x N matrix D: D is a diagonal matrix with d on its diagonal. N x N matrix D: D is a diagonal matrix with d on its diagonal. N x N symmetric matrix W: W ij = w ij. N x N symmetric matrix W: W ij = w ij.

Graph-based segmentation Let x be a characteristic vector of a set A of nodes. Let x be a characteristic vector of a set A of nodes. x i = 1 if node i is in a set A x i = 1 if node i is in a set A x i = -1 otherwise x i = -1 otherwise Let y be a continuous approximation to x Let y be a continuous approximation to x Solve the system of equations (D – W) y = D y for the eigenvectors y and eigenvalues. Solve the system of equations (D – W) y = D y for the eigenvectors y and eigenvalues. Use the eigenvector y with second smallest eigenvalue to bipartition the graph (y  x  A). Use the eigenvector y with second smallest eigenvalue to bipartition the graph (y  x  A). If further subdivision is merited, repeat recursively. If further subdivision is merited, repeat recursively.

Extensions: Edge Weights How to calculate the edge weights? Intensity Color (HSV) Texture

Continued Work: Semantic Segmentation Incorporating top-down information into low-level segmentation Interactive Graph Cuts: Yuri Boykov, et al

Graph-based segmentation

GrabCut GrabCut User Input Result Magic Wand (198?) Intelligent Scissors Mortensen and Barrett (1995) GrabCut Rother et al 2004 Regions Boundary Regions & Boundary Slides C Rother et al., Microsoft Research, Cambridge

Data Term Gaussian Mixture Model (typically 5-8 components) Foreground & Background Background G R Slides C Rother et al., Microsoft Research, Cambridge

Smoothness term An object is a coherent set of pixels Iterate until convergence: 1. Compute a configuration given the mixture model. (E-Step) 2. Compute the model parameters given the configuration. (M-Step) Slides C Rother et al., Microsoft Research, Cambridge

Moderately simple examples Moderately simple examples … GrabCut completes automatically Slides C Rother et al., Microsoft Research, Cambridge

Difficult Examples Difficult Examples Camouflage & Low Contrast No telepathy Fine structure Initial Rectangle Result Slides C Rother et al., Microsoft Research, Cambridge

CS 534 – Stereo Imaging - 71 Markov Random Fields (MRF) A graphical model for describing spatial consistency in images A graphical model for describing spatial consistency in images Suppose you want to label image pixels with some labels {l 1,…,l k }, e.g., segmentation, stereo disparity, foreground-background, etc. Suppose you want to label image pixels with some labels {l 1,…,l k }, e.g., segmentation, stereo disparity, foreground-background, etc. Ref: 1. S. Z. Li. Markov Random Field Modeling in Image Analysis. Springer-Verlag, S. Geman and D. Geman. Stochastic relaxation, gibbs distribution and bayesian restoration of images. PAMI, 6(6):721–741, From Slides by S. Seitz - University of Washington

CS 534 – Stereo Imaging - 72 Definition MRF Components: A set of sites: P={1,…,m} : each pixel is a site. A set of sites: P={1,…,m} : each pixel is a site. Neighborhood for each pixel N={N p | p  P} Neighborhood for each pixel N={N p | p  P} A set of random variables (random field), one for each site F={F p | p  P} Denotes the label at each pixel. A set of random variables (random field), one for each site F={F p | p  P} Denotes the label at each pixel. Each random variable takes a value f p from the set of labels L={l 1,…,l k } Each random variable takes a value f p from the set of labels L={l 1,…,l k } We have a joint event {F 1 =f 1,…, F m =f m }, or a configuration, abbreviated as F=f We have a joint event {F 1 =f 1,…, F m =f m }, or a configuration, abbreviated as F=f The joint prob. Of such configuration: Pr(F=f) or Pr(f) The joint prob. Of such configuration: Pr(F=f) or Pr(f) From Slides by S. Seitz - University of Washington

CS 534 – Stereo Imaging - 73 Definition MRF Components: Pr(f i ) > 0 for all variables f i. Pr(f i ) > 0 for all variables f i. Markov Property: Each Random variable depends on other RVs only through its neighbors. Pr(f p | f S-{p} )=Pr (f p |f Np ),  p Markov Property: Each Random variable depends on other RVs only through its neighbors. Pr(f p | f S-{p} )=Pr (f p |f Np ),  p So, we need to define a neighborhood system: N p (neighbors for site p). So, we need to define a neighborhood system: N p (neighbors for site p). No strict rules for neighborhood definition. No strict rules for neighborhood definition. Cliques for this neighborhood From Slides by S. Seitz - University of Washington

CS 534 – Stereo Imaging - 74 Definition MRF Components: The joint prob. of such configuration: The joint prob. of such configuration: Pr(F=f) or Pr(f) Markov Property: Each Random variable depends on other RVs only through its neighbors. Pr(f p | f S-{p} )=Pr (f p |f Np ),  p Markov Property: Each Random variable depends on other RVs only through its neighbors. Pr(f p | f S-{p} )=Pr (f p |f Np ),  p So, we need to define a neighborhood system: N p (neighbors for site p) So, we need to define a neighborhood system: N p (neighbors for site p) Hammersley-Clifford Theorem:Pr(f)  exp(-  C V C (f)) Sum over all cliques in the neighborhood system V C is clique potential We may decide 1. NOT to include all cliques in a neighborhood; or 2. Use different V c for different cliques in the same neighborhood From Slides by S. Seitz - University of Washington

CS 534 – Stereo Imaging - 75 Optimal Configuration MRF Components: Hammersley-Clifford Theorem: Hammersley-Clifford Theorem: Pr(f)  exp(-  C V C (f)) Consider MRF’s with arbitrary cliques among neighboring pixels Consider MRF’s with arbitrary cliques among neighboring pixels Sum over all cliques in the neighborhood system V C is clique potential: prior probability that elements of the clique C have certain values Typical potential: Potts model: From Slides by S. Seitz - University of Washington

CS 534 – Stereo Imaging - 76 Optimal Configuration MRF Components: Hammersley-Clifford Theorem: Hammersley-Clifford Theorem: Pr(f)  exp(-  C V C (f)) Consider MRF’s with clique potentials of pairs of neighboring pixels Consider MRF’s with clique potentials of pairs of neighboring pixels Most commonly used….very popular in vision. Energy function: There are two constraints to satisfy: 1.Data Constraint: Labeling should reflect the observation. 2.Smoothness constraint: Labeling should reflect spatial consistency (pixels close to each other are most likely to have similar labels).

CS 534 – Stereo Imaging - 77 Probabilistic interpretation The problem is we are not observing the labels but we observe something else that depends on these labels with some noise (eg intensity or disparity) The problem is we are not observing the labels but we observe something else that depends on these labels with some noise (eg intensity or disparity) At each site we have an observation i p At each site we have an observation i p The observed value at each site depends on its label: the prob. of certain observed value given certain label at site p : g(i p,f p )=Pr(i p |F p =f p ) The observed value at each site depends on its label: the prob. of certain observed value given certain label at site p : g(i p,f p )=Pr(i p |F p =f p ) The overall observation prob. given the labels: Pr(O|f) The overall observation prob. given the labels: Pr(O|f) We need to infer about the labels We need to infer about the labels given the observation Pr(f|O)  Pr(O|f) Pr(f)

Using MRFs How to model different problems? How to model different problems? Given observations y, and the parameters of the MRF, how to infer the hidden variables, x? Given observations y, and the parameters of the MRF, how to infer the hidden variables, x? How to learn the parameters of the MRF? How to learn the parameters of the MRF?

Modeling image pixel labels as MRF MRF-based segmentation 1 real image label image Slides by R. Huang – Rutgers University

MRF-based segmentation Classifying image pixels into different regions under the constraint of both local observations and spatial relationships Classifying image pixels into different regions under the constraint of both local observations and spatial relationships Probabilistic interpretation: Probabilistic interpretation: region labels image pixels model param. Slides by R. Huang – Rutgers University

Model joint probability label image label-label compatibility Function enforcing Smoothness constraint neighboring label nodes local Observations image-label compatibility Function enforcing Data Constraint region labels image pixels model param. How did we factorize? Slides by R. Huang – Rutgers University

CS 534 – Stereo Imaging - 82 Probabilistic interpretation We need to infer about the labels given the observation We need to infer about the labels given the observation Pr( f | O )  Pr(O|f ) Pr(f) Pr( f | O )  Pr(O|f ) Pr(f) MAP estimate of f should minimize the posterior energy Data (observation) term: Data Constraint Neighborhood term: Smoothness Constraint

MRF-based segmentation EM algorithm E-Step: (Inference) E-Step: (Inference) M-Step: (learning) M-Step: (learning) Applying and learning MRF Pseduo-likelihood method. Slides by R. Huang – Rutgers University

Applying and learning MRF: Example Slides by R. Huang – Rutgers University