Image Segmentation A Graph Theoretic Approach. Factors for Visual Grouping Similarity (gray level difference) Similarity (gray level difference) Proximity.

Slides:



Advertisements
Similar presentations
Liang Shan Clustering Techniques and Applications to Image Segmentation.
Advertisements

Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
1 Perceptual Organization and Linear Algebra Charless Fowlkes Computer Science Dept. University of California at Berkeley.
Segmentácia farebného obrazu
Information Networks Graph Clustering Lecture 14.
Normalized Cuts and Image Segmentation
Online Social Networks and Media. Graph partitioning The general problem – Input: a graph G=(V,E) edge (u,v) denotes similarity between u and v weighted.
Graph-Based Image Segmentation
10/11/2001Random walks and spectral segmentation1 CSE 291 Fall 2001 Marina Meila and Jianbo Shi: Learning Segmentation by Random Walks/A Random Walks View.
Lecture 6 Image Segmentation
Image segmentation. The goals of segmentation Group together similar-looking pixels for efficiency of further processing “Bottom-up” process Unsupervised.
CS 376b Introduction to Computer Vision 04 / 08 / 2008 Instructor: Michael Eckmann.
Normalized Cuts and Image Segmentation Jianbo Shi and Jitendra Malik, Presented by: Alireza Tavakkoli.
Clustering… in General In vector space, clusters are vectors found within  of a cluster vector, with different techniques for determining the cluster.
© 2003 by Davi GeigerComputer Vision October 2003 L1.1 Image Segmentation Based on the work of Shi and Malik, Carnegie Mellon and Berkley and based on.
Region Segmentation. Find sets of pixels, such that All pixels in region i satisfy some constraint of similarity.
Image Segmentation Chapter 14, David A. Forsyth and Jean Ponce, “Computer Vision: A Modern Approach”.
Segmentation CSE P 576 Larry Zitnick Many slides courtesy of Steve Seitz.
4. Ad-hoc I: Hierarchical clustering
Announcements Project 2 more signup slots questions Picture taking at end of class.
CS 376b Introduction to Computer Vision 04 / 04 / 2008 Instructor: Michael Eckmann.
Segmentation Graph-Theoretic Clustering.
Cutting complete weighted graphs Jameson Cahill Ido Heskia Math/CSC 870 Spring 2007.
Perceptual Organization: Segmentation and Optical Flow.
Announcements vote for Project 3 artifacts Project 4 (due next Wed night) Questions? Late day policy: everything must be turned in by next Friday.
Announcements Project 3 questions Photos after class.
Computer Vision - A Modern Approach Set: Segmentation Slides by D.A. Forsyth Segmentation and Grouping Motivation: not information is evidence Obtain a.
Segmentation via Graph Cuts
Image Segmentation Selim Aksoy Department of Computer Engineering Bilkent University
Clustering Unsupervised learning Generating “classes”
Image Segmentation Image segmentation is the operation of partitioning an image into a collection of connected sets of pixels. 1. into regions, which usually.
Domain decomposition in parallel computing Ashok Srinivasan Florida State University COT 5410 – Spring 2004.
Image Segmentation Rob Atlas Nick Bridle Evan Radkoff.
IAstro/IDHA Workshop Strasbourg Observatory November 2002 Vito Di Gesù, Giosuè Lo Bosco DMA – University of Palermo, ITALY THE.
Presenter : Kuang-Jui Hsu Date : 2011/5/3(Tues.).
Clustering appearance and shape by learning jigsaws Anitha Kannan, John Winn, Carsten Rother.
Segmentation using eigenvectors
CSSE463: Image Recognition Day 34 This week This week Today: Today: Graph-theoretic approach to segmentation Graph-theoretic approach to segmentation Tuesday:
Segmentation using eigenvectors Papers: “Normalized Cuts and Image Segmentation”. Jianbo Shi and Jitendra Malik, IEEE, 2000 “Segmentation using eigenvectors:
Region Segmentation Readings: Chapter 10: 10.1 Additional Materials Provided K-means Clustering (text) EM Clustering (paper) Graph Partitioning (text)
Image Segmentation February 27, Implicit Scheme is considerably better with topological change. Transition from Active Contours: –contour v(t) 
Segmentation Course web page: vision.cis.udel.edu/~cv May 7, 2003  Lecture 31.
CS774. Markov Random Field : Theory and Application Lecture 13 Kyomin Jung KAIST Oct
Chapter 14: SEGMENTATION BY CLUSTERING 1. 2 Outline Introduction Human Vision & Gestalt Properties Applications – Background Subtraction – Shot Boundary.
Clustering Gene Expression Data BMI/CS 576 Colin Dewey Fall 2010.
Learning Spectral Clustering, With Application to Speech Separation F. R. Bach and M. I. Jordan, JMLR 2006.
 In the previews parts we have seen some kind of segmentation method.  In this lecture we will see graph cut, which is a another segmentation method.
Mestrado em Ciência de Computadores Mestrado Integrado em Engenharia de Redes e Sistemas Informáticos VC 15/16 – TP10 Advanced Segmentation Miguel Tavares.
A Tutorial on Spectral Clustering Ulrike von Luxburg Max Planck Institute for Biological Cybernetics Statistics and Computing, Dec. 2007, Vol. 17, No.
Normalized Cuts and Image Segmentation Patrick Denis COSC 6121 York University Jianbo Shi and Jitendra Malik.
Document Clustering with Prior Knowledge Xiang Ji et al. Document Clustering with Prior Knowledge. SIGIR 2006 Presenter: Suhan Yu.
Miguel Tavares Coimbra
Image Segmentation Today’s Readings Szesliski Chapter 5
Segmentation by clustering: normalized cut
CSSE463: Image Recognition Day 34
Segmentation Graph-Theoretic Clustering.
Grouping.
Lecture 31: Graph-Based Image Segmentation
Announcements Photos right now Project 3 questions
Digital Image Processing
Seam Carving Project 1a due at midnight tonight.
Spectral Clustering Eric Xing Lecture 8, August 13, 2010
Segmentation (continued)
3.3 Network-Centric Community Detection
Announcements Project 4 questions Evaluations.
Text Categorization Berlin Chen 2003 Reference:
Announcements Project 4 out today (due Wed March 10)
Announcements Project 1 is out today help session at the end of class.
CSSE463: Image Recognition Day 34
“Traditional” image segmentation
Presentation transcript:

Image Segmentation A Graph Theoretic Approach

Factors for Visual Grouping Similarity (gray level difference) Similarity (gray level difference) Proximity Proximity Continuity Continuity Reference: Reference: M. Wertheimer, “Laws of Organization in Perceptual Forms”, A Sourcebook of Gestalt Psychology, W.B. Ellis, ed., pp , Harcourt, Brace, 1938.

What is the “correct” grouping?

Subjectivity in Segmentation Prior world knowledge needed Prior world knowledge needed Agglomerative and divisive techniques in grouping (or Region-based merge and split algorithms in image segmentation) Agglomerative and divisive techniques in grouping (or Region-based merge and split algorithms in image segmentation) Local properties – easier to specify but poorer results Local properties – easier to specify but poorer results e.g. coherence of brightness, colour, texture, motion Global properties – more difficult to specify but give better results e.g. object symmetries Global properties – more difficult to specify but give better results e.g. object symmetries Image segmentation can be modeled as a graph partitioning and optimization problem Image segmentation can be modeled as a graph partitioning and optimization problem

Partitioning Divisive or top-down approach Divisive or top-down approach Inherently hierarchical Inherently hierarchical We must aim at returning a tree structure (called the dendogram) corresponding to a hierarchical partitioning scheme instead of a single “flat” partition We must aim at returning a tree structure (called the dendogram) corresponding to a hierarchical partitioning scheme instead of a single “flat” partition

Challenges Picking an appropriate criterion to minimize which would result in a “good” segmentation Picking an appropriate criterion to minimize which would result in a “good” segmentation Finding an efficient way to achieve the minimization Finding an efficient way to achieve the minimization

Modeling as a Graph Partitioning problem Set of points of the feature space represented as a weighted, undirected graph, G = (V, E) Set of points of the feature space represented as a weighted, undirected graph, G = (V, E) The points of the feature space are the nodes of the graph. The points of the feature space are the nodes of the graph. Edge between every pair of nodes. Edge between every pair of nodes. Weight on each edge, w(i, j), is a function of the similarity between the nodes i and j. Weight on each edge, w(i, j), is a function of the similarity between the nodes i and j. Partition the set of vertices into disjoint sets where similarity within the sets is high and across the sets is low. Partition the set of vertices into disjoint sets where similarity within the sets is high and across the sets is low.

Weight Function for Brightness Images Weight measure (reflects likelihood of two pixels belonging to the same object) Weight measure (reflects likelihood of two pixels belonging to the same object)

Representing Images as Graphs

Graph Weight Matrix, W

Segmentation and Graphs - Other Common Approaches Minimal Spanning Tree Minimal Spanning Tree Limited Neighbourhood Set Limited Neighbourhood Set - Both approaches are computationally efficient but the criteria are based on local properties - Perceptual grouping is about extracting global impressions of a scene; thus local criteria are often inadequate

First attempt at global criterion selection A graph can be partitioned into two disjoint sets simply removing the edges connecting the two parts A graph can be partitioned into two disjoint sets by simply removing the edges connecting the two parts The degree of dissimilarity between these two pieces can be computed as total weight of the edges that have been removed More formally, it is called the ‘cut’

Graph Cut

Optimization Problem Minimize the cut value Minimize the cut value No of such partitions is exponential (2^N) but the minimum cut can be found efficiently No of such partitions is exponential (2^N) but the minimum cut can be found efficiently Reference: Z. Wu and R. Leahy, “An Optimal Graph Theoretic Approach to Data Clustering: Theory and Its Application to Image Segmentation”. IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 15, no. 11, pp , Nov Subject to the constraints:

Problems with min-cut Minimum cut criteria favors cutting small sets of isolated nodes in the graph.

Solution – Normalized Cut We must avoid unnatural bias for partitioning out small sets of points Normalized Cut - computes the cut cost as a fraction of the total edge connections to all the nodes in the graph where

Looking at it another way.. Our criteria can also aim to tighten similarity within the groups Our criteria can also aim to tighten similarity within the groups Minimizing Ncut and maximizing Nassoc are actually equivalent Minimizing Ncut and maximizing Nassoc are actually equivalent

Matrix Formulations Let x be an indicator vector s.t. x i = 1, if i belongs to A 0,otherwise 0,otherwise Assoc(A, A) = x T Wx Assoc(A, A) = x T Wx Assoc(A, V) = x T Dx Assoc(A, V) = x T Dx Cut(A, V-A) = x T (D – W)x Cut(A, V-A) = x T (D – W)x

Computational Issues Exact solution to minimizing normalized cut is an NP-complete problem Exact solution to minimizing normalized cut is an NP-complete problem However, approximate discrete solutions can be found efficiently However, approximate discrete solutions can be found efficiently Normalized cut criterion can be computed efficiently by solving a generalized eigenvalue problem Normalized cut criterion can be computed efficiently by solving a generalized eigenvalue problem

Algorithm 1. Construct the weighted graph representing the image. Summarize the information into matrices, W & D. Edge weight is an exponential function of feature similarity as well as distance measure. 2. Solve for the eigenvectors with the smallest eigenvalues of: (D – W)x = LDx

Algorithm (contd.) 3. Partition the graph into two pieces using the second smallest eigenvector. Signs tell us exactly how to partition the graph. 4. Recursively run the algorithm on the two partitioned parts. Recursion stops once the Ncut value exceeds a certain limit. This maximum allowed Ncut value controls the number of groups segmented.

Computational Issues Revisited Solving a standard eigenvalue problem for all eigenvectors takes O(n^3) operations, where n is the number of nodes in the graph This becomes impractical for image segmentation applications where n is the number of pixels in an image For the problem at hand, the graphs are often only locally connected, only the top few eigenvectors are needed for graph partitioning, and the precision requirement for the eigenvectors is low, often only the right sign bit is required.

A Physical Interpretation Think of the weighted graph as a spring mass system Think of the weighted graph as a spring mass system  Graph nodes  physical masses  Graph edges  springs  Graph edge weight  spring stiffness  Total incoming edge weights  mass of the node

A Physical Interpretation (contd..) Imagine giving a hard shake to this spring-mass system, forcing the nodes to oscillate in the direction perpendicular to the image plane Nodes that have stronger spring connections among them will likely oscillate together Eventually, the group will “pop” off from the image plane The overall steady state behavior of the nodes can be described by its fundamental mode of oscillation and it can be shown that the fundamental modes of oscillation of this spring mass system are exactly the generalized eigenvectors of the normalized cut.

Comparisons with other criteria Average Cut: Average Cut: Analogously, Average Association can be defined as: Analogously, Average Association can be defined as: Unlike in the case of Normalized Cut and Normalized Association, Average Cut and Average Association do not have a simple relationship between them Unlike in the case of Normalized Cut and Normalized Association, Average Cut and Average Association do not have a simple relationship between them Consequently, one cannot simultaneously minimize the disassociation across the partitions while maximizing the association within the groups Normalized Cut produces better results in practice Normalized Cut produces better results in practice

Comparisons with other criteria (contd..)

Average association has a bias for finding tight clusters – runs the risk of finding small, tight clusters in the data Average association has a bias for finding tight clusters – runs the risk of finding small, tight clusters in the data Average cut does not look at within-group similarity – problems when the dissimilarity between groups is not clearly defined Average cut does not look at within-group similarity – problems when the dissimilarity between groups is not clearly defined

Consider random 1-D data points: Consider random 1-D data points: Each data point is a node in the graph Each data point is a node in the graph and the weighted graph edge connecting two points is defined to be inversely proportional to the distance between two nodes We will consider two different monotonically decreasing weight functions, w(i,j) = f(d(i,j)), defined on the distance function, d(i,j), with differents rate of fall-off.

Fast falling weight function With this function, only close-by points are connected. With this function, only close-by points are connected.

Criterion usedSecond smallest eigenvector plot

Interpretation The cluster on the right has less within- group similarity compared with the cluster on the left. In this case, average association fails to find the right partition. Instead, it focuses on finding small clusters in each of the two main subgroups.

Slowly decreasing weight function With this function, most points have non- trivial connections with the rest With this function, most points have non- trivial connections with the rest

Criterion usedSecond smallest eigenvector plot

Interpretation To find a cut of the graph, a number of edges with heavy weights have to be removed. In this case, average cut has trouble deciding on where to cut.

Reference J. Shi and J. Malik, “Normalized Cuts and Image Segmentation,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 22, no. 8, pp , Aug