Image Segmentation February 27, 2007. Implicit Scheme is considerably better with topological change. Transition from Active Contours: –contour v(t) 

Slides:



Advertisements
Similar presentations
Image Segmentation with Level Sets Group reading
Advertisements

Segmentácia farebného obrazu
Normalized Cuts and Image Segmentation
Image Segmentation and Active Contour
Segmentation and Region Detection Defining regions in an image.
On Constrained Optimization Approach To Object Segmentation Chia Han, Xun Wang, Feng Gao, Zhigang Peng, Xiaokun Li, Lei He, William Wee Artificial Intelligence.
Instructor: Mircea Nicolescu Lecture 13 CS 485 / 685 Computer Vision.
10/11/2001Random walks and spectral segmentation1 CSE 291 Fall 2001 Marina Meila and Jianbo Shi: Learning Segmentation by Random Walks/A Random Walks View.
Lecture 6 Image Segmentation
Image segmentation. The goals of segmentation Group together similar-looking pixels for efficiency of further processing “Bottom-up” process Unsupervised.
Lecture 21: Spectral Clustering
CS 376b Introduction to Computer Vision 04 / 08 / 2008 Instructor: Michael Eckmann.
Normalized Cuts and Image Segmentation Jianbo Shi and Jitendra Malik, Presented by: Alireza Tavakkoli.
© 2003 by Davi GeigerComputer Vision October 2003 L1.1 Image Segmentation Based on the work of Shi and Malik, Carnegie Mellon and Berkley and based on.
Region Segmentation. Find sets of pixels, such that All pixels in region i satisfy some constraint of similarity.
Image Segmentation Chapter 14, David A. Forsyth and Jean Ponce, “Computer Vision: A Modern Approach”.
Segmentation and Clustering. Segmentation: Divide image into regions of similar contentsSegmentation: Divide image into regions of similar contents Clustering:
A Unified View of Kernel k-means, Spectral Clustering and Graph Cuts
Segmentation Divide the image into segments. Each segment:
Announcements Project 2 more signup slots questions Picture taking at end of class.
CS 376b Introduction to Computer Vision 04 / 04 / 2008 Instructor: Michael Eckmann.
Segmentation Graph-Theoretic Clustering.
Cutting complete weighted graphs Jameson Cahill Ido Heskia Math/CSC 870 Spring 2007.
Image Segmentation A Graph Theoretic Approach. Factors for Visual Grouping Similarity (gray level difference) Similarity (gray level difference) Proximity.
Presentation By Michael Tao and Patrick Virtue. Agenda History of the problem Graph cut background Compute graph cut Extensions State of the art Continued.
Perceptual Organization: Segmentation and Optical Flow.
CS292 Computational Vision and Language Segmentation and Region Detection.
Announcements vote for Project 3 artifacts Project 4 (due next Wed night) Questions? Late day policy: everything must be turned in by next Friday.
Computer Vision Segmentation Marc Pollefeys COMP 256 Some slides and illustrations from D. Forsyth, T. Darrel,...
Announcements Project 3 questions Photos after class.
Computer Vision - A Modern Approach Set: Segmentation Slides by D.A. Forsyth Segmentation and Grouping Motivation: not information is evidence Obtain a.
Segmentation via Graph Cuts
Image Segmentation Image segmentation is the operation of partitioning an image into a collection of connected sets of pixels. 1. into regions, which usually.
Domain decomposition in parallel computing Ashok Srinivasan Florida State University COT 5410 – Spring 2004.
Image Segmentation Rob Atlas Nick Bridle Evan Radkoff.
CSE554AlignmentSlide 1 CSE 554 Lecture 8: Alignment Fall 2014.
Segmentation Techniques Luis E. Tirado PhD qualifying exam presentation Northeastern University.
Presenter : Kuang-Jui Hsu Date : 2011/5/3(Tues.).
Clustering appearance and shape by learning jigsaws Anitha Kannan, John Winn, Carsten Rother.
Segmentation using eigenvectors
CSSE463: Image Recognition Day 34 This week This week Today: Today: Graph-theoretic approach to segmentation Graph-theoretic approach to segmentation Tuesday:
Segmentation using eigenvectors Papers: “Normalized Cuts and Image Segmentation”. Jianbo Shi and Jitendra Malik, IEEE, 2000 “Segmentation using eigenvectors:
Region Segmentation Readings: Chapter 10: 10.1 Additional Materials Provided K-means Clustering (text) EM Clustering (paper) Graph Partitioning (text)
7.1. Mean Shift Segmentation Idea of mean shift:
Segmentation Course web page: vision.cis.udel.edu/~cv May 7, 2003  Lecture 31.
Chapter 14: SEGMENTATION BY CLUSTERING 1. 2 Outline Introduction Human Vision & Gestalt Properties Applications – Background Subtraction – Shot Boundary.
Introduction to Level Set Methods: Part II
Spectral Clustering Jianping Fan Dept of Computer Science UNC, Charlotte.
Image Segmentation Superpixel methods Speaker: Hsuan-Yi Ko.
Domain decomposition in parallel computing Ashok Srinivasan Florida State University.
CS654: Digital Image Analysis Lecture 28: Advanced topics in Image Segmentation Image courtesy: IEEE, IJCV.
 In the previews parts we have seen some kind of segmentation method.  In this lecture we will see graph cut, which is a another segmentation method.
Community structure in graphs Santo Fortunato. More links “inside” than “outside” Graphs are “sparse” “Communities”
Mestrado em Ciência de Computadores Mestrado Integrado em Engenharia de Redes e Sistemas Informáticos VC 15/16 – TP10 Advanced Segmentation Miguel Tavares.
Spectral Clustering Shannon Quinn (with thanks to William Cohen of Carnegie Mellon University, and J. Leskovec, A. Rajaraman, and J. Ullman of Stanford.
Normalized Cuts and Image Segmentation Patrick Denis COSC 6121 York University Jianbo Shi and Jitendra Malik.
Course Introduction to Medical Imaging Segmentation 1 – Mean Shift and Graph-Cuts Guy Gilboa.
Miguel Tavares Coimbra
CSSE463: Image Recognition Day 34
Jianping Fan Dept of CS UNC-Charlotte
Segmentation Graph-Theoretic Clustering.
Grouping.
Seam Carving Project 1a due at midnight tonight.
Spectral Clustering Eric Xing Lecture 8, August 13, 2010
3.3 Network-Centric Community Detection
Announcements Project 4 out today (due Wed March 10)
Announcements Project 1 is out today help session at the end of class.
CSSE463: Image Recognition Day 34
“Traditional” image segmentation
Presentation transcript:

Image Segmentation February 27, 2007

Implicit Scheme is considerably better with topological change. Transition from Active Contours: –contour v(t)  front  (t) –contour energy  forces F A F C –image energy  speed function k I Level set: –The level set c 0 at time t of a function  (x,y,t) is the set of arguments { (x,y),  (x,y,t) = c 0 } –Idea: define a function  (x,y,t) so that at any time,  (t) = { (x,y),  (x,y,t) = 0 } there are many such   has many other level sets, more or less parallel to  only  has a meaning for segmentation, not any other level set of 

Need to figure out how to evolve the level set function!

Usual choice for  : signed distance to the front  (0)  - d(x,y,  ) if (x,y) inside the front  (x,y,0) =  0 “ on “  d(x,y,  ) “ outside “  (t)  (x,y,t) Level Set Framework

 (x,y,t)  (x,y,t+1) =  (x,y,t) + ∆  (x,y,t) no movement, only change of values the front may change its topology the front location may be between samples Level Set

Segmentation with LS: Initialise the front  (0) Compute  (x,y,0) Iterate:  (x,y,t+1) =  (x,y,t) + ∆  (x,y,t) until convergence Mark the front  (t end ) Level Set

link between spatial and temporal derivatives, but not the same type of motion as contours! constant “force” (balloon pressure)  (x,y,t+1) -  (x,y,t) extension of the speed function k I (image influence) smoothing “force” depending on the local curvature  (contour influence) spatial derivative of  product of influences Equation for Front Propagation

Speed function: –k I is meant to stop the front on the object’s boundaries –similar to image energy: k I (x,y) = 1 / ( 1 + ||  I (x,y) || ) –only makes sense for the front (level set 0) –yet, same equation for all level sets  extend k I to all level sets, defining –possible extension: ^ kIkI ^ k I (x,y) = k I (x’,y’) where (x’,y’) is the point in the front closest to (x,y) ^ ( such a k I (x,y) depends on the front location ) Equation for Front Propagation

compute the speed k I on the front extend it to all other level sets 2. compute  (x,y,t+1) =  (x,y,t) + ∆  (x,y,t) 3. find the front location (for next iteration) modify  (x,y,t+1) by linear interpolation  (x,y,t) Algorithm

Weaknesses of algorithm 1 –update of all  (x,y,t): inefficient, only care about the front –speed extension: computationally expensive Improvement: –narrow band: only update a few level sets around  –other extended speed: k I (x,y) = 1 / ( 1 + ||  I(x,y)|| ) ^ Narrow band extension

Caution: –extrapolate the curvature  at the edges –re-select the narrow band regularly: an empty pixel cannot get a value  may restrict the evolution of the front Narrow band extension

Level sets: –function  : [ 0, I width ] x [ 0, I height ] x N  R ( x, y, t )   (x,y,t) –embed a curve  :  (t) = { (x,y),  (x,y,t) = 0 } –  (0) is provided externally,  (x,y,0) is computed –  (x,y,t+1) is computed by changing the values of  (x,y,t) –changes using a product of influences –on convergence,  (t end ) is the border of the object Issue: –computation time (improved with narrow band) Summary

Segmentation: A region in the image 1.with some homogeneous properties (intensity, colors, texture, … ) 2.Cohesion (moved in a similar way, motion segmentation) Active contours will have difficulties with natural images such as Image Segmentation Similarity (intensity difference) Similarity (intensity difference) Proximity Proximity Continuity Continuity

Image Segmentation The first step towards higher level vision (object recognition etc. There may not be a single correct answer. Segmentation can be considered as a partition problem. Literature on this topic is tremendous. Many approaches: Cues such as color, regions, contours, texture, motion, etc. Automatic vs. user-assisted

Image Segmentation Results

1.Histogram-based segmentation 2.Region-based segmentation Edge detection Region growing, spliting and merging. 3.Clustering K-means 4.Graph based clustering Main Approaches

Simple Example (text segmentation) Thresholding How to choose threshold value? (128/256, median/mean etc…)

Break images into K regions. Reducing intensity values into K different levels. Threshold value Histogram-based Methods

Consider the image as a set of points in N-dimensional feature space: 1.Intensity values or color values [ (x, y, I) or (x, y, r, g, b) ] 2.Texture and other features Segmentation as a clustering problem Work directly in the feature space and cluster these points in the feature space. Require: 1.A good definition of feature space 2.Distance between feature points should be meaningful

What is the “correct” grouping?

Segmentation as a clustering problem

Two Sub-problems

K-means clustering

Segmentation Result

Set of points of the feature space represented as a weighted, undirected graph, G = (V, E) The points of the feature space are the nodes of the graph. Edge between every pair of nodes. Weight on each edge, w(i, j), is a function of the similarity between the nodes i and j. Partition the set of vertices into disjoint sets where similarity within the sets is high and across the sets is low. Graph Partitioning Model

Weight measure (reflects likelihood of two pixels belonging to the same object) Weight function Note: the function is based on local similiarity

Images as Graphs

A graph can be partitioned into two disjoint sets by simply removing the edges connecting the two parts The degree of dissimilarity between these two pieces can be computed as total weight of the edges that have been removed More formally, it is called the ‘cut’ Global Criterion Selection

Minimize the cut value No of such partitions is exponential (2 N ) but the minimum cut can be found efficiently Reference: Z. Wu and R. Leahy, “An Optimal Graph Theoretic Approach to Data Clustering: Theory and Its Application to Image Segmentation”. IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 15, no. 11, pp , Nov Subject to the constraints: Optimization Problem

Picking an appropriate criterion to minimize which would result in a “good” segmentation Finding an efficient way to achieve the minimization Challenges

Set of points in the feature space with similarity (relation) defined for pairs of points. Problem: Partition the feature points into disjoint sets where similarity within the sets is high and across the sets is low. Construct a complete graph V, and nodes are the points. Edge between every pair of nodes. Weight on each edge, w(i, j), is a function of the similarity between the nodes i and j. A cut (of V) gives the partition. Graph Partitioning Model

Minimize the cut value No of such partitions is exponential (2 N ) but the minimum cut can be found efficiently Reference: Z. Wu and R. Leahy, “An Optimal Graph Theoretic Approach to Data Clustering: Theory and Its Application to Image Segmentation”. IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 15, no. 11, pp , Nov Subject to the constraints: Optimization Problem

Weights defined as W ij = exp (-|s i -s j | 2 /2  2 )

We must avoid unnatural bias for partitioning out small sets of points Normalized Cut - computes the cut cost as a fraction of the total edge connections to all the nodes in the graph where Normalized Cut J. Shi and J. Malik, “Normalized Cuts and Image Segmentation,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 22, no. 8, pp , Aug

Our criteria can also aim to tighten similarity within the groups Minimizing Ncut and maximizing Nassoc are equivalent (2-Nassoc = Ncut) Normalized Cut

Tighter cluster Weights defined as W ij = exp (-|s i -s j | 2 /2  2 ) Larger Nassoc(A, B) value reflects tigher cluster. Smaller Nassoc(A, B) value

Let e be an indicator vector (of dimension N ): e = 1, if i belongs to A 0,otherwise Assoc(A, A) = e T We Assoc(A, V) = e T De Cut(A, V-A) = e T (D – W)e Matrix Formulation Find two indicator vectors e 1, e 2, such that (e 1 t e 2 =0) is minimized.

Exact solution to minimizing normalized cut is an NP-complete problem However, approximate discrete solutions can be found efficiently Normalized cut criterion can be computed efficiently by solving a generalized eigenvalue problem Computational Issues

Relaxation

1.Construct the weighted graph representing the image. Summarize the information into matrices, W and D. Edge weight is an exponential function of feature similarity as well as distance measure. 2. Solve for the eigenvector with the second smallest eigenvalue of: Lx=(D – W)x = Dx (L = D-W) Algorithm (for image segmentation)

3. Partition the graph into two pieces using the second smallest eigenvector. Signs tell us exactly how to partition the graph. 4. Recursively run the algorithm on the two partitioned parts. Recursion stops once the Ncut value exceeds a certain limit. This maximum allowed Ncut value controls the number of groups segmented. Algorithm (cont.)

Example

Results

K-way Normalized Cut One can recursively apply normalized cut to obtain desired number of subgraphs. Better still, we can do the following. Let e 1, e 2, … e K denote the indicator vectors of K-partition of the graph. We want to maximize this cost function. The trick is to rewrite this optimization problem

K-way Normalized Cut Proposition (Bach-Jordan 2003) C W (e 1, …, e k ) equals to tr ( Y t D -1/2 W D -1/2 Y ) for any N-by-K matrix Y such that (N is the number of data points ) 1.Columns of D -1/2 Y are piecewise constant with respect to the clusters 2.Y has orthonormal columns (Y t Y = I). Why this? Because we reduce (and relax) the problem to a very solvable form (appears frequently) Maximize tr ( Y t M Y) subject to Y t Y = I (with M symmetric!) Answer: Y=K-largest eigenvectors of M.

Spectral Clustering (Ng, Weiss&Jordan, 2001) Using spectral method (i.e., eigenstructure of some symmetric matrix) to computr clusters. Given a set of points s 1, s 2,... s n 1.Form affinity matrix A (W before) A ij = exp (-|s i -s j | 2 /2  2 ) if i  j, and A ij =0. 2.D the diagonal matrix whose (i, i)-element is the sum of A’s i-th row, and let L=D -1/2 A D -1/2. 3.Find x 1, x 2,. . ., x k the k largest eigenvectors of L and form X=[x 1 x 2. . . x k ]. 4.Renormalizing each of X’s rows to have unit length 5.Treating each row of Y as a point in R k, cluster them into k clusters via K- means or other algorithm. 6.S i is assigned to cluster j if and only if row i of Y was assigned to cluster j.

Spectral Clustering In the ideal case when there is no inter-cluster affinity, the matrix A is block-diagonal (after some row, column permutation). Computed Affinity matrix

Spectral Clustering Each block has 1 as its largest eigenvalue and the top K eigenvectors are all 1. X = [ X 1 X 2 X 3 … X K ] X’s rows (after normalization) give you a ‘projection’ of the data points in the sphere in R K.

Spectral Clustering

K-Means Spectral Technique