Image segmentation Prof. Noah Snavely CS1114

Slides:



Advertisements
Similar presentations
Greedy Algorithms Clayton Andrews 2/26/08. What is an algorithm? “An algorithm is any well-defined computational procedure that takes some value, or set.
Advertisements

Lecture 24 Coping with NPC and Unsolvable problems. When a problem is unsolvable, that's generally very bad news: it means there is no general algorithm.
MIT and James Orlin © Game Theory 2-person 0-sum (or constant sum) game theory 2-person game theory (e.g., prisoner’s dilemma)
Sorting and Selection, Part 1 Prof. Noah Snavely CS1114
Point-and-Line Problems. Introduction Sometimes we can find an exisiting algorithm that fits our problem, however, it is more likely that we will have.
Introduction to Approximation Algorithms Lecture 12: Mar 1.
Lecture 6 Image Segmentation
1 Spanning Trees Lecture 20 CS2110 – Spring
EE 7730 Image Segmentation.
Clustering and greedy algorithms — Part 2 Prof. Noah Snavely CS1114
Region Segmentation. Find sets of pixels, such that All pixels in region i satisfy some constraint of similarity.
1 Greedy Algorithms. 2 2 A short list of categories Algorithm types we will consider include: Simple recursive algorithms Backtracking algorithms Divide.
Interpolation Prof. Noah Snavely CS1114
Segmentation CSE P 576 Larry Zitnick Many slides courtesy of Steve Seitz.
CSE115/ENGR160 Discrete Mathematics 03/03/11 Ming-Hsuan Yang UC Merced 1.
Segmentation Divide the image into segments. Each segment:
Announcements Project 2 more signup slots questions Picture taking at end of class.
Clustering Color/Intensity
CSE 421 Algorithms Richard Anderson Lecture 6 Greedy Algorithms.
Lecture 8: Image Alignment and RANSAC
1 Introduction to Approximation Algorithms Lecture 15: Mar 5.
Clustering and greedy algorithms Prof. Noah Snavely CS1114
Lecture 10: Robust fitting CS4670: Computer Vision Noah Snavely.
Announcements Project 3 questions Photos after class.
10/31/02CSE Greedy Algorithms CSE Algorithms Greedy Algorithms.
Computer Vision - A Modern Approach Set: Segmentation Slides by D.A. Forsyth Segmentation and Grouping Motivation: not information is evidence Obtain a.
10/31/02CSE Greedy Algorithms CSE Algorithms Greedy Algorithms.
Interpolation, extrapolation, etc. Prof. Ramin Zabih
Computability Reports. More examples. Homework: Optimization. Other follow- ups. Start to plan presentation.
Lecture 2 Geometric Algorithms. A B C D E F G H I J K L M N O P Sedgewick Sample Points.
Minimal Spanning Trees What is a minimal spanning tree (MST) and how to find one.
Lecture 23. Greedy Algorithms
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
1 The Theory of NP-Completeness 2012/11/6 P: the class of problems which can be solved by a deterministic polynomial algorithm. NP : the class of decision.
Approximation Algorithms for NP-hard Combinatorial Problems Magnús M. Halldórsson Reykjavik University
CS774. Markov Random Field : Theory and Application Lecture 13 Kyomin Jung KAIST Oct
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
CSIE Dept., National Taiwan Univ., Taiwan
Dr. Naveed Ahmad Assistant Professor Department of Computer Science University of Peshawar.
CSE 185 Introduction to Computer Vision Pattern Recognition 2.
Prims’ spanning tree algorithm Given: connected graph (V, E) (sets of vertices and edges) V1= {an arbitrary node of V}; E1= {}; //inv: (V1, E1) is a tree,
Interpolation Prof. Noah Snavely CS1114
CPSC 320: Intermediate Algorithm Design & Analysis Greedy Algorithms and Graphs Steve Wolfman 1.
Optimization revisited Prof. Noah Snavely CS1114
Lecture 19 Greedy Algorithms Minimum Spanning Tree Problem.
Medians and blobs Prof. Ramin Zabih
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
Robustness and speed Prof. Noah Snavely CS1114
Clustering Prof. Ramin Zabih
SPANNING TREES Lecture 21 CS2110 – Fall Nate Foster is out of town. NO 3-4pm office hours today!
Clustering Hongfei Yan School of EECS, Peking University 7/8/2009 Refer to Aaron Kimball’s slides.
Sorting and selection – Part 2 Prof. Noah Snavely CS1114
CSEP 521 Applied Algorithms Richard Anderson Winter 2013 Lecture 3.
Network Theory: Community Detection Dr. Henry Hexmoor Department of Computer Science Southern Illinois University Carbondale.
CS 1114: Finding things Prof. Graeme Bailey (notes modified from Noah Snavely, Spring 2009)
CSE 421 Algorithms Richard Anderson Lecture 8 Greedy Algorithms: Homework Scheduling and Optimal Caching.
Color Image Segmentation Mentor : Dr. Rajeev Srivastava Students: Achit Kumar Ojha Aseem Kumar Akshay Tyagi.
COSC 3101A - Design and Analysis of Algorithms 14 NP-Completeness.
A greedy algorithm is an algorithm that follows the problem solving heuristic of making the locally optimal choice at each stage with the hope of finding.
CSSE463: Image Recognition Day 23
Advanced Analysis of Algorithms
Discrete Mathematics CMP-101 Lecture 12 Sorting, Bubble Sort, Insertion Sort, Greedy Algorithms Abdul Hameed
Seam Carving Project 1a due at midnight tonight.
Greedy Algorithms: Homework Scheduling and Optimal Caching
Quickselect Prof. Noah Snavely CS1114
Greedy clustering methods
Announcements Project 4 questions Evaluations.
Spanning Trees Lecture 20 CS2110 – Spring 2015.
Prims’ spanning tree algorithm
Sorting and selection Prof. Noah Snavely CS1114
Presentation transcript:

Image segmentation Prof. Noah Snavely CS1114

Image segmentation  Given an image, can we find and segment all the objects? 2

Why image segmentation?  Makes the image much easier to analyze –Large number of pixels  small number of segments  Find structures we care about (e.g., lightsticks, bones)  Compression 3

Image segmentation  A (much) more difficult version of thresholding to find the red lightstick –We don’t know what colors the objects are how many objects there are whether each object is even a constant color 4

Image segmentation  To start with, we’ll look at a very simple version of segmentation  Color quantization –Take an image with (possibly) many colors, convert it to an image with a small number of colors 5 Demo

How do we quantize the colors?  Option 1: Could choose a fixed palette (red, green, blue, purple, white, …)  Option 2: Could optimize the palette for a given image 6 16 colors

How do we compute the optimal palette?  This is an example of a clustering problem 1. Find groups of pixels (clusters) that have similar color (i.e., similar RGB values) 2. Assign all the pixels in each cluster the same color 7

8 Applications of clustering  Economics or politics –Finding similar-minded or similar behaving groups of people (market segmentation) –Find stocks that behave similarly  Spatial clustering –Earthquake centers cluster along faults  Social network analysis –Recognize communities of similar people

9

Clustering  The distance between two items is the distance between their vectors  We’ll also assume for now that we know the number of clusters 10

11 Example Figure from Johan Everts

12 Clustering algorithms  There are many different approaches  How is a cluster is represented? –Is a cluster represented by a data point, or by a point in the middle of the cluster?  What algorithm do we use? –An interesting class of methods uses graph partitioning –Edge weights are distances

One approach: k-means  Suppose we are given n points, and want to find k clusters  We will find k cluster centers (or means), and assign each of the n points to the nearest cluster center –A cluster is a subset of the n points, called –We’ll call each cluster center a mean 13

k-means  How do we define the best k means? 14 Legend - centers (means) - clusters

k-means  Idea: find the centers that minimize the sum of squared distances to the points  Objective: 15

Optimizing k-means  This is called an objective function  Goal is to find the clusters and means that minimize this objective function  How do we do this? 16

Optimizing k-means  Brute-force approach: 1. Try every possible clustering (The best mean for a cluster is just the average of the cluster elements) 2. Check the value of the objective for this clustering 3. Pick the clustering that gives the minimum value 17

Optimizing k-means  Brute-force approach: 1. Try every possible clustering (The best mean for a cluster is just the average of the cluster elements) 2. Check the value of the objective for this clustering 3. Pick the clustering that gives the minimum value  How much work is this? –(How many possible clusterings are there?) 18

Optimizing k-means  The bad news: it is practically impossible to find the global minimum of this objective function –no one has ever come up with an algorithm that is faster than exponential time (and probably never will)  There are many problems like this (called NP-hard) 19

Optimizing k-means  It’s possible to prove that this is a hard problem (you’ll learn how in future CS courses – it involves reductions)  What now?  We shouldn’t give up… it might still be possible to get a “pretty good” solution 20

21 Greedy algorithms  Many CS problems can be solved by repeatedly doing whatever seems best at the moment –I.e., without needing a long-term plan  These are called greedy algorithms  Example: sorting by swapping out-of- order pairs (e.g., bubble sort)

Making change  For US currency (quarters, dimes, nickels, pennies) we can make change with a greedy algorithm: 1.While remaining change is > 0 2.Give the highest denomination coin whose value is >= remaining change  What if our denominations are 50, 49, and 1? –How should we give back 98 cents in change? –Greedy algorithms don’t always work… –(This problem requires more advanced techniques) cents:

23 A greedy method for k-means  Pick a random point to start with, this is your first cluster center  Find the farthest point from the cluster center, this is a new cluster center  Find the farthest point from any cluster center and add it  Repeat until we have k centers

A greedy method for k-means 24

A greedy method for k-means  Unfortunately, this doesn’t work that well  The answer we get could be much worse than the optimum 25

The k-centers problem  Let’s look at a related problem: k-centers  Find k cluster centers that minimize the maximum distance between any point and its nearest center –We want the worst point in the worst cluster to still be good (i.e., close to its center) –Concrete example: place k hospitals in a city so as to minimize the maximum distance from a hospital to a house 26

27 An amazing property  This algorithm gives you a solution that is no worse than twice the optimum  Such results are sometimes difficult to achieve, and the subject of much research –Mostly in CS6810, a bit in CS4820 –You can’t find the optimum, yet you can prove something about it!  Sometimes related problems (e.g. k-means vs. k-centers) have very different guarantees

Next time  More on clustering 28