Problem Sets Problem Set 3 –Distributed Tuesday, 3/18. –Due Thursday, 4/3 Problem Set 4 –Distributed Tuesday, 4/1 –Due Tuesday, 4/15. Probably a total.

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

Filling Algorithms Pixelwise MRFsChaos Mosaics Patch segments are pasted, overlapping, across the image. Then either: Ambiguities are removed by smoothing.
Change Detection C. Stauffer and W.E.L. Grimson, “Learning patterns of activity using real time tracking,” IEEE Trans. On PAMI, 22(8): , Aug 2000.
Analysis of Contour Motions Ce Liu William T. Freeman Edward H. Adelson Computer Science and Artificial Intelligence Laboratory Massachusetts Institute.
E.G.M. PetrakisImage Segmentation1 Segmentation is the process of partitioning an image into regions –region: group of connected pixels with similar properties.
Computer Vision - A Modern Approach Set: Probability in segmentation Slides by D.A. Forsyth Missing variable problems In many vision problems, if some.
Lecture 07 Segmentation Lecture 07 Segmentation Mata kuliah: T Computer Vision Tahun: 2010.
Segmentation and Fitting Using Probabilistic Methods
Qualifying Exam: Contour Grouping Vida Movahedi Supervisor: James Elder Supervisory Committee: Minas Spetsakis, Jeff Edmonds York University Summer 2009.
Hidden Variables, the EM Algorithm, and Mixtures of Gaussians Computer Vision CS 143, Brown James Hays 02/22/11 Many slides from Derek Hoiem.
Segmentation Course web page: vision.cis.udel.edu/~cv May 2, 2003  Lecture 29.
Hidden Variables, the EM Algorithm, and Mixtures of Gaussians Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 03/15/12.
Uncertainty Representation. Gaussian Distribution variance Standard deviation.
Lecture 6 Image Segmentation
Prof. Ramin Zabih (CS) Prof. Ashish Raj (Radiology) CS5540: Computational Techniques for Analyzing Clinical Data.
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
Thresholding Otsu’s Thresholding Method Threshold Detection Methods Optimal Thresholding Multi-Spectral Thresholding 6.2. Edge-based.
Hierarchical Region-Based Segmentation by Ratio-Contour Jun Wang April 28, 2004 Course Project of CSCE 790.
Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 10 Statistical Modelling Martin Russell.
Announcements Final Exam May 16 th, 8 am (not my idea). Practice quiz handout 5/8. Review session: think about good times. PS5: For challenge problems,
First introduced in 1977 Lots of mathematical derivation Problem : given a set of data (data is incomplete or having missing values). Goal : assume the.
Region Segmentation. Find sets of pixels, such that All pixels in region i satisfy some constraint of similarity.
Segmentation and Clustering. Segmentation: Divide image into regions of similar contentsSegmentation: Divide image into regions of similar contents Clustering:
Segmentation CSE P 576 Larry Zitnick Many slides courtesy of Steve Seitz.
Segmentation Divide the image into segments. Each segment:
Announcements Project 2 more signup slots questions Picture taking at end of class.
Clustering Color/Intensity
Image Segmentation Some slides: courtesy of O. Capms, Penn State, J.Ponce and D. Fortsyth, Computer Vision Book.
Some slides and illustrations from D. Forsyth, …
Lecture 8: Image Alignment and RANSAC
Image Segmentation Today’s Readings Forsyth & Ponce, Chapter 14
(1) A probability model respecting those covariance observations: Gaussian Maximum entropy probability distribution for a given covariance observation.
Image Segmentation Image segmentation is the operation of partitioning an image into a collection of connected sets of pixels. 1. into regions, which usually.
Methods in Medical Image Analysis Statistics of Pattern Recognition: Classification and Clustering Some content provided by Milos Hauskrecht, University.
Feature and object tracking algorithms for video tracking Student: Oren Shevach Instructor: Arie nakhmani.
Segmentation Course web page: vision.cis.udel.edu/~cv May 7, 2003  Lecture 31.
EM and model selection CS8690 Computer Vision University of Missouri at Columbia Missing variable problems In many vision problems, if some variables.
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
Motion Segmentation By Hadas Shahar (and John Y.A.Wang, and Edward H. Adelson, and Wikipedia and YouTube) 1.
Chapter 10 Image Segmentation.
CS654: Digital Image Analysis
Mixture of Gaussians This is a probability distribution for random variables or N-D vectors such as… –intensity of an object in a gray scale image –color.
EECS 274 Computer Vision Model Fitting. Fitting Choose a parametric object/some objects to represent a set of points Three main questions: –what object.
Final Review Course web page: vision.cis.udel.edu/~cv May 21, 2003  Lecture 37.
Probability and Statistics in Vision. Probability Objects not all the sameObjects not all the same – Many possible shapes for people, cars, … – Skin has.
Visual Tracking by Cluster Analysis Arthur Pece Department of Computer Science University of Copenhagen
Model fitting ECE 847: Digital Image Processing Stan Birchfield Clemson University.
Hidden Variables, the EM Algorithm, and Mixtures of Gaussians Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 02/22/11.
Representing Moving Images with Layers J. Y. Wang and E. H. Adelson MIT Media Lab.
R. Kass/W03 P416 Lecture 5 l Suppose we are trying to measure the true value of some quantity (x T ). u We make repeated measurements of this quantity.
Machine Learning Expectation Maximization and Gaussian Mixtures CSE 473 Chapter 20.3.
Motion Segmentation at Any Speed Shrinivas J. Pundlik Department of Electrical and Computer Engineering, Clemson University, Clemson, SC.
Machine Learning Expectation Maximization and Gaussian Mixtures CSE 473 Chapter 20.3.
Course : T Computer Vision
University of Ioannina
Classification of unlabeled data:
PRAKASH CHOCKALINGAM, NALIN PRADEEP, AND STAN BIRCHFIELD
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Missing variable problems
Seam Carving Project 1a due at midnight tonight.
Segmentation (continued)
Analysis of Contour Motions
EM for Motion Segmentation
Announcements Project 4 questions Evaluations.
Announcements Project 4 out today (due Wed March 10)
Introduction to Sensor Interpretation
Announcements Project 1 is out today help session at the end of class.
EM Algorithm and its Applications
Introduction to Sensor Interpretation
Presentation transcript:

Problem Sets Problem Set 3 –Distributed Tuesday, 3/18. –Due Thursday, 4/3 Problem Set 4 –Distributed Tuesday, 4/1 –Due Tuesday, 4/15. Probably a total of 5 problem sets.

E-M Reading: –Forsyth & Ponce 16.1, 16.2 –Forsyth & Ponce 16.3 for challenge problem. Yair Weiss: Motion Segmentation using EM – a short tutorial. –www –Especially 1 st 2 pages.

Examples of Perceptual Grouping Boundaries: –Find closed contours near edges that are smooth. –Gestalt laws: common form, good continuation, closure. Smooth Paths: –Find open, smooth paths in images. –Applications: road finding, intelligent scissors (click on points and follow boundary between them). –Gestalt laws: Good continuation, common form.

Examples of Perceptual Grouping Regions: Find regions with uniform properties, to help find objects/parts of interest. –Color –Intensity –Texture –Gestalt laws: common form, proximity.

Examples of Perceptual Grouping Useful features: –Example, straight lines. Can be used to find vanishing points. –Gestalt laws: collinearity, proximity.

Parametric Methods We discussed Ransac, Hough Transform. The have some limitations –Object must have few parameters. –Finds an answer, but is it the best answer? –Hard to say because problem definition a bit vague.

Expectation-Maximization (EM) Can handle more parameters. Uses probabilistic model of all the data. –Good: we are solving a well defined problem. –Bad: Often we don’t have a good model of the data, especially noise. –Bad: Computations only feasible with pretty simple models, which can be unrealistic. Finds (locally) optimal solution.

E-M Definitions Models have parameters: u –Examples: line has slope/intercept; Gaussian has mean and variance. Data is what we know from image: y –Examples: Points we want to fit a line to. Assignment of data to models: z –Eg., which points came from line 1. –z(i,j) = 1 means data i came from model j. Data and assignments (y & z): x.

E-M Definitions Missing Data: We know y. Missing values are u and z. Mixture model: The data is a mixture of more than one model.

E-M Quick Overview We know data (y). Want to find assignments (z) and parameters (u). If we know y & u, we can find z more easily. If we know y & z, we can find u more easily. Algorithm: 1.Guess u. 2.Given current guess of u and y, find z. (E) 3.Given current guess of z and y, find u. (M) 4.Go to 2.

Example: Histograms Histogram gives 1D clustering problem. Constant regions + noise = Gaussians. Guess mean and variance of pixel intensities. Compute membership for each pixel. Compute means as weighted average. Compute variance as weighted sample variance. Details: whiteboard; Also, Matlab and Weiss.

More subtle points Guess must be reasonable, or we won’t converge to anything reasonable. –Seems good to start with high variance. How do we stop. –When things don’t change much. –Could look at parameters (u). –Or likelihood of data.

Overview again Break unknowns into pieces. If we know one piece, other is solvable. Guess piece 1. Solve for piece 2. Then solve for 1. …. Very useful strategy for solving problems.

Drawbacks Local optimum. Optimization: we take steps that make the solution better and better, and stop when next step doesn’t improve. But, we don’t try all possible steps.

Local maximum

which is an excellent fit to some points

Drawbacks How do we get models? –But if we don’t know models, in some sense we just don’t understand the problem. Starting point? Use non-parametric method. How many models?

A dataset that is well fitted by four lines

Result of EM fitting, with one line (or at least, one available local maximum).

Result of EM fitting, with two lines (or at least, one available local maximum).

Seven lines can produce a rather logical answer

Figure from “Color and Texture Based Image Segmentation Using EM and Its Application to Content Based Image Retrieval”,S.J. Belongie et al., Proc. Int. Conf. Computer Vision, 1998, c1998, IEEE Segmentation with EM

Motion segmentation with EM Model image pair (or video sequence) as consisting of regions of parametric motion –affine motion is popular Now we need to –determine which pixels belong to which region –estimate parameters Likelihood –assume Straightforward missing variable problem, rest is calculation

Three frames from the MPEG “flower garden” sequence Figure from “Representing Images with layers,”, by J. Wang and E.H. Adelson, IEEE Transactions on Image Processing, 1994, c 1994, IEEE

Grey level shows region no. with highest probability Segments and motion fields associated with them Figure from “Representing Images with layers,”, by J. Wang and E.H. Adelson, IEEE Transactions on Image Processing, 1994, c 1994, IEEE

If we use multiple frames to estimate the appearance of a segment, we can fill in occlusions; so we can re-render the sequence with some segments removed. Figure from “Representing Images with layers,”, by J. Wang and E.H. Adelson, IEEE Transactions on Image Processing, 1994, c 1994, IEEE

Probabilistic Interpretation We want P(u | y) –(A probability distribution of models given data). Or maybe P(u,z | y). Or argmax(u) P(u|y). We compute: argmax(u,z) P(y | u,z). –Find the model and assignments that make the data as likely to have occurred as possible. –This is similar to finding most likely model and assignments given data, ignoring prior on models.

Generalizations Multi-dimensional Gaussian. –Color, texture, … –Examples: 1D reduces to Gaussian –2D: nested ellipsoids of equal probability. –Discs if Covariance (Sigma) is diagonal.