CSSE463: Image Recognition Day 10 Lab 3 due Weds, 11:59pm Lab 3 due Weds, 11:59pm Take-home quiz due Friday, 4:00 pm Take-home quiz due Friday, 4:00 pm.

Slides:



Advertisements
Similar presentations
Eigen Decomposition and Singular Value Decomposition
Advertisements

EigenFaces and EigenPatches Useful model of variation in a region –Region must be fixed shape (eg rectangle) Developed for face recognition Generalised.
Eigen Decomposition and Singular Value Decomposition
Covariance Matrix Applications
電腦視覺 Computer and Robot Vision I
1er. Escuela Red ProTIC - Tandil, de Abril, 2006 Principal component analysis (PCA) is a technique that is useful for the compression and classification.
© 2003 by Davi GeigerComputer Vision September 2003 L1.1 Face Recognition Recognized Person Face Recognition.
Computer Graphics Recitation 6. 2 Last week - eigendecomposition A We want to learn how the transformation A works:
ENGG2013 Unit 19 The principal axes theorem
CHAPTER 19 Correspondence Analysis From: McCune, B. & J. B. Grace Analysis of Ecological Communities. MjM Software Design, Gleneden Beach, Oregon.
Face Recognition Jeremy Wyatt.
Face Recognition Using Eigenfaces
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Visual Recognition Tutorial1 Random variables, distributions, and probability density functions Discrete Random Variables Continuous Random Variables.
Probability Review (many slides from Octavia Camps)
Principal Component Analysis Principles and Application.
Principal Component Analysis. Consider a collection of points.
Techniques for studying correlation and covariance structure
The Multivariate Normal Distribution, Part 1 BMTRY 726 1/10/2014.
The Multivariate Normal Distribution, Part 2 BMTRY 726 1/14/2014.
Stats & Linear Models.
5.1 Orthogonality.
Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly.
Empirical Modeling Dongsup Kim Department of Biosystems, KAIST Fall, 2004.
Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.
Linear Least Squares Approximation. 2 Definition (point set case) Given a point set x 1, x 2, …, x n  R d, linear least squares fitting amounts to find.
Principal Component Analysis Adapted by Paul Anderson from Tutorial by Doug Raiford.
Dimensionality Reduction: Principal Components Analysis Optional Reading: Smith, A Tutorial on Principal Components Analysis (linked to class webpage)
Principal Components Analysis BMTRY 726 3/27/14. Uses Goal: Explain the variability of a set of variables using a “small” set of linear combinations of.
III. Multi-Dimensional Random Variables and Application in Vector Quantization.
Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki.
Principal Component Analysis Bamshad Mobasher DePaul University Bamshad Mobasher DePaul University.
N– variate Gaussian. Some important characteristics: 1)The pdf of n jointly Gaussian R.V.’s is completely described by means, variances and covariances.
Chapter 7 Multivariate techniques with text Parallel embedded system design lab 이청용.
Principal Components: A Mathematical Introduction Simon Mason International Research Institute for Climate Prediction The Earth Institute of Columbia University.
Chapter 10 Real Inner Products and Least-Square
CSSE463: Image Recognition Day 27 This week This week Today: Applications of PCA Today: Applications of PCA Sunday night: project plans and prelim work.
Introduction to Seismology
Last update Heejune Ahn, SeoulTech
Neural Computation Prof. Nathan Intrator
III. Multi-Dimensional Random Variables and Application in Vector Quantization.
EIGENSYSTEMS, SVD, PCA Big Data Seminar, Dedi Gadot, December 14 th, 2014.
CSSE463: Image Recognition Day 10 Lab 3 due Weds Lab 3 due Weds Today: Today: finish circularity finish circularity region orientation: principal axes.
Objectives: Normal Random Variables Support Regions Whitening Transformations Resources: DHS – Chap. 2 (Part 2) K.F. – Intro to PR X. Z. – PR Course S.B.
Presented by: Muhammad Wasif Laeeq (BSIT07-1) Muhammad Aatif Aneeq (BSIT07-15) Shah Rukh (BSIT07-22) Mudasir Abbas (BSIT07-34) Ahmad Mushtaq (BSIT07-45)
CSSE463: Image Recognition Day 25 This week This week Today: Applications of PCA Today: Applications of PCA Sunday night: project plans and prelim work.
Principal Components Analysis ( PCA)
Transformation methods - Examples
CSSE463: Image Recognition Day 10 Lab 3 due Weds, 3:25 pm Lab 3 due Weds, 3:25 pm Take-home quiz due Friday, 4:00 pm Take-home quiz due Friday, 4:00 pm.
Sect. 4.5: Cayley-Klein Parameters 3 independent quantities are needed to specify a rigid body orientation. Most often, we choose them to be the Euler.
CSSE463: Image Recognition Day 27
CSSE463: Image Recognition Day 26
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Principal Component Analysis
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Recitation: SVD and dimensionality reduction
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
CSSE463: Image Recognition Day 25
Eigen Decomposition Based on the slides by Mani Thomas
Principal Components What matters most?.
Digital Image Processing Lecture 21: Principal Components for Description Prof. Charlene Tsai *Chapter 11.4 of Gonzalez.
CSSE463: Image Recognition Day 25
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Principal Component Analysis
Eigen Decomposition Based on the slides by Mani Thomas
Principal Components What matters most?.
Outline Variance Matrix of Stochastic Variables and Orthogonal Transforms Principle Component Analysis Generalized Eigenvalue Decomposition.
Presentation transcript:

CSSE463: Image Recognition Day 10 Lab 3 due Weds, 11:59pm Lab 3 due Weds, 11:59pm Take-home quiz due Friday, 4:00 pm Take-home quiz due Friday, 4:00 pm Today: region orientation: principal axes Today: region orientation: principal axes Questions? Questions?

Principal Axes Gives orientation and elongation of a region Gives orientation and elongation of a region Demo Demo Uses eigenvalues and eigenvectors of the set of points in the region. Uses eigenvalues and eigenvectors of the set of points in the region.

Some intuition from statistics Sometimes changing axes can give more intuitive results Sometimes changing axes can give more intuitive results height weight size girth The size axis is the principal component: the dimension giving greatest variability. The girth axis is perpendicular to the size axis. It is uncorrelated and gives the direction of least variability.

How to find principal components? Recall from statistics, for distributions of 2 variables, variance of each variable and also covariance between the 2 variables are defined. Recall from statistics, for distributions of 2 variables, variance of each variable and also covariance between the 2 variables are defined. height weight size shape n=# of data (points in region) Q1

Intuitions  xx : How much x alone varies  xx : How much x alone varies  xy : How much x and y co-vary (are they correlated or independent?)  xy : How much x and y co-vary (are they correlated or independent?)  yy : How much y alone varies  yy : How much y alone varies Together, they form the covariance matrix, C Together, they form the covariance matrix, C Examples on board Examples on board Q2,Q3

The eigenvectors of the covariance matrix give the directions of variation, sorted from the one corresponding to the largest eigenvalue to the one corresponding to the smallest eigenvalue. The eigenvectors of the covariance matrix give the directions of variation, sorted from the one corresponding to the largest eigenvalue to the one corresponding to the smallest eigenvalue. Because the matrix is symmetric, the eigenvalues are guaranteed to be positive real numbers, and eigenvectors are orthogonal Because the matrix is symmetric, the eigenvalues are guaranteed to be positive real numbers, and eigenvectors are orthogonal height weight size shape Theorem (w/o proof) Q4

Application to images Consider the points in a region to be a 2D distribution of points. Consider the points in a region to be a 2D distribution of points. 2 vectors: [r,c] (as returned by find) Use covariance formulas (but replace x with c and y with r) The elements of the covariance matrix are called second-order spatial moments The elements of the covariance matrix are called second-order spatial moments Different than the spatial color moments in the sunset paper! Different than the spatial color moments in the sunset paper!

How to find principal axes? 1. Calculate spatial covariance matrix using previous formulas: 2. Find eigenvalues, 1, 2, and eigenvectors, v1, v2. 3. Direction of principal axis is direction of eigenvector corresponding to largest eigenvalue 4. Finally, a measure of the elongation of the shape is: Q5

Lab 4 Could you use the region properties we’ve studied to distinguish different shapes (squares, rectangles, circles, ellipses, triangles, …)? Could you use the region properties we’ve studied to distinguish different shapes (squares, rectangles, circles, ellipses, triangles, …)?