Digital Image Processing Lecture 21: Principal Components for Description Prof. Charlene Tsai *Chapter 11.4 of Gonzalez.

Slides:



Advertisements
Similar presentations
Eigen Decomposition and Singular Value Decomposition
Advertisements

Eigen Decomposition and Singular Value Decomposition
3D Geometry for Computer Graphics
Covariance Matrix Applications
1er. Escuela Red ProTIC - Tandil, de Abril, 2006 Principal component analysis (PCA) is a technique that is useful for the compression and classification.
Slides by Olga Sorkine, Tel Aviv University. 2 The plan today Singular Value Decomposition  Basic intuition  Formal definition  Applications.
Pattern Recognition Topic 1: Principle Component Analysis Shapiro chap
Face Recognition Using Eigenfaces
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
3D Geometry for Computer Graphics
CS4670: Computer Vision Kavita Bala Lecture 7: Harris Corner Detection.
Statistical Shape Models Eigenpatches model regions –Assume shape is fixed –What if it isn’t? Faces with expression changes, organs in medical images etc.
Summarized by Soo-Jin Kim
Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.
Chapter 2 Dimensionality Reduction. Linear Methods
Digital Image Processing
Lecture 19 Representation and description II
Machine Vision for Robots
Digital Image Processing Lecture 20: Representation & Description
CSE554AlignmentSlide 1 CSE 554 Lecture 5: Alignment Fall 2011.
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki.
Additive Data Perturbation: data reconstruction attacks.
Principal Component Analysis Bamshad Mobasher DePaul University Bamshad Mobasher DePaul University.
Digital Imaging and Remote Sensing Laboratory NAPC 1 Noise Adjusted Principal Component Transform (NAPC) Data are first preprocessed to remove system bias.
N– variate Gaussian. Some important characteristics: 1)The pdf of n jointly Gaussian R.V.’s is completely described by means, variances and covariances.
Techniques for studying correlation and covariance structure Principal Components Analysis (PCA) Factor Analysis.
Chapter 7 Multivariate techniques with text Parallel embedded system design lab 이청용.
EIGENSYSTEMS, SVD, PCA Big Data Seminar, Dedi Gadot, December 14 th, 2014.
Feature Extraction 主講人:虞台文. Content Principal Component Analysis (PCA) PCA Calculation — for Fewer-Sample Case Factor Analysis Fisher’s Linear Discriminant.
CSSE463: Image Recognition Day 10 Lab 3 due Weds Lab 3 due Weds Today: Today: finish circularity finish circularity region orientation: principal axes.
Feature Extraction 主講人:虞台文.
Chapter 13 Discrete Image Transforms
CSSE463: Image Recognition Day 10 Lab 3 due Weds, 11:59pm Lab 3 due Weds, 11:59pm Take-home quiz due Friday, 4:00 pm Take-home quiz due Friday, 4:00 pm.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
CS246 Linear Algebra Review. A Brief Review of Linear Algebra Vector and a list of numbers Addition Scalar multiplication Dot product Dot product as a.
CSE 554 Lecture 8: Alignment
Information Management course
Digital Image Processing Lecture 20: Representation & Description
Factor Analysis An Alternative technique for studying correlation and covariance structure.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Matrices and Vectors Review Objective
Lecture: Face Recognition and Feature Reduction
Additive Data Perturbation: data reconstruction attacks

Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Multicollinearity in Regression Principal Components Analysis
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
6-4 Symmetric Matrices By毛.
Techniques for studying correlation and covariance structure
Principal Component Analysis
Digital Image Procesing The Karhunen-Loeve Transform (KLT) in Image Processing DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL.
Introduction to Statistical Methods for Measuring “Omics” and Field Data PCA, PcoA, distance measure, AMOVA.
Recitation: SVD and dimensionality reduction
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Matrix Algebra and Random Vectors
Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors.
Factor Analysis An Alternative technique for studying correlation and covariance structure.
Feature space tansformation methods
Eigen Decomposition Based on the slides by Mani Thomas
Principal Components What matters most?.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Lecture 13: Singular Value Decomposition (SVD)
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Factor Analysis (Principal Components) Output
Principal Component Analysis
Physics 319 Classical Mechanics
Eigen Decomposition Based on the slides by Mani Thomas
Principal Components What matters most?.
Presentation transcript:

Digital Image Processing Lecture 21: Principal Components for Description Prof. Charlene Tsai *Chapter 11.4 of Gonzalez

Introduction Applicable to boundaries and regions. First developed by Hotelling Goal: Remove the correlation among the element of a random vector. Aside: x is a random vector if each element xi of x is a random variable. What is a random variable?

Mean and Covariance is a population of random vectors of length n Mean vector is Covariance matrix is defined as Expected value

Example (Gonzalez, pg 677) Considering 4 vectors x1=(0,0,0)T, x2=(1,0,0)T, x3=(1,1,0)T and x4=(1,0,1)T Mean vector and covariance matrix are How to interpret entries of the covariance matrix?

What to do with Cx? We want to transform the vectors such that elements of the new vectors are uncorrelated. Making the off-diagonal elements of covariance matrix 0. Cx is real and symmetric, so there exists a set of n orthonormal eigenvectors, ei with corresponding eigenvalues in descending order.

Hotelling Transform (Principal-Component Analysis) Let A be a matrix in the form Create a new set of vectors Largest eigenvalue Smallest eigenvalue

my and Cy for Mean vector is 0 (zero vector) Covariance matrix is Exactly what we want: 0 off-diagonal elements Cx and Cy share the same eigenvalues and eigenvectors Still remember matrix diagonalization?

Reconstruction of x from y A is orthonormal, i.e. A-1 = AT x can be recovered from corresponding y Approximation can be made by using the first k eigenvectors of Cx to form Ak

Application: Boundary Description e1 and e2 are the eigenvectors of the object

(con’d) are the points on the boundary x=(a,b)T, where a and b are the coordinate values w.r.t. x1- and x2-axes. The result of Hotelling (principal component) transformation is a new coordinate system: Origin at the centroid of the boundary points Axes are the direction of the eigenvectors of Cx

(con’d) Aligning the data with the eigenvectors, i.e. , we get Aside: is the variance of component yi along eigenvector ei

Description Invariant to … Rotation: Aligning with the principal axes removes rotation Scaling: Normalization using eigenvalues (variance) Translation: Accounted for by centering the object about its mean