EE 290A: Generalized Principal Component Analysis Lecture 4: Generalized Principal Component Analysis Sastry & Yang © Spring, 2011EE 290A, University of.

Slides:



Advertisements
Similar presentations
Example: Given a matrix defining a linear mapping Find a basis for the null space and a basis for the range Pamela Leutwyler.
Advertisements

3D Geometry for Computer Graphics
1 Seminar of computational geometry Lecture #1 Convexity.
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Geometric Transformations
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #20.
EE 290A: Generalized Principal Component Analysis Lecture 6: Iterative Methods for Mixture-Model Segmentation Sastry & Yang © Spring, 2011EE 290A, University.
EE 290A: Generalized Principal Component Analysis Lecture 5: Generalized Principal Component Analysis Sastry & Yang © Spring, 2011EE 290A, University of.
Affine-invariant Principal Components Charlie Brubaker and Santosh Vempala Georgia Tech School of Computer Science Algorithms and Randomness Center.
MASKS © 2004 Invitation to 3D vision Lecture 8 Segmentation of Dynamical Scenes.
Uncalibrated Geometry & Stratification Sastry and Yang
Dimension of a Vector Space (11/9/05) Theorem. If the vector space V has a basis consisting of n vectors, then any set of more than n vectors in V must.
Subspaces, Basis, Dimension, Rank
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
PCA Extension By Jonash.
Euclidean cameras and strong (Euclidean) calibration Intrinsic and extrinsic parameters Linear least-squares methods Linear calibration Degenerate point.
EE 290A: Generalized Principal Component Analysis Lecture 2 (by Allen Y. Yang): Extensions of PCA Sastry & Yang © Spring, 2011EE 290A, University of California,
Geometry and Algebra of Multiple Views
4.1 Vector Spaces and Subspaces 4.2 Null Spaces, Column Spaces, and Linear Transformations 4.3 Linearly Independent Sets; Bases 4.4 Coordinate systems.
Vectors in R n a sequence of n real number An ordered n-tuple: the set of all ordered n-tuple  n-space: R n Notes: (1) An n-tuple can be viewed.
A vector space containing infinitely many vectors can be efficiently described by listing a set of vectors that SPAN the space. eg: describe the solutions.
1 Recognition by Appearance Appearance-based recognition is a competing paradigm to features and alignment. No features are extracted! Images are represented.
Computer Vision Lab. SNU Young Ki Baik Nonlinear Dimensionality Reduction Approach (ISOMAP, LLE)
4.6: Rank. Definition: Let A be an mxn matrix. Then each row of A has n entries and can therefore be associated with a vector in The set of all linear.
Multi-linear Systems and Invariant Theory in the Context of Computer Vision and Graphics CS329 Amnon Shashua.
CSE 185 Introduction to Computer Vision Face Recognition.
2 2.9 © 2016 Pearson Education, Inc. Matrix Algebra DIMENSION AND RANK.
EE4-62 MLCV Lecture Face Recognition – Subspace/Manifold Learning Tae-Kyun Kim 1 EE4-62 MLCV.
Maximal Data Piling Visual similarity of & ? Can show (Ahn & Marron 2009), for d < n: I.e. directions are the same! How can this be? Note lengths are different.
EECS 274 Computer Vision Affine Structure from Motion.
Class 26: Question 1 1.An orthogonal basis for A 2.An orthogonal basis for the column space of A 3.An orthogonal basis for the row space of A 4.An orthogonal.
1 Graphics CSCI 343, Fall 2015 Lecture 9 Geometric Objects.
半年工作小结 报告人:吕小惠 2011 年 8 月 25 日. 报告提纲 一.学习了 Non-negative Matrix Factorization convergence proofs 二.学习了 Sparse Non-negative Matrix Factorization 算法 三.学习了线性代数中有关子空间等基础知.
2D-LDA: A statistical linear discriminant analysis for image matrix
Advanced Artificial Intelligence Lecture 8: Advance machine learning.
4.5: The Dimension of a Vector Space. Theorem 9 If a vector space V has a basis, then any set in V containing more than n vectors must be linearly dependent.
K -means clustering via Principal Component Analysis (Chris Ding and Xiaofeng He, ICML 2004) 03 March 2011 Kwak, Namju 1.
4 Vector Spaces 4.1 Vector Spaces and Subspaces 4.2 Null Spaces, Column Spaces, and Linear Transformations 4.3 Linearly Independent Sets; Bases 4.4 Coordinate.
Face detection and recognition Many slides adapted from K. Grauman and D. Lowe.
Sec Sec Sec 4.4 CHAPTER 4 Vector Spaces Let V be a set of elements with vector addition and multiplication by scalar is a vector space if these.
4. Affine transformations. Reading Required:  Watt, Section 1.1. Further reading:  Foley, et al, Chapter  David F. Rogers and J. Alan Adams,
Motion Segmentation CAGD&CG Seminar Wanqiang Shen
Beyond Vectors Hung-yi Lee. Introduction Many things can be considered as “vectors”. E.g. a function can be regarded as a vector We can apply the concept.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
University of Ioannina
9.3 Filtered delay embeddings
Motion Segmentation with Missing Data using PowerFactorization & GPCA
Part I 1 Title 2 Motivation 3 Problem statement 4 Brief review of PCA
René Vidal and Xiaodong Fan Center for Imaging Science
Segmentation of Dynamic Scenes
EMGT 6412/MATH 6665 Mathematical Programming Spring 2016
René Vidal Time/Place: T-Th 4.30pm-6pm, Hodson 301
Segmentation of Dynamic Scenes
A Unified Algebraic Approach to 2D and 3D Motion Segmentation
Segmentation of Dynamic Scenes from Image Intensities
Generalized Principal Component Analysis CVPR 2008
Linear Transformation, Null Spaces and Ranges
RECORD. RECORD Subspaces of Vector Spaces: Check to see if there are any properties inherited from V:
Linear Transformations
Linear Transformations
EE 290A Generalized Principal Component Analysis
ECE 638: Principles of Digital Color Imaging Systems
Segmentation of Dynamical Scenes
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
An Algorithm of Eye-Based Ray Tracing on MATLAB
Affine Spaces Def: Suppose
Elementary Linear Algebra
Linear Algebra Lecture 20.
Linear Transformations
Vector Spaces, Subspaces
Presentation transcript:

EE 290A: Generalized Principal Component Analysis Lecture 4: Generalized Principal Component Analysis Sastry & Yang © Spring, 2011EE 290A, University of California, Berkeley1

This lecture GPCA: Problem Definition Segmentation of Multiple Hyperplanes Reminder: HW 1 due on Feb. 8 th. Sastry & Yang © Spring, 2011 EE 290A, University of California, Berkeley 2

Problem Definition Define a mixture subspace model Subspace Segmentation Problem: Sastry & Yang © Spring, 2011 EE 290A, University of California, Berkeley 3

Projectivization of Affine Subspaces Every affine subspace can be “lifted” to a linear subspace by adding the homogeneous coordinates Homogeneous representation Sastry & Yang © Spring, 2011 EE 290A, University of California, Berkeley 4

Conclusion: Projectivization does not lose information on data model and sample membership Sastry & Yang © Spring, 2011 EE 290A, University of California, Berkeley 5

Subspace Projection High-dim data may lie in low-dim subspaces When d << D, estimation is not efficient Sastry & Yang © Spring, 2011 EE 290A, University of California, Berkeley 6 Images of a subject under illumination lie on a 20-dim subspace

Subspace-Preserving Projections Subspaces in high-D space can be projected onto a lower-D space while the membership of the samples is preserved Sastry & Yang © Spring, 2011 EE 290A, University of California, Berkeley 7

If the span of all subspaces is still a proper subspace of the ambient space : use PCA If the span is the whole space, yet the largest dimension is less than (D-1) Sastry & Yang © Spring, 2011 EE 290A, University of California, Berkeley 8

The approach for mixture-subspace segmentation Sastry & Yang © Spring, 2011 EE 290A, University of California, Berkeley 9

Choosing a SP-Projection Sastry & Yang © Spring, 2011 EE 290A, University of California, Berkeley 10

3.2 Introductory Cases Segmenting points on a line Sastry & Yang © Spring, 2011 EE 290A, University of California, Berkeley 11

Determine the number of groups Sastry & Yang © Spring, 2011 EE 290A, University of California, Berkeley 12 Question: When j=K, is the null space of P always 1-D in this case?

Segmenting lines on a plane Sastry & Yang © Spring, 2011 EE 290A, University of California, Berkeley 13

Sastry & Yang © Spring, 2011 EE 290A, University of California, Berkeley 14

Question 1: How to determine the number of lines? Question 2: When k=K, is the null space of V always rank-1? Sastry & Yang © Spring, 2011 EE 290A, University of California, Berkeley 15

Segmenting point clusters on a line or segmenting lines on a plane is a special case of mixture hyperplanes. Segmenting multiple hyperplanes Sastry & Yang © Spring, 2011 EE 290A, University of California, Berkeley 16

Sastry & Yang © Spring, 2011 EE 290A, University of California, Berkeley 17

Find the vanishing polynomial from embedded data Determine the number of hyperplanes by the rank of the embedded data matrix V. Sastry & Yang © Spring, 2011 EE 290A, University of California, Berkeley 18

Recover subspaces from vanishing polynomial Sastry & Yang © Spring, 2011 EE 290A, University of California, Berkeley 19

Sastry & Yang © Spring, 2011 EE 290A, University of California, Berkeley 20

Sastry & Yang © Spring, 2011 EE 290A, University of California, Berkeley 21