International Conference on Pattern Recognition, Hong Kong, August 2006 1 Multilinear Principal Component Analysis of Tensor Objects for Recognition Haiping.

Slides:



Advertisements
Similar presentations
3D Geometry for Computer Graphics
Advertisements

Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Computer vision: models, learning and inference Chapter 13 Image preprocessing and feature extraction.
1 Maarten De Vos SISTA – SCD - BIOMED K.U.Leuven On the combination of ICA and CPA Maarten De Vos Dimitri Nion Sabine Van Huffel Lieven De Lathauwer.
Dimensionality Reduction PCA -- SVD
Dimension reduction (1)
PCA + SVD.
1er. Escuela Red ProTIC - Tandil, de Abril, 2006 Principal component analysis (PCA) is a technique that is useful for the compression and classification.
International Conference on Automatic Face and Gesture Recognition, 2006 A Layered Deformable Model for Gait Analysis Haiping Lu, K.N. Plataniotis and.
1cs542g-term High Dimensional Data  So far we’ve considered scalar data values f i (or interpolated/approximated each component of vector values.
Shape and Dynamics in Human Movement Analysis Ashok Veeraraghavan.
Principal Component Analysis
Unsupervised Learning - PCA The neural approach->PCA; SVD; kernel PCA Hertz chapter 8 Presentation based on Touretzky + various additions.
CS 790Q Biometrics Face Recognition Using Dimensionality Reduction PCA and LDA M. Turk, A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive.
Face Recognition using PCA (Eigenfaces) and LDA (Fisherfaces)
Silhouette Analysis-Based Gait Recognition for Human Identification
Lecture 21 SVD and Latent Semantic Indexing and Dimensional Reduction
Singular Value Decomposition
1/ 30. Problems for classical IR models Introduction & Background(LSI,SVD,..etc) Example Standard query method Analysis standard query method Seeking.
SVD and PCA COS 323. Dimensionality Reduction Map points in high-dimensional space to lower number of dimensionsMap points in high-dimensional space to.
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Ordinary least squares regression (OLS)
SVD and PCA COS 323, Spring 05. SVD and PCA Principal Components Analysis (PCA): approximating a high-dimensional data set with a lower-dimensional subspacePrincipal.
Human Identification using Silhouette Gait Data Rutgers University Chan-Su Lee.
1cs542g-term Notes  Extra class next week (Oct 12, not this Friday)  To submit your assignment: me the URL of a page containing (links to)
Biomedical Image Analysis and Machine Learning BMI 731 Winter 2005 Kun Huang Department of Biomedical Informatics Ohio State University.
SVD(Singular Value Decomposition) and Its Applications
Summarized by Soo-Jin Kim
Machine Learning CS 165B Spring Course outline Introduction (Ch. 1) Concept learning (Ch. 2) Decision trees (Ch. 3) Ensemble learning Neural Networks.
Enhancing Tensor Subspace Learning by Element Rearrangement
Chapter 2 Dimensionality Reduction. Linear Methods
Presented By Wanchen Lu 2/25/2013
Next. A Big Thanks Again Prof. Jason Bohland Quantitative Neuroscience Laboratory Boston University.
Feature extraction 1.Introduction 2.T-test 3.Signal Noise Ratio (SNR) 4.Linear Correlation Coefficient (LCC) 5.Principle component analysis (PCA) 6.Linear.
General Tensor Discriminant Analysis and Gabor Features for Gait Recognition by D. Tao, X. Li, and J. Maybank, TPAMI 2007 Presented by Iulian Pruteanu.
Introduction to tensor, tensor factorization and its applications
1 Recognition by Appearance Appearance-based recognition is a competing paradigm to features and alignment. No features are extracted! Images are represented.
Using Support Vector Machines to Enhance the Performance of Bayesian Face Recognition IEEE Transaction on Information Forensics and Security Zhifeng Li,
Estimation of Number of PARAFAC Components
SVD Data Compression: Application to 3D MHD Magnetic Field Data Diego del-Castillo-Negrete Steve Hirshman Ed d’Azevedo ORNL ORNL-PPPL LDRD Meeting ORNL.
MUSTAFA OZAN ÖZEN PINAR SAĞLAM LEVENT ÜNVER MEHMET YILMAZ.
1 Introduction to Kernel Principal Component Analysis(PCA) Mohammed Nasser Dept. of Statistics, RU,Bangladesh
Supervisor: Nakhmani Arie Semester: Winter 2007 Target Recognition Harmatz Isca.
Meeting 23 Vectors. Vectors in 2-Space, 3-Space, and n- Space We will denote vectors in boldface type such as a, b, v, w, and x, and we will denote scalars.
Speech Lab, ECE, State University of New York at Binghamton  Classification accuracies of neural network (left) and MXL (right) classifiers with various.
EIGENSYSTEMS, SVD, PCA Big Data Seminar, Dedi Gadot, December 14 th, 2014.
PCA vs ICA vs LDA. How to represent images? Why representation methods are needed?? –Curse of dimensionality – width x height x channels –Noise reduction.
University of South Florida, Tampa1 Gait Recognition and Inverse Biometrics Sudeep Sarkar (Zongyi Liu, Pranab Mohanty) Computer Science and Engineering.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Principal Component Analysis and Linear Discriminant Analysis for Feature Reduction Jieping Ye Department of Computer Science and Engineering Arizona State.
Algorithm Development with Higher Order SVD
2D-LDA: A statistical linear discriminant analysis for image matrix
Irena Váňová. B A1A1. A2A2. A3A3. repeat until no sample is misclassified … labels of classes Perceptron algorithm for i=1...N if then end * * * * *
PRESENT BY BING-HSIOU SUNG A Multilinear Singular Value Decomposition.
Unsupervised Learning II Feature Extraction
Septian Adi Wijaya – Informatics Brawijaya University
Large Graph Mining: Power Tools and a Practitioner’s guide
Estimation Techniques for High Resolution and Multi-Dimensional Array Signal Processing EMS Group – Fh IIS and TU IL Electronic Measurements and Signal.
Dimensionality Reduction
Recognition with Expression Variations
Motion Segmentation with Missing Data using PowerFactorization & GPCA
Outlier Processing via L1-Principal Subspaces
René Vidal and Xiaodong Fan Center for Imaging Science
Lecture 8:Eigenfaces and Shared Features
PCA vs ICA vs LDA.
Design of Hierarchical Classifiers for Efficient and Accurate Pattern Classification M N S S K Pavan Kumar Advisor : Dr. C. V. Jawahar.
Outline H. Murase, and S. K. Nayar, “Visual learning and recognition of 3-D objects from appearance,” International Journal of Computer Vision, vol. 14,
Parallelization of Sparse Coding & Dictionary Learning
Concept Decomposition for Large Sparse Text Data Using Clustering
Outline Singular Value Decomposition Example of PCA: Eigenfaces.
Presentation transcript:

International Conference on Pattern Recognition, Hong Kong, August Multilinear Principal Component Analysis of Tensor Objects for Recognition Haiping Lu, K.N. Plataniotis and A.N. Venetsanopoulos The Edward S. Rogers Sr. Department of Electrical and Computer Engineering University of Toronto

2 International Conference on Pattern Recognition, Hong Kong, August 2006 Motivation Real data in pattern recognition High-dimensional: dimensionality reduction Multidimensional: tensors PCA: reshape tensors into vectors Multilinear algebra 2DPCA, 3DPCA Multifactor analysis Objective: multilinear PCA for tensors

3 International Conference on Pattern Recognition, Hong Kong, August 2006 Overview MPCA: natural extension of PCA Multilinear singular value & eigentensor Input: higher-order tensors Application: gait recognition Sample data set: 4 th -order tensor Gait sample: half gait cycle (normalized) Recognition: outperforms baseline algorithm

4 International Conference on Pattern Recognition, Hong Kong, August 2006 Notations Vector: lowercase boldface Matrix: uppercase boldface Tensor: calligraphic letter n-mode product: Scalar product: Frobenius norm: n-rank (n-mode vectors):

5 International Conference on Pattern Recognition, Hong Kong, August 2006 Higher-order SVD Subtensors of the core tensor S All-orthogonality Ordered based on : unitary

6 International Conference on Pattern Recognition, Hong Kong, August 2006 PCA with tensor notation Basis vectors (PCs): columns of PCA subspace: truncate  Projection to feature space:

7 International Conference on Pattern Recognition, Hong Kong, August 2006 Multilinear PCA Centered input tensor samples: HOSVD: Keep columns of  n-mode singular value: Basis tensor (eigentensor): Projection: MPCA features:

8 International Conference on Pattern Recognition, Hong Kong, August 2006 EigenTensorGait for recognition Gait sample: half gait cycle (3 rd -order) To obtain samples: partition based on foreground pixels in silhouettes Noise removal: best rank approximation Temporal normalization: interpolation Feature distance: sum of the absolute differences (equivalent to L 1 norm) Sequence matching: sum of min-dist

9 International Conference on Pattern Recognition, Hong Kong, August 2006 Best rank approximation The original silhouettes Best rank-(10,10,3) approximation

10 International Conference on Pattern Recognition, Hong Kong, August 2006 Experiments Data: USF gait challenge data sets V.1.7 Different conditions: surface, shoe, view Sample size: 64x44x20 Best results: Performance measure: CMCs Results: better overall recognition rate compared with baseline algorithm

11 International Conference on Pattern Recognition, Hong Kong, August 2006 Identification performance ProbeP I (%) at Rank 1P I (%) at Rank 5 BaselineMPCABaselineMPCA A(GAL) B(GBR) C(GBL) D(CAR) E(CBR) F(CAL) G(CBL) Average

12 International Conference on Pattern Recognition, Hong Kong, August 2006 MPCA CMC curves

13 International Conference on Pattern Recognition, Hong Kong, August 2006 Conclusions MPCA: multilinear extension of PCA Application of MPCA: EigenTensorGait Half gait cycles as gait samples Best rank approximation to reduce noise Temporal normalization by interpolation Future works MPCA to other problems Other multilinear extensions

14 International Conference on Pattern Recognition, Hong Kong, August 2006 Related work Haiping Lu, K.N. Plataniotis and A.N. Venetsanopoulos, "Gait Recognition through MPCA plus LDA", in Proc. Biometrics Symposium 2006 (BSYM 2006), Baltimore, US, September 2006.

15 International Conference on Pattern Recognition, Hong Kong, August 2006 Contact Information Haiping Lu Academic website: