Structured Sparse Principal Component Analysis Reading Group Presenter: Peng Zhang Cognitive Radio Institute Friday, October 01, 2010 Authors: Rodolphe.

Slides:



Advertisements
Similar presentations
EigenFaces and EigenPatches Useful model of variation in a region –Region must be fixed shape (eg rectangle) Developed for face recognition Generalised.
Advertisements

Eigen Decomposition and Singular Value Decomposition
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
EigenFaces.
Multi-Label Prediction via Compressed Sensing By Daniel Hsu, Sham M. Kakade, John Langford, Tong Zhang (NIPS 2009) Presented by: Lingbo Li ECE, Duke University.
A CTION R ECOGNITION FROM V IDEO U SING F EATURE C OVARIANCE M ATRICES Kai Guo, Prakash Ishwar, Senior Member, IEEE, and Janusz Konrad, Fellow, IEEE.
Face Recognition Ying Wu Electrical and Computer Engineering Northwestern University, Evanston, IL
Wangmeng Zuo, Deyu Meng, Lei Zhang, Xiangchu Feng, David Zhang
An Introduction to Sparse Coding, Sparse Sensing, and Optimization Speaker: Wei-Lun Chao Date: Nov. 23, 2011 DISP Lab, Graduate Institute of Communication.
As applied to face recognition.  Detection vs. Recognition.
Bayesian Robust Principal Component Analysis Presenter: Raghu Ranganathan ECE / CMR Tennessee Technological University January 21, 2011 Reading Group (Xinghao.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #20.
A 4-WEEK PROJECT IN Active Shape and Appearance Models
Logistic Regression Principal Component Analysis Sampling TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAA A A A.
Principal Component Analysis
Pattern Recognition Topic 1: Principle Component Analysis Shapiro chap
Principal Component Analysis IML Outline Max the variance of the output coordinates Optimal reconstruction Generating data Limitations of PCA.
Singular Value Decomposition COS 323. Underconstrained Least Squares What if you have fewer data points than parameters in your function?What if you have.
Face Recognition using PCA (Eigenfaces) and LDA (Fisherfaces)
Real-time Combined 2D+3D Active Appearance Models Jing Xiao, Simon Baker,Iain Matthew, and Takeo Kanade CVPR 2004 Presented by Pat Chan 23/11/2004.
Face Recognition Jeremy Wyatt.
Lecture 4 Unsupervised Learning Clustering & Dimensionality Reduction
Face Recognition Using Eigenfaces
Principal Component Analysis Barnabás Póczos University of Alberta Nov 24, 2009 B: Chapter 12 HRF: Chapter 14.5.
3D Geometry for Computer Graphics
PCA Channel Student: Fangming JI u Supervisor: Professor Tom Geoden.
Smart Traveller with Visual Translator for OCR and Face Recognition LYU0203 FYP.
Eigenfaces for Recognition Student: Yikun Jiang Professor: Brendan Morris.
SVD(Singular Value Decomposition) and Its Applications
Online Learning for Matrix Factorization and Sparse Coding
Mining Discriminative Components With Low-Rank and Sparsity Constraints for Face Recognition Qiang Zhang, Baoxin Li Computer Science and Engineering Arizona.
Deep Learning – Fall 2013 Instructor: Bhiksha Raj Paper: T. D. Sanger, “Optimal Unsupervised Learning in a Single-Layer Linear Feedforward Neural Network”,
Cs: compressed sensing
Kaihua Zhang Lei Zhang (PolyU, Hong Kong) Ming-Hsuan Yang (UC Merced, California, U.S.A. ) Real-Time Compressive Tracking.
 Karthik Gurumoorthy  Ajit Rajwade  Arunava Banerjee  Anand Rangarajan Department of CISE University of Florida 1.
Fast and incoherent dictionary learning algorithms with application to fMRI Authors: Vahid Abolghasemi Saideh Ferdowsi Saeid Sanei. Journal of Signal Processing.
Learning With Structured Sparsity
1 Recognition by Appearance Appearance-based recognition is a competing paradigm to features and alignment. No features are extracted! Images are represented.
Local Non-Negative Matrix Factorization as a Visual Representation Tao Feng, Stan Z. Li, Heung-Yeung Shum, HongJiang Zhang 2002 IEEE Presenter : 張庭豪.
Compressive Sampling Jan Pei Wu. Formalism The observation y is linearly related with signal x: y=Ax Generally we need to have the number of observation.
Learning to Sense Sparse Signals: Simultaneous Sensing Matrix and Sparsifying Dictionary Optimization Julio Martin Duarte-Carvajalino, and Guillermo Sapiro.
CSE 185 Introduction to Computer Vision Face Recognition.
An Efficient Greedy Method for Unsupervised Feature Selection
Design of PCA and SVM based face recognition system for intelligent robots Department of Electrical Engineering, Southern Taiwan University, Tainan County,
Linear Subspace Transforms PCA, Karhunen- Loeve, Hotelling C306, 2000.
NONNEGATIVE MATRIX FACTORIZATION WITH MATRIX EXPONENTIATION Siwei Lyu ICASSP 2010 Presenter : 張庭豪.
Point Distribution Models Active Appearance Models Compilation based on: Dhruv Batra ECE CMU Tim Cootes Machester.
Irfan Ullah Department of Information and Communication Engineering Myongji university, Yongin, South Korea Copyright © solarlits.com.
2D-LDA: A statistical linear discriminant analysis for image matrix
Face detection and recognition Many slides adapted from K. Grauman and D. Lowe.
Jianchao Yang, John Wright, Thomas Huang, Yi Ma CVPR 2008 Image Super-Resolution as Sparse Representation of Raw Image Patches.
Progress Report #2 Alvaro Velasquez. Project Selection I chose to work with Nasim Souly on the project titled “Subspace Clustering via Graph Regularized.
Lecture 16: Image alignment
Compressive Coded Aperture Video Reconstruction
Computing and Compressive Sensing in Wireless Sensor Networks
Lecture 8:Eigenfaces and Shared Features
Recognition: Face Recognition
Singular Value Decomposition
NESTA: A Fast and Accurate First-Order Method for Sparse Recovery
Object Modeling with Layers
Outline Linear Shift-invariant system Linear filters
Eigenfaces for recognition (Turk & Pentland)
Introduction PCA (Principal Component Analysis) Characteristics:
CS4670: Intro to Computer Vision
Aishwarya sreenivasan 15 December 2006.
Principal Component Analysis
CIS 700: “algorithms for Big Data”
Non-Negative Matrix Factorization
Outline Sparse Reconstruction RIP Condition
Sebastian Semper1 and Florian Roemer1,2
Presentation transcript:

Structured Sparse Principal Component Analysis Reading Group Presenter: Peng Zhang Cognitive Radio Institute Friday, October 01, 2010 Authors: Rodolphe Jenatton, Guillaume Obozinski, Francis Bach

Outline ■Introduction (in Imaging Sense) □Principal Component Analysis (PCA) □Sparse PCA (SPCA) □Structured Sparse PCA (SSPCA) ■Problem Statement ■The SSPCA Algorithm ■Experiments ■Conclusion and Other Thoughts

Introduction (Imaging Sense) ■The face recognition problem □A database includes a huge amount of faces □How to let computer to recognize different faces with database ■The challenge □Huge amount of data □Computation complexity ■The trick □Represent the face using a weighted “face dictionary” ▪Similar to code book in data compression ▪Example: An 200 X 200 pixel face can be represented by 100 coefficients using the “face dictionary” ■The solution □Principal component analysis (PCA)

PCA ■PCA □A compression method □Given a large amount of sample vectors {x} □2 nd moment statistics of the sample vectors □Eigen-decomposition finds the “dictionary” and “energy” of the dictionary codes ▪Eigen-vectors {v} form the “dictionary” ▪Eigen-values {d} give the “energy” of “dictionary” elements

PCA ■Original signal can be represented using only part of the dictionary ▪Data is compressed with fewer elements ■Meaning of “dictionary” v: □It is the weights of each elements in x ■The problem for PCA for face recognition: No physical meaning for “dictionary”

PCA ■Face recognition The Face Samples PCA The “dictionary”, eigen-faces These eigen-faces can reconstruct original faces perfectly, but make no sense in real life

Structured SPCA ■The SPCA goal: □Make dictionary more interpretable □The “sparse” solution: Limit the number of nonzeros Non-sparse Eigen-faces from PCA Sparse Eigen-faces from SPCA But the eigen-faces are still meaningless most of time

Structured SPCA ■The new idea, SSPCA □Eigen-faces will be meaningful when some structured constraints are set □Meaningful areas in faces are constrained in “grids” Eigen-faces from SSPCA

Structured SPCA ■This paper’s contribution □Add the “structure” constraint to make the dictionary more meaningful □How the constraint works □Meaningful dictionary is more close to “true” dictionary □Meaningful dictionary is more robust against noise □Meaningful dictionary is more accurate in face recognition

Outline ■Introduction □Principal Component Analysis (PCA) □Sparse PCA (SPCA) □Structured Sparse PCA (SSPCA) ■Problem Statement ■The SSPCA Algorithm ■Experiments ■Conclusion and Other Thoughts

Problem Statement ■From SPCA to SSPCA □The optimization problem □X is sample matrix, U is coefficient matrix, V is dictionary □||.|| and are different types of norms □The trick in SPCA ▪L1 norm force the dictionary to be a sparse solution

Problem Statement ■Structured SPCA, however, deal with a mixed l1/l2 minimization: ■Right now it’s hard for me to understand the G and d

Problem Statement ■In short, the norm constraints have the following effects □Dictionary has some structures □All non-zeros in the dictionary will be confined inside a grid

Outline ■Introduction □Principal Component Analysis (PCA) □Sparse PCA (SPCA) □Structured Sparse PCA (SSPCA) ■Problem Statement ■The SSPCA Algorithm ■Experiments ■Conclusion and Other Thoughts

The SSPCA Algorithm ■Making the dictionary sparser □The norm, □The new SSPCA problem:

The SSPCA Algorithm ■Methods to solve a sequence of convex problems

Excerpt from Author’s slide ■Excerpt from author’s slide:

Excerpt from Author’s slide

Outline ■Introduction □Principal Component Analysis (PCA) □Sparse PCA (SPCA) □Structured Sparse PCA (SSPCA) ■Problem Statement ■The SSPCA Algorithm ■Experiments ■Conclusion and Other Thoughts

Conclusion and Other Thoughts ■Conclusion □This paper shows how to use SSPCA □SSPCA gets better performance in denoising, face recognition and classification ■Other thoughts □Usually, the meaningful dictionary in communication signals is Fourier dictionary □But Fourier dictionary may not fit some transient signals or time- variant signals □How to manipulate the G, d and norms to set constraints for our needs?

THANK YOU!