Chin-Hsien Fang( 方競賢 ), Ju-Chin Chen( 陳洳瑾 ), Chien-Chung Tseng( 曾建中 ),and Jenn-Jier James Lien( 連震杰 ) Department of Computer Science and Information Engineering,

Slides:



Advertisements
Similar presentations
Learning Riemannian metrics for motion classification Fabio Cuzzolin INRIA Rhone-Alpes Computational Imaging Group, Pompeu Fabra University, Barcellona.
Advertisements

Machine Learning for Vision-Based Motion Analysis Learning pullback metrics for linear models Oxford Brookes Vision Group Oxford Brookes University 17/10/2008.
Text mining Gergely Kótyuk Laboratory of Cryptography and System Security (CrySyS) Budapest University of Technology and Economics
1 Manifold Alignment for Multitemporal Hyperspectral Image Classification H. Lexie Yang 1, Melba M. Crawford 2 School of Civil Engineering, Purdue University.
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
Medical Image Registration Kumar Rajamani. Registration Spatial transform that maps points from one image to corresponding points in another image.
Carolina Galleguillos, Brian McFee, Serge Belongie, Gert Lanckriet Computer Science and Engineering Department Electrical and Computer Engineering Department.
A Geometric Perspective on Machine Learning 何晓飞 浙江大学计算机学院 1.
Graph Embedding and Extensions: A General Framework for Dimensionality Reduction Keywords: Dimensionality reduction, manifold learning, subspace learning,
An Infant Facial Expression Recognition System Based on Moment Feature Extraction C. Y. Fang, H. W. Lin, S. W. Chen Department of Computer Science and.
AGE ESTIMATION: A CLASSIFICATION PROBLEM HANDE ALEMDAR, BERNA ALTINEL, NEŞE ALYÜZ, SERHAN DANİŞ.
Patch to the Future: Unsupervised Visual Prediction
Watching Unlabeled Video Helps Learn New Human Actions from Very Few Labeled Snapshots Chao-Yeh Chen and Kristen Grauman University of Texas at Austin.
Learning of Pseudo-Metrics. Slide 1 Online and Batch Learning of Pseudo-Metrics Shai Shalev-Shwartz Hebrew University, Jerusalem Joint work with Yoram.
Fast Image Replacement Using Multi-Resolution Approach Chih-Wei Fang and Jenn-Jier James Lien Robotics Lab Department of Computer Science and Information.
Three Algorithms for Nonlinear Dimensionality Reduction Haixuan Yang Group Meeting Jan. 011, 2005.
1 Nearest Neighbor Learning Greg Grudic (Notes borrowed from Thomas G. Dietterich and Tom Mitchell) Intro AI.
Distance Metric Learning: A Comprehensive Survey
Invariant Large Margin Nearest Neighbour Classifier M. Pawan Kumar Philip Torr Andrew Zisserman.
Atul Singh Junior Undergraduate CSE, IIT Kanpur.  Dimension reduction is a technique which is used to represent a high dimensional data in a more compact.
Distance Metric Learning for Large Margin Nearest Neighbor Classification (LMNN) NIPS 2006 Kilian Q. Weinberger, John Blitzer and Lawrence K. Saul.
Introduction to LMNN and progress review M.S. Student, Daewon Ko
Chrominance edge preserving grayscale transformation with approximate first principal component for color edge detection Professor: 連震杰 教授 Reporter: 第17組.
Representative Previous Work
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
This week: overview on pattern recognition (related to machine learning)
Structure Preserving Embedding Blake Shaw, Tony Jebara ICML 2009 (Best Student Paper nominee) Presented by Feng Chen.
1 Graph Embedding (GE) & Marginal Fisher Analysis (MFA) 吳沛勳 劉冠成 韓仁智
General Tensor Discriminant Analysis and Gabor Features for Gait Recognition by D. Tao, X. Li, and J. Maybank, TPAMI 2007 Presented by Iulian Pruteanu.
Graph Embedding: A General Framework for Dimensionality Reduction Dong XU School of Computer Engineering Nanyang Technological University
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Adaptive nonlinear manifolds and their applications to pattern.
IEEE TRANSSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
Local Fisher Discriminant Analysis for Supervised Dimensionality Reduction Presented by Xianwang Wang Masashi Sugiyama.
Low-Rank Kernel Learning with Bregman Matrix Divergences Brian Kulis, Matyas A. Sustik and Inderjit S. Dhillon Journal of Machine Learning Research 10.
ISOMAP TRACKING WITH PARTICLE FILTER Presented by Nikhil Rane.
GRASP Learning a Kernel Matrix for Nonlinear Dimensionality Reduction Kilian Q. Weinberger, Fei Sha and Lawrence K. Saul ICML’04 Department of Computer.
Spoken Language Group Chinese Information Processing Lab. Institute of Information Science Academia Sinica, Taipei, Taiwan
Manifold learning: MDS and Isomap
Levels of Image Data Representation 4.2. Traditional Image Data Structures 4.3. Hierarchical Data Structures Chapter 4 – Data structures for.
Nonlinear Dimensionality Reduction Approach (ISOMAP)
H UMAN A CTION R ECOGNITION USING L OCAL S PATIO -T EMPORAL D ISCRIMINANT E MBEDDING Kui Jia and Dit-Yan Yeung, IEEE Conference on Computer Vision and.
H. Lexie Yang1, Dr. Melba M. Crawford2
Project by: Cirill Aizenberg, Dima Altshuler Supervisor: Erez Berkovich.
A split-and-merge framework for 2D shape summarization D. Gerogiannis, C. Nikou and A. Likas Department of Computer Science, University of Ioannina, Greece.
Optimal Dimensionality of Metric Space for kNN Classification Wei Zhang, Xiangyang Xue, Zichen Sun Yuefei Guo, and Hong Lu Dept. of Computer Science &
Speech Communication Lab, State University of New York at Binghamton Dimensionality Reduction Methods for HMM Phonetic Recognition Hongbing Hu, Stephen.
A NOVEL METHOD FOR COLOR FACE RECOGNITION USING KNN CLASSIFIER
Speech Lab, ECE, State University of New York at Binghamton  Classification accuracies of neural network (left) and MXL (right) classifiers with various.
Large Scale Distributed Distance Metric Learning by Pengtao Xie and Eric Xing PRESENTED BY: PRIYANKA.
Student: Chih-Wei Fang ( 方志偉 ) Adviser: Jenn-Jier James Lien ( 連震杰 ) Robotics Laboratory, Department of Computer Science and Information Engineering, National.
Jenn-Jier James Lien (連震杰)
LDA (Linear Discriminant Analysis) ShaLi. Limitation of PCA The direction of maximum variance is not always good for classification.
Nonlinear Dimension Reduction: Semi-Definite Embedding vs. Local Linear Embedding Li Zhang and Lin Liao.
Manifold Learning JAMES MCQUEEN – UW DEPARTMENT OF STATISTICS.
Columbia University Advanced Machine Learning & Perception – Fall 2006 Term Project Nonlinear Dimensionality Reduction and K-Nearest Neighbor Classification.
Spectral Methods for Dimensionality
Correlative Multi-Label Multi-Instance Image Annotation
Unsupervised Riemannian Clustering of Probability Density Functions
CH 5: Multivariate Methods
Metric Learning for Clustering
Machine Learning Dimensionality Reduction
Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE
Modeling in the Time Domain
Scale-Space Representation for Matching of 3D Models
Knowledge-based event recognition from salient regions of activity
Scale-Space Representation for Matching of 3D Models
Video Analysis via Nonlinear Dimensionality Reduction Technique
Multivariate Methods Berlin Chen
Multivariate Methods Berlin Chen, 2005 References:
ECE – Pattern Recognition Lecture 10 – Nonparametric Density Estimation – k-nearest-neighbor (kNN) Hairong Qi, Gonzalez Family Professor Electrical.
Presentation transcript:

Chin-Hsien Fang( 方競賢 ), Ju-Chin Chen( 陳洳瑾 ), Chien-Chung Tseng( 曾建中 ),and Jenn-Jier James Lien( 連震杰 ) Department of Computer Science and Information Engineering, National Cheng Kung University HUMAN ACTION RECOGNITION IN TEMPORAL-VECTOR TRAJECTORY LEARNING FRAMEWORK 1

+ Motivation + System flowchart + Training Process + Testing Process + Experimental Results + Conclusions 2

+ Traditional Manifold classification (ex: LDA, LSDA…) *Only spatial information *The input data are continuous sequences *Temporal information should be considered 3

ASMASM h*w d d*(2t+1) h*w d d*(2t+1) 4

LPP Temporal data Metric Learning 5

+ Why dimension reduction? – To reduce the calculation cost + Why LPP (Locality Preserving Projections)? – Can handle non-linear data with linear transformation matrix – Local structure is preserved 6

Try to keep the local structure while reducing the dimension 7

Subject to Where L = (D - W) Objective function: L : Laplacian matrix D : Diagonal matrix W : Weight matrix 8

+ Three kinds of temporal information 1.LTM(Locations temporal motion of Mahalanobis distance) 2.DTM(Difference temporal motion of Mahalanobis distance) 3.TTM(Trajectory temporal motion of Mahalanobis distance) 9

LTM An input sequence: LPP Temporal where 10

DTM where 11

TTM where 12

+ Mahalanobis distance 1.Preserving the relation of the data 2.Doesn’t depend on the scale of the data 13

yiyi yiyi yjyj ylyl yjyj yiyi ylyl yiyi yjyj ylyl LME Space LMNN LPP+Temporal Space 14 Minimize : Subject to : (i) (ii) (iii ) M has to be positive semi-definite

LPP Metric Learning Temporal data K-NN 15

Test data Training data K=5 The winner takes all~~ Labeled as 16 The number of nearest neighbor

17

18

+ Our TVTL framework makes impressive progress compared to other traditional methods such as LSDA + Temporal information do have positive influence + DTM, TTM are better than LTM because they consider the correlation of the data 19