Zhu Han University of Houston Thanks for Dr. Hung Nguyen’s Slides

Slides:



Advertisements
Similar presentations
Beyond Streams and Graphs: Dynamic Tensor Analysis
Advertisements

Image Repairing: Robust Image Synthesis by Adaptive ND Tensor Voting IEEE Computer Society Conference on Computer Vision and Pattern Recognition Jiaya.
Bayesian Robust Principal Component Analysis Presenter: Raghu Ranganathan ECE / CMR Tennessee Technological University January 21, 2011 Reading Group (Xinghao.
SEMANTIC FEATURE ANALYSIS IN RASTER MAPS Trevor Linton, University of Utah.
Uncertainty and Variability in Point Cloud Surface Data Mark Pauly 1,2, Niloy J. Mitra 1, Leonidas J. Guibas 1 1 Stanford University 2 ETH, Zurich.
FYP Presentataion CK1 Intelligent Surface Modeler By Yu Wing TAI Kam Lun TANG Advised by Prof. Chi Keung TANG.
IR Models: Latent Semantic Analysis. IR Model Taxonomy Non-Overlapping Lists Proximal Nodes Structured Models U s e r T a s k Set Theoretic Fuzzy Extended.
SLIDE 1IS 240 – Spring 2007 Prof. Ray Larson University of California, Berkeley School of Information Tuesday and Thursday 10:30 am - 12:00.
Computing Sketches of Matrices Efficiently & (Privacy Preserving) Data Mining Petros Drineas Rensselaer Polytechnic Institute (joint.
Normal Estimation in Point Clouds 2D/3D Shape Manipulation, 3D Printing March 13, 2013 Slides from Olga Sorkine.
MIMO Multiple Input Multiple Output Communications © Omar Ahmad
Exercise problems for students taking the Programming Parallel Computers course. Janusz Kowalik Piotr Arlukowicz Tadeusz Puzniakowski Informatics Institute.
Gwangju Institute of Science and Technology Intelligent Design and Graphics Laboratory Multi-scale tensor voting for feature extraction from unstructured.
Graphite 2004 Statistical Synthesis of Facial Expressions for the Portrayal of Emotion Lisa Gralewski Bristol University United Kingdom
Introduction to tensor, tensor factorization and its applications
EMIS 8381 – Spring Netflix and Your Next Movie Night Nonlinear Programming Ron Andrews EMIS 8381.
 Karthik Gurumoorthy  Ajit Rajwade  Arunava Banerjee  Anand Rangarajan Department of CISE University of Florida 1.
How to reform a terrain into a pyramid Takeshi Tokuyama (Tohoku U) Joint work with Jinhee Chun (Tohoku U) Naoki Katoh (Kyoto U) Danny Chen (U. Notre Dame)
Pseudo-supervised Clustering for Text Documents Marco Maggini, Leonardo Rigutini, Marco Turchi Dipartimento di Ingegneria dell’Informazione Università.
Efficiency and Flexibility of Jagged Arrays Geir Gundersen Department of Informatics University of Bergen Norway Joint work with Trond Steihaug.
SINGULAR VALUE DECOMPOSITION (SVD)
Mingyang Zhu, Huaijiang Sun, Zhigang Deng Quaternion Space Sparse Decomposition for Motion Compression and Retrieval SCA 2012.
CMU SCS KDD '09Faloutsos, Miller, Tsourakakis P5-1 Large Graph Mining: Power Tools and a Practitioner’s guide Task 5: Graphs over time & tensors Faloutsos,
Clustering using Wavelets and Meta-Ptrees Anne Denton, Fang Zhang.
Introduction to Linear Algebra Mark Goldman Emily Mackevicius.
Tensor Voting and Applications
A Convergent Solution to Tensor Subspace Learning.
Feature Extraction 主講人:虞台文. Content Principal Component Analysis (PCA) PCA Calculation — for Fewer-Sample Case Factor Analysis Fisher’s Linear Discriminant.
Facets: Fast Comprehensive Mining of Coevolving High-order Time Series Hanghang TongPing JiYongjie CaiWei FanQing He Joint Work by Presenter:Wei Fan.
Feature Extraction 主講人:虞台文.
1 Dongheng Sun 04/26/2011 Learning with Matrix Factorizations By Nathan Srebro.
Results from Mean and Variance Calculations The overall mean of the data for all features was for the REF class and for the LE class. The.
Machine Learning Supervised Learning Classification and Regression
Singular Value Decomposition and its applications
Big data classification using neural network
Dimensionality Reduction and Principle Components Analysis
Large Graph Mining: Power Tools and a Practitioner’s guide
Estimation Techniques for High Resolution and Multi-Dimensional Array Signal Processing EMS Group – Fh IIS and TU IL Electronic Measurements and Signal.
Deep Learning Amin Sobhani.
A Three-way Model for Collective Learning on Multi-Relational Data
Outlier Processing via L1-Principal Subspaces
Sam Hopkins Cornell Tselil Schramm UC Berkeley Jonathan Shi Cornell
Lecture: Face Recognition and Feature Reduction
Compiling Dynamic Data Structures in Python to Enable the Use of Multi-core and Many-core Libraries Bin Ren, Gagan Agrawal 9/18/2018.
Finite Element Method To be added later 9/18/2018 ELEN 689.
Generalized DMRG with Tree Tensor Network
Fitting Curve Models to Edges
CIS 5590: Large-Scale Matrix Decomposition Tensors and Applications
4.2 Data Input-Output Representation
CSC4820/6820 Computer Graphics Algorithms Ying Zhu Georgia State University Transformations.
Lecture on Linear Algebra
SVD: Physical Interpretation and Applications
Filtering and State Estimation: Basic Concepts
Chap 4 Quantum Circuits: p
Word Embedding Word2Vec.
Asymmetric Transitivity Preserving Graph Embedding
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
Lets Play with arrays Singh Tripty
Non-negative Matrix Factorization (NMF)
Physics 451/551 Theoretical Mechanics
Machine Learning Support Vector Machine Supervised Learning
Midterm Exam Closed book, notes, computer Similar to test 1 in format:
Algorithmic complexity
Math review - scalars, vectors, and matrices
The Elements of Linear Algebra
Scalable light field coding using weighted binary images
Presentation transcript:

Zhu Han University of Houston Thanks for Dr. Hung Nguyen’s Slides Signal processing and Networking for Big Data Applications Lecture 19: Tensor Basics Zhu Han University of Houston Thanks for Dr. Hung Nguyen’s Slides

Outline 1. Basic concepts 2. Tensor operations 3. Tensor analysis 4. Applications 5. Tensor Voting 6. Conclusions

1. What is a tensor? Tensor is a generalization of an n-dimensional array Vector as a special case of Tensor Vector

Matrix as a special case of Tensor

Multiple dimensional array in most of programming languages 3rd order tensor

Tensors of arbitrary order

Dynamic Data Model (time, source, destination, port) keyword Author time (time, source, destination, port) (time, author, keyword)

Two important points Traditional matrix-based data analysis in inherently two-dimensional -> limit to apply to multi-dimensional data

2. Tensor operations Basis calculus

Vectorization

Matricize X(d) Unfold a tensor into a matrix 5 7 6 8 1 3 2 4

source Multiply a tensor with a matrix port port source group source destination destination group group source

3. Tensor analysis Tensor decomposition: generalize concept of low rank from matrices to tensors Result: Resulting tensor has just a few nonzero columns in each lateral slice. term doc author term author doc

Reminder: SVD  m n U VT  A m n

Reminder: SVD n 1u1v1 2u2v2 A  + m

Rank of a tensor

Goal: extension to >=3 modes I x R K x R A B J x R C R x R x R I x J x K +…+ = =

Tucker Decomposition - intuition I x J x K = A I x R B J x S C K x T R x S x T author x keyword x conference A: author x author-group B: keyword x keyword-group C: conf. x conf-group G: how groups relate to each other

Tucker 3 Decomposition

PARAFAC Decomposition

In the presence of missing data Tensor completion

4. Applications GOAL: to differentiate between left and right hand stimulation

In the presence of missing data Tensor completion

Surveillance original Low rank sparse

Analyzing Publication Data: Doc x Doc x Similarity Representation

Traffic engineering Dest. port 125 ... 80 IP source IP destination

Tensor Voting Tensor Voting Framework Normal space Tensor inference Token refinement Token decomposition Results and conclusion

Tensor Voting Framework Objective: infer “hidden” objects—gaps and broken parts. Gestalt principles: the presence of each input token implies a hypothesis that structure passes through it. Proximity Closure Continuity Inclined to infer those structures by red lines: Proximity Closure Continuity

Tensor Voting Framework-Normal Space Normal Space: encodes structure information; spanned by normal vectors. Tensor: describes the linear relation of vectors; outer product. stick tensor – surface element plate tensor – curve element ball tensor – point element Structure types in 3D: surfaces; curves; volumes. White arrows: normal vectors Blue regions: normal space Red shape: structure element

Tensor Voting Framework-Tensor Inference Consider a voter point p on a surface Normal is a single vector with saliency (magnitude) Information propagates to its neighboring votee point x The most likely smooth path: arc of the osculating circle Tensor(structure information) received at x:

Tensor Voting Framework-Tensor Inference voter projects structure information via its tensor weighted by decay function (DF) to neighboring votees. sum up all the tensors received by each votee. ϴ P1 NP1 P2 s l Pk NPk_P1 NPk_P2 NP2 Voting procedure in 2D with stick vote Decompose voter’s tensor as stick tensors Vote as the fundamental stick vote. Decompose voted tensors at the votee to reconstruct the same normal space as voter’s

Tensor Voting Framework-Token Refinement Problem: prior knowledge of structure type(normal space) & its saliencies unknown. Token refinement procedure is thus needed: Initialize each token with unit ball tenor(no direction preference) Tensor voting between tokens Decompose resulting tensor for each token indicating direction preference Initial Ball Tensors in 2D After Token Refinement in 2D

Inference Algorithm 1> Input initial trace with missing parts 2> Perform token refinement 3> Mark sparse voting region to define potential votees 4> Perform tensor voting to infer structures 5> Decompose results & add determined votees to the previous trace 6> Re-input the thinned trace and iterate the whole procedure starting with 2> No Iterate enough times? Yes 7> Output final trace result

Results and Conclusions Inference results: Number of pixels σ values Polluted trace as algorithm input Tensor voting with σ=1 Tensor voting with σ=2 Compared with victim method:

Results and Conclusions Provide an efficient approach to infer the human mobility trace, given that the observed location data exist missing parts. No user instructions are required to identify which part to be inferred. Discovering missing positions and accomplishing inference are done in automatic fashion. Sparse per-votee implementation scheme reduces computation load. Achieve relatively high accuracy.

5. Conclusions Tensor is a multidimensional array Tensor decomposition (factorization) can be considered higher- order generalization of matrix SVD or PCA Wide applications in data reconstruction, cluster analysis, compression.