Introduction to tensor, tensor factorization and its applications

Slides:



Advertisements
Similar presentations
Statistical perturbation theory for spectral clustering Harrachov, 2007 A. Spence and Z. Stoyanov.
Advertisements

Beyond Streams and Graphs: Dynamic Tensor Analysis
Multilinear Algebra for Analyzing Data with Multiple Linkages
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Multilinear Algebra for Analyzing Data with Multiple Linkages Tamara G. Kolda plus: Brett Bader, Danny Dunlavy, Philip Kegelmeyer Sandia National Labs.
CMU SCS : Multimedia Databases and Data Mining Lecture #21: Tensor decompositions C. Faloutsos.
1 Maarten De Vos SISTA – SCD - BIOMED K.U.Leuven On the combination of ICA and CPA Maarten De Vos Dimitri Nion Sabine Van Huffel Lieven De Lathauwer.
Dimensionality Reduction PCA -- SVD
3D Reconstruction – Factorization Method Seong-Wook Joo KG-VISA 3/10/2004.
Data mining and statistical learning - lecture 6
Maths for Computer Graphics
Principal Component Analysis
Unsupervised Learning - PCA The neural approach->PCA; SVD; kernel PCA Hertz chapter 8 Presentation based on Touretzky + various additions.
Multivariable Control Systems Ali Karimpour Assistant Professor Ferdowsi University of Mashhad.
Vector Space Information Retrieval Using Concept Projection Presented by Zhiguo Li
Clustering In Large Graphs And Matrices Petros Drineas, Alan Frieze, Ravi Kannan, Santosh Vempala, V. Vinay Presented by Eric Anderson.
3D Geometry for Computer Graphics
10-603/15-826A: Multimedia Databases and Data Mining SVD - part I (definitions) C. Faloutsos.
3D Geometry for Computer Graphics
DATA MINING LECTURE 7 Dimensionality Reduction PCA – SVD
Matrix Approach to Simple Linear Regression KNNL – Chapter 5.
CE 311 K - Introduction to Computer Methods Daene C. McKinney
1 Chapter 3 Matrix Algebra with MATLAB Basic matrix definitions and operations were covered in Chapter 2. We will now consider how these operations are.
4.2 Adding and Subtracting Matrices 4.3 Matrix Multiplication
1 Chapter 2 Matrices Matrices provide an orderly way of arranging values or functions to enhance the analysis of systems in a systematic manner. Their.
Chapter 7 Matrix Mathematics Matrix Operations Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
By Mary Hudachek-Buswell. Overview Atmospheric Turbulence Blur.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
SVD(Singular Value Decomposition) and Its Applications
Algebra 2: Lesson 5 Using Matrices to Organize Data and Solve Problems.
Matlab vs. Scilab Rafael Brunner.
Ilmenau University of Technology Communications Research Laboratory 1  A new multi-dimensional model order selection technique called closed- form PARAFAC.
Non Negative Matrix Factorization
100 Solve 3 X 7 = ? using base 10 blocks Build a 3 X 7 array! X.
Matrix Algebra and Regression a matrix is a rectangular array of elements m=#rows, n=#columns  m x n a single value is called a ‘scalar’ a single row.
SAND C 1/17 Coupled Matrix Factorizations using Optimization Daniel M. Dunlavy, Tamara G. Kolda, Evrim Acar Sandia National Laboratories SIAM Conference.
LTSI (1) Faculty of Mech. & Elec. Engineering, University AL-Baath, Syria Ahmad Karfoul (1), Julie Coloigner (2,3), Laurent Albera (2,3), Pierre Comon.
Informatics and Mathematical Modelling / Intelligent Signal Processing 1 Sparse’09 8 April 2009 Sparse Coding and Automatic Relevance Determination for.
Multiview Geometry and Stereopsis. Inputs: two images of a scene (taken from 2 viewpoints). Output: Depth map. Inputs: multiple images of a scene. Output:
ME451 Kinematics and Dynamics of Machine Systems Review of Linear Algebra 2.1, 2.2, 2.3 September 06, 2013 Radu Serban University of Wisconsin-Madison.
International Conference on Pattern Recognition, Hong Kong, August Multilinear Principal Component Analysis of Tensor Objects for Recognition Haiping.
Estimation of Number of PARAFAC Components
Scientific Computing Singular Value Decomposition SVD.
MATH 685/ CSI 700/ OR 682 Lecture Notes Lecture 4. Least squares.
CMU SCS KDD '09Faloutsos, Miller, Tsourakakis P5-1 Large Graph Mining: Power Tools and a Practitioner’s guide Task 5: Graphs over time & tensors Faloutsos,
A Clustering Method Based on Nonnegative Matrix Factorization for Text Mining Farial Shahnaz.
Matrix Operations.
Algebra Matrix Operations. Definition Matrix-A rectangular arrangement of numbers in rows and columns Dimensions- number of rows then columns Entries-
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Algorithm Development with Higher Order SVD
Facets: Fast Comprehensive Mining of Coevolving High-order Time Series Hanghang TongPing JiYongjie CaiWei FanQing He Joint Work by Presenter:Wei Fan.
Tensor Algorithms & Visualization By Arun Rao. Presentation Outline Introduction to Tensors Boussinesq Problem Visualizations Recent Uses.
PRESENT BY BING-HSIOU SUNG A Multilinear Singular Value Decomposition.
13.3 Product of a Scalar and a Matrix.  In matrix algebra, a real number is often called a.  To multiply a matrix by a scalar, you multiply each entry.
COMPUTER GRAPHICS AND LINEAR ALGEBRA AN INTRODUCTION.
Final Outline Shang-Hua Teng. Problem 1: Multiple Choices 16 points There might be more than one correct answers; So you should try to mark them all.
1 / 24 Distributed Methods for High-dimensional and Large-scale Tensor Factorization Kijung Shin (Seoul National University) and U Kang (KAIST)
MTH108 Business Math I Lecture 20.
Large Graph Mining: Power Tools and a Practitioner’s guide
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Least Common Multiple.
Zhu Han University of Houston Thanks for Dr. Hung Nguyen’s Slides
CIS 5590: Large-Scale Matrix Decomposition Tensors and Applications
Solving Linear Systems Using Inverse Matrices
SPARSE TENSORS DECOMPOSITION SOFTWARE
Jimeng Sun · Charalampos (Babis) E
Objectives Multiply two matrices.
Matrix Multiplication
Non-Negative Matrix Factorization
Presentation transcript:

Introduction to tensor, tensor factorization and its applications Mu Li iPAL Group Meeting Sept. 17, 2010

Outline Basic concepts about tensor 1. What’s tensor? Why tensor and tensor factorization? 2. Tensor multiplication 3. Tensor rank Tensor factorization 1. CANDECOMP/PARAFAC factorization 2. Tucker factorization Applications of tensor factorization Conclusion

What’s tensor? Why tensor and tensor factorization? Definition: a tensor is a multidimensional array which is an extension of matrix. Tensor can happen in daily life. In order to facilitate information mining from tensor and tensor processing, storage, tensor factorization is often needed. Three-way tensor:

A tensor is a multidimensional array

Fiber and slice

Tensor unfoldings: Matricization and vectorization Matricization: convert a tensor to a matrix Vectorization: convert a tensor to a vector

Tensor multiplication: the n-mode product: multiplied by a matrix Definition:

Tensor multiplication: the n-mode product: multiplied by a vector Definition: Note: multiplying by a vector reduces the dimension by one.

Rank-one Tensor and Tensor rank Example: Tensor rank: smallest number of rank-one tensors that can generate it by summing up. Differences with matrix rank: 1. tensor rank can be different over R and C. 2. Deciding tensor rank is an NP problem that no straightforward algorithm can solve it.

Tensor factorization: CANDECOMP/PARAFAC factorization(CP) Tensor factorization: an extension of SVD and PCA of matrix. CP factorization: Uniqueness: CP of tensor(higher-order) is unique under some general conditions. How to compute: Alternative Least Squares(ALS), fixing all but one factor matrix to which LS is applied.

Differences between matrix SVD and tensor CP Lower-rank approximation is different between matrix and higher-order tensor Matrix: Not true for higher-order tensor

Tensor factorization: Tucker factorization For three-way tensor, Tucker factorization has three types: Tucker3: Tucker2: Tucker1:

Three types of Tucker factorization

Uniqueness: Unlike CP, Tucker factorization is not unique. How to compute: Higher-order SVD(HOSVD), for each n, Rn:

Applications of Tensor factorization A simple application of CP:

Apply CP to reconstruct a MATLAB logo from noisy data

Apply Tucker3 to do data reconstruction from noise

Apply Tucker3 to do cluster analysis

Conclusion Tensor is a multidimensional array which is an extension of matrix that arises frequently in our daily life such as video, microarray data, EEG data, etc. Tensor factorization can be considered higher-order generalization of matrix SVD or PCA, but they also have much differences, such as NP essential of deciding higher-order tensor rank, non-optimal property of higher-order tensor factorization. There are still many other tensor factorizations, such as block- oriented decomposition, DEDICOM, CANDELINC. Tensor factorizations have wide applications in data reconstruction, cluster analysis, compression etc.

Kolda, Bader, Tensor decompositions and applications. References Kolda, Bader, Tensor decompositions and applications. Martin, an overview of multilinear algebra and tensor decompositions. Cichocki, etc., nonnegative matrix and tensor factorizations.