Download presentation
1
Algorithm Development with Higher Order SVD
Algorithms Meeting September 16, 2004 Algorithm Development with Higher Order SVD Adam Goodenough
2
Tensors N-Mode Generalization of a Matrix Notation: T
In is the length of the tensor in mode (i.e. dimension) n A tensor has order N Two basic operations: “unfolding” (T(n)) and the “mode-n product of a tensor and a matrix” (Tx(n)M) I1xI2x…xIN
3
Example 5 1 2 4 3 Mode-3 Tensor 5 1 2 4 3 I2 I1 I3
4
Unfolding Unfolding flattens an N-mode tensor into a 2-mode tensor
The operation, T(n), unfolds the matrix along mode n Each column of the resulting tensor is composed of t where in varies and the remaining indices are held constant (for a column) i1i2…in…iN
5
Example - Unfolding Mode-1 Unfolding
6
Example - Unfolding Mode-2 Unfolding
7
Example - Unfolding Mode-3 Unfolding
8
Tensor – Matrix Product
The operation, Tx(n)M, denotes the product of an Nth order Tensor with a Matrix (2nd order Tensor)
9
Example - Product 5 1 2 4 3 1 5 2 3 4
10
Matrix PCA Two standard ways of performing Principal Component Analysis: Find the eigenvalues/eigenvectors of the covariance matrix Take the SVD of the raw data The basis vectors are weighted by the eigen/singular values The SVD approach can be easily extended to Tensors
11
Matrix Decomposition I1 I2 Data Matrix = J1 U J2 S V T
12
Matrix Decomposition I1 I2 Data Matrix A Tensor!
13
Matrix Decomposition I1 J1 U I2 J2 V Data Matrix = T S
14
Matrix Decomposition U1 U2 I1 I2 Data Matrix = J1 U J2 S V T I1 J1 U
15
Matrix SVD – Tensor SVD
16
HOSVD Given a tensor, T, of order N:
Each Un is obtained from the SVD of the unfolding of T along mode n Z is a full tensor (i.e. not equivalent to the diagonal S matrix)
17
HOSVD The tensor, Z, is found by Z has dimensionality equivalent to T
i.e. Given
18
Algorithm Applications
Three general areas: Prediction Detection Compression All algorithms work through building a training data sets by varying parameters (mode indices)
19
Applications - Prediction
Tensor Textures (Vasilescu and Terzopoulos 2004)
20
Applications - Prediction
Data set: DTx I x V Rasterized Texture Images: T = 240x320x3 = 23040 Illumination Orientations: I = 21 View Orientations: V = 37 Total of 777 images taken (21x37)
21
Tangent Normal PCA of this data set would perform SVD on the matrix of all observations This matrix is equivalent to the mode-1 unfolding of the tensor, i.e.
22
Tangent This means that the SVD of the data set is related to Utexel
It turns out that “R” is constructed by the Z, Uillum, Uview terms (using the Kronecker product) We end up having explicit control over the eigenvalues (compression)
23
Applications - Prediction
i, v are computed based on nearest neighbors
24
Applications - Prediction
MODTRAN (MakeADB)
25
Applications - Detection
Tensor Faces (Vasilescu and Terzopoulos 2002)
26
Applications - Detection
Vast improvement over PCA based techniques
27
Applications - Detection
Attempts on plume data
28
Applications - Compression
HOSVD allows for explicit control over dimensionality reduction
29
Applications - Compression
Compression can be done so that reconstruction results are better perceptually (though usually worse RMSE) PCA HOSVD
30
Applications - Compression
Can having explicit control over dimensionality reduction allow us to perform invariant based detection algorithms better? Goal is to use photon mapping to generate spectral images of underwater targets given varying environmental conditions and to test traditional PCA based invariant algorithms versus HOSVD based invariant algorithms
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.