Download presentation
Presentation is loading. Please wait.
Published byRalf Heath Modified over 6 years ago
1
Estimation Techniques for High Resolution and Multi-Dimensional Array Signal Processing EMS Group – Fh IIS and TU IL Electronic Measurements and Signal Processing Group (EMS) LASP – UnB Laboratory of Array Signal Processing Prof. João Paulo C. Lustosa da Costa
2
Content of the intensive course (2)
Multidimensional Array Signal Processing Tensor operators Tensor decompositions Parallel factor analysis (PARAFAC) Kruskal’s rank Tensor models Higher order singular value decomposition (HOSVD) Merging dimensions Least squares Khatri-Rao factorization (LSKRF) Tensor based techniques Tensor based model order selection 2 2
3
Content of the intensive course (2)
Multidimensional Array Signal Processing Tensor operators Tensor decompositions Parallel factor analysis (PARAFAC) Kruskal’s rank Tensor models Higher order singular value decomposition (HOSVD) Merging dimensions Least squares Khatri-Rao factorization (LSKRF) Tensor based techniques Tensor based model order selection 3 3
4
? Multi-dimensional array signal processing (1) Term “tensor”
Mathematics: : W. Voigt abstract definition (multilinear mapping) Physics: : M. Grossmann and A. Einstein Description of covariant and contracovariant indices Intuitive description “A tensor of order p is a collection of elements that are referenced by p indices.” Scalars Vectors Matrices 3rd order tensor 4th order ? … “multi-way arrays” 4
5
Rectangular Array (URA)
Multi-dimensional array signal processing (2) Advantages: increased identifiability, separation without imposing additional constraints and improved accuracy (tensor gain) m1 m2 1 1 1 2 1 3 2 1 2 2 2 3 RX: Uniform Rectangular Array (URA) 3 1 3 2 9 x 3 matrix: maximum rank is 3. Solve maximum 3 sources! 3 3 n 1 2 3 5
6
Rectangular Array (URA)
Multi-dimensional array signal processing (3) Advantages: increased identifiability, separation without imposing additional constraints and improved accuracy (tensor gain) m1 1 2 3 n 1 2 3 m2 1 2 3 RX: Uniform Rectangular Array (URA) 3 x 3 x 3 tensor: maximum rank is 5. Solve maximum 5 sources! J. B. Kruskal. Rank, decomposition, and uniqueness for 3-way and N-way arrays. Multiway Data Analysis, pages 7–18, 1989 6
7
= Multi-dimensional array signal processing (4) + +
Advantages: increased identifiability, separation without imposing additional constraints and improved accuracy (tensor gain) For matrix model, nonrealistic assumptions such as orthogonality (PCA) or independence (ICA) should be done. For tensor model, separation is unique up to scalar and permutation ambiguities. = + + 7
8
Multi-dimensional array signal processing (5)
Advantages: increased identifiability, separation without imposing additional constraints and improved accuracy (tensor gain) Array interpolation due to imperfections Application of tensor based techniques Estimation of number of sources d also known as model order selection multi-dimensional schemes: better accuracy Prewhitening schemes multi-dimensional schemes: better accuracy and lower complexity Parameter estimation Drastic reduce of computational complexity Multidimensional searches are decomposed into several one dimensional searches 8
9
Multi-dimensional array signal processing (6) dn = ? dn = ??
Advantage: Insights due to the tensor notation Analogy between scalar representation and matrix representation Matrix equations are more compact insights, manipulations Example: DFT The data is the same for all representations. However, the representation is more compact. For tensors, the representation is even more compact, since more dimensions are represented in a same equation. dn = ? dn = ?? 9
10
Content of the intensive course (2)
Multidimensional Array Signal Processing Tensor operators Tensor decompositions Parallel factor analysis (PARAFAC) Kruskal’s rank Tensor models Higher order singular value decomposition (HOSVD) Merging dimensions Least squares Khatri-Rao factorization (LSKRF) Tensor based techniques Tensor based model order selection 10 10
11
Tensor operators (1) Let and , the the following operations are defined: Outer product: The order of the tensor resulting from the outer product is equal to the sum of the orders of both tensors. Inner product: The n-th dimension of both tensors is supressed. Higher order norm: Concatenation: All dimensions of both tensors should be equal for concatenation, except the n-th dimension. 11
12
Tensor operators (2) Example of inner product between two tensors:
Sample covariance tensor for a 3rd order tensor Comparing with the sample covariance matrix using the unfoldings of the tensor Note that with the tensor notation, the tensor structure is already clear, while the matrix notation does not show the natural multidimensional structure of the data. Moreover, the sample covariance matrix is a multimode matrix of the sample covariance tensor.
13
Tensor operators (3) Example of outer product Outer product between two vectors: a rank one matrix Outer product between three vectors: a rank one tensor 13
14
Tensor operators (4) Unfoldings do tensor: “1-mode vectors” “2-mode vectors” “3-mode vectors” The amount of unfoldings is equal to the order of the tensor. 14
15
Tensor operators (5) Unfolding of a third order tensor 15
16
RX: Uniform Rectangular Array (URA)
Tensor operators (6) Unfolding of a tensor m1 1 2 3 n 1 2 3 m2 1 2 3 RX: Uniform Rectangular Array (URA) Experts without knowledge of tensors usually use only the following unfolding, i.e. a single projection of the tensor: However, by exploiting all unfoldings, the tensor gain is achieved. 16
17
Tensor operators (7) N-mode product Let be a tensor and a matri, the mode-n product is defined as Tensor representation Scalar representation Matrix representation i.e., all the n-mode vectors are multiplied by the left by . Illustrative examples of the 1-mode, 2-mode and 3-mode products by matrices between a third order tensor: 3 1 2 17
18
Tensor operators (8) Khatri-Rao product between two matrices and
is defined as: where and stand for the i-th columns of and 18
19
Content of the intensive course (2)
Multidimensional Array Signal Processing Tensor operators Tensor decompositions Parallel factor analysis (PARAFAC) Kruskal’s rank Tensor models Higher order singular value decomposition (HOSVD) Merging dimensions Least squares Khatri-Rao factorization (LSKRF) Tensor based techniques Tensor based model order selection 19 19
20
Parallel Factor Analysis (1)
Unfolding of a rank one tensor Outer product between three vectors: rank one tensor Example: 20
21
Parallel Factor Analysis (2)
Unfolding of a rank one tensor Example: Similarly, for the other unfoldings, it holds: 21
22
Parallel Factor Analysis (3)
Unfolding of a rank d tensor A rank d tensor can be tensor decomposed into a sum of d rank one tensors as follows: where The unfoldings of can be written as: 22
23
Parallel Factor Analysis (4)
Parallel Factor Analysis (PARAFAC) Given the tensor and the model order d, we desire to obtain the following matrices in order to obtain the factor matrices: Applications of the PARAFAC model: wireless communications (detection, estimation), harmonical retrieval, RADAR, exploratory data analysis, source separation, espetroscopy, pattern recognition, biomedical signals, chemistry, psychometrics, … 23
24
Parallel Factor Analysis (5)
Alternating Least Squares (ALS) ALS is a simple technique to estimate the factor matrices by exploiting the equations of the unfoldings of the tensor. The equations of the unfoldings of the tensors can be rewritten by using pseudo-inverses as follows Note that ALS should be initialized with at least two factor matrices, for instance, matrices B and C. In this case, the factor matrices can be initialized with random elements. A good initialization implies into fast covergence. 24
25
Content of the intensive course (2)
Multidimensional Array Signal Processing Tensor operators Tensor decompositions Parallel factor analysis (PARAFAC) Kruskal’s rank Tensor models Higher order singular value decomposition (HOSVD) Merging dimensions Least squares Khatri-Rao factorization (LSKRF) Tensor based techniques Tensor based model order selection 25 25
26
Kruskal’s rank Unicity of the PARAFAC decomposition
The Kruskal-rank is defined as largest number k such that all sets of k columns are linearly independent and is represented by The PARAFAC decomposition is unique and can be applied if the following relationship is satisfied where d is the tensor rank and A, B and C are the factor matrices. Scaling and permutation ambiguities Scaling: Permutation: 26
27
Content of the intensive course (2)
Multidimensional Array Signal Processing Tensor operators Tensor decompositions Parallel factor analysis (PARAFAC) Kruskal’s rank Tensor models Higher order singular value decomposition (HOSVD) Merging dimensions Least squares Khatri-Rao factorization (LSKRF) Tensor based techniques Tensor based model order selection 27 27
28
Tensor models (1) Scalar representation of harmonical retrieval data model where , and Mapping between the spatial frequencies and the directions of arrival (azimuth and elevation ) F. Roemer. Rank, Advances in subspace-based parameter estimation: Tensor-ESPRIT-type methods and non-circular sources, Master’s dissertation, TU Ilmenau, 2006 28
29
Tensor models (2) Matrix representation where
Tensor representation using n-mode products where is the identity tensor, which is defined as a tensor with 1’s on the superdiagonal and 0’s everywhere else Another possible representation is given by 29
30
Tensor models (3) 30
31
Tensor models (4) 31
32
Content of the intensive course (2)
Multidimensional Array Signal Processing Tensor operators Tensor decompositions Parallel factor analysis (PARAFAC) Kruskal’s rank Tensor models Higher order singular value decomposition (HOSVD) Merging dimensions Least squares Khatri-Rao factorization (LSKRF) Tensor based techniques Tensor based model order selection 32 32
33
HOSVD (1) Complete HOSVD Economic HOSVD Low rank approximation 33
34
HOSVD (2) The HOSVD is computed by applying one SVD for each unfolding, which implies into R+1 SVDs. Example of the HOSVD for a 3rd order tensor: For the approximation to a rank d, d columns of the matrices are selected for r = 1, …, R+1 as well as the corresponding vectors of The HOSVD allows the denoising in all dimensions of the tensor. 34
35
HOSVD versus PARAFAC-ALS
3 HOSVD The core tensor is full, i.e., seldom it is diagonal. Direct and easy to compute via SVD Factor matrices are tall or square Used for denoising or data compression 2 1 3 PARAFAC-ALS 2 The core tensor is diagonal. Factor matrices can be iteratively computed. The factor matrices can be fat, i.e. underdetermined case. The tensor rank and the natural source of the data are revealed. 1 35
36
Content of the intensive course (2)
Multidimensional Array Signal Processing Tensor operators Tensor decompositions Parallel factor analysis (PARAFAC) Kruskal’s rank Tensor models Higher order singular value decomposition (HOSVD) Merging dimensions Least squares Khatri-Rao factorization (LSKRF) Tensor based techniques Tensor based model order selection 36 36
37
Merging dimensions Let us consider without merging
We can stack dimensions and obtain with merging where in both cases, we assume that 37
38
Least Squares Khatri-Rao Factorization
Given , we desire and Reshaping the merged vector Since the product should be a rank-one matrix, we can apply the SVD-based rank one approximation Therefore, 38
39
Content of the intensive course (2)
Multidimensional Array Signal Processing Tensor operators Tensor decompositions Parallel factor analysis (PARAFAC) Kruskal’s rank Tensor models Higher order singular value decomposition (HOSVD) Merging dimensions Least squares Khatri-Rao factorization (LSKRF) Tensor based techniques Tensor based model order selection 39 39
40
Model order estimation for tensors
Tensor based Model Order Selection (1) Model order estimation for tensors Global eigenvalues: 40
41
Tensor based Model Order Selection (2)
Comparison between global eigenvalues and the eigenvalues of a single unfolding 41
42
Tensor based Model Order Selection (3)
Comparison between tensor based and matrix based model order selection schemes 42
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.