SVD: Physical Interpretation and Applications

Slides:



Advertisements
Similar presentations
Eigen Decomposition and Singular Value Decomposition
Advertisements

3D Geometry for Computer Graphics
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
MIMO Communication Systems
Linear Algebra.
Slides by Olga Sorkine, Tel Aviv University. 2 The plan today Singular Value Decomposition  Basic intuition  Formal definition  Applications.
1cs542g-term High Dimensional Data  So far we’ve considered scalar data values f i (or interpolated/approximated each component of vector values.
Symmetric Matrices and Quadratic Forms
Computer Graphics Recitation 5.
3D Geometry for Computer Graphics
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Math for CSLecture 41 Linear Least Squares Problem Over-determined systems Minimization problem: Least squares norm Normal Equations Singular Value Decomposition.
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Previously Two view geometry: epipolar geometry Stereo vision: 3D reconstruction epipolar lines Baseline O O’ epipolar plane.
3D Geometry for Computer Graphics
3D Geometry for Computer Graphics
Lecture 18 Eigenvalue Problems II Shang-Hua Teng.
1cs542g-term Notes  Extra class next week (Oct 12, not this Friday)  To submit your assignment: me the URL of a page containing (links to)
Matrices CS485/685 Computer Vision Dr. George Bebis.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
SVD(Singular Value Decomposition) and Its Applications
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
BMI II SS06 – Class 3 “Linear Algebra” Slide 1 Biomedical Imaging II Class 3 – Mathematical Preliminaries: Elementary Linear Algebra 2/13/06.
CPSC 491 Xin Liu Nov 17, Introduction Xin Liu PhD student of Dr. Rokne Contact Slides downloadable at pages.cpsc.ucalgary.ca/~liuxin.
SVD: Singular Value Decomposition
§ Linear Operators Christopher Crawford PHY
Eigenvectors and Linear Transformations Recall the definition of similar matrices: Let A and C be n  n matrices. We say that A is similar to C in case.
Introduction to Linear Algebra Mark Goldman Emily Mackevicius.
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
EIGENSYSTEMS, SVD, PCA Big Data Seminar, Dedi Gadot, December 14 th, 2014.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
Singular Value Decomposition and Numerical Rank. The SVD was established for real square matrices in the 1870’s by Beltrami & Jordan for complex square.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
Lecture XXVII. Orthonormal Bases and Projections Suppose that a set of vectors {x 1,…,x r } for a basis for some space S in R m space such that r  m.
CS246 Linear Algebra Review. A Brief Review of Linear Algebra Vector and a list of numbers Addition Scalar multiplication Dot product Dot product as a.
Lecture 16: Image alignment
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Review of Linear Algebra
Eigen & Singular Value Decomposition
CS479/679 Pattern Recognition Dr. George Bebis
Matrices and vector spaces
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Lecture: Face Recognition and Feature Reduction
Singular Value Decomposition

Some useful linear algebra
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
CS485/685 Computer Vision Dr. George Bebis
Numerical Analysis Lecture 16.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Derivative of scalar forms
Christopher Crawford PHY
Recitation: SVD and dimensionality reduction
Singular Value Decomposition SVD
Chapter 3 Linear Algebra
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Outline Singular Value Decomposition Example of PCA: Eigenfaces.
Symmetric Matrices and Quadratic Forms
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
Lecture 13: Singular Value Decomposition (SVD)
Maths for Signals and Systems Linear Algebra in Engineering Lecture 18, Friday 18th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Subject :- Applied Mathematics
Lecture 20 SVD and Its Applications
Symmetric Matrices and Quadratic Forms
Presentation transcript:

SVD: Physical Interpretation and Applications 2018-11-30 SVD: Physical Interpretation and Applications Adeel Razi Wireless Communications Lab (WCL) Dept. of Electrical Engineering & Telecommunications University of New South Wales Anyone who has never made a mistake has never tried anything new -- Einstein

The plan today Singular Value Decomposition Basic intuition 2018-11-30 The plan today Singular Value Decomposition Basic intuition Formal definition SVD and Multiuser MIMO Other Applications

Geometric analysis of linear transformations 2018-11-30 Geometric analysis of linear transformations We want to know what a linear transformation A does Need some simple and “comprehendible” representation of the matrix of A. Let’s look what A does to some vectors Since A(v) = A(v), it’s enough to look at vectors v of unit length A

The geometry of linear transformations 2018-11-30 The geometry of linear transformations A linear (non-singular) transform A always takes hyper-spheres to hyper-ellipses. A A

The geometry of linear transformations 2018-11-30 The geometry of linear transformations Thus, one good way to understand what A does is to find which vectors are mapped to the “main axes” of the ellipsoid. A A

Geometric analysis of linear transformations 2018-11-30 Geometric analysis of linear transformations If we are lucky: A = V  VT, V orthogonal (true if A is symmetric) The eigenvectors of A are the axes of the ellipse A

Symmetric matrix: eigen decomposition 2018-11-30 Symmetric matrix: eigen decomposition In this case A is just a scaling matrix. The eigen decomposition of A tells us which orthogonal axes it scales, and by how much: A 1 2 1

General linear transformations: SVD 2018-11-30 General linear transformations: SVD In general A will also contain rotations, not just scales: 1 1 1 2 A

General linear transformations: SVD 2018-11-30 General linear transformations: SVD 1 1 1 2 A orthonormal orthonormal

SVD more formally SVD exists for any matrix Formal definition: 2018-11-30 SVD more formally SVD exists for any matrix Formal definition: For square matrices A  Rnn, there exist orthogonal matrices U, V  Rnn and a diagonal matrix , such that all the diagonal values i of  are non-negative and =

2018-11-30 SVD more formally The diagonal values of  (1, …, n) are called the singular values. It is accustomed to sort them: 1  2 …  n The columns of U (u1, …, un) are called the left singular vectors. They are the axes of the ellipsoid. The columns of V (v1, …, vn) are called the right singular vectors. They are the pre-images of the axes of the ellipsoid. =

Singular values and Eigenvalues Relation 2018-11-30 Singular values and Eigenvalues Relation For an m n matrix A of rank r there exists a factorization (Singular Value Decomposition = SVD) as follows: mm mn V is nn The columns of U are orthogonal eigenvectors of AAT. The columns of V are orthogonal eigenvectors of ATA. Singular values. Eigenvalues 1 … r of AAT are the eigenvalues of ATA.

Matrix rank The rank of A is the number of non-zero singular values n 2018-11-30 Matrix rank The rank of A is the number of non-zero singular values n 1 2 n m =

Beamforming for Multiuser MIMO System 2018-11-30 Beamforming for Multiuser MIMO System MIMO systems are always not transparent to the user Smart antenna is not suitable Solving None Light-of-Sight (NLOS) problem Best solution: Singular Value Decomposition (SVD) Cluster 1 Tx antenna 1 Rx antenna 1 Tx antenna 2 Rx antenna 2 Tx antenna 3 Rx antenna 3 Tx antenna 4 Rx antenna 4 Cluster 2 Undesired spreaded beams Desired beams Interference

SVD and MIMO MIMO channel can be modeled as complex matrix 2018-11-30 SVD and MIMO MIMO channel can be modeled as complex matrix The MIMO channel capacity CSI is needed at Tx MIMO Channel SVD

Beamforming for MIMO System 2018-11-30 Beamforming for MIMO System The goal of beamforming is to diagonalize channel matrix Performance approach to the Shannon capacity

Beamforming for MIMO System 2018-11-30 Beamforming for MIMO System Let channel matrix Post-processing Pre-processing

Beamforming for MIMO System 2018-11-30 Beamforming for MIMO System By applying SVD, H can be decomposed as Data streams arrive orthogonally without interference between streams Array processing at transmitter Array processing at receiver

Matrix inverse and solving linear systems 2018-11-30 Matrix inverse and solving linear systems Matrix inverse: So, to solve

Solving least-squares systems 2018-11-30 Solving least-squares systems We tried to solve Ax=b when A was rectangular: Seeking solutions in least-squares sense: A x b =

Solving least-squares systems 2018-11-30 Solving least-squares systems When A is full-rank, (normal equations). So: = 12 n2 1 n

Solving least-squares systems 2018-11-30 Solving least-squares systems Substituting in the normal equations: 1/12 1 1/1 = 1/n2 n 1/n

Pseudoinverse The matrix we found is called the pseudoinverse of A. 2018-11-30 Pseudoinverse The matrix we found is called the pseudoinverse of A. Definition using reduced SVD: If some of the i are zero, put zero instead of 1/I .

Pseudo-inverse Pseudoinverse A+ exists for any matrix A. 2018-11-30 Pseudo-inverse Pseudoinverse A+ exists for any matrix A. Its properties: If A is mn then A+ is nm Acts a little like real inverse:

Solving least-squares systems 2018-11-30 Solving least-squares systems When A is not full-rank: ATA is singular There are multiple solutions to the normal equations: Thus, there are multiple solutions to the least-squares. The SVD approach still works! In this case it finds the minimal-norm solution:

Thank you! The important thing is not to stop questioning -- Einstein 2018-11-30 Thank you! The important thing is not to stop questioning -- Einstein

2018-11-30 Backup Slides

2018-11-30 Principal components Eigenvectors that correspond to big eigenvalues are the directions in which the data has strong components (= large variance). If the eigenvalues are more or less the same – there is no preferable direction. Note: the eigenvalues are always non-negative. Think why…

Principal components S looks like this: S looks like this: 2018-11-30 Principal components There’s no preferable direction S looks like this: Any vector is an eigenvector There is a clear preferable direction S looks like this:  is close to zero, much smaller than .

SVD and MIMO Consider a 2X2 MIMO case 2018-11-30 SVD and MIMO Consider a 2X2 MIMO case Every input is a vector in the following 2-dimensional space (T1, T2) Matrix H rotates the basis vectors (T1, T2) into new set of vectors (R1, R2) Now the output is a vector in the new space of (R1, R2) New rotated vectors might not be orthogonal or they might not be independent of each other T1 T2 H T1 T2 R1 R2

2018-11-30 SVD and MIMO If H is full rank matrix, R1 and R2 will be orthogonal (and can be termed as new basis vectors) T1 T2 h11T1 + h12T2 = R1 h21T1 + h22T2 = R2

SVD and MIMO: Example Example: 2018-11-30 T1 T2 T1 + T2 = R1,R2 0.7T1 + 0.7T2 = v1 0.7T1 - 0.7T2 = v2 Received signal will only lie on this line