Maths for Signals and Systems Linear Algebra in Engineering Lecture 18, Friday 18th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.

Slides:



Advertisements
Similar presentations
Applied Informatics Štefan BEREŽNÝ
Advertisements

Lecture 19 Singular Value Decomposition
Chapter 3 Determinants and Matrices
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Lecture 18 Eigenvalue Problems II Shang-Hua Teng.
MOHAMMAD IMRAN DEPARTMENT OF APPLIED SCIENCES JAHANGIRABAD EDUCATIONAL GROUP OF INSTITUTES.
Subspaces, Basis, Dimension, Rank
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
Matrices CS485/685 Computer Vision Dr. George Bebis.
5.1 Orthogonality.
Linear Algebra & Matrices MfD 2004 María Asunción Fernández Seara January 21 st, 2004 “The beginnings of matrices and determinants.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly.
Scientific Computing QR Factorization Part 1 – Orthogonal Matrices, Reflections and Rotations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
CHAPTER SIX Eigenvalues
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
Fundamentals from Linear Algebra Ghan S. Bhatt and Ali Sekmen Mathematical Sciences and Computer Science College of Engineering Tennessee State University.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
Presentation on Matrices and some special matrices In partial fulfillment of the subject Vector calculus and linear algebra ( ) Submitted by: Agarwal.
CPSC 491 Xin Liu Nov 17, Introduction Xin Liu PhD student of Dr. Rokne Contact Slides downloadable at pages.cpsc.ucalgary.ca/~liuxin.
Linear Algebra (Aljabar Linier) Week 10 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma
Linear algebra: matrix Eigen-value Problems
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Scientific Computing Singular Value Decomposition SVD.
Eigenvectors and Linear Transformations Recall the definition of similar matrices: Let A and C be n  n matrices. We say that A is similar to C in case.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
1 Chapter 8 – Symmetric Matrices and Quadratic Forms Outline 8.1 Symmetric Matrices 8.2Quardratic Forms 8.3Singular ValuesSymmetric MatricesQuardratic.
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Graphics Graphics Korea University kucg.korea.ac.kr Mathematics for Computer Graphics 고려대학교 컴퓨터 그래픽스 연구실.
Maths for Signals and Systems Linear Algebra for Engineering Applications Lectures 1-2, Tuesday 13 th October 2014 DR TANIA STATHAKI READER (ASSOCIATE.
CS246 Linear Algebra Review. A Brief Review of Linear Algebra Vector and a list of numbers Addition Scalar multiplication Dot product Dot product as a.
EE611 Deterministic Systems Vector Spaces and Basis Changes Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Review of Linear Algebra
CS479/679 Pattern Recognition Dr. George Bebis
Elementary Linear Algebra Anton & Rorres, 9th Edition
Matrices and vector spaces
Eigenvalues and Eigenvectors
Matrices and Vectors Review Objective
CS485/685 Computer Vision Dr. George Bebis
Chapter 3 Linear Algebra
Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors.
Presented By Farheen Sultana Ist Year I SEM
Symmetric Matrices and Quadratic Forms
Maths for Signals and Systems Linear Algebra in Engineering Lectures 10-12, Tuesday 1st and Friday 4th November2016 DR TANIA STATHAKI READER (ASSOCIATE.
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
Maths for Signals and Systems Linear Algebra in Engineering Lectures 16-17, Tuesday 18th November 2014 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
Mathematics for Signals and Systems
Maths for Signals and Systems Linear Algebra in Engineering Lectures 9, Friday 28th October 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Lecture 13: Singular Value Decomposition (SVD)
Maths for Signals and Systems Linear Algebra in Engineering Lecture 6, Friday 21st October 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Maths for Signals and Systems Linear Algebra in Engineering Lectures 4-5, Tuesday 18th October 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN.
Maths for Signals and Systems Linear Algebra in Engineering Class 2, Tuesday 2nd November 2014 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Maths for Signals and Systems Linear Algebra in Engineering Lecture 9, Friday 1st November 2014 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Maths for Signals and Systems Linear Algebra in Engineering Lecture 18, Friday 21st November 2014 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Maths for Signals and Systems Linear Algebra in Engineering Lectures 19-20, Tuesday 25st November 2014 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
Maths for Signals and Systems Linear Algebra in Engineering Class 1, Friday 28th November 2014 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Linear Algebra Lecture 41.
Eigenvalues and Eigenvectors
Subject :- Applied Mathematics
Linear Algebra: Matrix Eigenvalue Problems – Part 2
Lecture 20 SVD and Its Applications
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Maths for Signals and Systems Linear Algebra in Engineering Lecture 18, Friday 18th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL COLLEGE LONDON

Mathematics for Signals and Systems In this set of lectures we will talk about two applications: Linear Transformations Summary of Decompositions and Matrices

Linear transformations Consider the parameters/functions/vectors/other mathematical quantities denoted by 𝑢 and 𝑣. A transformation is an operator applied on the above quantities, i.e., 𝑇 𝑢 , 𝑇(𝑣). A linear transformation possesses the following two properties: 𝑇 𝑢+𝑣 =𝑇 𝑢 +𝑇(𝑣) 𝑇 𝑐𝑣 =𝑐𝑇(𝑣) where 𝑐 is a scalar. By grouping the above two conditions we get 𝑇 𝑐 1 𝑢+ 𝑐 2 𝑣 = 𝑐 1 𝑇 𝑢 + 𝑐 2 𝑇(𝑣) The zero vector in a linear transformation is always mapped to zero.

Examples of transformations Is the transformation 𝑇: 𝑅 2 → 𝑅 2 , which carries out projection of any vector of the 2-D plane on a specific straight line, a linear transformation? Is the transformation 𝑇: 𝑅 2 → 𝑅 2 , which shifts the entire plane by a vector 𝑣 0 , a linear transformation? Is the transformation 𝑇: 𝑅 3 →𝑅, which takes as input a vector and produces as output its length, a linear transformation? Is the transformation 𝑇: 𝑅 2 → 𝑅 2 , which rotates a vector by 45 o a linear transformation? Is the transformation 𝑇 𝑣 =𝐴𝑣, where 𝐴 is a matrix, a linear transformation?

Examples of transformations Consider a transformation 𝑇: 𝑅 3 → 𝑅 2 . In case 𝑇 𝑣 =𝐴𝑣, then 𝐴 is a matrix of size 2×3. If we know the outputs of the transformation if applied on a set of vectors 𝑣 1 ,𝑣 2 ,…, 𝑣 𝑛 which form a basis of some space, then we know the output to any vector that belongs to that space. Recall: The coordinates of a system are based on its basis. Most of the time when we talk about coordinates we think about the “standard” basis, which consists of the rows (columns) of the identity matrix. Another popular basis consists of the eigenvectors of a matrix.

Examples of transformations: Projections Consider the matrix 𝐴 that represents a linear transformation 𝑇. Most of the times the required transformation is of the form 𝑇: 𝑅 𝑛 → 𝑅 𝑚 . I need to choose two bases, one for 𝑅 𝑛 , denoted by 𝑣 1 ,𝑣 2 ,…, 𝑣 𝑛 and one for 𝑅 𝑚 denoted by 𝑤 1 ,𝑤 2 ,…, 𝑤 𝑚 . I am looking for a transformation that if applied on a vector described with the input coordinates produces the output co-ordinates. Consider 𝑅 2 and the transformation which projects any vector on the line shown on the figure below. I consider as basis for 𝑅 2 the vectors shown with red below and not the “standard” vectors 1 0 and 0 1 . One of the basis vectors lies on the required line and the other is perpendicular to the former.

Examples of transformations: Projections cont. I consider as basis for 𝑅 2 the vectors shown with red below both before and after the transformation. Any vector 𝑣 in 𝑅 2 can be written as 𝑣= 𝑐 1 𝑣 1 +𝑐 1 𝑣 2 . We are looking for 𝑇(∙) such that 𝑇 𝑣 1 = 𝑣 1 and 𝑇 𝑣 2 =0. Furthermore, 𝑇 𝑣 = 𝑐 1 𝑇 𝑣 1 +𝑐 2 𝑇 𝑣 2 = 𝑐 1 𝑣 1 1 0 0 0 𝑐 1 𝑐 2 = 𝑐 1 0 The matrix in that case is Λ. This is the “good” matrix. 𝑣 2 = 𝑤 2 𝑣 1 = 𝑤 1

Examples of transformations: Projections cont. I now consider as basis for 𝑅 2 the “standard” basis. 𝑣 1 = 1 0 and 𝑣 2 = 0 1 . Consider projections on to 45 o line. In this example the required matrix is 𝑃= 𝑎 𝑎 𝑇 𝑎 𝑇 𝑎 = 1/2 1/2 1/2 1/2 Here we didn’t choose the “best” basis, we chose the “handiest” basis. 𝑣 2 = 𝑤 2 𝑣 1 = 𝑤 1

Examples of transformations: Derivative of a function Consider a linear transformation that takes the derivative of a function. (The derivative is a linear transformation!) 𝑇= 𝑑(∙) 𝑑𝑥 Consider input 𝑐 1 + 𝑐 2 𝑥+ 𝑐 3 𝑥 2 . Basis consists of the functions 1,𝑥, 𝑥 2 . The output should be 𝑐 2 + 2𝑐 3 𝑥. Basis consists of the functions 1,𝑥. I am looking for a matrix 𝐴 such that 𝐴 𝑐 1 𝑐 2 𝑐 3 = 𝑐 2 2𝑐 3 . This is 𝐴= 0 1 0 0 0 2 .

Summary of Decompositions: LU Decomposition What is it? 𝐴=𝐿𝑈, 𝐴 is a square matrix. 𝐿 a lower triangular matrix with 1s on the diagonal. 𝑈 an upper triangular matrix with the pivots of 𝐴 on the diagonal. When does it exist? If the matrix is invertible (the determinant is not 0), then a pure 𝐿𝑈 decomposition exists only if the leading principal minors are not 0. The leading principal minors are the “north-west” determinants. Example: The matrix 𝐴= 0 1 1 1 does not have an 𝐿𝑈 although it is invertible. If the matrix is not invertible (the determinant is 0), then we can't know if there is a pure LU decomposition. 𝐿𝑈 decomposition works with rectangular matrices as well, with slight modifications/extensions.

LDU Decomposition What is it? 𝐴=𝐿𝐷𝑈, 𝐴 is a square matrix. 𝐿 a lower triangular matrix with 1s on the diagonal. 𝐷 a diagonal matrix with the pivots of 𝐴 across the diagonal. 𝑈 an upper triangular matrix with 1s on the diagonal. When does it exist? The same requirements as in pure 𝐿𝑈 decomposition. Example: 𝐴=𝐿𝑈= 2 1 8 7 = 1 0 4 1 2 1 0 3 The above can be written as: 𝐴=𝐿𝐷𝑈= 2 1 8 7 = 1 0 4 1 2 0 0 3 1 1/2 0 1 𝐿𝐷𝑈 decomposition works with rectangular matrices as well, with slight modifications/extensions.

A=ER What is it? 𝐴=𝐸𝑅 𝐴 is any matrix of dimension 𝑚×𝑛. 𝐸 a square invertible matrix of dimension 𝑚×𝑚. 𝑅= 𝐼 𝑟×𝑟 𝐹 𝑟×(𝑛−𝑟) 𝟎 (𝑚−𝑟)×𝑟 𝟎 (𝑚−𝑟)×(𝑛−𝑟) 𝐹 is a random matrix. 𝑟 is the rank of 𝐴. When does it exist? Always.

A=QR for square matrices What is it? 𝐴=𝑄𝑅 𝐴 is any matrix of dimension 𝑛×𝑛. 𝑄 a square invertible matrix of dimension 𝑛×𝑛 with orthogonal columns. The columns of 𝑄 are derived from the columns of 𝐴. If 𝐴 is not invertible the first 𝑟 columns of 𝑄 are derived from the independent columns of 𝐴. 𝑟 is the rank of 𝐴. The last (𝑛−𝑟) columns of 𝑄 are chosen to be orthogonal to the first 𝑟 ones. 𝑅 is upper triangular if 𝐴 is invertible. If 𝐴 is NOT invertible the last (𝑛−𝑟) rows of 𝑅 are filled with zeros. (Look at Problem Sheet 6, Question 6, Matrix B). When does it exist? Always. 𝑄𝑅 decomposition exists also for rectangular matrices. I haven’t taught this case.

𝐴=𝑆Λ 𝑆 −1 What is it? 𝐴=𝑆Λ 𝑆 −1 𝐴 is a square invertible matrix of dimension 𝑛×𝑛. When does it exist? When 𝐴 has 𝑛 linearly independent eigenvectors.

𝐴=𝑄Λ 𝑄 𝑇 What is it? 𝐴=𝑄Λ 𝑄 𝑇 𝐴 is a real, symmetric matrix of dimension 𝑛×𝑛. The above decomposition is the so called Spectral Theorem. When does it exist? When 𝐴 is real and symmetric.

The Singular Value Decomposition 𝐴=𝑈Σ 𝑉 𝑇 What is it? 𝐴=𝑈Σ 𝑉 𝑇 𝐴 is any matrix of dimension 𝑚×𝑛. 𝑈 is an orthogonal matrix of dimension 𝑚×𝑚 whose columns are the eigenvectors of matrix 𝐴 𝐴 𝑇 . Σ is a singular value matrix, with the singular values of 𝐴 in its main diagonal. 𝑉 is an orthogonal matrix of dimension 𝑛×𝑛 whose columns are the eigenvectors of matrix 𝐴 𝑇 𝐴. The squares of the singular values of 𝐴 are the eigenvalues of both 𝐴 𝐴 𝑇 and 𝐴 𝑇 𝐴. When does it exist? Always.

Summary of matrices Projection matrix 𝑃 onto subspace 𝑆. 𝑝=𝑃𝑏 is the closest point to 𝑏 in 𝑆. 𝑃 2 =𝑃= 𝑃 𝑇 . Condition 𝑃 2 =𝑃 is sufficient to characterise a matrix as a projection matrix. Eigenvalues: 0 or 1. Eigenvectors are in 𝑆 or 𝑆 ⊥ . If the columns of matrix 𝐴 are a basis for 𝑆, then 𝑃=𝐴 ( 𝐴 𝑇 𝐴) −1 𝐴 𝑇 . Orthogonal matrix 𝑄. Square matrix with orthonormal columns. 𝑄 −1 = 𝑄 𝑇 Eigenvalues: 𝜆 =1. Eigenvectors are orthogonal. Preserves length and angles, i.e., 𝑄𝑥 = 𝑥 .

Also recall Symmetric (or Hermitian) matrices (real eigenvalues). Positive definite and positive semi-definite matrices. Their eigenvalues are positive or non-negative respectively. Matrices 𝐴 𝑇 𝐴 and 𝐴𝐴 𝑇 . They have identical eigenvalues. Their eigenvalues are non-negative. Square matrices. Rectangular matrices.