Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly.

Slides:



Advertisements
Similar presentations
10.4 Complex Vector Spaces.
Advertisements

Matrix Representation
Chapter 4 Euclidean Vector Spaces
Ch 7.7: Fundamental Matrices
Linear Algebra.
OCE301 Part II: Linear Algebra lecture 4. Eigenvalue Problem Ax = y Ax = x occur frequently in engineering analysis (eigenvalue problem) Ax =  x [ A.
8 CHAPTER Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1.
Slides by Olga Sorkine, Tel Aviv University. 2 The plan today Singular Value Decomposition  Basic intuition  Formal definition  Applications.
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Symmetric Matrices and Quadratic Forms
ENGG2013 Unit 19 The principal axes theorem
Chapter 6 Eigenvalues.
Chapter 3 Determinants and Matrices
Dirac Notation and Spectral decomposition Michele Mosca.
Lecture 18 Eigenvalue Problems II Shang-Hua Teng.
Orthogonal Sets (12/2/05) Recall that “orthogonal” matches the geometric idea of “perpendicular”. Definition. A set of vectors u 1,u 2,…,u p in R n is.
Dirac Notation and Spectral decomposition
Matrices CS485/685 Computer Vision Dr. George Bebis.
A matrix having a single row is called a row matrix. e.g.,
5.1 Orthogonality.
1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 5QF Introduction to Vector and Matrix Operations Needed for the.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
CHAPTER SIX Eigenvalues
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
Eigenvalue Problems Solving linear systems Ax = b is one part of numerical linear algebra, and involves manipulating the rows of a matrix. The second main.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
Linear algebra: matrix Eigen-value Problems
Computing Eigen Information for Small Matrices The eigen equation can be rearranged as follows: Ax = x  Ax = I n x  Ax - I n x = 0  (A - I n )x = 0.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Elementary Linear Algebra Anton & Rorres, 9th Edition
§ Linear Operators Christopher Crawford PHY
Scientific Computing Singular Value Decomposition SVD.
Eigenvectors and Linear Transformations Recall the definition of similar matrices: Let A and C be n  n matrices. We say that A is similar to C in case.
Diagonalization and Similar Matrices In Section 4.2 we showed how to compute eigenpairs (,p) of a matrix A by determining the roots of the characteristic.
What is the determinant of What is the determinant of
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Eigenvalues The eigenvalue problem is to determine the nontrivial solutions of the equation Ax= x where A is an n-by-n matrix, x is a length n column.
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
7.1 Eigenvalues and Eigenvectors
5.1 Eigenvalues and Eigenvectors
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
1 Chapter 8 – Symmetric Matrices and Quadratic Forms Outline 8.1 Symmetric Matrices 8.2Quardratic Forms 8.3Singular ValuesSymmetric MatricesQuardratic.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Matrices CHAPTER 8.9 ~ Ch _2 Contents  8.9 Power of Matrices 8.9 Power of Matrices  8.10 Orthogonal Matrices 8.10 Orthogonal Matrices 
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Graphics Graphics Korea University kucg.korea.ac.kr Mathematics for Computer Graphics 고려대학교 컴퓨터 그래픽스 연구실.
CS246 Linear Algebra Review. A Brief Review of Linear Algebra Vector and a list of numbers Addition Scalar multiplication Dot product Dot product as a.
MTH108 Business Math I Lecture 20.
Review of Linear Algebra
CS479/679 Pattern Recognition Dr. George Bebis
Extensions to Complex numbers
Matrices and vector spaces
Eigenvalues and Eigenvectors
Matrices and Vectors Review Objective
Euclidean Inner Product on Rn
CS485/685 Computer Vision Dr. George Bebis
6-4 Symmetric Matrices By毛.
Quantum One.
Christopher Crawford PHY
Chapter 3 Linear Algebra
Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors.
Symmetric Matrices and Quadratic Forms
Maths for Signals and Systems Linear Algebra in Engineering Lecture 18, Friday 18th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Eigenvalues and Eigenvectors
Subject :- Applied Mathematics
Linear Algebra: Matrix Eigenvalue Problems – Part 2
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly independent eigenvectors. In this section we investigate ‘When is a matrix similar to a triangular matrix?’ To do this we need additional concepts. We will see that the answer to this question includes both real and complex square matrices, whether or not they are diagonalizable. In addition we will get a valuable result for symmetric matrices which we use in two applications. Some review: The dot product of a pair of vectors x and y in R n as The complex dot product of a pair of vectors x and y in C n as Length of a vector: (The length of x is also called the norm of x.)

Notation and Definitions: In order to incorporate both the real and complex cases it is convenient to use the conjugate transpose of a matrix, which was introduced in the Exercises in Section 1.1. We also have that the dot product of x and y is given by the row-by-column product x* y and that the norm of a vector x in R n or C n can be expressed as Definition A set S of n  1 vectors in R n or C n is called an orthogonal set provided none of the vectors is the zero vector and each pair of distinct vectors in S is orthogonal; that is, for vectors x  y in S, x*y = 0. Example: Show that each of the following sets is an orthogonal set. The columns any identity matrix The conjugate transpose of column vector x is denoted x* and is the row vector given by

Definition An orthogonal set of vectors S is called an orthonormal set provided each of its vectors has length or norm 1; that is, x*x = 1, for each x in S. Example: Show that each of the following sets are orthonormal. If S is an orthogonal set you can turn S into an orthonormal set by replacing each vector x by its corresponding unit vector Example: Convert each of the following orthogonal sets to an orthonormal set. Orthogonal and orthonormal sets of vectors have an important property. An orthogonal or orthonormal set of vectors is linearly independent. (How would you prove this?)

Next we define an important class of matrices that have rows and columns which form an orthonormal set. Definition A real (complex) n  n matrix Q whose columns form an orthonormal set is called an orthogonal (unitary ) matrix. To combine the real and complex cases we will use the term unitary since every real matrix is a complex matrix whose entries have imaginary parts equal to zero

Recall that one goal of this chapter is to determine the eigenvalues of a matrix A. As we stated in Section 4.3 one way to do this is to find a similarity transformation that results in a diagonal or triangular matrix so we can in effect 'read-off' the eigenvalues as the diagonal entries. It happens that unitary similarity transformations are just the right tool as given in the following important result, which is known as Schur's Theorem. An n  n complex matrix A is unitarily similar to an upper triangular matrix.

We will not prove Schur’s Theorem, but it does tell us that we can determine all the eigenvalues of A by finding an appropriate unitary matrix P so that P -1 AP = P*AP is upper triangular. The following statement is a special case that is useful in a variety of applications, two of which we investigate later in this section. Schur’s Theorem An n  n complex matrix A is unitarily similar to an upper triangular matrix. Theorem: Every Hermitian (symmetric) matrix is unitarily (orthogonally) similar to a diagonal matrix. Recall a real matrix is symmetric if A T = A and a complex matrix is Hermitian if A* = A. An important result of this theorem is the following: Every symmetric matrix is diagonalizable. In fact, we know even more by using some previous results. If A is symmetric then all of its eigenvalues are real numbers, and we can find a corresponding set of eigenvectors which form an orthonormal set. By combining these ideas we can show that eigenvalues and eigenvectors are fundamental building blocks of a symmetric matrix. (They are like LEGOS!)

Application: The Image of the Unit Circle by a Symmetric Matrix Previously we showed that the image of the unit circle in R 2 by a diagonal matrix was an ellipse. (See Section 1.5.) Here we investigate the image of the unit circle by a matrix transformation whose associated matrix A is symmetric. We will show the fundamental role of the eigenpairs of A in determining both the size and orientation of the image. Suppose A is 2 × 2 and symmetric. Then A is orthogonally similar to a diagonal matrix so Now we apply the transformation defined by matrix A step-by-step. and so A = P T DP.

For A is 2 × 2 and symmetric. Then A is orthogonally similar to a diagonal matrix so and so A = P T DP. Applying the transformation defined by matrix A step-by-step to the unit circle gives the following result: P T takes the unit circle to another unit circle; the point c is just rotated D takes this unit circle to an ellipse in standard position P rotates or reflects the ellipse Thus the image of the unit circle by a symmetric matrix is an ellipse with center at the origin, but possibly with its axes not parallel to the coordinate axes.

Next we show the role of the eigenvalues and eigenvectors of symmetric matrix A. Since A is symmetric,and so A = P T DP. The eigen pairs of A are (λ 1, p 1 ) and (λ 2, p 2 ), where the eigenvectors are the columns of orthogonal matrix P. Eigenvectors of A graphed in the unit circle. When P T is applied to the unit circle the image of the eigenvectors are unit vectors i and j. When D is applied to the unit circle we get an ellipse with major & minor axes in the coordinate directions scaled by the eigenvalues. When P is applied to the ellipse a rotation or reflection is performed and the axes of ellipse are in the directions of the eigenvectors of A.

The eigenvalues determine the stretching of axes. The eigenvectors determine the orientation of the images of the axes. So indeed the eigenpairs completely determine the image. Demonstrate in MATLAB using routine mapcirc.