Symmetric Matrices and Quadratic Forms

Slides:



Advertisements
Similar presentations
10.4 Complex Vector Spaces.
Advertisements

3D Geometry for Computer Graphics
Ch 7.7: Fundamental Matrices
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Linear Equations in Linear Algebra
Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Symmetric Matrices and Quadratic Forms
Chapter 6 Eigenvalues.
Chapter 3 Determinants and Matrices
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Lecture 18 Eigenvalue Problems II Shang-Hua Teng.
Orthogonality and Least Squares
4 4.6 © 2012 Pearson Education, Inc. Vector Spaces RANK.
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
6 6.3 © 2012 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
5 5.1 © 2012 Pearson Education, Inc. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Dirac Notation and Spectral decomposition
Matrices CS485/685 Computer Vision Dr. George Bebis.
5.1 Orthogonality.
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
1 MAC 2103 Module 12 Eigenvalues and Eigenvectors.
Eigenvalues and Eigenvectors
4 4.4 © 2012 Pearson Education, Inc. Vector Spaces COORDINATE SYSTEMS.
Linear Equations in Linear Algebra
AN ORTHOGONAL PROJECTION
Orthogonality and Least Squares
Computing Eigen Information for Small Matrices The eigen equation can be rearranged as follows: Ax = x  Ax = I n x  Ax - I n x = 0  (A - I n )x = 0.
5 5.2 © 2012 Pearson Education, Inc. Eigenvalues and Eigenvectors THE CHARACTERISTIC EQUATION.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Eigenvectors and Linear Transformations Recall the definition of similar matrices: Let A and C be n  n matrices. We say that A is similar to C in case.
Diagonalization and Similar Matrices In Section 4.2 we showed how to compute eigenpairs (,p) of a matrix A by determining the roots of the characteristic.
Chapter 10 Real Inner Products and Least-Square
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
4 © 2012 Pearson Education, Inc. Vector Spaces 4.4 COORDINATE SYSTEMS.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
7.1 Eigenvalues and Eigenvectors
5.1 Eigenvectors and Eigenvalues 5. Eigenvalues and Eigenvectors.
5.1 Eigenvalues and Eigenvectors
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
Linear Algebra Chapter 6 Linear Algebra with Applications -Gareth Williams Br. Joel Baumeyer, F.S.C.
6 6.3 © 2016 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
1 Chapter 8 – Symmetric Matrices and Quadratic Forms Outline 8.1 Symmetric Matrices 8.2Quardratic Forms 8.3Singular ValuesSymmetric MatricesQuardratic.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Matrices CHAPTER 8.9 ~ Ch _2 Contents  8.9 Power of Matrices 8.9 Power of Matrices  8.10 Orthogonal Matrices 8.10 Orthogonal Matrices 
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
6 6.5 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares LEAST-SQUARES PROBLEMS.
Tutorial 6. Eigenvalues & Eigenvectors Reminder: Eigenvectors A vector x invariant up to a scaling by λ to a multiplication by matrix A is called.
Chapter 5 Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors
CS479/679 Pattern Recognition Dr. George Bebis
Elementary Linear Algebra Anton & Rorres, 9th Edition
Eigenvalues and Eigenvectors
Euclidean Inner Product on Rn
Orthogonality and Least Squares
CS485/685 Computer Vision Dr. George Bebis
Orthogonality and Least Squares
Symmetric Matrices and Quadratic Forms
EIGENVECTORS AND EIGENVALUES
Linear Algebra Lecture 41.
Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors
Subject :- Applied Mathematics
Orthogonality and Least Squares
Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Symmetric Matrices and Quadratic Forms DIAGONALIZATION OF SYMMETRIC MATRICES © 2012 Pearson Education, Inc.

SYMMETRIC MATRIX A symmetric matrix is a matrix A such that . Such a matrix is necessarily square. Its main diagonal entries are arbitrary, but its other entries occur in pairs—on opposite sides of the main diagonal. © 2012 Pearson Education, Inc.

SYMMETRIC MATRIX Theorem 1: If A is symmetric, then any two eigenvectors from different eigenspaces are orthogonal. Proof: Let v1 and v2 be eigenvectors that correspond to distinct eigenvalues, say, λ1 and λ2. To show that , compute Since v1 is an eigenvector Since Since v2 is an eigenvector © 2012 Pearson Education, Inc.

SYMMETRIC MATRIX Hence . But , so . An matrix A is said to be orthogonally diagonalizable if there are an orthogonal matrix P (with ) and a diagonal matrix D such that ----(1) Such a diagonalization requires n linearly independent and orthonormal eigenvectors. When is this possible? If A is orthogonally diagonalizable as in (1), then © 2012 Pearson Education, Inc.

SYMMETRIC MATRIX Thus A is symmetric. Theorem 2: An matrix A is orthogonally diagonalizable if and only if A is symmetric matrix. Example 1: Orthogonally diagonalize the matrix , whose characteristic equation is © 2012 Pearson Education, Inc.

SYMMETRIC MATRIX Solution: The usual calculations produce bases for the eigenspaces: Although v1 and v2 are linearly independent, they are not orthogonal. The projection of v2 onto v1 is . © 2012 Pearson Education, Inc.

SYMMETRIC MATRIX The component of v2 orthogonal to v1 is Then {v1, z2} is an orthogonal set in the eigenspace for . (Note that z2 is linear combination of the eigenvectors v1 and v2, so z2 is in the eigenspace). © 2012 Pearson Education, Inc.

SYMMETRIC MATRIX Since the eigenspace is two-dimensional (with basis v1, v2), the orthogonal set {v1, z2} is an orthogonal basis for the eigenspace, by the Basis Theorem. Normalize v1 and z2 to obtain the following orthonormal basis for the eigenspace for : © 2012 Pearson Education, Inc.

SYMMETRIC MATRIX An orthonormal basis for the eigenspace for is By Theorem 1, u3 is orthogonal to the other eigenvectors u1 and u2. Hence {u1, u2, u3} is an orthonormal set. © 2012 Pearson Education, Inc.

SYMMETRIC MATRIX Let Then P orthogonally diagonalizes A, and . © 2012 Pearson Education, Inc.

THE SPECTRAL THEOREM The set if eigenvalues of a matrix A is sometimes called the spectrum of A, and the following description of the eigenvalues is called a spectral theorem. Theorem 3: The Spectral Theorem for Symmetric Matrices An symmetric matrix A has the following properties: A has n real eigenvalues, counting multiplicities. © 2012 Pearson Education, Inc.

THE SPECTRAL THEOREM The dimension of the eigenspace for each eigenvalue λ equals the multiplicity of λ as a root of the characteristic equation. The eigenspaces are mutually orthogonal, in the sense that eigenvectors corresponding to different eigenvalues are orthogonal. A is orthogonally diagonalizable. © 2012 Pearson Education, Inc.

SPECTRAL DECOMPOSITION Suppose , where the columns of P are orthonormal eigenvectors u1,…,un of A and the corresponding eigenvalues λ1,…,λn are in the diagonal matrix D. Then, since , © 2012 Pearson Education, Inc.

SPECTRAL DECOMPOSITION Using the column-row expansion of a product, we can write ----(2) This representation of A is called a spectral decomposition of A because it breaks up A into pieces determined by the spectrum (eigenvalues) of A. Each term in (2) is an matrix of rank 1. For example, every column of is a multiple of u1. Each matrix is a projection matrix in the sense that for each x in , the vector is the orthogonal projection of x onto the subspace spanned by uj. © 2012 Pearson Education, Inc.

SPECTRAL DECOMPOSITION Example 2: Construct a spectral decomposition of the matrix A that has the orthogonal diagonalization Solution: Denote the columns of P by u1 and u2. Then © 2012 Pearson Education, Inc.

SPECTRAL DECOMPOSITION To verify the decomposition of A, compute and © 2012 Pearson Education, Inc.