ENGG2013 Unit 19 The principal axes theorem

Slides:



Advertisements
Similar presentations
10.4 Complex Vector Spaces.
Advertisements

Eigen Decomposition and Singular Value Decomposition
3D Geometry for Computer Graphics
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Scientific Computing QR Factorization Part 2 – Algorithm to Find Eigenvalues.
Linear Algebra.
Section 5.1 Eigenvectors and Eigenvalues. Eigenvectors and Eigenvalues Useful throughout pure and applied mathematics. Used to study difference equations.
OCE301 Part II: Linear Algebra lecture 4. Eigenvalue Problem Ax = y Ax = x occur frequently in engineering analysis (eigenvalue problem) Ax =  x [ A.
8 CHAPTER Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1.
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Symmetric Matrices and Quadratic Forms
Computer Graphics Recitation 5.
ENGG2013 Unit 17 Diagonalization Eigenvector and eigenvalue Mar, 2011.
Chapter 6 Eigenvalues.
5.II. Similarity 5.II.1. Definition and Examples
Lecture 18 Eigenvalue Problems II Shang-Hua Teng.
Matrices CS485/685 Computer Vision Dr. George Bebis.
5.1 Orthogonality.
Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly.
CHAPTER SIX Eigenvalues
Principal Component Analysis Adapted by Paul Anderson from Tutorial by Doug Raiford.
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
1 MAC 2103 Module 12 Eigenvalues and Eigenvectors.
PHY 301: MATH AND NUM TECH Chapter 5: Linear Algebra Applications I.Homogeneous Linear Equations II.Non-homogeneous equation III.Eigen value problem.
Eigenvalues and Eigenvectors
Therorem 1: Under what conditions a given matrix is diagonalizable ??? Jordan Block REMARK: Not all nxn matrices are diagonalizable A similar to (close.
Linear algebra: matrix Eigen-value Problems
Computing Eigen Information for Small Matrices The eigen equation can be rearranged as follows: Ax = x  Ax = I n x  Ax - I n x = 0  (A - I n )x = 0.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Linear algebra: matrix Eigen-value Problems Eng. Hassan S. Migdadi Part 1.
Eigenvectors and Linear Transformations Recall the definition of similar matrices: Let A and C be n  n matrices. We say that A is similar to C in case.
Diagonalization and Similar Matrices In Section 4.2 we showed how to compute eigenpairs (,p) of a matrix A by determining the roots of the characteristic.
What is the determinant of What is the determinant of
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Eigenvalues The eigenvalue problem is to determine the nontrivial solutions of the equation Ax= x where A is an n-by-n matrix, x is a length n column.
Chapter 6 Eigenvalues. Example In a certain town, 30 percent of the married women get divorced each year and 20 percent of the single women get married.
7.1 Eigenvalues and Eigenvectors
5.1 Eigenvalues and Eigenvectors
Linear Algebra Chapter 6 Linear Algebra with Applications -Gareth Williams Br. Joel Baumeyer, F.S.C.
1 Chapter 8 – Symmetric Matrices and Quadratic Forms Outline 8.1 Symmetric Matrices 8.2Quardratic Forms 8.3Singular ValuesSymmetric MatricesQuardratic.
7 7.2 © 2016 Pearson Education, Ltd. Symmetric Matrices and Quadratic Forms QUADRATIC FORMS.
Matrices CHAPTER 8.9 ~ Ch _2 Contents  8.9 Power of Matrices 8.9 Power of Matrices  8.10 Orthogonal Matrices 8.10 Orthogonal Matrices 
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
Chapter 5 Eigenvalues and Eigenvectors
Orthogonal Matrices & Symmetric Matrices
Continuum Mechanics (MTH487)
Elementary Linear Algebra Anton & Rorres, 9th Edition
Extensions to Complex numbers
Jordan Block Under what conditions a given matrix is diagonalizable ??? Therorem 1: REMARK: Not all nxn matrices are diagonalizable A similar to.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Complex Eigenvalues kshum ENGG2420B.
Euclidean Inner Product on Rn
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
CS485/685 Computer Vision Dr. George Bebis
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Symmetric Matrices and Quadratic Forms
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
EIGENVECTORS AND EIGENVALUES
Chapter 5 Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors
Linear Algebra Lecture 30.
Eigenvalues and Eigenvectors
Subject :- Applied Mathematics
Linear Algebra: Matrix Eigenvalue Problems – Part 2
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Lin. indep eigenvectors One single eigenvector
Presentation transcript:

ENGG2013 Unit 19 The principal axes theorem Mar, 2011.

Outline Special matrices Principle axes theorem Symmetric, skew-symmetric, orthogonal Principle axes theorem Application to conic sections kshum ENGG2013

Diagonalizable ?? A square matrix M is called diagonalizable if we can find an invertible matrix, say P, such that the product P–1 M P is a diagonal matrix. Example Some matrix cannot be diagonalized. kshum ENGG2013

Theorem An nn matrix M is diagonalizable if and only if we can find n linear independent eigenvectors of M. Proof: For concreteness, let’s just consider the 33 case. The three columns are linearly independent because the matrix is invertible \begin{bmatrix} x_1 & x_2 & x_3\\ y_1 & y_2 & y_3\\ z_1 & z_2 & z_3\\ \end{bmatrix}^{-1}\begin{bmatrix} a & b & c\\ d & e & f\\ g & h & i \end{bmatrix} \begin{bmatrix} \end{bmatrix}=\begin{bmatrix} \lambda_1 & & \\ & \lambda_2& \\ & & \lambda_3 \end{bmatrix} by definition kshum ENGG2013

Proof continued and and \begin{bmatrix} a & b & c\\ d & e & f\\ g & h & i \end{bmatrix} \begin{bmatrix} x_1 & x_2 & x_3\\ y_1 & y_2 & y_3\\ z_1 & z_2 & z_3\\ \end{bmatrix}=\begin{bmatrix} \end{bmatrix}\begin{bmatrix} \lambda_1 & & \\ & \lambda_2& \\ & & \lambda_3 \end{bmatrix} kshum ENGG2013

Complex eigenvalue There are some matrices whose eigenvalues are complex numbers. For example: the matrix which represents rotation by 45 degree counter-clockwise. kshum ENGG2013

Theorem If an nn matrix M has n distinct eigenvalues, then M is diagonalizable \begin{bmatrix} a & b & c\\ d & e & f\\ g & h & i \end{bmatrix}\begin{bmatrix} x_k\\ y_k\\ z_k\\ \end{bmatrix}= \lambda_k \begin{bmatrix} \end{bmatrix} \text{ for }k=1,2,3 The converse is false: There is some diagonalizable matrix with repeated eigenvalues. kshum ENGG2013

Matrix in special form Symmetric: AT=A. Skew-symmetric: AT= –A. Orthogonal: AT =A-1, or equivalently AT A = I. Examples: symmetric and orthogonal symmetric skew-symmetric kshum ENGG2013

Orthogonal matrix A matrix M is called orthogonal if Each column has norm 1 MT I M Dot product = 1 \begin{bmatrix} m_{11} & m_{12} & m_{13} & \cdots & m_{1n}\\ m_{21} & m_{22} & m_{23} & \cdots & m_{2n}\\\ m_{31} & m_{32} & m_{33} & \cdots & m_{3n}\\ \vdots & \vdots & \vdots & \ddots & \\ m_{n1} & m_{n2} & m_{n3} & \cdots & m_{nn} \end{bmatrix}\begin{bmatrix} m_{11} & m_{21} & m_{31} & \cdots & m_{n1}\\ m_{12} & m_{22} & m_{32} & \cdots & m_{n2}\\\ m_{13} & m_{23} & m_{33} & \cdots & m_{n3}\\ m_{1n} & m_{2n} & m_{3n} & \cdots & m_{nn} \end{bmatrix} kshum

Orthogonal matrix A matrix M is called orthogonal if Any two distinct columns are orthogonal Dot product = 0 \begin{bmatrix} m_{11} & m_{12} & m_{13} & \cdots & m_{1n}\\ m_{21} & m_{22} & m_{23} & \cdots & m_{2n}\\\ m_{31} & m_{32} & m_{33} & \cdots & m_{3n}\\ \vdots & \vdots & \vdots & \ddots & \\ m_{n1} & m_{n2} & m_{n3} & \cdots & m_{nn} \end{bmatrix}\begin{bmatrix} m_{11} & m_{21} & m_{31} & \cdots & m_{n1}\\ m_{12} & m_{22} & m_{32} & \cdots & m_{n2}\\\ m_{13} & m_{23} & m_{33} & \cdots & m_{n3}\\ m_{1n} & m_{2n} & m_{3n} & \cdots & m_{nn} \end{bmatrix} kshum

Principal axes theorem Given any nn symmetric matrix A, we have: The eigenvalues of A are real. A is diagonalizable. We can pick n mutually perpendicular (aka orthogonal) eigenvectors.  Q Proof omitted. http://en.wikipedia.org/wiki/Principal_axis_theorem kshum ENGG2013

Two sufficient conditions for diagonalizability Distinct eigenvalues Symmetric, skew-symmetric, orthogonal Diagonalizable kshum ENGG2013

Example kshum ENGG2013

Similarity Definition: We say that two nn matrix A and B are similar if we can find an invertible matrix S such that Example: and are similar, The notion of diagonalization can be phrased in terms of similarity: matrix A is diagonalizable if and only if A is similar to a diagonal matrix. \begin{bmatrix} 3 & 0\\ -1 & 1 \end{bmatrix}^{-1} -1 & 0\\ 2/3 & 1 \end{bmatrix} \end{bmatrix} = 0 & 1 kshum

More examples is similar to because and are similar. \begin{bmatrix} 1 & 0 & 0\\ 0 & 0 & 1\\ 0 & 1 & 0 \end{bmatrix}^{-1}\begin{bmatrix} 1 & 2 & 3\\ 4 & 5 & 6\\ 7 & 8 & 9 \end{bmatrix} \begin{bmatrix} \end{bmatrix} = \begin{bmatrix} 1 & 3 & 2\\ 7 & 9 & 8\\ 4 & 6 & 5 \end{bmatrix} kshum

Application to conic sections Ellipse : x2/a + y2/b = 1. Hyperbola : x2/a – y2/b = 1. Parabola y = ax2. kshum ENGG2013

Application to conic sections Is x2 – 4xy +2y2 = 1 a ellipse, or a hyperbola? Rewrite using symmetric matrix Find the characteristic polynomial Solve for the eigenvalues kshum

Application to conic sections Diagonalize Change coordinates \begin{bmatrix} x & y \end{bmatrix}\mathbf{P}^{-1} \begin{bmatrix} 1 & -2\\ -2 & 2 \end{bmatrix} \mathbf{P} x\\y \end{bmatrix} = 1 Hyperbola kshum

x2 – 4xy +2y2 = 1 kshum

2x2 + 2xy + 2y2 = 1 Rewrite using symmetric matrix Compute the characteristic polynomial Find the eigenvalues kshum

2x2 + 2xy + 2y2 = 1 Columns of P are eigenvectors, normalized to norm 1. Diagonalize Change of variables kshum

2x2 + 2xy + 2y2 = 1 v u kshum