Diagonalization Hung-yi Lee.

Slides:



Advertisements
Similar presentations
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Advertisements

Chapter 6 Eigenvalues and Eigenvectors
Section 5.1 Eigenvectors and Eigenvalues. Eigenvectors and Eigenvalues Useful throughout pure and applied mathematics. Used to study difference equations.
Lecture 17 Introduction to Eigenvalue Problems
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Chapter 6 Eigenvalues.
5.II. Similarity 5.II.1. Definition and Examples
6 1 Linear Transformations. 6 2 Hopfield Network Questions.
5 5.1 © 2012 Pearson Education, Inc. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
化工應用數學 授課教師: 郭修伯 Lecture 9 Matrices
Matrices CS485/685 Computer Vision Dr. George Bebis.
5.1 Orthogonality.
1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 5QF Introduction to Vector and Matrix Operations Needed for the.
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
1 MAC 2103 Module 12 Eigenvalues and Eigenvectors.
PHY 301: MATH AND NUM TECH Chapter 5: Linear Algebra Applications I.Homogeneous Linear Equations II.Non-homogeneous equation III.Eigen value problem.
CHAPTER 2 MATRIX. CHAPTER OUTLINE 2.1 Introduction 2.2 Types of Matrices 2.3 Determinants 2.4 The Inverse of a Square Matrix 2.5 Types of Solutions to.
Diagonalization Revisted Isabel K. Darcy Mathematics Department Applied Math and Computational Sciences University of Iowa Fig from knotplot.com.
Day 1 Eigenvalues and Eigenvectors
Day 1 Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors
Linear algebra: matrix Eigen-value Problems
Domain Range definition: T is a linear transformation, EIGENVECTOR EIGENVALUE.
Computing Eigen Information for Small Matrices The eigen equation can be rearranged as follows: Ax = x  Ax = I n x  Ax - I n x = 0  (A - I n )x = 0.
5 5.2 © 2012 Pearson Education, Inc. Eigenvalues and Eigenvectors THE CHARACTERISTIC EQUATION.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Eigenvectors and Linear Transformations Recall the definition of similar matrices: Let A and C be n  n matrices. We say that A is similar to C in case.
4.6: Rank. Definition: Let A be an mxn matrix. Then each row of A has n entries and can therefore be associated with a vector in The set of all linear.
LAHW#13 Due December 20, Eigenvalues and Eigenvectors 5. –The matrix represents a rotations of 90° counterclockwise. Obviously, Ax is never.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Linear Algebra Diyako Ghaderyan 1 Contents:  Linear Equations in Linear Algebra  Matrix Algebra  Determinants  Vector Spaces  Eigenvalues.
Chapter 6 Eigenvalues. Example In a certain town, 30 percent of the married women get divorced each year and 20 percent of the single women get married.
Linear Algebra Diyako Ghaderyan 1 Contents:  Linear Equations in Linear Algebra  Matrix Algebra  Determinants  Vector Spaces  Eigenvalues.
7.1 Eigenvalues and Eigenvectors
5.1 Eigenvectors and Eigenvalues 5. Eigenvalues and Eigenvectors.
5.1 Eigenvalues and Eigenvectors
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
Linear Algebra Chapter 6 Linear Algebra with Applications -Gareth Williams Br. Joel Baumeyer, F.S.C.
2.5 – Determinants and Multiplicative Inverses of Matrices.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Reduced echelon form Matrix equations Null space Range Determinant Invertibility Similar matrices Eigenvalues Eigenvectors Diagonabilty Power.
Characteristic Polynomial Hung-yi Lee. Outline Last lecture: Given eigenvalues, we know how to find eigenvectors or eigenspaces Check eigenvalues This.
Chapter 5 Eigenvalues and Eigenvectors
Orthogonal Matrices & Symmetric Matrices
Eigenvalues and Eigenvectors
CS479/679 Pattern Recognition Dr. George Bebis
Elementary Linear Algebra Anton & Rorres, 9th Edition
Linear Algebra Lecture 36.
Row Space, Column Space, and Nullspace
Positive Semidefinite matrix
Euclidean Inner Product on Rn
Eigenvalues and Eigenvectors
CS485/685 Computer Vision Dr. George Bebis
4.6: Rank.
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Linear Algebra Lecture 32.
EIGENVECTORS AND EIGENVALUES
Elementary Linear Algebra Anton & Rorres, 9th Edition
Linear Algebra Lecture 29.
Diagonalization of Linear Operator
RAYAT SHIKSHAN SANSTHA’S S.M.JOSHI COLLEGE HADAPSAR, PUNE
Linear Algebra Lecture 30.
Linear Algebra Lecture 35.
Eigenvalues and Eigenvectors
Subject :- Applied Mathematics
Linear Algebra Lecture 28.
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Diagonalization Hung-yi Lee

Review If 𝐴𝑣=𝜆𝑣 (𝑣 is a vector, 𝜆 is a scalar) 𝑣 is an eigenvector of A 𝜆 is an eigenvalue of A that corresponds to 𝑣 Eigenvectors corresponding to 𝜆 are nonzero solution of (A  In)v = 0 A scalar 𝑡 is an eigenvalue of A excluding zero vector Eigenvectors corresponding to 𝜆 Eigenspace of 𝜆: Eigenvectors corresponding to 𝜆 + 𝟎 =𝑁𝑢𝑙𝑙(A  In)  𝟎 eigenspace 𝑑𝑒𝑡 𝐴−𝑡 𝐼 𝑛 =0

Review Characteristic polynomial of A is 𝑑𝑒𝑡 𝐴−𝑡 𝐼 𝑛 Factorization multiplicity = 𝑡− 𝜆 1 𝑚 1 𝑡− 𝜆 2 𝑚 2 … 𝑡− 𝜆 𝑘 𝑚 𝑘 …… Eigenvalue: 𝜆 1 𝜆 2 𝜆 𝑘 Eigenspace: 𝑑 1 𝑑 2 𝑑 𝑘 (dimension) ≤ 𝑚 1 ≤ 𝑚 2 ≤ 𝑚 𝑘

Outline An nxn matrix A is called diagonalizable if 𝐴= 𝑃𝐷 𝑃 −1 D: nxn diagonal matrix P: nxn invertible matrix Is a matrix A diagonalizable? If yes, find D and P Reference: Textbook 5.3 Application 1: growth Application 2: linear operator

Diagonalizable An nxn matrix A is called diagonalizable if 𝐴= 𝑃𝐷 𝑃 −1 D: nxn diagonal matrix P: nxn invertible matrix Not all matrices are diagonalizable (?) A2 = 0 If A = PDP1 for some invertible P and diagonal D A2 = PD2P1 = 0 D2 = 0 D = 0 D is diagonal A= 0

Diagonalizable 𝑃= 𝑝 1 ⋯ 𝑝 𝑛 𝐷= 𝑑 1 ⋯ 0 ⋮ ⋱ ⋮ 0 ⋯ 𝑑 𝑛 𝑃= 𝑝 1 ⋯ 𝑝 𝑛 Diagonalizable 𝐷= 𝑑 1 ⋯ 0 ⋮ ⋱ ⋮ 0 ⋯ 𝑑 𝑛 If A is diagonalizable 𝐴=𝑃𝐷 𝑃 −1 𝐴𝑃=𝑃𝐷 𝐴𝑃= 𝐴 𝑝 1 ⋯ 𝐴 𝑝 𝑛 𝑃𝐷=𝑃 𝑑 1 𝑒 1 ⋯ 𝑑 𝑛 𝑒 𝑛 = 𝑃 𝑑 1 𝑒 1 ⋯ 𝑃 𝑑 𝑛 𝑒 𝑛 = 𝑑 1 𝑃 𝑒 1 ⋯ 𝑑 𝑛 𝑃 𝑒 𝑛 = 𝑑 1 𝑝 1 ⋯ 𝑑 𝑛 𝑝 𝑛 𝐴 𝑝 𝑖 = 𝑑 𝑖 𝑝 𝑖 𝑝 𝑖 is an eigenvector of A corresponding to eigenvalue 𝑑 𝑖

Diagonalizable = = = 𝑃= 𝑝 1 ⋯ 𝑝 𝑛 𝐷= 𝑑 1 ⋯ 0 ⋮ ⋱ ⋮ 0 ⋯ 𝑑 𝑛 𝑃= 𝑝 1 ⋯ 𝑝 𝑛 Diagonalizable 𝐷= 𝑑 1 ⋯ 0 ⋮ ⋱ ⋮ 0 ⋯ 𝑑 𝑛 If A is diagonalizable 𝑝 𝑖 is an eigenvector of A corresponding to eigenvalue 𝑑 𝑖 𝐴=𝑃𝐷 𝑃 −1 = There are n eigenvectors that form an invertible matrix = There are n independent eigenvectors = The eigenvectors of A can form a basis for Rn.

Diagonalizable 𝑃= 𝑝 1 ⋯ 𝑝 𝑛 𝐷= 𝑑 1 ⋯ 0 ⋮ ⋱ ⋮ 0 ⋯ 𝑑 𝑛 𝑃= 𝑝 1 ⋯ 𝑝 𝑛 Diagonalizable 𝐷= 𝑑 1 ⋯ 0 ⋮ ⋱ ⋮ 0 ⋯ 𝑑 𝑛 If A is diagonalizable 𝑝 𝑖 is an eigenvector of A corresponding to eigenvalue 𝑑 𝑖 𝐴=𝑃𝐷 𝑃 −1 How to diagonalize a matrix A? Is diagonalizable unique? No. The eigen values can repeat in step 2. Find n L.I. eigenvectors corresponding if possible, and form an invertible P Step 1: The eigenvalues corresponding to the eigenvectors in P form the diagonal matrix D. Step 2:

Diagonalizable A set of eigenvectors that correspond to distinct eigenvalues is linear independent. 𝑑𝑒𝑡 𝐴−𝑡 𝐼 𝑛 Factorization = 𝑡− 𝜆 1 𝑚 1 𝑡− 𝜆 2 𝑚 2 … 𝑡− 𝜆 𝑘 𝑚 𝑘 …… …… Eigenvalue: 𝜆 1 𝜆 2 𝜆 𝑘 …… Eigenspace: 𝑑 1 ≤ 𝑚 1 𝑑 2 ≤ 𝑚 2 𝑑 𝑘 ≤ 𝑚 𝑘 (dimension) Independent

Diagonalizable A set of eigenvectors that correspond to distinct eigenvalues is linear independent. 𝜆 1 𝜆 2 Eigenvalue: 𝜆 𝑚 …… Assume dependent 𝑣 𝑚 Eigenvector: 𝑣 1 𝑣 2 …… a contradiction vk = c1v1+c2v2+  +ck1vk1 Avk = c1Av1+c2Av2+  +ck1Avk1 (k) kvk = c11v1+c22v2+  +ck1k1vk1 - kvk = c1kv1+c2kv2+  +ck1kvk1 0 = c1(1k) v1+c2(2k)v2+  +ck1(k1k)vk1 Not c1 = c2 =  = ck1 = 0 Same eigenvalue a contradiction

Independent Eigenvectors 𝑃= 𝑝 1 ⋯ 𝑝 𝑛 Diagonalizable 𝐷= 𝑑 1 ⋯ 0 ⋮ ⋱ ⋮ 0 ⋯ 𝑑 𝑛 If A is diagonalizable 𝑝 𝑖 is an eigenvector of A corresponding to eigenvalue 𝑑 𝑖 𝐴=𝑃𝐷 𝑃 −1 𝑑𝑒𝑡 𝐴−𝑡 𝐼 𝑛 = 𝑡− 𝜆 1 𝑚 1 𝑡− 𝜆 2 𝑚 2 … 𝑡− 𝜆 𝑘 𝑚 𝑘 …… …… Eigenvalue: 𝜆 1 𝜆 2 𝜆 𝑘 Is diagonalizable unique? No. The eigen values can repeat in step 2. …… Eigenspace: Basis for 𝜆 1 Basis for 𝜆 2 Basis for 𝜆 3 Independent Eigenvectors You can’t find more!

Diagonalizable - Example Diagonalize a given matrix characteristic polynomial is (t + 1)2(t  3) eigenvalues: 3, 1 eigenvalue 3 A = PDP1, where B1 = eigenvalue 1 B2 =

Application of Diagonalization If A is diagonalizable, Example: 𝐴=𝑃𝐷 𝑃 −1 𝐴 𝑚 =𝑃 𝐷 𝑚 𝑃 −1 .15 Study FB .85 .97 .03 .85 …… .85 Study Study Study .15 .727 .85 .15 .03 …… FB FB .97 .273 .15

From Study FB To Study FB =𝐴 …… …… .727 .273 .85 .15 1 0 𝐴=𝑃𝐷 𝑃 −1 .03 .85 .97 =𝐴 .85 …… .85 Study Study Study .15 .727 .85 .15 .03 …… FB FB .97 .273 .15 The following example shows that stochastic matrices do not need to be diagonalizable, http://mathoverflow.net/questions/51887/non-diagonalizable-doubly-stochastic-matrices在這裡鍵入方程式。 .727 .273 .85 .15 1 0 𝐴=𝑃𝐷 𝑃 −1 𝐴 𝑚 =𝑃 𝐷 𝑚 𝑃 −1

Diagonalizable Diagonalize a given matrix RREF (invertible) RREF

Application of Diagonalization When 𝑚→∞, The beginning condition does not influence. 𝐴 𝑚 = 1/6 1/6 5/6 5/6

Test for a Diagonalizable Matrix An n x n matrix A is diagonalizable if and only if both the following conditions are met. The characteristic polynomial of A factors into a product of linear factors. 𝑑𝑒𝑡 𝐴−𝑡 𝐼 𝑛 Factorization = 𝑡− 𝜆 1 𝑚 1 𝑡− 𝜆 2 𝑚 2 … 𝑡− 𝜆 𝑘 𝑚 𝑘 …… For each eigenvalue l of A, the multiplicity of l equals the dimension of the corresponding eigenspace.

Independent Eigenvectors An n x n matrix A is diagonalizable = The eigenvectors of A can form a basis for Rn. = 𝑑𝑒𝑡 𝐴−𝑡 𝐼 𝑛 = 𝑡− 𝜆 1 𝑚 1 𝑡− 𝜆 2 𝑚 2 … 𝑡− 𝜆 𝑘 𝑚 𝑘 …… …… Eigenvalue: 𝜆 1 𝜆 2 𝜆 𝑘 …… Eigenspace: 𝑑 1 = 𝑚 1 𝑑 2 = 𝑚 2 𝑑 𝑘 = 𝑚 𝑘 (dimension)

This lecture 𝑣 B 𝑇 𝑣 B 𝐵 −1 𝐵 𝑣 𝑇 𝑣 Reference: Chapter 5.4 𝑇 B simple Properly selected Properly selected 𝑇 𝑣 𝑇 𝑣

Independent Eigenvectors 𝑃= 𝑝 1 ⋯ 𝑝 𝑛 𝐷= 𝑑 1 ⋯ 0 ⋮ ⋱ ⋮ 0 ⋯ 𝑑 𝑛 Review eigenvector eigenvalue An n x n matrix A is diagonalizable (𝐴=𝑃𝐷 𝑃 −1 ) = The eigenvectors of A can form a basis for Rn. 𝑑𝑒𝑡 𝐴−𝑡 𝐼 𝑛 = 𝑡− 𝜆 1 𝑚 1 𝑡− 𝜆 2 𝑚 2 … 𝑡− 𝜆 𝑘 𝑚 𝑘 …… …… Eigenvalue: 𝜆 1 𝜆 2 𝜆 𝑘 …… Eigenspace: Basis for 𝜆 1 Basis for 𝜆 2 Basis for 𝜆 3 Independent Eigenvectors

Diagonalization of Linear Operator 𝑇 𝑥 1 𝑥 2 𝑥 3 = 8 𝑥 1 +9 𝑥 2 −6 𝑥 1 −7 𝑥 2 3 𝑥 1 +3 𝑥 2 − 𝑥 3 Example 1: det -t 𝐴= 8 9 0 −6 −7 0 3 3 −1 The standard matrix is -t -t  the characteristic polynomial is (t + 1)2(t  2)  eigenvalues: 1, 2 Eigenvalue -1: Eigenvalue 2: B1= −1 1 0 , 0 0 1 B2= 3 −2 1  B1  B2 is a basis of R 3  T is diagonalizable.

Diagonalization of Linear Operator 𝑇 𝑥 1 𝑥 2 𝑥 3 = − 𝑥 1 + 𝑥 2 +2 𝑥 3 𝑥 1 − 𝑥 2 0 Example 2: det -t 𝐴= −1 1 2 1 −1 0 0 0 0 The standard matrix is -t -t  the characteristic polynomial is t2(t + 2)  eigenvalues: 0, 2  the reduced row echelon form of A  0I3 = A is 1 −1 0 0 0 1 0 0 0  the eigenspaces corresponding to the eigenvalue 0 has the dimension 1 < 2 = algebraic multiplicity of the eigenvalue 0  T is not diagonalizable.

Diagonalization of Linear Operator If a linear operator T is diagonalizable 𝐷 𝑇 B 𝑣 B 𝑇 𝑣 B simple Eigenvectors form the good system 𝑃 −1 𝑃 𝐵 −1 𝐵 Properly selected Properly selected 𝐴=𝑃𝐷 𝑃 −1 𝑣 𝑇 𝑣

Diagonalization of Linear Operator -1: 2: 𝑇 𝑥 1 𝑥 2 𝑥 3 = 8 𝑥 1 +9 𝑥 2 −6 𝑥 1 −7 𝑥 2 3 𝑥 1 +3 𝑥 2 − 𝑥 3 B1= −1 1 0 , 0 0 1 B2= 3 −2 1 𝑇 B 𝑣 B 𝑇 𝑣 B −1 0 0 0 −1 0 0 0 2 −1 0 3 1 0 −2 0 1 1 −1 −1 0 3 1 0 −2 0 1 1 8 9 0 −6 −7 0 3 3 −1 𝑣 𝑇 𝑣