Download presentation
1
CHAPTER SIX Eigenvalues
2
Outlines System of linear ODE (Omit) Diagonalization
Hermitian matrices Ouadratic form Positive definite matrices
3
Motivations To simplify a linear dynamics such that it is as simple as possible. To realize a linear system characteristics e.g., the behavior of system dynamics.
4
Example: In a town, each year
30% married women get divorced 20% single women get married In the 1st year, 8000 married women 2000 single women. Total population remains constant be the women numbers at year i, where represent married & single women respectively.
5
If Question: Why does converges? Why does it converges to the same limit even when the initial condition is different?
6
Ans: Choose a basis Given an initial for some for example Question: How does one know choosing such a basis?
7
Def: Let . A scalar is said to be an
eigenvalue or characteristic value of A if such that The vector is said to be an eigenvector or characteristic vector belonging to . is called an eigen pair of A. Question: Given A, How to compute eigenvalues & eigenvectors?
8
characteristic polynomial of A, of degree n in
is an eigen pair of is singular Note that, is a polynomial, called characteristic polynomial of A, of degree n in Thus, by FTA, A has exactly n eigenvalues including multiplicities. is a eigenvector associated with eigenvalue while is eigenspace of A.
9
Example: Let are eigenvalues of A. To find eigenspace of 2:( i.e., )
10
To find eigenspaces of 3(i.e., )
Let
11
Let Then is an eigenvalue of A. has a nontrivial solution. is singular. loses rank.
12
Let If is an eigenvalue of A with eigenvector Then This means that is also an eigen-pair of A.
13
Let Where are eigenvalues of A. (i) Let (ii) Compare with the coefficient of , we have
14
Theorem 6.1.1: Let Then and consequently A & B have the same eigenvalues. Pf: Let for some nonsingular matrix S.
15
Diagonalization Goal: Given find nonsingular matrix S a diagonal matrix. Question1: Are all matrices diagonalizable? Question2: What kinds of A are diagonalizable? Question3: How to find S if A is diagonalizable?
16
NOT all matrices are diagonalizable
e.g., Let If A is diagonalizable nonsingular matrix S
17
To answer Q2, Suppose A is diagonalizable. nonsingular matrix S Let are eigenpair of A for This gives a condition for and diagonalizability and a way to find S.
18
Theorem 6.3.2: Let is diagonalizable
A has n linear independent eigenvectors. Note : Similarity transformation Change of coordinate diagonalization
19
Theorem6.3.1: If are distinct eigenvalues of a
matrix A with corresponding eigenvectors , then are linear independent. Pf: Suppose are linear independent not all zero Suppose are distinct. are linear independent.
20
Remarks: Let , and (i) is an eigenpair of A for (ii) The diagonalizing matrix S is not unique because Its columns can be reordered or multiplied by an nonzero scalar (iii) If A has n distinct eigenvalues , A is diagonalizable. If the eigenvalues are not distinct , then may or may not diagonalizable depending on whether or not A has n linear independent eigenvectors. (iv)
21
Example: Let For Let
22
Def: If an matrix A has fewer than n linear independent eigenvectors,we say that A is defective e.g. (i) is defective (ii) is defective
23
Example: Let A & B both have the same eigenvalues Nullity (A-2I)=1 The eigenspace associated with has only one dimension. A is NOT diagonalizable However, Nullity (B-2I)=2 B is diagonalizable
24
Question: Is the following matrix diagonalizable ?
25
The Exponential of a Matrix Motiration :
Motiration:The general solution of is The unique solution of Question:What is and how to compute ?
26
Note that Define
27
Suppose A is diagonalizable with
28
Example: Compute Sol: The eigenvalues A are with eigenvectors
29
Hermitian matrices : Let , then A can be written as where e.g. ,
30
Let , then e.g. ,
31
Def: A matrix A is said to be Hermitian if A is said to be skew-Hermitian if A is said to be unitary if ( → its column vectors form an orthonormal set in )
32
(ii) Let and be two eigenpairs of A with
Theorem6.4.1: Let Then (i) (ii) eigenvectors belonging to distinct eigenvalues are orthogonal Pf:(i)Let be an eigenpair of A (ii) Let and be two eigenpairs of A with
33
Theorem: Let Then Pf:(i) Let be an eigenpair of A is pure-imaginary
34
Theorem6.4.3: (Schur`s Theorem) Let
Then unitary matrix U is upper triangular Pf:Let be an eigenpair of A with Choose to be such that is unitary Choose Chose to be unitary Continue this process , we have the theorem
35
Theorem6.4.4: (Spectral Theorem)
If , then unitary matrix U that diagonalizes A . Pf:By previous Theorem , unitary matrix , where T is upper triangular . T is a diagonal matrix
36
Cor: Let A be real symmetric matrix .
Then (i) (ii) an orthogonal matrix U is a diagonal matrix Remark:If A is Hermitian , then , by Th6.4.4 , Complete orthonormal eigenbasis
37
Example: Find an orthogonal matrix U that diagonalizes A Sol : (i) (ii) (iii)By Gram-Schmidt Process The columns of form an orthogonormal eigenbasis (WHY?)
38
Note:If A has orthonormal eigenbasis
Question:In addition to Hermitian matrices , Is there any other matrices possessing orthonormal eigenbasis? Note:If A has orthonormal eigenbasis where U is Hermitian & D is diagonal
39
Def: A is said to be normal if
Remark:Hermitian , Skew- Hermitian and Unitary matrices are all normal
40
Theorem6.4.6: A is normal A possesses
orthonormal eigenbasis Pf: have proved By Th , unitary U is upper triangular T is also normal Compare the diagonal elements of T has orthonormal eigenbasis(WHY?)
41
Singular Value Decomposition(SVD) :
Theorem : Let with rank(A)=r Then unitary matrices With Where
42
Remark:In the SVD The scalars are called singular values of A Columns of U are called left singular vectors of A Columns of V are called right singular vectors of A
43
Pf:Note that , is Hermitian
& Positive semidefinite with unitary matrix V where Define (1) (2) Define (3) Define is unitary
44
Remark:In the SVD The singular values of A are unique while U&V are not unique Columns of U are orthonormal eigenbasis for Columns of V are orthonormal eigenbasis for is an orthonormal basis for
45
rank(A) = number of nonzero singular values
but rank(A) ≠ number of nonzero eigenvalues for example
46
Example : Find SVD of Sol : An orthonormal eigenbasis associate with can be found as Find U is orthogonal A set of candidate for are Thus
47
Lemma6.5.2 : Let be orthogonal . Then Pf :
48
Cor : Let be the SVD of A . Then
49
We`ll state the next result without proof :
Theorem6.5.3 : H.(1) be the SVD of A (2) C :
50
Application : Digital Image Processing
(especially efficient for matrix which has low rank)
51
Quadratic Forms : To classify the type of quadratic surface (line)
Optimization : An application to the Calculus
52
Def : A quadratic equation in two variables x & y
is an equation of the form
53
Standard forms of conic sections
(ii) (iii) (iv) Note : Is there any difference between the eigenvalues A of the quadratic form ?
54
Goal : Try to transform the quadratic equation
into standard form by suitable translation and rotation
55
Example : (No xy term) The eigenvalues of the quadratic terms are 9 , 4 → ellipse → →
56
→ By direct computation
Example : (Have xy term) → → By direct computation is orthogonal ( why does such U exist ? ) → Let the original equation becomes
57
Example : → → Let or
58
Optimization : Let It is known from Taylor’s Theorem of Calculus that Where is Hessian matrix is local extremum If then is a local minimum
59
Def : A real symmetric matrix A is said to be
(i) Positive definite denoted by (ii) Negative definite denoted by (iii) Positive semidefinite denoted by (iv) Negative semidefinite denoted by example : is indefinite question : Given a real symmetric matrix , how to determine its definiteness efficiently ?
60
Theorem6.5.1: Let Then Pf : let be eigenpair of A Suppose Let be an orthonormal eigen-basis of A (Why can assume this ? )
61
Example: Find local extrema of
Sol : Thus f has local maximum at while are saddle points
62
Positive Definite Matrices :
Property I : P is nonsingular Property II : and all the leading principal submatrices of A are positive definite
63
Property III : P can be reduced to upper triangular form using only row operation III and the pivots elements will all be positive Sketch of the proof : & determinant is invariant under row operation of type III Continue this process , the property can be proved
64
Property IV : Let Then (i) A can be decompose as A=LU where L is lower triangular & U is upper triangular (ii) A can be decompose as A=LU where L is lower triangular & U is upper triangular with all the diagonal element being equal to 1 , D is an diagonal matrix Pf : by Gaussian elimination and the fact that the product of two lower (upper) triangular matrix is lower (upper) triangular
65
Example: Thus A=LU Also A=LDU with
66
Property V : Let If Pf : LHS is lower triangular & RHS is upper triangular with diagonal elements 1
67
Property VI : Let Then A can be factored into where D is a diagonal matrix & L is lower triangular with 1’s along the diagonal Pf : Since the LDU representation is unique
68
Property VII : (Cholesky decomposition)
Let Then A can be factored into where L is lower triangular with positive diagonal Hint :
69
Example: We have seen that
Note that Define we have the Choleoky decomposition
70
Theorem6.6.1 : Let , Then the followings are equivalent: (i) A>0
(ii) All the leading principal submatrices have positive determinants. (iii) A ~ U only using elementary row operation of type III. And the pivots are all positive , where U is an upper triangular matrix. (iv) A has Cholesky decomposition LLT. (v) A can be factored into BTB for some nonsingular matrix B row Pf : We have shown that (i) (ii) (iii) (iv) In addition , (iv) (v) (i) is trivial
71
Housholder Transformation :
Def : Then the matrix is called Housholder transformation Geometrical lnterpretation: Q is symmetric , Q is orthogonal , What is the eigenvalues , eigenvectors and determinant of Q ?
72
Given , Find QR factorization
73
Theorem : Let and be a SVD of A with Then Pf : Cor : Let be nonsingular with singular values Then and
74
Application : In solving What is the effect of the
Solution when present measurement error ?
75
is said to be the condition number of A
If A is orthogonal then This means that , due to the error in b the deviation of the associated solution of is minimum if A is orthogonal
76
Example : A is close to singular Note that , is the solution for and is the solution for What does this mean ? Similarly , i.e. small deviation in x results in large deviation in b This is the reason why we use orthogonal factorization in Numerical solving Ax=b
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.