Download presentation
Presentation is loading. Please wait.
Published byBeatrice Richard Modified over 5 years ago
1
Maths for Signals and Systems Linear Algebra in Engineering Lectures 10-12, Tuesday 1st and Friday 4th November2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL COLLEGE LONDON
2
In this set of lectures we will talk about:
Eigenvectors and eigenvalues Matrix diagonalization Applications of matrix diagonalization Stochastic matrices
3
Eigenvectors and eigenvalues
Consider a matrix ๐ด and a vector ๐ฅ. The operation ๐ด๐ฅ produces a vector ๐ฆ at some direction. I am interested in vectors ๐ฆ which lie in the same direction as ๐ฅ. In that case I have ๐ด๐ฅ=๐๐ฅ with ๐ being a scalar. When the above relationship holds, ๐ฅ is called an eigenvector and ๐ is called an eigenvalue of matrix ๐ด. If ๐ด is singular then ๐=0 is an eigenvalue. Problem: How do we find the eigenvectors and eigenvalues of a matrix?
4
Eigenvectors and eigenvalues of a projection matrix
Problem: What are the eigenvectors ๐ฅ โฒ and eigenvalues ๐ โฒ of a projection matrix ๐? In the figure, consider the matrix ๐ which projects vector ๐ onto vector ๐. Question: Is ๐ an eigenvector of ๐? Answer: No, because ๐ and ๐๐ lie in different directions. Question: What vectors are eigenvectors of ๐? Answer: Vectors ๐ฅ which lie on the projection plane already. In that case ๐๐ฅ=๐ฅ and therefore ๐ฅ is an eigenvector with eigenvalue 1.
5
Eigenvectors and eigenvalues of a projection matrix (cont.)
The eigenvectors of ๐ are vectors ๐ฅ which lie on the projection plane already. In that case ๐๐ฅ=๐ฅ and therefore ๐ฅ is an eigenvector with eigenvalue 1. We can find 2 independent eigenvectors of ๐ which lie on the projection plane both associated with an eigenvalue of 1. Problem: In the 3D space we can find 3 independent vectors. Can you find a third eigenvector of ๐ that is perpendicular to the eigenvectors of ๐ that lie on the projection plane? Answer: YES! Any vector ๐ perpendicular to the plane. In that case ๐๐=๐=0๐. Therefore, the eigenvalues of ๐ are ๐=0 and ๐=1.
6
Eigenvectors and eigenvalues of a permutation matrix
Consider the permutation matrix ๐ด= Problem: Can you give an eigenvector of the above matrix? Or can you think of a vector that if permuted is still a multiple of itself? Answer: YES. It is the vector and the corresponding eigenvalue is ๐=1. And furthermore, the vector โ1 1 with eigenvalue ๐=โ1. ๐ร๐ matrices will have ๐ eigenvalues. It is not easy to find them. The sum of the eigenvalues, called the trace of a matrix, equals the sum of the diagonal elements of the matrix. The product of the eigenvalues equals the determinant of the matrix. Therefore, in the previous example, once I found an eigenvalue ๐=1, I should know that there is another eigenvalue ๐=โ1.
7
Problem: Solve ๐ด๐ฅ=๐๐ฅ Consider an eigenvector ๐ฅ of matrix ๐ด. In that case ๐ด๐ฅ=๐๐ฅโ๐ด๐ฅโ๐๐ฅ=๐ (๐ is the zero vector). Therefore, ๐ดโ๐๐ผ ๐ฅ=๐. In order for the above set of equations to have a non-zero solution, the nullspace of (๐ดโ๐๐ผ) must be non-zero, i.e., the matrix (๐ดโ๐๐ผ) must be singular. Therefore, det ๐ดโ๐๐ผ =0. I now have an equation for ๐. It is called the characteristic equation, or the eigenvalue equation. From the roots of this equation we can find the eigenvalues. I might have repeated ๐s. This might cause problems but I will deal with it later. After I find ๐, I can find ๐ฅ from ๐ดโ๐๐ผ ๐ฅ=๐. Basically, I will be looking for the nullspace of ๐ดโ๐๐ผ . The eigenvalues of ๐ด ๐ are obtained through the equation det ๐ด ๐ โ๐๐ผ =0. But: det ๐ด ๐ โ๐๐ผ = det ๐ด ๐ โ๐ ๐ผ ๐ = det ๐ดโ๐๐ผ ๐ =detโก(๐ดโ๐๐ผ). Therefore, the eigenvalues of ๐ด ๐ are the same as the eigenvalues of ๐ด.
8
Solve ๐ด๐ฅ=๐๐ฅ. An example. Consider the matrix ๐ด= A symmetric matrix always has real eigenvalues. Furthermore, the eigenvectors of a symmetric matrix can be chosen to be orthogonal. det ๐ดโ๐๐ผ = 3โ๐ 2 โ1=0โ3โ๐=ยฑ1โ๐=3ยฑ1โ ๐ 1 =4, ๐ 2 =2. Or det ๐ดโ๐๐ผ = ๐ 2 โ6๐+8=0. Note that 6= ๐ 1 + ๐ 2 and 8=det(๐ด)= ๐ 1 ๐ 2 . Find the eigenvector for ๐ 1 =4. ๐ดโ4๐ผ= โ1 1 1 โ1 โ โ1 1 1 โ1 ๐ฅ ๐ฆ = โ๐ฅ=๐ฆ Find the eigenvector for ๐ 2 =2. ๐ดโ2๐ผ= โ ๐ฅ ๐ฆ = โ๐ฅ=โ๐ฆ Notice that there are families of eigenvectors, not single eigenvectors.
9
Compare the two matrices given previously
Consider the matrix ๐ด= As shown it has eigenvectors ๐ฅ ๐ฅ and โ๐ฅ ๐ฅ with eigenvalues ๐ 1 =4 and ๐ 2 =2. Consider the matrix ๐ต= , also with eigenvectors ๐ฅ ๐ฅ and โ๐ฅ ๐ฅ and eigenvalues ๐ 1 =1 and ๐ 2 =โ1. We observe that ๐จ=๐ฉ+๐๐ฐ. The eigenvalues of ๐จ are obtained from the eigenvalues of ๐ฉ if we increase them by 3. The eigenvectors of ๐จ and ๐ฉ are the same.
10
Generalization of the above observation
Consider the matrix ๐ด=๐ต+๐๐ผ. Consider an eigenvector ๐ฅ of ๐ต with eigenvalue ๐. Then ๐ต๐ฅ=๐๐ฅ and therefore, ๐ด๐ฅ= ๐ต+๐๐ผ ๐ฅ=๐ต๐ฅ+๐๐ผ๐ฅ=๐ต๐ฅ+๐๐ฅ=๐๐ฅ+๐๐ฅ= ๐+๐ ๐ฅ ๐ด has the same eigenvectors with ๐ต with eigenvalues ๐+๐. There arenโt any properties that enable us to find the eigenvalues of ๐จ+๐ฉ and ๐จ๐ฉ.
11
Example Take a matrix that rotates every vector by 90 o .
This is ๐= cos(90) โsin(90) sin(90) cos(90) = 0 โ1 1 0 ๐ 1 + ๐ 2 =0 and det ๐ =๐ 1 ๐ 2 =1. What vector can be parallel to itself after rotation? det ๐โ๐๐ผ =det โ๐ โ1 1 โ๐ = ๐ 2 +1=0โ๐=ยฑ๐. In that case we have a skew symmetric (or anti-symmetric) matrix with ๐ ๐ = ๐ โ1 =โ๐. We observe that the eigenvalues are complex. Complex eigenvalues always come in complex conjugate pairs if the associated matrix is real.
12
Example Consider ๐ด= 3 1 0 3 . ๐ 1 + ๐ 2 =6 and det(๐ 1 ๐ 2 )=9.
det ๐ดโ๐๐ผ =det 3โ๐ 1 0 3โ๐ = (3โ๐) 2 =0โ ๐ 1,2 =3 The eigenvalues of a triangular matrix are the values of the diagonal. For that particular case we have ๐ฅ ๐ฆ = 3๐ฅ 3๐ฆ โ๐ฆ=0 and ๐ฅ can be any number.
13
Matrix diagonalization The case of independent eigenvectors
Suppose we have ๐ independent eigenvectors of a matrix ๐ด. We call them ๐ฅ ๐ . The associated eigenvalues are ๐ ๐ . We put them in the columns of a matrix ๐. We form the matrix: ๐ด๐=๐ด ๐ฅ 1 ๐ฅ 2 โฆ ๐ฅ ๐ = ๐ 1 ๐ฅ 1 ๐ 2 ๐ฅ 2 โฆ ๐ ๐ ๐ฅ ๐ = ๐ฅ 1 ๐ฅ 2 โฆ ๐ฅ ๐ ๐ โฆ ๐ 2 โฆ 0 โฎ โฎ โฑ โฆ โฎ ๐ ๐ =๐ฮโ๐ด๐=๐ฮ ๐ โ1 ๐ด๐=ฮ or ๐ด= ๐ฮ ๐ โ1 The above formulation of ๐ด is very important in Mathematics and Engineering and is called matrix diagonalization.
14
Matrix diagonalization: Eigenvalues of ๐ด ๐
If ๐ด๐ฅ=๐๐ฅโ ๐ด 2 ๐ฅ=๐๐ด๐ฅโ ๐ด 2 ๐ฅ= ๐ 2 ๐ฅ. Therefore, the eigenvalues of ๐ด 2 are ๐ 2 . The eigenvectors of ๐ด 2 remain the same since ๐ด 2 = ๐ฮ ๐ โ1 ๐ฮ ๐ โ1 =๐ ฮ 2 ๐ โ1 In general ๐ด ๐ = ๐ ฮ ๐ ๐ โ1 lim ๐ด ๐ =0 if the eigenvalues of ๐ด have the property ๐ ๐ <1. A matrix has ๐ independent eigenvectors and therefore is diagonalizable if all the eigenvalues are different. If there are repeated eigenvalues a matrix may, or may not have independent eigenvectors. As an example consider the identity matrix. Its eigenvectors are the row (or column) vectors. They are all independent. However, the eigenvalues are all equal to 1. Find the eigenvalues of ๐ด= 1 ๐ 0 1 ,๐โ 0. This is a standard textbook matrix which cannot be diagonalized. (Answer: The eigenvectors are of the form [ ๐ฅ 0 ] ๐ )
15
Application of matrix diagonalization cont
Application of matrix diagonalization cont. A first order system which evolves with time Consider a system that follows an equation of the form ๐ข ๐+1 =๐ด ๐ข ๐ . ๐ข ๐ is the vector which consists of the system parameters which evolve with time. The eigenvalues of ๐ด characterize fully the behavior of the system. I start with a given vector ๐ข 0 . ๐ข 1 =๐ด ๐ข 0 , ๐ข 2 = ๐ด 2 ๐ข 0 and in general ๐ข ๐ = ๐ด ๐ ๐ข 0 In order to solve this system I assume that I can write ๐ข 0 = ๐ 1 ๐ฅ 1 +๐ 2 ๐ฅ 2 +โฆ+ ๐ ๐ ๐ฅ ๐ where ๐ฅ ๐ are the eigenvectors of matrix ๐ด. This is a standard approach to the solution of this type of problem. ๐ด๐ข 0 = ๐ 1 ๐ด ๐ฅ 1 +๐ 2 ๐ด๐ฅ 2 +โฆ+ ๐ ๐ ๐ด๐ฅ ๐ = ๐ 1 ๐ 1 ๐ฅ 1 +๐ 2 ๐ 2 ๐ฅ 2 +โฆ+ ๐ ๐ ๐ ๐ ๐ฅ ๐ ๐ด 100 ๐ข 0 = ๐ ๐ ๐ฅ 1 +๐ 2 ๐ ๐ฅ 2 +โฆ+ ๐ ๐ ๐ ๐ ๐ฅ ๐ = ๐ฮ 100 ๐ ๐ is a column vector that contains the coefficients ๐ ๐ . ๐,ฮ are the matrices defined previously.
16
Application of matrix diagonalization cont
Application of matrix diagonalization cont. Fibonacci example: Convert a second order scalar problem into a first order system I will take two numbers which I denote with ๐น 0 =0 and ๐น 1 =1. The Fibonacci sequence of numbers is given by the two initial numbers given above and the relationship ๐น ๐ =๐น ๐โ1 + ๐น ๐โ2 . The sequence is 0,1,1,2,3,5,8,13 and so on. How can I get a closed form formula for the 100th Fibonacci number or any Fibonacci number? I will present a standard approach to this problem. I define a vector ๐ข ๐ = ๐น ๐+1 ๐น ๐ and an extra equation ๐น ๐+1 =๐น ๐+1 . By merging the two equations I obtain the following matrix form: ๐ข ๐+1 = ๐น ๐+2 ๐น ๐+1 = ๐น ๐+1 ๐น ๐ =๐ด ๐ข ๐ I managed to convert the second order scalar problem into a first order matrix problem.
17
Application of matrix diagonalization cont. Fibonacci example cont
Application of matrix diagonalization cont. Fibonacci example cont. Convert a second order scalar problem into a first order system The eigenvalues of ๐ด are obtained from det 1โ๐ 1 1 โ๐ =โ 1โ๐ ๐โ1=0โ ๐ 2 โ๐โ1=0 Observe the analogy between ๐ 2 โ๐โ1=0 and ๐น ๐ โ๐น ๐โ1 โ ๐น ๐โ2 =0. ๐ 1,2 = 1ยฑ Eigenvalues add up to 1. The matrix ๐ด is diagonalizable. It can be shown that the eigenvectors are ๐ฅ 1 = ๐ and ๐ฅ 2 = ๐ How can I get a formula for the 100th Fibonacci number? ๐ข ๐+1 = ๐น ๐+2 ๐น ๐+1 =๐ด ๐ข ๐ = ๐ด 2 ๐ข ๐โ1 etc. Therefore, ๐ข 100 = ๐ด 100 ๐ข 0 =๐ ฮ 100 ๐ โ1 ๐ข 0 ฮ 100 = ๐ ๐ โ
๐ , ๐= ๐ 1 ๐ , ๐ โ1 = 1 ๐ 1 โ ๐ โ ๐ 2 โ1 ๐ 1 ๐ข 100 โ
1 ๐ 1 โ ๐ ๐ 1 ๐ ๐ โ ๐ 2 โ1 ๐ = 1 ๐ 1 โ ๐ ๐ 1 ๐ ๐ โ1 = 1 ๐ 1 โ ๐ ๐ 1 ๐ ๐ = 1 ๐ 1 โ ๐ ๐ ๐ โ ๐น 100 = 1 ๐ 1 โ ๐ 2 ๐
18
Applications of matrix diagonalization cont
Applications of matrix diagonalization cont. First order differential equations ๐๐ข ๐๐ก =๐ด๐ข Problem: Solve the system of differential equations: ๐ ๐ข 1 (๐ก) ๐๐ก =โ ๐ข 1 ๐ก +2 ๐ข 2 ๐ก , ๐ ๐ข 2 (๐ก) ๐๐ก = ๐ข 1 (๐ก)โ2 ๐ข 2 (๐ก) Solution: We convert the system of equations into a matrix form ๐ ๐ข 1 (๐ก) ๐๐ก ๐ ๐ข 2 (๐ก) ๐๐ก =๐ด ๐ข 1 (๐ก) ๐ข 2 (๐ก) . We set ๐ข(๐ก)= ๐ข 1 (๐ก) ๐ข 2 (๐ก) . We impose the initial conditions ๐ข(0)= The systemโs matrix is ๐ด= โ1 2 1 โ2 . The matrix ๐ด is singular. One of the eigenvalues is zero. Therefore, from the trace we conclude that the eigenvalues are ๐ 1 =0, ๐ 2 =โ3. The solution of the above system depends exclusively on the eigenvalues of ๐ด. We can easily show that the eigenvectors are ๐ฅ 1 = and ๐ฅ 2 = 1 โ1 .
19
Applications of matrix diagonalization cont
Applications of matrix diagonalization cont. First order differential equations ๐๐ข ๐๐ก =๐ด๐ข The solution of the above system of equations is of the form ๐ข ๐ก = ๐ 1 ๐ ๐ 1 ๐ก ๐ฅ 1 + ๐ 2 ๐ ๐ 2 ๐ก ๐ฅ 2 Problem: Verify the above by plugging-in ๐ ๐ ๐ ๐ก ๐ฅ ๐ to the equation ๐๐ข ๐๐ก =๐ด๐ข. Letโs find ๐ข ๐ก = ๐ 1 ๐ 0๐ก ๐ 2 ๐ โ3๐ก 1 โ1 = ๐ ๐ 2 ๐ โ3๐ก 1 โ1 ๐ 1 , ๐ 2 comes from the initial conditions. ๐ ๐ โ1 = โ โ1 ๐ 1 ๐ 2 = โ๐ ๐ 1 ๐ 2 =๐ข(0)โ ๐ 1 = 1 3 , ๐ 2 = 1 3 . The Steady State of the system is defined as ๐ข โ = 2/3 1/3 . Stability is achieved if the real part of the eigenvalues is negative. Note that if complex eigenvalues exists, they appear in conjugate pairs.
20
Applications of matrix diagonalization cont
Applications of matrix diagonalization cont. First order differential equations ๐๐ข ๐๐ก =๐ด๐ข Stability is achieved if the real part of the eigenvalues is negative. We do have a steady state if at least one eigenvalue is 0 and the rest of the eigenvalues have negative real part. The system is unstable if at least one eigenvalue has a positive real part. For stability the trace of the systemโs matrix must be negative. The reverse is not true. Obviously a negative trace does not guarantee stability.
21
Applications of matrix diagonalization cont
Applications of matrix diagonalization cont. First order differential equations ๐๐ข ๐๐ก =๐ด๐ข Consider a first-order differential equation ๐๐ข ๐๐ก =๐ด๐ข. Recall the matrix ๐ which contains the eigenvectors of ๐ด defined previously. I set ๐ข(๐ก)=๐๐ฃ(๐ก) and therefore the differential equation becomes: ๐ ๐๐ฃ(๐ก) ๐๐ก =๐ด๐๐ฃ(๐ก)โ ๐๐ฃ ๐ก ๐๐ก = ๐ โ1 ๐ด๐๐ฃ(๐ก)=ฮ๐ฃ(๐ก) This is an interesting result which can be found in various engineering applications. I start from a system of equations ๐๐ข(๐ก) ๐๐ก =๐ด๐ข(๐ก) which are coupled (or dependent or correlated) and I end up with a set of equations which are decoupled and easier to solve. From the relationship ๐๐ฃ(๐ก) ๐๐ก =ฮ๐ฃ(๐ก) we get ๐ฃ ๐ก = ๐ ฮ๐ก ๐ฃ(0) and ๐ข ๐ก = ๐๐ ฮ๐ก ๐ โ1 ๐ข(0) with ๐ ๐ด๐ก = ๐๐ ฮ๐ก ๐ โ1 Question: What is the exponential of a matrix? See next slide.
22
Applications of matrix diagonalization cont
Applications of matrix diagonalization cont. Second order homogeneous differential equations How do I change the second order homogeneous differential equation ๐ฆ โฒโฒ (๐ก)+๐ ๐ฆ โฒ (๐ก)+๐๐ฆ(๐ก)=0 to two first order ones? I use the equation above and the additional equation ๐ฆ โฒ ๐ก = ๐ฆ โฒ (๐ก). I define ๐ข(๐ก)= ๐ฆ โฒ (๐ก) ๐ฆ(๐ก) and from the two available equations I form the matrix representation shown below. ๐ข โฒ (๐ก)= ๐ฆ โฒโฒ (๐ก) ๐ฆ โฒ (๐ก) = โ๐ โ๐ ๐ฆ โฒ (๐ก) ๐ฆ(๐ก) ๐ข โฒ (๐ก)= โ๐ โ๐ 1 0 ๐ข(๐ก)
23
Applications of matrix diagonalization cont
Applications of matrix diagonalization cont. Higher order homogeneous differential equations How do I change the ๐ th order homogeneous differential equation ๐ฆ ๐ (๐ก)+ ๐ 1 ๐ฆ ๐โ1 (๐ก)+โฆ+ ๐ ๐โ1 ๐ฆ(๐ก)=0 to ๐ first order ones? I define ๐ข(๐ก)= ๐ฆ ๐โ1 (๐ก) โฎ ๐ฆ โฒ (๐ก) ๐ฆ(๐ก) and ๐ additional equations ๐ฆ ๐ ๐ก = ๐ฆ ๐ ๐ก , ๐=1,โฆ,(๐โ1) and therefore, I obtain the matrix representation below: ๐ข โฒ (๐ก)= ๐ฆ ๐ (๐ก) ๐ฆ ๐โ1 (๐ก) โฎ ๐ฆ โฒโฒ (๐ก) ๐ฆ โฒ (๐ก) = โ ๐ 1 โ ๐ 2 โฆ โ ๐ ๐โ2 โ ๐ ๐โ โฎ 0 โฆ โฎ โฆ โฎ 0 โฎ โฆ 0 โฆ ๐ข(๐ก) As previously analysed the solution of this system is explicitly affiliated to the eigenvalues of the systemโs matrix.
24
Diagonal matrix exponentials
The exponential ๐ ฮ๐ก of a diagonal matrix is given by: ๐ ฮ๐ก = ๐ ๐ 1 ๐ก 0 โฆ ๐ ๐ 2 ๐ก โฆ 0 โฎ โฎ โฑ โฆ โฎ ๐ ๐ ๐ ๐ก with ๐ ๐ being the elements of the diagonal. Furthermore, ฮ ๐ก = ๐ 1 ๐ก 0 โฆ ๐ 2 ๐ก โฆ 0 โฎ โฎ โฑ โฆ โฎ ๐ ๐ ๐ก As we already showed lim ๐กโโ ๐ ฮ๐ก =0 if Re ๐ ๐ <0, โ๐ lim ๐กโโ ฮ ๐ก =0 if ๐ ๐ <1, โ๐
25
Matrix exponentials ๐ ๐ด๐ก
Consider the exponential ๐ ๐ด๐ก . The Taylor series expansion is ๐ ๐ฅ = 0 โ ๐ฅ ๐ ๐! . Similarly, ๐ ๐ด๐ก =๐ผ+๐ด๐ก+ (๐ด๐ก) (๐ด๐ก) โฆ+ ๐ด๐ก ๐ ๐! +โฆ ๐ ๐ด๐ก =๐ผ+๐ฮ ๐ โ1 ๐ก+ ๐ ฮ 2 ๐ โ1 ๐ก ๐ ฮ 3 ๐ โ1 ๐ก โฆ+ ๐ ฮ ๐ ๐ โ1 ๐ก ๐ ๐! +โฆ=๐ ๐ ฮ๐ก ๐ โ1 The assumption built-in to the above formula is that ๐ด must be diagonalizable. Furthermore, note that 1 1โ๐ฅ = 0 โ ๐ฅ ๐ . For matrices we have (๐ผโ๐ด๐ก) โ1 =๐ผ+๐ด๐ก+ (๐ด๐ก) 2 + (๐ด๐ก) 3 +โฆ This sum converges if the magnitudes of the eigenvalues of matrix ๐ด๐ก are of magnitude less than 1, i.e., ๐(๐ด๐ก) <1. This statement is proven using diagonalization.
26
Stochastic matrices Consider a matrix ๐ด with the following properties:
It is square All entries are positive and real. The elements of each column or each row or both each column and each row add up to 1. Based on the above, a matrix that exhibits the above properties will have all entries โค 1. This is called a stochastic matrix. Stochastic matrices are also called Markov, probability, transition, or substitution matrices. The entries of a stochastic matrix usually represent a probability. Stochastic matrices are widely used inย probability theory,ย statistics,ย mathematical financeย andย linear algebra, as well asย computer science.
27
Stochastic matrices. Types.
There are several types of stochastic matrices: Aย right stochastic matrixย is a matrix of nonnegative real entries, with each rowโs elements summing to 1. Aย left stochastic matrixย is a matrix of nonnegative real entries, with each columnโs elements summing to 1. Aย doubly stochastic matrixย is a matrix of nonnegative real entries with each rowโs and each columnโs elements summing to 1. A stochastic matrix often describes aย so called Markov chainย ๐ ๐ก ย over aย finiteย state spaceย ๐. Generally, an ๐ร๐ stochastic matrix is related to ๐ โstatesโ. If theย probabilityย of moving from stateย ๐ย to stateย ๐ isย ๐ ๐ ๐ ๐ = ๐ ๐๐ , the stochastic matrix ๐ย is given by usingย ๐ ๐๐ ย as theย ๐ th ย row andย ๐ th ย column element: ๐= ๐ 11 ๐ 12 ๐ 21 ๐ โฆ ๐ 1๐ โฆ ๐ 2๐ โฎ โฎ ๐ ๐1 ๐ ๐2 โฑ โฎ โฆ ๐ ๐๐ Depending on the particular problem, the above matrix can be formulated in such a way so that it is either right or left stochastic.
28
Products of stochastic matrices. Stochastic vectors.
An example of a left stochastic matrix is the following: ๐ด= You can prove that if ๐ด and ๐ต are stochastic matrices of any type, then ๐ด๐ต is also a stochastic matrix of the same type. Consider two left stochastic matrices ๐ด and ๐ต with elements ๐ ๐๐ and ๐ ๐๐ respectively, and ๐ถ=๐ด๐ต with elements ๐ ๐๐ . Let us find the sum of the elements of the ๐ th column of ๐ถ: ๐=1 ๐ ๐ ๐๐ = ๐=1 ๐ ๐=1 ๐ ๐ ๐๐ ๐ ๐๐ = ๐=1 ๐ ๐=1 ๐ ๐ ๐๐ ๐ ๐๐ = ๐=1 ๐ ๐ ๐๐ ๐=1 ๐ ๐ ๐๐ = 1โ1=1 Based on the above any power of a stochastic matrix is a stochastic matrix. Furthermore, a vector with real, nonnegative entries ๐ ๐ , for which all the ๐ ๐ add up to 1, is called a stochastic vector. For a stochastic matrix, every column or row or both is a stochastic vector. I am interested in the eigenvalues and eigenvectors of a stochastic matrix.
29
Stochastic matrices and their eigenvalues
I would like to prove that ๐=1 is always an eigenvalue of a stochastic matrix. Consider again a left stochastic matrix ๐ด. Since the elements of each column of ๐ด add up to 1, the elements of each row of ๐ด ๐ should add up to 1. Therefore, ๐ด ๐ โฎ 1 = โฎ 1 =1โ โฎ 1 Therefore, 1 is an eigenvalue of ๐ด ๐ . As already shown, the eigenvalues of ๐ด and ๐ด ๐ are the same, which implies that 1 is also an eigenvalue of ๐ด. Since det ๐ดโ๐ผ =0, the matrix ๐ดโ๐ผ is singular, which means that there is a vector ๐ฅ for which ๐ดโ๐ผ ๐ฅ=๐โ๐ด๐ฅ=๐ฅ A vector of the null space of ๐ดโ๐ผ is the eigenvector of ๐ด that corresponds to eigenvalue ๐=1.
30
Stochastic matrices and their eigenvalues cont.
I would like now to prove that the eigenvalues of a stochastic matrix have magnitude smaller or equal to 1. Assume that ๐ฃ= ๐ฃ 1 ๐ฃ 2 โฎ ๐ฃ ๐ is an eigenvector of a right stochastic matrix ๐ด with an associated eigenvalue ๐ >1. Then ๐ด๐ฃ=๐๐ฃ implies ๐=1 ๐ ๐ด ๐๐ ๐ฃ ๐ =๐ ๐ฃ ๐ . We can see that ๐=1 ๐ ๐ด ๐๐ ๐ฃ ๐ โค ๐ฃ max ๐=1 ๐ ๐ด ๐๐ =๐ฃ max . All the elements of the vector ๐ด๐ฃ are smaller or equal to ๐ฃ max . If ๐ >1 at least one element of the vector ๐๐ฃ (which is equal to ๐ด๐ฃ) will be greater than ๐ฃ max (๐ ๐ฃ max ). Based on the above, the assumption of an eigenvalue being larger than 1 can not be valid. For left stochastic matrices we use ๐ด ๐ in the proof above, and the fact that the eigenvalues of ๐ด and ๐ด ๐ are the same.
31
An application of stochastic matrices: First order systems
Consider again the first order system described by an equation of the form ๐ข ๐ = ๐ด ๐ ๐ข 0 , where ๐ด is now a stochastic matrix. Previously, we managed to write ๐ข ๐ = ๐ด ๐ ๐ข 0 = ๐ 1 ๐ 1 ๐ ๐ฅ 1 + ๐ 2 ๐ 2 ๐ ๐ฅ 2 +โฆ where ๐ ๐ and ๐ฅ ๐ are the eigenvalues and eigenvectors of matrix ๐ด, respectively. Note that the above relationship requires a complete set of eigenvectors. If ๐ 1 =1 and ๐ ๐ <1, ๐>1 then the steady state of the system is ๐ 1 ๐ฅ 1 (which is part of the initial condition ๐ข 0 ). I will use an example where ๐ด is a 2ร2 matrix. Generally, an ๐ร๐ stochastic matrix is related to ๐ โstatesโ. Assume that the 2 โstatesโ are 2 UK cities. I take London and Oxford. I am interested in the population of the two cities and how it evolves. I assume that people who inhabit these two cities move between them only. ๐ข ox ๐ข lon ๐ก=๐+1 = ๐ข ox ๐ข lon ๐ก=๐ It is now obvious that the column elements are positive and also add up to 1 because they represent probabilities.
32
Application of stochastic matrices (cont.)
I assume that ๐ข ox ๐ข lon ๐ก=๐=0 = Then ๐ข ox ๐ข lon ๐=1 = = What is the population of the two cities after a long time? Consider the matrix The eigenvalues are ๐ 1 =1 and ๐ 2 =0.7. (Notice that the second eigenvalue is found by the trace of the matrix.) The eigenvectors of this matrix are ๐ฅ 1 = and ๐ฅ 2 = โ1 1 . ๐ข ox ๐ข lon ๐ = ๐ 1 ๐ 1 ๐ ๐ 2 ๐ 2 ๐ โ1 1 = ๐ ๐ ๐ โ1 1 I find ๐ 1 , ๐ 2 from the initial condition ๐ข ox ๐ข lon ๐=0 = = ๐ ๐ 2 โ1 1 and therefore, ๐ 1 = and ๐ 2 = ๐ข ox ๐ข lon ๐โโ = ๐โโ โ1 1 =
33
Application of stochastic matrices (cont.)
Stochastic models facilitate the modeling of various real life engineering applications. An example is the modeling of the movement of people without gain or loss: total number of people is conserved.
34
Symmetric matrices In this lecture we will be interested in symmetric matrices. In case of real matrices, symmetry is defined as ๐ด= ๐ด ๐ . In case of complex matrices, symmetry is defined as ๐ด โ = ๐ด ๐ or ๐ด โ ๐ =๐ด. A matrix which possesses this property is called Hermitian. We can also use the symbol ๐ด ๐ป =๐ด โ ๐ . We will prove that the eigenvalues of a symmetric matrix are real. The eigenvectors of a symmetric matrix can be chosen to be orthogonal. If we also choose them to have a magnitude of 1, then the eigenvectors can be chosen to form an orthonormal set of vectors. However, the eigenvectors of a symmetric matrix that correspond to different eigenvalues are orthogonal (prove is given in subsequent slide). For a random matrix with independent eigenvectors we have ๐ด=๐ฮ ๐ โ1 . For a symmetric matrix with orthonormal eigenvectors we have ๐ด=๐ฮ ๐ โ1 = ๐ฮ ๐ ๐
35
Real symmetric matrices
Problem: Prove that the eigenvalues of a symmetric matrix occur in complex conjugate pairs. Solution: Consider ๐ด๐ฅ=๐๐ฅ. If we take complex conjugate in both sides we get (๐ด๐ฅ) โ = (๐๐ฅ) โ โ๐ด โ ๐ฅ โ =๐ โ ๐ฅ โ If ๐ด is real then ๐ด ๐ฅ โ =๐ โ ๐ฅ โ . Therefore, if ๐ is an eigenvalue of ๐ด with corresponding eigenvector ๐ฅ then ๐ โ is an eigenvalue of ๐ด with corresponding eigenvector ๐ฅ โ .
36
Real symmetric matrices cont.
Problem: Prove that the eigenvalues of a symmetric matrix are real. Solution: We proved that if ๐ด is real then ๐ด ๐ฅ โ =๐ โ ๐ฅ โ . If we take transpose in both sides we get ๐ฅ โ ๐ ๐ด ๐ = ๐ โ ๐ฅ โ ๐ โ ๐ฅ โ ๐ ๐ด= ๐ โ ๐ฅ โ ๐ We now multiply both sides from the right with ๐ฅ and we get ๐ฅ โ ๐ ๐ด๐ฅ= ๐ โ ๐ฅ โ ๐ ๐ฅ We take now ๐ด๐ฅ=๐๐ฅ. We now multiply both sides from the left with ๐ฅ โ ๐ and we get ๐ฅ โ ๐ ๐ด๐ฅ= ๐ ๐ฅ โ ๐ ๐ฅ. From the above we see that ๐ ๐ฅ โ ๐ ๐ฅ= ๐ โ ๐ฅ โ ๐ ๐ฅ and since ๐ฅ โ ๐ ๐ฅโ 0, we see that ๐= ๐ โ .
37
Real symmetric matrices cont.
Problem: Prove that the eigenvectors of a symmetric matrix which correspond to different eigenvalues are always perpendicular. Solution: Suppose that ๐ด๐ฅ= ๐ 1 ๐ฅ and ๐ด๐ฆ= ๐ 2 ๐ฆ with ๐ 1 โ ๐ 2 . ๐ 1 ๐ฅ ๐ ๐ฆ= ๐ฅ ๐ ๐ 1 ๐ฆ= ๐ด๐ฅ ๐ ๐ฆ= ๐ฅ ๐ ๐ด๐ฆ= ๐ฅ ๐ ๐ 2 ๐ฆ The conditions ๐ฅ ๐ ๐ 1 ๐ฆ= ๐ฅ ๐ ๐ 2 ๐ฆ and ๐ 1 โ ๐ 2 give ๐ฅ ๐ ๐ฆ=0. The eigenvectors ๐ฅ and ๐ฆ are perpendicular.
38
Complex matrices. Complex symmetric matrices.
Let us find which complex matrices have real eigenvalues and orthogonal eigenvectors. Consider ๐ด๐ฅ=๐๐ฅ with ๐ด possibly complex. If we take complex conjugate in both sides we get (๐ด๐ฅ) โ = (๐๐ฅ) โ โ๐ด โ ๐ฅ โ =๐ โ ๐ฅ โ If we take transpose in both sides we get ๐ฅ โ ๐ ๐ด โ ๐ = ๐ โ ๐ฅ โ ๐ We now multiply both sides from the right with ๐ฅ we get ๐ฅ โ ๐ ๐ด โ ๐ ๐ฅ= ๐ โ ๐ฅ โ ๐ ๐ฅ We take now ๐ด๐ฅ=๐๐ฅ. We now multiply both sides from the left with ๐ฅ โ ๐ and we get ๐ฅ โ ๐ ๐ด๐ฅ= ๐ ๐ฅ โ ๐ ๐ฅ. From the above we see that if ๐ด โ ๐ =๐ด then ๐ ๐ฅ โ ๐ ๐ฅ= ๐ โ ๐ฅ โ ๐ ๐ฅ and since ๐ฅ โ ๐ ๐ฅโ 0, we see that ๐= ๐ โ . If ๐ด โ ๐ =๐ด the matrix is called Hermitian.
39
Complex vectors and matrices
Consider a complex column vector ๐ง= ๐ง 1 ๐ง 2 โฆ ๐ง ๐ ๐ . Its length is ๐ง โ ๐ ๐ง= ๐=1 ๐ ๐ง ๐ As already mentioned, when we both transpose and conjugate we can use the symbol ๐ง ๐ป =๐ง โ ๐ (Hermitian). The inner product of two complex vectors is ๐ฆ โ ๐ ๐ฅ= ๐ฆ ๐ป ๐ฅ. For complex matrices the symmetry is defined as ๐ด โ ๐ =๐ด. As already mentioned, these are called Hermitian matrices. They have real eigenvalues and perpendicular eigenvectors. If these are complex we check their length using ๐ ๐ โ ๐ ๐ ๐ and also ๐ โ ๐ ๐=๐ผ. Example: Consider the matrix ๐ด= 2 3+๐ 3โ๐ 5 Eigenvalues are found from: 2โ๐ 5โ๐ โ 3+๐ 3โ๐ =0 โ ๐ 2 โ7๐+10โ 9โ3๐+3๐โ ๐ 2 =0โ๐ ๐โ7 =0
40
Eigenvalue sign We proved that:
The eigenvalues of a symmetric matrix, either real or complex, are real. The eigenvectors of a symmetric matrix can be chosen to be orthogonal. The eigenvectors of a symmetric matrix that correspond to different eigenvalues are orthogonal. Do not forget the definition of symmetry for complex matrices. It can be proven that the signs of the pivots are the same as the signs of the eigenvalues. Just to remind you: Product of pivots=Product of eigenvalues=Determinant
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.