Day 2 Eigenvectors neither stretched nor compressed, its eigenvalue is 1. All vectors with the same vertical direction—i.e., parallel to this vector—are.

Slides:



Advertisements
Similar presentations
Chapter day 4 Differential equations. The number of rabbits in a population increases at a rate that is proportional to the number of rabbits present.
Advertisements

Ch 7.6: Complex Eigenvalues
10.4 Complex Vector Spaces.
Ch 7.7: Fundamental Matrices
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Chapter 6 Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors
Psychology 202b Advanced Psychological Statistics, II January 25, 2011.
Linear Transformations
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Symmetric Matrices and Quadratic Forms
Ch 7.8: Repeated Eigenvalues
Ch 7.9: Nonhomogeneous Linear Systems
5.II. Similarity 5.II.1. Definition and Examples
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
5 5.1 © 2012 Pearson Education, Inc. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Arithmetic Operations on Matrices. 1. Definition of Matrix 2. Column, Row and Square Matrix 3. Addition and Subtraction of Matrices 4. Multiplying Row.
Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly.
Eigenvalue Problems Solving linear systems Ax = b is one part of numerical linear algebra, and involves manipulating the rows of a matrix. The second main.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
1 MAC 2103 Module 12 Eigenvalues and Eigenvectors.
Day 1 Eigenvalues and Eigenvectors
Day 1 Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors
Day 4 Differential Equations (option chapter). The number of rabbits in a population increases at a rate that is proportional to the number of rabbits.
Linear algebra: matrix Eigen-value Problems
Computing Eigen Information for Small Matrices The eigen equation can be rearranged as follows: Ax = x  Ax = I n x  Ax - I n x = 0  (A - I n )x = 0.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Day 3 Markov Chains For some interesting demonstrations of this topic visit: 2005/Tools/index.htm.
Diagonalization and Similar Matrices In Section 4.2 we showed how to compute eigenpairs (,p) of a matrix A by determining the roots of the characteristic.
LAHW#13 Due December 20, Eigenvalues and Eigenvectors 5. –The matrix represents a rotations of 90° counterclockwise. Obviously, Ax is never.
What is the determinant of What is the determinant of
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Boyce/DiPrima 9 th ed, Ch 7.6: Complex Eigenvalues Elementary Differential Equations and Boundary Value Problems, 9 th edition, by William E. Boyce and.
5.1 Eigenvectors and Eigenvalues 5. Eigenvalues and Eigenvectors.
1.7 Linear Independence. in R n is said to be linearly independent if has only the trivial solution. in R n is said to be linearly dependent if there.
1 Chapter 8 – Symmetric Matrices and Quadratic Forms Outline 8.1 Symmetric Matrices 8.2Quardratic Forms 8.3Singular ValuesSymmetric MatricesQuardratic.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Matrices CHAPTER 8.9 ~ Ch _2 Contents  8.9 Power of Matrices 8.9 Power of Matrices  8.10 Orthogonal Matrices 8.10 Orthogonal Matrices 
Reduced echelon form Matrix equations Null space Range Determinant Invertibility Similar matrices Eigenvalues Eigenvectors Diagonabilty Power.
Math 4B Systems of Differential Equations Matrix Solutions Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Matrices.
Review of Eigenvectors and Eigenvalues from CliffsNotes Online mining-the-Eigenvectors-of-a- Matrix.topicArticleId-20807,articleId-
Chapter 5 Eigenvalues and Eigenvectors
Chapter 6 Eigenvalues and Eigenvectors
College Algebra Chapter 6 Matrices and Determinants and Applications
Boyce/DiPrima 10th ed, Ch 7.9: Nonhomogeneous Linear Systems Elementary Differential Equations and Boundary Value Problems, 10th edition, by William E.
Use Inverse Matrices to Solve Linear Systems
Eigenvalues and Eigenvectors
Systems of Linear Differential Equations
Review of Eigenvectors and Eigenvalues
ISHIK UNIVERSITY FACULTY OF EDUCATION Mathematics Education Department
Matrices and vector spaces
Eigenvalues and Eigenvectors
Boyce/DiPrima 10th ed, Ch 7.7: Fundamental Matrices Elementary Differential Equations and Boundary Value Problems, 10th edition, by William E. Boyce and.
Boyce/DiPrima 10th ed, Ch 7.8: Repeated Eigenvalues Elementary Differential Equations and Boundary Value Problems, 10th edition, by William E. Boyce.
Complex Eigenvalues Prepared by Vince Zaccone
Eigenvalues and Eigenvectors
Unit 3: Matrices
MATH 374 Lecture 23 Complex Eigenvalues.
Determinants CHANGE OF BASIS © 2016 Pearson Education, Inc.
Symmetric Matrices and Quadratic Forms
Maths for Signals and Systems Linear Algebra in Engineering Lectures 10-12, Tuesday 1st and Friday 4th November2016 DR TANIA STATHAKI READER (ASSOCIATE.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Linear Algebra Lecture 30.
Eigenvalues and Eigenvectors
Linear Algebra: Matrix Eigenvalue Problems – Part 2
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Day 2 Eigenvectors neither stretched nor compressed, its eigenvalue is 1. All vectors with the same vertical direction—i.e., parallel to this vector—are also eigenvectors, with the same eigenvalue. Together with the zero-vector, they form the eigenspace for this eigenvalue. In this shear mapping of the Mona Lisa, the picture was deformed in such a way that its central vertical axis (red vector) has not changed direction, but the diagonal vector (blue) has changed direction. Hence the red vector is an eigenvector of the transformation and the blue vector is not. Since the red vector wasshear mapping Mona Lisa

Similar Matrices Two square matrices and A and B that are related bysquare matrices B = X -1 AX where X is a square nonsingular matrix are said to be similar. A transformation of the form X - 1 AX is called a similarity transformation, or conjugation by X.nonsingular matrixsimilarity transformation

S -1 AS =Λ Suppose an n x n matrix A has n independent eigenvectors. Put those vectors in the column of a matrix called S (called the eigenvector matrix) If you multiply AS = A = = Hence: AS = SΛ or S -1 AS =Λ x 1 x 2 …x n λ 1 x 1, λ 2 x 2… λ n x n x 1 x 2 …x n λ 1 0 … 0 0 λ 2 …0 … 0,0…0 λ n Eigenvalue matrix, Λ Capital λ

S -1 AS =Λ Recall from our discussion of similar matrices that A and Λ are similar matrices. They represent the same transformation with regards to a different basis. We will use this to find out about powers (and later exponentials) of a matrix.

Diagonal Matrix D An n x nis similar to diagonal matrix if & only if A has n linearly independent eigenvectors. The elements on the diagonals are the eigenvalues of A. Theorem 7.4

Diagonalize the matrix if possible Diagonalization problem 1

Diagonalize problem 1 solution Note: A= SΛS -1

Diagonalize the matrix if possible

When can we diagonalize a matrix? If a matrix has n independent eigenvectors then we can diagonalize the matrix. (Th. 7.4) We can do this in all times when there are n different eigenvalues. (Th. 7.5) If there are repeated eigenvalues we may or may not find n independent eigenvectors

What are the Eigenvalues and eigenvectors of A 2, A 3 If Ax = λx multiply both sides of the equation by A A 2 x = Aλx = λAx = λ 2 x This tells us that the eigenvectors of A 2 x = λ 2 x. If we square A then we square the eigenvalues of a A but the eigenvectors are the same as the eigenvectors of A.

We can get the same information from the formula S -1 AS =Λ S -1 AS = Λ AS = S Λ A = S Λ S -1 Therefore A 2 = S Λ S -1 S Λ S -1 A 2 = S Λ 2 S -1 Note this is telling us the same information that we found before.

Raise matrix to a power Applications For use in transition matrices from one state to another. Waiting time / queuing theory. To study long term behavior. Markov matrices.

Raise matrix to a power In a previous slide we were able to diagonalize the matrix. Find A 4

A k = S Λ k S -1 To Find A 4 Multiply S D 4 S -1 We usually do not use this equation in this form. However, this demonstrates that diagonalization tells us about powers of a matrix. Note: we found the eigenvalues and eigenvectors on slide 7.

What Eigenvectors and Eigenvalues tell us about the powers of a matrix When do the powers of a matrix go to zero A k → 0 What would be true about A? Recall: A = S Λ S -1 A k = S Λ k S -1 A matrix that goes to zero when raised to the k power as k → ∞ is called stable.

A system that approaches zero as t → ∞ is said to be stable. A difference equation is stable if: The absolute value of each eigenvalue is less than one. Recall: The eigenvalues may be complex. Note: This approach only works if there are n independent eigenvectors.

A system that approaches a (non zero) constant as t→∞ is called a steady state What are the eigenvalues for a matrix that describes a transformation that will become a steady state regardless of the initial conditions? What assumptions are built into this?

A system that approaches a (non zero) constant as t→∞ is called a steady state What are the eigenvalues for a matrix that describes a transformation that will become a steady state regardless of the initial conditions? One or more eigenvalues that are 1 and the others have an absolute value of less than 1. What assumptions are built into this? n independent eigenvectors.

How does this usually get applied? u k+1 = A u k With starting condition u 0 u 1 = Au 0, u 2 = A 2 u 0, u 3 = A 3 u 0, u k = A k u 0 To solve Write u 0 as a combination of eigenvectors. u 0 = c 1 x 1 + c 2 x 2 + … + c n x n Multiply both sides of the equation by A (Note: the x’s are eigenvectors) Au 0 = c 1 λ 1 x 1 + c 2 λ 2 x 2 + … + c n λ n x n A k u 0 = c 1 λ 1 k x 1 + c 2 λ 2 k x 2 + … + c n λ n k x n

Key Formula for Difference Equations Note: this formula is based on having n independent eigenvectors. If that is not true then this approach can not be used.

The Fibonacci Sequence The sequence 0, 1, 1, 2, 3, 5, 8, … is called the Fibonacci Sequence. Each term is found by adding the two previous terms Equation: y(k+2) = y(k+1) + y(k) How fast is this sequence growing? How could we approximate the 100 th term of the sequence?

We need to change this into matrix language Equation: y(n+2) = y(n+1) + y(n) Trick create another equation y(n+1) = y(n+1) This system can be expressed at the matrix system

Find the determinant of A - λI 1- λ 1 = λ 2 – λ – 1 by the quadratic formula 1- λ λ 1 = ½ (1 + √5) λ 2 = ½ (1 - √5) λ 1 ≈ λ 2 ≈ What do the eigenvalues tell us about the growth of the Fibonacci Sequence?

Fibonacci solution λ 2 – λ – 1 = 0 Find the eigenvectors: Recall: λ 2 – λ – 1 = 0 for these two values

Fibonacci solution To find eigenvectors Recall: λ 2 – λ – 1 = 0 for these two values

Fibonacci solution Use our key equation Plug in λ 1 ≈ λ 2 ≈ And the corresponding eigenvectors for x 1 and x 2. Then use the initial conditions to find the c 1 and c 2.

Fibonacci solution A 100 u 0 = c 1 ½ (1 + √5) 100 x 1 + c 2 (1 -√5) 100 x 2 x 1 and x 2 are the eigenvectors by inspections the eigenvectors are Recall: λ 2 – λ – 1 = 0 for these two values.

Fibonacci solution A k u 0 = c 1 (λ 1 ) k λ 1 + c 2 (λ 2 ) k λ λ 1 = ½ (1 + √5) λ 2 = ½ (1 - √5) Now use the initial conditions to find c 1 and c 2 u 0 = 1 u 0 = c 1 x 1 + c 2 x 2 0 [ ] λ 1 c 1 + λ 2 c 2 = 1 1 c c 2 = 0 c 1 = √5/5 c 2 = -√5/5

Homework wkst 7.2 p all Customer: "How much is a large order of Fibonachos?" Cashier: "It's the price of a small order plus the price of a medium order."

More info