The QR iteration for eigenvalues. ... The intention of the algorithm is to perform a sequence of similarity transformations on a real matrix so that the.

Slides:



Advertisements
Similar presentations
Applied Informatics Štefan BEREŽNÝ
Advertisements

Scientific Computing QR Factorization Part 2 – Algorithm to Find Eigenvalues.
Chapter 4 Systems of Linear Equations; Matrices
Transformations We want to be able to make changes to the image larger/smaller rotate move This can be efficiently achieved through mathematical operations.
Algebraic, transcendental (i.e., involving trigonometric and exponential functions), ordinary differential equations, or partial differential equations...
Least Squares example There are 3 mountains u,y,z that from one site have been measured as 2474 ft., 3882 ft., and 4834 ft.. But from u, y looks 1422 ft.
MATH 685/ CSI 700/ OR 682 Lecture Notes
Solving Linear Systems (Numerical Recipes, Chap 2)
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Parallel Programming in C with MPI and OpenMP Michael J. Quinn.
Lecture 13 - Eigen-analysis CVEN 302 July 1, 2002.
Sparse Matrices in Matlab John R. Gilbert Xerox Palo Alto Research Center with Cleve Moler (MathWorks) and Rob Schreiber (HP Labs)
CISE301_Topic3KFUPM1 SE301: Numerical Methods Topic 3: Solution of Systems of Linear Equations Lectures 12-17: KFUPM Read Chapter 9 of the textbook.
Matrix Theory Background
DEF: Characteristic Polynomial (of degree n) QR - Algorithm Note: 1) QR – Algorithm is different from QR-Decomposition 2) a procedure to calculate the.
Modern iterative methods For basic iterative methods, converge linearly Modern iterative methods, converge faster –Krylov subspace method Steepest descent.
Solution of linear system of equations
CSCI 317 Mike Heroux1 Sparse Matrix Computations CSCI 317 Mike Heroux.
Linear Transformations
1cs542g-term Notes  Simpler right-looking derivation (sorry):
3D Geometry for Computer Graphics
Chapter 6 Eigenvalues.
Information Retrieval in Text Part III Reference: Michael W. Berry and Murray Browne. Understanding Search Engines: Mathematical Modeling and Text Retrieval.
QR-RLS Algorithm Cy Shimabukuro EE 491D
Sparse Matrix Methods Day 1: Overview Matlab and examples Data structures Ax=b Sparse matrices and graphs Fill-reducing matrix permutations Matching and.
3D Geometry for Computer Graphics
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Parallel Programming in C with MPI and OpenMP Michael J. Quinn.
6 1 Linear Transformations. 6 2 Hopfield Network Questions.
Finding Eigenvalues and Eigenvectors What is really important?
MATH 685/ CSI 700/ OR 682 Lecture Notes Lecture 6. Eigenvalue problems.
A matrix having a single row is called a row matrix. e.g.,
Dominant Eigenvalues & The Power Method
5.1 Orthogonality.
Numerical Analysis 1 EE, NCKU Tien-Hao Chang (Darby Chang)
Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly.
Scientific Computing QR Factorization Part 1 – Orthogonal Matrices, Reflections and Rotations.
Gerschgorin Circle Theorem. Eigenvalues In linear algebra Eigenvalues are defined for a square matrix M. An Eigenvalue for the matrix M is a scalar such.
CSE554AlignmentSlide 1 CSE 554 Lecture 8: Alignment Fall 2014.
Eigenvalue Problems Solving linear systems Ax = b is one part of numerical linear algebra, and involves manipulating the rows of a matrix. The second main.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
Gram-Schmidt Orthogonalization
Linear algebra: matrix Eigen-value Problems
Chapter 10 Real Inner Products and Least-Square (cont.) In this handout: Angle between two vectors Revised Gram-Schmidt algorithm QR-decompoistion of matrices.
Chapter 5 MATRIX ALGEBRA: DETEMINANT, REVERSE, EIGENVALUES.
MATH 685/ CSI 700/ OR 682 Lecture Notes Lecture 4. Least squares.
Solving Linear Systems Solving linear systems Ax = b is one part of numerical linear algebra, and involves manipulating the rows of a matrix. Solving linear.
Algorithms 2005 Ramesh Hariharan. Algebraic Methods.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Eigenvalues The eigenvalue problem is to determine the nontrivial solutions of the equation Ax= x where A is an n-by-n matrix, x is a length n column.
By Josh Zimmer Department of Mathematics and Computer Science The set ℤ p = {0,1,...,p-1} forms a finite field. There are p ⁴ possible 2×2 matrices in.
1.2. Homomorphism: SL(2,C) ~ L 0 Let Φ be a map between groups G 1 & G 2, i.e., Φ: G 1 → G 2. Φ is a homomorphism if Example: G 1 = (Z,+), G 2 = C 2, Φ(n)
Gaoal of Chapter 2 To develop direct or iterative methods to solve linear systems Useful Words upper/lower triangular; back/forward substitution; coefficient;
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Slide Copyright © 2009 Pearson Education, Inc. 7.4 Solving Systems of Equations by Using Matrices.
MA5233 Lecture 6 Krylov Subspaces and Conjugate Gradients Wayne M. Lawton Department of Mathematics National University of Singapore 2 Science Drive 2.
Computational Physics (Lecture 7) PHY4061. Eigen Value Problems.
ALGEBRAIC EIGEN VALUE PROBLEMS
1 Numerical Methods Solution of Systems of Linear Equations.
CSE 554 Lecture 8: Alignment
Chapter 4 Systems of Linear Equations; Matrices
Matrices and vector spaces
CE Digital Signal Processing Fall Discrete-time Fourier Transform
Complex Eigenvalues Prepared by Vince Zaccone
Orthogonality and Least Squares
~ Least Squares example
Linear Algebra Lecture 32.
~ Least Squares example
Linear Algebra Lecture 29.
Linear Algebra Lecture 35.
Linear Algebra: Matrix Eigenvalue Problems – Part 2
Eigenvectors and Eigenvalues
Presentation transcript:

The QR iteration for eigenvalues

... The intention of the algorithm is to perform a sequence of similarity transformations on a real matrix so that the limit is a triangular matrix.

... The intention of the algorithm is to perform a sequence of similarity transformations on a real matrix so that the limit is a triangular matrix. If this were possible then the eigenvalues would be exactly the diagonal elements.

But it may not be possible:

since Real matrices may have complex eigenvalues and All of the arithmetic in the algorithm is real

But it may not be possible: since Real matrices may have complex eigenvalues and All of the arithmetic in the algorithm is real There is no way the real numbers can converge to anything other than real numbers.

But it may not be possible: since Real matrices may have complex eigenvalues and All of the arithmetic in the algorithm is real There is no way the real numbers can converge to anything other than real numbers. That is: It is impossible for the limit to have numbers with non-zero imaginary parts.

But it may not be possible: since Real matrices may have complex eigenvalues and All of the arithmetic in the algorithm is real There is no way the real numbers can converge to anything other than real numbers. That is: It is impossible for the limit to have numbers with non-zero imaginary parts. If any eigenvalues have non-zero imaginary parts, the sequence will not converge to them.

Are we dead?

If any eigenvalues have non-zero imaginary parts, the sequence will not converge to them. Are we dead? Nope, but we have to modify our expectations.

... Instead of the limit being an upper triangular matrix

... Instead of the limit being an upper triangular matrix it is block upper triangular

... Instead of the limit being an upper triangular matrix it is block upper triangular

... Instead of the limit being an upper triangular matrix it is block upper triangular The blocks are 2 by 2 and…

... Instead of the limit being an upper triangular matrix it is block upper triangular The blocks are 2 by 2 and… the eigenvalues we want are the complex conjugate pairs of eigenvalues of the blocks

... This actually presents no major troubles. The blocks are 2 by 2 and… the eigenvalues we want are the complex conjugate pairs of eigenvalues of the blocks

So this is the algorithm in a mathematical form (as opposed to form representing what happens in storage):

So this is the algorithm in a mathematical form (as opposed to form representing what happens in storage): 0. Set A 1 = A For k = 1, 2, … 1.Do a QR factorization of A k : A k = Q k R k 2.Set A k+1 = R k Q k

This is the algorithm in a programming form: For k = 1, 2, … 1.Do a QR factorization of A: A → QR 2.Set A ← RQ

Since A k = Q k R k Q k T A k = Q k T Q k R k = R k

Since A k = Q k R k Q k T A k = Q k T Q k R k = R k but then A k+1 = R k Q k = Q k T A k Q k

Since A k = Q k R k Q k T A k = Q k T Q k R k = R k but then A k+1 = R k Q k = Q k T A k Q k and since Q k is orthogonal, Q k T = Q k -1 and

Since A k = Q k R k Q k T A k = Q k T Q k R k = R k but then A k+1 = R k Q k = Q k T A k Q k and since Q k is orthogonal, Q k T = Q k -1 and A k+1 = Q k -1 A k Q k

Since A k = Q k R k Q k T A k = Q k T Q k R k = R k but then A k+1 = R k Q k = Q k T A k Q k and since Q k is orthogonal, Q k T = Q k -1 and A k+1 = Q k -1 A k Q k A k+1 is similar to A k

is similar to A k-1

A k+1 is similar to A k is similar to A k-1 is similar to A k-2

A k+1 is similar to A k is similar to A k-1 is similar to A k-2... is similar to A 1 =A

A k+1 is similar to A k is similar to A k-1 is similar to A k-2... is similar to A 1 =A We have a sequence of similar matrices A 1, A 2, A 3, … tending to a block triangular matrix whose eigenvalues are easy to obtain.

Not only are the matrices in the sequence similar they are orthogonally similar - the similarity transformation is orthogonal

Not only are the matrices in the sequence similar they are orthogonally similar - the similarity transformation is orthogonal Since orthogonal matrices preserve lengths, this means: The matrices of the sequence do not get very large or very small, and The computations are done more accurately.

Let’s see the algorithm in action. The sizes will be indicated by color. Since, what will be interesting is seeing the subdiagonal components get smaller, we will use a logarithmic scale that emphasizes small numbers. 1.(Unshifted) QR(Unshifted) QR 2.Corner shifted QRCorner shifted QR 3.Double shift QRDouble shift QR