1cs542g-term1-2006 Notes  In assignment 1, problem 2: smoothness = number of times differentiable.

Slides:



Advertisements
Similar presentations
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Advertisements

Scientific Computing QR Factorization Part 2 – Algorithm to Find Eigenvalues.
Solving Linear Systems (Numerical Recipes, Chap 2)
Lecture 13 - Eigen-analysis CVEN 302 July 1, 2002.
DEF: Characteristic Polynomial (of degree n) QR - Algorithm Note: 1) QR – Algorithm is different from QR-Decomposition 2) a procedure to calculate the.
Steepest Decent and Conjugate Gradients (CG). Solving of the linear equation system.
Modern iterative methods For basic iterative methods, converge linearly Modern iterative methods, converge faster –Krylov subspace method Steepest descent.
Slides by Olga Sorkine, Tel Aviv University. 2 The plan today Singular Value Decomposition  Basic intuition  Formal definition  Applications.
1cs542g-term High Dimensional Data  So far we’ve considered scalar data values f i (or interpolated/approximated each component of vector values.
1cs542g-term Notes  Make-up lecture tomorrow 1-2, room 204.
Jonathan Richard Shewchuk Reading Group Presention By David Cline
1cs542g-term Notes  Assignment 1 due tonight ( me by tomorrow morning)
Determinants Bases, linear Indep., etc Gram-Schmidt Eigenvalue and Eigenvectors Misc
Symmetric Matrices and Quadratic Forms
1cs542g-term Notes  Simpler right-looking derivation (sorry):
Mar Numerical approach for large-scale Eigenvalue problems 1 Definition Why do we study it ? Is the Behavior system based or nodal based? What are.
1cs542g-term Notes  Extra class this Friday 1-2pm  If you want to receive s about the course (and are auditing) send me .
1cs542g-term Notes  r 2 log r is technically not defined at r=0 but can be smoothly continued to =0 there  Question (not required in assignment):
1 Systems of Linear Equations Iterative Methods. 2 B. Direct Methods 1.Jacobi method and Gauss Seidel 2.Relaxation method for iterative methods.
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
An introduction to iterative projection methods Eigenvalue problems Luiza Bondar the 23 rd of November th Seminar.
Symmetric Definite Generalized Eigenproblem
3D Geometry for Computer Graphics
ECIV 301 Programming & Graphics Numerical Methods for Engineers REVIEW II.
Finding Eigenvalues and Eigenvectors What is really important?
1cs542g-term Notes  Extra class next week (Oct 12, not this Friday)  To submit your assignment: me the URL of a page containing (links to)
Matrices CS485/685 Computer Vision Dr. George Bebis.
MATH 685/ CSI 700/ OR 682 Lecture Notes Lecture 6. Eigenvalue problems.
Dominant Eigenvalues & The Power Method
Numerical Analysis 1 EE, NCKU Tien-Hao Chang (Darby Chang)
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
CHAPTER SIX Eigenvalues
Wireless Communication Elec 534 Set IV October 23, 2007
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
Eigenvalue Problems Solving linear systems Ax = b is one part of numerical linear algebra, and involves manipulating the rows of a matrix. The second main.
+ Review of Linear Algebra Optimization 1/14/10 Recitation Sivaraman Balakrishnan.
Using Adaptive Methods for Updating/Downdating PageRank Gene H. Golub Stanford University SCCM Joint Work With Sep Kamvar, Taher Haveliwala.
Algorithms for a large sparse nonlinear eigenvalue problem Yusaku Yamamoto Dept. of Computational Science & Engineering Nagoya University.
Lecture 22 - Exam 2 Review CVEN 302 July 29, 2002.
Linear algebra: matrix Eigen-value Problems
Computing Eigen Information for Small Matrices The eigen equation can be rearranged as follows: Ax = x  Ax = I n x  Ax - I n x = 0  (A - I n )x = 0.
Orthogonalization via Deflation By Achiya Dax Hydrological Service Jerusalem, Israel
Chapter 5 MATRIX ALGEBRA: DETEMINANT, REVERSE, EIGENVALUES.
A Note on Rectangular Quotients By Achiya Dax Hydrological Service Jerusalem, Israel
Algorithms 2005 Ramesh Hariharan. Algebraic Methods.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Eigenvalues The eigenvalue problem is to determine the nontrivial solutions of the equation Ax= x where A is an n-by-n matrix, x is a length n column.
Introduction to Linear Algebra Mark Goldman Emily Mackevicius.
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
Review of Matrix Operations Vector: a sequence of elements (the order is important) e.g., x = (2, 1) denotes a vector length = sqrt(2*2+1*1) orientation.
1. Systems of Linear Equations and Matrices (8 Lectures) 1.1 Introduction to Systems of Linear Equations 1.2 Gaussian Elimination 1.3 Matrices and Matrix.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Krylov-Subspace Methods - I Lecture 6 Alessandra Nardi Thanks to Prof. Jacob White, Deepak Ramaswamy, Michal Rewienski, and Karen Veroy.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
1 Instituto Tecnológico de Aeronáutica Prof. Maurício Vicente Donadon AE-256 NUMERICAL METHODS IN APPLIED STRUCTURAL MECHANICS Lecture notes: Prof. Maurício.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
ALGEBRAIC EIGEN VALUE PROBLEMS
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Review of Linear Algebra
CS479/679 Pattern Recognition Dr. George Bebis
CS485/685 Computer Vision Dr. George Bebis
Numerical Analysis Lecture 16.
Eigenvalues and Eigenvectors
Principal Components What matters most?.
Subject :- Applied Mathematics
Linear Algebra Lecture 16.
Presentation transcript:

1cs542g-term Notes  In assignment 1, problem 2: smoothness = number of times differentiable

2cs542g-term The Symmetric Eigenproblem  Assume A is symmetric and real  Find orthogonal matrix V and diagonal matrix D s.t. AV=VD Diagonal entries of D are the eigenvalues, corresponding columns of V are the eigenvectors  Also put: A=VDV T or V T AV=D  There are a few strategies More if you only care about a few eigenpairs, not the complete set…  Also: finding eigenvalues of an nxn matrix is equivalent to solving a degree n polynomial No “analytic” solution in general for n≥5 Thus general algorithms are iterative

3cs542g-term The Jacobi Algorithm  One of the oldest numerical methods, but still of interest  Start with initial guess at V (e.g. V=I), set A=V T AV  For k=1, 2, … If A is close enough to diagonal (Frobenius norm of off-diagonal tiny relative to A) stop Find a Givens rotation Q that solves a 2x2 subproblem  Either zeroing max off-diagonal entry, or sweeping in order through matrix. V=VQ, A=Q T AQ  Typically O(log n) sweeps, O(n 2 ) rotations each  Quadratic convergence in the limit

4cs542g-term The Power Method  Start with some random vector v, ||v|| 2 =1  Iterate v=(Av)/||Av||  What happens? How fast?

5cs542g-term Shift and Invert (Rayleigh Iteration)  Say we know the eigenvalue we want is approximately k  The matrix (A- k I) -1 has the same eigenvectors as A  But the eigenvalues are  Use this in the power method instead  Even better, update guess at eigenvalue each iteration:  Gives cubic convergence! (triples the number of significant digits each iteration when converging)

6cs542g-term Maximality and Orthogonality  Unit eigenvectors v 1 of the maximum magnitude eigenvalue satisfy  Unit eigenvectors v k of the k’th eigenvalue satisfy  Can pick them off one by one, or….

7cs542g-term Orthogonal iteration  Solve for lots (or all) of eigenvectors simultaneously  Start with initial guess V  For k=1, 2, … Z=AV VR=Z (QR decomposition: orthogonalize W)  Easy, but slow (linear convergence, nearby eigenvalues slow things down a lot)

8cs542g-term Rayleigh-Ritz  Before going further with full problem, take a look at intermediate problem: find a subset of the eigenpairs E.g. largest k, smallest k  Orthogonal estimate V (nxk) of eigenvectors  Simple Rayleigh estimate of eigenvalues: diag(V T AV)  Rayleigh-Ritz approach: Solve kxk eigenproblem V T AV Use those eigenvalues (Ritz values) and the associated orthogonal combinations of columns of V Note: another instance of “assume solution lies in span of a few basis vectors, solve reduced dimension problem”

9cs542g-term Solving the Full Problem  Orthogonal iteration works, but it’s slow  First speed-up: make A tridiagonal Sequence of symmetric Householder reflections Then Z=AV runs in O(n 2 ) instead of O(n 3 )  Other ingredients: Shifting: if we shift A by an exact eigenvalue, A- I, we get an exact eigenvector out of QR (improves on linear convergence) Division: once an offdiagonal is almost zero, problem separates into decoupled blocks