Introduction to Numerical Analysis I MATH/CMPSC 455 Conjugate Gradient Methods.

Slides:



Advertisements
Similar presentations
Vector Spaces A set V is called a vector space over a set K denoted V(K) if is an Abelian group, is a field, and For every element vV and K there exists.
Advertisements

1 D. R. Wilton ECE Dept. ECE 6382 Introduction to Linear Vector Spaces Reference: D.G. Dudley, “Mathematical Foundations for Electromagnetic Theory,” IEEE.
Optimization.
Least Squares example There are 3 mountains u,y,z that from one site have been measured as 2474 ft., 3882 ft., and 4834 ft.. But from u, y looks 1422 ft.
Steepest Decent and Conjugate Gradients (CG). Solving of the linear equation system.
Modern iterative methods For basic iterative methods, converge linearly Modern iterative methods, converge faster –Krylov subspace method Steepest descent.
Cojugate Gradient Method Zhengru Zhang ( 张争茹 ) Office: Math. Building 413(West) 2010 年教学实践周
Jonathan Richard Shewchuk Reading Group Presention By David Cline
Eigenvalues and Eigenvectors
Chapter 3 Determinants and Matrices
CS240A: Conjugate Gradients and the Model Problem.
Class 25: Question 1 Which of the following vectors is orthogonal to the row space of A?
Monica Garika Chandana Guduru. METHODS TO SOLVE LINEAR SYSTEMS Direct methods Gaussian elimination method LU method for factorization Simplex method of.
Orthogonality and Least Squares
CSE 245: Computer Aided Circuit Simulation and Verification
5 5.1 © 2012 Pearson Education, Inc. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Dominant Eigenvalues & The Power Method
Scientific Computing Matrix Norms, Convergence, and Matrix Condition Numbers.
ECE 530 – Analysis Techniques for Large-Scale Electrical Systems Prof. Hao Zhu Dept. of Electrical and Computer Engineering University of Illinois at Urbana-Champaign.
Example: Introduction to Krylov Subspace Methods DEF: Krylov sequence
Lecture 10 REPRESENTATIONS OF SYMMETRY POINT GROUPS 1) Basis functions, characters and representations Each symmetry operation in a group can be represented.
Linear Equations in Linear Algebra
Systems of Linear Equations Iterative Methods
Iterative Methods for Solving Linear Systems Leo Magallon & Morgan Ulloa.
Linear Algebra Lecture 25.
Diophantine Approximation and Basis Reduction
Gram-Schmidt Orthogonalization
Chapter 2 Simultaneous Linear Equations (cont.)
4 4.4 © 2012 Pearson Education, Inc. Vector Spaces COORDINATE SYSTEMS.
Section 4.1 Vectors in ℝ n. ℝ n Vectors Vector addition Scalar multiplication.
A matrix equation has the same solution set as the vector equation which has the same solution set as the linear system whose augmented matrix is Therefore:
1 1.5 © 2016 Pearson Education, Inc. Linear Equations in Linear Algebra SOLUTION SETS OF LINEAR SYSTEMS.
Chapter Content Real Vector Spaces Subspaces Linear Independence
AN ORTHOGONAL PROJECTION
Computing Eigen Information for Small Matrices The eigen equation can be rearranged as follows: Ax = x  Ax = I n x  Ax - I n x = 0  (A - I n )x = 0.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
4.6: Rank. Definition: Let A be an mxn matrix. Then each row of A has n entries and can therefore be associated with a vector in The set of all linear.
Let C be the simplicial complex on the right (the boundary of a tetrahedron). Find the following: C 0 = C 1 = C 2 = C 3 = Z 0 = Explain your answer for.
CSE 245: Computer Aided Circuit Simulation and Verification Matrix Computations: Iterative Methods I Chung-Kuan Cheng.
Chapter 10 Real Inner Products and Least-Square
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Quasi-Newton Methods of Optimization Lecture 2. General Algorithm n A Baseline Scenario Algorithm U (Model algorithm for n- dimensional unconstrained.
Solve a system of linear equations By reducing a matrix Pamela Leutwyler.
Class 24: Question 1 Which of the following set of vectors is not an orthogonal set?
Review of Matrix Operations Vector: a sequence of elements (the order is important) e.g., x = (2, 1) denotes a vector length = sqrt(2*2+1*1) orientation.
4.3 Linearly Independent Sets; Bases
5.1 Eigenvectors and Eigenvalues 5. Eigenvalues and Eigenvectors.
Section 1.7 Linear Independence and Nonsingular Matrices
Krylov-Subspace Methods - I Lecture 6 Alessandra Nardi Thanks to Prof. Jacob White, Deepak Ramaswamy, Michal Rewienski, and Karen Veroy.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Chapter 4 Vector Spaces Linear Algebra. Ch04_2 Definition 1: ……………………………………………………………………. The elements in R n called …………. 4.1 The vector Space R n Addition.
ECE 530 – Analysis Techniques for Large-Scale Electrical Systems
Lecture XXVI.  The material for this lecture is found in James R. Schott Matrix Analysis for Statistics (New York: John Wiley & Sons, Inc. 1997).  A.
6 6.1 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
1 1.3 © 2016 Pearson Education, Ltd. Linear Equations in Linear Algebra VECTOR EQUATIONS.
The Landscape of Sparse Ax=b Solvers Direct A = LU Iterative y’ = Ay Non- symmetric Symmetric positive definite More RobustLess Storage More Robust More.
Tutorial 6. Eigenvalues & Eigenvectors Reminder: Eigenvectors A vector x invariant up to a scaling by λ to a multiplication by matrix A is called.
Review of Matrix Operations
Solving Poisson Equations Using Least Square Technique in Image Editing Colin Zheng Yi Li.
Row Space, Column Space, and Nullspace
4.6: Rank.
Conjugate Gradient Method
CS5321 Numerical Optimization
Theorems about LINEAR MAPPINGS.
Row-equivalences again
Linear Algebra Lecture 32.
Row-equivalences again
Linear Vector Space and Matrix Mechanics
Presentation transcript:

Introduction to Numerical Analysis I MATH/CMPSC 455 Conjugate Gradient Methods

A-O RTHOGONAL B ASIS form a basis of, where is the i-th row of the identity matrix. They are orthogonal in the following sense: They are linearly independent, and form a basis. Introduce a set of nonzero vectors, They satisfy the following condition: We say they are A-orthogonal, or conjugate w.r.t A.

C ONJUGATE D IRECTION M ETHOD Theorem: For any initial guess, the sequence generated by the above iterative method, converges to the solution of the linear system in at most n iterations. Question: How to find the A-orthogonal bases?

C ONJUGATE G RADIENT METHOD Answer: Each conjugate direction is chosen to be a linear combination of the residual and the previous direction Conjugate Gradient Method: Conjugate direction method on this particular basis.

CG (O RIGINAL V ERSION ) While End While

Theorem : Let A be a symmetric positive-definite matrix. In the Conjugate Gradient Method, we have

CG (P RACTICAL V ERSION ) While End While

Example :