QR Factorization –Direct Method to solve linear systems Problems that generate Singular matrices –Modified Gram-Schmidt Algorithm –QR Pivoting Matrix must.

Slides:



Advertisements
Similar presentations
Linear Algebra Applications in Matlab ME 303. Special Characters and Matlab Functions.
Advertisements

Section 4.6 (Rank).
MATH 685/ CSI 700/ OR 682 Lecture Notes
Solving Linear Systems (Numerical Recipes, Chap 2)
Lecture 7 Intersection of Hyperplanes and Matrix Inverse Shang-Hua Teng.
SOLVING SYSTEMS OF LINEAR EQUATIONS. Overview A matrix consists of a rectangular array of elements represented by a single symbol (example: [A]). An individual.
CISE301_Topic3KFUPM1 SE301: Numerical Methods Topic 3: Solution of Systems of Linear Equations Lectures 12-17: KFUPM Read Chapter 9 of the textbook.
Iterative Methods and QR Factorization Lecture 5 Alessandra Nardi Thanks to Prof. Jacob White, Suvranu De, Deepak Ramaswamy, Michal Rewienski, and Karen.
Lecture 6 Matrix Operations and Gaussian Elimination for Solving Linear Systems Shang-Hua Teng.
Chapter 9 Gauss Elimination The Islamic University of Gaza
1 EE 616 Computer Aided Analysis of Electronic Networks Lecture 3 Instructor: Dr. J. A. Starzyk, Professor School of EECS Ohio University Athens, OH,
Signal , Weight Vector Spaces and Linear Transformations
Chapter 5 Orthogonality
ECIV 301 Programming & Graphics Numerical Methods for Engineers Lecture 17 Solution of Systems of Equations.
Lecture 12 Least Square Approximation Shang-Hua Teng.
ECIV 520 Structural Analysis II Review of Matrix Algebra.
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Ordinary least squares regression (OLS)
Linear Least Squares QR Factorization. Systems of linear equations Problem to solve: M x = b Given M x = b : Is there a solution? Is the solution unique?
The Islamic University of Gaza Faculty of Engineering Civil Engineering Department Numerical Analysis ECIV 3306 Chapter 10 LU Decomposition and Matrix.
ECIV 301 Programming & Graphics Numerical Methods for Engineers Lecture 18 LU Decomposition and Matrix Inversion.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Instructor: Irvin Roy Hentzel Office 432 Carver Phone
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
SVD(Singular Value Decomposition) and Its Applications
Computational Methods in Physics PHYS 3437 Dr Rob Thacker Dept of Astronomy & Physics (MM-301C)
Scientific Computing Linear Systems – LU Factorization.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Square n-by-n Matrix.
MA2213 Lecture 5 Linear Equations (Direct Solvers)
Solving Systems of 3 or More Variables Why a Matrix? In previous math classes you solved systems of two linear equations using the following method:
By: David McQuilling and Jesus Caban Numerical Linear Algebra.
AN ORTHOGONAL PROJECTION
13.6 MATRIX SOLUTION OF A LINEAR SYSTEM.  Examine the matrix equation below.  How would you solve for X?  In order to solve this type of equation,
Chapter 3 Solution of Algebraic Equations 1 ChE 401: Computational Techniques for Chemical Engineers Fall 2009/2010 DRAFT SLIDES.
Orthogonality and Least Squares
Mathematical foundationsModern Seismology – Data processing and inversion 1 Some basic maths for seismic data processing and inverse problems (Refreshement.
Lecture 8 Matrix Inverse and LU Decomposition
MATH 685/ CSI 700/ OR 682 Lecture Notes Lecture 4. Least squares.
The first 2 steps of the Gram Schmitt Process
Chapter 10 Real Inner Products and Least-Square
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
Direct Methods for Sparse Linear Systems Lecture 4 Alessandra Nardi Thanks to Prof. Jacob White, Suvranu De, Deepak Ramaswamy, Michal Rewienski, and Karen.
Direct Methods for Linear Systems Lecture 3 Alessandra Nardi Thanks to Prof. Jacob White, Suvranu De, Deepak Ramaswamy, Michal Rewienski, and Karen Veroy.
Chapter 9 Gauss Elimination The Islamic University of Gaza
Krylov-Subspace Methods - I Lecture 6 Alessandra Nardi Thanks to Prof. Jacob White, Deepak Ramaswamy, Michal Rewienski, and Karen Veroy.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
 Recall that when you wanted to solve a system of equations, you used to use two different methods.  Substitution Method  Addition Method.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
ECE 530 – Analysis Techniques for Large-Scale Electrical Systems
Unit #1 Linear Systems Fall Dr. Jehad Al Dallal.
Krylov-Subspace Methods - II Lecture 7 Alessandra Nardi Thanks to Prof. Jacob White, Deepak Ramaswamy, Michal Rewienski, and Karen Veroy.
Linear Algebra Review Tuesday, September 7, 2010.
Chapter 12: Data Analysis by linear least squares Overview: Formulate problem as an over-determined linear system of equations Solve the linear system.
Section 8 Numerical Analysis CSCI E-24 José Luis Ramírez Herrán October 20, 2015.
Numerical Computation Lecture 6: Linear Systems – part II United International College.
6 6.5 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares LEAST-SQUARES PROBLEMS.
1 Numerical Methods Solution of Systems of Linear Equations.
REVIEW Linear Combinations Given vectors and given scalars
Krylov-Subspace Methods - I
Linear Equations.
MA/CS 375 Spring 2002 Lecture 19 MA/CS 375 Fall 2002.
Singular Value Decomposition
Conjugate Gradient Method
Numerical Computation and Optimization
Numerical Analysis Lecture10.
Linear Systems Numerical Methods.
Systems of Equations Solve by Graphing.
Major: All Engineering Majors Authors: Autar Kaw
Lecture 8 Matrix Inverse and LU Decomposition
Presentation transcript:

QR Factorization –Direct Method to solve linear systems Problems that generate Singular matrices –Modified Gram-Schmidt Algorithm –QR Pivoting Matrix must be singular, move zero column to end. –Minimization view point  Link to Iterative Non stationary Methods (Krylov Subspace) Outline

1 1 v1 v2v3 v4 The resulting nodal matrix is SINGULAR, but a solution exists! LU Factorization fails Singular Example

The resulting nodal matrix is SINGULAR, but a solution exists! Solution (from picture): v 4 = -1 v 3 = -2 v 2 = anything you want  solutions v 1 = v LU Factorization fails Singular Example One step GE

Recall weighted sum of columns view of systems of equations M is singular but b is in the span of the columns of M QR Factorization Singular Example

Orthogonal columns implies : Multiplying the weighted columns equation by i-th column: Simplifying using orthogonality : QR Factorization – Key idea If M has orthogonal columns

Picture for the two-dimensional case Non-orthogonal Case Orthogonal Case QR Factorization - M orthonormal M is orthonormal if:

How to perform the conversion? QR Factorization – Key idea

QR Factorization Projection formula

Formulas simplify if we normalize QR Factorization – Normalization

Mx=b  Qy=b  Mx=Qy QR Factorization – 2 x 2 case

Two Step Solve Given QR QR Factorization – 2 x 2 case

To Insure the third column is orthogonal QR Factorization – General case

In general, must solve NxN dense linear system for coefficients

To Orthogonalize the Nth Vector QR Factorization – General case

To Insure the third column is orthogonal QR Factorization – General case Modified Gram-Schmidt Algorithm

For i = 1 to N “For each Source Column”   For j = i+1 to N { “For each target Column right of source”   end  end Normalize QR Factorization Modified Gram-Schmidt Algorithm (Source-column oriented approach)

QR Factorization – By picture

Suppose only matrix-vector products were available? More convenient to use another approach QR Factorization – Matrix-Vector Product View

For i = 1 to N “For each Target Column” For j = 1 to i-1 “For each Source Column left of target” end Normalize QR Factorization Modified Gram-Schmidt Algorithm (Target-column oriented approach)

QR Factorization r 11 r 12 r 22 r 13 r 23 r 33 r 14 r 24 r 34 r 44 r 11 r 22 r 12 r 14 r 13 r 23 r 24 r 33 r 34 r 44

What if a Column becomes Zero? Matrix MUST BE Singular! 1)Do not try to normalize the column. 2)Do not use the column as a source for orthogonalization. 3) Perform backward substitution as well as possible QR Factorization – Zero Column

Resulting QR Factorization QR Factorization – Zero Column

Recall weighted sum of columns view of systems of equations M is singular but b is in the span of the columns of M QR Factorization – Zero Column

Reasons for QR Factorization QR factorization to solve Mx=b –Mx=b  QRx=b  Rx=Q T b where Q is orthogonal, R is upper trg O(N 3 ) as GE Nice for singular matrices –Least-Squares problem Mx=b where M: mxn and m>n Pointer to Krylov-Subspace Methods –through minimization point of view

Minimization More General! QR Factorization – Minimization View

One dimensional Minimization Normalization QR Factorization – Minimization View One-Dimensional Minimization

One dimensional minimization yields same result as projection on the column! QR Factorization – Minimization View One-Dimensional Minimization: Picture

Residual Minimization Coupling Term QR Factorization – Minimization View Two-Dimensional Minimization

QR Factorization – Minimization View Two-Dimensional Minimization: Residual Minimization Coupling Term To eliminate coupling term: we change search directions !!!

More General Search Directions Coupling Term QR Factorization – Minimization View Two-Dimensional Minimization

More General Search Directions QR Factorization – Minimization View Two-Dimensional Minimization Goal : find a set of search directions such that In this case minimization decouples !!! p i and p j are called M T M orthogonal

i-th search direction equals    orthogonalized unit vector Use previous orthogonalized Search directions QR Factorization – Minimization View Forming M T M orthogonal Minimization Directions

QR Factorization – Minimization View Minimizing in the Search Direction When search directions p j are M T M orthogonal, residual minimization becomes:

For i = 1 to N “For each Target Column” For j = 1 to i-1 “For each Source Column left of target” end Normalize Orthogonalize Search Direction QR Factorization – Minimization View Minimization Algorithm

Intuitive summary QR factorization  Minimization view (Direct)(Iterative) Compose vector x along search directions: –Direct: composition along Q i (orthonormalized columns of M)  need to factorize M –Iterative: composition along certain search directions  you can stop half way About the search directions: –Chosen so that it is easy to do the minimization (decoupling)  p j are M T M orthogonal –Each step: try to minimize the residual

MMM M T M Orthonormal Compare Minimization and QR

Summary Iterative Methods Overview –Stationary –Non Stationary QR factorization to solve Mx=b –Modified Gram-Schmidt Algorithm –QR Pivoting –Minimization View of QR Basic Minimization approach Orthogonalized Search Directions Pointer to Krylov Subspace Methods

Forward Difference Formula Of order o(h 2 ) for Numerical Differentiation Y’ = e -x sin(x)

Terima kasih