6 6.3 © 2012 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.

Slides:



Advertisements
Similar presentations
Chapter 4 Euclidean Vector Spaces
Advertisements

6.4 Best Approximation; Least Squares
Linear Equations in Linear Algebra
Eigenvalues and Eigenvectors
4 4.3 © 2012 Pearson Education, Inc. Vector Spaces LINEARLY INDEPENDENT SETS; BASES.
Eigenvalues and Eigenvectors
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
THE DIMENSION OF A VECTOR SPACE
Symmetric Matrices and Quadratic Forms
Chapter 5 Orthogonality
Orthogonality and Least Squares
Linear Equations in Linear Algebra
Matrix Algebra THE INVERSE OF A MATRIX © 2012 Pearson Education, Inc.
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Orthogonal Sets (12/2/05) Recall that “orthogonal” matches the geometric idea of “perpendicular”. Definition. A set of vectors u 1,u 2,…,u p in R n is.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Chapter 5: The Orthogonality and Least Squares
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
Section 11.6 Pythagorean Theorem. Pythagorean Theorem: In any right triangle, the square of the length of the hypotenuse equals the sum of the squares.
4 4.2 © 2012 Pearson Education, Inc. Vector Spaces NULL SPACES, COLUMN SPACES, AND LINEAR TRANSFORMATIONS.
Eigenvalues and Eigenvectors
4.1 Vector Spaces and Subspaces 4.2 Null Spaces, Column Spaces, and Linear Transformations 4.3 Linearly Independent Sets; Bases 4.4 Coordinate systems.
4 4.4 © 2012 Pearson Education, Inc. Vector Spaces COORDINATE SYSTEMS.
1 1.4 Linear Equations in Linear Algebra THE MATRIX EQUATION © 2016 Pearson Education, Inc.
2 2.1 © 2016 Pearson Education, Inc. Matrix Algebra MATRIX OPERATIONS.
Linear Equations in Linear Algebra
AN ORTHOGONAL PROJECTION
1 1.7 © 2016 Pearson Education, Inc. Linear Equations in Linear Algebra LINEAR INDEPENDENCE.
1 Chapter 5 – Orthogonality and Least Squares Outline 5.1 Orthonormal Bases and Orthogonal Projections 5.2 Gram-Schmidt Process and QR Factorization 5.3.
Orthogonality and Least Squares
6 6.1 © 2016 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Elementary Linear Algebra Anton & Rorres, 9th Edition
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
Chap. 5 Inner Product Spaces 5.1 Length and Dot Product in R n 5.2 Inner Product Spaces 5.3 Orthonormal Bases: Gram-Schmidt Process 5.4 Mathematical Models.
4 © 2012 Pearson Education, Inc. Vector Spaces 4.4 COORDINATE SYSTEMS.
2 2.1 © 2012 Pearson Education, Inc. Matrix Algebra MATRIX OPERATIONS.
Copyright © 2011 Pearson Education, Inc. Publishing as Prentice Hall Quadratic Equations and Problem Solving.
Slide Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
6 6.3 © 2016 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
4 4.1 © 2016 Pearson Education, Ltd. Vector Spaces VECTOR SPACES AND SUBSPACES.
4 4.5 © 2016 Pearson Education, Inc. Vector Spaces THE DIMENSION OF A VECTOR SPACE.
1 1.4 Linear Equations in Linear Algebra THE MATRIX EQUATION © 2016 Pearson Education, Ltd.
6 6.1 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Let W be a subspace of R n, y any vector in R n, and the orthogonal projection of y onto W. …
6 6.5 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares LEAST-SQUARES PROBLEMS.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Lecture 11 Inner Product Spaces Last Time Change of Basis (Cont.) Length and Dot Product in R n Inner Product Spaces Elementary Linear Algebra R. Larsen.
Eigenvalues and Eigenvectors
Linear Equations in Linear Algebra
Copyright 2013, 2010, 2007, 2005, Pearson, Education, Inc.
Eigenvalues and Eigenvectors
Least Squares Approximations
Orthogonality and Least Squares
VECTOR SPACES AND SUBSPACES
Linear Equations in Linear Algebra
Orthogonality and Least Squares
Linear Algebra Lecture 40.
Linear Algebra Lecture 39.
Determinants CHANGE OF BASIS © 2016 Pearson Education, Inc.
Linear Algebra Lecture 38.
Symmetric Matrices and Quadratic Forms
Linear Algebra Lecture 41.
Orthogonality and Least Squares
Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Presentation transcript:

6 6.3 © 2012 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS

Slide © 2012 Pearson Education, Inc. ORTHOGONAL PROJECTIONS  The orthogonal projection of a point in onto a line through the origin has an important analogue in.  Given a vector y and a subspace W in, there is a vector in W such that (1) is the unique vector in W for which is orthogonal to W, and (2) is the unique vector in W closest to y. See the following figure.

Slide © 2012 Pearson Education, Inc. THE ORTHOGONAL DECOMPOSITION THEOREM  These two properties of provide the key to finding the least-squares solutions of linear systems.  Theorem 8: Let W be a subspace of. Then each y in can be written uniquely in the form ----(1) where is in W and z is in.  In fact, if {u 1,…,u p } is any orthogonal basis of W, then ----(2) and.

Slide © 2012 Pearson Education, Inc. THE ORTHOGONAL DECOMPOSITION THEOREM  The vector in (1) is called the orthogonal projection of y onto W and often is written as proj W y. See the following figure.  Proof: Let {u 1,…,u p } be any orthogonal basis for W, and define by (2).  Then is in W because is a linear combination of the basis u 1,…,u p.

Slide © 2012 Pearson Education, Inc. THE ORTHOGONAL DECOMPOSITION THEOREM  Let.  Since u 1 is orthogonal to u 2,…,u p, it follows from (2) that  Thus z is orthogonal to u 1.  Similarly, z is orthogonal to each u j in the basis for W.  Hence z is orthogonal to every vector in W.  That is, z is in.

Slide © 2012 Pearson Education, Inc. THE ORTHOGONAL DECOMPOSITION THEOREM  To show that the decomposition in (1) is unique, suppose y can also be written as, with in W and z 1 in.  Then (since both sides equal y), and so  This equality shows that the vector is in W and in (because z 1 and z are both in, and is a subspace).  Hence, which shows that.  This proves that and also.

Slide © 2012 Pearson Education, Inc. THE ORTHOGONAL DECOMPOSITION THEOREM  The uniqueness of the decomposition (1) shows that the orthogonal projection depends only on W and not on the particular basis used in (2).  Example 1: Let. Observe that {u 1, u 2 } is an orthogonal basis for. Write y as the sum of a vector in W and a vector orthogonal to W.

Slide © 2012 Pearson Education, Inc. THE ORTHOGONAL DECOMPOSITION THEOREM  Solution: The orthogonal projection of y onto W is  Also

Slide © 2012 Pearson Education, Inc. THE ORTHOGONAL DECOMPOSITION THEOREM  Theorem 8 ensures that is in.  To check the calculations, verify that is orthogonal to both u 1 and u 2 and hence to all of W.  The desired decomposition of y is

Slide © 2012 Pearson Education, Inc. PROPERTIES OF ORTHOGONAL PROJECTIONS  If {u 1,…,u p } is an orthogonal basis for W and if y happens to be in W, then the formula for proj W y is exactly the same as the representation of y given in Theorem 5 in Section 6.2.  In this case,.  If y is in, then.

Slide © 2012 Pearson Education, Inc. THE BEST APPROXIMATION THEOREM  Theorem 9: Let W be a subspace of, let y be any vector in, and let be the orthogonal projection of y onto W. Then is the closest point in W to y, in the sense that ----(3) for all v in W distinct from.  The vector in Theorem 9 is called the best approximation to y by elements of W.  The distance from y to v, given by, can be regarded as the “error” of using v in place of y.  Theorem 9 says that this error is minimized when.

Slide © 2012 Pearson Education, Inc. THE BEST APPROXIMATION THEOREM  Inequality (3) leads to a new proof that does not depend on the particular orthogonal basis used to compute it.  If a different orthogonal basis for W were used to construct an orthogonal projection of y, then this projection would also be the closest point in W to y, namely,.

Slide © 2012 Pearson Education, Inc. THE BEST APPROXIMATION THEOREM  Proof: Take v in W distinct from. See the following figure.  Then is in W.  By the Orthogonal Decomposition Theorem, is orthogonal to W.  In particular, is orthogonal to (which is in W ).

Slide © 2012 Pearson Education, Inc. THE BEST APPROXIMATION THEOREM  Since the Pythagorean Theorem gives  (See the colored right triangle in the figure on the previous slide. The length of each side is labeled.)  Now because, and so inequality (3) follows immediately.

Slide © 2012 Pearson Education, Inc. PROPERTIES OF ORTHOGONAL PROJECTIONS  Example 2: The distance from a point y in to a subspace W is defined as the distance from y to the nearest point in W. Find the distance from y to, where  Solution: By the Best Approximation Theorem, the distance from y to W is, where.

Slide © 2012 Pearson Education, Inc. PROPERTIES OF ORTHOGONAL PROJECTIONS  Since {u 1, u 2 } is an orthogonal basis for W,  The distance from y to W is.

Slide © 2012 Pearson Education, Inc. PROPERTIES OF ORTHOGONAL PROJECTIONS  Theorem 10: If {u 1,…,u p } is an orthogonal basis for a subspace W of, then ----(4) If, then for all y in ----(5)  Proof: Formula (4) follows immediately from (2) in Theorem 8.

Slide © 2012 Pearson Education, Inc. PROPERTIES OF ORTHOGONAL PROJECTIONS  Also, (4) shows that proj W y is a linear combination of the columns of U using the weights.  The weights can be written as, showing that they are the entries in U T y and justifying (5).