6 6.3 © 2016 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.

Slides:



Advertisements
Similar presentations
Linear Equations in Linear Algebra
Advertisements

Copyright © 2014, 2010, 2007 Pearson Education, Inc. 1 Section 6.7 Dot Product.
Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Symmetric Matrices and Quadratic Forms
Chapter 5 Orthogonality
Orthogonality and Least Squares
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
6 6.3 © 2012 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
Orthogonal Sets (12/2/05) Recall that “orthogonal” matches the geometric idea of “perpendicular”. Definition. A set of vectors u 1,u 2,…,u p in R n is.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Pythagorean Theorem 2 Algebraic Proofs. Pythagoras’ Proof.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Slide The Pythagorean Theorem and the Distance Formula  Special Right Triangles  Converse of the Pythagorean Theorem  The Distance Formula: An.
Chapter 5: The Orthogonality and Least Squares
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
1 9.1 and 9.2 The Pythagorean Theorem. 2 A B C Given any right triangle, A 2 + B 2 = C 2.
Section 11.6 Pythagorean Theorem. Pythagorean Theorem: In any right triangle, the square of the length of the hypotenuse equals the sum of the squares.
Section 7.2 – The Quadratic Formula. The solutions to are The Quadratic Formula
Eigenvalues and Eigenvectors
4 4.4 © 2012 Pearson Education, Inc. Vector Spaces COORDINATE SYSTEMS.
1 © 2011 Pearson Education, Inc. All rights reserved 1 © 2010 Pearson Education, Inc. All rights reserved © 2011 Pearson Education, Inc. All rights reserved.
Section 7.1 – Solving Quadratic Equations. We already know how to solve quadratic equations. What if we can’t factor? Maybe we can use the Square Root.
Chapter 3 Euclidean Vector Spaces Vectors in n-space Norm, Dot Product, and Distance in n-space Orthogonality
AN ORTHOGONAL PROJECTION
1 Chapter 5 – Orthogonality and Least Squares Outline 5.1 Orthonormal Bases and Orthogonal Projections 5.2 Gram-Schmidt Process and QR Factorization 5.3.
Orthogonality and Least Squares
6 6.1 © 2016 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Slide Copyright © 2006 Pearson Education, Inc. Publishing as Pearson Addison-Wesley.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Copyright © Cengage Learning. All rights reserved. Vectors in Two and Three Dimensions.
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
Chap. 5 Inner Product Spaces 5.1 Length and Dot Product in R n 5.2 Inner Product Spaces 5.3 Orthonormal Bases: Gram-Schmidt Process 5.4 Mathematical Models.
Chapter 2 Copyright © 2015, 2011, 2007 Pearson Education, Inc. Chapter Copyright © 2015, 2011, 2007 Pearson Education, Inc. Chapter 2-1 Solving Linear.
Geometry Section 7.2 Use the Converse of the Pythagorean Theorem.
4.7 – Square Roots and The Pythagorean Theorem Day 2.
Slide 1 Copyright © 2015, 2011, 2008 Pearson Education, Inc. Area, Volume, and Surface Area Section9.3.
Copyright © 2011 Pearson Education, Inc. Publishing as Prentice Hall Quadratic Equations and Problem Solving.
Section 8-3 The Converse of the Pythagorean Theorem.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
1 MAC 2103 Module 11 lnner Product Spaces II. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Construct an orthonormal.
The Pythagorean Theorem The Ladder Problem. Right Triangles Longest side is the hypotenuse, side c (opposite the 90 o angle) The other two sides are the.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Pythagorean Theorem & Distance Formula Anatomy of a right triangle The hypotenuse of a right triangle is the longest side. It is opposite the right angle.
6 6.1 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Let W be a subspace of R n, y any vector in R n, and the orthogonal projection of y onto W. …
6 6.5 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares LEAST-SQUARES PROBLEMS.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Lecture 11 Inner Product Spaces Last Time Change of Basis (Cont.) Length and Dot Product in R n Inner Product Spaces Elementary Linear Algebra R. Larsen.
Midsegment Theorem and Coordinate Proofs
Copyright 2013, 2010, 2007, 2005, Pearson, Education, Inc.
Eigenvalues and Eigenvectors
Orthogonality and Least Squares
9.2 The Pythagorean Theorem
5.7: THE PYTHAGOREAN THEOREM (REVIEW) AND DISTANCE FORMULA
Orthogonality and Least Squares
Linear Algebra Lecture 40.
Linear Algebra Lecture 39.
Determinants CHANGE OF BASIS © 2016 Pearson Education, Inc.
Linear Algebra Lecture 38.
Symmetric Matrices and Quadratic Forms
Radical Equations and Problem Solving
Linear Algebra Lecture 41.
Vector Spaces COORDINATE SYSTEMS © 2012 Pearson Education, Inc.
Orthogonality and Least Squares
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Presentation transcript:

6 6.3 © 2016 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS

Slide © 2016 Pearson Education, Inc. ORTHOGONAL PROJECTIONS

Slide THE ORTHOGONAL DECOMPOSITION THEOREM © 2016 Pearson Education, Inc.

Slide THE ORTHOGONAL DECOMPOSITION THEOREM  The vector in (1) is called the orthogonal projection of y onto W and often is written as proj W y. See the following figure:  Proof: Let {u 1,…,u p } be any orthogonal basis for W, and define by (2).  Then is in W because is a linear combination of the basis u 1,…,u p. © 2016 Pearson Education, Inc.

Slide © 2016 Pearson Education, Inc. THE ORTHOGONAL DECOMPOSITION THEOREM

Slide © 2016 Pearson Education, Inc. THE ORTHOGONAL DECOMPOSITION THEOREM

Slide © 2016 Pearson Education, Inc. THE ORTHOGONAL DECOMPOSITION THEOREM  The uniqueness of the decomposition (1) shows that the orthogonal projection depends only on W and not on the particular basis used in (2).  Example 1: Let. Observe that {u 1, u 2 } is an orthogonal basis for. Write y as the sum of a vector in W and a vector orthogonal to W.

Slide THE ORTHOGONAL DECOMPOSITION THEOREM  Solution: The orthogonal projection of y onto W is  Also © 2016 Pearson Education, Inc.

Slide © 2016 Pearson Education, Inc. THE ORTHOGONAL DECOMPOSITION THEOREM  Theorem 8 ensures that is in.  To check the calculations, verify that is orthogonal to both u 1 and u 2 and hence to all of W.  The desired decomposition of y is

Slide © 2016 Pearson Education, Inc. PROPERTIES OF ORTHOGONAL PROJECTIONS  If {u 1,…,u p } is an orthogonal basis for W and if y happens to be in W, then the formula for proj W y is exactly the same as the representation of y given in Theorem 5 in Section 6.2.  In this case,.  If y is in, then.

Slide © 2016 Pearson Education, Inc. THE BEST APPROXIMATION THEOREM

Slide © 2016 Pearson Education, Inc. THE BEST APPROXIMATION THEOREM  Inequality (3) leads to a new proof that does not depend on the particular orthogonal basis used to compute it.  If a different orthogonal basis for W were used to construct an orthogonal projection of y, then this projection would also be the closest point in W to y, namely,.

Slide © 2016 Pearson Education, Inc. THE BEST APPROXIMATION THEOREM  Proof: Take v in W distinct from. See the following figure:  Then is in W.  By the Orthogonal Decomposition Theorem, is orthogonal to W.  In particular, is orthogonal to (which is in W ).

Slide © 2016 Pearson Education, Inc. THE BEST APPROXIMATION THEOREM  Since the Pythagorean Theorem gives  (See the colored right triangle in the figure on the previous slide. The length of each side is labeled.)  Now because, and so inequality (3) follows immediately.

Slide © 2016 Pearson Education, Inc. PROPERTIES OF ORTHOGONAL PROJECTIONS

Slide © 2016 Pearson Education, Inc. PROPERTIES OF ORTHOGONAL PROJECTIONS  Since {u 1, u 2 } is an orthogonal basis for W,  The distance from y to W is.

Slide © 2016 Pearson Education, Inc. PROPERTIES OF ORTHOGONAL PROJECTIONS

Slide © 2016 Pearson Education, Inc. PROPERTIES OF ORTHOGONAL PROJECTIONS