Linear Algebra Lecture 40.

Slides:



Advertisements
Similar presentations
Determinants Bases, linear Indep., etc Gram-Schmidt Eigenvalue and Eigenvectors Misc
Advertisements

Symmetric Matrices and Quadratic Forms
Chapter 5 Orthogonality
3D Geometry for Computer Graphics. 2 The plan today Least squares approach  General / Polynomial fitting  Linear systems of equations  Local polynomial.
Basis of a Vector Space (11/2/05)
Lecture 12 Projection and Least Square Approximation Shang-Hua Teng.
Lecture 12 Least Square Approximation Shang-Hua Teng.
Orthogonality and Least Squares
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
6 6.3 © 2012 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
Lecture 11 Fundamental Theorems of Linear Algebra Orthogonalily and Projection Shang-Hua Teng.
Mathematics1 Mathematics 1 Applied Informatics Štefan BEREŽNÝ.
5.1 Orthogonality.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Linear Least Squares Approximation. 2 Definition (point set case) Given a point set x 1, x 2, …, x n  R d, linear least squares fitting amounts to find.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Chapter 5: The Orthogonality and Least Squares
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
Chapter 5 Orthogonality.
MAT 4725 Numerical Analysis Section 8.2 Orthogonal Polynomials and Least Squares Approximations (Part II)
4.1 Vector Spaces and Subspaces 4.2 Null Spaces, Column Spaces, and Linear Transformations 4.3 Linearly Independent Sets; Bases 4.4 Coordinate systems.
Sections 1.8/1.9: Linear Transformations
Linear Algebra (Aljabar Linier) Week 10 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma
AN ORTHOGONAL PROJECTION
Orthogonality and Least Squares
AGC DSP AGC DSP Professor A G Constantinides©1 Hilbert Spaces Linear Transformations and Least Squares: Hilbert Spaces.
Elementary Linear Algebra Anton & Rorres, 9th Edition
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
Class 24: Question 1 Which of the following set of vectors is not an orthogonal set?
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
1 MAC 2103 Module 11 lnner Product Spaces II. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Construct an orthonormal.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
6 6.3 © 2016 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
4 4.1 © 2016 Pearson Education, Ltd. Vector Spaces VECTOR SPACES AND SUBSPACES.
Chapter 4 Vector Spaces Linear Algebra. Ch04_2 Definition 1: ……………………………………………………………………. The elements in R n called …………. 4.1 The vector Space R n Addition.
4.5: The Dimension of a Vector Space. Theorem 9 If a vector space V has a basis, then any set in V containing more than n vectors must be linearly dependent.
4 Vector Spaces 4.1 Vector Spaces and Subspaces 4.2 Null Spaces, Column Spaces, and Linear Transformations 4.3 Linearly Independent Sets; Bases 4.4 Coordinate.
Sec Sec Sec 4.4 CHAPTER 4 Vector Spaces Let V be a set of elements with vector addition and multiplication by scalar is a vector space if these.
An inner product on a vector space V is a function that, to each pair of vectors u and v in V, associates a real number and satisfies the following.
6 6.1 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Let W be a subspace of R n, y any vector in R n, and the orthogonal projection of y onto W. …
6 6.5 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares LEAST-SQUARES PROBLEMS.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Lecture XXVII. Orthonormal Bases and Projections Suppose that a set of vectors {x 1,…,x r } for a basis for some space S in R m space such that r  m.
1 Chapter 6 Orthogonality 6.1 Inner product, length, and orthogonality 内积 , 模长 , 正交 6.2 Orthogonal sets 正交组,正交集 6.4 The Gram-Schmidt process 格莱姆 - 施密特过程.
VECTOR SPACES AND SUBSPACES
Orthogonality and Least Squares
VECTOR SPACES AND SUBSPACES
Least Squares Approximations
Orthogonal Projection
Orthogonality and Least Squares
Singular Value Decomposition SVD
Linear Algebra Lecture 39.
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Linear Algebra Lecture 38.
Symmetric Matrices and Quadratic Forms
Elementary Linear Algebra
Elementary Linear Algebra Anton & Rorres, 9th Edition
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Linear Algebra Lecture 23.
Linear Algebra Lecture 20.
Linear Algebra Lecture 41.
Linear Algebra Lecture 7.
Vector Spaces, Subspaces
Orthogonality and Least Squares
Symmetric Matrices and Quadratic Forms
VECTOR SPACES AND SUBSPACES
Outline Basic Theories on the Subspace Subspace projection.
Presentation transcript:

Linear Algebra Lecture 40

Linear Algebra Lecture 40

Segment VI Orthogonality and Least Squares

Orthogonal Projections

Orthogonal Projection The orthogonal projection of a point in R2 onto a line through the origin has an important analogue in Rn. …

(1) is the unique vector in W for which y – is orthogonal to W, and Continued Given a vector y and a subspace W in Rn, there is a vector in W such that (1) is the unique vector in W for which y – is orthogonal to W, and (2) is the unique vector in W closest to y. …

Continued Figure …

Continued We observe that whenever a vector y is written as a linear combination of vectors u1, …, un in a basis of Rn, the terms in the sum for y can be grouped into two parts so that y can be written as y = z1 + z2 …

Continued where z1 is a linear combination of some of the ui, and z2 is a linear combination of the rest of the ui. This idea is particularly useful when {u1,…, un} is an orthogonal basis.

Let {u1, …, u5} be an orthogonal basis for R5 and let Example 1 Let {u1, …, u5} be an orthogonal basis for R5 and let Consider the subspace W = Span {u1, u2}, and write y as the sum of a vector z1 in W and a vector z2 in .

Decomposition Theorem The Orthogonal Decomposition Theorem Let W be a subspace of Rn. Then each y in Rn can be written uniquely in the form where is in W and z is in . …

In fact, if {u1, …, up} is any orthogonal basis of W, then Continued In fact, if {u1, …, up} is any orthogonal basis of W, then and z = y – . The vector is called the orthogonal projection of y onto W and often is written as projw y.

Orthogonal Projection Continued Orthogonal Projection of y on to W.

Example 2 Observe that {u1, u2} is an orthogonal basis for W = Span {u1, u2}. Write y as the sum of a vector in W and a vector orthogonal to W.

Best Approximation Theorem Let W be a subspace of Rn, y any vector in Rn, and the orthogonal projection of y onto W. Then is the closest point in W to y, in the sense that for all v in W distinct from . …

Continued The vector in this theorem is called the best approximation to y by elements of W.

and W = Span {u1, u2}, then the closest point in W to y is Example 3 and W = Span {u1, u2}, then the closest point in W to y is

Example 4 The distance from a point y in Rn to a subspace W is defined as the distance from y to the nearest point in W. …

Find the distance from y to W = Span{u1, u2}, where Continued Find the distance from y to W = Span{u1, u2}, where

Theorem

Example 5 and W = Span{u1, u2}. Use the fact that u1 and u2 are orthogonal to compute projw y.

Solution In this case y happens to be a linear combination of u1 and u2, so y is in W. The closest point in W to y is y itself.

Linear Algebra Lecture 40