Orthogonal Projections

Slides:



Advertisements
Similar presentations
6.4 Best Approximation; Least Squares
Advertisements

Systems of Linear Equations Examples Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Class 25: Question 1 Which of the following vectors is orthogonal to the row space of A?
Orthogonality and Least Squares
Math 3C Practice Midterm Solutions Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Multiplication with Vectors
6.837 Linear Algebra Review Patrick Nichols Thursday, September 18, 2003.
Patrick Nichols Thursday, September 18, Linear Algebra Review.
Chapter 5: The Orthogonality and Least Squares
Systems of Linear Equations Gaussian Elimination Types of Solutions Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
6.837 Linear Algebra Review Patrick Nichols Thursday, September 18, 2003.
AN ORTHOGONAL PROJECTION
The first 2 steps of the Gram Schmitt Process
Chapter 10 Real Inner Products and Least-Square
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
The Inverse of a Matrix Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Class 26: Question 1 1.An orthogonal basis for A 2.An orthogonal basis for the column space of A 3.An orthogonal basis for the row space of A 4.An orthogonal.
Systems of Linear Equations in Vector Form Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Vector Spaces Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Inner Product, Length and Orthogonality Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Linear Transformations
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Orthogonal Projections Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Coordinate Systems Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Linear Algebra Review Tuesday, September 7, 2010.
Matrix Arithmetic Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Differential Equations Second-Order Linear DEs Variation of Parameters Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Math 4B Systems of Differential Equations Matrix Solutions Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
QR decomposition: A = QR Q is orthonormal R is upper triangular To find QR decomposition: 1.) Q: Use Gram-Schmidt to find orthonormal basis for column.
Determinants Prepared by Vince Zaccone
Differential Equations
Systems of Differential Equations Phase Plane Analysis
Spaces.
Eigenvalues and Eigenvectors
Matrix Arithmetic Prepared by Vince Zaccone
Systems of Linear Equations
Vector Spaces Prepared by Vince Zaccone
Orthogonal Projections
Systems of Linear Equations
Linear Transformations
Differential Equations
Differential Equations
Complex Eigenvalues Prepared by Vince Zaccone
Vectors.
Linear Transformations
Systems of Linear Equations
Systems of Linear Equations
Vectors Jeff Chastine.
Scalars and Vectors.
Coordinate Systems Prepared by Vince Zaccone
Least Squares Approximations
Orthogonality and Least Squares
Inner Product, Length and Orthogonality
Lecture 03: Linear Algebra
Linear Independence Prepared by Vince Zaccone
Linear Transformations
Vector Spaces Prepared by Vince Zaccone
Systems of Differential Equations Phase Plane Analysis
Systems of Differential Equations Nonhomogeneous Systems
Least Squares Approximations
Systems of Linear Equations
11 Vectors and the Geometry of Space
The Inverse of a Matrix Prepared by Vince Zaccone
Systems of Linear Equations
Systems of Linear Equations
Linear Transformations
Coordinate Systems Prepared by Vince Zaccone
Diagonalization Prepared by Vince Zaccone
Presentation transcript:

Orthogonal Projections Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Orthogonal Projection For a vector u in ℝn we would like to decompose any other vector in ℝn into the sum of two vectors, one a multiple of u, and the other orthogonal to u. That is, we wish to write: 𝑦 = 𝑦 + 𝑧 Where 𝑦 =𝛼 𝑢 for some scalar α, and z is a vector orthogonal to u. Here is a formula. To get this, just set u•z=0 and rearrange. 𝑦 = 𝑦 ∙ 𝑢 𝑢 ∙ 𝑢 𝑢 Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Orthogonal Projection For a vector u in ℝn we would like to decompose any other vector in ℝn into the sum of two vectors, one a multiple of u, and the other orthogonal to u. That is, we wish to write: 𝑦 = 𝑦 + 𝑧 Where 𝑦 =𝛼 𝑢 for some scalar α, and z is a vector orthogonal to u. Another version of the formula. This one shows the unit vectors in the direction of u. 𝑦 = 𝑦 ∙ 𝑢 𝑢 𝑢 𝑢 Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Orthogonal Projection Here is an example using vectors in ℝ2. 𝑦 = 4 7 𝑢 = 2 1 Find the orthogonal projection of y onto u. 𝑦 = 𝑦 ∙ 𝑢 𝑢 ∙ 𝑢 𝑢 𝑦 = 15 5 𝑢 = 6 3 𝑦 Subtracting yields the vector z, which is orthogonal to u. Check this by finding that z•u=0. 𝑧 𝑢 𝑧 = 𝑦 − 𝑦 = −2 4 Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Orthogonal Projection One important consequence of the previous calculation is that we have manufactured an orthogonal basis for the vector space ℝ2. This idea can be very useful in a variety of situations. The original set of vectors was a basis for ℝ2, but the vectors were not orthogonal. To find an orthogonal basis, we simply found the projection of one vector onto the other, then subtracted it, leaving an orthogonal vector. We can go a step further and find an orthonormal basis by simply dividing each vector by its magnitude. Original basis Orthogonal basis Orthonormal basis In this example, we only needed two basis vectors (ℝ2 is 2-dimensional), but if we are dealing with a larger space this process can be repeated to find as many vectors as necessary. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Orthogonal Projection The process of constructing an orthonormal basis in the way we have described is called the Gram-Schmidt process. Here is another example, this time with vectors in ℝ3. The given set of vectors form a basis for ℝ3. Use the Gram-Schmidt process to find an orthonormal basis for ℝ3. Step 1 is to find the projection of v2 onto v1, and subtract it from v2, leaving a new vector that is orthogonal to v1. I will call this new vector v2*. Can scale vector to avoid fractions Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Orthogonal Projection We now have two orthogonal vectors. To manufacture the third one we will project v3 onto both of these, and subtract those projections to obtain a vector that is orthogonal to both v1 and v2*. Call this one v3*. We can scale these vectors however we want, so for convenience we can use the following orthogonal set: Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Orthogonal Projection The last step is to normalize the vectors. Simply find the length of each vector and divide, obtaining a vector of length 1 that points the same direction. Here is an orthonormal set of vectors that spans ℝ3. If we make a matrix with these vectors as columns we get a very convenient orthogonal (orthonormal) matrix with the property that UT=U-1 Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB