Orthogonal Projections Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.

Slides:



Advertisements
Similar presentations
CS 450: COMPUTER GRAPHICS LINEAR ALGEBRA REVIEW SPRING 2015 DR. MICHAEL J. REALE.
Advertisements

Physics 6C Energy Levels Bohr Model of the Atom Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Physics 6C Heisenberg Uncertainty Principle Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Physics 6B Electric Potential and Electric Potential Energy Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Differential Equations Verification Examples Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Differential Equations Separable Examples Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Systems of Linear Equations Examples Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Class 25: Question 1 Which of the following vectors is orthogonal to the row space of A?
Orthogonality and Least Squares
Math 3C Euler’s Method Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Math 34A Chapter 5 examples Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Linear Algebra, Principal Component Analysis and their Chemometrics Applications.
Differential Equations Solving First-Order Linear DEs Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Math 3C Practice Midterm Solutions Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Multiplication with Vectors
Chapter 5: The Orthogonality and Least Squares
Systems of Linear Equations Gaussian Elimination Types of Solutions Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
MA4248 Weeks 1-3. Topics Coordinate Systems, Kinematics, Newton’s Laws, Inertial Mass, Force, Momentum, Energy, Harmonic Oscillations (Springs and Pendulums)
AN ORTHOGONAL PROJECTION
Vectors and the Geometry of Space Copyright © Cengage Learning. All rights reserved.
Elementary Linear Algebra Anton & Rorres, 9th Edition
The first 2 steps of the Gram Schmitt Process
Differential Equations Separable Examples Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Chapter 10 Real Inner Products and Least-Square
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
The Inverse of a Matrix Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Class 26: Question 1 1.An orthogonal basis for A 2.An orthogonal basis for the column space of A 3.An orthogonal basis for the row space of A 4.An orthogonal.
Differential Equations Graphing Solutions Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Systems of Linear Equations in Vector Form Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Vector Spaces Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Inner Product, Length and Orthogonality Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Linear Transformations
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Coordinate Systems Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Matrix Arithmetic Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Differential Equations Second-Order Linear DEs Variation of Parameters Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Math 4B Systems of Differential Equations Matrix Solutions Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
CSE 681 Brief Review: Vectors. CSE 681 Vectors Direction in space Normalizing a vector => unit vector Dot product Cross product Parametric form of a line.
QR decomposition: A = QR Q is orthonormal R is upper triangular To find QR decomposition: 1.) Q: Use Gram-Schmidt to find orthonormal basis for column.
Determinants Prepared by Vince Zaccone
Differential Equations
Eigenvalues and Eigenvectors
Matrix Arithmetic Prepared by Vince Zaccone
Systems of Linear Equations
Vector Spaces Prepared by Vince Zaccone
Orthogonal Projections
Systems of Linear Equations
Linear Transformations
Differential Equations
Differential Equations
Complex Eigenvalues Prepared by Vince Zaccone
Vectors.
Linear Transformations
Systems of Linear Equations
Systems of Linear Equations
Scalars and Vectors.
Coordinate Systems Prepared by Vince Zaccone
Least Squares Approximations
Inner Product, Length and Orthogonality
Linear Independence Prepared by Vince Zaccone
Linear Transformations
Vector Spaces Prepared by Vince Zaccone
Least Squares Approximations
Systems of Linear Equations
Systems of Linear Equations
Orthogonal Projections
Linear Transformations
Coordinate Systems Prepared by Vince Zaccone
Diagonalization Prepared by Vince Zaccone
Presentation transcript:

Orthogonal Projections Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB For a vector u in ℝ n we would like to decompose any other vector in ℝ n into the sum of two vectors, one a multiple of u, and the other orthogonal to u. Where Orthogonal Projection That is, we wish to write: for some scalar α, and z is a vector orthogonal to u.

Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB For a vector u in ℝ n we would like to decompose any other vector in ℝ n into the sum of two vectors, one a multiple of u, and the other orthogonal to u. Where Orthogonal Projection That is, we wish to write: for some scalar α, and z is a vector orthogonal to u. Here is a formula. To get this, just set uz=0 and rearrange.

Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB For a vector u in ℝ n we would like to decompose any other vector in ℝ n into the sum of two vectors, one a multiple of u, and the other orthogonal to u. Where Orthogonal Projection That is, we wish to write: for some scalar α, and z is a vector orthogonal to u. Another version of the formula. This one shows the unit vectors in the direction of u.

Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB Here is an example using vectors in ℝ 2. Orthogonal Projection Find the orthogonal projection of y onto u.

Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB Here is an example using vectors in ℝ 2. Orthogonal Projection Find the orthogonal projection of y onto u. Subtracting yields the vector z, which is orthogonal to u. Check this by finding that zu=0.

Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB Here is an example using vectors in ℝ 2. Orthogonal Projection Find the orthogonal projection of y onto u. Subtracting yields the vector z, which is orthogonal to u. Check this by finding that zu=0.

Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB One important consequence of the previous calculation is that we have manufactured an orthogonal basis for the vector space ℝ 2. This idea can be very useful in a variety of situations. Orthogonal Projection The original set of vectors was a basis for ℝ 2, but the vectors were not orthogonal. To find an orthogonal basis, we simply found the projection of one vector onto the other, then subtracted it, leaving an orthogonal vector. We can go a step further and find an orthonormal basis by simply dividing each vector by its magnitude.

Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB One important consequence of the previous calculation is that we have manufactured an orthogonal basis for the vector space ℝ 2. This idea can be very useful in a variety of situations. Orthogonal Projection The original set of vectors was a basis for ℝ 2, but the vectors were not orthogonal. To find an orthogonal basis, we simply found the projection of one vector onto the other, then subtracted it, leaving an orthogonal vector. Original basisOrthogonal basisOrthonormal basis We can go a step further and find an orthonormal basis by simply dividing each vector by its magnitude.

Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB One important consequence of the previous calculation is that we have manufactured an orthogonal basis for the vector space ℝ 2. This idea can be very useful in a variety of situations. Orthogonal Projection The original set of vectors was a basis for ℝ 2, but the vectors were not orthogonal. To find an orthogonal basis, we simply found the projection of one vector onto the other, then subtracted it, leaving an orthogonal vector. Original basisOrthogonal basis In this example, we only needed two basis vectors ( ℝ 2 is 2-dimensional), but if we are dealing with a larger space this process can be repeated to find as many vectors as necessary. Orthonormal basis We can go a step further and find an orthonormal basis by simply dividing each vector by its magnitude.

Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB The process of constructing an orthonormal basis in the way we have described is called the Gram-Schmidt process. Here is another example, this time with vectors in ℝ 3. The given set of vectors form a basis for ℝ 3. Use the Gram-Schmidt process to find an orthonormal basis for ℝ 3. Orthogonal Projection

Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB The process of constructing an orthonormal basis in the way we have described is called the Gram-Schmidt process. Here is another example, this time with vectors in ℝ 3. The given set of vectors form a basis for ℝ 3. Use the Gram-Schmidt process to find an orthonormal basis for ℝ 3. Orthogonal Projection Step 1 is to find the projection of v 2 onto v 1, and subtract it from v 2, leaving a new vector that is orthogonal to v 1. I will call this new vector v 2 *.

Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB We now have two orthogonal vectors. To manufacture the third one we will project v 3 onto both of these, and subtract those projections to obtain a vector that is orthogonal to both v 1 and v 2 *. Call this one v 3 *. Orthogonal Projection

Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB We now have two orthogonal vectors. To manufacture the third one we will project v 3 onto both of these, and subtract those projections to obtain a vector that is orthogonal to both v 1 and v 2 *. Call this one v 3 *. Orthogonal Projection We can scale these vectors however we want, so for convenience we can use the following orthogonal set:

Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB The last step is to normalize the vectors. Simply find the length of each vector and divide, obtaining a vector of length 1 that points the same direction. Orthogonal Projection Here is an orthonormal set of vectors that spans ℝ 3.