Download presentation
Presentation is loading. Please wait.
1
3.VI.1. Orthogonal Projection Into a Line 3.VI.2. Gram-Schmidt Orthogonalization 3.VI.3. Projection Into a Subspace 3.VI. Projection 3.VI.1. & 2. deal only with inner product spaces. 3.VI.3 is applicable to any direct sum space.
2
3.VI.1. Orthogonal Projection Into a Line Definition 1.1: Orthogonal Projection The orthogonal projection of v into the line spanned by a nonzero s is the vector. Example 1.3: Orthogonal projection of the vector ( 2 3 ) T into the line y = 2x.
3
Example 1.4: Orthogonal projection of a general vector in R 3 into the y-axis Example 1.5: Project = Discard orthogonal components A railroad car left on an east-west track without its brake is pushed by a wind blowing toward the northeast at fifteen miles per hour; what speed will the car reach?
4
Example 1.6:Nearest Point A submarine is tracking a ship moving along the line y = 3x + 2. Torpedo range is one-half mile. Can the sub stay where it is, at the origin on the chart, or must it move to reach a place where the ship will pass within range? Ship’s path is parallel to vector Point p of closest approach is (Out of range)
5
Exercises 3.VI.1. 1. Consider the function mapping a plane to itself that takes a vector to its projection into the line y = x. (a) Produce a matrix that describes the function’s action. (b) Show also that this map can be obtained by first rotating everything in the plane π/4 radians clockwise, then projecting into the x-axis, and then rotating π/4 radians counterclockwise.
6
3.VI.2. Gram-Schmidt Orthogonalization Given a vector s, any vector v in an inner product space can be decomposed as where Definition 2.1: Mutually Orthogonal Vectors Vectors v 1, …, v k R n are mutually orthogonal if v i · v j = 0 i j Theorem 2.2: A set of mutually orthogonal non-zero vectors is linearly independent. Proof: → →c j = 0 j Corollary 2.3: A set of k mutually orthogonal nonzero vectors in V k is a basis for the space. Definition 2.5: Orthogonal Basis An orthogonal basis for a vector space is a basis of mutually orthogonal vectors.
7
Example 2.6:Turn into an orthogonal basis for R 3.
8
Theorem 2.7: Gram-Schmidt Orthogonalization If β 1, …, β k is a basis for a subspace of R n then the κ j s calculated from the following scheme is an orthogonal basis for the same subspace. Proof:For m 2, Let β 1, …, β m be mutually orthogonal, and Then QED
9
If each κ j is furthermore normalized, the basis is called orthonormal. The Gram-Schmidt scheme simplifies to:
10
Exercises 3.VI.2. 1. Perform the Gram-Schmidt process on this basis for R 3, 2. Show that the columns of an n n matrix form an orthonormal set if and only if the inverse of the matrix is its transpose. Produce such a matrix.
11
3.VI.3. Projection Into a Subspace Definition 3.1: For any direct sum V = M N and any v V such that v = m + nwith m M and n N The projection of v into M along N is defined as proj M, N (v) = m Reminder: M & N need not be orthogonal. There need not even be an inner product defined.
12
Example 3.2: The space M 2 2 of 2 2 matrices is the direct sum of Task: Find proj M, N (A), where Solution: Let the bases for M & N be → is a basis for M 2 2. ∴
13
Example 3.3: Both subscripts on proj M, N (v) are significant. Considerwith basis & and subspaces It’s straightforward to verify whereTask: Find proj M, N (v) and proj M, L (v) Solution: For →
14
→ Note: B ML is orthogonal but B MN is not.
15
Definition 3.4: Orthogonal Complement The orthogonal complement of a subspace M of R n is M = { v R n | v is perpendicular to all vectors in M }( read “M perp” ). The orthogonal projection proj M (v ) of a vector is its projection into M along M . Example 3.5: In R 3, find the orthogonal complement of the plane Solution:Natural basis for P is ( parameter = z) →
16
Lemma 3.7: Let M be a subspace of R n. Then M is also a subspace and R n = M M . Hence, v R n, v proj M (v) is perpendicular to all vectors in M. Proof: Construct bases using G-S orthogonalization. Theorem 3.8: Let v be a vector in R n and let M be a subspace of R n with basis β 1, …, β k . If A is the matrix whose columns are the β’s then proj M (v ) = c 1 β 1 + …+ c k β k where the coefficients c i are the entries of the vector (A T A) A T v. That is, proj M (v ) = A (A T A) 1 A T v. Proof:where c is a column vector → By lemma 3.7, →→
17
Example 3.9: To orthogonally project into subspace From we get →
18
Exercises 3.VI.3. 1. Project v into M along N, where 2. Find M for
19
3. Define a projection to be a linear transformation t : V → V with the property that repeating the projection does nothing more than does the projection alone: ( t t )(v) = t (v) for all v V. (a) Show that for any such t there is a basis B = β 1, …, β n for V such that where r is the rank of t. (b) Conclude that every projection has a block partial-identity representation:
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.