Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces
5-2
5-3
5-4 A standard unit vector in R n : Ex: the standard unit vector in R 2 : the standard unit vector in R 3 : Notes: The process of finding the unit vector in the direction of v is called normalizing the vector v.
5-5 Notes: (Properties of distance) (1) (2) if and only if (3)
5-6
5-7 Euclidean n-space: R n was defined to be the set of all order n-tuples of real numbers. When R n is combined with the standard operations of vector addition, scalar multiplication, vector length, and the dot product, the resulting vector space is called Euclidean n-space.
5-8 Dot product and matrix multiplication: (A vector in R n is represented as an n×1 column matrix)
5-9 Note: The angle between the zero vector and another vector is not defined.
5-10 Note: The vector 0 is said to be orthogonal to every vector.
5-11 Note: Equality occurs in the triangle inequality if and only if the vectors u and v have the same direction.
Inner Product Spaces Note:
5-13 Note: A vector space V with an inner product is called an inner product space. Vector space: Inner product space:
5-14
5-15 Note:
5-16 Properties of norm: (1) (2) if and only if (3)
5-17 Properties of distance: (1) (2) if and only if (3)
5-18 Note: If v is a init vector, then. The formula for the orthogonal projection of u onto v takes the following simpler form.
5-19
Orthonormal Bases: Gram-Schmidt Process Note: If S is a basis, then it is called an orthogonal basis or an orthonormal basis.
5-21
5-22
5-23
5-24
5-25
Mathematical Models and Least Squares Analysis
5-27 Let W be a subspace of an inner product space V. (a) A vector u in V is said to orthogonal to W, if u is orthogonal to every vector in W. (b) The set of all vectors in V that are orthogonal to W is called the orthogonal complement of W. (read “ perp”) Orthogonal complement of W: Notes:
5-28 Notes: Ex:
5-29
5-30
5-31
5-32 Notes: (1) Among all the scalar multiples of a vector u, the orthogonal projection of v onto u is the one that is closest to v. (2) Among all the vectors in the subspace W, the vector is the closest vector to v.
5-33 The four fundamental subspaces of the matrix A: N(A): nullspace of A N(A T ): nullspace of A T R(A): column space of A R(A T ): column space of A T
5-34
5-35 Least squares problem: (A system of linear equations) (1) When the system is consistent, we can use the Gaussian elimination with back-substitution to solve for x (2) When the system is inconsistent, how to find the “best possible” solution of the system. That is, the value of x for which the difference between Ax and b is small. Least squares solution: Given a system Ax = b of m linear equations in n unknowns, the least squares problem is to find a vector x in R n that minimizes with respect to the Euclidean inner product on R n. Such a vector is called a least squares solution of Ax = b.
5-36 (the normal equations of the least squares problem Ax = b)
5-37 Note: The problem of finding the least squares solution of is equal to he problem of finding an exact solution of the associated normal system. Thm: For any linear system, the associated normal system is consistent, and all solutions of the normal system are least squares solution of Ax = b. Moreover, if W is the column space of A, and x is any least squares solution of Ax = b, then the orthogonal projection of b on W is
5-38 Thm: If A is an m×n matrix with linearly independent column vectors, then for every m×1 matrix b, the linear system Ax = b has a unique least squares solution. This solution is given by Moreover, if W is the column space of A, then the orthogonal projection of b on W is
Applications of Inner Product Spaces
5-40
5-41 Note: C[a, b] is the inner product space of all continuous functions on [a, b].
5-42