Lecture 11 Inner Product Space Last Time - Coordinates and Change of Basis - Applications - Length and Dot Product in R n Elementary Linear Algebra R.

Slides:



Advertisements
Similar presentations
10.4 Complex Vector Spaces.
Advertisements

Chapter 4 Euclidean Vector Spaces
Chapter 5 Inner Product Spaces
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Chapter 5 Orthogonality
5.1 Rn上之長度與點積 5.2 內積空間 5.3 單範正交基底:Gram-Schmidt過程 5.4 數學模型與最小平方分析
Chapter 3 Determinants 3.1 The Determinant of a Matrix
第五章 內積空間 5.1 Rn上之長度與點積 5.2 內積空間 5.3 單範正交基底:Gram-Schmidt過程
CHAPTER 5 INNER PRODUCT SPACES
Orthogonality and Least Squares
6 6.3 © 2012 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
Chapter 5 Inner Product Spaces
Chapter 1 Systems of Linear Equations
Chapter 6 Linear Transformations
1 MAC 2103 Module 10 lnner Product Spaces I. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Define and find the.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
2.1 Operations with Matrices 2.2 Properties of Matrix Operations
6.1 Introduction to Linear Transformations
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Chapter 2A Matrices 2A.1 Definition, and Operations of Matrices: 1 Sums and Scalar Products; 2 Matrix Multiplication 2A.2 Properties of Matrix Operations;
Chapter 1 Systems of Linear Equations
Chapter 5: The Orthogonality and Least Squares
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
Chapter 5 Orthogonality.
Chapter 4 Vector Spaces 4.1 Vectors in Rn 4.2 Vector Spaces
Vectors in R n a sequence of n real number An ordered n-tuple: the set of all ordered n-tuple  n-space: R n Notes: (1) An n-tuple can be viewed.
AN ORTHOGONAL PROJECTION
Chap. 6 Linear Transformations
1 Chapter 5 – Orthogonality and Least Squares Outline 5.1 Orthonormal Bases and Orthogonal Projections 5.2 Gram-Schmidt Process and QR Factorization 5.3.
Chapter 3 Vectors in n-space Norm, Dot Product, and Distance in n-space Orthogonality.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Chapter 4 Linear Transformations 4.1 Introduction to Linear Transformations 4.2 The Kernel and Range of a Linear Transformation 4.3 Matrices for Linear.
Lecture 11 Inner Product Space
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
Chap. 5 Inner Product Spaces 5.1 Length and Dot Product in R n 5.2 Inner Product Spaces 5.3 Orthonormal Bases: Gram-Schmidt Process 5.4 Mathematical Models.
Lecture 13 Inner Product Space & Linear Transformation Last Time - Orthonormal Bases:Gram-Schmidt Process - Mathematical Models and Least Square Analysis.
Class 24: Question 1 Which of the following set of vectors is not an orthogonal set?
Chapter 4 Euclidean n-Space Linear Transformations from to Properties of Linear Transformations to Linear Transformations and Polynomials.
Lecture 1 Systems of Linear Equations
7.1 Eigenvalues and Eigenvectors
5.1 Eigenvalues and Eigenvectors
Lecture 14 Linear Transformation Last Time - Mathematical Models and Least Square Analysis - Inner Product Space Applications - Introduction to Linear.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
1 MAC 2103 Module 11 lnner Product Spaces II. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Construct an orthonormal.
Lecture 9 Vector & Inner Product Space
CHAPTER Four Linear Transformations. Outlines Definition and Examples Matrix Representation of linear transformation Similarity.
Lecture 9 Vector & Inner Product Space Last Time - Spanning Sets and Linear Independence (Cont.) - Basis and Dimension - Rank of a Matrix and Systems of.
Chapter 2 Matrices 2.1 Operations with Matrices 2.2 Properties of Matrix Operations 2.3 The Inverse of a Matrix 2.4 Elementary Matrices Elementary Linear.
Lecture 7 Vector Spaces Lat Time Properties of Determinants Introduction to Eigenvalues Applications of Determinants Vectors in R n Elementary Linear Algebra.
Chapter 4 Linear Transformations 4.1 Introduction to Linear Transformations 4.2 The Kernel and Range of a Linear Transformation 4.3 Matrices for Linear.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
CHAPTER 6 LINEAR TRANSFORMATIONS 6.1 Introduction to Linear Transformations 6.2 The Kernel and Range of a Linear Transformation 6.3 Matrices for Linear.
Lecture 9 Vector & Inner Product Spaces Last Time Spanning Sets and Linear Independence (Cont.) Basis and Dimension Rank of a Matrix and Systems of Linear.
Lecture 11 Inner Product Spaces Last Time Change of Basis (Cont.) Length and Dot Product in R n Inner Product Spaces Elementary Linear Algebra R. Larsen.
1 Chapter 6 Orthogonality 6.1 Inner product, length, and orthogonality 内积 , 模长 , 正交 6.2 Orthogonal sets 正交组,正交集 6.4 The Gram-Schmidt process 格莱姆 - 施密特过程.
Lecture 7 Vector Space Last Time - Properties of Determinants
Elementary Linear Algebra
Lecture 10 Inner Product Spaces
Lecture 8 Vector Space Last Time - Vector Spaces and Applications
Lecture 8 Vector Space Last Time - Vector Spaces and Applications
Lecture 7 Vector Space Last Time - Properties of Determinants
Lecture 14 Linear Transformation
Linear Algebra Lecture 39.
Linear Algebra Lecture 38.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Linear Algebra Lecture 20.
Chapter 4 Linear Transformations
RAYAT SHIKSHAN SANSTHA’S S.M.JOSHI COLLEGE HADAPSAR, PUNE
Chapter 5 Inner Product Spaces
Presentation transcript:

Lecture 11 Inner Product Space Last Time - Coordinates and Change of Basis - Applications - Length and Dot Product in R n Elementary Linear Algebra R. Larsen et al. (5 Edition) TKUEE 翁慶昌 -NTUEE SCC_11_2007

11- 2 Lecture 10: Inner Product Spaces Today Length and Dot Product in R n (Continued) Inner Product Spaces Orthonormal Bases:Gram-Schmidt Process Mathematical Models and Least Square Analysis Applications Reading Assignment: Secs Next Time Inner Product Space Applications Introduction to Linear Transformations The Kernel and Range of a Linear Transformation Matrices for Linear Transformations Reading Assignment: Secs

What Have You Actually Learned about Dot Product So Far?

Today Length and Dot Product in R n (Continued) Inner Product Spaces Orthonormal Bases:Gram-Schmidt Process Mathematical Models and Least Square Analysis Applications

Length and Dot Product in R n (Continue) Thm 5.4: (The Cauchy - Schwarz inequality) If u and v are vectors in R n, then ( denotes the absolute value of ) Ex 7: (An example of the Cauchy - Schwarz inequality) Verify the Cauchy - Schwarz inequality for u=(1, -1, 3) and v=(2, 0, -1) Sol:

11- 6 Note: The angle between the zero vector and another vector is not defined. The angle between two vectors in R n : Opposite direction Same direction

Ex 8: (Finding the angle between two vectors) Sol: u and v have opposite directions.

5 - 8 Orthogonal vectors: Two vectors u and v in R n are orthogonal if Note: The vector 0 is said to be orthogonal to every vector.

11- 9 Ex 10: (Finding orthogonal vectors) Determine all vectors in R n that are orthogonal to u=(4, 2). Let Sol:

Thm 5.5: (The triangle inequality) If u and v are vectors in R n, then Pf: Note: Equality occurs in the triangle inequality if and only if the vectors u and v have the same direction.

Thm 5.6: (The Pythagorean theorem) If u and v are vectors in R n, then u and v are orthogonal if and only if

Dot product and matrix multiplication: (A vector in R n is represented as an n×1 column matrix)

Keywords in Section 5.1: length: 長度 norm: 範數 unit vector: 單位向量 standard unit vector : 標準單位向量 normalizing: 單範化 distance: 距離 dot product: 點積 Euclidean n-space: 歐基里德 n 維空間 Cauchy-Schwarz inequality: 科西 - 舒瓦茲不等式 angle: 夾角 triangle inequality: 三角不等式 Pythagorean theorem: 畢氏定理

Today Length and Dot Product in R n (Continued) Inner Product Spaces Orthonormal Bases:Gram-Schmidt Process Mathematical Models and Least Square Analysis Applications

Inner Product Spaces (1) (2) (3) (4) and if and only if Inner product: Let u, v, and w be vectors in a vector space V, and let c be any scalar. An inner product on V is a function that associates a real number with each pair of vectors u and v and satisfies the following axioms.

Note: A vector space V with an inner product is called an inner product space. Vector space: Inner product space:

Ex 1: (The Euclidean inner product for R n ) Show that the dot product in R n satisfies the four axioms of an inner product. Sol: By Theorem 5.3, this dot product satisfies the required four axioms. Thus it is an inner product on R n.

Ex 2: (A different inner product for R n ) Show that the function defines an inner product on R 2, where and. Sol:

Note: (An inner product on R n )

Ex 3: (A function that is not an inner product) Show that the following function is not an inner product on R 3. Sol: Let Axiom 4 is not satisfied. Thus this function is not an inner product on R 3.

Thm 5.7: (Properties of inner products) Let u, v, and w be vectors in an inner product space V, and let c be any real number. (1) (2) (3) Norm (length) of u: Note:

u and v are orthogonal if. Distance between u and v: Angle between two nonzero vectors u and v: Orthogonal:

Notes: (1) If, then v is called a unit vector. (2) (the unit vector in the direction of v) not a unit vector

Ex 6: (Finding inner product) is an inner product Sol:

Properties of norm: (1) (2) if and only if (3) Properties of distance: (1) (2) if and only if (3)

Thm 5.8 : Let u and v be vectors in an inner product space V. (1) Cauchy-Schwarz inequality: (2) Triangle inequality: (3) Pythagorean theorem : u and v are orthogonal if and only if Theorem 5.5 Theorem 5.6 Theorem 5.4

Orthogonal projections in inner product spaces: Let u and v be two vectors in an inner product space V, such that. Then the orthogonal projection of u onto v is given by Note: If v is a init vector, then. The formula for the orthogonal projection of u onto v takes the following simpler form.

Ex 10: (Finding an orthogonal projection in R 3 ) Use the Euclidean inner product in R 3 to find the orthogonal projection of u=(6, 2, 4) onto v=(1, 2, 0). Sol:

Thm 5.9: (Orthogonal projection and distance) Let u and v be two vectors in an inner product space V, such that. Then

Keywords in Section 5.2: inner product: 內積 inner product space: 內積空間 norm: 範數 distance: 距離 angle: 夾角 orthogonal: 正交 unit vector: 單位向量 normalizing: 單範化 Cauchy – Schwarz inequality: 科西 - 舒瓦茲不等式 triangle inequality: 三角不等式 Pythagorean theorem: 畢氏定理 orthogonal projection: 正交投影

Today Length and Dot Product in R n (Continued) Inner Product Spaces Orthonormal Bases:Gram-Schmidt Process Mathematical Models and Least Square Analysis Applications

Orthonormal Bases: Gram-Schmidt Process Orthogonal: A set S of vectors in an inner product space V is called an orthogonal set if every pair of vectors in the set is orthogonal. Orthonormal: An orthogonal set in which each vector is a unit vector is called orthonormal. Note: If S is a basis, then it is called an orthogonal basis or an orthonormal basis.

Ex 1: (A nonstandard orthonormal basis for R 3 ) Show that the following set is an orthonormal basis. Sol: Show that the three vectors are mutually orthogonal.

Show that each vector is of length 1. Thus S is an orthonormal set.

The standard basis is orthonormal. Ex 2: (An orthonormal basis for ) In, with the inner product Sol: Then

Thm 5.10: (Orthogonal sets are linearly independent) If is an orthogonal set of nonzero vectors in an inner product space V, then S is linearly independent. Pf: S is an orthogonal set of nonzero vectors

Corollary to Thm 5.10: If V is an inner product space of dimension n, then any orthogonal set of n nonzero vectors is a basis for V.

Ex 4: (Using orthogonality to test for a basis) Show that the following set is a basis for. Sol: : nonzero vectors (by Corollary to Theorem 5.10)

Thm 5.11: (Coordinates relative to an orthonormal basis) If is an orthonormal basis for an inner product space V, then the coordinate representation of a vector w with respect to B is is orthonormal (unique representation) Pf: is a basis for V

Note: If is an orthonormal basis for V and, Then the corresponding coordinate matrix of w relative to B is

Ex 5: (Representing vectors relative to an orthonormal basis) Find the coordinates of w = (5, -5, 2) relative to the following orthonormal basis for. Sol:

Gram-Schmidt orthonormalization process: is a basis for an inner product space V is an orthogonal basis. is an orthonormal basis.

Sol: Ex 7: (Applying the Gram-Schmidt orthonormalization process) Apply the Gram-Schmidt process to the following basis.

Orthogonal basis Orthonormal basis

Ex 10: (Alternative form of Gram-Schmidt orthonormalization process) Find an orthonormal basis for the solution space of the homogeneous system of linear equations. Sol:

Thus one basis for the solution space is ( orthogonal basis) (orthonormal basis )

Keywords in Section 5.3: orthogonal set: 正交集合 orthonormal set: 單範正交集合 orthogonal basis: 正交基底 orthonormal basis: 單範正交基底 linear independent: 線性獨立 Gram-Schmidt Process: Gram-Schmidt 過程

Today Length and Dot Product in R n (Continued) Inner Product Spaces Orthonormal Bases:Gram-Schmidt Process Mathematical Models and Least Square Analysis Applications

Mathematical Models and Least Squares Analysis Let W be a subspace of an inner product space V. (a) A vector u in V is said to orthogonal to W, if u is orthogonal to every vector in W. (b) The set of all vectors in V that are orthogonal to W is called the orthogonal complement of W. (read “ perp”) Orthogonal complement of W:  Notes:

Notes:  Ex:

Direct sum: Let and be two subspaces of. If each vector can be uniquely written as a sum of a vector from and a vector from,, then is the direct sum of and, and you can write. Thm 5.13: (Properties of orthogonal subspaces) Let W be a subspace of R n. Then the following properties are true. (1) (2) (3)

Thm 5.14: (Projection onto a subspace) If is an orthonormal basis for the subspace S of V, and, then

Ex 5: (Projection onto a subspace) Find the projection of the vector v onto the subspace W. Sol: an orthogonal basis for W an orthonormal basis for W

Find by the other method:

Thm 5.15: (Orthogonal projection and distance) Let W be a subspace of an inner product space V, and. Then for all, ( is the best approximation to v from W)

Pf: By the Pythagorean theorem

Notes: (1) Among all the scalar multiples of a vector u, the orthogonal projection of v onto u is the one that is closest to v. (2) Among all the vectors in the subspace W, the vector is the closest vector to v.

Thm 5.16: (Fundamental subspaces of a matrix) If A is an m×n matrix, then (1) (2) (3) (4)

Ex 6: (Fundamental subspaces) Find the four fundamental subspaces of the matrix. (reduced row-echelon form) Sol:

Check:

Ex 3: Let W is a subspace of R 4 and. (a) Find a basis for W (b) Find a basis for the orthogonal complement of W. Sol: (reduced row-echelon form)

is a basis for W Notes:

Least squares problem: (A system of linear equations) (1) When the system is consistent, we can use the Gaussian elimination with back-substitution to solve for x (2) When the system is consistent, how to find the “best possible” solution of the system. That is, the value of x for which the difference between Ax and b is small.

Least squares solution: Given a system Ax = b of m linear equations in n unknowns, the least squares problem is to find a vector x in R n that minimizes with respect to the Euclidean inner product on R n. Such a vector is called a least squares solution of Ax = b.

(the normal system associated with Ax = b)

Note: The problem of finding the least squares solution of is equal to he problem of finding an exact solution of the associated normal system. Thm: For any linear system, the associated normal system is consistent, and all solutions of the normal system are least squares solution of Ax = b. Moreover, if W is the column space of A, and x is any least squares solution of Ax = b, then the orthogonal projection of b on W is

Thm: If A is an m×n matrix with linearly independent column vectors, then for every m×1 matrix b, the linear system Ax = b has a unique least squares solution. This solution is given by Moreover, if W is the column space of A, then the orthogonal projection of b on W is

Ex 7: (Solving the normal equations) Find the least squares solution of the following system and find the orthogonal projection of b on the column space of A.

Sol: the associated normal system

the least squares solution of Ax = b the orthogonal projection of b on the column space of A

Keywords in Section 5.4: orthogonal to W: 正交於 W orthogonal complement: 正交補集 direct sum: 直和 projection onto a subspace: 在子空間的投影 fundamental subspaces: 基本子空間 least squares problem: 最小平方問題 normal equations: 一般方程式