1 MAC 2103 Module 11 lnner Product Spaces II. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Construct an orthonormal.

Slides:



Advertisements
Similar presentations
6.4 Best Approximation; Least Squares
Advertisements

Signal , Weight Vector Spaces and Linear Transformations
Signal , Weight Vector Spaces and Linear Transformations
Symmetric Matrices and Quadratic Forms
Chapter 5 Orthogonality
Orthogonality and Least Squares
6 6.3 © 2012 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
Chapter 5 Inner Product Spaces
5.1 Orthogonality.
Quantum One: Lecture 8. Continuously Indexed Basis Sets.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
1 MAC 2103 Module 10 lnner Product Spaces I. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Define and find the.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Vectors in 2-Space and 3-Space II
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
1 MAC 2103 Module 12 Eigenvalues and Eigenvectors.
Rev.S08 MAC 1140 Module 10 System of Equations and Inequalities II.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
1 MAC 2103 Module 9 General Vector Spaces II. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Find the coordinate.
Chapter 5: The Orthogonality and Least Squares
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
+ Review of Linear Algebra Optimization 1/14/10 Recitation Sivaraman Balakrishnan.
MAT 4725 Numerical Analysis Section 8.2 Orthogonal Polynomials and Least Squares Approximations (Part II)
1 MAC 2103 Module 6 Euclidean Vector Spaces I. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Use vector notation.
Linear Algebra (Aljabar Linier) Week 10 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma
AN ORTHOGONAL PROJECTION
1 MAC 2103 Module 3 Determinants. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Determine the minor, cofactor,
1 Chapter 5 – Orthogonality and Least Squares Outline 5.1 Orthonormal Bases and Orthogonal Projections 5.2 Gram-Schmidt Process and QR Factorization 5.3.
Orthogonality and Least Squares
AGC DSP AGC DSP Professor A G Constantinides©1 Hilbert Spaces Linear Transformations and Least Squares: Hilbert Spaces.
Elementary Linear Algebra Anton & Rorres, 9th Edition
1 MAC 2103 Module 7 Euclidean Vector Spaces II. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Determine if a linear.
Chapter 10 Real Inner Products and Least-Square
Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition
Lecture 11 Inner Product Space
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
Chap. 5 Inner Product Spaces 5.1 Length and Dot Product in R n 5.2 Inner Product Spaces 5.3 Orthonormal Bases: Gram-Schmidt Process 5.4 Mathematical Models.
AGC DSP AGC DSP Professor A G Constantinides©1 Signal Spaces The purpose of this part of the course is to introduce the basic concepts behind generalised.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Class 26: Question 1 1.An orthogonal basis for A 2.An orthogonal basis for the column space of A 3.An orthogonal basis for the row space of A 4.An orthogonal.
Class 24: Question 1 Which of the following set of vectors is not an orthogonal set?
1 MAC 2103 Module 8 General Vector Spaces I. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Recognize from the standard.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
6 6.3 © 2016 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
1 MAC 2103 Module 4 Vectors in 2-Space and 3-Space I.
6 6.1 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Lecture 11 Inner Product Space Last Time - Coordinates and Change of Basis - Applications - Length and Dot Product in R n Elementary Linear Algebra R.
Let W be a subspace of R n, y any vector in R n, and the orthogonal projection of y onto W. …
MAT 4725 Numerical Analysis Section 8.2 Orthogonal Polynomials and Least Squares Approximations (Part II)
6 6.5 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares LEAST-SQUARES PROBLEMS.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Lecture 9 Vector & Inner Product Spaces Last Time Spanning Sets and Linear Independence (Cont.) Basis and Dimension Rank of a Matrix and Systems of Linear.
Lecture 11 Inner Product Spaces Last Time Change of Basis (Cont.) Length and Dot Product in R n Inner Product Spaces Elementary Linear Algebra R. Larsen.
QR decomposition: A = QR Q is orthonormal R is upper triangular To find QR decomposition: 1.) Q: Use Gram-Schmidt to find orthonormal basis for column.
Elementary Linear Algebra
Euclidean Inner Product on Rn
Orthogonality and Least Squares
Orthogonality and Least Squares
Linear Algebra Lecture 39.
Linear Algebra Lecture 38.
Symmetric Matrices and Quadratic Forms
Elementary Linear Algebra Anton & Rorres, 9th Edition
Linear Algebra Lecture 41.
General Vector Spaces I
Orthogonality and Least Squares
Symmetric Matrices and Quadratic Forms
Approximation of Functions
Approximation of Functions
Presentation transcript:

1 MAC 2103 Module 11 lnner Product Spaces II

2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Construct an orthonormal set of vectors from an orthogonal set of vectors. 2. Find the coordinate vector with respect to a given orthonormal basis. 3. Construct an orthogonal basis from a nonstandard basis in ℜ ⁿ using the Gram-Schmidt process. 4. Find the least squares solution to a linear system Ax = b. 5. Find the orthogonal projection on col(A). 6. Obtain the best approximation. Click link to download other modules.

3 Rev.09 General Vector Spaces II Click link to download other modules. Orthogonal Bases, Gram-Schmidt Process, Least Squares and Best Approximation The major topics in this module:

4 Rev.F09 How to Construct an Orthonormal Set of Vectors from an Orthogonal Set of Vectors? Click link to download other modules. We have learned from the previous module that two vectors u and v in an inner product space V are orthogonal to each other iff = 0. To obtain an orthonormal set, we will normalize each of the vectors in the orthogonal set. How to normalize the vectors? This can be done by dividing each of them by their respective norm and making each of them a unit vector.

5 Rev.F09 How to Construct an Orthonormal Set of Vectors from an Orthogonal Set of Vectors? (Cont.) Click link to download other modules. Example 1: Find the orthonormal set of vectors from the following set of vectors: Let S ={v 1, v 2 } where v 1 = (5,0) and v 2 = (0,-3). Step 1: Verify that the set of vectors are mutually orthogonal with respect to the Euclidean inner product on ℜ ². Step 2: Find the norm for both vectors.

6 Rev.F09 How to Construct an Orthonormal Set of Vectors from an Orthogonal Set of Vectors? (Cont.) Click link to download other modules. Step 3: Normalize the vectors in the orthogonal set. Step 4: Verify that the set S is orthonormal by showing that and

7 Rev.F09 Orthonormal Set, Orthonormal Basis, and Orthogonal Basis Click link to download other modules. Orthonormal Set: An orthogonal set in which each vector is a unit vector. Orthonormal basis: A basis consisting of orthonormal vectors in an inner product space. Example: The standard basis for ℜ ⁿ. Orthogonal basis: A basis consisting of orthogonal vectors in an inner product space. Note that if S is an orthogonal set, then S is a linearly independent set.

8 Rev.F09 Orthonormal Set, Orthonormal Basis, and Orthogonal Basis (Cont.) Click link to download other modules. If S ={v 1, v 2, …, v n } is an orthogonal basis of W, then for any w ∈ W, where are called the Fourier coefficients. So the coordinate vector of w,

9 Rev.F09 Orthonormal Set, Orthonormal Basis, and Orthogonal Basis (Cont.) Click link to download other modules. If S ={q 1, q 2, …, q n } is an orthonormal basis of W, then for any w ∈ W, where are called the Fourier coefficients. So the coordinate vector of w,

10 Rev.F09 How to Find the Coordinate Vector with Respect to a Given Orthogonal Basis? Click link to download other modules. Example 2: Compute the coefficients and determine the coordinate vectors in Example 1 for u = (10,3). From Example 1, we have v 1 = (5,0), v 2 = (0,-3) and In this case, the coefficients are:

11 Rev.F09 How to Find the Coordinate Vector with Respect to a Given Orthogonal Basis? (Cont.) Click link to download other modules. So the coordinate vector of u, We can see that a nice advantage of working with an orthogonal basis is that the coefficients in any basis representation for a vector are immediately known; they are called Fourier coefficients.

12 Rev.F09 How to Find the Coordinate Vector with Respect to a Given Orthonormal Basis? Click link to download other modules. Example 3: Find the coordinates of w = (2,3) relative to the orthonormal basis for ℜ ², B = {v 1, v 2 }, where Since B is orthonormal, we have

13 Rev.F09 The Gram-Schmidt Process Click link to download other modules. Let S = {u 1, u 2, …, u m } with nonzero u i ∈ ℜ ⁿ for i = 1, 2, …, m. S does not have to be a linearly independent set. It might be that A = [u 1 u 2 … u m ] is an n x m matrix and the source of S. The Gram-Schmidt Algorithm: S = {u 1, u 2, …, u m } 1. Let 2. For k = 2, 3, …, m, let If we discard it since it is linearly dependent.

14 Rev.F09 The Gram-Schmidt Process (Cont.) Click link to download other modules. 3. We then have r ≤ m orthogonal and linearly independent vectors in B = {v 1, v 2, …, v r }. span(S) = span(B) = W, a r dimensional subspace of ℜ ⁿ and B is an orthogonal basis for W. W = col(A) if A = [u 1 u 2 … u m ]. An orthogonal basis for W is {q 1, q 2, …, q r } where for i =1, 2, …, r.

15 Rev.F09 How to Construct an Orthonormal Basis from a Nonstandard Basis in ℜ ⁿ using the Gram-Schmidt Process? Click link to download other modules. Let ℜ ⁿ be the usual Euclidean inner product space of dimension n. Let {u 1, u 2, …, u n } be a nonstandard basis in ℜ ⁿ. Step 1: Use the Gram-Schmidt method to construct an orthogonal basis {v 1, v 2, …, v n } from the basis vectors {u 1, u 2, …, u n }. Step 2: Normalize the orthogonal basis vectors to obtain the orthonormal basis {q 1, q 2,…, q n }.

16 Rev.F09 How to Construct an Orthonormal Basis from a Nonstandard Basis in ℜ ⁿ using the Gram-Schmidt Process? (Cont.) Click link to download other modules. Example 4: The set is a nonstandard basis in ℜ ³. Step 1:

17 Rev.F09 How to Construct an Orthonormal Basis from a Nonstandard Basis in ℜ ⁿ using the Gram-Schmidt Process? (Cont.) Click link to download other modules. form an orthogonal basis for ℜ ³.

18 Rev.F09 How to Construct an Orthonormal Basis from a Nonstandard Basis in ℜ ⁿ using the Gram-Schmidt Process? (Cont.) Click link to download other modules. Step 2: Normalize the orthogonal basis vectors to obtain the orthonormal basis B = {q 1, q 2, q 3 }. The norms of these vectors are: So an orthonormal basis is B where

19 Rev.F09 How to Find the Least Squares Solution? Click link to download other modules. The linear system Ax = b always has the associated normal system A T Ax = A T b which is consistent and has one or more solutions. Any solution of this system is a least squares solution of Ax = b. Moreover, the orthogonal projection of b on W = col(A) is where x is a least squares solution. Example 5: Find the least squares solution of the linear system Ax = b given by Observe that A has two linearly independent column vectors, so A T A is invertible, and there is a unique solution to A T Ax = A T b, which will be our least squares solution to Ax = b.

20 Rev.F09 How to Find the Least Squares Solution? (Cont.) Click link to download other modules. We have So the normal system A T Ax = A T b in this case is

21 Rev.F09 How to Find the Least Squares Solution? (Cont.) Click link to download other modules. By Gauss Elimination, the row-echelon form of is The solution to the normal system A T Ax = A T b in this case is is our unique least square solution to Ax = b.

22 Rev.F09 How to Find the Orthogonal Projection and Obtain the Best Approximation? Click link to download other modules. is the orthogonal projection of b on W = col(A). The is the Best Approximation to b in W since the distance between b and is the minimum for all vectors in W, Example 6: From the previous example, the orthogonal projection of b on W = col(A) is Thus, we have obtained the best approximation to b in W.

23 Rev.F09 What have we learned? We have learned to : 1. Construct an orthonormal set of vectors from an orthogonal set of vectors. 2. Find the coordinate vector with respect to a given orthonormal basis. 3. Construct an orthogonal basis from a nonstandard basis in ℜ ⁿ using the Gram-Schmidt process. 4. Find the least squares solution to a linear system Ax = b. 5. Find the orthogonal projection on col(A). 6. Obtain the best approximation. Click link to download other modules.

24 Rev.F09 Credit Some of these slides have been adapted/modified in part/whole from the following textbook: Anton, Howard: Elementary Linear Algebra with Applications, 9th Edition Click link to download other modules.