CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.

Slides:



Advertisements
Similar presentations
Vector Spaces A set V is called a vector space over a set K denoted V(K) if is an Abelian group, is a field, and For every element vV and K there exists.
Advertisements

10.4 Complex Vector Spaces.
Chapter 4 Euclidean Vector Spaces
6.4 Best Approximation; Least Squares
3D Geometry for Computer Graphics
Signal , Weight Vector Spaces and Linear Transformations
Signal , Weight Vector Spaces and Linear Transformations
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Chapter 5 Orthogonality
Computer Graphics Recitation 5.
Chapter 6 Eigenvalues.
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
Lecture 12 Least Square Approximation Shang-Hua Teng.
Class 25: Question 1 Which of the following vectors is orthogonal to the row space of A?
Orthogonality and Least Squares
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 08 Chapter 8: Linear Transformations.
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
6 6.3 © 2012 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
Chapter 5 Inner Product Spaces
5.1 Orthogonality.
1 MAC 2103 Module 10 lnner Product Spaces I. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Define and find the.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
CHAPTER SIX Eigenvalues
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Chapter 5: The Orthogonality and Least Squares
Chapter 5 Orthogonality.
Linear Algebra Chapter 4 Vector Spaces.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
4 4.4 © 2012 Pearson Education, Inc. Vector Spaces COORDINATE SYSTEMS.
Section 4.1 Vectors in ℝ n. ℝ n Vectors Vector addition Scalar multiplication.
Vectors CHAPTER 7. Ch7_2 Contents  7.1 Vectors in 2-Space 7.1 Vectors in 2-Space  7.2 Vectors in 3-Space 7.2 Vectors in 3-Space  7.3 Dot Product 7.3.
AN ORTHOGONAL PROJECTION
1 Chapter 5 – Orthogonality and Least Squares Outline 5.1 Orthonormal Bases and Orthogonal Projections 5.2 Gram-Schmidt Process and QR Factorization 5.3.
Orthogonality and Least Squares
Chapter 3 Vectors in n-space Norm, Dot Product, and Distance in n-space Orthogonality.
6 6.1 © 2016 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
AGC DSP AGC DSP Professor A G Constantinides©1 Hilbert Spaces Linear Transformations and Least Squares: Hilbert Spaces.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Chapter 10 Real Inner Products and Least-Square
Lecture 11 Inner Product Space
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
Chap. 5 Inner Product Spaces 5.1 Length and Dot Product in R n 5.2 Inner Product Spaces 5.3 Orthonormal Bases: Gram-Schmidt Process 5.4 Mathematical Models.
Chap. 4 Vector Spaces 4.1 Vectors in Rn 4.2 Vector Spaces
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Chapter 4 Euclidean n-Space Linear Transformations from to Properties of Linear Transformations to Linear Transformations and Polynomials.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
1 MAC 2103 Module 11 lnner Product Spaces II. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Construct an orthonormal.
6 6.3 © 2016 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
A function is a rule f that associates with each element in a set A one and only one element in a set B. If f associates the element b with the element.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
6 6.1 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Lecture 11 Inner Product Space Last Time - Coordinates and Change of Basis - Applications - Length and Dot Product in R n Elementary Linear Algebra R.
Let W be a subspace of R n, y any vector in R n, and the orthogonal projection of y onto W. …
6 6.5 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares LEAST-SQUARES PROBLEMS.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Lecture 11 Inner Product Spaces Last Time Change of Basis (Cont.) Length and Dot Product in R n Inner Product Spaces Elementary Linear Algebra R. Larsen.
Orthogonality and Least Squares
Chapter 3 Linear Algebra
Linear Algebra Lecture 39.
Linear Algebra Lecture 38.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Orthogonality and Least Squares
Approximation of Functions
Chapter 5 Inner Product Spaces
Approximation of Functions
Presentation transcript:

CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation

Least square problem Motivation: Curve fitting Problem formulation: Given, Find such that

Outlines Results of Orthogonal subspace Inner product spaces Least square problems Orthonormal sets Gram-Schmidt Orthogonalization process

The scalar product in Def: Let,.the inner product (or scalar product) of and is defined to be The norm of is defined as

Theorem5.1.1 : Let. Then pf: By the law of cosines, Note: If is the angle between, then Thus Def:

Cor: (Cauchy-Schwartz Inequality) Let. Then Moreover, equality holds for some

Scalar and Vector Projections Let. Then the quantify is the scalar projection of onto and the vector is called the vector projection of onto If,then the scalar projection of onto is and the vector projection of onto is

Example: Find the point on the line that is closest to the point (1,4) Sol: Note that the vector is on the line Then the desired point is

Example: Find the equation of the plane passing through and normal to Sol:

Example: Find the distance form to the plane Sol: a normal vector to the plane is desired distance

Orthogonal Subspace Def: Let be two subspace of. We say that if, and. Example: Let but does not orthogonal to

Def: Let be a subspace of. Then the orthogonal complement of is defined as Example: In,

Lemma: (i) Two subspaces (ii) If is a subspace of,then is also a subspace of. Pf: (i) If (ii) Let and

Four Fundamental Subspaces Let for some It will be shown later that

Theorem5.1.2: Let. Then and pf: Let and _____(1) for some _____(2) ________(3) Also, if _____(4) Similarly,

Example: Let Clearly,

Theorem5.2.2: Let be a subspace of, then (i) (ii) If is a basis for and is a basis for,then is a basis for. pf: If The result follows Suppose. Let and

To show that is a basis for, It remains to show their independency. Let. Then Similarly, This completes the proof

Def: Let are two subspaces of. We say that is a direct sum of,denoted by, if each can be written uniquely as a sum, where Example: Let Then but

Theorem5.2.2: If is a subspace of, then pf: By Theorem5.2.2, To show uniqueness, Suppose where

Theorem5.2.4: If is a subspace of, then pf: Let If (Why?)

Remark: Let. i.e., Since and are bijections.

Let bijection

Cor5.2.5: Let and. Then either (i) or (ii) pf: or Note that

Example: Let. Find The basic idea is that the row space and the sol. of are invariant under row operation. Sol: (i) (Why?) (ii) (Why?) (iii) Similarly, and (iv) Clearly,

Example: Let (i) and (ii) The mapping is a bijection and (iv) What is the matrix representation for ?

Linear Product Space A tool to measure the orthogonality of two vectors in general vector space

Def: An inner product on a vector space is a function Satisfying the following conditions: (i) with equality iff (ii) (iii)

Example: (i) Let Then is an inner product of (ii) Let, Then is an inner product of (iii) Let and then is an inner product of (iv) Let, is a positive function and are distinct real numbers. Then is an inner product of

Def: Let be an inner product of a vector space and. we say The length or norm of is

Theorem5.3.1: (The Pythagorean Law) pf:

Example: Consider with inner product (i) (ii) (iii) (iv) (Pythagorean Law) or

Example: Consider with inner product It can be shown that (i) (ii) (iii) Thus are orthonormal set.

Example: Let and let Then not orthogonal to

Def: Let be two vectors in an inner product space. Then the scalar projection of onto is defined as The vector projection of onto is

Lemma: Let be the vector projection of onto. Then for some pf:

Theorem5.3.2: (Cauchy-Schwarz Inequality) Let be two vectors in an inner product space. Then Moreover, equality holds are linear dependent. pf: If If Equality holds i.e., equality holds iff are linear dependent.

Note: From Cauchy-Schwarz Inequality. This, we can define as the angle between the two vectors

Def: Let be a vector space A fun is said to be a norm if it satisfies with equality scalar

Theorem5.3.3: If is an inner product space, then is a norm on pf: trivial Def: The distance between is defined as

Example: Let. Then is a norm is a norm for any In particular, is the euclidean norm

Example: Let. Then

Example: Let Thus, However, (Why?)

Example: Let Then

Least square problem A typical example: Given Find the best line to fit the data. or or find such that is minimum Geometrical meaning :

Least square problem: Given then the equation may not have solutions The objective of least square problem is trying to find such that has minimum value i.e., find satisfying

Preview of the results: It will be shown that Moreover, If columns of are Linear independent.

Theorem5.4.1: H. Let be a subspace of C. (i) for all (ii) pf: where If Since the expression is unique, result (i) is then proved. (ii) follows directly from (i) by noting that

Question: How to find which solves Answer: Let From previous Theorem, we know that normal equation

Theorem5.4.2: Let and Then the normal equation. Has a unique sol. and is the unique least square sol. to pf: Clearly, is nonsingular (Why?) is the unique sol. To normal equation. is the unique sol. To the least square problem (Why?) ( has linear independent columns)

Note: The projection vector is the element of that is closet to in the least square sense. Thus, The matrix is called the Projection matrix (that project any vector of to )

Example: Suppose a spring obeys the Hook’s law and a series of data are taken (with measurement error) as How to determine ? sol: Note that is inconsistent The normal equation. is The least square sol.

Example: Given the data Find the best least square fit by a linear function. sol: Let the desired linear function be The problem be comes to find the least square sol. of Least square sol. The best linear least square fit is

Example: Find the best quadratic least square fit to the data sol: Let the desired quadratic function. be The problem becomes to find the least square sol. of least square sol. the best quadratic least square fit is

Orthonormal Set Simplify the least square sol. (avoid computing inverse) Numerical computational stability

Def: is said to be an orthogonal set in an inner product space if Moreover, if, then is said to be orthonormal

Example: is an orthogonal set but not orthonormal However, is orthonormal

Theorem5.5.1: Let be an orthogonal set of nonzero vectors in an inner product space. Then they are linear independent pf: Suppose is linear independent.

Example: is an orthonormal set of with inner product Note: Now you know the meaning what one says that

Theorem5.5.2: Let be an orthonormal basis for an inner product space. If then pf:

Cor: Let be an orthonormal basis for an inner product space. If and, then pf:

Cor: (Parseval’s Formula) If is an orthonormal basis for an innerproduct space and, then pf: direct from previous corollary

Example: and from an orthonormal basis for. If, then and

Example: Determine without computing antiderivatives. sol: and is an orthonormal set of

Def: is said to be an orthogonal matrix if the column vectors of form an orthonormal set in Example: The rotational matrix and the elementary reflection matrix are orthogonal matrix.

Properties of orthogonal matrix: at be orthogonal. Then The column vectors of form an orthonormal basis for Preserve inner product preserve norm preserve angle.

Note: Let the columns of form an orthonormal set of.Then and the least square sol to is This avoid computing matrix inverse.

Cor5.5.9: Let be a nonzero subspace of and is an orthonormal basis for. If then the projection of onto is pf:

Note: Let columns of be an orthonormal set The projection of onto is the sum of the projection of onto each.

Example: Let Find the vector in that is closet to Sol: Clearly is a basis for. Let Thus Hw: Try What is ?

Approximation of functions Example: Find the best least square approximation to on by a linear function. Sol: (i) Clearly, but is not orthonormal (ii) seek a function of the form By calculation is an orthonormal set of (iii) Thus the projection. is the best linear least square approximation to on

Approximation of trigonometric polynomials FACT: forms an orthonormal set in with respect to the inner product Problem: Given a periodic function, find a trigonometric polynomial of degree n which is a best least square approximation to.

Sol: It suffices to find the projection of onto the subspace The best approximation of has coefficients

Example: Consider with inner product of (i) Check are orthonormal (ii) Let Similarly (iii) (iv)

Cram-Schmidt Orthogonalization Process Question: Given a set of linear independent vectors, how to transform them into orthogonal ones while preserve spanning set.?

Given,Clearly Clearly Similarly, Clearly We have the next result

Theorem5.6.1: (The Cram-Schmidt process) H. (i) be a basis for an inner product space (ii) C. is an orthonormal basis

Example: Find an orthonormal basis for with inner product given by where Sol: Starting with a basis

QR-Decomposition Given Let _______(1) _________________(2) _______________________(3) Define Where has orthonormal columns and is upper-triangular

To solve with Then, the example can be solved By backsubstitution without finding Inverse (if is square)

Example: Solve By direct calculation, The solution can be obtained from