5.1 Orthogonality.

Slides:



Advertisements
Similar presentations
10.4 Complex Vector Spaces.
Advertisements

3D Geometry for Computer Graphics
Ch 7.7: Fundamental Matrices
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Symmetric Matrices and Quadratic Forms
Chapter 5 Orthogonality
Computer Graphics Recitation 5.
ENGG2013 Unit 19 The principal axes theorem
Chapter 6 Eigenvalues.
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Lecture 18 Eigenvalue Problems II Shang-Hua Teng.
Class 25: Question 1 Which of the following vectors is orthogonal to the row space of A?
Orthogonality and Least Squares
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
5 5.1 © 2012 Pearson Education, Inc. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Orthogonal Sets (12/2/05) Recall that “orthogonal” matches the geometric idea of “perpendicular”. Definition. A set of vectors u 1,u 2,…,u p in R n is.
Matrices CS485/685 Computer Vision Dr. George Bebis.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly.
CHAPTER SIX Eigenvalues
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
1 MAC 2103 Module 12 Eigenvalues and Eigenvectors.
Chapter 5: The Orthogonality and Least Squares
Gram-Schmidt Orthogonalization
Day 1 Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
Linear Algebra (Aljabar Linier) Week 10 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma
AN ORTHOGONAL PROJECTION
1 Chapter 5 – Orthogonality and Least Squares Outline 5.1 Orthonormal Bases and Orthogonal Projections 5.2 Gram-Schmidt Process and QR Factorization 5.3.
Domain Range definition: T is a linear transformation, EIGENVECTOR EIGENVALUE.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Chapter 10 Real Inner Products and Least-Square
Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Class 26: Question 1 1.An orthogonal basis for A 2.An orthogonal basis for the column space of A 3.An orthogonal basis for the row space of A 4.An orthogonal.
Class 24: Question 1 Which of the following set of vectors is not an orthogonal set?
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
Chapter 6 Eigenvalues. Example In a certain town, 30 percent of the married women get divorced each year and 20 percent of the single women get married.
7.1 Eigenvalues and Eigenvectors
5.1 Eigenvectors and Eigenvalues 5. Eigenvalues and Eigenvectors.
5.1 Eigenvalues and Eigenvectors
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
1 Chapter 8 – Symmetric Matrices and Quadratic Forms Outline 8.1 Symmetric Matrices 8.2Quardratic Forms 8.3Singular ValuesSymmetric MatricesQuardratic.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Matrices CHAPTER 8.9 ~ Ch _2 Contents  8.9 Power of Matrices 8.9 Power of Matrices  8.10 Orthogonal Matrices 8.10 Orthogonal Matrices 
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
QR decomposition: A = QR Q is orthonormal R is upper triangular To find QR decomposition: 1.) Q: Use Gram-Schmidt to find orthonormal basis for column.
CS479/679 Pattern Recognition Dr. George Bebis
Elementary Linear Algebra Anton & Rorres, 9th Edition
Matrices and vector spaces
Eigenvalues and Eigenvectors
Euclidean Inner Product on Rn
Orthogonality and Least Squares
CS485/685 Computer Vision Dr. George Bebis
Linear Algebra Lecture 39.
Symmetric Matrices and Quadratic Forms
Elementary Linear Algebra Anton & Rorres, 9th Edition
Linear Algebra Lecture 41.
Eigenvalues and Eigenvectors
Linear Algebra: Matrix Eigenvalue Problems – Part 2
Symmetric Matrices and Quadratic Forms
Presentation transcript:

5.1 Orthogonality

Definitions A set of vectors is called an orthogonal set if all pairs of distinct vectors in the set are orthogonal. An orthonormal set is an orthogonal set of unit vectors. An orthogonal (orthonormal) basis for a subspace W of Rn is a basis for W that is an orthogonal (orthonormal) set. An orthogonal matrix is a square matrix whose columns form an orthonormal set.

Examples 1) Is the following set of vectors orthogonal? orthonormal? 2) Find an orthogonal basis and an orthonormal basis for the subspace W of Rn

Theorems All vectors in an orthogonal set are linearly independent. Let {v1, v2,…, vk } be an orthogonal basis for a subspace W of Rn and w be any vector in W. Then the unique scalars c1 ,c2 , …, ck such that w = c1v1 + c2v2 + …+ ckvk are given by Proof: To find ci we take the dot product with vi w vi = (c1v1 + c2v2 + …+ ckvk ) vi

Examples 3) The orthogonal basis for the subspace W in previous example is Pick a vector in W and express it in terms of the vectors in the basis. 4) Is the following matrix orthogonal? If it is orthogonal, find its inverse and its transpose.

Theorems on Orthogonal Matrix The following statements are equivalent for a matrix A : A is orthogonal A-1 = AT ||Av|| = ||v|| for every v in Rn Av1∙ Av2 = v1∙ v2 for every v1 ,v2 in Rn Let A be an orthogonal matrix. Then its rows form an orthonormal set. A-1 is also orthogonal. |det(A)| = 1 |λ| = 1 where λ is an eigenvalue of A If A and B are orthogonal matrices, then so is AB

Orthogonal Complements and Orthogonal Projections 5.2 Orthogonal Complements and Orthogonal Projections

Orthogonal Complements Recall: A normal vector n to a plane is orthogonal to every vector in that plane. If the plane passes through the origin, then it is a subspace W of R3 . Also, span(n) is also a subspace of R3 Note that every vector in span(n) is orthogonal to every vector in subspace W . Then span(n) is called orthogonal complement of W. Definition: A vector v is said to be orthogonal to a subspace W of Rn if it is orthogonal to all vectors in W. The set of all vectors that are orthogonal to W is called the orthogonal complement of W, denoted W ┴ . That is W perp http://www.math.tamu.edu/~yvorobet/MATH304-2011C/Lect3-02web.pdf

Example 1) Find the orthogonal complements for W of R3 .

Theorems Let W be a subspace of Rn . W ┴ is a subspace of Rn . (W ┴)┴ = W W ∩ W ┴ = {0} If W = span(w1,w2,…,wk), then v is in W ┴ iff v∙wi = 0 for all i =1,…,k. Let A be an m x n matrix. Then (row(A))┴ = null(A) and (col(A))┴ = null(AT) Proof?

Example 2) Use previous theorem to find the orthogonal complements for W of R3 .

Orthogonal Projections u w2 w1 v Let u and v be nonzero vectors. w1 is called the vector component of u along v (or projection of u onto v), and is denoted by projvu w2 is called the vector component of u orthogonal to v

Orthogonal Projections Let W be a subspace of Rn with an orthogonal basis {u1, u2,…, uk }, the orthogonal projection of v onto W is defined as: projW v = proju1 v + proju2 v + … + projuk v The component of v orthogonal to W is the vector perpW v = v – projw v Let W be a subspace of Rn and v be any vector in Rn . Then there are unique vectors w1 in W and w2 in W ┴ such that v = w1 + w2 .

Examples 3) Find the orthogonal projection of v = [ 1, -1, 2 ] onto W and the component of v orthogonal to W.

The Gram-Schmidt Process And the QR Factorization 5.3 The Gram-Schmidt Process And the QR Factorization

The Gram-Schmidt Process Goal: To construct an orthogonal (orthonormal) basis for any subspace of Rn. We start with any basis {x1, x2,…, xk }, and “orthogonalize” each vector vi in the basis one at a time by finding the component of vi orthogonal to W = span(x1, x2,…, xi-1 ). Let {x1, x2,…, xk } be a basis for a subspace W. Then choose the following vectors: v1 = x1, v2 = x2 – projv1 x2 v3 = x3 – projv1 x3 – projv2 x3 … and so on Then {v1, v2,…, vk } is orthogonal basis for W . We can normalize each vector in the basis to form an orthonormal basis.

Examples 1) Use the following basis to find an orthonormal basis for R2 2) Find an orthogonal basis for R3 that contains the vector

The QR Factorization If A is an m x n matrix with linearly independent columns, then A can be factored as A = QR where R is an invertible upper triangular matrix and Q is an m x n orthogonal matrix. In fact columns of Q form orthonormal basis for Rn which can be constructed from columns of A by using Gram-Schmidt process. Note: Since Q is orthogonal, Q-1 = QT and we have R = QT A

Examples 3) Find a QR factorization for the following matrices.

Orthogonal Diagonalization 5.4 Orthogonal Diagonalization of Symmetric Matrices

Example 1) Diagonalize the matrix. Recall: A square matrix A is symmetric if AT = A. A square matrix A is diagonalizable if there exists a matrix P and a diagonal matrix D such that P-1AP = D.

Orthogonal Diagonalization Definition: A square matrix A is orthogonally diagonalizable if there exists an orthogonal matrix Q and a diagonal matrix D such that Q-1AQ = D. Note that Q-1 = QT

Theorems If A is orthogonally diagonalizable, then A is symmetric. If A is a real symmetric matrix, then the eigenvalues of A are real. If A is a symmetric matrix, then any two eigenvectors corresponding to distinct eigenvalues of A are orthogonal. A square matrix A is orthogonally diagonalizable if and only if it is symmetric.

Example 2) Orthogonally diagonalize the matrix and write A in terms of matrices Q and D.

Theorem If A is orthogonally diagonalizable, and QTAQ = D then A can written as where qi is the orthonormal column of Q, and λi is the corresponding eigenvalue. This fact will help us construct the matrix A given eigenvalues and orthogonal eigenvectors.

Example 3) Find a 2 x 2 matrix that has eigenvalues 2 and 7, with corresponding eigenvectors

5.5 Applications

Quadratic Forms A quadratic form in x and y : A quadratic form in x,y and z: where x is the variable (column) matrix.

Quadratic Forms A quadratic form in n variables is a function f : Rn  R of the form: where A is a symmetric n x n matrix and x is in Rn A is called the matrix associated with f.

The Principal Axes Theorem Every quadratic form can be diagonalized. In fact, if A is a symmetric n x n matrix and if Q is an orthogonal matrix so that QTAQ = D then the change of variable x = Qy transforms the quadratic form into Example: Find a change of variable that transforms the Quadratic into one with no cross-product terms.