Orthogonal Matrices & Symmetric Matrices

Slides:



Advertisements
Similar presentations
5.3 Orthogonal Transformations
Advertisements

10.4 Complex Vector Spaces.
3D Geometry for Computer Graphics
Eigenvalues and Eigenvectors
Lecture 19 Singular Value Decomposition
Symmetric Matrices and Quadratic Forms
Computer Graphics Recitation 5.
ENGG2013 Unit 19 The principal axes theorem
Chapter 6 Eigenvalues.
Chapter 3 Determinants and Matrices
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Lecture 18 Eigenvalue Problems II Shang-Hua Teng.
Class 25: Question 1 Which of the following vectors is orthogonal to the row space of A?
Orthogonal Sets (12/2/05) Recall that “orthogonal” matches the geometric idea of “perpendicular”. Definition. A set of vectors u 1,u 2,…,u p in R n is.
Matrices CS485/685 Computer Vision Dr. George Bebis.
5.1 Orthogonality.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly.
CHAPTER SIX Eigenvalues
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
Chapter 5: The Orthogonality and Least Squares
Eigenvalues and Eigenvectors
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
Linear algebra: matrix Eigen-value Problems
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
§ Linear Operators Christopher Crawford PHY
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Class 26: Question 1 1.An orthogonal basis for A 2.An orthogonal basis for the column space of A 3.An orthogonal basis for the row space of A 4.An orthogonal.
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
7.1 Eigenvalues and Eigenvectors
5.1 Eigenvectors and Eigenvalues 5. Eigenvalues and Eigenvectors.
5.1 Eigenvalues and Eigenvectors
Similar diagonalization of real symmetric matrix
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
Linear Algebra Chapter 6 Linear Algebra with Applications -Gareth Williams Br. Joel Baumeyer, F.S.C.
1 Chapter 8 – Symmetric Matrices and Quadratic Forms Outline 8.1 Symmetric Matrices 8.2Quardratic Forms 8.3Singular ValuesSymmetric MatricesQuardratic.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Beyond Vectors Hung-yi Lee. Introduction Many things can be considered as “vectors”. E.g. a function can be regarded as a vector We can apply the concept.
Characteristic Polynomial Hung-yi Lee. Outline Last lecture: Given eigenvalues, we know how to find eigenvectors or eigenspaces Check eigenvalues This.
Tutorial 6. Eigenvalues & Eigenvectors Reminder: Eigenvectors A vector x invariant up to a scaling by λ to a multiplication by matrix A is called.
CS246 Linear Algebra Review. A Brief Review of Linear Algebra Vector and a list of numbers Addition Scalar multiplication Dot product Dot product as a.
Chapter 5 Eigenvalues and Eigenvectors
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Review of Linear Algebra
CS479/679 Pattern Recognition Dr. George Bebis
Elementary Linear Algebra Anton & Rorres, 9th Edition
Matrices and Vectors Review Objective
Euclidean Inner Product on Rn
Orthogonality and Least Squares
CS485/685 Computer Vision Dr. George Bebis
6-4 Symmetric Matrices By毛.
Derivative of scalar forms
Christopher Crawford PHY
Chapter 3 Linear Algebra
Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors.
Diagonalization Hung-yi Lee.
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Symmetric Matrices and Quadratic Forms
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Linear Algebra Lecture 41.
Corollary If A is diagonalizable and rank A=r,
Diagonalization of Linear Operator
Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors
Subject :- Applied Mathematics
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Orthogonal Matrices & Symmetric Matrices Hung-yi Lee

Announcement 如果三個作業都滿分請忽略以下訊息 …… We have a bonus homework 三個作業都滿分就是 300 Bonus homework 全對可以加 50 最多可以加到 300 助教第二堂課會來講解

Orthogonal Matrices Symmetric Matrices Outline Reference: Chapter 7.5

Norm-preserving A linear operator is norm-preserving if 𝑇 𝑢 = 𝑢 𝑇 𝑢 = 𝑢 For all u Example: linear operator T on R2 that rotates a vector by .  Is T norm-preserving? Anything in Common? Example: linear operator T is refection  Is T norm-preserving? 𝐴= 1 0 0 −1

Norm-preserving A linear operator is norm-preserving if 𝑇 𝑢 = 𝑢 𝑇 𝑢 = 𝑢 For all u Example: linear operator T is projection  Is T norm-preserving? 𝐴= 1 0 0 0 Example: linear operator U on Rn that has an eigenvalue   ±1.  U is not norm-preserving, since for the corresponding eigenvector v, U(v) = v = ·v  v.

Orthogonal Matrix An nxn matrix Q is called an orthogonal matrix (or simply orthogonal) if the columns of Q form an orthonormal basis for Rn Orthogonal operator: standard matrix is an orthogonal matrix. unit unit is an orthogonal matrix. orthogonal

Norm-preserving Necessary conditions: Norm-preserving Orthogonal Matrix ? ??? Linear operator Q is norm-preserving Why there is no northonormal matrix?????????????????????????????????????????????????????? qj = 1 qj = Qej = ej qi and qj are orthogonal 畢式定理 qi + qj2 = Qei + Qej2 = Q(ei + ej)2 = ei + ej2 = 2 = qi2 + qj2

Orthogonal Matrix Q is an orthogonal matrix 𝑄𝑄 𝑇 = 𝐼 𝑛 Those properties are used to check orthogonal matrix. Q is an orthogonal matrix 𝑄𝑄 𝑇 = 𝐼 𝑛 𝑄 is invertible, and 𝑄 −1 = 𝑄 𝑇 𝑄𝑢∙𝑄𝑣=𝑢∙𝑣 for any u and v 𝑄𝑢 = 𝑢 for any u Simple inverse Q preserves dot projects Q preserves norms Proof (b)  (c) By definition of invertible matrices (a)  (b) with Q = [ q1 q2  qn ], qi  qi = 1 = [QTQ]ii i, and qi  qj = 0 = [QTQ]ij i  j.  QTQ = In (c)  (d) u, v  Rn, Qu  Qv = u  QTQv = u  Q1Qv = u  v. (d)  (e) u  Rn, Qu = (Qu  Qu)1/2 = (u  u)1/2 = u. (e)  (a) The above necessary conditions. Norm-preserving Orthogonal Matrix

Orthogonal Matrix Rows and columns Q is orthogonal if and only if 𝑄 𝑇 is orthogonal. Let P and Q be n x n orthogonal matrices 𝑑𝑒𝑡𝑄=±1 𝑃𝑄 is an orthogonal matrix 𝑄 −1 is an orthogonal matrix 𝑄 𝑇 is an orthogonal matrix Proof Check by 𝑄 −1 = 𝑄 𝑇 Check by 𝑃𝑄 −1 = 𝑃𝑄 𝑇 Proof (a) QQT = In  1 = det(In) = det(QQT) = det(Q)det(QT) = det(Q)2  det(Q) = ±1. (b) (PQ)T = QTPT = Q1P1 = (PQ)1.

Orthogonal Operator Applying the properties of orthogonal matrices on orthogonal operators T is an orthogonal operator 𝑇 𝑢 ∙𝑇 𝑣 =𝑢∙𝑣 for all 𝑢 and 𝑣 𝑇 𝑢 = 𝑢 for all 𝑢 T and U are orthogonal operators, then 𝑇𝑈 and 𝑇 −1 are orthogonal operators. Preserves dot product Preserves norms

Example: Find an orthogonal operator T on R3 such that 𝑇 1 2 0 1 2 = 0 1 0 Norm-preserving 𝑣= 1 2 0 1 2 Find 𝐴 −1 first 𝐴𝑣= 𝑒 2 𝑣= 𝐴 −1 𝑒 2 Because 𝐴 −1 = 𝐴 𝑇 𝐴 −1 = ∗ 1 2 ∗ ∗ 0 ∗ ∗ 1 2 ∗ Also orthogonal 𝐴 −1 = 1 2 1 2 0 0 0 1 −1 2 1 2 0 1 2 0 −1 2 0 1 0 𝐴= 𝐴 −1 𝑇 = 1 2 0 −1 2 1 2 0 1 2 0 1 0

Conclusion Orthogonal Matrix (Operator) Columns and rows are orthogonal unit vectors Preserving norms, dot products Its inverse is equal its transpose

Orthogonal Matrices Symmetric Matrices Outline Reference: Chapter 7.5

Eigenvalues are real The eigenvalues for symmetric matrices are always real. How about more general cases? Consider 2 x 2 symmetric matrices 實係數多項式虛根共軛 The symmetric matrices always have real eigenvalues.

Orthogonal Eigenvectors A is symmetric 𝑑𝑒𝑡 𝐴−𝑡 𝐼 𝑛 Factorization = 𝑡− 𝜆 1 𝑚 1 𝑡− 𝜆 2 𝑚 2 … 𝑡− 𝜆 𝑘 𝑚 𝑘 …… …… Eigenvalue: 𝜆 1 𝜆 2 𝜆 𝑘 …… Eigenspace: 𝑑 1 ≤ 𝑚 1 𝑑 2 ≤ 𝑚 2 𝑑 𝑘 ≤ 𝑚 𝑘 (dimension) Independent orthogonal

Orthogonal Eigenvectors A is symmetric. If 𝑢 and 𝑣 are eigenvectors corresponding to eigenvalues 𝜆 and 𝜇 (𝜆≠𝜇) 𝑢 and 𝑣 are orthogonal.

P is an orthogonal matrix Diagonalization A=𝑃D P 𝑇 P560 ? A is symmetric P 𝑇 A𝑃=D P is an orthogonal matrix : simple D is a diagonal matrix P 𝑇 A𝑃=D P −1 A𝑃=D A=𝑃D P −1 Diagonalization A=𝑃D P 𝑇 P consists of eigenvectors , D are eigenvalues

Diagonalization Example A=𝑃D P 𝑇 A=𝑃D P −1 P 𝑇 A𝑃=D A has eigenvalues 1 = 6 and 2 = 1, with corresponding eigenspaces E1 = Span{[ 1 2 ]T} and E2 = Span{[ 2 1 ]T} orthogonal  B1 = {[ 1 2 ]T/5} and B2 = {[ 2 1 ]T/5}

Example of Diagonalization of Symmetric Matrix A=𝑃D P −1 A=𝑃D P 𝑇 P is an orthogonal matrix Intendent Gram- Schmidt 1 = 2 𝑆𝑝𝑎𝑛 −1 2 1 2 0 , 1 6 1 6 −2 6 Eigenspace: 𝑆𝑝𝑎𝑛 −1 1 0 , −1 0 1 normalization Not orthogonal 2 = 8 𝑆𝑝𝑎𝑛 1 3 1 3 1 3 Eigenspace: 𝑆𝑝𝑎𝑛 1 1 1 normalization 𝑃= −1 2 1 2 0 1 6 1 6 −2 6 1 3 1 3 1 3 𝐷= 2 0 0 0 2 0 0 0 8

P is an orthogonal matrix Diagonalization P is an orthogonal matrix A is symmetric P 𝑇 A𝑃=D A=𝑃D P 𝑇 P consists of eigenvectors , D are eigenvalues Finding an orthonormal basis consisting of eigenvectors of A (1) Compute all distinct eigenvalues 1, 2, , k of A. (2) Determine the corresponding eigenspaces E1, E2, , Ek. (3) Get an orthonormal basis B i for each Ei. (4) B = B 1 B 2 B k is an orthonormal basis for A.

Diagonalization of Symmetric Matrix 𝑢= 𝑐 1 𝑣 1 + 𝑐 2 𝑣 2 +⋯+ 𝑐 𝑘 𝑣 𝑘 𝑢∙ 𝑣 1 𝑢∙ 𝑣 2 𝑢∙ 𝑣 𝑘 Orthonormal basis 𝐷 𝑇 B 𝑣 B 𝑇 𝑣 B simple Eigenvectors form the good system 𝑃 −1 𝑃 𝐵 −1 𝐵 Properly selected Properly selected 𝐴=𝑃𝐷 𝑃 −1 𝑣 𝑇 𝑣 A is symmetric

Spectral Decomposition Orthonormal basis A = PDPT Let P = [ u1 u2  un ] and D = diag[ 1 2  n ]. = P[ 1e1 2e2  nen ]PT = [ 1Pe1 2Pe2  nPen ]PT = [ 1u1 2u2  nun ]PT 𝑃 1 𝑃 2 𝑃 𝑛 = 𝜆 1 P 1 + 𝜆 2 P 2 +⋯+ 𝜆 𝑛 P 𝑛 𝑃 𝑖 are symmetric

Spectral Decomposition Orthonormal basis A = PDPT Let P = [ u1 u2  un ] and D = diag[ 1 2  n ]. = 𝜆 1 P 1 + 𝜆 2 P 2 +⋯+ 𝜆 𝑛 P 𝑛

Spectral Decomposition Example 𝐴= 3 −4 −4 −3 Find spectrum decomposition. 𝑃 1 = 𝑢 1 𝑢 1 𝑇 = 4 5 −2 5 −2 5 1 5 Eigenvalues 1 = 5 and 2 = 5. An orthonormal basis consisting of eigenvectors of A is 𝐵= −2 5 1 5 , 1 5 2 5 𝑃 2 = 𝑢 2 𝑢 2 𝑇 = 1 5 2 5 2 5 4 5 𝑢 1 𝑢 2 𝐴= 𝜆 1 𝑃 1 + 𝜆 2 𝑃 2

P is an orthogonal matrix Conclusion Any symmetric matrix has only real eigenvalues has orthogonal eigenvectors. is always diagonalizable A is symmetric P is an orthogonal matrix P 𝑇 A𝑃=D A=𝑃D P 𝑇

Appendix

Diagonalization By induction on n. n = 1 is obvious. Assume it holds for n  1, and consider A  R(n+1)(n+1). A has an eigenvector b1  Rn+1 corresponding to a real eigenvalue , so  an orthonormal basis B = {b1, b2, , bn+1} by the Extension Theorem and Gram- Schmidt Process.

S = ST  Rnn   an orthogonal C  Rnn and a diagonal L  Rnn such that CTSC = L by the induction hypothesis. Gram-Schmidt

Example: reflection operator T about a line L passing the origin. Question: Is T an orthogonal operator? (An easier) Question: Is T orthogonal if L is the x-axis? b1 is a unit vector along L . b2 is a unit vector perpendicular to L . P = [ b1 b2 ] is an orthogonal matrix. B = {b1, b2} is an orthonormal basis of R2. [T]B = diag[1 1] is an orthogonal matrix. Let the standard matrix of T be Q. Then [T]B = P1QP, or Q = P[T]B P1  Q is an orthogonal matrix.  T is an orthogonal operator.