Matrices CS485/685 Computer Vision Dr. George Bebis.

Slides:



Advertisements
Similar presentations
3D Geometry for Computer Graphics
Advertisements

Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Chapter 6 Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Computer Graphics Recitation 5.
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Chapter 6 Eigenvalues.
Singular Value Decomposition (SVD) (see Appendix A.6, Trucco & Verri) CS485/685 Computer Vision Prof. George Bebis.
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Lecture 18 Eigenvalue Problems II Shang-Hua Teng.
Digital Control Systems Vector-Matrix Analysis. Definitions.
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
化工應用數學 授課教師: 郭修伯 Lecture 9 Matrices
5.1 Orthogonality.
1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 5QF Introduction to Vector and Matrix Operations Needed for the.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 2.1: Recap on Linear Algebra Daniel Cremers Technische Universität.
Boyce/DiPrima 9th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
Compiled By Raj G. Tiwari
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
1 MAC 2103 Module 12 Eigenvalues and Eigenvectors.
BMI II SS06 – Class 3 “Linear Algebra” Slide 1 Biomedical Imaging II Class 3 – Mathematical Preliminaries: Elementary Linear Algebra 2/13/06.
Eigenvalues and Eigenvectors
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
Matrices. Definitions  A matrix is an m x n array of scalars, arranged conceptually as m rows and n columns.  m is referred to as the row dimension.
Linear algebra: matrix Eigen-value Problems
Computing Eigen Information for Small Matrices The eigen equation can be rearranged as follows: Ax = x  Ax = I n x  Ax - I n x = 0  (A - I n )x = 0.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
What is the determinant of What is the determinant of
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Eigenvalues The eigenvalue problem is to determine the nontrivial solutions of the equation Ax= x where A is an n-by-n matrix, x is a length n column.
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
Review of Matrix Operations Vector: a sequence of elements (the order is important) e.g., x = (2, 1) denotes a vector length = sqrt(2*2+1*1) orientation.
7.1 Eigenvalues and Eigenvectors
5.1 Eigenvalues and Eigenvectors
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
Linear Algebra Chapter 6 Linear Algebra with Applications -Gareth Williams Br. Joel Baumeyer, F.S.C.
2.5 – Determinants and Multiplicative Inverses of Matrices.
1 Chapter 8 – Symmetric Matrices and Quadratic Forms Outline 8.1 Symmetric Matrices 8.2Quardratic Forms 8.3Singular ValuesSymmetric MatricesQuardratic.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Reduced echelon form Matrix equations Null space Range Determinant Invertibility Similar matrices Eigenvalues Eigenvectors Diagonabilty Power.
CS246 Linear Algebra Review. A Brief Review of Linear Algebra Vector and a list of numbers Addition Scalar multiplication Dot product Dot product as a.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Review of Linear Algebra
CS479/679 Pattern Recognition Dr. George Bebis
Elementary Linear Algebra Anton & Rorres, 9th Edition
Matrices and vector spaces
Euclidean Inner Product on Rn
Eigenvalues and Eigenvectors
Some useful linear algebra
CS485/685 Computer Vision Dr. George Bebis
Derivative of scalar forms
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Matrix Definitions It is assumed you are already familiar with the terms matrix, matrix transpose, vector, row vector, column vector, unit vector, zero.
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
RAYAT SHIKSHAN SANSTHA’S S.M.JOSHI COLLEGE HADAPSAR, PUNE
Eigenvalues and Eigenvectors
Subject :- Applied Mathematics
Linear Algebra: Matrix Eigenvalue Problems – Part 2
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Matrices CS485/685 Computer Vision Dr. George Bebis

Matrix Operations Matrix addition/subtraction –Matrices must be of same size. Matrix multiplication Condition: n = q m x nq x pm x p

Identity Matrix

Matrix Transpose

Symmetric Matrices Example:

Determinants 2 x 2 3 x 3 n x n

Determinants (cont’d) diagonal matrix:

Matrix Inverse The inverse A -1 of a matrix A has the property: AA -1 =A -1 A=I A -1 exists only if Terminology –Singular matrix: A -1 does not exist –Ill-conditioned matrix: A is close to being singular

Matrix Inverse (cont’d) Properties of the inverse:

Pseudo-inverse The pseudo-inverse A + of a matrix A (could be non- square, e.g., m x n) is given by: It can be shown that:

Matrix trace properties :

Rank of matrix Equal to the dimension of the largest square sub- matrix of A that has a non-zero determinant. Example: has rank 3

Rank of matrix (cont’d) Alternative definition: the maximum number of linearly independent columns (or rows) of A. Therefore, rank is not 4 ! Example:

Rank and singular matrices

Orthogonal matrices A is orthogonal if: Notation: Example:

Orthonormal matrices A is orthonormal if: Note that if A is orthonormal, it easy to find its inverse: Property:

Eigenvalues and Eigenvectors The vector v is an eigenvector of matrix A and λ is an eigenvalue of A if: Interpretation: the linear transformation implied by A cannot change the direction of the eigenvectors v, only their magnitude. (assume non-zero v)

Computing λ and v To find the eigenvalues λ of a matrix A, find the roots of the characteristic polynomial: Example:

Properties Eigenvalues and eigenvectors are only defined for square matrices (i.e., m = n) Eigenvectors are not unique (e.g., if v is an eigenvector, so is kv) Suppose λ 1, λ 2,..., λ n are the eigenvalues of A, then:

Properties (cont’d) x T Ax > 0 for every

Matrix diagonalization Given A, find P such that P -1 AP is diagonal (i.e., P diagonalizes A) Take P = [v 1 v 2... v n ], where v 1,v 2,... v n are the eigenvectors of A:

Matrix diagonalization (cont’d) Example:

Only if P -1 exists (i.e., A must have n linearly independent eigenvectors, that is, rank(A)=n) If A has n distinct eigenvalues λ 1, λ 2,..., λ n, then the corresponding eigevectors v 1,v 2,... v n form a basis: (1) linearly independent (2) span R n Are all n x n matrices diagonalizable?

Diagonalization  Decomposition Let us assume that A is diagonalizable, then:

Decomposition: symmetric matrices The eigenvalues of symmetric matrices are all real. The eigenvectors corresponding to distinct eigenvalues are orthogonal. A=PDP T = P -1 =P T