CS479/679 Pattern Recognition Dr. George Bebis

Slides:



Advertisements
Similar presentations
3D Geometry for Computer Graphics
Advertisements

Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
OCE301 Part II: Linear Algebra lecture 4. Eigenvalue Problem Ax = y Ax = x occur frequently in engineering analysis (eigenvalue problem) Ax =  x [ A.
3_3 An Useful Overview of Matrix Algebra
Symmetric Matrices and Quadratic Forms
Computer Graphics Recitation 5.
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Singular Value Decomposition (SVD) (see Appendix A.6, Trucco & Verri) CS485/685 Computer Vision Prof. George Bebis.
Chapter 3 Determinants and Matrices
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
CSci 6971: Image Registration Lecture 2: Vectors and Matrices January 16, 2004 Prof. Chuck Stewart, RPI Dr. Luis Ibanez, Kitware Prof. Chuck Stewart, RPI.
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
Matrices CS485/685 Computer Vision Dr. George Bebis.
5.1 Orthogonality.
1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 5QF Introduction to Vector and Matrix Operations Needed for the.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
A vector can be interpreted as a file of data A matrix is a collection of vectors and can be interpreted as a data base The red matrix contain three column.
Compiled By Raj G. Tiwari
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
1 MAC 2103 Module 12 Eigenvalues and Eigenvectors.
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
10.4 Matrix Algebra 1.Matrix Notation 2.Sum/Difference of 2 matrices 3.Scalar multiple 4.Product of 2 matrices 5.Identity Matrix 6.Inverse of a matrix.
Matrices. Definitions  A matrix is an m x n array of scalars, arranged conceptually as m rows and n columns.  m is referred to as the row dimension.
Linear algebra: matrix Eigen-value Problems
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
Review of Matrix Operations Vector: a sequence of elements (the order is important) e.g., x = (2, 1) denotes a vector length = sqrt(2*2+1*1) orientation.
7.1 Eigenvalues and Eigenvectors
5.1 Eigenvalues and Eigenvectors
10.4 Matrix Algebra 1.Matrix Notation 2.Sum/Difference of 2 matrices 3.Scalar multiple 4.Product of 2 matrices 5.Identity Matrix 6.Inverse of a matrix.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Reduced echelon form Matrix equations Null space Range Determinant Invertibility Similar matrices Eigenvalues Eigenvectors Diagonabilty Power.
Graphics Graphics Korea University kucg.korea.ac.kr Mathematics for Computer Graphics 고려대학교 컴퓨터 그래픽스 연구실.
EE611 Deterministic Systems Vector Spaces and Basis Changes Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
Chapter 5 Eigenvalues and Eigenvectors
Matrices and Vector Concepts
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Review of Linear Algebra
Review of Matrix Operations
Elementary Linear Algebra Anton & Rorres, 9th Edition
Matrices and vector spaces
Matrices and Vectors Review Objective
Systems of First Order Linear Equations
Euclidean Inner Product on Rn
Math Review CS474/674 – Prof. Bebis.
Eigenvalues and Eigenvectors
CS485/685 Computer Vision Dr. George Bebis
Numerical Analysis Lecture 16.
1.3 Vector Equations.
Chapter 3 Linear Algebra
Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors.
Signal & Weight Vector Spaces
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Matrix Definitions It is assumed you are already familiar with the terms matrix, matrix transpose, vector, row vector, column vector, unit vector, zero.
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
Eigenvalues and Eigenvectors
Subject :- Applied Mathematics
Linear Algebra: Matrix Eigenvalue Problems – Part 2
Lecture 20 SVD and Its Applications
Symmetric Matrices and Quadratic Forms
Presentation transcript:

CS479/679 Pattern Recognition Dr. George Bebis 2018-05-20 Linear Algebra Review CS479/679 Pattern Recognition Dr. George Bebis George Bebis

n-dimensional Vector An n-dimensional vector v is denoted as follows: The transpose vT is denoted as follows:

Inner (or dot) product Given vT = (x1, x2, . . . , xn) and wT = (y1, y2, . . . , yn), their dot product defined as follows: (scalar) or

Orthogonal / Orthonormal vectors A set of vectors x1, x2, . . . , xn is orthogonal if A set of vectors x1, x2, . . . , xn is orthonormal if k

Linear combinations A vector v is a linear combination of the vectors v1, ..., vk if: where c1, ..., ck are constants. Example: vectors in R3 can be expressed as a linear combinations of unit vectors i = (1, 0, 0), j = (0, 1, 0), and k = (0, 0, 1)

Space spanning A set of vectors S=(v1, v2, . . . , vk ) span some space W if every vector in W can be written as a linear combination of the vectors in S - The unit vectors i, j, and k span R3 w

Linear dependence A set of vectors v1, ..., vk are linearly dependent if at least one of them is a linear combination of the others. (i.e., vj does not appear on the right side)

Linear independence A set of vectors v1, ..., vk is linearly independent if no vector can be represented as a linear combination of the remaining vectors, i.e.: Example: c1=c2=0

Vector basis A set of vectors (v1, ..., vk) forms a basis in some vector space W if: (1) (v1, ..., vk) are linearly independent (2) (v1, ..., vk) span W Standard bases: R2 R3 Rn

Matrix Operations Matrix addition/subtraction Matrix multiplication Add/Subtract corresponding elements. Matrices must be of same size. Matrix multiplication m x p m x n q x p n Condition: n = q

Identity Matrix

Matrix Transpose

Symmetric Matrices Example:

Determinants 2 x 2 3 x 3 n x n Properties: (expanded along 1st column) (expanded along kth column) Properties:

Matrix Inverse The inverse of a matrix A, denoted as A-1, has the property: AA-1=A-1A=I A-1 exists only if Terminology Singular matrix: A-1 does not exist Ill-conditioned matrix: A is “close” to being singular

Matrix Inverse (cont’d) Properties of the inverse:

Matrix trace Properties:

Rank of matrix Example: Equal to the dimension of the largest square sub-matrix of A that has a non-zero determinant. Example: has rank 3

Rank of matrix (cont’d) Alternative definition: the maximum number of linearly independent columns (or rows) of A. Example: i.e., rank is not 4!

Rank of matrix (cont’d)

Eigenvalues and Eigenvectors The vector v is an eigenvector of matrix A and λ is an eigenvalue of A if: Geometric interpretation: the linear transformation implied by A can not change the direction of the eigenvectors v, only their magnitude. (assume non-zero v)

Computing λ and v To find the eigenvalues λ of a matrix A, find the roots of the characteristic polynomial: Example:

Properties of λ and v Eigenvalues and eigenvectors are only defined for square matrices. Eigenvectors are not unique (e.g., if v is an eigenvector, so is kv). Suppose λ1, λ2, ..., λn are the eigenvalues of A, then:

Matrix diagonalization Given an n x n matrix A, find P such that: P-1AP=Λ where Λ is diagonal Solution: Set P = [v1 v2 . . . vn], where v1,v2 ,. . . vn are the eigenvectors of A:

Matrix diagonalization (cont’d) Example:

Matrix diagonalization (cont’d) If A is diagonalizable, then the corresponding eigenvectors v1,v2 ,. . . vn form a basis in Rn

Are all n x n matrices diagonalizable P-1AP ? An n x n matrix A is diagonalizable iff it has n linearly independent eigenvectors. i.e., if P-1 exists, that is, rank(P)=n Theorem: If the eigenvalues of A are all distinct, their corresponding eigenvectors are linearly independent (i.e., A is diagonalizable).

Are all n x n matrices diagonalizable P-1AP ? (cont’d)   λ1=λ2=1 and λ3=2 λ1=λ2=0 and λ3=-2 non-diagonalizable diagonalizable

Matrix decomposition If A is diagonalizable, then A can be decomposed as follows:

Special case: symmetric matrices The eigenvalues of a symmetric matrix are real and its eigenvectors are orthogonal. P-1=PT A=PDPT=