Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.

Slides:



Advertisements
Similar presentations
10.4 Complex Vector Spaces.
Advertisements

3D Geometry for Computer Graphics
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
3_3 An Useful Overview of Matrix Algebra
Symmetric Matrices and Quadratic Forms
Computer Graphics Recitation 5.
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Singular Value Decomposition (SVD) (see Appendix A.6, Trucco & Verri) CS485/685 Computer Vision Prof. George Bebis.
Chapter 3 Determinants and Matrices
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Chapter 2 Matrices Definition of a matrix.
CSci 6971: Image Registration Lecture 2: Vectors and Matrices January 16, 2004 Prof. Chuck Stewart, RPI Dr. Luis Ibanez, Kitware Prof. Chuck Stewart, RPI.
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Lecture 18 Eigenvalue Problems II Shang-Hua Teng.
Finding Eigenvalues and Eigenvectors What is really important?
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
化工應用數學 授課教師: 郭修伯 Lecture 9 Matrices
Matrices CS485/685 Computer Vision Dr. George Bebis.
5.1 Orthogonality.
1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 5QF Introduction to Vector and Matrix Operations Needed for the.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
A vector can be interpreted as a file of data A matrix is a collection of vectors and can be interpreted as a data base The red matrix contain three column.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 2.1: Recap on Linear Algebra Daniel Cremers Technische Universität.
Compiled By Raj G. Tiwari
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
1 MAC 2103 Module 12 Eigenvalues and Eigenvectors.
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
Matrices. Definitions  A matrix is an m x n array of scalars, arranged conceptually as m rows and n columns.  m is referred to as the row dimension.
Linear algebra: matrix Eigen-value Problems
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Eigenvalues The eigenvalue problem is to determine the nontrivial solutions of the equation Ax= x where A is an n-by-n matrix, x is a length n column.
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
Review of Matrix Operations Vector: a sequence of elements (the order is important) e.g., x = (2, 1) denotes a vector length = sqrt(2*2+1*1) orientation.
7.1 Eigenvalues and Eigenvectors
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
2.5 – Determinants and Multiplicative Inverses of Matrices.
1 Chapter 8 – Symmetric Matrices and Quadratic Forms Outline 8.1 Symmetric Matrices 8.2Quardratic Forms 8.3Singular ValuesSymmetric MatricesQuardratic.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Reduced echelon form Matrix equations Null space Range Determinant Invertibility Similar matrices Eigenvalues Eigenvectors Diagonabilty Power.
Graphics Graphics Korea University kucg.korea.ac.kr Mathematics for Computer Graphics 고려대학교 컴퓨터 그래픽스 연구실.
Lecture XXVII. Orthonormal Bases and Projections Suppose that a set of vectors {x 1,…,x r } for a basis for some space S in R m space such that r  m.
CS246 Linear Algebra Review. A Brief Review of Linear Algebra Vector and a list of numbers Addition Scalar multiplication Dot product Dot product as a.
EE611 Deterministic Systems Vector Spaces and Basis Changes Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
Matrices and Vector Concepts
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Review of Linear Algebra
CS479/679 Pattern Recognition Dr. George Bebis
Review of Matrix Operations
Elementary Linear Algebra Anton & Rorres, 9th Edition
Matrices and vector spaces
Systems of First Order Linear Equations
Euclidean Inner Product on Rn
CS485/685 Computer Vision Dr. George Bebis
Chapter 3 Linear Algebra
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
Eigenvalues and Eigenvectors
Subject :- Applied Mathematics
Linear Algebra: Matrix Eigenvalue Problems – Part 2
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis

n-dimensional Vector An n-dimensional vector v is denoted as follows: The transpose v T is denoted as follows:

Inner (or dot) product Given v T = (x 1, x 2,..., x n ) and w T = (y 1, y 2,..., y n ), their dot product defined as follows: or (scalar)

Orthogonal / Orthonormal vectors A set of vectors x 1, x 2,..., x n is orthogonal if A set of vectors x 1, x 2,..., x n is orthonormal if k

Linear combinations A vector v is a linear combination of the vectors v 1,..., v k if: where c 1,..., c k are constants. Example: vectors in R 3 can be expressed as a linear combinations of unit vectors i = (1, 0, 0), j = (0, 1, 0), and k = (0, 0, 1)

Space spanning A set of vectors S=(v 1, v 2,..., v k ) span some space W if every vector in W can be written as a linear combination of the vectors in S - The unit vectors i, j, and k span R 3 w

Linear dependence A set of vectors v 1,..., v k are linearly dependent if at least one of them is a linear combination of the others. (i.e., v j does not appear on the right side)

Linear independence A set of vectors v 1,..., v k is linearly independent if no vector can be represented as a linear combination of the remaining vectors, i.e.: Example:

Vector basis A set of vectors (v 1,..., v k ) forms a basis in some vector space W if: (1) (v 1,..., v k ) are linearly independent (2) (v 1,..., v k ) span W Standard bases: R2R3RnR2R3Rn

Matrix Operations Matrix addition/subtraction – Matrices must be of same size. Matrix multiplication Condition: n = q m x n q x p m x p n

Identity Matrix

Matrix Transpose

Symmetric Matrices Example:

Determinants 2 x 2 3 x 3 n x n Properties:

Matrix Inverse The inverse A -1 of a matrix A has the property: AA -1 =A -1 A=I A -1 exists only if Terminology – Singular matrix: A -1 does not exist – Ill-conditioned matrix: A is “close” to being singular

Matrix Inverse (cont’d) Properties of the inverse:

Matrix trace Properties :

Rank of matrix Equal to the dimension of the largest square sub- matrix of A that has a non-zero determinant. Example: has rank 3

Rank of matrix (cont’d) Alternative definition: the maximum number of linearly independent columns (or rows) of A. i.e., rank is not 4! Example:

Rank of matrix (cont’d)

Eigenvalues and Eigenvectors The vector v is an eigenvector of matrix A and λ is an eigenvalue of A if: i.e., the linear transformation implied by A cannot change the direction of the eigenvectors v, only their magnitude. (assume non-zero v)

Computing λ and v To find the eigenvalues λ of a matrix A, find the roots of the characteristic polynomial: Example:

Properties Eigenvalues and eigenvectors are only defined for square matrices (i.e., m = n) Eigenvectors are not unique (e.g., if v is an eigenvector, so is kv) Suppose λ 1, λ 2,..., λ n are the eigenvalues of A, then:

Matrix diagonalization Given an n x n matrix A, find P such that: P -1 AP=Λ where Λ is diagonal Take P = [v 1 v 2... v n ], where v 1,v 2,... v n are the eigenvectors of A:

Matrix diagonalization (cont’d) Example:

Only if P -1 exists (i.e., P must have n linearly independent eigenvectors, that is, rank(P)=n) If A is diagonalizable, then the corresponding eigenvectors v 1,v 2,... v n form a basis in R n Are all n x n matrices diagonalizable P -1 AP ?

Matrix decomposition Let us assume that A is diagonalizable, then A can be decomposed as follows:

Special case: symmetric matrices The eigenvalues of a symmetric matrix are real and its eigenvectors are orthogonal. P -1 =P T A=PDP T =