Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.

Slides:



Advertisements
Similar presentations
10.4 Complex Vector Spaces.
Advertisements

3D Geometry for Computer Graphics
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Chapter 6 Eigenvalues and Eigenvectors
+ Review of Linear Algebra Introduction to Matlab / Machine Learning Fall 2010 Recitation by Leman Akoglu 9/16/10.
Lecture 19 Singular Value Decomposition
1cs542g-term High Dimensional Data  So far we’ve considered scalar data values f i (or interpolated/approximated each component of vector values.
Signal , Weight Vector Spaces and Linear Transformations
Dan Witzner Hansen  Groups?  Improvements – what is missing?
Symmetric Matrices and Quadratic Forms
Computer Graphics Recitation 5.
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Chapter 2 Matrices Definition of a matrix.
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
1cs542g-term Notes  Extra class next week (Oct 12, not this Friday)  To submit your assignment: me the URL of a page containing (links to)
化工應用數學 授課教師: 郭修伯 Lecture 9 Matrices
Matrices CS485/685 Computer Vision Dr. George Bebis.
Stats & Linear Models.
5.1 Orthogonality.
1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 5QF Introduction to Vector and Matrix Operations Needed for the.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
A vector can be interpreted as a file of data A matrix is a collection of vectors and can be interpreted as a data base The red matrix contain three column.
Boyce/DiPrima 9th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
Compiled By Raj G. Tiwari
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
BMI II SS06 – Class 3 “Linear Algebra” Slide 1 Biomedical Imaging II Class 3 – Mathematical Preliminaries: Elementary Linear Algebra 2/13/06.
Modern Navigation Thomas Herring
+ Review of Linear Algebra Optimization 1/14/10 Recitation Sivaraman Balakrishnan.
Elementary Linear Algebra Anton & Rorres, 9th Edition
CPSC 491 Xin Liu Nov 17, Introduction Xin Liu PhD student of Dr. Rokne Contact Slides downloadable at pages.cpsc.ucalgary.ca/~liuxin.
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 1 : “shiv rpi” Linear Algebra A gentle introduction Linear Algebra has become as basic and as applicable.
Matrices. Definitions  A matrix is an m x n array of scalars, arranged conceptually as m rows and n columns.  m is referred to as the row dimension.
Linear algebra: matrix Eigen-value Problems
Elementary Linear Algebra Anton & Rorres, 9th Edition
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Eigenvalues The eigenvalue problem is to determine the nontrivial solutions of the equation Ax= x where A is an n-by-n matrix, x is a length n column.
Introduction to Linear Algebra Mark Goldman Emily Mackevicius.
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
Review of Matrix Operations Vector: a sequence of elements (the order is important) e.g., x = (2, 1) denotes a vector length = sqrt(2*2+1*1) orientation.
EIGENSYSTEMS, SVD, PCA Big Data Seminar, Dedi Gadot, December 14 th, 2014.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Unsupervised Learning II Feature Extraction
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Graphics Graphics Korea University kucg.korea.ac.kr Mathematics for Computer Graphics 고려대학교 컴퓨터 그래픽스 연구실.
EE611 Deterministic Systems Vector Spaces and Basis Changes Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
Introduction to Linear Algebra
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Review of Linear Algebra
CS479/679 Pattern Recognition Dr. George Bebis
Review of Matrix Operations
Matrices and vector spaces
Matrices and Vectors Review Objective
Systems of First Order Linear Equations
Lecture on Linear Algebra
CS485/685 Computer Vision Dr. George Bebis
Chapter 3 Linear Algebra
Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors.
Subject :- Applied Mathematics
Presentation transcript:

Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear Algebra review

Vectors The length of x, a.k.a. the norm or 2-norm of x, is e.g.,

Good Review Materials (Gonzales & Woods review materials) Chapt. 1: Linear Algebra Review Chapt. 2: Probability, Random Variables, Random Vectors

Online vector addition demo:

Vector Addition v u u+v

Vector Subtraction u v u-v

Example (on board)

Inner product (dot product) of two vectors

Inner (dot) Product v u  The inner product is a SCALAR.

Transpose of a Matrix Transpose: Example of symmetric matrix Examples: If, we say A is symmetric.

Matrix Product Product: Examples: A and B must have compatible dimensions In Matlab: >> A*B Matrix Multiplication is not commutative:

Matrix Sum Sum: A and B must have the same dimensions Example:

Determinant of a Matrix Determinant: Example: A must be square

Determinant in Matlab

Inverse of a Matrix If A is a square matrix, the inverse of A, called A -1, satisfies AA -1 = I and A -1 A = I, Where I, the identity matrix, is a diagonal matrix with all 1’s on the diagonal.

Inverse of a 2D Matrix Example:

Inverses in Matlab

Trace of a matrix Example (on board)

Matrix Transformation: Scale A square, diagonal matrix scales each dimension by the corresponding diagonal element. Example:

Linear Independence A set of vectors is linearly dependent if one of the vectors can be expressed as a linear combination of the other vectors. Example: A set of vectors is linearly independent if none of the vectors can be expressed as a linear combination of the other vectors. Example:

Rank of a matrix The rank of a matrix is the number of linearly independent columns of the matrix. Examples: has rank 2 Note: the rank of a matrix is also the number of linearly independent rows of the matrix. has rank 3

Singular Matrix All of the following conditions are equivalent. We say a square (n  n) matrix is singular if any one of these conditions (and hence all of them) is satisfied. –The columns are linearly dependent –The rows are linearly dependent –The determinant = 0 –The matrix is not invertible –The matrix is not full rank (i.e., rank < n)

Linear Spaces A linear space is the set of all vectors that can be expressed as a linear combination of a set of basis vectors. We say this space is the span of the basis vectors. –Example: R 3, 3-dimensional Euclidean space, is spanned by each of the following two bases:

Linear Subspaces A linear subspace is the space spanned by a subset of the vectors in a linear space. –The space spanned by the following vectors is a two-dimensional subspace of R 3. What does it look like? –The space spanned by the following vectors is a two-dimensional subspace of R 3.

Orthogonal and Orthonormal Bases n linearly independent real vectors span R n, n-dimensional Euclidean space They form a basis for the space. –An orthogonal basis, a 1, …, a n satisfies a i  a j = 0 if i ≠ j –An orthonormal basis, a 1, …, a n satisfies a i  a j = 0 if i ≠ j a i  a j = 1 if i = j –Examples.

Orthonormal Matrices A square matrix is orthonormal (also called unitary) if its columns are orthonormal vectors. –A matrix A is orthonormal iff AA T = I. If A is orthonormal, A -1 = A T AA T = A T A = I. –A rotation matrix is an orthonormal matrix with determinant = 1. It is also possible for an orthonormal matrix to have determinant = –1. This is a rotation plus a flip (reflection).

Matrix Transformation: Rotation by an Angle  2D rotation matrix:

Some Properties of Eigenvalues and Eigenvectors –If 1, …, n are distinct eigenvalues of a matrix, then the corresponding eigenvectors e 1, …, e n are linearly independent. –If e 1 is an eigenvector of a matrix with corresponding eigenvalue 1, then any nonzero scalar multiple of e 1 is also an eigenvector with eigenvalue 1. –A real, symmetric square matrix has real eigenvalues, with orthogonal eigenvectors (can be chosen to be orthonormal).

SVD: Singular Value Decomposition Any matrix A (m  n) can be written as the product of three matrices: A = UDV T where –U is an m  m orthonormal matrix –D is an m  n diagonal matrix. Its diagonal elements,  1,  2, …, are called the singular values of A, and satisfy  1 ≥  2 ≥ … ≥ 0. –V is an n  n orthonormal matrix Example: if m > n AUDV TV T

SVD in Matlab >> a = [1 2 3; 2 7 4; ; 2 4 9; ] a = >> [u,d,v] = svd(a) u = d = v =

Some Properties of SVD –The rank of matrix A is equal to the number of nonzero singular values  i –A square (n  n) matrix A is singular if and only if at least one of its singular values  1, …,  n is zero.

Geometric Interpretation of SVD If A is a square (n  n) matrix, –U is a unitary matrix: rotation (possibly plus flip) –D is a scale matrix –V (and thus V T ) is a unitary matrix Punchline: An arbitrary n-D linear transformation is equivalent to a rotation (plus perhaps a flip), followed by a scale transformation, followed by a rotation Advanced: y = Ax = UDV T x –V T expresses x in terms of the basis V. –D rescales each coordinate (each dimension) –The new coordinates are the coordinates of y in terms of the basis U AUDV TV T