Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis
n-dimensional Vector An n-dimensional vector v is denoted as follows: The transpose v T is denoted as follows:
Inner (or dot) product Given v T = (x 1, x 2,..., x n ) and w T = (y 1, y 2,..., y n ), their dot product defined as follows: or (scalar)
Orthogonal / Orthonormal vectors A set of vectors x 1, x 2,..., x n is orthogonal if A set of vectors x 1, x 2,..., x n is orthonormal if k
Linear combinations A vector v is a linear combination of the vectors v 1,..., v k if: where c 1,..., c k are constants. Example: vectors in R 3 can be expressed as a linear combinations of unit vectors i = (1, 0, 0), j = (0, 1, 0), and k = (0, 0, 1)
Space spanning A set of vectors S=(v 1, v 2,..., v k ) span some space W if every vector in W can be written as a linear combination of the vectors in S - The unit vectors i, j, and k span R 3 w
Linear dependence A set of vectors v 1,..., v k are linearly dependent if at least one of them is a linear combination of the others. (i.e., v j does not appear on the right side)
Linear independence A set of vectors v 1,..., v k is linearly independent if no vector can be represented as a linear combination of the remaining vectors, i.e.: Example:
Vector basis A set of vectors (v 1,..., v k ) forms a basis in some vector space W if: (1) (v 1,..., v k ) are linearly independent (2) (v 1,..., v k ) span W Standard bases: R2R3RnR2R3Rn
Matrix Operations Matrix addition/subtraction – Matrices must be of same size. Matrix multiplication Condition: n = q m x n q x p m x p n
Identity Matrix
Matrix Transpose
Symmetric Matrices Example:
Determinants 2 x 2 3 x 3 n x n Properties:
Matrix Inverse The inverse A -1 of a matrix A has the property: AA -1 =A -1 A=I A -1 exists only if Terminology – Singular matrix: A -1 does not exist – Ill-conditioned matrix: A is “close” to being singular
Matrix Inverse (cont’d) Properties of the inverse:
Matrix trace Properties :
Rank of matrix Equal to the dimension of the largest square sub- matrix of A that has a non-zero determinant. Example: has rank 3
Rank of matrix (cont’d) Alternative definition: the maximum number of linearly independent columns (or rows) of A. i.e., rank is not 4! Example:
Rank of matrix (cont’d)
Eigenvalues and Eigenvectors The vector v is an eigenvector of matrix A and λ is an eigenvalue of A if: i.e., the linear transformation implied by A cannot change the direction of the eigenvectors v, only their magnitude. (assume non-zero v)
Computing λ and v To find the eigenvalues λ of a matrix A, find the roots of the characteristic polynomial: Example:
Properties Eigenvalues and eigenvectors are only defined for square matrices (i.e., m = n) Eigenvectors are not unique (e.g., if v is an eigenvector, so is kv) Suppose λ 1, λ 2,..., λ n are the eigenvalues of A, then:
Matrix diagonalization Given an n x n matrix A, find P such that: P -1 AP=Λ where Λ is diagonal Take P = [v 1 v 2... v n ], where v 1,v 2,... v n are the eigenvectors of A:
Matrix diagonalization (cont’d) Example:
Only if P -1 exists (i.e., P must have n linearly independent eigenvectors, that is, rank(P)=n) If A is diagonalizable, then the corresponding eigenvectors v 1,v 2,... v n form a basis in R n Are all n x n matrices diagonalizable P -1 AP ?
Matrix decomposition Let us assume that A is diagonalizable, then A can be decomposed as follows:
Special case: symmetric matrices The eigenvalues of a symmetric matrix are real and its eigenvectors are orthogonal. P -1 =P T A=PDP T =