Introduction to Matrices Douglas N. Greve

Slides:



Advertisements
Similar presentations
3D Geometry for Computer Graphics
Advertisements

Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Section 1.7 Diagonal, Triangular, and Symmetric Matrices.
SOLVING SYSTEMS OF LINEAR EQUATIONS. Overview A matrix consists of a rectangular array of elements represented by a single symbol (example: [A]). An individual.
Refresher: Vector and Matrix Algebra Mike Kirkpatrick Department of Chemical Engineering FAMU-FSU College of Engineering.
Chapter 9 Gauss Elimination The Islamic University of Gaza
MF-852 Financial Econometrics
ECIV 301 Programming & Graphics Numerical Methods for Engineers Lecture 12 System of Linear Equations.
ECIV 301 Programming & Graphics Numerical Methods for Engineers Lecture 14 Elimination Methods.
Review of Matrix Algebra
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Chapter 2 Matrices Definition of a matrix.
Ch 7.2: Review of Matrices For theoretical and computation reasons, we review results of matrix theory in this section and the next. A matrix A is an m.
CSci 6971: Image Registration Lecture 2: Vectors and Matrices January 16, 2004 Prof. Chuck Stewart, RPI Dr. Luis Ibanez, Kitware Prof. Chuck Stewart, RPI.
1 Neural Nets Applications Vectors and Matrices. 2/27 Outline 1. Definition of Vectors 2. Operations on Vectors 3. Linear Dependence of Vectors 4. Definition.
ECIV 520 Structural Analysis II Review of Matrix Algebra.
ECIV 301 Programming & Graphics Numerical Methods for Engineers REVIEW II.
MOHAMMAD IMRAN DEPARTMENT OF APPLIED SCIENCES JAHANGIRABAD EDUCATIONAL GROUP OF INSTITUTES.
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
Linear regression models in matrix terms. The regression function in matrix terms.
Matrices CS485/685 Computer Vision Dr. George Bebis.
CE 311 K - Introduction to Computer Methods Daene C. McKinney
Chapter 7 Matrix Mathematics Matrix Operations Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 5QF Introduction to Vector and Matrix Operations Needed for the.
Linear Algebra & Matrices MfD 2004 María Asunción Fernández Seara January 21 st, 2004 “The beginnings of matrices and determinants.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
A vector can be interpreted as a file of data A matrix is a collection of vectors and can be interpreted as a data base The red matrix contain three column.
Compiled By Raj G. Tiwari
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
Sundermeyer MAR 550 Spring Laboratory in Oceanography: Data and Methods MAR550, Spring 2013 Miles A. Sundermeyer Linear Algebra & Calculus Review.
ECON 1150 Matrix Operations Special Matrices
BIOL 582 Lecture Set 19 Matrices, Matrix calculations, Linear models using linear algebra.
Presentation on Matrices and some special matrices In partial fulfillment of the subject Vector calculus and linear algebra ( ) Submitted by: Agarwal.
Some matrix stuff.
CHAP 0 MATHEMATICAL PRELIMINARY
Statistics and Linear Algebra (the real thing). Vector A vector is a rectangular arrangement of number in several rows and one column. A vector is denoted.
Lecture 28: Mathematical Insight and Engineering.
Matrix Algebra and Regression a matrix is a rectangular array of elements m=#rows, n=#columns  m x n a single value is called a ‘scalar’ a single row.
BIOL 582 Supplemental Material Matrices, Matrix calculations, GLM using matrix algebra.
Linear algebra: matrix Eigen-value Problems Eng. Hassan S. Migdadi Part 1.
4.4 Identify and Inverse Matrices Algebra 2. Learning Target I can find and use inverse matrix.
Eigenvalues The eigenvalue problem is to determine the nontrivial solutions of the equation Ax= x where A is an n-by-n matrix, x is a length n column.
Introduction to Linear Algebra Mark Goldman Emily Mackevicius.
Chapter 9 Gauss Elimination The Islamic University of Gaza
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
Review of Matrix Operations Vector: a sequence of elements (the order is important) e.g., x = (2, 1) denotes a vector length = sqrt(2*2+1*1) orientation.
Matrices and Determinants
MATRICES Operations with Matrices Properties of Matrix Operations
Matrices and Matrix Operations. Matrices An m×n matrix A is a rectangular array of mn real numbers arranged in m horizontal rows and n vertical columns.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
2.5 – Determinants and Multiplicative Inverses of Matrices.
LEARNING OUTCOMES At the end of this topic, student should be able to :  D efination of matrix  Identify the different types of matrices such as rectangular,
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Unsupervised Learning II Feature Extraction
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
Matrices, Vectors, Determinants.
Matrices. Variety of engineering problems lead to the need to solve systems of linear equations matrixcolumn vectors.
MTH108 Business Math I Lecture 20.
Matrices and Vector Concepts
Review of Linear Algebra
CS479/679 Pattern Recognition Dr. George Bebis
Review of Matrix Operations
Chapter 7 Matrix Mathematics
Introduction to Matrices
Matrices and vector spaces
Matrix Algebra - Overview
Lecture 11 Matrices and Linear Algebra with MATLAB
Laboratory in Oceanography: Data and Methods
Subject :- Applied Mathematics
Presentation transcript:

Introduction to Matrices Douglas N. Greve

Simplifies notation Simplifies concepts Grounding for general linear model (GLM) Simplifies implementation (eg, matlab, octave) Optimization Why Matrices?

A matrix is a table of numbers, like a spread sheet Each entry is called an “element” There is a value in each element (no empty cells) Elements can be whole numbers, positive, negative, fractions, real, imaginary, symbolic variables What is the matrix? 24y  3.3i-7.4 M = M M

Parts of Matrices Size/Dimensions Elements Groupings of Elements Diagonals Triangles Vectors

Number of Rows Number of Columns Rows-by-Columns (RowsxCols) Matrix Size/Dimension 2x3 24y  3.3i-7.4 1x3 24y 2  2x1  1x1

Scalar is just a number or value No dimension or size Scalar versus Matrix  = -1 No brackets No dimension Not a 1x1  = [-1] 1x1

Indicated by its row and column number Scalar value Matrix Element 2x3 (1,1) (1,2) (1,3)  (2,2) (2,3) M = M(2,3) = -7.4 M(i,j) M(row,col) M ij 24y  3.3i-7.4

Row = Column + Offset Diagonals x Row = Column + 0 (Main Diagonal) Row = Column - 1 (First Upper Diagonal) Row = Column - 2 (Second Upper Diagonal) Row = Column - 1 (First Lower Diagonal) Row = Column - 2 (Second Lower Diagonal) } Off-Diagonals }

Triangles x3 Upper Triangle Lower Triangle

Special Type of Matrix Number of Rows or Columns = 1 Often given lower-case variable names Vector 1x3 Row Vector 2  24y 2x1 Column Vector

A matrix consists of vectors 24x  3.3i x  3.3i y  3.3i-7.4 Two 1x3 row vectorsThree 2x1 column vectors 2x3

Special Matrices Square Diagonal Identity Triangular Symmetric Toeplitz

Rows = Columns Square Matrix 2x2 24  3.3i  1x1

All off-diagonal elements = 0 M(i,j)=0 if i != j Diagonal Matrix 3x

Main diagonal has all 1s All off-diagonal elements = 0 Square Symbol “ I” Identity Matrix 3x I =I = Any matrix multiplied by the identity is itself

Triangular Matrices x3 Upper Triangular – all values below the main diagonal are x3 Lower Triangular – all values above the main diagonal are 0

Symmetric Matrix x3 M(row,col) = M(col,row) Reflect across main diagonal Square

All elements on a diagonal are the same Toeplitz Matrix x x2 A causal, time invariant linear system is represented by an upper triangular Toeplitz matrix

All elements are 1 Any number of rows or columns Ones Matrix 2x x x1  1x1

All elements are 0 Any number of rows or columns Zeros Matrix 2x x x1  1x1

Matrix Operations Addition/Subtraction Vector Multiplication Matrix Multiplication Multiplication by scalar Transpose Trace Inverse Pseudo Inverse

Element-by-Element Addition/Subtraction Size of two matrices must be the same C(row,col) = A(row,col) +/- B(row,col) Matrix Addition/Subtraction A = B = C = 2x3 C = A + B (your first matrix equation!)

One Row Vector and One Column Vector Same length, Results in 1x1 Inner Product, Dot Product, “Scalar” Product Like a correlation “Multiply and accumulate” Vector Multiplication 246 x = y = 1x3 3x1 z = x.y= x*y =  x i *y i * * * = 46

Series of vector multiplications Same “inner dimension” Matrix Multiplication A = 0 1 B = 2x32x3 3x43x c ij =Arow i *Bcol j c 11 2x42x4 C=A*B = c 12 c 13 c 14 c 21 c 22 c 23 c 24 Dimensions: (2x3) *(3x4)= (2x4 )

Series of vector multiplications Same “inner dimension” Matrix Multiplication A = 0 1 B = 2x32x3 3x43x c11c11 2x42x4 C=A*B = c 12 c 13 c 14 c 21 c 22 c 23 c c11=c11= 2*0+4*1+6*3= 22

Series of vector multiplications Same “inner dimension” Matrix Multiplication A = 0 1 B = 2x32x3 3x43x x42x4 C=A*B = c12c12 c 13 c 14 c 21 c 22 c 23 c c12=c12= 2*4+4*5+6*7= 70

Series of vector multiplications Same “inner dimension” Matrix Multiplication A = 0 1 B = 2x32x3 3x43x x42x4 C=A*B = 70c 14 c 21 c 22 c 23 c c13=c13= 2*2+4*6+6*9= 82 c13c13

Series of vector multiplications Same “inner dimension” Matrix Multiplication A = 0 1 B = 2x32x3 3x43x x42x4 C=A*B = 70c14c14 c 21 c 22 c 23 c c14=c14= 2*8+4*2+6*3= 42 82

Series of vector multiplications Same “inner dimension” Matrix Multiplication A = 0 1 B = 2x32x3 3x43x x42x4 C=A*B = 7042 c21c21 c 22 c 23 c c21=c21= 1*0+4*1+5*3= 18 82

Series of vector multiplications Same “inner dimension” Matrix Multiplication A = 0 1 B = 2x32x3 3x43x x42x4 C=A*B = c22c22 c 23 c c22=c22= 1*4+4*5+5*7= 54 82

One Row Vector one Column Vector Same length (N) Result is NxN matrix Special case of Matrix Multiplication Outer Product (Vector Multiplication) 246 x = y = 1x3 3x1 z = y*x x3 3x x

Multiply each element by scalar value Multiplication with a scalar A =  = -1 2x3  = No brackets No dimension Not a 1x x3

“Reflect” across diagonal B(row,col) = A(col,row) Matrix Transpose A = 2 4 B=A T =A t =A’= 2x3 3x If A=A ’, then A is symmetric (must be square)

Sum of diagonal Elements Trace =  M(i,i) Trace M = Trace = = 14

Determinant c 11 2x2 C = c 12 c 21 c 22  = c 11 * c 22 - c 12 * c 21 Property of a square matrix Complicated in general Simple for a 2x2

C*A = I, then A= C -1 Must be square Complicated in general Simple for a 2x2 Matrix Inverse c 11 2x2 C = c 12 c 21 c 22  = c 11 * c 22 - c 12 * c 21 c 22 2x2 C -1 = -c 12 -c 21 c 11 

Invertibility 1.0 2x2 C =  = 1.0* *0.5 = 1-1 = x2 C -1 =  IMPORTANT!!! Not all matrices are invertible  =0 “Singular”

Singularity and “Ill-Conditioned” 1.0 2x2 C =  = 1.0* *0.5 = 1-1 = 0 Column 2 = twice Column 1 Linear Dependence Ill-Conditioned:  is “close” to 0 Relates to efficiency of a GLM.

Can be used for non-square Same rules on invertibility apply Least-mean-square (LMS) for over-determined system Pseudo-Inverse If X has more columns than rows X + = (X’ * X) -1 * X’ Check: X + * X = (X’ * X) -1 * X’ * X=I If X has more rows than columns X + = X’ * (X * X’) -1 * Check: X * X + = X * X’ * (X * X’) -1 =I

Matrix Applications

Sum/Length/Average of a List length(y) = x ’ *x = 1*1 + 1*1 + 1*1 = 3 sum(y) = x ’ *y = 1*1 + 1*2 + 1*3 = 6 x = 3x x3 x ’ = [1 1 1] 1x3 “Ones vector” 3x1 Average  = (x ’ *x) -1 (x’*y) [Note: GLM] = (3 -1 )*6 = 6/3 = 2 y = 3x

Selective Average length(y) = x ’ *x = 0*0+ 1*1 + 1*1 = 2 sum(y) = x ’ *y = 0*1 + 1*2 + 1*3 = 5 x = 3x x3 x ’ = [0 1 1] 1x3 “Ones vector” 3x1 Average  = (x ’ *x) -1 (x’*y) = (2 -1 )*5 = 5/2 = 2.5 y = 3x

Sum of the Squares of a List y’ = [1 2 3] 1x3 SS(y) = y ’ *y = 1*1 + 2*2 + 3*3 = 14 y = 3x SS =  y 2 )

Variance of a List Residual e = y –  *x  2 = sqrt(e’*e)/DOF DOF = x’*x - 1 = #RowsX-#ColsX y = 3x x = 3x  2 =  (y-  ) 2 (N-1)

Vector product is 0 Property of two vectors Uncorrelated/Independence - statistical property Orthogonality 15 x = y = 1x3 3x1 z = x*y =  x i *y i 1 * * *.2 = [0] 1x1

46 Fitting a Line HRF Amplitude IQ, Height, Weight Independent Variable Dependent Variable, Measurement x1x2 y2 y1 Subject 1 Subject 2 Thickness Age

47 Linear Model Intercept: b Slope: m Age x1x2 y2 y1 System of Linear Equations y1 = 1*b + x1*m y2 = 1*b + x2*m Two Eqns, Two Unknowns “Intercept” = “Offset” m= (y2-y1) (x2-x1) b= (x2*y1-x1*y2) (x2-x1) 1.Solve 1 st Eq for b 2.Substitute into 2 nd Eq 3.Solve 2 nd equation for m 4.Substitute m back into Eq for b

48 Matrix Formulation/GLM Intercept: b Slope: m Age x1x2 y2 y1 y1 = 1*b + x1*m y2 = 1*b + x2*m  - parameter vector X - design matrix y1 y2 1 x1 1 x2 bmbm =* y = X*  2x2 2x1

49 Matrix Formulation/GLM Intercept: b Slope: m Age x1x2 y2 y1 y = X*  y1 y2 1 x1 1 x2 bmbm =* 1 X = x1 1x2 # of Cols of X = # Parameters Each parameter has a column Each column means something Parameter meaning from column

50 GLM Solution Intercept: b Slope: m Age x1x2 y2 y1 y = X*   ultiply both sides by X -1  =X -1 *y y1 y2 1 x1 1 x2 bmbm =* 1 X = x1 1x2 X -1 = -x1 1 11  = x2-x1 Non-invertible if x1=x2 Ill-conditioned if x1 near x2 -- Sensitive to noise

51 Non-invertibility Intercept: b Slope: m Age x1x2 y2 y1 y2 1 x1 1 x2 bmbm =* 1 X =.1 1 y = X*  1.2 y =  = 2.2  = y = 1.2 x1=x2=0.1 Want to find  but …  is not unique!

52 More than Two Points y1 = 1*b + x1*m y2 = 1*b + x2*m y3 = 1*b + x3*m Intercept: b Slope: m y = X*   ultiply both sides by X +  =X + y=(X’X) -1 X’y Three Eqns, Two Unknowns Over-determined Still two columns in X, same meaning X is not square (use pseudo-inv X + ; LMS) Same invertibility concerns Measure noise (DOF != 0)

Matrix Decomposition/Factorization Break up a single matrix into several matrices that are multiplied together: Eigen System Singular Value Decomposition (SVD) Principle Component Analysis (PCA) Cholesky Decomposition - “square root” of a matrix

Eigen Decomposition A = Q*  *Q -1, A*v = *v A – square matrix v – eigenvector - eigenvalue Q – matrix of eigenvectors  – diagonal matrix of eigenvalues