Principal Components What matters most?
DRAFT: Copyright GA Tagliarini, PhD Basic Statistics Assume x1, x2,…, xn represent a distribution x of samples of a random variable. The expected value or mean of the distribution x, written E{x} or m, is given by E{x} = m = (x1+ x2+,…,+ xn)/n The population variance E{(x-m)2} = σ2, the mean of the squared deviations from the mean, is given by σ2=[(x1-m)2+(x2-m)2+…+(xn-m)2]/n 8/21/2019 DRAFT: Copyright GA Tagliarini, PhD
Generalization to Vectors Assume d-dimensional vectors xi=<x1,…,xd>T whose dimensions sample random distributions 8/21/2019 DRAFT: Copyright GA Tagliarini, PhD
DRAFT: Copyright GA Tagliarini, PhD Interpretation The mean vector is d-dimensional The covariance matrix: d x d Symmetric Real valued Main diagonal entries are the variances of the d dimensions The off-diagonal entry in row i column j is the covariance in dimensions i and j 8/21/2019 DRAFT: Copyright GA Tagliarini, PhD
Example: Finding a Covariance Matrix 8/21/2019 DRAFT: Copyright GA Tagliarini, PhD
Example: Finding a Covariance Matrix 8/21/2019 DRAFT: Copyright GA Tagliarini, PhD
Example: Finding a Covariance Matrix 8/21/2019 DRAFT: Copyright GA Tagliarini, PhD
Example: Finding a Covariance Matrix 8/21/2019 DRAFT: Copyright GA Tagliarini, PhD
Example: Finding a Covariance Matrix 8/21/2019 DRAFT: Copyright GA Tagliarini, PhD
Example: Finding a Covariance Matrix 8/21/2019 DRAFT: Copyright GA Tagliarini, PhD
Eigenvectors and Eigenvalues Suppose A is an d x d matrix, λ is a scalar, and e ≠ 0 is an d-dimensional vector. If Ae = λe, then e is an eigenvector of A corresponding to the eigenvalue λ. If A is real and symmetric, one can always find d, unit length, orthogonal (orthonormal) eigenvectors for A 8/21/2019 DRAFT: Copyright GA Tagliarini, PhD
Characteristic Polynomial Since eigenvalues and the corresponding eigenvectors satisfy Ae = λe, one can find an eigenvector corresponding to λ by solving the homogeneous linear system (λI-A)e = 0 To find λ, construct the characteristic polynomial p(λ) = det(λI-A), where det(.) is the determinant operator 8/21/2019 DRAFT: Copyright GA Tagliarini, PhD
Example: Finding Eigenvalues 8/21/2019 DRAFT: Copyright GA Tagliarini, PhD
Example: Finding Eigenvalues 8/21/2019 DRAFT: Copyright GA Tagliarini, PhD
Example: Finding Eigenvalues 8/21/2019 DRAFT: Copyright GA Tagliarini, PhD
Finding Eigenvectors For The Eigenvalues: λ1 and e1 8/21/2019 DRAFT: Copyright GA Tagliarini, PhD
Finding Eigenvectors For The Eigenvalues: λ1 and e1 8/21/2019 DRAFT: Copyright GA Tagliarini, PhD
Finding Eigenvectors For The Eigenvalues: λ1 and e1 8/21/2019 DRAFT: Copyright GA Tagliarini, PhD
Finding Eigenvectors For The Eigenvalues: λ1 and e1 8/21/2019 DRAFT: Copyright GA Tagliarini, PhD
Finding Eigenvectors For The Eigenvalues: λ2 and e2 8/21/2019 DRAFT: Copyright GA Tagliarini, PhD
Finding Eigenvectors For The Eigenvalues: λ2 and e2 8/21/2019 DRAFT: Copyright GA Tagliarini, PhD
Finding Eigenvectors For The Eigenvalues: λ3 and e3 8/21/2019 DRAFT: Copyright GA Tagliarini, PhD
Finding Eigenvectors For The Eigenvalues: λ3 and e3 8/21/2019 DRAFT: Copyright GA Tagliarini, PhD
A Similarity Transform 8/21/2019 DRAFT: Copyright GA Tagliarini, PhD
DRAFT: Copyright GA Tagliarini, PhD Hotelling Transform Form matrix A using the eigenvectors of Cx Order the eigenvalues from largest to smallest l1≥l2≥...≥ld and the corresponding eigenvectors e1, e2,…, ed Enter eigenvectors as the rows in A y = A(x-mx) my = E{y} = 0 Cy = A Cx AT 8/21/2019 DRAFT: Copyright GA Tagliarini, PhD