Presentation is loading. Please wait.

Presentation is loading. Please wait.

Scientific Computing Singular Value Decomposition SVD.

Similar presentations


Presentation on theme: "Scientific Computing Singular Value Decomposition SVD."— Presentation transcript:

1 Scientific Computing Singular Value Decomposition SVD

2 SVD - Overview SVD is a way to decompose singular (or nearly singular) matrices, i.e. matrices that do not have inverses. This includes square matrices whose determinant is zero (or nearly zero) and all rectangular matrices.

3 SVD - Basics The SVD of a m-by-n matrix A is given by the formula : Where : U is a m-by-m matrix of the orthonormal eigenvectors of AA T (U is orthogonal) V T is the transpose of a n-by-n matrix containing the orthonormal eigenvectors of A T A (V is orthogonal) D is a n-by-n Diagonal matrix of the singular values which are the square roots of the eigenvalues of A T A

4 SVD In Matlab: [U,D,V]=svd(A,0)

5 The Algorithm Derivation of the SVD can be broken down into two major steps [2] : 1.Reduce the initial matrix to bidiagonal form using Householder transformations (reflections) 2.Diagonalize the resulting matrix using orthogonal transformations (rotations) Initial MatrixBidiagonal FormDiagonal Form

6 Householder Transformations Recall: A Householder matrix is a reflection defined as : H = I – 2ww T Where w is a unit vector with |w| 2 = 1. We have the following properties : H = H T H -1 = H T H 2 = I (Identity Matrix) If H is multiplied by another matrix, (on right/left) it results in a new matrix with zero’ed out elements in a selected row / column based on the values chosen for w.

7 Applying Householder To derive the bidiagonal matrix, we apply successive Householder matrices on the left (columns) and right (rows):

8 Application con’t From here we see : H 1 A = A 1 A 1 K 1 = A 2 H 2 A 2 = A 3 …. A n K n = B [If m > n, then H m A m = B] This can be re-written in terms of A : A = H 1 T A 1 = H 1 T A 2 K 1 T = H 1 T H 2 T A 3 K 1 T = … = H 1 T …H m T BK n T …K 1 T = H 1 …H m BK n …K 1 = HBK

9 Householder Calculation Columns: Recall that we zero out the column below the (k,k) entry as follows (note that there are m rows): Let (column vector-size m) Note: Thus, where I k is a kxk identity matrix.

10 Householder Calculation Rows: To zero out the row past the (k,k+1) entry: Let (row vector- size n) where w k is a row vector Note: Thus, where I k is a (k-1)x(k-1) identity matrix.

11 Example To derive H 1 for the given matrix A : We have : Thus, So,

12 Example con’t Then, For K1:

13 Example con’t Then, We can start to see the bidiagonal form.

14 Example con’t If we carry out this one more time we get:

15 The QR Algorithm As seen, the initial matrix is placed into bidiagonal form which results in the following decomposition : A = HBK with H = H 1...H n and K = K m …K 1 The next step takes B and converts it to the final diagonal form using successive rotation transformations (as reflections would disrupt upper triangular form).

16 Givens Rotations A Givens rotation is used to rotate a plane about two coordinates axes and can be used to zero elements similar to the householder reflection. It is represented by a matrix of the form : Note: The multiplication GA effects only the rows i and j in A. Likewise the multiplication AG t only effects the columns i and j.

17 Givens rotation The zeroing of an element is performed by computing the c and s in the following system. Where b is the element being zeroed and a is next to b in the preceding column / row. This results in :

18 Givens Example In our previous example, we had used Householder transformations to get a bidiagonal matrix: We can use rotation matrices to zero out the off-diagonal terms Matlab: [U,D,V]=svd(A,0)

19 SVD Applications Calculation of inverse of A: So, for mxn A define (pseudo) inverse to be: V D -1 U t [1] : Given [2] : Multiply by A -1 [3] : Multiply by V [4]* : Multiply by D -1 [5] : Multiply by U T [6] : Rearranging

20 SVD Applications con’t SVD can tell How close a square matrix A is to be singular. The ratio of the largest singular value to the smallest singular value can tell us how close a matrix is to be singular: A is singular if c is infinite. A is ill-conditioned if c is too large (machine dependent). Condition number

21 SVD Applications con’t Data Fitting Problem

22 SVD Applications con’t Image processing [U,W,V]=svd(A) NewImg=U(:,1)*W(1,1)*V(:,1)’

23 SVD Applications con’t SVD is used as a method for noise reduction. Let a matrix A represent the noisy signal: – compute the SVD, – and then discard small singular values of A. It can be shown that the small singular values mainly represent the noise, and thus the rank- k matrix A k represents a filtered signal with less noise. Digital Signal Processing (DSP)


Download ppt "Scientific Computing Singular Value Decomposition SVD."

Similar presentations


Ads by Google