Vector Norms and the related Matrix Norms. Properties of a Vector Norm: Euclidean Vector Norm: Riemannian metric:

Slides:



Advertisements
Similar presentations
10.4 Complex Vector Spaces.
Advertisements

5.1 Real Vector Spaces.
Chapter 4 Euclidean Vector Spaces
Ch 7.7: Fundamental Matrices
Chapter 6 Eigenvalues and Eigenvectors
Jonathan Richard Shewchuk Reading Group Presention By David Cline
Signal , Weight Vector Spaces and Linear Transformations
Signal , Weight Vector Spaces and Linear Transformations
Linear Transformations
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Chapter 5 Orthogonality
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Ch 7.9: Nonhomogeneous Linear Systems
Math for CSLecture 41 Linear Least Squares Problem Over-determined systems Minimization problem: Least squares norm Normal Equations Singular Value Decomposition.
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
1 Neural Nets Applications Vectors and Matrices. 2/27 Outline 1. Definition of Vectors 2. Operations on Vectors 3. Linear Dependence of Vectors 4. Definition.
5.IV. Jordan Form 5.IV.1. Polynomials of Maps and Matrices 5.IV.2. Jordan Canonical Form.
6 1 Linear Transformations. 6 2 Hopfield Network Questions.
Tutorial 10 Iterative Methods and Matrix Norms. 2 In an iterative process, the k+1 step is defined via: Iterative processes Eigenvector decomposition.
Orthogonality and Least Squares
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
Linear regression models in matrix terms. The regression function in matrix terms.
Stats & Linear Models.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Boyce/DiPrima 9th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
Summarized by Soo-Jin Kim
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
Multivariate Statistics Matrix Algebra II W. M. van der Veld University of Amsterdam.
6 1 Linear Transformations. 6 2 Hopfield Network Questions The network output is repeatedly multiplied by the weight matrix W. What is the effect of this.
Chapter 3 Euclidean Vector Spaces Vectors in n-space Norm, Dot Product, and Distance in n-space Orthogonality
AN ORTHOGONAL PROJECTION
Least SquaresELE Adaptive Signal Processing 1 Method of Least Squares.
Linear algebra: matrix Eigen-value Problems
Chapter 3 Vectors in n-space Norm, Dot Product, and Distance in n-space Orthogonality.
CHAPTER 4 Adaptive Tapped-delay-line Filters Using the Least Squares Adaptive Filtering.
Computing Eigen Information for Small Matrices The eigen equation can be rearranged as follows: Ax = x  Ax = I n x  Ax - I n x = 0  (A - I n )x = 0.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Chapter 10 Real Inner Products and Least-Square
Eigenvalues The eigenvalue problem is to determine the nontrivial solutions of the equation Ax= x where A is an n-by-n matrix, x is a length n column.
Ch11: Normal Modes 1. Review diagonalization and eigenvalue problem 2. Normal modes of Coupled oscillators (3 springs, 2 masses) 3. Case of Weakly coupled.
5.1 Eigenvectors and Eigenvalues 5. Eigenvalues and Eigenvectors.
Signal & Weight Vector Spaces
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Chapter 5 Chapter Content 1. Real Vector Spaces 2. Subspaces 3. Linear Independence 4. Basis and Dimension 5. Row Space, Column Space, and Nullspace 6.
Matrices CHAPTER 8.9 ~ Ch _2 Contents  8.9 Power of Matrices 8.9 Power of Matrices  8.10 Orthogonal Matrices 8.10 Orthogonal Matrices 
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Matrices, Vectors, Determinants.
Review of Eigenvectors and Eigenvalues from CliffsNotes Online mining-the-Eigenvectors-of-a- Matrix.topicArticleId-20807,articleId-
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Matrices Introduction.
Lecture 16: Image alignment
Continuum Mechanics (MTH487)
Eigenvalues and Eigenvectors
Systems of First Order Linear Equations
Numerical Analysis Lecture 16.
Techniques for studying correlation and covariance structure
Boyce/DiPrima 10th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
Determinants CHANGE OF BASIS © 2016 Pearson Education, Inc.
Symmetric Matrices and Quadratic Forms
Linear Algebra Lecture 32.
Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Vector Norms and the related Matrix Norms

Properties of a Vector Norm: Euclidean Vector Norm: Riemannian metric:

If is a regular vector-norm on the n-dimemsional vector space, and if A is an matrix, we define the related matrix-norm as Properties of the related matrix-norm: For some positive constants which are independent of A

The Conditional Number of a Matrix If A is a nonsingular square matrix, we define the conditional number Interpretation: Let the unit sphere be mapped by the transformation into some surface S. The conditional number is the ration of the largest to the smallest distances from the origin to points on S. Thus, where are the eigenvalues of A arranged so that This follows from setting and equal to eigenvectors belonging to and, respectively.

By the previous definition: But what is the minimum of ? we have Therefore So,

Application of Conditional Numbers Suppose that we are solving, where that data A and are not known exactly. What is the effect of errors and on the solution? Let Assume that A and are nonsingular, and that. Define the error ratios:

We try to estimate as a function of and. But Whereas Therefore, Multiplying by and division by yield Hence then

Assuming we find or If then If then

Perturbations of the spectrum Let A be an matrix with eigenvalues and with corresponding eigenvectors. A small change in the matrix produces changes in the eigenvalues and changes in the eigenvectors. If are distinct, then are linearly independent and are unique, except for nonzero scalar multiplies We have and (1) In this equation we consider (2) If, the ; but the perturbation equation (1) is satisfied by any which is multiple of. To ensure, we shall normalize the perturbated eigenvector by the assumption that,

in the expansion The coefficient of remains equal to 1 when A is replaced by. In other words, we shall require expansions (3) The unknowns are now and the coeffs. for. If the components of the matrix are very small, eqn.(1) becomes, to the first order, where the neglected terms, are of second order. Since (4) To compute the unknowns we will use the “principle of biorthogonality”: Let be eigenvectors corresponding to the eigenvalues of an matrix A. assume. Let be eigenvectors corresponding to the eigenvalues of.(Hermitian matrix of A) Then and

To solve eqn.(4) for, we will use the eigenvectors of. By normalization (3), the perturbation is a combination of for. Therefore,. Now (4) yields, But Therefore, since (5) To find take the inner product of eqn.(4) with, for : Since But the normalization (3) gives (6)

Example:, where. Let, where and is a small parameter. In this case, we take for the eigenvectors of A and. Eqn.(5) gives Eqn.(6) gives Now Eqn.(3) gives is the vector whose j th component is 0 and k th component is