CPSC 491 Xin Liu November 19, 2010. Norm Norm is a generalization of Euclidean length in vector space Definition 2.

Slides:



Advertisements
Similar presentations
3D Geometry for Computer Graphics
Advertisements

Generalised Inverses Modal Analysis and Modal Testing S. Ziaei Rad.
1.5 Elementary Matrices and a Method for Finding
Lecture 19 Singular Value Decomposition
Slides by Olga Sorkine, Tel Aviv University. 2 The plan today Singular Value Decomposition  Basic intuition  Formal definition  Applications.
1cs542g-term High Dimensional Data  So far we’ve considered scalar data values f i (or interpolated/approximated each component of vector values.
Dan Witzner Hansen  Groups?  Improvements – what is missing?
Sampling algorithms for l 2 regression and applications Michael W. Mahoney Yahoo Research (Joint work with P. Drineas.
3D Geometry for Computer Graphics
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Unsupervised Learning - PCA The neural approach->PCA; SVD; kernel PCA Hertz chapter 8 Presentation based on Touretzky + various additions.
Lecture 19 Quadratic Shapes and Symmetric Positive Definite Matrices Shang-Hua Teng.
Multivariable Control Systems Ali Karimpour Assistant Professor Ferdowsi University of Mashhad.
Math for CSLecture 41 Linear Least Squares Problem Over-determined systems Minimization problem: Least squares norm Normal Equations Singular Value Decomposition.
Information Retrieval in Text Part III Reference: Michael W. Berry and Murray Browne. Understanding Search Engines: Mathematical Modeling and Text Retrieval.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Chapter 2 Matrices Definition of a matrix.
Computing Sketches of Matrices Efficiently & (Privacy Preserving) Data Mining Petros Drineas Rensselaer Polytechnic Institute (joint.
CSci 6971: Image Registration Lecture 2: Vectors and Matrices January 16, 2004 Prof. Chuck Stewart, RPI Dr. Luis Ibanez, Kitware Prof. Chuck Stewart, RPI.
1 Neural Nets Applications Vectors and Matrices. 2/27 Outline 1. Definition of Vectors 2. Operations on Vectors 3. Linear Dependence of Vectors 4. Definition.
3D Geometry for Computer Graphics
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Slide# Ketter Hall, North Campus, Buffalo, NY Fax: Tel: x 2400 Control of Structural Vibrations.
1cs542g-term Notes  Extra class next week (Oct 12, not this Friday)  To submit your assignment: me the URL of a page containing (links to)
Matrices CS485/685 Computer Vision Dr. George Bebis.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
SVD(Singular Value Decomposition) and Its Applications
Vector Norms DEF: A norm is a function that satisfies
© 2005 Yusuf Akgul Gebze Institute of Technology Department of Computer Engineering Computer Vision Geometric Camera Calibration.
CPSC 491 Xin Liu Nov 17, Introduction Xin Liu PhD student of Dr. Rokne Contact Slides downloadable at pages.cpsc.ucalgary.ca/~liuxin.
Network Systems Lab. Korea Advanced Institute of Science and Technology No.1 Appendix A. Mathematical Background EE692 Parallel and Distribution Computation.
January 22 Review questions. Math 307 Spring 2003 Hentzel Time: 1:10-2:00 MWF Room: 1324 Howe Hall Instructor: Irvin Roy Hentzel Office 432 Carver Phone.
Sec 3.5 Inverses of Matrices Where A is nxn Finding the inverse of A: Seq or row operations.
Overview Definitions Basic matrix operations (+, -, x) Determinants and inverses.
SVD: Singular Value Decomposition
Matrices. Definitions  A matrix is an m x n array of scalars, arranged conceptually as m rows and n columns.  m is referred to as the row dimension.
Linear algebra: matrix Eigen-value Problems
SINGULAR VALUE DECOMPOSITION (SVD)
Chapter 2 Nonnegative Matrices. 2-1 Introduction.
تهیه کننده : نرگس مرعشی استاد راهنما : جناب آقای دکتر جمشید شنبه زاده.
Lecture 8 Matrix Inverse and LU Decomposition
Scientific Computing Singular Value Decomposition SVD.
4.6: Rank. Definition: Let A be an mxn matrix. Then each row of A has n entries and can therefore be associated with a vector in The set of all linear.
CPSC 491 Xin Liu November 22, A change of Bases A mxn =UΣV T, U mxm, V nxn are unitary matrixes Σ mxn is a diagonal matrix Columns of a unitary.
Introduction to Linear Algebra Mark Goldman Emily Mackevicius.
BART VANLUYTEN, JAN C. WILLEMS, BART DE MOOR 44 th IEEE Conference on Decision and Control December 2005 Model Reduction of Systems with Symmetries.
EIGENSYSTEMS, SVD, PCA Big Data Seminar, Dedi Gadot, December 14 th, 2014.
1.3 Solutions of Linear Systems
Ch 6 Vector Spaces. Vector Space Axioms X,Y,Z elements of  and α, β elements of  Def of vector addition Def of multiplication of scalar and vector These.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
2.5 – Determinants and Multiplicative Inverses of Matrices.
1 Chapter 8 – Symmetric Matrices and Quadratic Forms Outline 8.1 Symmetric Matrices 8.2Quardratic Forms 8.3Singular ValuesSymmetric MatricesQuardratic.
PRESENT BY BING-HSIOU SUNG A Multilinear Singular Value Decomposition.
Unsupervised Learning II Feature Extraction
MASKS © 2004 Invitation to 3D vision Lecture 6 Introduction to Algebra & Rigid-Body Motion Allen Y. Yang September 18 th, 2006.
EE611 Deterministic Systems Vector Spaces and Basis Changes Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Euclidean Inner Product on Rn
SVD: Physical Interpretation and Applications
2. Matrix Algebra 2.1 Matrix Operations.
4.6: Rank.
Outline Singular Value Decomposition Example of PCA: Eigenfaces.
Matrix Definitions It is assumed you are already familiar with the terms matrix, matrix transpose, vector, row vector, column vector, unit vector, zero.
Lecture 8 Matrix Inverse and LU Decomposition
Lecture 20 SVD and Its Applications
Outline Numerical Stability Singular Value Decomposition
Outline Properties on the Matrix Inverse Matrix Inverse Lemma
Presentation transcript:

CPSC 491 Xin Liu November 19, 2010

Norm Norm is a generalization of Euclidean length in vector space Definition 2

l p -Norms Definition Examples l 1 -norm l 2 -norm (Euclidean length) l ∞-norm 3

Matrix norms Definition A matrix norm is a function ||  || : R mxn  R that satisfies Frobenius norm Matrix norms induced by vector norms (see next page) 4

Induced Matrix Norms Definition The supremum of ||Ax|| / ||x|| The “amplification factor” of vector norms Examples The matrix norms of A induced by L1, L2, L∞ norms 5

Induced Matrix Norms 6

Induced Matrix Norms The 1-Norm of a matrix A Write A in terms of its columns On the sphere ||x|| 1 = 1, we have When x = e j, the upper bound is attained. Therefore, 7

Properties of Matrix Norms For (1) norms induced by vector norms and (2) Frobenius norm, we have the following conclusions Bound of ||AB|| ||AB|| ≤||A|| ||B|| Invariance under Unitary Multiplication If Q is a unitary matrix, we have ||QA|| = ||AQ|| = ||A|| 8

Singular Value Decomposition Definition A singular value decomposition (SVD) of A is a factorization The diagonal entries of Σ are nonnegative and in nonincreasing order Existence and Uniqueness 9

Geometric interpretation The image AS of the unit sphere S under any mxn matrix A is a hyperellipse. The singular values of A σ 1, σ 2, …, σ n, are the lengths of the principal semiaxes of AS The left singular vectors of A, {u 1, u 2, …, u n } oriented in the directions of the principal semiaxes of AS The right singular vectors of A, {v 1, v 2, …, v n } are the pre-images of the principal semiaxes of AS, i.e., Av j = σ j u j 10

Reduced SVD If m ≥ n, a full-ranked matrix A has rank n, exactly n singular values are nonzero. We can reduce the full decomposition by removing the zero rows in Σ, and related columns in U  A mxn U cap mxn Σ cap nxn V nxn 11