Outline Basic Theories on the Subspace Subspace projection.

Slides:



Advertisements
Similar presentations
5.2 Rank of a Matrix. Set-up Recall block multiplication:
Advertisements

8.2 Kernel And Range.
3D Geometry for Computer Graphics
Lecture 19 Singular Value Decomposition
Maths for Computer Graphics
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
THE DIMENSION OF A VECTOR SPACE
Symmetric Matrices and Quadratic Forms
Chapter 5 Orthogonality
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Class 25: Question 1 Which of the following vectors is orthogonal to the row space of A?
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Lecture 11 Fundamental Theorems of Linear Algebra Orthogonalily and Projection Shang-Hua Teng.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
1 MAC 2103 Module 10 lnner Product Spaces I. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Define and find the.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
1 MAC 2103 Module 9 General Vector Spaces II. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Find the coordinate.
Chapter 5: The Orthogonality and Least Squares
Linear Algebra (Aljabar Linier) Week 10 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma
Section 4.1 Vectors in ℝ n. ℝ n Vectors Vector addition Scalar multiplication.
AN ORTHOGONAL PROJECTION
Orthogonality and Least Squares
تهیه کننده : نرگس مرعشی استاد راهنما : جناب آقای دکتر جمشید شنبه زاده.
Section 2.3 Properties of Solution Sets
Introduction to Matrices and Matrix Approach to Simple Linear Regression.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Chap. 5 Inner Product Spaces 5.1 Length and Dot Product in R n 5.2 Inner Product Spaces 5.3 Orthonormal Bases: Gram-Schmidt Process 5.4 Mathematical Models.
5.1 Eigenvectors and Eigenvalues 5. Eigenvalues and Eigenvectors.
Ch 6 Vector Spaces. Vector Space Axioms X,Y,Z elements of  and α, β elements of  Def of vector addition Def of multiplication of scalar and vector These.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Chapter 4 Vector Spaces Linear Algebra. Ch04_2 Definition 1: ……………………………………………………………………. The elements in R n called …………. 4.1 The vector Space R n Addition.
4 4.5 © 2016 Pearson Education, Inc. Vector Spaces THE DIMENSION OF A VECTOR SPACE.
Sec Sec Sec 4.4 CHAPTER 4 Vector Spaces Let V be a set of elements with vector addition and multiplication by scalar is a vector space if these.
Reduced echelon form Matrix equations Null space Range Determinant Invertibility Similar matrices Eigenvalues Eigenvectors Diagonabilty Power.
Let W be a subspace of R n, y any vector in R n, and the orthogonal projection of y onto W. …
Beyond Vectors Hung-yi Lee. Introduction Many things can be considered as “vectors”. E.g. a function can be regarded as a vector We can apply the concept.
Introduction to Vectors and Matrices
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
CS479/679 Pattern Recognition Dr. George Bebis
Vector Spaces Prepared by Vince Zaccone
Linear independence and matrix rank
Orthogonality and Least Squares
Matrices Definition: A matrix is a rectangular array of numbers or symbolic elements In many applications, the rows of a matrix will represent individuals.
Vector Spaces Prepared by Vince Zaccone
Some useful linear algebra
Orthogonality and Least Squares
Linear Algebra Lecture 40.
Linear Algebra Lecture 39.
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Linear Algebra Lecture 38.
Symmetric Matrices and Quadratic Forms
I.4 Polyhedral Theory (NW)
Properties of Solution Sets
Maths for Signals and Systems Linear Algebra in Engineering Lecture 6, Friday 21st October 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Introduction to Vectors and Matrices
Elementary Linear Algebra Anton & Rorres, 9th Edition
I.4 Polyhedral Theory.
Linear Algebra Lecture 41.
Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors
THE DIMENSION OF A VECTOR SPACE
Orthogonality and Least Squares
Symmetric Matrices and Quadratic Forms
Outline Subspace projection,continued.
Presentation transcript:

Outline Basic Theories on the Subspace Subspace projection

W = span{u1, u2, …, um} = {a1u1+a2u2 +…+amum} Subspace The Set of a Space, but Also a Space Closure on addition and scalar multiplication Zero elements belongs to the set … Representation of a subspace: the set of all linear combinations on a certain set W = span{u1, u2, …, um} = {a1u1+a2u2 +…+amum} Other representations?

Subspace Remove Redundant Vectors from the Set W = span{u1, u2, …, um} = {a1u1+a2u2 +…+amum} Vector Removal Criteria If some vectors are the linear combination of other vectors, then all other vectors also spans the same vector space Some linearly independent set of vectors also form the same subspace

Subspace Properties Orthogonality of Subspace: Si ┴Sj Any two vectors in the subspaces Si and Sj are orthogonal to each other Example: x-y plane and z axis Orthogonal complement Set of vectors orthogonal to S: S┴ = {x|xTy = 0, y in S} Orthogonal Complement Also a Subspace How to prove?

Subspace Properties Dimension of Subspace and Its Orthogonal? dim(S) + dim(S┴) = dim(V) How to Prove this? … Orthogonal Subspace and EVD Null(A - λI), the set of vectors u such that (A - λI)u = 0 the eigenspace corresponding to eigenvalue λ

Subspace Projection The Subspace Spanned by Column Vectors of A Subspace: S = span(A) = {Au} Subspace Projection of Vector x The vector in S = span(A) closest to vector x Least square method: minu||x – Au||2 Least Square Solution: u = (AHA)-1AHx Projection Matrix: PS = A(AHA)-1AH

Subspace Projection Projection Matrix: PS = A(AHA)-1AH Projection^2 = Projection: PSPS = PS Orthogonal Matrix Operation: Matrix: I - PS = I - A(AHA)-1AH Properties: (I - PS) (I - PS) = (I - PS) (I - PS) PS = 0

Subspace Projection Distances of Two Subspaces More Discussions: what is this stand for? Angle between the two subspaces? The angle on all dimensions? …

Column and Row Space Column space: Col(A) Row space: Row(A) the subspace spanned by all columns of A Row space: Row(A) the subspace spanned by all rows of A Zero space: Null(A) = {x|Ax = 0} Properties on Column and Row Spaces Null(A) = (Row(A))┴ Null(AH) = (Col(A))┴

Span{a1, a2,…, ak} = Span{q1, q2,…, qk} Basic Results Rank(A) + dim[Null(A)] = n Let S = span(A), We have: Rank(A) = dim(S), dim[Null(A)] = dim(S┴) dim(S) + dim(S┴) = n  Rank(A) + dim[Null(A)] = n Full rank matrix A, QR Decomposition A = QR: columns of A and Q Span{a1, a2,…, ak} = Span{q1, q2,…, qk}