Lecture 11 Fundamental Theorems of Linear Algebra Orthogonalily and Projection Shang-Hua Teng.

Slides:



Advertisements
Similar presentations
Chapter 3: Linear transformations
Advertisements

Section 4.6 (Rank).
Lecture 19 Singular Value Decomposition
THE DIMENSION OF A VECTOR SPACE
Chapter 5 Orthogonality
Lecture 6 Intersection of Hyperplanes and Matrix Inverse Shang-Hua Teng.
Lecture 9 Symmetric Matrices Subspaces and Nullspaces Shang-Hua Teng.
Dimension of a Vector Space (11/9/05) Theorem. If the vector space V has a basis consisting of n vectors, then any set of more than n vectors in V must.
Lecture 12 Projection and Least Square Approximation Shang-Hua Teng.
Lecture 12 Least Square Approximation Shang-Hua Teng.
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Lecture 20 SVD and Its Applications Shang-Hua Teng.
4 4.6 © 2012 Pearson Education, Inc. Vector Spaces RANK.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 08 Chapter 8: Linear Transformations.
Lecture 10 Dimensions, Independence, Basis and Complete Solution of Linear Systems Shang-Hua Teng.
Subspaces, Basis, Dimension, Rank
Last lecture summary independent vectors x
Last lecture summary Fundamental system in linear algebra : system of linear equations Ax = b. nice case – n equations, n unknowns matrix notation row.
5.1 Orthogonality.
1 MAC 2103 Module 10 lnner Product Spaces I. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Define and find the.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
BMI II SS06 – Class 3 “Linear Algebra” Slide 1 Biomedical Imaging II Class 3 – Mathematical Preliminaries: Elementary Linear Algebra 2/13/06.
Linear Algebra Lecture 25.
Chapter 5: The Orthogonality and Least Squares
Chapter 5 Orthogonality.
Vectors in R n a sequence of n real number An ordered n-tuple: the set of all ordered n-tuple  n-space: R n Notes: (1) An n-tuple can be viewed.
Section 4.1 Vectors in ℝ n. ℝ n Vectors Vector addition Scalar multiplication.
Chapter 3 Vector Spaces. The operations of addition and scalar multiplication are used in many contexts in mathematics. Regardless of the context, however,
Chapter Content Real Vector Spaces Subspaces Linear Independence
AN ORTHOGONAL PROJECTION
SVD: Singular Value Decomposition
Orthogonality and Least Squares
Elementary Linear Algebra Anton & Rorres, 9th Edition
Lecture 8 Matrix Inverse and LU Decomposition
1 Chapter 3 – Subspaces of R n and Their Dimension Outline 3.1 Image and Kernel of a Linear Transformation 3.2 Subspaces of R n ; Bases and Linear Independence.
5.5 Row Space, Column Space, and Nullspace
4 4.6 © 2012 Pearson Education, Inc. Vector Spaces RANK.
Vector Spaces RANK © 2016 Pearson Education, Inc..
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
A website that has programs that will do most operations in this course (an online calculator for matrices)
4 © 2012 Pearson Education, Inc. Vector Spaces 4.4 COORDINATE SYSTEMS.
2 2.9 © 2016 Pearson Education, Inc. Matrix Algebra DIMENSION AND RANK.
Chap. 4 Vector Spaces 4.1 Vectors in Rn 4.2 Vector Spaces
I.4 Polyhedral Theory 1. Integer Programming  Objective of Study: want to know how to describe the convex hull of the solution set to the IP problem.
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
Ch 6 Vector Spaces. Vector Space Axioms X,Y,Z elements of  and α, β elements of  Def of vector addition Def of multiplication of scalar and vector These.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
4.5: The Dimension of a Vector Space. Theorem 9 If a vector space V has a basis, then any set in V containing more than n vectors must be linearly dependent.
4 4.5 © 2016 Pearson Education, Inc. Vector Spaces THE DIMENSION OF A VECTOR SPACE.
4 Vector Spaces 4.1 Vector Spaces and Subspaces 4.2 Null Spaces, Column Spaces, and Linear Transformations 4.3 Linearly Independent Sets; Bases 4.4 Coordinate.
 Matrix Operations  Inverse of a Matrix  Characteristics of Invertible Matrices …
Row Space, Column Space, and Nullspace
4.6: Rank.
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Linear Algebra Lecture 21.
Theorems about LINEAR MAPPINGS.
Mathematics for Signals and Systems
Row-equivalences again
Maths for Signals and Systems Linear Algebra in Engineering Lecture 6, Friday 21st October 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Linear Algebra Lecture 24.
Maths for Signals and Systems Linear Algebra in Engineering Lectures 4-5, Tuesday 18th October 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN.
Linear Algebra Lecture 41.
Vector Spaces, Subspaces
Row-equivalences again
Vector Spaces RANK © 2012 Pearson Education, Inc..
Lecture 9 Symmetric Matrices Subspaces and Nullspaces
THE DIMENSION OF A VECTOR SPACE
Lecture 20 SVD and Its Applications
Presentation transcript:

Lecture 11 Fundamental Theorems of Linear Algebra Orthogonalily and Projection Shang-Hua Teng

The Whole Picture Rank(A) = m = n Ax=b has unique solution Rank(A) = m < n Ax=b has n-m dimensional solution Rank(A) = n < m Ax=b has 0 or 1 solution Rank(A) < n, Rank(A) < m Ax=b has 0 or n-rank(A) dimensions

Basis and Dimension of a Vector Space A basis for a vector space is a sequence of vectors that –The vectors are linearly independent –The vectors span the space: every vector in the vector can be expressed as a linear combination of these vectors

Basis for 2D and n-D (1,0), (0,1) (1 1), (-1 –2) The vectors v 1,v 2,…v n are basis for R n if and only if they are columns of an n by n invertible matrix

Column and Row Subspace C(A): the space spanned by columns of A –Subspace in m dimensions –The pivot columns of A are a basis for its column space Row space: the space spanned by rows of A –Subspace in n dimensions –The row space of A is the same as the column space of A T, C(A T ) –The pivot rows of A are a basis for its row space –The pivot rows of its Echolon matrix R are a basis for its row space

Important Property I: Uniqueness of Combination The vectors v 1,v 2,…v n are basis for a vector space V, then for every vector v in V, there is a unique way to write v as a combination of v 1,v 2,…v n. v = a 1 v 1 + a 2 v 2 +…+ a n v n v = b 1 v 1 + b 2 v 2 +…+ b n v n So: 0=(a 1 - b 1 ) v 1 + (a 2 -b 2 )v 2 +…+ (a n -b n )v n

Important Property II: Dimension and Size of Basis If a vector space V has two set of bases –v 1,v 2,…v m. V = [v 1,v 2,…v m ] –w 1,w 2,…w n. W= [w 1,w 2,…w n ]. then m = n –Proof: assume n > m, write W = VA –A is m by n, so Ax = 0 has a non-zero solution –So VAx = 0 and Wx = 0 The dimension of a vector space is the number of vectors in every basis –Dimension of a vector space is well defined

Dimensions of the Four Subspaces Fundamental Theorem of Linear Algebra, Part I Row space: C(A T ) – dimension = rank(A) Column space: C(A)– dimension = rank(A) Nullspace: N(A) – dimension = n-rank(A) Left Nullspace: N(A T ) – dimension = m –rank(A)

Orthogonality and Orthogonal Subspaces Two vectors v and w are orthogonal if Two vector subspaces V and W are orthogonal if

Example: Orthogonal Subspace in 5 Dimensions The union of these two subspaces is R 5

Orthogonal Complement Suppose V is a vector subspace a vector space W The orthogonal complement of V is Orthogonal complement is itself a vector subspace

Dimensions of the Four Subspaces Fundamental Theorem of Linear Algebra, Part I Row space: C(A T ) – dimension = rank(A) Column space: C(A)– dimension = rank(A) Nullspace: N(A) – dimension = n-rank(A) Left Nullspace: N(A T ) – dimension = m –rank(A)

Orthogonality of the Four Subspaces Fundamental Theorem of Linear Algebra, Part II The nullspace is the orthogonal complement of the row space in R n The left Nullspace is the orthogonal complement of the column space in R m

Proof The nullspace is the orthogonal complement of the row space in R n

The Whole Picture C(A T ) N(A) RnRn RmRm C(A) N(A T ) xnxn A x n = 0 xrxr b A x r = b A x= b dim r dim n- r dim m- r

Uniqueness of The Typical Solution Every vector in the column space comes from one and only one vector x r from the row space Proof: suppose there are two x r, y r from the row space such that Ax r =A y r =b, then Ax r -A y r = A(x r -y r ) = 0 (x r -y r ) is in row space and nullspace hence must be 0 The matching of dim in row and column spaces

Deep Secret of Linear Algebra Pseudo-inverse Throw away the two null spaces, there is an r by r invertible matrix hiding insider A. In some sense, from the row space to the column space, A is invertible It maps an r-space in n space to an r-space in m-space

Invertible Matrices Any n linearly independent vector in R n must span R n. They are basis. So Ax = b is always uniquely solvable A is invertible

Projection Projection onto an axis (a,b) x axis is a vector subspace

Projection onto an Arbitrary Line Passing through 0 (a,b)

Projection on to a Plane

Projection onto a Subspace Input: 1. Given a vector subspace V in R m 2.A vector b in R m … Desirable Output: –A vector in x in V that is closest to b –The projection x of b in V –A vector x in V such that (b-x) is orthogonal to V

How to Describe a Vector Subspace V in R m If dim(V) = n, then V has n basis vectors –a 1, a 2, …, a n –They are independent V = C(A) where A = [a 1, a 2, …, a n ]

Projection onto a Subspace Input: 1. Given n independent vectors a 1, a 2, …, a n in R m 2.A vector b in R m … Desirable Output: –A vector in x in C([a 1, a 2, …, a n ]) that is closest to b –The projection x of b in C([a 1, a 2, …, a n ]) –A vector x in V such that (b-x) is orthogonal to C([a 1, a 2, …, a n ])

Think about this Picture C(A T ) N(A) RnRn RmRm C(A) N(A T ) xnxn A x n = 0 xrxr b A x r = b A x= b dim r dim n- r dim m- r