Lecture 10 Dimensions, Independence, Basis and Complete Solution of Linear Systems Shang-Hua Teng.

Slides:



Advertisements
Similar presentations
5.4 Basis And Dimension.
Advertisements

5.2 Rank of a Matrix. Set-up Recall block multiplication:
Gauss Elimination.
Section 4.6 (Rank).
Lecture 7 Intersection of Hyperplanes and Matrix Inverse Shang-Hua Teng.
Lecture 19 Singular Value Decomposition
Eigenvalues and Eigenvectors
THE DIMENSION OF A VECTOR SPACE
Lecture 6 Intersection of Hyperplanes and Matrix Inverse Shang-Hua Teng.
Lecture 9 Symmetric Matrices Subspaces and Nullspaces Shang-Hua Teng.
Lecture 20 SVD and Its Applications Shang-Hua Teng.
4 4.6 © 2012 Pearson Education, Inc. Vector Spaces RANK.
Lecture 11 Fundamental Theorems of Linear Algebra Orthogonalily and Projection Shang-Hua Teng.
Subspaces, Basis, Dimension, Rank
1 MAC 2103 Module 10 lnner Product Spaces I. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Define and find the.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Square n-by-n Matrix.
MA2213 Lecture 5 Linear Equations (Direct Solvers)
Linear Algebra Lecture 25.
ME 1202: Linear Algebra & Ordinary Differential Equations (ODEs)
4 4.2 © 2012 Pearson Education, Inc. Vector Spaces NULL SPACES, COLUMN SPACES, AND LINEAR TRANSFORMATIONS.
Vectors in R n a sequence of n real number An ordered n-tuple: the set of all ordered n-tuple  n-space: R n Notes: (1) An n-tuple can be viewed.
Section 4.1 Vectors in ℝ n. ℝ n Vectors Vector addition Scalar multiplication.
A matrix equation has the same solution set as the vector equation which has the same solution set as the linear system whose augmented matrix is Therefore:
Matrix. REVIEW LAST LECTURE Keyword Parametric form Augmented Matrix Elementary Operation Gaussian Elimination Row Echelon form Reduced Row Echelon form.
We will use Gauss-Jordan elimination to determine the solution set of this linear system.
Linear Equations in Linear Algebra
1 1.7 © 2016 Pearson Education, Inc. Linear Equations in Linear Algebra LINEAR INDEPENDENCE.
Lecture 8 Matrix Inverse and LU Decomposition
5.5 Row Space, Column Space, and Nullspace
4 4.6 © 2012 Pearson Education, Inc. Vector Spaces RANK.
4.6: Rank. Definition: Let A be an mxn matrix. Then each row of A has n entries and can therefore be associated with a vector in The set of all linear.
Section 2.3 Properties of Solution Sets
Vector Spaces RANK © 2016 Pearson Education, Inc..
Chapter 1 Section 1.3 Consistent Systems of Linear Equations.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
A website that has programs that will do most operations in this course (an online calculator for matrices)
4 © 2012 Pearson Education, Inc. Vector Spaces 4.4 COORDINATE SYSTEMS.
Chap. 4 Vector Spaces 4.1 Vectors in Rn 4.2 Vector Spaces
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
Arab Open University Faculty of Computer Studies M132: Linear Algebra
Ch 6 Vector Spaces. Vector Space Axioms X,Y,Z elements of  and α, β elements of  Def of vector addition Def of multiplication of scalar and vector These.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
4.5: The Dimension of a Vector Space. Theorem 9 If a vector space V has a basis, then any set in V containing more than n vectors must be linearly dependent.
4 4.5 © 2016 Pearson Education, Inc. Vector Spaces THE DIMENSION OF A VECTOR SPACE.
4 Vector Spaces 4.1 Vector Spaces and Subspaces 4.2 Null Spaces, Column Spaces, and Linear Transformations 4.3 Linearly Independent Sets; Bases 4.4 Coordinate.
Linear Algebra Engineering Mathematics-I. Linear Systems in Two Unknowns Engineering Mathematics-I.
1 SYSTEM OF LINEAR EQUATIONS BASE OF VECTOR SPACE.
 Matrix Operations  Inverse of a Matrix  Characteristics of Invertible Matrices …
Section 5.6 Rank and Nullity. FOUR FUNDAMENTAL MATRIX SPACES If we consider matrices A and A T together, then there are six vector spaces of interest:
Lecture 9 Vector & Inner Product Spaces Last Time Spanning Sets and Linear Independence (Cont.) Basis and Dimension Rank of a Matrix and Systems of Linear.
Eigenvalues and Eigenvectors
Lecture 8 Vector Space Last Time - Vector Spaces and Applications
Lecture 7 Vector Space Last Time - Properties of Determinants
Row Space, Column Space, and Nullspace
4.6: Rank.
What can we know from RREF?
Linear Algebra Lecture 21.
Mathematics for Signals and Systems
Properties of Solution Sets
Linear Algebra Lecture 24.
Maths for Signals and Systems Linear Algebra in Engineering Lectures 4-5, Tuesday 18th October 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN.
Linear Algebra Lecture 6.
Row-equivalences again
Lecture 8 Matrix Inverse and LU Decomposition
Vector Spaces RANK © 2012 Pearson Education, Inc..
Lecture 9 Symmetric Matrices Subspaces and Nullspaces
THE DIMENSION OF A VECTOR SPACE
NULL SPACES, COLUMN SPACES, AND LINEAR TRANSFORMATIONS
Presentation transcript:

Lecture 10 Dimensions, Independence, Basis and Complete Solution of Linear Systems Shang-Hua Teng

Linear Independence Linear Combination Linear Independence is linearly independent if only if none of them can be expressed as a linear combination of the others

Examples

Linear Independence and Null Space Theorem/Definition is linearly independent if and only a1v1+a2v2+…+anvn=0 only happens when all a ’s are zero The columns of a matrix A are linearly independent when only solution to Ax=0 is x = 0

2D and 3D v w u v

How do we determine a set of vectors are independent? Make them the columns of a matrix Elimination Computing their null space

Permute Rows and Continuing Elimination (permute columns)

There must be free variables. Theorem If Ax = 0 has more have more unknown than equations (m > n: more columns than rows), then it has nonzero solutions. There must be free variables.

Echelon Matrices Free variables

Reduced Row Echelon Matrix R Free variables

Computing the Reduced Row Echelon Matrix Elimination to Echelon Matrix E1PA = U Divide the row of pivots by the pivots Upward Elimination E2E1PA = R

Example: Gauss-Jordan Method for Matrix Inverse [A I] E1[A I] = [U, I] In its reduced Echelon Matrix A-1 [A I] = [I A-1]

A Close Look at Reduced Echelon Matrix The last equation of R x = 0 is redundant 0 = 0 Rank of A is the number of pivots rank(A).

What is the Rank of Outer Product

Rank and Reduced Row Echelon Matrix Free variables Theorem/Definition Rank(A) = number of independent rows Rank(A) = number of independent columns

Dimension of the Column Space and Null Space The dimensions of the column space of A is equal to Rank(A). The dimension of the null space of A is equal to the number of free variables which is n – Rank(A) A is an m by n matrix

Rank and Reduced Row Echelon Matrix The Pivot columns are not combinations of earlier columns Pivot columns Free variables Free Columns

Reduced Echelon and Null Space Matrix Special Solutions

Null Space Matrix Ax=0 has n-Rank(A) free variables and special solutions The Nullspace matrix has n-Rank(A) columns The columns of the nullspace matrix are independent The dimension of the Null space is n – rank(A)

Complete Solution of Ax = 0 After column permutation, we can write r pivot columns n-r free columns Nullspace matrix Pivot variables Free variables Moreover: RN = [0]

Complete Solution to Ax = b A is an m by n matrix, and b is an n-place vector Unique solution Infinitely many solution No solution Suppose Ax = b has more then one solution, say x1, x2 then A x1 = b A x2 = b So A (x1 - x2 ) = 0 (x1 - x2 ) is in nullspace(A)

Complete Solution to Ax = b Suppose we found a particular solution xp to Ax = b i.e, A xp = b Let F be the indexes of free variables of Ax = 0 Let xF be the column vector of free variables Let N be the nullspace matrix of A Then defines the complete set of solutions to Ax = b xp

Example: Complete Solution to Ax = b

Augmented matrix [A b] Elimination to obtain [R d]

Set free variables to 0 to find a particular solution Compute the nullspace matrix Complete solution is

Full Rank Matrix Suppose A is an m by n matrix. Then A is full column if rank(A) = n columns of A are independent A is full row rank if rank(A) = m Rows of A are independent

Full Column Rank Matrix Columns are independent All columns of A are pivot columns There are non free variables or special solutions The nullspace N(A) contains only the zero vector If Ax=b has a solution (it might not) then it has only one solution n by n m-n rows of zeros

Full Row Rank Matrix Rows are independent All rows of A have pivots, R has no zero rows Ax=b has a solution for every right hand side b The column space is the whole space Rm There are n-m special solutions in the null space of A

The Whole Picture Rank(A) = m = n Ax=b has unique solution Ax=b has n-m dimensional solution Rank(A) = n < m Ax=b has 0 or 1 solution Rank(A) < n, Rank(A) < m Ax=b has 0 or n-rank(A) dimensions

Basis and Dimension of a Vector Space A basis for a vector space is a sequence of vectors that The vectors are linearly independent The vectors span the space: every vector in the vector can be expressed as a linear combination of these vectors

Basis for 2D and n-D (1,0), (0,1) (1 1), (-1 –2) The vectors v1,v2,…vn are basis for Rn if and only if they are columns of an n by n invertible matrix

Column and Row Subspace C(A): the space spanned by columns of A Subspace in m dimensions The pivot columns of A are a basis for its column space Row space: the space spanned by rows of A Subspace in n dimensions The row space of A is the same as the column space of AT, C(AT) The pivot rows of A are a basis for its row space The pivot rows of its Echolon matrix R are a basis for its row space

Important Property I: Uniqueness of Combination The vectors v1,v2,…vn are basis for a vector space V, then for every vector v in V, there is a unique way to write v as a combination of v1,v2,…vn . v = a1 v1+ a2 v2+…+ an vn v = b1 v1+ b2 v2+…+ bn vn So: 0=(a1 - b1) v1 + (a2 -b2 )v2+…+ (an -bn )vn

Important Property II: Dimension and Size of Basis If a vector space V has two set of bases v1,v2,…vm . V = [v1,v2,…vm ] w1,w2,…wn . W= [w1,w2,…wn ]. then m = n Proof: assume n > m, write W = VA A is m by n, so Ax = 0 has a non-zero solution So VAx = 0 and Wx = 0 The dimension of a vector space is the number of vectors in every basis Dimension of a vector space is well defined

Dimensions of the Four Subspaces Fundamental Theorem of Linear Algebra, Part I Row space: C(AT) – dimension = rank(A) Column space: C(A)– dimension = rank(A) Nullspace: N(A) – dimension = n-rank(A) Left Nullspace: N(AT) – dimension = m –rank(A)