Basis and Dimension Basis Dimension Vector Spaces and Linear Systems

Slides:



Advertisements
Similar presentations
Vector Spaces A set V is called a vector space over a set K denoted V(K) if is an Abelian group, is a field, and For every element vV and K there exists.
Advertisements

5.4 Basis And Dimension.
5.1 Real Vector Spaces.
THE DIMENSION OF A VECTOR SPACE
3.II.1. Representing Linear Maps with Matrices 3.II.2. Any Matrix Represents a Linear Map 3.II. Computing Linear Maps.
2.III. Basis and Dimension 1.Basis 2.Dimension 3.Vector Spaces and Linear Systems 4.Combining Subspaces.
3.III.1. Representing Linear Maps with Matrices 3.III.2. Any Matrix Represents a Linear Map 3.III. Computing Linear Maps.
4.I. Definition 4.II. Geometry of Determinants 4.III. Other Formulas Topics: Cramer’s Rule Speed of Calculating Determinants Projective Geometry Chapter.
5.II. Similarity 5.II.1. Definition and Examples
Matrices and Systems of Equations
3.V.1. Changing Representations of Vectors 3.V.2. Changing Map Representations 3.V. Change of Basis.
Chapter Two: Vector Spaces I.Definition of Vector Space II.Linear Independence III.Basis and Dimension Topic: Fields Topic: Crystals Topic: Voting Paradoxes.
I. Isomorphisms II. Homomorphisms III. Computing Linear Maps IV. Matrix Operations V. Change of Basis VI. Projection Topics: Line of Best Fit Geometry.
4 4.6 © 2012 Pearson Education, Inc. Vector Spaces RANK.
III. Reduced Echelon Form
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
1 資訊科學數學 14 : Determinants & Inverses 陳光琦助理教授 (Kuang-Chi Chen)
1 資訊科學數學 13 : Solutions of Linear Systems 陳光琦助理教授 (Kuang-Chi Chen)
 Row and Reduced Row Echelon  Elementary Matrices.
MA2213 Lecture 5 Linear Equations (Direct Solvers)
Systems of Linear Equation and Matrices
Linear Algebra Lecture 25.
Chapter 2: Vector spaces
A matrix equation has the same solution set as the vector equation which has the same solution set as the linear system whose augmented matrix is Therefore:
Chapter 3 Vector Spaces. The operations of addition and scalar multiplication are used in many contexts in mathematics. Regardless of the context, however,
Chapter Content Real Vector Spaces Subspaces Linear Independence
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
4 4.6 © 2012 Pearson Education, Inc. Vector Spaces RANK.
Section 2.3 Properties of Solution Sets
Vector Spaces RANK © 2016 Pearson Education, Inc..
4 © 2012 Pearson Education, Inc. Vector Spaces 4.4 COORDINATE SYSTEMS.
Chap. 4 Vector Spaces 4.1 Vectors in Rn 4.2 Vector Spaces
Chapter 5 Chapter Content 1. Real Vector Spaces 2. Subspaces 3. Linear Independence 4. Basis and Dimension 5. Row Space, Column Space, and Nullspace 6.
Matrices, Vectors, Determinants.
 Matrix Operations  Inverse of a Matrix  Characteristics of Invertible Matrices …
Lecture XXVII. Orthonormal Bases and Projections Suppose that a set of vectors {x 1,…,x r } for a basis for some space S in R m space such that r  m.
7.3 Linear Systems of Equations. Gauss Elimination
Vector Spaces B.A./B.Sc. III: Mathematics (Paper II) 1 Vectors in Rn
CS479/679 Pattern Recognition Dr. George Bebis
Chapter 1 Linear Equations and Vectors
Elementary Linear Algebra Anton & Rorres, 9th Edition
Lecture 7 Vector Space Last Time - Properties of Determinants
Row Space, Column Space, and Nullspace
Systems of First Order Linear Equations
GROUPS & THEIR REPRESENTATIONS: a card shuffling approach
4.6: Rank.
§1-3 Solution of a Dynamical Equation
Chapter 3 Linear Algebra
Linear Algebra Lecture 37.
資訊科學數學13 : Solutions of Linear Systems
Linear Algebra Lecture 39.
Subspaces and Spanning Sets
2.III. Basis and Dimension
Elementary Linear Algebra
I.4 Polyhedral Theory (NW)
Elementary Linear Algebra Anton & Rorres, 9th Edition
Row-equivalences again
Elementary Linear Algebra
Properties of Solution Sets
Maths for Signals and Systems Linear Algebra in Engineering Lecture 6, Friday 21st October 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Linear Algebra Lecture 24.
Maths for Signals and Systems Linear Algebra in Engineering Lectures 4-5, Tuesday 18th October 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN.
Vector Spaces 1 Vectors in Rn 2 Vector Spaces
RAYAT SHIKSHAN SANSTHA’S S.M.JOSHI COLLEGE, HADAPSAR, PUNE
3.IV. Change of Basis 3.IV.1. Changing Representations of Vectors
Row-equivalences again
Vector Spaces RANK © 2012 Pearson Education, Inc..
Linear Vector Space and Matrix Mechanics
THE DIMENSION OF A VECTOR SPACE
Linear Equations in Linear Algebra
Presentation transcript:

Basis and Dimension Basis Dimension Vector Spaces and Linear Systems Combining Subspaces

Basis Definition 1.1: Basis A basis of a vector space V is an ordered set of linearly independent (non-zero) vectors that spans V. Notation: Example 1.2: L.I. → Minimal Span → Complete is a basis for R2 B is L.I. : → → B spans R2: → →

Example : is a basis for R2 that differs from B only in order. Definition : Standard / Natural Basis for Rn kth component of ei =

Example : For the function space a natural basis is Another basis is Proof is straightforward. Rule: Set of L.C.’s of a L.I. set is L.I. if each L.C. contains a different vector. Example : For the function space of cubic polynomials P3 , a natural basis is Other choices can be Proof is again straightforward.

The trivial space { 0 } has only one basis, the empty one  . Example 1.8: The trivial space { 0 } has only one basis, the empty one  . Note: By convention, 0 does not count as a basis vector. ( Any set of vectors containing 0 is linearly dependent. ) Example 1.9: The space of all finite degree polynomials has a basis with infinitely many elements  1, x, x2, … . Example 1.10: Solution Set of Homogeneous Systems The solution set of ( Proof of L.I. is left as exercise ) is

Find a basis for this subspace of M22 : Example : Matrices Find a basis for this subspace of M22 : Solution: ∴ Basis is ( Proof of L.I. is left as exercise ) Theorem .In any vector space, a subset is a basis if and only if each vector in the space can be expressed as a linear combination of elements of the subset in a unique way. Proof: A basis is by definition spanning → every vector can be expressed as a linear combination of the basis vectors. Let then ∴ L.I.  uniqueness

Definition : Representation wrt a Basis Let B =  β1 , …, βn  be a basis of vector space V and Subscript B is often omitted Then the representation of v wrt B is cj are called the coordinates (components) of v wrt B. Example : P3 Let Then

Exercises 1. Find a basis for each. (a) The subspace { a2 x2 + a1 x + a0 | a2  2a1 = a0 } of P2 . (b) The space of three-wide row vectors whose 1st and 2nd components add to zero. (c) This subspace of the 22 matrices 2. A square matrix is symmetric if for all indices i and j, entry i, j equals entry j, i. (a) Find a basis for the vector space of symmetric 2  2 matrices. (b) Find a basis for the space of symmetric 3  3 matrices. (c) Find a basis for the space of symmetric n  n matrices.

3. One of the exercises in the Subspaces subsection shows that the set is a vector space under these operations. Find a basis.

Dimension To be proved: All bases for a vector space have the same number of elements. → Dimension  Number of vectors in basis. → Basis = Minimal spanning = L.I. set = Smallest set. Definition A vector space is finite-dimensional if it has a basis with only finitely many vectors. Lemma : Exchange Lemma Assume that B =  β1 , …, βn  is a basis for a vector space, and that for the vector v the relationship holds: where cj  0. Then exchanging βj for v yields another basis for the space. Proof: See Hefferon p.120.

Theorem : In any finite-dimensional vector space, all of the bases have the same number of elements. Proof: Let B =  β1 , …, βn  be a basis of n elements. Any other basis D =  δ1 , …, δm  must have m  n. Let with ck  0. By lemma 2.2, D1 =  β1 , …, βk  1 ,δ1 ,βk + 1 , …, βn  is a basis. Next, replacing βj in D1 begets D2 =  β1 , …, βk  1 ,δ1 ,βk + 1 , …, βj  1 ,δ2 ,βj + 1 , …, βn  Repeating the process n times results in a basis Dn =  δ1 , …, δn  that spans V. If m > n, then we can write with at least one ck  0. Which contradicts with the assumption that D is L.I. Hence m = n.

Definition : Dimension The dimension of a vector space is the number of vectors in any of its bases. Example Rn Any basis for Rn has n vectors since the standard basis En has n vectors. → Rn is n-D. Example : Pn dim Pn = n+1. since its natural basis,  1, x, x2, …, xn , has n+1 elements. Example : A trivial space is 0-D since its basis is empty. Comments: All results in this book are applicable to finite-D vectors spaces. Most of them are also applicable to countably infinite-D vectors spaces. For uncountably infinite-D vectors spaces, e.g., Hilbert spaces, convergence most be taken into account.

Corollary : No L.I. set can have a size greater than the dimension of the enclosing space. Example : Only subspaces in R3. 2-D: Planes thru 0 1-D: Lines thru 0 0-D: {0}

Corollary : Any L.I. set can be expanded to make a basis. Corollary : Any spanning set can be shrunk to a basis. Corollary : In an n-D space, a set of n vectors is L.I. iff it spans the space. Remark : The statement ‘any infinite-dimensional vector space has a basis’ is known to be equivalent to a statement called the Axiom of Choice. Mathematicians differ philosophically on whether to accept or reject this statement as an axiom on which to base mathematics (although, the great majority seem to accept it).

Exercises 1. What is the dimension of the span of the set { cos2θ , sin2θ , cos 2θ, sin 2θ} This span is a subspace of the space of all real-valued functions of one real variable. 2. Observe that, where S is a set, the functions f : S → R form a vector space under the natural operations: ( f + g ) (s) = f(s) + g(s) and (r f )(s) = r f(s). What is the dimension of the space resulting for each domain? (a) S = {1} (b) S = { 1, 2 } (c) S = { 1, 2, …, n } 3. Prove that if U and W are both three-dimensional subspaces of R5 then U  W is non-trivial. Generalize.

Vector Spaces and Linear Systems Definition : Row Space & Row Rank The row space of a matrix is the span of the set of its rows. The row rank is the dimension of the row space, the number of L.I. rows. Example : → Lemma : Row-equivalent matrices have the same row space & hence the same row rank. Proof: Let A & B be row-equivalent matrices. Each row of B is a lin.comb. of rows of A → RowSpace(B)  RowSpace(A) Each row of A is a lin.comb. of rows of B → RowSpace(A)  RowSpace(B) Hence, RowSpace(A) = RowSpace(B)

Lemma : The nonzero rows of an echelon form matrix make up a L.I. set. Proof: This is just a re-statement of Lemma III.2.5, which states that, in an echelon form matrix, no nonzero row is a linear combination of the other rows. Gaussian reduction ~ Finding a basis for the row space. Example : → → Basis for the row space is { (1 3 1), (0 1 0), (0 0 3) }.

Definition : Column Space & Column Rank The column space of a matrix is the span of the set of its columns. The column rank is the dimension of the column space, the number of L.I. columns. A linear system is equivalent to the linear combination of the columns of the corresponding coefficient matrix. ~ Basis for the column space can be found by applying the Gaussian reduction to the transpose of the corresponding coefficient matrix. Definition : Transpose The transpose of a matrix is the result of interchanging the rows and columns of that matrix, i.e.,

Example : Find a basis for the column space of ~ → Basis is

Example : Get a basis for Solution: Let Vector spaces T & S are isomorphic. ~ → Basis is

Lemma : Row operations do not change the column rank. Proof: Rationale for Gaussian reduction: Row operations do not affect linear relationships among columns. Example: Reading basis from reduced echelon form ~ Basis for row space (rows with leading entries): Basis for column space (columns containing leading entries):

Theorem : The row rank and column rank of a matrix are equal. Proof: Lemmas 3.3 & 3.10 → row rank of matrix = row rank of its echelon form. In reduced echelon form: Row rank = Number of leading entries = Column rank. Definition : Rank The rank of a matrix is its row rank or column rank. → dim(RowSpace) = dim(ColumnSpace) Theorem For linear systems with n unknowns and matrix of coefficients A, the following statements are equivalent. (1) rank of A is r (2) space of solutions of the associated homogeneous system has dim n  r Proof: rank A = r  reduced echelon form has r non-zero rows (L.I. eqs)  there are n  r free variables (L.D.eqs)

Corollary Where the matrix A is nn, the following statements are equivalent. (1) the rank of A is n (2) A is nonsingular (3) the rows of A form a linearly independent set (4) the columns of A form a linearly independent set (5) any linear system with matrix of coefficients A has one and only one solution Proof: Trivial (see Hefferon p.129)

Exercises 1. Find a basis for the span of each set. (a) (b) (c) (d) 2. (a) Show that the following set of column vectors is a subspace of R3. (b) Find a basis.

Combining Subspaces Definition : Sum of Vector Subspaces Where W1, …, Wk are subspaces of a vector space, their sum is the span of their union: Example : R3 → R3 = x-axis + y-axis + z-axis Example : A sum of subspaces can be less than the entire space. Let L = { a + b x | a, b  R } and C = { c x3 | c  R } be subspaces of P4. Then L + C = { a + b x + c x3 | a, b, c  R }  P4.

Example : R3 A space can be described as a combination of subspaces in more than one way. → R3 = xy-plane + yz-plane This decomposition of v is obviously not unique. Definition : The concatenation of the sequences … is their adjoinment

Lemma : Let V be a vector space that is the sum of some of its subspaces V = W1 + … + Wk. Let B1, . . . , Bk be any bases for these subspaces. Then the following are equivalent. (1) For every v  V , the expression v = w1 + …+ wk (with wi  Wi ) is unique. (2) The concatenation B1  …  Bk is a basis for V . (3) The nonzero members of {w1 , … , wk } (with wi  Wi ) form a linearly independent set — among nonzero vectors from different Wi’s, every linear relationship is trivial. Proof: See Hefferon, p.134.

Definition : Independent Set of Subspaces A collection of subspaces { W1 , …, Wk } is independent if no nonzero vector from any Wi is a linear combination of vectors from the other subspaces W1 , …, Wi 1 Wi + 1 , …, Wk . Definition : ( Internal ) Direct Sum A vector space V is the direct sum of its subspaces W1 , …, Wk if V = W1 +…+ Wk and { W1 , …, Wk } is independent. We write V = W1  …  Wk Example : R3 = x-axis  y-axis  z-axis. Example

Corollary 4.13: The dimension of a direct sum is the sum of the dimensions of its summands. Proof. Follows directly from Lemma 4.8 Definition 4.14: Complement Subspaces When a vector space is the direct sum of two of its subspaces, then they are said to be complements. Lemma 4.15: A vector space V is the direct sum of two of its subspaces W1 and W2 iff V = W1+W2 and W1  W2 = { 0 } Proof  : V = W1  W2 → W1 and W2 are L.I. → W1  W2 = { 0 } Proof  : W1  W2 = { 0 } → W1 and W2 are L.I. Caution:  fails if there are more than 2 subspaces. See Example 4.19 below.

Example 4.16: Direct Sum Decomposition is not Unique In R2, the x-axis and the y-axis are complements, i.e., R2 = x-axis  y-axis. So are lines y = x and y = 2 x. Example 4.17: Complement to a Subspace is not Unique In the space F = { a cosθ + b sinθ | a,b R }, the subspaces W1 = { a cosθ | a  R } and W2 = { b sinθ | b R } are complements. Another complement of W1 is W3 = { b cosθ + b sinθ | b R }. Example 4.18: In R3, the xy-plane and the yz-planes are not complements.

If there are more than two subspaces, then having a trivial intersection is not enough to guarantee unique decomposition. Example 4.19: R3 Let W1 = x-axis, W2 = y-axis, → Decomposition is not unique: Reason: vector from W3 can be a L.C. of those from W1 & W2 →

Exercises 2.III.4. 1. Let W1 ,W2 be subspaces of a vector space. (a) Assume that the set S1 spans W1, and that the set S2 spans W2. Can S1 S2 span W1 +W2? Must it? (b) Assume that S1 is a linearly independent subset of W1 and that S2 is a linearly independent subset of W2. Can S1 S2 be a linearly independent subset of W1 +W2? Must it? 2. The example of the x-axis and the y-axis in R2 shows that W1  W2 = V does not imply that W1  W2 = V . Can W1  W2 = V and W1  W2 = V happen? 3. Let W1 , W2 , W3 be subspaces of a vector space. Prove that (W1W2 ) + (W1W3 )  W1  (W2 +W3 ) . Does the inclusion reverse?

Let V and W be vector spaces. Use wikipedia to find out the meanings of their Direct sum , V  W . Direct product , V  W . Tensor product , V W .