2.III. Basis and Dimension Vector Spaces and Linear Systems Combining Subspaces
2.III.1. Basis Definition 1.1: Basis A basis of a vector space V is an ordered set of linearly independent (non-zero) vectors that spans V. Notation: Example 1.2: L.I. → Minimal Span → Complete is a basis for R2 B is L.I. : → → B spans R2: → →
Example 1.3: is a basis for R2 that differs from B only in order. Definition 1.5: Standard / Natural Basis for Rn kth component of ei =
Example 1.6: For the function space a natural basis is Another basis is Proof is straightforward. Rule: Set of L.C.’s of a L.I. set is L.I. if each L.C. contains a different vector. Example 1.7: For the function space of cubic polynomials P3 , a natural basis is Other choices can be Proof is again straightforward.
The trivial space { 0 } has only one basis, the empty one . Example 1.8: The trivial space { 0 } has only one basis, the empty one . Note: By convention, 0 does not count as a basis vector. ( Any set of vectors containing 0 is linearly dependent. ) Example 1.9: The space of all finite degree polynomials has a basis with infinitely many elements 1, x, x2, … . Example 1.10: Solution Set of Homogeneous Systems The solution set of ( Proof of L.I. is left as exercise ) is
Find a basis for this subspace of M22 : Example 1.11: Matrices Find a basis for this subspace of M22 : Solution: ∴ Basis is ( Proof of L.I. is left as exercise ) Theorem 1.12: In any vector space, a subset is a basis if and only if each vector in the space can be expressed as a linear combination of elements of the subset in a unique way. Proof: A basis is by definition spanning → every vector can be expressed as a linear combination of the basis vectors. Let then ∴ L.I. uniqueness
Definition 1.13: Representation wrt a Basis Let B = β1 , …, βn be a basis of vector space V and Subscript B is often omitted Then the representation of v wrt B is cj are called the coordinates (components) of v wrt B. Example 1.14: P3 Let Then
Exercises 2.III.1 1. Find a basis for each. (a) The subspace { a2 x2 + a1 x + a0 | a2 2a1 = a0 } of P2 . (b) The space of three-wide row vectors whose 1st and 2nd components add to zero. (c) This subspace of the 22 matrices 2. A square matrix is symmetric if for all indices i and j, entry i, j equals entry j, i. (a) Find a basis for the vector space of symmetric 2 2 matrices. (b) Find a basis for the space of symmetric 3 3 matrices. (c) Find a basis for the space of symmetric n n matrices.
3. One of the exercises in the Subspaces subsection shows that the set is a vector space under these operations. Find a basis.
2.III.2. Dimension To be proved: All bases for a vector space have the same number of elements. → Dimension Number of vectors in basis. → Basis = Minimal spanning = L.I. set = Smallest set. Definition 2.1 A vector space is finite-dimensional if it has a basis with only finitely many vectors. Lemma 2.2: Exchange Lemma Assume that B = β1 , …, βn is a basis for a vector space, and that for the vector v the relationship holds: where cj 0. Then exchanging βj for v yields another basis for the space. Proof: See Hefferon p.120.
Theorem 2.3: In any finite-dimensional vector space, all of the bases have the same number of elements. Proof: Let B = β1 , …, βn be a basis of n elements. Any other basis D = δ1 , …, δm must have m n. Let with ck 0. By lemma 2.2, D1 = β1 , …, βk 1 ,δ1 ,βk + 1 , …, βn is a basis. Next, replacing βj in D1 begets D2 = β1 , …, βk 1 ,δ1 ,βk + 1 , …, βj 1 ,δ2 ,βj + 1 , …, βn Repeating the process n times results in a basis Dn = δ1 , …, δn that spans V. If m > n, then we can write with at least one ck 0. Which contradicts with the assumption that D is L.I. Hence m = n.
Definition 2.4: Dimension The dimension of a vector space is the number of vectors in any of its bases. Example 2.5: Rn Any basis for Rn has n vectors since the standard basis En has n vectors. → Rn is n-D. Example 2.6: Pn dim Pn = n+1. since its natural basis, 1, x, x2, …, xn , has n+1 elements. Example 2.7: A trivial space is 0-D since its basis is empty. Comments: All results in this book are applicable to finite-D vectors spaces. Most of them are also applicable to countably infinite-D vectors spaces. For uncountably infinite-D vectors spaces, e.g., Hilbert spaces, convergence most be taken into account.
Corollary 2.8: No L.I. set can have a size greater than the dimension of the enclosing space. Example 2.9 : Only subspaces in R3. 2-D: Planes thru 0 1-D: Lines thru 0 0-D: {0}
Corollary 2.10: Any L.I. set can be expanded to make a basis. Corollary 2.11: Any spanning set can be shrunk to a basis. Corollary 2.12: In an n-D space, a set of n vectors is L.I. iff it spans the space. Remark 2.13: The statement ‘any infinite-dimensional vector space has a basis’ is known to be equivalent to a statement called the Axiom of Choice. Mathematicians differ philosophically on whether to accept or reject this statement as an axiom on which to base mathematics (although, the great majority seem to accept it).
Exercises 2.III.2. 1. What is the dimension of the span of the set { cos2θ , sin2θ , cos 2θ, sin 2θ} This span is a subspace of the space of all real-valued functions of one real variable. 2. Observe that, where S is a set, the functions f : S → R form a vector space under the natural operations: ( f + g ) (s) = f(s) + g(s) and (r f )(s) = r f(s). What is the dimension of the space resulting for each domain? (a) S = {1} (b) S = { 1, 2 } (c) S = { 1, 2, …, n } 3. Prove that if U and W are both three-dimensional subspaces of R5 then U W is non-trivial. Generalize.
2.III.3. Vector Spaces and Linear Systems Definition 3.1: Row Space & Row Rank The row space of a matrix is the span of the set of its rows. The row rank is the dimension of the row space, the number of L.I. rows. Example 3.2: → Lemma 3.3: Row-equivalent matrices have the same row space & hence the same row rank. Proof: Let A & B be row-equivalent matrices. Each row of B is a lin.comb. of rows of A → RowSpace(B) RowSpace(A) Each row of A is a lin.comb. of rows of B → RowSpace(A) RowSpace(B) Hence, RowSpace(A) = RowSpace(B)
Lemma 3.4: The nonzero rows of an echelon form matrix make up a L.I. set. Proof: This is just a re-statement of Lemma III.2.5, which states that, in an echelon form matrix, no nonzero row is a linear combination of the other rows. Gaussian reduction ~ Finding a basis for the row space. Example 3.5: → → Basis for the row space is { (1 3 1), (0 1 0), (0 0 3) }.
Definition 3.6: Column Space & Column Rank The column space of a matrix is the span of the set of its columns. The column rank is the dimension of the column space, the number of L.I. columns. A linear system is equivalent to the linear combination of the columns of the corresponding coefficient matrix. ~ Basis for the column space can be found by applying the Gaussian reduction to the transpose of the corresponding coefficient matrix. Definition 3.8: Transpose The transpose of a matrix is the result of interchanging the rows and columns of that matrix, i.e.,
Example 3.7: Find a basis for the column space of ~ → Basis is
Example 3.9: Get a basis for Solution: Let Vector spaces T & S are isomorphic. ~ → Basis is
Lemma 3.10: Row operations do not change the column rank. Proof: Rationale for Gaussian reduction: Row operations do not affect linear relationships among columns. Example: Reading basis from reduced echelon form ~ Basis for row space (rows with leading entries): Basis for column space (columns containing leading entries):
Theorem 3.11: The row rank and column rank of a matrix are equal. Proof: Lemmas 3.3 & 3.10 → row rank of matrix = row rank of its echelon form. In reduced echelon form: Row rank = Number of leading entries = Column rank. Definition 3.12: Rank The rank of a matrix is its row rank or column rank. → dim(RowSpace) = dim(ColumnSpace) Theorem 3.13: For linear systems with n unknowns and matrix of coefficients A, the following statements are equivalent. (1) rank of A is r (2) space of solutions of the associated homogeneous system has dim n r Proof: rank A = r reduced echelon form has r non-zero rows (L.I. eqs) there are n r free variables (L.D.eqs)
Corollary 3.15: Where the matrix A is nn, the following statements are equivalent. (1) the rank of A is n (2) A is nonsingular (3) the rows of A form a linearly independent set (4) the columns of A form a linearly independent set (5) any linear system with matrix of coefficients A has one and only one solution Proof: Trivial (see Hefferon p.129)
Exercises 2.III.3. 1. Find a basis for the span of each set. (a) (b) 2. (a) Show that the following set of column vectors is a subspace of R3. (b) Find a basis.
2.III.4. Combining Subspaces Definition 4.1: Sum of Vector Subspaces Where W1, …, Wk are subspaces of a vector space, their sum is the span of their union: Example 4.2: R3 → R3 = x-axis + y-axis + z-axis Example 4.3: A sum of subspaces can be less than the entire space. Let L = { a + b x | a, b R } and C = { c x3 | c R } be subspaces of P4. Then L + C = { a + b x + c x3 | a, b, c R } P4.
Example 4.4: R3 A space can be described as a combination of subspaces in more than one way. → R3 = xy-plane + yz-plane This decomposition of v is obviously not unique. Definition 4.7: The concatenation of the sequences … is their adjoinment
Lemma 4.8: Let V be a vector space that is the sum of some of its subspaces V = W1 + … + Wk. Let B1, . . . , Bk be any bases for these subspaces. Then the following are equivalent. (1) For every v V , the expression v = w1 + …+ wk (with wi Wi ) is unique. (2) The concatenation B1 … Bk is a basis for V . (3) The nonzero members of {w1 , … , wk } (with wi Wi ) form a linearly independent set — among nonzero vectors from different Wi’s, every linear relationship is trivial. Proof: See Hefferon, p.134.
Definition 4.9: Independent Set of Subspaces A collection of subspaces { W1 , …, Wk } is independent if no nonzero vector from any Wi is a linear combination of vectors from the other subspaces W1 , …, Wi 1 Wi + 1 , …, Wk . Definition 4.10: ( Internal ) Direct Sum A vector space V is the direct sum of its subspaces W1 , …, Wk if V = W1 +…+ Wk and { W1 , …, Wk } is independent. We write V = W1 … Wk Example 4.11: R3 = x-axis y-axis z-axis. Example 4.12:
Corollary 4.13: The dimension of a direct sum is the sum of the dimensions of its summands. Proof. Follows directly from Lemma 4.8 Definition 4.14: Complement Subspaces When a vector space is the direct sum of two of its subspaces, then they are said to be complements. Lemma 4.15: A vector space V is the direct sum of two of its subspaces W1 and W2 iff V = W1+W2 and W1 W2 = { 0 } Proof : V = W1 W2 → W1 and W2 are L.I. → W1 W2 = { 0 } Proof : W1 W2 = { 0 } → W1 and W2 are L.I. Caution: fails if there are more than 2 subspaces. See Example 4.19 below.
Example 4.16: Direct Sum Decomposition is not Unique In R2, the x-axis and the y-axis are complements, i.e., R2 = x-axis y-axis. So are lines y = x and y = 2 x. Example 4.17: Complement to a Subspace is not Unique In the space F = { a cosθ + b sinθ | a,b R }, the subspaces W1 = { a cosθ | a R } and W2 = { b sinθ | b R } are complements. Another complement of W1 is W3 = { b cosθ + b sinθ | b R }. Example 4.18: In R3, the xy-plane and the yz-planes are not complements.
If there are more than two subspaces, then having a trivial intersection is not enough to guarantee unique decomposition. Example 4.19: R3 Let W1 = x-axis, W2 = y-axis, → Decomposition is not unique: Reason: vector from W3 can be a L.C. of those from W1 & W2 →
Exercises 2.III.4. 1. Let W1 ,W2 be subspaces of a vector space. (a) Assume that the set S1 spans W1, and that the set S2 spans W2. Can S1 S2 span W1 +W2? Must it? (b) Assume that S1 is a linearly independent subset of W1 and that S2 is a linearly independent subset of W2. Can S1 S2 be a linearly independent subset of W1 +W2? Must it? 2. The example of the x-axis and the y-axis in R2 shows that W1 W2 = V does not imply that W1 W2 = V . Can W1 W2 = V and W1 W2 = V happen? 3. Let W1 , W2 , W3 be subspaces of a vector space. Prove that (W1W2 ) + (W1W3 ) W1 (W2 +W3 ) . Does the inclusion reverse?
Let V and W be vector spaces. Use wikipedia to find out the meanings of their Direct sum , V W . Direct product , V W . Tensor product , V W .