Download presentation
Published byWilfrid Stevens Modified over 9 years ago
1
Section 1.7 Linear Independence and Nonsingular Matrices
Chapter 1 Section 1.7 Linear Independence and Nonsingular Matrices
2
Zero Vector in ℝ 𝑛 A vector in ℝ 𝑛 in which every entry is zero is called the zero vector it is denoted (or 0 ) 𝜽= 0 ⋮ 0 Linearly Independent Vectors A set of vectors v 1 , v 2 ,⋯, v 𝑝 in ℝ 𝑛 is said to be linearly independent if the only solution the homogeneous system of equations given by: 𝑥 1 v 1 + 𝑥 2 v 2 +⋯+ 𝑥 𝑝 v 𝑝 =𝜃 is 𝑥 1 =0, 𝑥 2 =0,⋯, 𝑥 𝑝 =0. Vectors that are not linearly independent (i.e. it is possible to find a nonzero solution) are said to be linearly dependent. Recall a homogeneous system of equations always has the zero solution, but it will have infinitely many (hence a nonzero solution) if one of the variables is an independent variable. If an independent variable exists the vectors are linearly dependent and if the system has no independent variable the system is linearly independent. If a set of vectors is linearly dependent it is possible to express one of the vectors as a linear combination of the others (i.e. one of the vectors gives you no new algebraic information). v 𝑖 = 𝑠 1 v 1 + 𝑠 2 v 2 +⋯+ 𝑠 𝑖−1 v 𝑖−1 + 𝑠 𝑖+1 v 𝑖+1 +⋯+ 𝑠 𝑝 v 𝑝
3
Example Determine if the following sets of vectors are linearly dependent or independent. If they are linearly dependent show how one is a linear combination of the others. Set of Vectors Form the System Form the Matrix Row Reduce 1 1 − , , −1 𝑥 − 𝑥 𝑥 −1 = 1 1 − −1 This system has 3 dependent variables and no independent variables (i.e. 𝑥 1 =0, 𝑥 2 =0, 𝑥 3 =0) therefore is linearly independent. Set of Vectors Form the System & Matrix Row Reduce 1 −2 −1 3 , 2 −4 −2 6 , − −9 , 𝑥 −2 − 𝑥 −4 − 𝑥 3 − −9 + 𝑥 = − General Solution This system has the independent variables 𝑥 2 and 𝑥 3 so the vectors are linearly dependent. To write one in terms of the other pick a particular solution. 𝑥 1 =−5 𝑥 2 =1 𝑥 3 =−1 𝑥 4 =0 − −9 =−5 1 −2 − −4 −2 6 𝑥 1 =−2 𝑥 2 +3 𝑥 3 𝑥 4 =0
4
Without doing any calculations why is the set of vectors to the right linearly dependent?
The set contains the zero vector. If the zero vector is in the set have the variable corresponding to the zero vector be 1 and all the others be zero and this will be a non-zero solution to the system. , , − , , Without doing any calculations why is the set of vectors to the right linearly dependent? The corresponding system of equations can have at most 3 dependent variables so that means the must be at least 1 independent variable. Whenever you have more variables (vectors) than equations (entries in a vector) the set must be linearly dependent. , 2 −8 5 , − , It is useful to notice these details in order to avoid either a great deal of calculation by hand or a bunch of unnecessary typing into your calculator or computer!
5
If 𝑎 11 ⋯ 𝑎 1𝑛 ⋮ ⋱ ⋮ 𝑎 𝑛1 ⋯ 𝑎 𝑛𝑛 𝑥 1 ⋮ 𝑥 𝑛 = 0 ⋮ 0
Unit Vectors in ℝ 𝑛 The unit vectors in ℝ 𝑛 are e 1 , e 2 ,⋯, e 𝑛 where e 𝑖 is a vector with n entries with a 1 in the ith position and 0 for all the other entries. e 1 = ⋮ 0 , e 2 = ⋮ 0 ,⋯, e 𝑛 = ⋮ 1 Nonsingular Matrix If 𝑎 11 ⋯ 𝑎 1𝑛 ⋮ ⋱ ⋮ 𝑎 𝑛1 ⋯ 𝑎 𝑛𝑛 𝑥 1 ⋮ 𝑥 𝑛 = 0 ⋮ 0 then 𝑥 1 ⋮ 𝑥 𝑛 = 0 ⋮ 0 Nonsingular and Singular Matrices A square 𝑛×𝑛 matrix A is call nonsingular if the only solution to the matrix equation 𝐴x=𝜽 is for x=𝜽 (i.e. x is the zero vector). If a nonzero solution exists for x we call the matrix A singular. Again this represents a homogeneous system of equations The only way to have a nonzero solution is have an independent variable in the general solution. Row reduces to the matrix Nonsingular Row reduces to the matrix Singular 1 0 − If a square matrix can be row reduced to the identity matrix then the matrix is nonsingular. If the row reduced matrix has a row of all zeros it is singular.
6
Nonsingular Matrices and Linear Independence
A square 𝑛×𝑛 matrix 𝐴= 𝐴 1 , 𝐴 2 ,⋯, 𝐴 𝑛 is nonsingular if and only if the set column vectors 𝐴 1 , 𝐴 2 ,⋯, 𝐴 3 of the matrix A form a linearly independent set. 𝐴x= 𝑎 11 ⋯ 𝑎 1𝑛 ⋮ ⋱ ⋮ 𝑎 𝑛1 ⋯ 𝑎 𝑛𝑛 𝑥 1 ⋮ 𝑥 𝑛 = 0 ⋮ 0 if and only if 𝑥 1 𝑎 11 ⋮ 𝑎 𝑛1 +⋯+ 𝑥 𝑛 𝑎 1𝑛 ⋮ 𝑎 𝑛𝑛 = 0 ⋮ 0 Example: Find all values for a so that the set of vectors 𝑎 2 , 9 𝑎+3 is linearly dependent. We begin by forming the matrix with the given set of vectors as columns and row reduce it. 𝑎 9 2 𝑎+3 R1↔R2 2 𝑎+3 𝑎 9 ½R1 1 𝑎 𝑎 9 -aR1+R2 1 𝑎 −𝑎 𝑎+3 2 There will be an independent variable for this system of equations if R2 is zero. Set the expression in the second row and second column equal to zero and solve. 9+ −𝑎 𝑎 =0 9= 𝑎 𝑎+3 2 18= 𝑎 2 +3𝑎 0= 𝑎 2 +3𝑎−18 0= 𝑎+6 𝑎−3 If 𝑎=−6 or 𝑎=3 the vectors will be linearly dependent.
7
Example: Determine a relation between a, b, c, and d so the matrix 𝑎 𝑏 𝑐 𝑑 is singular. Again we want to row reduce but the sequence is not quite the Gauss Jordan method since we do not know the values for the entries. 𝑎 𝑏 𝑐 𝑑 cR1 aR2 𝑎𝑐 𝑏𝑐 𝑎𝑐 𝑎𝑑 -R1+R2 𝑎𝑐 𝑏𝑐 0 𝑎𝑑−𝑏𝑐 This system will have an independent variable as long as the second row is all zero. Set the entry in the second row and second column equal to zero. 𝑎𝑑−𝑏𝑐=0 From this we have characterized all 2×2 singular matrices. A 2×2 matrix of the form 𝑎 𝑏 𝑐 𝑑 is singular if and only if 𝑎𝑑−𝑏𝑐=0 and nonsingular if and only if 𝑎𝑑−𝑏𝑐≠0
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.