ENGG2013 Unit 15 Rank, determinant, dimension, and the links between them Mar, 2011.

Slides:



Advertisements
Similar presentations
Vector Spaces A set V is called a vector space over a set K denoted V(K) if is an Abelian group, is a field, and For every element vV and K there exists.
Advertisements

5.4 Basis And Dimension.
5.1 Real Vector Spaces.
5.2 Rank of a Matrix. Set-up Recall block multiplication:
3D Geometry for Computer Graphics
Chapter 6 Eigenvalues and Eigenvectors
Signal , Weight Vector Spaces and Linear Transformations
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
ENGG2013 Unit 11 Row-Rank Feb,
Chapter 5 Orthogonality
Chapter 2 Basic Linear Algebra
ENGG2013 Unit 13 Basis Feb, Question 1 Find the value of c 1 and c 2 such that kshumENGG20132.
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
ENGG2013 Unit 9 3x3 Determinant
Class 25: Question 1 Which of the following vectors is orthogonal to the row space of A?
Orthogonality and Least Squares
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 08 Chapter 8: Linear Transformations.
Linear Equations in Linear Algebra
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
5  Systems of Linear Equations: ✦ An Introduction ✦ Unique Solutions ✦ Underdetermined and Overdetermined Systems  Matrices  Multiplication of Matrices.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Boyce/DiPrima 9th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
 Row and Reduced Row Echelon  Elementary Matrices.
MA2213 Lecture 5 Linear Equations (Direct Solvers)
Linear Algebra Lecture 25.
Presentation by: H. Sarper
Linear Algebra Chapter 4 Vector Spaces.
Chapter 2 Simultaneous Linear Equations (cont.)
Chapter 5 General Vector Spaces.
Vectors in R n a sequence of n real number An ordered n-tuple: the set of all ordered n-tuple  n-space: R n Notes: (1) An n-tuple can be viewed.
Section 4.1 Vectors in ℝ n. ℝ n Vectors Vector addition Scalar multiplication.
Simplex method (algebraic interpretation)
A matrix equation has the same solution set as the vector equation which has the same solution set as the linear system whose augmented matrix is Therefore:
Chapter 3 Vector Spaces. The operations of addition and scalar multiplication are used in many contexts in mathematics. Regardless of the context, however,
Chapter Content Real Vector Spaces Subspaces Linear Independence
1 1.3 © 2012 Pearson Education, Inc. Linear Equations in Linear Algebra VECTOR EQUATIONS.
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
Chap. 6 Linear Transformations
Computing Eigen Information for Small Matrices The eigen equation can be rearranged as follows: Ax = x  Ax = I n x  Ax - I n x = 0  (A - I n )x = 0.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
5.5 Row Space, Column Space, and Nullspace
4.6: Rank. Definition: Let A be an mxn matrix. Then each row of A has n entries and can therefore be associated with a vector in The set of all linear.
Section 2.3 Properties of Solution Sets
Chap. 4 Vector Spaces 4.1 Vectors in Rn 4.2 Vector Spaces
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Class 26: Question 1 1.An orthogonal basis for A 2.An orthogonal basis for the column space of A 3.An orthogonal basis for the row space of A 4.An orthogonal.
Quadratic Functions (4) What is the discriminant What is the discriminant Using the discriminant Using the discriminant.
Elementary Linear Algebra Anton & Rorres, 9th Edition
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
OR  Now, we look for other basic feasible solutions which gives better objective values than the current solution. Such solutions can be examined.
Chapter 5 Chapter Content 1. Real Vector Spaces 2. Subspaces 3. Linear Independence 4. Basis and Dimension 5. Row Space, Column Space, and Nullspace 6.
Linear Algebra Engineering Mathematics-I. Linear Systems in Two Unknowns Engineering Mathematics-I.
Matrices, Vectors, Determinants.
Lecture XXVI.  The material for this lecture is found in James R. Schott Matrix Analysis for Statistics (New York: John Wiley & Sons, Inc. 1997).  A.
REVIEW Linear Combinations Given vectors and given scalars
7.3 Linear Systems of Equations. Gauss Elimination
Matrices and Vector Concepts
5 Systems of Linear Equations and Matrices
Systems of First Order Linear Equations
Linear Equations in Linear Algebra
4.6: Rank.
1.3 Vector Equations.
Chapter 3 Linear Algebra
Elementary Linear Algebra
Properties of Solution Sets
Linear Algebra Lecture 20.
Vector Spaces 1 Vectors in Rn 2 Vector Spaces
Eigenvalues and Eigenvectors
Presentation transcript:

ENGG2013 Unit 15 Rank, determinant, dimension, and the links between them Mar, 2011.

Review on “rank” “row-rank of a matrix” counts the max. number of linearly independent rows. “column-rank of a matrix” counts the max. number of linearly independent columns. One application: Given a large system of linear equations, count the number of essentially different equations. The number of essentially different equations is just the row-rank of the augmented matrix. kshum ENGG2013

Evaluating the row-rank by definition Linearly independent Linearly independent Linearly independent Linearly dependent Linearly independent Linearly independent Linearly dependent  Row-Rank = 2 kshum ENGG2013

Calculation of row-rank via RREF Row reductions Row-rank = 2 Because row reductions do not affect the number of linearly independent rows Row-rank = 2 kshum ENGG2013

Calculation of column-rank by definition List all combinations of columns  Column-Rank = 2 Linearly independent?? Y Y Y Y Y Y Y Y Y Y N N N N N kshum ENGG2013

Theorem Given any matrix, its row-rank and column-rank are equal. In view of this property, we can just say the “rank of a matrix”. It means either the row-rank of column-rank. kshum ENGG2013

Why row-rank = column-rank? If some column vectors are linearly dependent, they remain linearly dependent after any elementary row operation For example, are linearly dependent kshum ENGG2013

Why row-rank = column-rank? Any row operation does not change the column- rank. By the same argument, apply to the transpose of the matrix, we conclude that any column operation does not change the row-rank as well. kshum ENGG2013

Why row-rank = column-rank? Apply row reductions. row-rank and column-rank do not change. Apply column reductions. row-rank and column-rank do not change. The top-left corner is an identity matrix. The row-rank and column-rank of this “normal form” is certainly the size of this identity submatrix, and are therefore equal. kshum ENGG2013

DISCRIMINANT, DETERMINANT AND RANK kshum ENGG2013

Discriminant of a quadratic equation y = ax2+bx+c Discirminant of ax2+bx+c = b2-4ac. It determines whether the roots are distinct or not y x kshum ENGG2013

Discriminant measures the separation of roots y = x2+bx+c. Let the roots be  and . y = (x – )(x – ). Discriminant = ( – )2. Discriminant is zero means that the two roots coincide. y ( – )2 x kshum ENGG2013

Discriminant is invariant under translation If we substitute u= x – t into y = ax2+bx+c, (t is any real constant), then the discriminant of a(u+t)2+b(u+t)+c, as a polynomial in u, is the same as before. y ( – )2 u kshum ENGG2013

Determinant of a square matrix The determinant of a square matrix determine whether the matrix is invertible or not. Zero determinant: not invertible Non-zero determinant: invertible. kshum ENGG2013

Determinant measure the area 22 determinant measures the area of a parallelogram. 33 determinant measures the volume of a parallelopiped. nn determinant measures the “volume” of some “parallelogram” in n-dimension. Determinant is zero means that the columns vectors lie in some lower-dimensional space. kshum ENGG2013

Determinant is invariant under shearing action Shearing action = third kind of elementary row or column operation kshum ENGG2013

Rank of a rectangular matrix The rank of a matrix counts the maximal number of linearly independent rows. It also counts the maximal number of linearly independent columns. It is an integer. If the matrix is mn, then the rank is an integer between 0 and min(m,n). kshum ENGG2013

Rank is invariant under row and column operations kshum ENGG2013

Comparison between det and rank Determinant Rank Real number Defined to square matrix only Non-zero det implies existence of inverse. When det is zero, we only know that all the columns (or rows) together are linearly dependent, but don’t know any information about subset of columns (or rows) which are linearly independent. Integer Defined to any rectangular matrix When applied to nn square matrix, rank=n implies existence of inverse. kshum ENGG2013

Basis: Definition For any given vector in if there is one and only one choice for the coefficients c1, c2, …,ck, such that we say that these k vectors form a basis of . kshum ENGG2013

Yet another interpretation of rank Recall that a subspace W in is a subset which is Closed under addition: Sum of any two vectors in W stay in W. Closed under scalar multiplication: scalar multiple of any vector in W stays in W as well. W kshum ENGG2013

Closedness property of subspace W kshum ENGG2013

Geometric picture x – 3y + z = 0 W z y x W is the plane generated, or spanned, by these vectors. x – 3y + z = 0 z y x kshum ENGG2013

Basis and dimension A basis of a subspace W is a set of linearly independent vectors which span W. A rigorous definition of the dimension is: Dim(W) = the number of vectors in a basis of W. z y W x kshum ENGG2013

Rank as dimension In this context, the rank of a matrix is the dimension of the subspace spanned by the rows of this matrix. The least number of row vectors required to span the subspace spanned by the rows. The rank is also the dimension of the subspace spanned by the column of this matrix. The least number of column vectors required to span the subspace spanned by the columns kshum ENGG2013

Example x – 2y + z = 0 z y Rank = 2 The three row vectors lie on the same plane. Two of them is enough to describe the plane. x kshum ENGG2013

INTERPOLATION kshum ENGG2013

Polynomial interpolation Given n points, find a polynomial of degree n-1 which goes through these n points. Technical requirements: All x-coordinates must be distinct y-coordinates need not be distinct. y3 y2 y4 y1 x1 x2 x3 x4 kshum ENGG2013

Lagrange interpolation Lagrange interpolating polynomial for four data points: \begin{align*} f(x)&=y_1\frac{(x-x_2)(x-x_3)(x-x_4)}{(x_1-x_2)(x_1-x_3)(x_1-x_4)}+ y_2\frac{(x-x_1)(x-x_3)(x-x_4)}{(x_2-x_1)(x_2-x_3)(x_2-x_4)}+ \\ & y_3\frac{(x-x_1)(x-x_2)(x-x_4)}{(x_3-x_1)(x_3-x_2)(x_3-x_4)}+ y_4\frac{(x-x_1)(x-x_2)(x-x_3)}{(x_4-x_1)(x_4-x_2)(x_4-x_3)} \end{align*} kshum ENGG2013

Computing the coefficients by linear equations We want to solve for coefficients c3, c2, c1, and c0, such that or equivalently kshum ENGG2013

The theoretical basis for polynomial interpolation The determinant of a vandermonde matrix is non-zero, if all xi’s are distinct. Hence, we can always find the matrix inverse and solve the system of linear equations. kshum ENGG2013