The Matrix of a Linear Transformation (9/30/05)

Slides:



Advertisements
Similar presentations
Eigenvalues and Eigenvectors
Advertisements

2 2.3 © 2012 Pearson Education, Inc. Matrix Algebra CHARACTERIZATIONS OF INVERTIBLE MATRICES.
The Inverse of a Matrix (10/14/05) If A is a square (say n by n) matrix and if there is an n by n matrix C such that C A = A C = I n, then C is called.
Basis of a Vector Space (11/2/05)
Chapter 2 Section 2 Solving a System of Linear Equations II.
Coordinate Systems (11/4/05) It turns out that every vector space V which has a finite basis can be “realized” as one of the spaces R n as soon as we pick.
Solution Sets of Linear Systems (9/21/05)
Some Important Subspaces (10/29/04) If A is an m by n matrix, then the null space of A is the set of all vectors x in R n which satisfy the homogeneous.
1 © 2012 Pearson Education, Inc. Matrix Algebra THE INVERSE OF A MATRIX.
4 4.2 © 2012 Pearson Education, Inc. Vector Spaces NULL SPACES, COLUMN SPACES, AND LINEAR TRANSFORMATIONS.
Eigenvalues and Eigenvectors
Sections 1.8/1.9: Linear Transformations
A matrix equation has the same solution set as the vector equation which has the same solution set as the linear system whose augmented matrix is Therefore:
1 1.3 © 2012 Pearson Education, Inc. Linear Equations in Linear Algebra VECTOR EQUATIONS.
Linear Independence (9/26/05) A set of vectors {v 1, v 2, …, v n } is said to be linearly independent if the homogeneous vector equation x 1 v 1 + x 2.
4 Vector Spaces 4.1 Vector Spaces and Subspaces 4.2 Null Spaces, Column Spaces, and Linear Transformations 4.3 Linearly Independent Sets; Bases 4.4 Coordinate.
The Matrix Equation A x = b (9/16/05) Definition. If A is an m  n matrix with columns a 1, a 2,…, a n and x is a vector in R n, then the product of A.
Algebra 1 Section 4.2 Graph linear equation using tables The solution to an equation in two variables is a set of ordered pairs that makes it true. Is.
1 1.3 © 2016 Pearson Education, Ltd. Linear Equations in Linear Algebra VECTOR EQUATIONS.
Eigenvalues and Eigenvectors
Linear Algebra Lecture 26.
Eigenvalues and Eigenvectors
Linear Algebra Lecture 19.
Does the set S SPAN R3 ?.
Linear Equations in Linear Algebra
CHARACTERIZATIONS OF INVERTIBLE MATRICES
Linear Algebra Linear Transformations. 2 Real Valued Functions Formula Example Description Function from R to R Function from to R Function from to R.
Linear Transformations
Row Space, Column Space, and Nullspace
LINEAR TRANSFORMATIONS
Elementary Linear Algebra
Orthogonality and Least Squares
1.7 Linear Independence 线性无关
Linear Algebra Lecture 22.
Linear Equations in Linear Algebra
Orthogonality and Least Squares
1.3 Vector Equations.
Signal & Weight Vector Spaces
Linear Algebra Lecture 37.
Linear Algebra Lecture 39.
What can we know from RREF?
Signal & Weight Vector Spaces
Linear Algebra Lecture 21.
The Inverse of a Matrix Prepared by Vince Zaccone
Symmetric Matrices and Quadratic Forms
Linear Algebra Lecture 10.
Properties of Solution Sets
Linear Transformations
Linear Algebra Lecture 9.
Linear Algebra Lecture 41.
Linear Algebra Lecture 33.
Linear Algebra Lecture 7.
How many solutions? Hung-yi Lee New Question:
RAYAT SHIKSHAN SANSTHA’S S. M. JOSHI COLLEGE HADAPSAR, PUNE
Vector Spaces, Subspaces
Vectors Any quantity having a n components is called a vector of order n. thus, the coefficients in a linear equation or the elements of a row or column.
Row-equivalences again
Linear Algebra Lecture 35.
3.1 Day 2 Applications and properties of a Kernel
Null Spaces, Column Spaces, and Linear Transformations
NULL SPACES, COLUMN SPACES, AND LINEAR TRANSFORMATIONS
Linear Equations in Linear Algebra
Vector Spaces COORDINATE SYSTEMS © 2012 Pearson Education, Inc.
CHARACTERIZATIONS OF INVERTIBLE MATRICES
Linear Equations in Linear Algebra
Orthogonality and Least Squares
Eigenvalues and Eigenvectors
Eigenvalues and Eigenvectors
Linear Equations in Linear Algebra
CHAPTER 4 Vector Spaces Linear combination Sec 4.3 IF
Presentation transcript:

The Matrix of a Linear Transformation (9/30/05) We observed last time that every transformation T from Rn to Rm which is described by a matrix is linear. The opposite is also true: Every linear transformation T can be completely described by a unique matrix A, which is called the standard matrix for T .

An example Suppose T is a linear transformation from R2 to R3 which takes the vector e1= to the vector and takes the vector e2= to the vector . Find a matrix description of T . How can we generalize this?

Explicit description of the matrix A If T is a linear transformation from Rn to Rm , and if for each j , T takes the vector ej = (0,0,…,1,..,0) (only 1 is in the j th slot) in Rn to a vector vj in Rm, then the unique matrix A which describes T is simply the matrix whose columns are v1, v2, …, vn . That is, T ‘s action on the ej’s completely describes T .

One-to-one and Onto A transformation T from Rn to Rm is called onto if every b in Rm is the image of at least one a in Rn. (Hence Rm is the range of T .) T is called one-to-one if every b in Rm is the image of at most one a in Rn. If T is linear, we can check whether it’s one-to-one simply by checking what a’s get sent to 0.

Connections to Independence and Spanning If a linear transformation T from Rn to Rm is represented by the m by n matrix A, then T is onto if and only if the columns of A span Rm. T is one-to-one if and only if the columns of A are linearly independent.

An Application & Assignment for Monday Section 1.10 contains three applications of which we will do one: Linear difference equations. See pages 97-99. For Monday: Read Section 1.9. Do Practice and Exercises 1, 3, 5, 11, 13, 17, 19, 21, 23, 25, 27. Read pages 97-99 and do Exercise 9 on page 101.