Chapter 3 Determinants and Matrices

Slides:



Advertisements
Similar presentations
10.4 Complex Vector Spaces.
Advertisements

Matrix Representation
3D Geometry for Computer Graphics
Elementary Linear Algebra Anton & Rorres, 9th Edition
Y.M. Hu, Assistant Professor, Department of Applied Physics, National University of Kaohsiung Matrix – Basic Definitions Chapter 3 Systems of Differential.
OCE301 Part II: Linear Algebra lecture 4. Eigenvalue Problem Ax = y Ax = x occur frequently in engineering analysis (eigenvalue problem) Ax =  x [ A.
Refresher: Vector and Matrix Algebra Mike Kirkpatrick Department of Chemical Engineering FAMU-FSU College of Engineering.
8 CHAPTER Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1.
Linear Transformations
Symmetric Matrices and Quadratic Forms
Computer Graphics Recitation 5.
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
Chapter 2 Matrices Definition of a matrix.
Chapter 1 Vector analysis
6 1 Linear Transformations. 6 2 Hopfield Network Questions.
MOHAMMAD IMRAN DEPARTMENT OF APPLIED SCIENCES JAHANGIRABAD EDUCATIONAL GROUP OF INSTITUTES.
5 5.1 © 2012 Pearson Education, Inc. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Dirac Notation and Spectral decomposition
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
資訊科學數學11 : Linear Equation and Matrices
化工應用數學 授課教師: 郭修伯 Lecture 9 Matrices
Matrices CS485/685 Computer Vision Dr. George Bebis.
5.1 Orthogonality.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Compiled By Raj G. Tiwari
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
Eigenvalue Problems Solving linear systems Ax = b is one part of numerical linear algebra, and involves manipulating the rows of a matrix. The second main.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
 Row and Reduced Row Echelon  Elementary Matrices.
CHAPTER 2 MATRIX. CHAPTER OUTLINE 2.1 Introduction 2.2 Types of Matrices 2.3 Determinants 2.4 The Inverse of a Square Matrix 2.5 Types of Solutions to.
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
Linear algebra: matrix Eigen-value Problems
Linear algebra: matrix Eigen-value Problems Eng. Hassan S. Migdadi Part 1.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 02 Chapter 2: Determinants.
4.6: Rank. Definition: Let A be an mxn matrix. Then each row of A has n entries and can therefore be associated with a vector in The set of all linear.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Introduction to Linear Algebra Mark Goldman Emily Mackevicius.
The Mathematics for Chemists (I) (Fall Term, 2004) (Fall Term, 2005) (Fall Term, 2006) Department of Chemistry National Sun Yat-sen University 化學數學(一)
Chapter 1 Systems of Linear Equations Linear Algebra.
Chapter 2 Determinants. With each square matrix it is possible to associate a real number called the determinant of the matrix. The value of this number.
Matrices and Matrix Operations. Matrices An m×n matrix A is a rectangular array of mn real numbers arranged in m horizontal rows and n vertical columns.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
STROUD Worked examples and exercises are in the text Programme 5: Matrices MATRICES PROGRAMME 5.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
STROUD Worked examples and exercises are in the text PROGRAMME 5 MATRICES.
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
Linear Algebra Engineering Mathematics-I. Linear Systems in Two Unknowns Engineering Mathematics-I.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Matrices, Vectors, Determinants.
Lecture XXVI.  The material for this lecture is found in James R. Schott Matrix Analysis for Statistics (New York: John Wiley & Sons, Inc. 1997).  A.
Graphics Graphics Korea University kucg.korea.ac.kr Mathematics for Computer Graphics 고려대학교 컴퓨터 그래픽스 연구실.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Review of Linear Algebra
CS479/679 Pattern Recognition Dr. George Bebis
Elementary Linear Algebra Anton & Rorres, 9th Edition
Matrices and vector spaces
6-4 Symmetric Matrices By毛.
4.6: Rank.
Chapter 3 Linear Algebra
Symmetric Matrices and Quadratic Forms
Elementary Linear Algebra Anton & Rorres, 9th Edition
Physics 319 Classical Mechanics
Linear Algebra: Matrix Eigenvalue Problems – Part 2
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Chapter 3 Determinants and Matrices September 14 Determinants 3.1 Determinants Fundamental definition: Notes for each term in det A: Each row has only one number. Each column has only one number. The sign is given by the permutation. Levi-Civita symbol: number of interchanges of adjacent elements

Development by minors (an iterative procedure): Minor A(i,j): A reduced array from A by removing the ith row and the jth column. Expanding along a row Expanding along a column Cofactor Cij: Useful properties of determinants: A common factor in a row (column) may be factored out. Interchanging two rows (columns) changes the sign of the determinant. A multiple of one row (column) can be added to another row (column) without changing the determinant. These properties can be tested in the triple product of

Homogeneous linear equations: The determinant of the coefficient matrix must be zero for a nontrivial solution to exist. Inhomogeneous linear equations:

Linear independence of vectors: A set of vectors are linearly independent if the only solution for is Gram-Schmidt orthogonalization: Starting from n linearly independent vectors we can construct an orthonormal basis set

Read: Chapter 3: 1 Homework: 3.1.1,3.1.2 Due: September 23

September 16,19 Matrices 3.2 Matrices Definition: A matrix is a rectangular array of numbers. Terminology: row, column, element (entry), dimension, row vector, column vector. Basic operations: Addition: Scalar multiplication: Transpose: Elementary row (column) operations: Row switching. Row multiplication by a number. Adding a multiple of a row to another row. Rank: The maximal number of linearly independent row (or column) vectors is called the row (or column) rank of the matrix. For any matrix, row rank equals column rank. Proof (need more labor): 1) Elementary row operations do not change the row rank. 2) Elementary row operations do not change the column rank. 3) Elementary row operations result in an echelon form of matrix, which has equal row and column ranks.

Matrix multiplication: In the view of row (or column) vectors:      The kth row of C is a linear combination of all rows of B, each weighted by an element from the kth row of A. (Similarly by taking the transpose) The kth column of C is a linear combination of all columns of A, each weighted by an element from the kth column of B.

Product theorem:

Direct product: Diagonal matrices: Trace:

Matrix inversion: Gauss-Jordan method of matrix inversion: Let MLA=1 be the result of a series of elementary row operations on A, then Example:

Read: Chapter 3: 2 Homework: 3.2.1,3.2.31,3.2.34,3.2.36 Due: September 23

September 21 Orthogonal matrices 3.3 Orthogonal matrices Change of basis (rotation of coordinate axes): Orthogonal transformation: (orthonormal transformation) preserves the inner product between vectors: Orthogonality conditions: Other equivalent forms:

Orthogonal matrix: An orthogonal matrix preserves the inner product of vectors. The determinant of any orthogonal matrix is +1 (rotation) or −1 (rotation + inversion). All 3 ×3 orthogonal matrices form an orthogonal group O(3). Its subgroup SO(3) (special orthogonal group) consists of the orthogonal matrices with determinant +1. Similarity transformation: The matrix representation of an operator depends on the choice of basis vectors. Let operator A rotate a vector: B change the basis (coordinate transformation): Question: A′ and A are called similar matrices. They are the representation of the same operator in different bases.

Read: Chapter 3: 3 Homework: 3.3.1,3.3.8,3.3.9,3.3.10,3.3.14 Due: September 30

September 23,26 Diagonalization of matrices 3.4 Hermitian matrices and unitary matrices Complex conjugate: Adjoint: Hermitian matrices: Unitary matrices: Inner product: Conjugate transpose. Sometimes A* in math books. Self-adjoint. Symmetric matrices in real space. Orthogonal matrices in real space. The inner product of vectors x and y is Unitary transformation: A unitary transformation preserves the inner product of complex vectors: Orthogonality conditions:

3.5 Diagonalization of matrices Example: Diagonization of the moment of inertia. Angular momentum of a rigid body rotating around the origin. Let us consider one mass element m (I mean dm) inside the rigid body. The actual angular momentum takes the integration form. We can rotate the coordinates so that the moment of inertia matrix I is diagonalized in the new coordinate system. If the angular velocity is along a principle axis, the angular momentum will be in the same direction as the angular velocity .

Eigenvalues and eigenvectors: For an operator A, if there is a nonzero vectors x and a scalar l such that then x is called an eigenvector of A, and l is called the corresponding eigenvalue. A only changes the “length” of its eigenvector x by a factor of eigenvalue l, without affecting its “direction”. For nontrivial solutions, we need This is called the secular equation, or characteristic equation. Example: Calculate the eigenvalues and eigenvectors of

Eigenvalues and eigenvectors of Hermitian matrices: The eigenvalues are real. The eigenvectors associated with different eigenvalues are orthogonal. Physicists like them. Proof:

Read: Chapter 3: 4-5 Homework: 3.4.4,3.4.5,3.4.7,3.4.8,3.5.6,3.5.8,3.5.9,3.5.12,3.5.27 Due: September 30

September 28 Normal matrices 3.6 Normal matrices 1) A and A+ have the same eigenvector, but with conjugated eigenvalues. Proof: 2) The eigenvectors of a normal matrix are orthogonal.

More about normal matrices: Hermitian matrices and unitary matrixes are both normal matrices. However, it is not the case that all normal matrices are either unitary or Hermitian. A normal matrix is Hermitian (self-adjoint) if and only if its eigenvalues are real. A normal matrix is unitary if and only if its eigenvalues have unit magnitude. Every normal matrix can be converted to a diagonal matrix by a unitary transform, and every matrix which can be made diagonal by a unitary transform is normal. Proof:

Reading: Spectral decomposition theorem: For any normal matrix A, there exists a unitary matrix U so that where L is a diagonal matrix consists of the eigenvalues of A, and the columns of U are the eigenvectors of A. More explicit form: Apply to functions of matrices:

Read: Chapter 3: 6 Homework: 3.6.3,3.6.4,3.6.6,3.6.10,3.6.11 Due: October 7