Lecture 16 Cramer’s Rule, Eigenvalue and Eigenvector

Slides:



Advertisements
Similar presentations
3D Geometry for Computer Graphics
Advertisements

Elementary Linear Algebra Anton & Rorres, 9th Edition
Totally Unimodular Matrices
3.2 Determinants; Mtx Inverses
Lecture 17 Introduction to Eigenvalue Problems
Overview of Markov chains David Gleich Purdue University Network & Matrix Computations Computer Science 15 Sept 2011.
Matrix Multiplication To Multiply matrix A by matrix B: Multiply corresponding entries and then add the resulting products (1)(-1)+ (2)(3) Multiply each.
Lecture 21: Matrix Operations and All-pair Shortest Paths Shang-Hua Teng.
Linear Transformations
Lecture 16 Graphs and Matrices in Practice Eigenvalue and Eigenvector Shang-Hua Teng.
Lecture 6 Intersection of Hyperplanes and Matrix Inverse Shang-Hua Teng.
Lecture 15 Recursive and Iterative Formula for Determinants Shang-Hua Teng.
Singular Value Decomposition in Text Mining Ram Akella University of California Berkeley Silicon Valley Center/SC Lecture 4b February 9, 2011.
Chapter 3 Determinants and Matrices
Lecture 21 SVD and Latent Semantic Indexing and Dimensional Reduction
Lecture 18 Eigenvalue Problems II Shang-Hua Teng.
6 1 Linear Transformations. 6 2 Hopfield Network Questions.
Lecture 16 Cramer’s Rule, Eigenvalue and Eigenvector Shang-Hua Teng.
Lecture 10 Dimensions, Independence, Basis and Complete Solution of Linear Systems Shang-Hua Teng.
化工應用數學 授課教師: 郭修伯 Lecture 9 Matrices
Applications of matrices and determinants
Spring 2013 Solving a System of Linear Equations Matrix inverse Simultaneous equations Cramer’s rule Second-order Conditions Lecture 7.
4.1 Matrix Operations What you should learn: Goal1 Goal2 Add and subtract matrices, multiply a matrix by a scalar, and solve the matrix equations. Use.
8.1 Matrices & Systems of Equations
2 pt 3 pt 4 pt 5pt 1 pt 2 pt 3 pt 4 pt 5 pt 1 pt 2pt 3 pt 4pt 5 pt 1pt 2pt 3 pt 4 pt 5 pt 1 pt 2 pt 3 pt 4pt 5 pt 1pt Row Operations Matrix Operations.
Computing Eigen Information for Small Matrices The eigen equation can be rearranged as follows: Ax = x  Ax = I n x  Ax - I n x = 0  (A - I n )x = 0.
Lecture 8 Matrix Inverse and LU Decomposition
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 02 Chapter 2: Determinants.
Linear Algebra (Aljabar Linier) Week 6 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma Ph: ,
Linear Algebra (Aljabar Linier) Week 5 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma Ph: ,
Section 10.3 and Section 9.3 Systems of Equations and Inverses of Matrices.
Systems of Equations and Inequalities Ryan Morris Josh Broughton.
1. Systems of Linear Equations and Matrices (8 Lectures) 1.1 Introduction to Systems of Linear Equations 1.2 Gaussian Elimination 1.3 Matrices and Matrix.
1.7 Linear Independence. in R n is said to be linearly independent if has only the trivial solution. in R n is said to be linearly dependent if there.
geometric representations of graphs
2.5 – Determinants and Multiplicative Inverses of Matrices.
LEARNING OUTCOMES At the end of this topic, student should be able to :  D efination of matrix  Identify the different types of matrices such as rectangular,
Section 2.1 Determinants by Cofactor Expansion. THE DETERMINANT Recall from algebra, that the function f (x) = x 2 is a function from the real numbers.
Final Outline Shang-Hua Teng. Problem 1: Multiple Choices 16 points There might be more than one correct answers; So you should try to mark them all.
TYPES OF SOLUTIONS SOLVING EQUATIONS
Spectral partitioning works: Planar graphs and finite element meshes
Continuum Mechanics (MTH487)
TYPES OF SOLUTIONS SOLVING EQUATIONS
Matrix Representation of Graphs
Linear Algebra Lecture 19.
Search Engines and Link Analysis on the Web
Matrix Multiplication
Linear Algebra Lecture 36.
Eigenvalues and Eigenvectors
DETERMINANT MATRIX YULVI ZAIKA.
Lecture 21 SVD and Latent Semantic Indexing and Dimensional Reduction
Lecture 22 SVD, Eigenvector, and Web Search
Chapter 7: Matrices and Systems of Equations and Inequalities
geometric representations of graphs
Students will write a summary explaining how to use Cramer’s rule.
Ilan Ben-Bassat Omri Weinstein
2.1 Day 3 Linear Transformations
Matrix Algebra and Random Vectors
Eigenvalues and Eigenvectors
Great Theoretical Ideas In Computer Science
Lecture 13: Singular Value Decomposition (SVD)
Lecture 22 SVD, Eigenvector, and Web Search
Lecture 22 SVD, Eigenvector, and Web Search
Lecture 21: Matrix Operations and All-pair Shortest Paths
Information Retrieval and Web Search
DETERMINANT MATH 80 - Linear Algebra.
Lecture 8 Matrix Inverse and LU Decomposition
Lecture 20 SVD and Its Applications
CS723 - Probability and Stochastic Processes
Chapter 2 Determinants.
Presentation transcript:

Lecture 16 Cramer’s Rule, Eigenvalue and Eigenvector Shang-Hua Teng

Determinants and Linear System Cramer’s Rule

Cramer’s Rule If det A is not zero, then Ax = b has the unique solution

Cramer’s Rule for Inverse Proof:

Where Does Matrices Come From?

Computer Science Graphs: G = (V,E)

Internet Graph

View Internet Graph on Spheres

Graphs in Scientific Computing

Resource Allocation Graph

Road Map

Matrices Representation of graphs Adjacency matrix:

Adjacency Matrix: 1 5 2 3 4

Matrix of Graphs Else A(i, j) = 0. Adjacency Matrix: If A(i, j) = 1: edge exists Else A(i, j) = 0. 1 1 2 3 4 2 -3 4 3

Laplacian of Graphs 1 5 2 3 4

Matrix of Weighted Graphs Weighted Matrix: If A(i, j) = w(i,j): edge exists Else A(i, j) = infty. 1 1 2 3 4 2 -3 4 3

Random walks How long does it take to get completely lost?

Random walks Transition Matrix 1 2 3 4 5 6

Markov Matrix Every entry is non-negative Every column adds to 1 A Markov matrix defines a Markov chain

Other Matrices Projections Rotations Permutations Reflections

Term-Document Matrix Index each document (by human or by computer) fij counts, frequencies, weights, etc Each document can be regarded as a point in m dimensions

Document-Term Matrix Index each document (by human or by computer) fij counts, frequencies, weights, etc Each document can be regarded as a point in n dimensions

Term Occurrence Matrix

c1 c2 c3 c4 c5 m1 m2 m3 m4 human 1 1 interface 1 1 computer 1 1 user 1 1 1 system 1 1 2 response 1 1 time 1 1 EPS 1 1 survey 1 1 trees 1 1 1 graph 1 1 1 minors 1 1

Matrix in Image Processing

Random walks How long does it take to get completely lost?

Random walks Transition Matrix 1 2 3 4 5 6