Lecture 16 Cramer’s Rule, Eigenvalue and Eigenvector Shang-Hua Teng.

Slides:



Advertisements
Similar presentations
3D Geometry for Computer Graphics
Advertisements

Elementary Linear Algebra Anton & Rorres, 9th Edition
Totally Unimodular Matrices
5.4. Additional properties Cofactor, Adjoint matrix, Invertible matrix, Cramers rule. (Cayley, Sylvester….)
3.2 Determinants; Mtx Inverses
Lecture 7 Intersection of Hyperplanes and Matrix Inverse Shang-Hua Teng.
Lecture 17 Introduction to Eigenvalue Problems
10/11/2001Random walks and spectral segmentation1 CSE 291 Fall 2001 Marina Meila and Jianbo Shi: Learning Segmentation by Random Walks/A Random Walks View.
Overview of Markov chains David Gleich Purdue University Network & Matrix Computations Computer Science 15 Sept 2011.
Matrix Multiplication To Multiply matrix A by matrix B: Multiply corresponding entries and then add the resulting products (1)(-1)+ (2)(3) Multiply each.
Lecture 6 Matrix Operations and Gaussian Elimination for Solving Linear Systems Shang-Hua Teng.
Lecture 21: Matrix Operations and All-pair Shortest Paths Shang-Hua Teng.
1 Representing Graphs. 2 Adjacency Matrix Suppose we have a graph G with n nodes. The adjacency matrix is the n x n matrix A=[a ij ] with: a ij = 1 if.
Lecture 16 Graphs and Matrices in Practice Eigenvalue and Eigenvector Shang-Hua Teng.
Lecture 6 Intersection of Hyperplanes and Matrix Inverse Shang-Hua Teng.
Lecture 15 Recursive and Iterative Formula for Determinants Shang-Hua Teng.
Singular Value Decomposition in Text Mining Ram Akella University of California Berkeley Silicon Valley Center/SC Lecture 4b February 9, 2011.
Chapter 3 Determinants and Matrices
Lecture 21 SVD and Latent Semantic Indexing and Dimensional Reduction
Lecture 14 Simplex, Hyper-Cube, Convex Hull and their Volumes
Lecture 18 Eigenvalue Problems II Shang-Hua Teng.
6 1 Linear Transformations. 6 2 Hopfield Network Questions.
Lecture 10 Dimensions, Independence, Basis and Complete Solution of Linear Systems Shang-Hua Teng.
化工應用數學 授課教師: 郭修伯 Lecture 9 Matrices
Spring 2013 Solving a System of Linear Equations Matrix inverse Simultaneous equations Cramer’s rule Second-order Conditions Lecture 7.
Piyush Kumar (Lecture 2: PageRank) Welcome to COT5405.
4.1 Matrix Operations What you should learn: Goal1 Goal2 Add and subtract matrices, multiply a matrix by a scalar, and solve the matrix equations. Use.
8.1 Matrices & Systems of Equations
2 pt 3 pt 4 pt 5pt 1 pt 2 pt 3 pt 4 pt 5 pt 1 pt 2pt 3 pt 4pt 5 pt 1pt 2pt 3 pt 4 pt 5 pt 1 pt 2 pt 3 pt 4pt 5 pt 1pt Row Operations Matrix Operations.
Lecture 8 Matrix Inverse and LU Decomposition
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 02 Chapter 2: Determinants.
Linear Algebra (Aljabar Linier) Week 6 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma Ph: ,
Linear Algebra (Aljabar Linier) Week 5 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma Ph: ,
Section 10.3 and Section 9.3 Systems of Equations and Inverses of Matrices.
Notes 7.2 – Matrices I. Matrices A.) Def. – A rectangular array of numbers. An m x n matrix is a matrix consisting of m rows and n columns. The element.
Modern information retreival Chapter. 02: Modeling (Latent Semantic Indexing)
Systems of Equations and Inequalities Ryan Morris Josh Broughton.
Copyright © 2007 Pearson Education, Inc. Slide 7-1.
Chapter 2 Determinants. With each square matrix it is possible to associate a real number called the determinant of the matrix. The value of this number.
1. Systems of Linear Equations and Matrices (8 Lectures) 1.1 Introduction to Systems of Linear Equations 1.2 Gaussian Elimination 1.3 Matrices and Matrix.
1.7 Linear Independence. in R n is said to be linearly independent if has only the trivial solution. in R n is said to be linearly dependent if there.
geometric representations of graphs
2.5 – Determinants and Multiplicative Inverses of Matrices.
LEARNING OUTCOMES At the end of this topic, student should be able to :  D efination of matrix  Identify the different types of matrices such as rectangular,
Section 2.1 Determinants by Cofactor Expansion. THE DETERMINANT Recall from algebra, that the function f (x) = x 2 is a function from the real numbers.
Final Outline Shang-Hua Teng. Problem 1: Multiple Choices 16 points There might be more than one correct answers; So you should try to mark them all.
CHAPTER 7 Determinant s. Outline - Permutation - Definition of the Determinant - Properties of Determinants - Evaluation of Determinants by Elementary.
TYPES OF SOLUTIONS SOLVING EQUATIONS
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
TYPES OF SOLUTIONS SOLVING EQUATIONS
LINEAR ALGEBRA.
Matrix Representation of Graphs
Linear Algebra Lecture 19.
Search Engines and Link Analysis on the Web
Matrix Multiplication
Linear Algebra Lecture 36.
Lecture 16 Cramer’s Rule, Eigenvalue and Eigenvector
DETERMINANT MATRIX YULVI ZAIKA.
Lecture 21 SVD and Latent Semantic Indexing and Dimensional Reduction
Chapter 7: Matrices and Systems of Equations and Inequalities
geometric representations of graphs
Students will write a summary explaining how to use Cramer’s rule.
Ilan Ben-Bassat Omri Weinstein
Eigenvalues and Eigenvectors
Great Theoretical Ideas In Computer Science
Lecture 13: Singular Value Decomposition (SVD)
Lecture 21: Matrix Operations and All-pair Shortest Paths
Lecture 8 Matrix Inverse and LU Decomposition
CS723 - Probability and Stochastic Processes
Chapter 2 Determinants.
Presentation transcript:

Lecture 16 Cramer’s Rule, Eigenvalue and Eigenvector Shang-Hua Teng

Determinants and Linear System Cramer’s Rule

Cramer’s Rule If det A is not zero, then Ax = b has the unique solution

Cramer’s Rule for Inverse Proof:

Where Does Matrices Come From?

Computer Science Graphs: G = (V,E)

Internet Graph

View Internet Graph on Spheres

Graphs in Scientific Computing

Resource Allocation Graph

Road Map

Matrices Representation of graphs Adjacency matrix:

Adjacency Matrix:

Matrix of Graphs Adjacency Matrix: If A(i, j) = 1: edge exists Else A(i, j) =

Laplacian of Graphs

Matrix of Weighted Graphs Weighted Matrix: If A(i, j) = w(i,j): edge exists Else A(i, j) = infty

Random walks How long does it take to get completely lost?

Random walks Transition Matrix

Markov Matrix Every entry is non-negative Every column adds to 1 A Markov matrix defines a Markov chain

Other Matrices Projections Rotations Permutations Reflections

Term-Document Matrix Index each document (by human or by computer) –f ij counts, frequencies, weights, etc Each document can be regarded as a point in m dimensions

Document-Term Matrix Index each document (by human or by computer) –f ij counts, frequencies, weights, etc Each document can be regarded as a point in n dimensions

Term Occurrence Matrix

c1 c2 c3 c4 c5 m1 m2 m3 m4 human interface computer user system response time EPS survey trees graph minors

Matrix in Image Processing

Random walks How long does it take to get completely lost?

Random walks Transition Matrix