Lecture 4 The Gauß scheme A linear system of equations Matrix algebra deals essentially with linear linear systems. Multiplicative elements. A non-linear.

Slides:



Advertisements
Similar presentations
Matrices A matrix is a rectangular array of quantities (numbers, expressions or function), arranged in m rows and n columns x 3y.
Advertisements

Lecture 4 The Gauß scheme A linear system of equations Matrix algebra deals essentially with linear linear systems. Multiplicative elements. A non-linear.
Autar Kaw Humberto Isaza
Lecture 9: Introduction to Matrix Inversion Gaussian Elimination Sections 2.4, 2.5, 2.6 Sections 2.2.3, 2.3.
3_3 An Useful Overview of Matrix Algebra
Eigenvalues and eigenvectors
Lecture 4 The Gauß scheme A linear system of equations Matrix algebra deals essentially with linear linear systems. Multiplicative elements. A non-linear.
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Chapter 2 Basic Linear Algebra
Matrices and Systems of Equations
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
Chapter 2 Matrices Definition of a matrix.
MOHAMMAD IMRAN DEPARTMENT OF APPLIED SCIENCES JAHANGIRABAD EDUCATIONAL GROUP OF INSTITUTES.
Math for CSLecture 21 Solution of Linear Systems of Equations Consistency Rank Geometric Interpretation Gaussian Elimination Lecture 2. Contents.
Part 3 Chapter 8 Linear Algebraic Equations and Matrices PowerPoints organized by Dr. Michael R. Gustafson II, Duke University All images copyright © The.
化工應用數學 授課教師: 郭修伯 Lecture 9 Matrices
INDR 262 INTRODUCTION TO OPTIMIZATION METHODS LINEAR ALGEBRA INDR 262 Metin Türkay 1.
Intro to Matrices Don’t be scared….
Applications of matrices and determinants
Copyright © Cengage Learning. All rights reserved. 7.6 The Inverse of a Square Matrix.
Lecture 3 Matrix algebra A vector can be interpreted as a file of data A matrix is a collection of vectors and can be interpreted as a data base The red.
1 Chapter 2 Matrices Matrices provide an orderly way of arranging values or functions to enhance the analysis of systems in a systematic manner. Their.
1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 5QF Introduction to Vector and Matrix Operations Needed for the.
5  Systems of Linear Equations: ✦ An Introduction ✦ Unique Solutions ✦ Underdetermined and Overdetermined Systems  Matrices  Multiplication of Matrices.
A vector can be interpreted as a file of data A matrix is a collection of vectors and can be interpreted as a data base The red matrix contain three column.
Systems of Linear Equations and Row Echelon Form.
Boyce/DiPrima 9th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
Spring 2013 Solving a System of Linear Equations Matrix inverse Simultaneous equations Cramer’s rule Second-order Conditions Lecture 7.
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
ECON 1150 Matrix Operations Special Matrices
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
Barnett/Ziegler/Byleen Finite Mathematics 11e1 Review for Chapter 4 Important Terms, Symbols, Concepts 4.1. Systems of Linear Equations in Two Variables.
Matrix Algebra. Quick Review Quick Review Solutions.
CHAPTER 2 MATRIX. CHAPTER OUTLINE 2.1 Introduction 2.2 Types of Matrices 2.3 Determinants 2.4 The Inverse of a Square Matrix 2.5 Types of Solutions to.
Presentation by: H. Sarper
January 22 Review questions. Math 307 Spring 2003 Hentzel Time: 1:10-2:00 MWF Room: 1324 Howe Hall Instructor: Irvin Roy Hentzel Office 432 Carver Phone.
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
10.4 Matrix Algebra 1.Matrix Notation 2.Sum/Difference of 2 matrices 3.Scalar multiple 4.Product of 2 matrices 5.Identity Matrix 6.Inverse of a matrix.
Linear algebra: matrix Eigen-value Problems
Matrix Algebra and Regression a matrix is a rectangular array of elements m=#rows, n=#columns  m x n a single value is called a ‘scalar’ a single row.
Linear algebra: matrix Eigen-value Problems Eng. Hassan S. Migdadi Part 1.
Chapter 6 Systems of Linear Equations and Matrices Sections 6.3 – 6.5.
Copyright © 2011 Pearson Education, Inc. Solving Linear Systems Using Matrices Section 6.1 Matrices and Determinants.
Review of Matrix Operations Vector: a sequence of elements (the order is important) e.g., x = (2, 1) denotes a vector length = sqrt(2*2+1*1) orientation.
Chapter 2 Determinants. With each square matrix it is possible to associate a real number called the determinant of the matrix. The value of this number.
10.4 Matrix Algebra 1.Matrix Notation 2.Sum/Difference of 2 matrices 3.Scalar multiple 4.Product of 2 matrices 5.Identity Matrix 6.Inverse of a matrix.
Section 1.7 Linear Independence and Nonsingular Matrices
Matrices and Matrix Operations. Matrices An m×n matrix A is a rectangular array of mn real numbers arranged in m horizontal rows and n vertical columns.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
2.5 – Determinants and Multiplicative Inverses of Matrices.
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
Matrices, Vectors, Determinants.
1 SYSTEM OF LINEAR EQUATIONS BASE OF VECTOR SPACE.
10.4 Matrix Algebra. 1. Matrix Notation A matrix is an array of numbers. Definition Definition: The Dimension of a matrix is m x n “m by n” where m =
Copyright © Cengage Learning. All rights reserved. 8 Matrices and Determinants.
College Algebra Chapter 6 Matrices and Determinants and Applications
MTH108 Business Math I Lecture 20.
MAT 322: LINEAR ALGEBRA.
Matrices and Vector Concepts
7.7 Determinants. Cramer’s Rule
CS479/679 Pattern Recognition Dr. George Bebis
Review of Matrix Operations
Matrices and vector spaces
Linear Algebraic Equations and Matrices
Linear independence and matrix rank
Systems of First Order Linear Equations
Matrix Algebra.
Boyce/DiPrima 10th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
Matrix Algebra.
Chapter 4 Systems of Linear Equations; Matrices
Presentation transcript:

Lecture 4 The Gauß scheme A linear system of equations Matrix algebra deals essentially with linear linear systems. Multiplicative elements. A non-linear system Solving simple stoichiometric equations

The division through a vector or a matrix is not defined! 2 equations and four unknowns Solving a linear system

For a non-singular square matrix the inverse is defined as r 2 =2r 1 r 3 =2r 1 +r 2 Singular matrices are those where some rows or columns can be expressed by a linear combination of others. Such columns or rows do not contain additional information. They are redundant. A linear combination of vectors A matrix is singular if it’s determinant is zero. Det A: determinant of A A matrix is singular if at least one of the parameters k is not zero.

(AB) -1 = B -1 A -1 ≠ A -1 B -1 Determinant The inverse of a 2x2 matrixThe inverse of a diagonal matrix The inverse of a square matrix only exists if its determinant differs from zero. Singular matrices do not have an inverse The inverse can be unequivocally calculated by the Gauss-Jordan algorithm

Solving a simple linear system

Identity matrix Only possible if A is not singular. If A is singular the system has no solution. The general solution of a linear system Systems with a unique solution The number of independent equations equals the number of unknowns. X: Not singularThe augmented matrix X aug is not singular and has the same rank as X. The rank of a matrix is minimum number of rows/columns of the largest non-singular submatrix

Consistent Rank(A) = rank(A:B) = n Consistent Rank(A) = rank(A:B) < n Inconsistent Rank(A) < rank(A:B) Consistent Rank(A) = rank(A:B) < n Inconsistent Rank(A) < rank(A:B) Consistent Rank(A) = rank(A:B) = n Infinite number of solutions No solution Infinite number of solutions No solution Infinite number of solutions

We have only four equations but five unknowns. The system is underdetermined. The missing value is found by dividing the vector through its smallest values to find the smallest solution for natural numbers.

Equality of atoms involved Including information on the valences of elements We have 16 unknows but without experminetnal information only 11 equations. Such a system is underdefined. A system with n unknowns needs at least n independent and non-contradictory equations for a unique solution. If n i and a i are unknowns we have a non-linear situation. We either determine n i or a i or mixed variables such that no multiplications occur.

The matrix is singular because a 1, a 7, and a 10 do not contain new information Matrix algebra helps to determine what information is needed for an unequivocal information. From the knowledge of the salts we get n 1 to n 5

We have six variables and six equations that are not contradictory and contain different information. The matrix is therefore not singular.

Linear models in biology tN The logistic model of population growth K denotes the maximum possible density under resource limitation, the carrying capacity. r denotes the intrinsic population growth rate. If r > 1 the population growths, at r < 1 the population shrinks. We need four measurements

N t K Overshot We have an overshot. In the next time step the population should decrease below the carrying capacity. Population growth tN NN K/2 Fastest population growth

The transition matrix Assume a gene with four different alleles. Each allele can mutate into anther allele. The mutation probabilities can be measured. A→AB→AC→AD→A Sum Transition matrix Probability matrix Initial allele frequencies What are the frequencies in the next generation? A→A A→B A→C A→D Σ = 1 The frequencies at time t+1 do only depent on the frequencies at time t but not on earlier ones. Markov process

Does the mutation process result in stable allele frequencies? Stable state vector Eigenvector of A EigenvalueUnit matrixEigenvector The largest eigenvalue defines the stable state vector Every probability matrix has at least one eigenvalue = 1.

The insulin – glycogen system At high blood glucose levels insulin stimulates glycogen synthesis and inhibits glycogen breakdown. The change in glycogen concentration  N can be modelled by the sum of constant production g and concentration dependent breakdown fN. At equilibrium we have The vector {-f,g} is the stationary state vector (the largest eigenvector) of the dispersion matrix and gives the equilibrium conditions (stationary point). The value -1 is the eigenvalue of this system. The symmetric and square matrix D that contains squared values is called the dispersion matrix The glycogen concentration at equilibrium: The equilbrium concentration does not depend on the initial concentrations

X Y How to transform vector A into vector B? A B Multiplication of a vector with a square matrix defines a new vector that points to a different direction. The matrix defines a transformation in space X Y A B Image transformation X contains all the information necesssary to transform the image The vectors that don’t change during transformation are the eigenvectors. In general we define U is the eigenvector and the eigenvalue of the square matrix X Eigenvalues and eigenvectors

A matrix with n columns has n eigenvalues and n eigenvectors.

Some properties of eigenvectors If  is the diagonal matrix of eigenvalues: The product of all eigenvalues equals the determinant of a matrix. The determinant is zero if at least one of the eigenvalues is zero. In this case the matrix is singular. The eigenvectors of symmetric matrices are orthogonal Eigenvectors do not change after a matrix is multiplied by a scalar k. Eigenvalues are also multiplied by k. If A is trianagular or diagonal the eigenvalues of A are the diagonal entries of A.

Page Rank Google sorts internet pages according to a ranking of websites based on the probablitites to be directled to this page. Assume a surfer clicks with probability d to a certain website A. Having N sites in the world (30 to 50 bilion) the probability to reach A is d/N. Assume further we have four site A, B, C, D, with links to A. Assume further the four sites have c A, c B, c C, and c D links and k A, k B, k C, and k D links to A. If the probability to be on one of these sites is p A, p B, p C, and p D, the probability to reach A from any of the sites is therefore

In reality we have a linear system of bilion equations!!! Google uses a fixed value of d=0.15. Needed is the number of links per website. Probability matrix PRank vector u Internet pages are ranked according to probability to be reached The total probability to reach A is

AB C D Larry Page (1973- Sergej Brin (1973-

Page Rank as an eigenvector problem In reality the constant is very small The final page rank is given by the stationary state vector (the vector of the largest eigenvalue).