12.215 Modern Navigation Thomas Herring

Slides:



Advertisements
Similar presentations
Chapter 4 Euclidean Vector Spaces
Advertisements

3D Geometry for Computer Graphics
Matrix Algebra Matrix algebra is a means of expressing large numbers of calculations made upon ordered sets of numbers. Often referred to as Linear Algebra.
Matrix Algebra Matrix algebra is a means of expressing large numbers of calculations made upon ordered sets of numbers. Often referred to as Linear Algebra.
Lecture 9: Introduction to Matrix Inversion Gaussian Elimination Sections 2.4, 2.5, 2.6 Sections 2.2.3, 2.3.
MF-852 Financial Econometrics
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Chapter 5 Orthogonality
Computer Graphics Recitation 5.
Chapter 4.1 Mathematical Concepts. 2 Applied Trigonometry Trigonometric functions Defined using right triangle  x y h.
Chapter 2 Basic Linear Algebra
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
Chapter 3 Determinants and Matrices
Chapter 2 Matrices Definition of a matrix.
CSci 6971: Image Registration Lecture 2: Vectors and Matrices January 16, 2004 Prof. Chuck Stewart, RPI Dr. Luis Ibanez, Kitware Prof. Chuck Stewart, RPI.
ECIV 520 Structural Analysis II Review of Matrix Algebra.
MOHAMMAD IMRAN DEPARTMENT OF APPLIED SCIENCES JAHANGIRABAD EDUCATIONAL GROUP OF INSTITUTES.
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
化工應用數學 授課教師: 郭修伯 Lecture 9 Matrices
Matrices and Determinants
Stats & Linear Models.
5.1 Orthogonality.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Basics of Linear Algebra A review?. Matrix  Mathematical term essentially corresponding to an array  An arrangement of numbers into rows and columns.
Boyce/DiPrima 9th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
Chapter 10 Review: Matrix Algebra
Compiled By Raj G. Tiwari
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
Graphics CSE 581 – Interactive Computer Graphics Mathematics for Computer Graphics CSE 581 – Roger Crawfis (slides developed from Korea University slides)
6.837 Linear Algebra Review Patrick Nichols Thursday, September 18, 2003.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
ECON 1150 Matrix Operations Special Matrices
Patrick Nichols Thursday, September 18, Linear Algebra Review.
Eigenvalue Problems Solving linear systems Ax = b is one part of numerical linear algebra, and involves manipulating the rows of a matrix. The second main.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
BMI II SS06 – Class 3 “Linear Algebra” Slide 1 Biomedical Imaging II Class 3 – Mathematical Preliminaries: Elementary Linear Algebra 2/13/06.
Linear Algebra Chapter 4 Vector Spaces.
Elementary Linear Algebra Anton & Rorres, 9th Edition
6.837 Linear Algebra Review Patrick Nichols Thursday, September 18, 2003.
Multivariate Statistics Matrix Algebra II W. M. van der Veld University of Amsterdam.
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 1 : “shiv rpi” Linear Algebra A gentle introduction Linear Algebra has become as basic and as applicable.
Statistics and Linear Algebra (the real thing). Vector A vector is a rectangular arrangement of number in several rows and one column. A vector is denoted.
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
Linear algebra: matrix Eigen-value Problems
Elementary Linear Algebra Anton & Rorres, 9th Edition
Linear algebra: matrix Eigen-value Problems Eng. Hassan S. Migdadi Part 1.
Chapter 5 MATRIX ALGEBRA: DETEMINANT, REVERSE, EIGENVALUES.
Solving Linear Systems Solving linear systems Ax = b is one part of numerical linear algebra, and involves manipulating the rows of a matrix. Solving linear.
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Matrices and Matrix Operations. Matrices An m×n matrix A is a rectangular array of mn real numbers arranged in m horizontal rows and n vertical columns.
Computer Graphics Mathematical Fundamentals Lecture 10 Taqdees A. Siddiqi
Linear Algebra Review Tuesday, September 7, 2010.
Unsupervised Learning II Feature Extraction
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
Linear Algebra Engineering Mathematics-I. Linear Systems in Two Unknowns Engineering Mathematics-I.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Matrices, Vectors, Determinants.
Graphics Graphics Korea University kucg.korea.ac.kr Mathematics for Computer Graphics 고려대학교 컴퓨터 그래픽스 연구실.
Matrices Introduction.
Matrices and Vector Concepts
Review of Linear Algebra
CS479/679 Pattern Recognition Dr. George Bebis
Matrices and vector spaces
Matrices and Vectors Review Objective
Systems of First Order Linear Equations
Chapter 3 Linear Algebra
Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors.
Eigenvalues and Eigenvectors
Presentation transcript:

Modern Navigation Thomas Herring

10/14/ Modern Naviation L092 Review of last class Sextant measurements using the sun: –We tracked the sun to find its highest elevation and the time this occurs. Our one cheat was using computer NTP to get time (We will use GPS to check the results)

10/14/ Modern Naviation L093 Today’s Class Review of linear Algebra. Class will be based on the book “Linear Algebra, Geodesy, and GPS”, G. Strang and K. Borre, Wellesley-Cambridge Press, Wellesley, MA, pp. 624, 1997 Topics to be covered will be those later in the course General areas are: –Vectors and matrices –Solving linear equations –Vector Spaces –Eigenvectors and values –Rotation matrices

10/14/ Modern Naviation L094 Basic vectors and matrices Important basic concepts Vectors: A column representing a set of n-quantities –In two and three dimensions these can be visualized as arrows between points with different coordinates with the vector itself usually having on end at the origin –The same concept can be applied to any n-dimensional vector –Vectors can be added and subtracted (head-to-tail) by adding and subtracting the individual components of the vectors. –Linear combinations of vectors can be formed by scaling and addition. The result is another vector e.g., cv+dw –(Often a bold symbol will be used to denote a vector and some times a line is drawn over the top).

10/14/ Modern Naviation L095 Lengths and dot products The dot product or inner product of two vectors is defined as: The order of the dot product makes no difference. The length or norm of a vector is the square-root of the dot product A unit vector is one with unit length If the dot product of two vectors is zero they are said to be orthogonal (orthonormal if they are unit vectors) The components of a 2-D unit vector are cos and sin of the angle the vector makes to the x-axis.

10/14/ Modern Naviation L096 Angles between vectors The cosine formula: Schwarz inequality: The dot product of any two vectors is less or equal the product of the lengths of the two vectors The representation of a plane is by the vector that is normal to it. For a plane through the origin, all vectors in that plane must have zero dot product with the normal. This then provides an equation for the plane.

10/14/ Modern Naviation L097 Planes If a plane does not contain the origin, then the coordinates of a point on the normal containing the plane specifies the plane. The dot product of the normal and points in the plane then is a constant. If the normal to the plane is a expressed as a unit vector, the constant is the closest distance of the plane to the origin. In N-dimensional space, the concept of a plane is the same: it is defined by n.v=d Any two non-collinear vectors define a plane

10/14/ Modern Naviation L098 Matrices and linear equations Any set of linear equations (i.e., equations which do not contain powers or products of the unknowns) can be written in matrix form with the coefficients of the linear equations being the elements of the matrix. The rows and columns of matrices are themselves vectors. A matrix represents a linear combination of the elements of a vector to form another vector of possibly different length:

10/14/ Modern Naviation L099 Solving linear equations If the x and b vectors are the same length, then given A and b it is often possible to find x (sometimes this is not possible and sometimes x may not be unique). There a many methods for solving this type of system but the earliest ones are by elimination i.e., linear combinations are formed of the rows of the matrix A that eliminate one of the elements of x. The process is repeated until only one element of x remains (which is easily solved). Back substitution allows all the components of x to be computed. This process is sometimes viewed as multiplying by eliminator matrices.

10/14/ Modern Naviation L0910 Rules of matrix multiplication The product of two matrices A (n-rows and m- columns) by B (r-rows and c-columns) is only possible if m=r The resultant matrix has n-rows and c-columns. In general, AB does not equal BA even the matrices are square A matrix multiplication is the dot products of rows of the first matrix with the columns of the second matrix Matrix multiplication is associative (AB)C=A(BC) but not commutative A matrix is invertible if A -1 such that A -1 A=I where I is a unit matrix, exists.

10/14/ Modern Naviation L0911 Factorization In factorization a matrix A is written as A=LU where is L is a lower triangular matrix and U is an upper triangular matrix. The individual matrices L and U are not unique (L can be multiplied by a scalar and U divided by the same scalar with out changing the product. Convention has the diagonal of L being 1’s. Why factorize? Since forms are lower triangular, substitute down (L) and up (U) the matrix

10/14/ Modern Naviation L0912 Characteristics of LU When the rows of A start with zero so do the corresponding rows of L; when the columns of A start with 0 so do the columns of U. Many estimation problems are “band-limited” i.e., only a small number of the elements around the diagonal are non-zero; the L and U matrices will also be band- limited but the inverse of such a matrix is normally full. (Factorization saves time and space). is a link to an SLU matlab code (also code at same site that pivots the matrix which is a more stable approach).

10/14/ Modern Naviation L0913 Transpose The transpose of a matrix is the matrix with rows and columns switched. Usually denoted as A T or sometimes A’ Some rules: (AB) T =B T A T A symmetric matrix is one for which A=A T The products A T A and AA T generate symmetric matrices. We will see these forms many times in when we cover estimation and statistics.

10/14/ Modern Naviation L0914 Matrix rank The rank of a matrix is the number of non-zero pivots in the matrix. Pivots are the number you divide by when doing Gauss elimination or when solving the UL system. The rank is the number of independent rows in a matrix (i.e., rows that are not linear combinations of other rows). If the rank of a square matrix is less than its dimension then the matrix is singular: Two cases: –Infinite number of solutions –No solutions

10/14/ Modern Naviation L0915 Vector spaces An nxn matrix can be seen as defining a space made up of the n column vectors of the matrix Sub-spaces are vector spaces that satisfy two requirements: if v and w are vectors in a subspace and c is a scalar; the v+w is in the subspace and cv is in the subspace (eg., a plane is a subspace of 3-D space). Vectors that lie in a plane, all addition and scaling of these vectors lie in that same plane Non-singular matrices allow transformation back and forth between two spaces.

10/14/ Modern Naviation L0916 The null space An important case (especially in estimation) is the null space. This is the set of non-trivial vectors that satisfy the equation: Ax = 0 Any vectors x that satisfy this equation collapse to the null space meaning that you can reverse the transformation. In estimation, vectors that are in the null space can not be estimated and an arbitrary component of them can be added to any solution. Generally null spaces exist when the columns of a matrix are linearly dependent.

10/14/ Modern Naviation L0917 Eigenvectors and Eigenvalues This second applies to square matrices Normally when a matrix multiplies a vector, the vector changes direction (i.e., the dot product is not just the product of the lengths of the vectors) There are specific vectors that when multiplied by a specific matrix do not change direction. These are called eigenvectors and satisfy the equation: Ax= x By convention the eigenvectors are unit vectors. Normally the number of eigenvectors match the dimension of the matrix For symmetric matrices the eigenvectors are all orthogonal.

10/14/ Modern Naviation L0918 Equation for eigenvalues To solve the eigenvalue problem we find the values of that make the determinate of (A- I)=0 Once a matrix is factored, the determinate is the sum of the diagonal of the U matrix. The determinate generates an n th order polynomial for  where n is the dimension of the matrix. The product of the eigenvalues is the determinate of the matrix The sum of the eigenvalues is the trace of the matrix (i.e., the sum of the diagonal elements).

10/14/ Modern Naviation L0919 Diagonalization of a matrix If the eigenvectors are put in a column matrix S, then Since the eigenvectors for a symmetric matrix are orthogonal, S -1 =S T (most commonly in estimation problems we encounter symmetric matrices.) Symmetric matrices with all positive eigenvalues are said to be positive definite.

10/14/ Modern Naviation L0920 Other matrices that we will encounter Rotation matrices: These are matrices that rotate a vector but do not change its size. They can be composed of three rotations about xyz axes: Form below is for a rigid body rotation. The signs are flipped for coordinate system rotations. The Z-rotation follows the pattern with the signs in the X position

10/14/ Modern Naviation L0921 Rotation matrices The form shown is for rotating a body, rotations of the coordinate system itself are of the same form except that the signs changes places. Note: When coordinate systems are rotated, after the first rotation the axes change location and hence the angle of the rotation about the new axes will change. In general, the order the rotation are applied will effect the results. If the direction cosines of a set of axes is known in the other frame, the rotation matrix can be directly written with the columns of the X system in the X’ system

10/14/ Modern Naviation L0922 Direction cosines relationship Given two coordinate systems X and X’, to find the coordinates of vector given in the X system in the X’s a rotation matrix is constructed (could be made up of multiplication by the three basic rotation matrices) In general the rotation matrix looks like

10/14/ Modern Naviation L0923 Small angle rotations If the rotation are small, then cos(  and sin  in this case the rotation matrix reduces to Rotation rate matrices are of this form (but after the rotation has persisted the axes change direction by a large and the equations needed to be integrated. (Example would be effects of gravitational torques on the equatorial bulge.

10/14/ Modern Naviation L0924 Summary General review of linear algebra: –Vectors and matrices –Solving linear equations –Vector Spaces –Eigenvectors and values –Rotation matrices These concepts will be used in latter classes when we cover satellite motions and estimation.