Sect. 4.2: Orthogonal Transformations

Slides:



Advertisements
Similar presentations
Chapter Matrices Matrix Arithmetic
Advertisements

Matrices A matrix is a rectangular array of quantities (numbers, expressions or function), arranged in m rows and n columns x 3y.
Matrix Algebra Matrix algebra is a means of expressing large numbers of calculations made upon ordered sets of numbers. Often referred to as Linear Algebra.
Chapter 4: Rigid Body Kinematics Rigid Body  A system of mass points subject to ( holonomic) constraints that all distances between all pairs of points.
3_3 An Useful Overview of Matrix Algebra
Maths for Computer Graphics
Chapter 4.1 Mathematical Concepts
Chapter 4.1 Mathematical Concepts. 2 Applied Trigonometry Trigonometric functions Defined using right triangle  x y h.
ECIV 301 Programming & Graphics Numerical Methods for Engineers Lecture 12 System of Linear Equations.
Ch. 2: Rigid Body Motions and Homogeneous Transforms
CSCE 590E Spring 2007 Basic Math By Jijun Tang. Applied Trigonometry Trigonometric functions  Defined using right triangle  x y h.
Lecture # 9 Matrix Representation of Symmetry Groups
Review of Matrix Algebra
Ch 7.2: Review of Matrices For theoretical and computation reasons, we review results of matrix theory in this section and the next. A matrix A is an m.
Chapter 1 Vector analysis
Matrices and Determinants
CE 311 K - Introduction to Computer Methods Daene C. McKinney
3.8 Matrices.
Chapter 1: Matrices Definition 1: A matrix is a rectangular array of numbers arranged in horizontal rows and vertical columns. EXAMPLE:
Chapter 7 Matrix Mathematics Matrix Operations Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
2IV60 Computer Graphics Basic Math for CG
Basics of Linear Algebra A review?. Matrix  Mathematical term essentially corresponding to an array  An arrangement of numbers into rows and columns.
Chapter 10 Review: Matrix Algebra
Compiled By Raj G. Tiwari
Graphics CSE 581 – Interactive Computer Graphics Mathematics for Computer Graphics CSE 581 – Roger Crawfis (slides developed from Korea University slides)
6.837 Linear Algebra Review Patrick Nichols Thursday, September 18, 2003.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
Patrick Nichols Thursday, September 18, Linear Algebra Review.
Modern Navigation Thomas Herring
6.837 Linear Algebra Review Patrick Nichols Thursday, September 18, 2003.
Scalars A scalar is any physical quantity that can be completely characterized by its magnitude (by a number value) A scalar is any physical quantity that.
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 1 : “shiv rpi” Linear Algebra A gentle introduction Linear Algebra has become as basic and as applicable.
Sect 5.4: Eigenvalues of I & Principal Axis Transformation
Statistics and Linear Algebra (the real thing). Vector A vector is a rectangular arrangement of number in several rows and one column. A vector is denoted.
Copyright © 2013, 2009, 2005 Pearson Education, Inc. 1 5 Systems and Matrices Copyright © 2013, 2009, 2005 Pearson Education, Inc.
Matrices, Transformations and the 3D Pipeline Matthew Rusch Paul Keet.
6.837 Linear Algebra Review Rob Jagnow Monday, September 20, 2004.
Magnitude of a Vector The magnitude of a vector, denoted as |a|, is defined as the square root of the sum of the squares of its components: |a| = (x 2.
Matrices A matrix is a table or array of numbers arranged in rows and columns The order of a matrix is given by stating its dimensions. This is known as.
Background The Physics Knowledge Expected for this Course: Newton’s Laws of Motion  the “Theme of the Course” –Energy & momentum conservation –Elementary.
Sect. 4.4: Euler Angles Back to PHYSICS! GOAL: Describe rigid body motion (especially rotations) with a Lagrangian formalism. What generalized coordinates.
1 Principal stresses/Invariants. 2 In many real situations, some of the components of the stress tensor (Eqn. 4-1) are zero. E.g., Tensile test For most.
Oh, it’s your standard “boy meets girl, boy loses girl,
Meeting 18 Matrix Operations. Matrix If A is an m x n matrix - that is, a matrix with m rows and n columns – then the scalar entry in the i th row and.
CSCI 171 Presentation 9 Matrix Theory. Matrix – Rectangular array –i th row, j th column, i,j element –Square matrix, diagonal –Diagonal matrix –Equality.
Matrices Section 2.6. Section Summary Definition of a Matrix Matrix Arithmetic Transposes and Powers of Arithmetic Zero-One matrices.
Chun-Yuan Lin Mathematics for Computer Graphics 2015/12/15 1 CG.
Chiang & Wainwright Mathematical Economics
Scalars & Vectors in Terms of Transformations: Sect. 1.8 Skip Sects. 1,5, 1.6, 1.7 for now. Possibly come back to later (as needed!) Consider the orthogonal.
Sect. 4.7: Finite Rotations
TH EDITION LIAL HORNSBY SCHNEIDER COLLEGE ALGEBRA.
Matrices and Determinants
Basic Theory (for curve 01). 1.1 Points and Vectors  Real life methods for constructing curves and surfaces often start with points and vectors, which.
LINEAR MODELS AND MATRIX ALGEBRA
Learning from the Past, Looking to the Future James R. (Jim) Beaty, PhD - NASA Langley Research Center Vehicle Analysis Branch, Systems Analysis & Concepts.
Unsupervised Learning II Feature Extraction
Matrix Algebra Definitions Operations Matrix algebra is a means of making calculations upon arrays of numbers (or data). Most data sets are matrix-type.
Matrices. Variety of engineering problems lead to the need to solve systems of linear equations matrixcolumn vectors.
Lecture 1 Linear algebra Vectors, matrices. Linear algebra Encyclopedia Britannica:“a branch of mathematics that is concerned with mathematical structures.
= y1y1 y2y2 y3y y 1 = = 14 xxx Calculate y 1 : ROW 1 Matrix-Vector multiplication.
Sect. 4.5: Cayley-Klein Parameters 3 independent quantities are needed to specify a rigid body orientation. Most often, we choose them to be the Euler.
A very brief introduction to Matrix (Section 2.7) Definitions Some properties Basic matrix operations Zero-One (Boolean) matrices.
MATRICES A rectangular arrangement of elements is called matrix. Types of matrices: Null matrix: A matrix whose all elements are zero is called a null.
MTH108 Business Math I Lecture 20.
Mathematics for Computer Graphics
Chapter 7 Matrix Mathematics
Math review - scalars, vectors, and matrices
Presentation transcript:

Sect. 4.2: Orthogonal Transformations For convenience, change of notation: x  x1, y  x2, z  x3, x´  x1, y´  x2, z´  x3 Also: aij  cosθij In new notation, transformation eqtns between primed & unprimed coords become: x1 = a11 x1+a12 x2 +a13 x3 x2 = a21 x1+a22 x2 +a23 x3 x3 = a31 x1+a32 x2 +a33 x3 Or: xi = ∑j aij xj (i,j = 1,2,3) (1) (1) = An example of what mathematicians call a Linear (or Vector) Transformation.

 Einstein summation convention For convenience, another change of notation: If the index is repeated, summation over it is implied.  xi = ∑j aij xj (i,j = 1,2,3)   xi = aij xj (i,j = 1,2,3)  Einstein summation convention To avoid possible ambiguity when powers of an indexed quantity occur: ∑i(xi)2  xixi For the rest of the course, summation convention is automatically assumed, unless stated otherwise.

Linear Transformation: xi = aij xj (i,j = 1,2,3) (1) With aij  cosθij as derived, (1) is only a special case of a general linear transformation, since, as already discussed, the direction cosines cosθij are not all independent. Re-derive connections between them, use new notation. Both coord systems are Cartesian:  Square of magnitude of vector = sum of squares of components. Magnitude is invariant on transformation of coords:  xixi = xixi Using (1), this becomes: aijaikxjxk = xixi (i,j,k = 1,2,3)

aijaikxjxk = xixi (i,j,k = 1,2,3) aijaik = δj,k (j,k = 1,2,3) Can be valid if & only if aijaik = δj,k (j,k = 1,2,3) Identical previous results for orthogonality of direction cosines. Any Linear Transformation: xi = aij xj (i,j = 1,2,3) (1)  Orthogonal Transformation aijaik = δj,k  Orthogonality Condition

Linear (or Vector) Transformation. xi = aijxj (i,j = 1,2,3) (1) Can arrange direction cosines into a square matrix: a11 a12 a13 A  a21 a22 a23 a31 a32 a33 Consider coordinate axes as column vector components: x1 x1 r = x2 r = x2 x3 x2  Coordinate transformation reln can be written: r = Ar with A  Transformation matrix or rotation matrix (or tensor)

Example: 2d Coordinate Rotation Application to 2d rotation. See figure: Easy to show that: x3 = x3 x1 = x1cos + x2sin = x1cos + x2cos( - π/2) x2 = -x1sin  + x2cos = x1cos( + π/2) + x2cos

 Transformation matrix has form: a11 a12 0 cos sin 0 2d rotation. See fig:  aij  cosθij a33 = cosθ33 = 1 a11 = cosθ11 = cos a22 = cosθ22 = cos a12 = cosθ12 = cos( - π/2) = sin a21 = cosθ21 = cos( + π/2) = -sin a31 = cosθ31 = cos(π/2) = 0, a32 = cosθ32 = cos(π/2) = 0  Transformation matrix has form: a11 a12 0 cos sin 0 A = a21 a22 0 = -sin cos 0 0 0 1 0 0 1

 Need only one angle to specify a 2d rotation. 2d rotation. See fig:  aij  cosθij Orthogonality Condition: aijaik = δj,k  a11a11 + a21a21 = 1 a12a12 + a22a22 = 1 , a11a12 + a21a22 = 0 Use expressions for aij & get: cos2 + sin2 =1 sin2 + cos2 =1, cossin - sincos = 0  Need only one angle to specify a 2d rotation.

Transformation matrix A  Math operator that, acting on unprimed system, transforms it to primed system. Symbolically: r = Ar (1)  Matrix A, acting on components of the r in unprimed system yields components of r in the primed system. Assumption: Vector r itself is unchanged (in length & direction) on operation with A. (r2 = (r)2) NOTE: Same formal mathematics results from another interpretation of (1): A acts on r & changes it into r . Components of 2 vectors related by (1). Which interpretation depends on context of problem. Usually, for rigid body motion, use 1st interpretation. For general transformation (1), nature of A depends on which interpretation is used. A acting on coords: Passive transformation. A acting on vector: Active transformation

Example from Marion In the unprimed system, point P is represented as (x1, x2, x3) = (2,1,3). In the primed system, x2 has been rotated from x2, towards x3 by a 30º angle as in the figure. Find the rotation matrix A & the representation of P = (x1, x2, x3) in the primed system.

From figure, using aij  cosθij a11 = cosθ11 = cos(0º) =1 a12 = cosθ12 = cos(90º) = 0 a13 = cosθ13 = cos(90º) = 0 a21 = cosθ21 = cos(90º) = 0 a22 = cosθ22 = cos(30º) = 0.866 a23 = cosθ23 = cos(90º-30º) = cos(60º) = 0.5 a31 = cosθ31 = cos(90º) = 0 a32 = cosθ32 = cos(90º+30º) = -0.5 a33 = cosθ33 = cos(30º) = 0.866 1 0 0  A = 0 0.866 0.5 0 -0.5 0.866

To find new representation of P, apply r = Ar or x1 = a11 x1+a12 x2 +a13 x3 x2 = a21 x1+a22 x2 +a23 x3 x3 = a31 x1+a32 x2 +a33 x3 Using (x1, x2, x3) = (2,1,3)  x1 = x1 = 2 x2 = 0.866x2 +0.5x3 = 2.37 x3 = -0.5 x2 + 0.866x3 = 2.10  (x1, x2, x3) = (2,2.37,2.10)

Useful Relations  cos2α + cos2β + cos2γ = 1 Consider a general line segment, as in the figure: Angles α, β, γ between the segment & x1, x2, x3  Direction cosines of line  cosα, cosβ, cosγ Manipulation, using orthogonality relns from before:  cos2α + cos2β + cos2γ = 1

cosα, cosβ, cosγ, & cosα , cos β, cosγ Angle θ between segments: Consider 2 line segments, direction cosines, as in the figure: cosα, cosβ, cosγ, & cosα , cos β, cosγ Angle θ between segments: Manipulation (trig):  cosθ = cosαcosα +cosβcosβ +cosγcosγ

Sect. 4.3: Formal (math) Properties of the Transformation Matrix For a while (almost) pure math! 2 successive orthogonal transformations B and A, acting on unprimed coordinates: r = Br followed by r = Ar = ABr In component form, application of B followed by A gives (summation convention assumed, of course!): xk = bkjxj , xi = aikxk = aikbkjxj (1) (i,j,k = 1,2,3) Rewrite (1) as: xi = cijxj (2) (2) has the form of an orthogonal transformation C  AB with elements of the square matrix C given by cij  aikbkj

Products  Product of 2 orthogonal transformations B (matrix elements bkj) & A (matrix elements aik) is another orthogonal transformation C = AB (matrix elements cij  aikbkj). Proof that C is also orthogonal: See Prob. 1, p 180. Can show (student exercise!): Product of orthogonal transformations is not commutative: BA  AB Define: D  BA (matrix elements dij  bikakj). Find, in general: dij  cij.  Final coords depend on order of application of A & B. Can also show (student exercise!): Products of such transformations are associative: (AB)C = A(BC)

Note: Text now begins to use vector r & vector x interchangeably Note: Text now begins to use vector r & vector x interchangeably!  r = Ar  x = Ax can be represented in terms of matrices, with coord vectors being column vectors: x = Ax  xi = aijxj or: x1 a11 a12 a13 x1 x2 = a21 a22 a23 x2 x2 a31 a32 a33 x3 Addition of 2 transformation matrices: C = A + B  Matrix elements are: cij = aij + bij

Inverse Define the inverse A-1 of transformation A: x = Ax (1),  x  A-1 x (2) In terms of matrix elements, these are: xk = akixi (1 ), xi  aij xj (2) where aij are matrix elements of A-1 Combining (1) & (2):  xk = akiaij xj clearly, this can hold if & only if: akiaij = δj,k (3) Define: Unit Matrix 1 0 0 1  0 1 0  akiaij = δj,k are clearly matrix 0 0 1 elements of 1

Transpose  In terms of matrices, DEFINE A-1 by: AA-1  A-1A  1 Proof that AA-1  A-1A : p 146 of text. 1  Identity transformation because: x = 1 x and A = 1 A Matrix elements of A-1 & of A are related by: aij = aji (4) Proof of this: p 146-147 of text. Define: Ã  Transpose of A  matrix obtained from A by interchanging rows & columns. Clearly, (4)  A-1 = Ã & thus: ÃA = AÃ = 1

Combine aij = aji with akiaij = δj,k  akiaji = δj,k (5) A-1 = Ã  For orthogonal matrices, the reciprocal is equal to the transpose. Combine aij = aji with akiaij = δj,k  akiaji = δj,k (5) (5): A restatement of the orthogonality relns for the aki ! Dimension of rectangular matrix, m rows, n columns  m  n. A, A-1, Ã : Square matrices with m = n. Column vector (1 column matrix) x, dimension m  1. Transpose x: dimension 1  m (one row matrix). Matrix multiplication: Product AB exists only if # columns of A = # rows of B: cij = aikbkj See text about multiplication of x & its transpose with A & Ã

Obviously, diagonal elements in this case: aii = 0 Define: Symmetric Matrix  A square matrix that is the same as its transpose: A = Ã  aij = aji Antisymmetric Matrix  A square matrix that is the negative of its transpose: A = - Ã  aij = - aji Obviously, diagonal elements in this case: aii = 0

2 interpretations of orthogonal transformation Ax = x 1) Transforming coords. 2) Transforming vector x. How does arbitrary vector F (column matrix) transform under transformation A? Obviously, G  AF (some other vector). If also, the coord system is transformed under operation B, components of G in new system are given by G  BG  BAF Rewrite (using B-1B = 1) as: G = BG = BAB-1BF Also, components of F in new system are given by F  BF

Combining gives: G = BAB-1F where: F  BF, G  BG  If define operator BAB-1  A we have: G  A F (same form as G = AF, but expressed in transformed coords)  Transformation of operator A under coord transformation B is given as: A  BAB-1  Similarity transformation

Some identities (no proofs): |AB| = |A||B| Properties of determinant formed from elements of an orthogonal transformation matrix: det(A)  |A| Some identities (no proofs): |AB| = |A||B| From orthogonality reln ÃA = AÃ = 1 get |Ã||A| = |A||Ã| = 1 Determinant is unaffected by interchange of rows & columns: |Ã| = |A| Using this with above gives: |A|2 = 1  |A| =  1

Multiply from right by B: AB = BAB-1B = BA Value of determinant is invariant under a similarity transformation. Proof: A, B orthogonal transformations Assumes 1) B-1 exists & 2) |B|  0 Similarity transformation: A  BAB-1 Multiply from right by B: AB = BAB-1B = BA Determinant: |A||B| = |B||A| (|B| = a number  0)  Divide by |B| on both sides & get |A| = |A|