Download presentation
Presentation is loading. Please wait.
Published byAkira Spurr Modified over 10 years ago
1
CS 450: COMPUTER GRAPHICS LINEAR ALGEBRA REVIEW SPRING 2015 DR. MICHAEL J. REALE
2
INTRODUCTION We’re going to spend a little time on some important concepts from Linear Algebra Some of it will seem a bit general and abstract, but I will endeavor to give you concrete examples We’ve already covered vectors in general in a previous lecture Here, we will discuss Euclidean space Linear (In)dependence Basis vectors Matrices Matrix Determinant
3
EUCLIDEAN SPACE
4
Let’s say we have a vector V with n components (e.g., V has 3 components, v x, v y, v z ) our vector is an n-tuple n-tuple = ordered list of n real numbers The vectors we will be working with exist in n-dimensional real Euclidean space: n-dimensional i.e., how many components in vector real real numbers (not complex) Euclidean space set of all possible n-tuples (all possible points in an n-space) I.e., set of all possible vectors with n components ℝ n n-dimensional real Euclidean space
5
EXAMPLE: 3D EUCLIDEAN SPACE 3 dimensions (x, y, z) 3D Euclidean space All possible 3D vectors (3D points) 3D vector in 3D Euclidean space in ℝ 3
6
THINGS TO DO IN SPACE For a vector in Euclidean space (in ℝ n ), you can do two things with it: Add them to another vector Multiply it by a scalar In both cases, you end up with another vector in ℝ n
7
RULES IN EUCLIDEAN SPACE There are several rules in Euclidean space (in fact, these actually DEFINE that we’re in a Euclidean space) Given vectors u, v, and w, and scalars a and b:
8
DOT PRODUCT REVISITED A general definition for the dot product (also called the inner (dot) product or scalar product): Some of the rules for the dot product:
9
VECTOR PROJECTION USING THE DOT PRODUCT Orthogonal = perpendicular You can use the dot product to orthogonally project one vector onto another The orthogonal projection (vector) w of a vector u onto a vector v is given by: t = scalar value
10
ORTHOGONAL PROJECTION EXAMINED Recall: Therefore:
11
ORTHOGONAL PROJECTION EXAMINED FURTHER Projection gives us an orthogonal decomposition of u I.e., can describe u in terms of two orthogonal vectors, w and (u – w) w ┴ (u-w) If v is already normalized Which means ║ w ║ = absolute value of dot product between u and v
12
NORM (LENGTH) REVISITED The norm (or length) of the vector u is a non-negative number that can be expressed using the dot product: It too has some rules:
13
CROSS PRODUCT REVISITED The rules for the cross product may be found below:
14
LINEAR (IN)DEPENDENCE
15
LINEAR DEPENDENCE AND INDEPENDENCE Let’s say we have two vectors, u 0 and u 1 If I can just multiply a number (scalar) by u 0 to get u 1, then the vectors are linearly DEPENDENT u 0 = a*u 1 linearly DEPENDENT
16
LINEAR DEPENDENCE AND INDEPENDENCE Linearly DEPENDENT Linearly INDEPENDENT
17
LINEAR DEPENDENCE AND INDEPENDENCE Another way to look at this is to rearrange the equation and also multiply u 1 by its own scalar: If the ONLY way for this to be true is if a 0 and a 1 equal zero u 0 and u 1 are linearly INDEPENDENT Can’t cancel out each other Otherwise u 0 and u 1 are linearly DEPENDENT With two vectors, this only happens when vectors are PARALLEL to each other
18
LINEAR DEPENDENCE AND INDEPENDENCE: DEFINED Let’s say now we have a n vectors (u 0, u 1, …, u n-1 ), each with their own scalar factor (a 0, a 1, …, a n-1 ) If the ONLY way to make the above statement true is to set a 0 = a 1 = … = a n-1 = 0 vectors (u 0, u 1, …, u n- 1 ) are linearly INDEPENDENT Otherwise, vectors are linearly DEPENDENT Some or all of the vectors can cancel each other out, given the proper scaling factors
19
SIZE OF SPACE How big a space is (i.e., how many dimensions a space is n) is determined by the maximum number of linearly independent vectors you want make Example: ℝ 3 can have at most 3 vectors in a set that are linearly independent Example set of linearly independent vectors for ℝ 3 : (1,0,0) (0,1,0) (0,0,1) …can’t come up with another one
20
BASIS VECTORS
21
SPANNING SPACE AND BASIS VECTORS If we have a set of n vectors (u 0, u 1, …, u n-1 ) in ℝ n AND: Vectors linearly independent Any vector V in ℝ n can be written as: Then vectors (u 0, u 1, …, u n-1 ) span Euclidean space ℝ n If only one set of v i values will give you V u vectors are a basis in ℝ n
22
EXAMPLE OF A 2D BASIS u 0 = (4,3) u 1 = (2,6) Spans ℝ 2 linearly independent and can use to make any vector in ℝ 2 Basis in ℝ 2 only one combination of (v 0, v 1 ) will give you a given V vector Example:
23
DESCRIBING A VECTOR To completely describe a vector V, we would need to state: Components v i Basis vectors u i However, if we’re using the same basis vectors for all vectors, we can just use the components to describe the vector:
24
ORTHONORMAL BASIS Orthonormal basis = basis where the vectors meet the following conditions: Every basis vector has length equal to 1 ║ u i ║ = 1 Every pair of basis vectors must be orthogonal angle between them equals 90° If basis vectors are orthogonal BUT do not have unit length orthogonal basis
25
STANDARD BASIS Standard basis = basis where each basis vector u i has components: One for dimension i Zero elsewhere Standard basis vectors denoted e i Example: 3D standard basis
26
ORTHONORMAL BASES AND THE DOT PRODUCT Given a vector P and an orthonormal basis (u 0, …, u n-1 ), you can get the components of P using the dot product: Basically you project vector P onto each basis vector u i gives you distance along u i Example: P on standard basis
27
INTRODUCTION TO MATRICES
28
ENTER THE MATRIX Matrix = (p X q) 2D array of numbers (scalars) p = number of rows, q = number of columns Used here to manipulate vectors and points used to transform vectors/points Given matrix M, another notation for a matrix is [m ij ] In computer graphics, most matrices will be 2x2, 3x3, or 4x4 In the slides that follow (for the most part): Capital letters matrices Lowercase letters scalar numbers http://www.papeldeparede.et c.br/wallpapers/codigo- matrix_2283_1280x1024.jpg
29
IDENTITY MATRIX Identity matrix = square matrix with 1’s on the diagonal and 0’s everywhere else Effectively the matrix equivalent of the number one multiplying by the identity matrix gives you the same matrix back M = I*M
30
MATRIX ADDITION To add two matrices, just add the corresponding components Same rules as with vectors Both matrices must have the same dimensions! Resulting matrix same dimensions as original matrices
31
RULES OF MATRIX ADDITION Note: 0 = matrix filled with zeros
32
MULTIPLY A MATRIX BY A SCALAR To multiple a matrix M by a scalar (single number) a, just multiply a by the individual components Again, same as with vectors Not surprisingly, resulting matrix same size as original
33
RULES OF SCALAR-MATRIX MULTIPLICATION
34
TRANSPOSE OF A MATRIX Transpose of matrix M = rows become columns and columns become rows Notation: M T If M is (p X q) M T is (q x p)
35
RULES OF THE TRANSPOSE MATRIX
36
TRACE OF A MATRIX Trace of matrix = just the sum of the diagonal elements of a square matrix Notation: tr(M)
37
MATRIX-MATRIX MULTIPLICATION When multiplying two matrix M and N like this T = MN Size of M must be (p X q) Size of N must be (q x r) Result T will be (p x r) ORDER MATTERS!!! 2 x 3 3 x2 x22
38
MATRIX-MATRIX MULTIPLICATION T = MN For each value in T t ij get the dot product of the row i of M and the column j of N
39
MATRIX-MATRIX MULTIPLICATION
42
EXAMPLE: MATRIX-MATRIX MULTIPLICATION
43
RULES OF MATRIX-MATRIX MULTIPLICATION We will use this for combining transformations I = identity matrix This is true in general, even if dimensions are the same!
44
MULTIPLYING A MATRIX BY A VECTOR We will be using column vectors here Column vector = (q x 1) matrix Multiplying a (q x 1) vector by a matrix (p x q) will give us a new vector (p x 1) For our transformations later, usually p = q so that w has the same size as v w i = dot product of v with row i of M
45
MATRIX DETERMINANT
46
DETERMINANT INTRODUCTION The determinant of a matrix Scalar number Only defined for square matrix (e.g., matrix is p x p) Denoted as |M| or det(M) Going to concentrate here on determinants 2x2 and 3x3 matrices Computing determinants for larger square matrices is a kind of recursive procedure
47
DETERMINANT FOR 2X2 AND 3X3 Pattern: diagonals going: Upper-LEFT to lower-RIGHT add Upper-RIGHT to lower-LEFT subtract
48
CLOSER LOOK AT 2X2 DETERMINANT
49
3X3 DETERMINANT AND CROSS PRODUCT If you replace: Top row e x e y e z vectors Middle row u x u y u z Bottom row v x v y v z Suddenly have Sarrus’ scheme for computing the cross product! NOTE: Not exactly the same: Cross product gives you vector Determinant gives you scalar However, the determinant and cross product are related in some interesting ways
50
ALTERNATE WAY TO COMPUTE 3X3 DETERMINANT Another way to compute the determinant is to break the matrix up into its columns and use the cross product and dot product: NOTE: the m,n notation means the n th column vector of M
51
RULES OF THE DETERMINANT Inverse of M M -1 Assuming we have a matrix M of size n X n:
52
SCALAR MULTIPLICATION AND THE DETERMINANT If you multiply a scalar a by the whole matrix M a n |M| However, if you just multiply a by ONE row (or ONE column) a|M|
53
ZERO DETERMINANT If either: Two rows (or two columns) have a cross product of zero (both going exactly the same way) OR Any row (or any column) is entirely composed of zeros Then |M| = 0 Note: if |M| = 0, then |M -1 | = 1/0 so, zero determinant means that M -1 does not exist
54
ORIENTATION OF A BASIS If each column of a matrix M is in fact a basis vector, then: If determinant |M| is POSITIVE basis is positively oriented right-handed system If determinant |M| is NEGATIVE basis is negatively oriented left-handed system Example: standard basis right-handed system
55
RELATIONSHIP TO AREA AND VOLUME Two vectors u and v can define a parallelogram Three vectors u, v, and w form a solid parallelepiped Can use scalar triple product to get volume same as getting determinant of matrix with u, v, and w as columns!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.