Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.

Slides:



Advertisements
Similar presentations
5.3 Orthogonal Transformations
Advertisements

3D Geometry for Computer Graphics
Lecture 7 Intersection of Hyperplanes and Matrix Inverse Shang-Hua Teng.
Lecture 6 Matrix Operations and Gaussian Elimination for Solving Linear Systems Shang-Hua Teng.
Lecture 19 Singular Value Decomposition
Determinants Bases, linear Indep., etc Gram-Schmidt Eigenvalue and Eigenvectors Misc
Symmetric Matrices and Quadratic Forms
Chapter 5 Orthogonality
Computer Graphics Recitation 5.
Lecture 9 Symmetric Matrices Subspaces and Nullspaces Shang-Hua Teng.
3.VI.1. Orthogonal Projection Into a Line 3.VI.2. Gram-Schmidt Orthogonalization 3.VI.3. Projection Into a Subspace 3.VI. Projection 3.VI.1. & 2. deal.
Lecture 14 Simplex, Hyper-Cube, Convex Hull and their Volumes
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Lecture 12 Projection and Least Square Approximation Shang-Hua Teng.
Lecture 12 Least Square Approximation Shang-Hua Teng.
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Lecture 18 Eigenvalue Problems II Shang-Hua Teng.
Lecture 7 Hyper-planes, Matrices, and Linear Systems Shang-Hua Teng.
Orthogonality and Least Squares
Linear Least Squares QR Factorization. Systems of linear equations Problem to solve: M x = b Given M x = b : Is there a solution? Is the solution unique?
Lecture 10 Dimensions, Independence, Basis and Complete Solution of Linear Systems Shang-Hua Teng.
Orthogonal Sets (12/2/05) Recall that “orthogonal” matches the geometric idea of “perpendicular”. Definition. A set of vectors u 1,u 2,…,u p in R n is.
Lecture 11 Fundamental Theorems of Linear Algebra Orthogonalily and Projection Shang-Hua Teng.
5.1 Orthogonality.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
Chapter 5: The Orthogonality and Least Squares
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
Chapter 5 Orthogonality.
Gram-Schmidt Orthogonalization
Linear Algebra (Aljabar Linier) Week 10 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma
AN ORTHOGONAL PROJECTION
SVD: Singular Value Decomposition
1 Chapter 5 – Orthogonality and Least Squares Outline 5.1 Orthonormal Bases and Orthogonal Projections 5.2 Gram-Schmidt Process and QR Factorization 5.3.
Orthogonality and Least Squares
Chapter 10 Real Inner Products and Least-Square (cont.) In this handout: Angle between two vectors Revised Gram-Schmidt algorithm QR-decompoistion of matrices.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Lecture 8 Matrix Inverse and LU Decomposition
The first 2 steps of the Gram Schmitt Process
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Class 26: Question 1 1.An orthogonal basis for A 2.An orthogonal basis for the column space of A 3.An orthogonal basis for the row space of A 4.An orthogonal.
3D Transformation A 3D point (x,y,z) – x,y, and z coordinates
1. Systems of Linear Equations and Matrices (8 Lectures) 1.1 Introduction to Systems of Linear Equations 1.2 Gaussian Elimination 1.3 Matrices and Matrix.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Krylov-Subspace Methods - I Lecture 6 Alessandra Nardi Thanks to Prof. Jacob White, Deepak Ramaswamy, Michal Rewienski, and Karen Veroy.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Let W be a subspace of R n, y any vector in R n, and the orthogonal projection of y onto W. …
6 6.5 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares LEAST-SQUARES PROBLEMS.
Lecture 11 Inner Product Spaces Last Time Change of Basis (Cont.) Length and Dot Product in R n Inner Product Spaces Elementary Linear Algebra R. Larsen.
QR decomposition: A = QR Q is orthonormal R is upper triangular To find QR decomposition: 1.) Q: Use Gram-Schmidt to find orthonormal basis for column.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Orthogonality and Least Squares
RECORD. RECORD Subspaces of Vector Spaces: Check to see if there are any properties inherited from V:
Orthogonality and Least Squares
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Symmetric Matrices and Quadratic Forms
Maths for Signals and Systems Linear Algebra in Engineering Lecture 18, Friday 18th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Maths for Signals and Systems Linear Algebra in Engineering Lecture 6, Friday 21st October 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Linear Algebra Lecture 41.
Lecture 8 Matrix Inverse and LU Decomposition
Lecture 20 SVD and Its Applications
Orthogonality and Least Squares
Symmetric Matrices and Quadratic Forms
Outline Basic Theories on the Subspace Subspace projection.
Presentation transcript:

Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng

Projection Projection onto an axis (a,b) x axis is a vector subspace

Projection onto an Arbitrary Line Passing through 0 (a,b)

Projection on to a Plane

Projection on to a Line b a  p

Projection Matrix: on to a Line b a  p What matrix P has the property p = Pb

Properties of Projection on to a Line b a  p p is the points in Span(a) that is the closest to b

Projection onto a Subspace Input: 1. Given a vector subspace V in R m 2.A vector b in R m … Desirable Output: –A vector in p in V that is closest to b –The projection p of b in V –A vector p in V such that (b-p) is orthogonal to V

How to Describe a Vector Subspace V in R m If dim(V) = n, then V has n basis vectors –a 1, a 2, …, a n –They are independent V = C(A) where A = [a 1, a 2, …, a n ]

Projection onto a Subspace Input: 1. Given n independent vectors a 1, a 2, …, a n in R m 2.A vector b in R m … Desirable Output: –A vector in p in C([a 1, a 2, …, a n ]) that is closest to b –The projection p of b in C([a 1, a 2, …, a n ]) –A vector p in V such that (b-p) is orthogonal to C([a 1, a 2, …, a n ])

Using Orthogonal Condition

Think about this Picture C(A T ) N(A) RnRn RmRm C(A) N(A T ) xnxn xrxr b dim r dim n- r dim m- r p b-p

Connection to Least Square Approximation

Rotation 

Properties of The Rotation Matrix

Q is an orthonormal matrix: Q T Q = I

Rotation Matrix in High Dimensions Q is an orthonormal matrix: Q T Q = I

Rotation Matrix in High Dimensions Q is an orthonormal matrix: Q T Q = I

Reflection u b mirror

Reflection u b

u b mirror

Reflection is Symmetric and Orthonormal u b mirror

Orthonormal Vectors are orthonormal if

Orthonormal Matrices Q is orthonormal if Q T Q = I The columns of Q are orthonormal vectors Theorem: For any vectors x and y,

Products of Orthonormal Matrices Theorem: If Q and P are both orthonormal matrices then QP is also an orthonormal matrix. Proof:

Orthonormal Basis and Gram-Schmidt Input: an m by n matrix A Desirable output: Q such that –C(A) = C(Q), and –Q is orthonormal

Basic Idea Suppose A = [a 1 … a n ] If n = 1, then Q = [a 1 /|| a 1 ||] If n = 2, –q 1 = a 1 /|| a 1 || –Start with a 2 and subtract its projection along a 1 –Normalize

Gram-Schmidt Suppose A = [a 1 … a n ] –q 1 = a 1 /|| a 1 || –For i = 2 to n What is the complexity? O( mn 2 )

Theorem: QR-Decomposition Suppose A = [a 1 … a n ] –There exist an upper triangular matrix R such that –A = QR

Using QR to Find Least Square Approximation Can be solved by back substitution