Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.

Slides:



Advertisements
Similar presentations
5.3 Orthogonal Transformations
Advertisements

6.4 Best Approximation; Least Squares
3D Geometry for Computer Graphics
Lecture 19 Singular Value Decomposition
Symmetric Matrices and Quadratic Forms
Chapter 5 Orthogonality
Computer Graphics Recitation 5.
3D Geometry for Computer Graphics
Math for CSLecture 41 Linear Least Squares Problem Over-determined systems Minimization problem: Least squares norm Normal Equations Singular Value Decomposition.
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Lecture 12 Projection and Least Square Approximation Shang-Hua Teng.
3D Geometry for Computer Graphics
Lecture 12 Least Square Approximation Shang-Hua Teng.
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Lecture 20 SVD and Its Applications Shang-Hua Teng.
Lecture 18 Eigenvalue Problems II Shang-Hua Teng.
Lecture 7 Hyper-planes, Matrices, and Linear Systems Shang-Hua Teng.
Orthogonality and Least Squares
Linear Least Squares QR Factorization. Systems of linear equations Problem to solve: M x = b Given M x = b : Is there a solution? Is the solution unique?
6 6.3 © 2012 Pearson Education, Inc. Orthogonality and Least Squares ORTHOGONAL PROJECTIONS.
Orthogonal Sets (12/2/05) Recall that “orthogonal” matches the geometric idea of “perpendicular”. Definition. A set of vectors u 1,u 2,…,u p in R n is.
Lecture 11 Fundamental Theorems of Linear Algebra Orthogonalily and Projection Shang-Hua Teng.
5.1 Orthogonality.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Instructor: Irvin Roy Hentzel Office 432 Carver Phone
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
Chapter 5: The Orthogonality and Least Squares
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
Gram-Schmidt Orthogonalization
Linear Algebra (Aljabar Linier) Week 10 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma
AN ORTHOGONAL PROJECTION
1 Chapter 5 – Orthogonality and Least Squares Outline 5.1 Orthonormal Bases and Orthogonal Projections 5.2 Gram-Schmidt Process and QR Factorization 5.3.
Orthogonality and Least Squares
Elementary Linear Algebra Anton & Rorres, 9th Edition
Lecture 8 Matrix Inverse and LU Decomposition
The first 2 steps of the Gram Schmitt Process
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Class 26: Question 1 1.An orthogonal basis for A 2.An orthogonal basis for the column space of A 3.An orthogonal basis for the row space of A 4.An orthogonal.
1. Systems of Linear Equations and Matrices (8 Lectures) 1.1 Introduction to Systems of Linear Equations 1.2 Gaussian Elimination 1.3 Matrices and Matrix.
5.1 Eigenvalues and Eigenvectors
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Let W be a subspace of R n, y any vector in R n, and the orthogonal projection of y onto W. …
6 6.5 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares LEAST-SQUARES PROBLEMS.
Lecture XXVII. Orthonormal Bases and Projections Suppose that a set of vectors {x 1,…,x r } for a basis for some space S in R m space such that r  m.
QR decomposition: A = QR Q is orthonormal R is upper triangular To find QR decomposition: 1.) Q: Use Gram-Schmidt to find orthonormal basis for column.
Orthogonal Matrices & Symmetric Matrices
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Orthogonality and Least Squares
RECORD. RECORD Subspaces of Vector Spaces: Check to see if there are any properties inherited from V:
Orthogonality and Least Squares
Chapter 3 Linear Algebra
Linear Algebra Lecture 40.
Linear Algebra Lecture 39.
Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng.
Symmetric Matrices and Quadratic Forms
Maths for Signals and Systems Linear Algebra in Engineering Lecture 18, Friday 18th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Linear Algebra Lecture 41.
Lecture 8 Matrix Inverse and LU Decomposition
Lecture 9 Symmetric Matrices Subspaces and Nullspaces
Lecture 20 SVD and Its Applications
Orthogonality and Least Squares
Symmetric Matrices and Quadratic Forms
Outline Basic Theories on the Subspace Subspace projection.
Presentation transcript:

Lecture 13 Operations in Graphics and Geometric Modeling I: Projection, Rotation, and Reflection Shang-Hua Teng

Projection Projection onto an axis (a,b) x axis is a vector subspace

Projection onto an Arbitrary Line Passing through 0 (a,b)

Projection on to a Plane

Projection on to a Line b a q p

Projection Matrix: on to a Line b What matrix P has the property p = Pb a q p

Properties of Projection on to a Line b a q p p is the points in Span(a) that is the closest to b

Projection onto a Subspace Input: Given a vector subspace V in Rm A vector b in Rm… Desirable Output: A vector in p in V that is closest to b The projection p of b in V A vector p in V such that (b-p) is orthogonal to V

How to Describe a Vector Subspace V in Rm If dim(V) = n, then V has n basis vectors a1, a2, …, an They are independent V = C(A) where A = [a1, a2, …, an]

Projection onto a Subspace Input: Given n independent vectors a1, a2, …, an in Rm A vector b in Rm… Desirable Output: A vector in p in C([a1, a2, …, an]) that is closest to b The projection p of b in C([a1, a2, …, an]) A vector p in V such that (b-p) is orthogonal to C([a1, a2, …, an])

Using Orthogonal Condition

Think about this Picture dim r dim r xr C(A) C(AT) p b Rn Rm xn b-p dim n- r N(A) N(AT) dim m- r

Connection to Least Square Approximation

Rotation q

Properties of The Rotation Matrix

Properties of The Rotation Matrix Q is an orthonormal matrix: QT Q = I

Rotation Matrix in High Dimensions Q is an orthonormal matrix: QT Q = I

Rotation Matrix in High Dimensions Q is an orthonormal matrix: QT Q = I

Reflection b u mirror

Reflection b u

Reflection b u mirror

Reflection is Symmetric and Orthonormal b u mirror

Orthonormal Vectors are orthonormal if

Orthonormal Matrices Q is orthonormal if QT Q = I The columns of Q are orthonormal vectors Theorem: For any vectors x and y,

Products of Orthonormal Matrices Theorem: If Q and P are both orthonormal matrices then QP is also an orthonormal matrix. Proof:

Orthonormal Basis and Gram-Schmidt Input: an m by n matrix A Desirable output: Q such that C(A) = C(Q), and Q is orthonormal

Basic Idea Suppose A = [a1 … an] If n = 1, then Q = [a1 /|| a1 ||] Start with a2 and subtract its projection along a1 Normalize

Gram-Schmidt Suppose A = [a1 … an] What is the complexity? O(mn2) q1 = a1 /|| a1 || For i = 2 to n What is the complexity? O(mn2)

Theorem: QR-Decomposition Suppose A = [a1 … an] There exist an upper triangular matrix R such that A = QR

Using QR to Find Least Square Approximation Can be solved by back substitution