Linear Algebra (Aljabar Linier) Week 10 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma

Slides:



Advertisements
Similar presentations
10.4 Complex Vector Spaces.
Advertisements

Linear Algebra (Aljabar Linier) Week 13 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma
Elementary Linear Algebra Anton & Rorres, 9th Edition
Determinants Bases, linear Indep., etc Gram-Schmidt Eigenvalue and Eigenvectors Misc
Signal , Weight Vector Spaces and Linear Transformations
Signal , Weight Vector Spaces and Linear Transformations
Linear Transformations
Symmetric Matrices and Quadratic Forms
Chapter 5 Orthogonality
Chapter 6 Eigenvalues.
6 1 Linear Transformations. 6 2 Hopfield Network Questions.
Class 25: Question 1 Which of the following vectors is orthogonal to the row space of A?
Orthogonality and Least Squares
5.1 Orthogonality.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Linear Algebra (Aljabar Linier) Week 2 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma Ph: ,
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
Linear Algebra (Aljabar Linier) Week 1 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma Ph: ,
Chapter 5: The Orthogonality and Least Squares
Chapter 5 Orthogonality.
Gram-Schmidt Orthogonalization
Chapter Content Real Vector Spaces Subspaces Linear Independence
Linear algebra: matrix Eigen-value Problems
Orthogonality and Least Squares
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Linear Algebra (Aljabar Linier) Week 6 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma Ph: ,
Linear Algebra (Aljabar Linier) Week 5 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma Ph: ,
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Chap. 5 Inner Product Spaces 5.1 Length and Dot Product in R n 5.2 Inner Product Spaces 5.3 Orthonormal Bases: Gram-Schmidt Process 5.4 Mathematical Models.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Class 26: Question 1 1.An orthogonal basis for A 2.An orthogonal basis for the column space of A 3.An orthogonal basis for the row space of A 4.An orthogonal.
7.1 Eigenvalues and Eigenvectors
5.1 Eigenvectors and Eigenvalues 5. Eigenvalues and Eigenvectors.
1. Systems of Linear Equations and Matrices (8 Lectures) 1.1 Introduction to Systems of Linear Equations 1.2 Gaussian Elimination 1.3 Matrices and Matrix.
5.1 Eigenvalues and Eigenvectors
Ch 6 Vector Spaces. Vector Space Axioms X,Y,Z elements of  and α, β elements of  Def of vector addition Def of multiplication of scalar and vector These.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
Linear Algebra Chapter 6 Linear Algebra with Applications -Gareth Williams Br. Joel Baumeyer, F.S.C.
1 Chapter 8 – Symmetric Matrices and Quadratic Forms Outline 8.1 Symmetric Matrices 8.2Quardratic Forms 8.3Singular ValuesSymmetric MatricesQuardratic.
An inner product on a vector space V is a function that, to each pair of vectors u and v in V, associates a real number and satisfies the following.
Beyond Vectors Hung-yi Lee. Introduction Many things can be considered as “vectors”. E.g. a function can be regarded as a vector We can apply the concept.
Lecture 11 Inner Product Spaces Last Time Change of Basis (Cont.) Length and Dot Product in R n Inner Product Spaces Elementary Linear Algebra R. Larsen.
Tutorial 6. Eigenvalues & Eigenvectors Reminder: Eigenvectors A vector x invariant up to a scaling by λ to a multiplication by matrix A is called.
CS246 Linear Algebra Review. A Brief Review of Linear Algebra Vector and a list of numbers Addition Scalar multiplication Dot product Dot product as a.
QR decomposition: A = QR Q is orthonormal R is upper triangular To find QR decomposition: 1.) Q: Use Gram-Schmidt to find orthonormal basis for column.
Lecture 7 Vector Space Last Time - Properties of Determinants
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Euclidean Inner Product on Rn
Orthogonality and Least Squares
Section 4.1: Vector Spaces and Subspaces
Linear Transformations
Section 4.1: Vector Spaces and Subspaces
CS485/685 Computer Vision Dr. George Bebis
Orthogonality and Least Squares
Linear Algebra Lecture 38.
Symmetric Matrices and Quadratic Forms
Maths for Signals and Systems Linear Algebra in Engineering Lecture 18, Friday 18th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Linear Algebra Lecture 20.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Linear Algebra Lecture 41.
Linear Algebra Lecture 29.
Eigenvalues and Eigenvectors
Elementary Linear Algebra Anton & Rorres, 9th Edition
Orthogonality and Least Squares
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Linear Algebra (Aljabar Linier) Week 10 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma

Agenda Orthogonality –Orthogonality in R n, Orthogonal complements, Orthogonal Projections –The Gram-Schmidt Process –The QR Factorization Approximating eigenvalues –Orthogonal Diagonalization of Symmetric Matrices Vector Spaces –Vector spaces and subspaces –Linear Independence, basis, and dimension –Change of basis –Linear Transformation: Kernel and Range –The Matrix of a linear transformation

The Gram-Schmidt Process and The QR Factorization

Constructing Orthogonal Vectors

The Gram-Schmidt Process

Example Apply the Gram-Schmidt Process to construct an orthonormal basis for the subspace W = span(x 1,x 2,x 3 ) of R 4, where

QR Factorization The Gram-Schmidt process has shown that for each i=1,...,n,

Example QR Factorization procedure: Use the Gram-Schmidt process to find an orthonormal basis for Col A Since Q has orthonormal columns, then. If then Find a QR factorization of

Approximating Eigenvalues The idea is based on the following: where All the A k are similar and hence they have the same eigenvalues. Under certain conditions, the matrices A k converge to a triangular matrix (the Schur form of A), where the eigenvalues are listed on the diagonal Example: Compute eigenvalues of

Orthogonal Diagonalization of Symmetric Matrices

Spectral Theorem The spectral decomposition of A The projection form of the Spectral Theorem

Example Find the spectral decomposition of the matrix

Vector Spaces & Subspaces

Definition: Let V be a set on which addition and scalar multiplication are defined. If the following axioms are true for all objects u, v, and w in V and all scalars c and d then V is called a vector space and the objects in V are called “vectors” Note: objects called vectors here are not only Euclidean vectors (previous lectures), but they can be matrices, functions, etc. Vector Spaces

Let the set V be the points on a line through the origin, with the standard addition and scalar multiplication. Show that V is a vector space. Let the set V be the points on a line that does NOT go through the origin in with the standard addition and scalar multiplication. Show that V is not a vector space Let n and m be fixed numbers and let represent the set of nxm all matrices. Also let addition and scalar multiplication on be the standard matrix addition and standard matrix scalar multiplication. Show that is a vector space Show that is a vector space: Examples

Operations on real-valued functions

Theorem: Note: Every vector space, V, has at least two subspaces. Namely, V itself and (the zero space) Subspaces

Let W be the set of diagonal matrices of size nxn. Is this a subspace of M nn ? Let be the set of all polynomials of degree n or less. Is this a subspace of, where is a set of real-valued functions on the interval ? Examples

Examples: Spanning Sets

Linear Independence Basis Dimension

Examples: Linear Independence

Examples: Basis

Remark: The most useful aspect pf coordinate vectors is that they allow us to transfer information from a general vector space to R n Examples: Coordinates

Examples: Dimension

Change of Basis

Introduction

The End Thank you for your attention!