Beyond Vectors Hung-yi Lee. Introduction Many things can be considered as “vectors”. E.g. a function can be regarded as a vector We can apply the concept.

Slides:



Advertisements
Similar presentations
10.4 Complex Vector Spaces.
Advertisements

3D Geometry for Computer Graphics
Signal , Weight Vector Spaces and Linear Transformations
Signal , Weight Vector Spaces and Linear Transformations
Linear Transformations
Symmetric Matrices and Quadratic Forms
Chapter 6 Eigenvalues.
Chapter 3 Determinants and Matrices
6 1 Linear Transformations. 6 2 Hopfield Network Questions.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 08 Chapter 8: Linear Transformations.
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
資訊科學數學11 : Linear Equation and Matrices
5.1 Orthogonality.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Linear Algebra and Image Processing
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
Chapter 5: The Orthogonality and Least Squares
Chapter 5 Orthogonality.
Linear Algebra Chapter 4 Vector Spaces.
+ Review of Linear Algebra Optimization 1/14/10 Recitation Sivaraman Balakrishnan.
University of Texas at Austin CS384G - Computer Graphics Fall 2008 Don Fussell Orthogonal Functions and Fourier Series.
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
Linear Algebra (Aljabar Linier) Week 10 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma
Page 146 Chapter 3 True False Questions. 1. The image of a 3x4 matrix is a subspace of R 4 ? False. It is a subspace of R 3.
Chap. 6 Linear Transformations
Linear algebra: matrix Eigen-value Problems
1 Chapter 5 – Orthogonality and Least Squares Outline 5.1 Orthonormal Bases and Orthogonal Projections 5.2 Gram-Schmidt Process and QR Factorization 5.3.
Domain Range definition: T is a linear transformation, EIGENVECTOR EIGENVALUE.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Introductions to Linear Transformations Function T that maps a vector space V into a vector space W: V: the domain of T W: the codomain of T Chapter.
Chapter 4 Linear Transformations 4.1 Introduction to Linear Transformations 4.2 The Kernel and Range of a Linear Transformation 4.3 Matrices for Linear.
Chapter 10 Real Inner Products and Least-Square
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
Chap. 5 Inner Product Spaces 5.1 Length and Dot Product in R n 5.2 Inner Product Spaces 5.3 Orthonormal Bases: Gram-Schmidt Process 5.4 Mathematical Models.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Class 26: Question 1 1.An orthogonal basis for A 2.An orthogonal basis for the column space of A 3.An orthogonal basis for the row space of A 4.An orthogonal.
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
7.1 Eigenvalues and Eigenvectors
Signal & Weight Vector Spaces
5.1 Eigenvalues and Eigenvectors
Ch 6 Vector Spaces. Vector Space Axioms X,Y,Z elements of  and α, β elements of  Def of vector addition Def of multiplication of scalar and vector These.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
4 Vector Spaces 4.1 Vector Spaces and Subspaces 4.2 Null Spaces, Column Spaces, and Linear Transformations 4.3 Linearly Independent Sets; Bases 4.4 Coordinate.
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
An inner product on a vector space V is a function that, to each pair of vectors u and v in V, associates a real number and satisfies the following.
Reduced echelon form Matrix equations Null space Range Determinant Invertibility Similar matrices Eigenvalues Eigenvectors Diagonabilty Power.
Chapter 4 Linear Transformations 4.1 Introduction to Linear Transformations 4.2 The Kernel and Range of a Linear Transformation 4.3 Matrices for Linear.
Graphics Graphics Korea University kucg.korea.ac.kr Mathematics for Computer Graphics 고려대학교 컴퓨터 그래픽스 연구실.
Lecture 7 Vector Space Last Time - Properties of Determinants
CS479/679 Pattern Recognition Dr. George Bebis
Chapter 1 Linear Equations and Vectors
Matrices and vector spaces
Matrices and Vectors Review Objective
Lecture on Linear Algebra
CS485/685 Computer Vision Dr. George Bebis
Chapter 3 Linear Algebra
Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors.
Linear Algebra Lecture 38.
Symmetric Matrices and Quadratic Forms
Chapter 4 Linear Transformations
Symmetric Matrices and Quadratic Forms
Subspace Hung-yi Lee Think: beginning subspace.
Presentation transcript:

Beyond Vectors Hung-yi Lee

Introduction Many things can be considered as “vectors”. E.g. a function can be regarded as a vector We can apply the concept we learned on those “vectors”. Linear combination Span Basis Orthogonal …… Reference: Chapter 6

Are they vectors?

A matrix A linear transform A polynomial

Are they vectors? Any function is a vector? What is the zero vector?

What is a vector? If a set of objects V is a vector space, then the objects are “vectors”. Vector space: There are operations called “addition” and “scalar multiplication”. u, v and w are in V, and a and b are scalars. u+v and au are unique elements of V The following axioms hold: u + v = v + u, (u +v) + w = u +(v + w) There is a “zero vector” 0 in V such that u + 0 = u There is –u in V such that u +(-u) = 0 1u = u, (ab)u = a(bu), a(u+v) = au +av, (a+b)u = au +bu unique R n is a vector space

Objects in Different Vector Spaces In Vector Space R 1 In Vector Space R (1,0)(2,0)(3,0)

Objects in Different Vector Spaces All the polynomials with degree less than or equal to 2 as a vector space All functions as a vector space Vectors with infinite dimensions

Subspaces

Review: Subspace A vector set V is called a subspace if it has the following three properties: 1. The zero vector 0 belongs to V 2. If u and w belong to V, then u+w belongs to V 3. If u belongs to V, and c is a scalar, then cu belongs to V Closed under (vector) addition Closed under scalar multiplication

Are they subspaces? All the functions pass 0 at t 0 All the matrices whose trace equal to zero All the matrices of the form All the continuous functions All the polynomials with degree n All the polynomials with degree less than or equal to n P: all polynomials, P n : all polynomials with degree less than or equal to n

Linear Combination and Span

Matrices Linear combination with coefficient a, b, c What is Span S? All 2x2 matrices whose trace equal to zero

Linear Combination and Span Polynomials

Linear Transformation

Linear transformation A mapping (function) T is called linear if for all “vectors” u, v and scalars c: Preserving vector addition: Preserving vector multiplication: Is matrix transpose linear? Input: m x n matrices, output: n x m matrices

Linear transformation Derivative: Integral from a to b Derivative function f function f’ e.g. x 2 e.g. 2x Integral function f scalar e.g. x 2 (from a to b) linear?

Null Space and Range Null Space The null space of T is the set of all vectors such that T(v)=0 What is the null space of matrix transpose? Range The range of T is the set of all images of T. That is, the set of all vectors T(v) for all v in the domain What is the range of matrix transpose?

One-to-one and Onto U: M m  n  M n  m defined by U(A) = A T. Is U one-to-one? Is U onto? D: C   C  defined by D( f ) = f Is D one-to-one? Is D onto? D: P 3  P 3 defined by D( f ) = f Is D one-to-one? Is D onto? yes no yes no

Isomorphism ( 同構 ) Biology Graph Chemistry

Isomorphism Let V and W be vector space. A linear transformation T: V→W is called an isomorphism if it is one-to-one and onto Invertible linear transform W and V are isomorphic. W V Example 1: U: M m  n  M n  m defined by U(A) = A T. Example 2: T: P 2  R 3

Basis A basis for subspace V is a linearly independent generation set of V.

Independent Example S = {x 2 - 3x + 2, 3x 2  5x, 2x  3} is a subset of P 2. Is it linearly independent? No implies that a = b = c = 0 is a subset of 2x2 matrices. Yes Is it linearly independent?

Independent Example  i c i x i = 0 implies c i = 0 for all i. The infinite vector set {1, x, x 2, , x n,  } Is it linearly independent? Yes S = {e t, e 2t, e 3t } Is it linearly independent? ae t + be 2t + ce 3t = 0 a + b + c = 0 ae t + 2be 2t + 3ce 3t = 0 ae t + 4be 2t + 9ce 3t = 0 a + 2b + 3c = 0 a + 4b + 9c = 0 Yes If {v 1, v 2, ……, v k } are L.I., and T is an isomorphism, {T(v 1 ), T(v 2 ), ……, T(v k )} are L.I.

Basis Example For the subspace of all 2 x 2 matrices, S = {1, x, x 2, , x n,  } is a basis of P. The basis is Dim = 4 Dim = inf

Vector Representation of Object Coordinate Transformation basis Pn:Pn: Basis: {1, x, x 2, , x n }

Matrix Representation of Linear Operator Example: D (derivative): P 2 → P 2 Represent it as a matrix polynomial vector Derivative Multiply a matrix

Matrix Representation of Linear Operator Example: D (derivative): P 2 → P 2 Represent it as a matrix polynomial vector Multiply a matrix Derivative

Matrix Representation of Linear Operator Example: D (derivative): P 2 → P 2 Represent it as a matrix polynomial vector Multiply a matrix Derivative Not invertible

Matrix Representation of Linear Operator Function in F vector Multiply a matrix Derivative invertible

Matrix Representation of Linear Operator Function in F vector Multiply a matrix Derivative Antiderivative

Eigenvalue and Eigenvector

Consider derivative (linear transformation, input & output are functions) Consider Transpose (also linear transformation, input & output are functions) What is the “eigenvalue”? Every scalar is an eigenvalue of derivative. Symmetric: Skew-symmetric: Symmetric matrices form the eigenspace Skew-symmetric matrices form the eigenspace.

2x2 matrices vector transpose Consider Transpose of 2x2 matrices What are the eigenvalues? vector

Eigenvalue and Eigenvector Consider Transpose of 2x2 matrices Matrix representation of transpose Characteristic polynomial Symmetric matrices Skew-symmetric matrices Dim=3 Dim=1

Inner Product

“vector” v “vector” u scalar For any vectors u, v and w, and any scalar a, the following axioms hold: Dot product is a special case of inner product Can you define other inner product for normal vectors? Norm (length): Orthogonal: Inner product is zero

Inner Product Inner Product of Matrix Frobenius inner product Element-wise multiplication

Inner Product Inner product for general functions Can it be inner product for general functions?

Orthogonal/Orthonormal Basis

Orthogonal Basis After normalization, you can get orthonormal basis. Gram-Schmidt Process Gram-Schmidt Process

Orthogonal/Orthonormal Basis Find orthogonal/orthonormal basis for P 2 Define an inner product of P 2 by Find a basis {1, x, x 2 }

Orthogonal/Orthonormal Basis Find orthogonal/orthonormal basis for P 2 Define an inner product of P 2 by Get an orthogonal basis {1, x, x 2 -1/3} Orthonormal Basis

Fourier Series Orthogonal Basis: Orthonormal Basis: a b

工商時間 預計下個學期 (105 學年度上學期 ) 開「機器學習」 Introducing general machine learning methods, not only focus on deep learning 電機系大二以上就有能力修習 預計下下個學期 (105 學年度下學期 ) 開「機器學 習及其深層與結構化」 “ 深度學習 深度學習 ”