Signal Processing and Representation Theory Lecture 1.

Slides:



Advertisements
Similar presentations
10.4 Complex Vector Spaces.
Advertisements

Chapter 4 Euclidean Vector Spaces
3D Geometry for Computer Graphics
Matrix Theory Background
Signal , Weight Vector Spaces and Linear Transformations
HCI 530 : Seminar (HCI) Damian Schofield. HCI 530: Seminar (HCI) Transforms –Two Dimensional –Three Dimensional The Graphics Pipeline.
Linear Transformations
Chapter 4.1 Mathematical Concepts
Chapter 5 Orthogonality
Computer Graphics Recitation 5.
Chapter 4.1 Mathematical Concepts. 2 Applied Trigonometry Trigonometric functions Defined using right triangle  x y h.
Rotation Group.  A metric is used to measure the distance in a space. Euclidean space is delta  An orthogonal transformation preserves the metric. Inverse.
CSCE 590E Spring 2007 Basic Math By Jijun Tang. Applied Trigonometry Trigonometric functions  Defined using right triangle  x y h.
Computer Graphics CSC 630 Lecture 2- Linear Algebra.
CSci 6971: Image Registration Lecture 2: Vectors and Matrices January 16, 2004 Prof. Chuck Stewart, RPI Dr. Luis Ibanez, Kitware Prof. Chuck Stewart, RPI.
6 1 Linear Transformations. 6 2 Hopfield Network Questions.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 08 Chapter 8: Linear Transformations.
Lecture 2: Geometry vs Linear Algebra Points-Vectors and Distance-Norm Shang-Hua Teng.
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
INDR 262 INTRODUCTION TO OPTIMIZATION METHODS LINEAR ALGEBRA INDR 262 Metin Türkay 1.
Lecture 7: Matrix-Vector Product; Matrix of a Linear Transformation; Matrix-Matrix Product Sections 2.1, 2.2.1,
Stats & Linear Models.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Linear Algebra With Applications by Otto Bretscher. Page The Determinant of any diagonal nxn matrix is the product of its diagonal entries. True.
Compiled By Raj G. Tiwari
GROUPS & THEIR REPRESENTATIONS: a card shuffling approach Wayne Lawton Department of Mathematics National University of Singapore S ,
Signal-Space Analysis ENSC 428 – Spring 2008 Reference: Lecture 10 of Gallager.
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
Sundermeyer MAR 550 Spring Laboratory in Oceanography: Data and Methods MAR550, Spring 2013 Miles A. Sundermeyer Linear Algebra & Calculus Review.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
Alignment Introduction Notes courtesy of Funk et al., SIGGRAPH 2004.
Chapter 5: The Orthogonality and Least Squares
Modern Navigation Thomas Herring
Linear Algebra Chapter 4 Vector Spaces.
Chapter 2 – Linear Transformations
Elementary Linear Algebra Anton & Rorres, 9th Edition
CPSC 491 Xin Liu Nov 17, Introduction Xin Liu PhD student of Dr. Rokne Contact Slides downloadable at pages.cpsc.ucalgary.ca/~liuxin.
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
Shape Analysis and Retrieval Statistical Shape Descriptors Notes courtesy of Funk et al., SIGGRAPH 2004.
Page 146 Chapter 3 True False Questions. 1. The image of a 3x4 matrix is a subspace of R 4 ? False. It is a subspace of R 3.
1 C ollege A lgebra Systems and Matrices (Chapter5) 1.
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
Linear algebra: matrix Eigen-value Problems
Signal Processing and Representation Theory Lecture 4.
Signal Processing and Representation Theory Lecture 2.
Mathematical foundationsModern Seismology – Data processing and inversion 1 Some basic maths for seismic data processing and inverse problems (Refreshement.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Signal Processing and Representation Theory Lecture 3.
Chap. 4 Vector Spaces 4.1 Vectors in Rn 4.2 Vector Spaces
AGC DSP AGC DSP Professor A G Constantinides©1 Signal Spaces The purpose of this part of the course is to introduce the basic concepts behind generalised.
Introduction to Linear Algebra Mark Goldman Emily Mackevicius.
Signal & Weight Vector Spaces
Matrices and Determinants
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
A function is a rule f that associates with each element in a set A one and only one element in a set B. If f associates the element b with the element.
1 G Lect 4W Multiple regression in matrix terms Exploring Regression Examples G Multiple Regression Week 4 (Wednesday)
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Matrices, Vectors, Determinants.
Lecture 1 Linear algebra Vectors, matrices. Linear algebra Encyclopedia Britannica:“a branch of mathematics that is concerned with mathematical structures.
Linear Algebra With Applications by Otto Bretscher.
Linear Algebra Lecture 2.
Matrices and vector spaces
GROUPS & THEIR REPRESENTATIONS: a card shuffling approach
Quantum One.
Chapter 3 Linear Algebra
Vector Spaces 1 Vectors in Rn 2 Vector Spaces
Laboratory in Oceanography: Data and Methods
Linear Vector Space and Matrix Mechanics
Presentation transcript:

Signal Processing and Representation Theory Lecture 1

Outline: Algebra Review –Numbers –Groups –Vector Spaces –Inner Product Spaces –Orthogonal / Unitary Operators Representation Theory

Algebra Review Numbers (Reals) Real numbers, ℝ, are the set of numbers that we express in decimal notation, possibly with infinite, non-repeating, precision.

Algebra Review Numbers (Reals) Example:  = … Completeness: If a sequence of real numbers gets progressively “tighter” then it must converge to a real number. Size: The size of a real number a  ℝ is the square root of its square norm:

Algebra Review Numbers (Complexes) Complex numbers, ℂ, are the set of numbers that we express as a+ib, where a,b  ℝ and i=. Example: e i  =cos  +isin 

Algebra Review Numbers (Complexes) Let p(x)=x n +a n-1 x n-1 +…+a 1 x 1 +a 0 be a polynomial with a i  ℂ. Algebraic Closure: p(x) must have a root, x 0 in ℂ : p(x 0 )=0.

Algebra Review Numbers (Complexes) Conjugate: The conjugate of a complex number a+ib is: Size: The size of a real number a+ib  ℂ is the square root of its square norm:

Algebra Review Groups A group G is a set with a composition rule + that takes two elements of the set and returns another element, satisfying: –Asscociativity: (a+b)+c=a+(b+c) for all a,b,c  G. –Identity: There exists an identity element 0  G such that 0+a=a+0=a for all a  G. –Inverse: For every a  G there exists an element -a  G such that a+(-a)=0. If the group satisfies a+b=b+a for all a,b  G, then the group is called commutative or abelian.

Algebra Review Groups Examples: –The integers, under addition, are a commutative group. –The positive real numbers, under multiplication, are a commutative group. –The set of complex numbers without 0, under multiplication, are a commutative group. –Real/complex invertible matrices, under multiplication are a non-commutative group. –The rotation matrices, under multiplication, are a non- commutative group. (Except in 2D when they are commutative)

Algebra Review (Real) Vector Spaces A real vector space is a set of objects that can be added together and scaled by real numbers. Formally : A real vector space V is a commutative group with a scaling operator: (a,v)→av, a  ℝ, v  V, such that: 1.1v=v for all v  V. 2.a(v+w)=av+aw for all a  ℝ, v,w  V. 3.(a+b)v=av+bv for all a,b  ℝ, v  V. 4.(ab)v=a(bv) for all a,b  ℝ, v  V.

Algebra Review (Real) Vector Spaces Examples: The set of n-dimensional arrays with real coefficients is a vector space. The set of mxn matrices with real entries is a vector space. The sets of real-valued functions defined in 1D, 2D, 3D,… are all vector spaces. The sets of real-valued functions defined on the circle, disk, sphere, ball,… are all vector spaces. Etc.

Algebra Review (Complex) Vector Spaces A complex vector space is a set of objects that can be added together and scaled by complex numbers. Formally : A complex vector space V is a commutative group with a scaling operator: (a,v)→av, a  ℂ, v  V, such that: 1.1v=v for all v  V. 2.a(v+w)=av+aw for all a  ℂ, v,w  V. 3.(a+b)v=av+bv for all a,b  ℂ, v  V. 4.(ab)v=a(bv) for all a,b  ℂ, v  V.

Algebra Review (Complex) Vector Spaces Examples: The set of n-dimensional arrays with complex coefficients is a vector space. The set of mxn matrices with complex entries is a vector space. The sets of complex-valued functions defined in 1D, 2D, 3D,… are all vector spaces. The sets of complex-valued functions defined on the circle, disk, sphere, ball,… are all vector spaces. Etc.

Algebra Review (Real) Inner Product Spaces A real inner product space is a real vector space V with a mapping  V,V  → ℝ that takes a pair of vectors and returns a real number, satisfying: –  u,v+w  =  u,v  +  u,w  for all u,v,w  V. –  αu,v  =α  u,v  for all u,v  V and all α  ℝ. –  u,v  =  v,u  for all u,v  V. –  v,v  0 for all v  V, and  v,v  =0 if and only if v=0.

Algebra Review (Real) Inner Product Spaces Examples: –The space of n-dimensional arrays with real coefficients is an inner product space. If v=(v 1,…,v n ) and w=(w 1,…,w n ) then:  v,w  =v 1 w 1 +…+v n w n –If M is a symmetric matrix (M=M t ) whose eigen- values are all positive, then the space of n-dimensional arrays with real coefficients is an inner product space. If v=(v 1,…,v n ) and w=(w 1,…,w n ) then:  v,w  M =vMw t

Algebra Review (Real) Inner Product Spaces Examples: –The space of mxn matrices with real coefficients is an inner product space. If M and N are two mxn matrices then:  M,N  =Trace(M t N)

Algebra Review (Real) Inner Product Spaces Examples: –The spaces of real-valued functions defined in 1D, 2D, 3D,… are real inner product space. If f and g are two functions in 1D, then: –The spaces of real-valued functions defined on the circle, disk, sphere, ball,… are real inner product spaces. If f and g are two functions defined on the circle, then:

Algebra Review (Complex) Inner Product Spaces A complex inner product space is a complex vector space V with a mapping  V,V  → ℂ that takes a pair of vectors and returns a complex number, satisfying: –  u,v+w  =  u,v  +  u,w  for all u,v,w  V. –  αu,v  =α  u,v  for all u,v  V and all α  ℝ. – for all u,v  V. –  v,v  0 for all v  V, and  v,v  =0 if and only if v=0.

Algebra Review (Complex) Inner Product Spaces Examples: –The space of n-dimensional arrays with complex coefficients is an inner product space. If v=(v 1,…,v n ) and w=(w 1,…,w n ) then: –If M is a conjugate symmetric matrix ( ) whose eigen-values are all positive, then the space of n- dimensional arrays with complex coefficients is an inner product space. If v=(v 1,…,v n ) and w=(w 1,…,w n ) then:  v,w  M =vMw t

Algebra Review (Complex) Inner Product Spaces Examples: –The space of mxn matrices with real coefficients is an inner product space. If M and N are two mxn matrices then:

Algebra Review (Complex) Inner Product Spaces Examples: –The spaces of complex-valued functions defined in 1D, 2D, 3D,… are real inner product space. If f and g are two functions in 1D, then: –The spaces of real-valued functions defined on the circle, disk, sphere, ball,… are real inner product spaces. If f and g are two functions defined on the circle, then:

Algebra Review Inner Product Spaces If V 1,V 2  V, then V is the direct sum of subspaces V 1, V 2, written V=V 1  V 2, if: –Every vector v  V can be written uniquely as: for some vectors v 1  V 1 and v 2  V 2.

Algebra Review Inner Product Spaces Example: If V is the vector space of 4-dimensional arrays, then V is the direct sum of the vector spaces V 1,V 2  V where: –V 1 =(x 1,x 2,0,0) –V 2 =(0,0,x 3,x 4 )

Algebra Review Orthogonal / Unitary Operators If V is a real / complex inner product space, then a linear map A:V→V is orthogonal / unitary if it preserves the inner product:  v,w  =  Av,Aw  for all v,w  V.

Algebra Review Orthogonal / Unitary Operators Examples: –If V is the space of real, two-dimensional, vectors and A is any rotation or reflection, then A is orthogonal. A= v2v2 v1v1 A( v 2 ) A( v 1 )

Algebra Review Orthogonal / Unitary Operators Examples: –If V is the space of real, three-dimensional, vectors and A is any rotation or reflection, then A is orthogonal. A=

Algebra Review Orthogonal / Unitary Operators Examples: –If V is the space of functions defined in 1D and A is any translation, then A is orthogonal. A=

Algebra Review Orthogonal / Unitary Operators Examples: –If V is the space of functions defined on a circle and A is any rotation or reflection, then A is orthogonal. A=

Algebra Review Orthogonal / Unitary Operators Examples: –If V is the space of functions defined on a sphere and A is any rotation or reflection, then A is orthogonal. A=

Outline: Algebra Review Representation Theory –Orthogonal / Unitary Representations –Irreducible Representations –Why Do We Care?

Representation Theory Orthogonal / Unitary Representation An orthogonal / unitary representation of a group G onto an inner product space V is a map  that sends every element of G to an orthogonal / unitary transformation, subject to the conditions: 1.  (0)v=v, for all v  V, where 0 is the identity element. 2.  (gh)v=  (g)  (h)v

Representation Theory Orthogonal / Unitary Representation Examples: –If G is any group and V is any vector space, then: is an orthogonal / unitary representation. –If G is the group of rotations and reflections and V is any vector space, then: is an orthogonal / unitary representation.

Representation Theory Orthogonal / Unitary Representation Examples: –If G is the group of nxn orthogonal / unitary matrices, and V is the space of n-dimensional arrays, then: is an orthogonal / unitary representation.

Representation Theory Orthogonal / Unitary Representation Examples: –If G is the group of 2x2 rotation matrices, and V is the vector space of 4-dimensional real / complex arrays, then: is an orthogonal / unitary representation.

Representation Theory Irreducible Representations A representation , of a group G onto a vector space V is irreducible if cannot be broken up into smaller representation spaces. That is, if there exist W  V such that:  (G)W  W Then either W=V or W= .

Representation Theory Irreducible Representations If W  V is a sub-representation of G, and W  is the space of vectors perpendicular to W:  v,w  =0 for all v  W  and w  W, then V=W  W  and W  is also a sub-representation of V. For any g  G, v  W , and w  W, we have: So if a representation is reducible, it can be broken up into the direct sum of two sub-representations.

Representation Theory Irreducible Representations Examples: –If G is any group and V is any vector space with dimension larger than one, then: is not an irreducible representation.

Representation Theory Irreducible Representations Examples: –If G is the group of 2x2 rotation matrices, and V is the vector space of 4-dimensional real / complex arrays, then: is not an irreducible representation since it maps the space W=(x 1,x 2,0,0) back into itself.

Representation Theory Why do we care?

Representation Theory Why we care In shape matching we have to deal with the fact that rotations do not change the shape of a model. =

Representation Theory Exhaustive Search If v M is a spherical function representing model M and v n is a spherical function representing model N, we want to find the minimum over all rotations T of the equation:

Representation Theory Exhaustive Search If V is the space of spherical functions then we can consider the representation of the group of rotations on this space. By decomposing V into a direct sum of its irreducible representations, we get a better framework for finding the best rotation.

Representation Theory Exhaustive Search (Brute Force) Suppose that {v 1,…,v n } is some orthogonal basis for V, then we can express the shape descriptors in terms of this basis: v M =a 1 v 1 +…+a n v n v N =b 1 v 1 +…+b n v n

Representation Theory Exhaustive Search (Brute Force) Then the dot-product of M and N at a rotation T is equal to:

Representation Theory Exhaustive Search (Brute Force) So that the nxn cross-multiplications are needed: T(vn)T(vn) vMvM v1v1 v2v2 vnvn = T(v1)T(v1) = T(v2)T(v2) T(vN)T(vN) … …

Representation Theory Exhaustive Search (w/ Rep. Theory) Now suppose that we can decompose V into a collection of one-dimensional representations. That is, there exists an orthogonal basis {w 1,…,w n } of functions such that T(w i )  w i ℂ for all rotations T and hence:  w i,T(w j )  =0 for all i≠j.

Representation Theory Exhaustive Search (w/ Rep. Theory) Then we can express the shape descriptors in terms of this basis: v M =α 1 w 1 +…+α n w n v N =β 1 w 1 +…+β n w n

Representation Theory Exhaustive Search (w/ Rep. Theory) And the dot-product of M and N at a rotation T is equal to:

Representation Theory Exhaustive Search (w/ Rep. Theory) So that only n multiplications are needed: T(wn)T(wn) vMvM w1w1 w2w2 wnwn = T(w1)T(w1) = T(w2)T(w2) T(vN)T(vN) … …