Signal , Weight Vector Spaces and Linear Transformations

Slides:



Advertisements
Similar presentations
Vector Spaces A set V is called a vector space over a set K denoted V(K) if is an Abelian group, is a field, and For every element vV and K there exists.
Advertisements

10.4 Complex Vector Spaces.
Vector Spaces & Subspaces Kristi Schmit. Definitions A subset W of vector space V is called a subspace of V iff a.The zero vector of V is in W. b.W is.
Signal , Weight Vector Spaces and Linear Transformations
Linear Transformations
THE DIMENSION OF A VECTOR SPACE
Chapter 5 Orthogonality
6 1 Linear Transformations. 6 2 Hopfield Network Questions.
Orthogonality and Least Squares
Linear Algebra, Principal Component Analysis and their Chemometrics Applications.
Mathematics1 Mathematics 1 Applied Informatics Štefan BEREŽNÝ.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
1 MAC 2103 Module 10 lnner Product Spaces I. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Define and find the.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
Chapter 5: The Orthogonality and Least Squares
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
Chapter 5 Orthogonality.
Linear Algebra Chapter 4 Vector Spaces.
Gram-Schmidt Orthogonalization
4.1 Vector Spaces and Subspaces 4.2 Null Spaces, Column Spaces, and Linear Transformations 4.3 Linearly Independent Sets; Bases 4.4 Coordinate systems.
University of Texas at Austin CS384G - Computer Graphics Fall 2008 Don Fussell Orthogonal Functions and Fourier Series.
Digital Image Processing, 3rd ed. © 1992–2008 R. C. Gonzalez & R. E. Woods Gonzalez & Woods Matrices and Vectors Objective.
6 1 Linear Transformations. 6 2 Hopfield Network Questions The network output is repeatedly multiplied by the weight matrix W. What is the effect of this.
4 4.4 © 2012 Pearson Education, Inc. Vector Spaces COORDINATE SYSTEMS.
Linear Algebra (Aljabar Linier) Week 10 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma
Section 4.1 Vectors in ℝ n. ℝ n Vectors Vector addition Scalar multiplication.
Chapter 3 Vector Spaces. The operations of addition and scalar multiplication are used in many contexts in mathematics. Regardless of the context, however,
Vectors CHAPTER 7. Ch7_2 Contents  7.1 Vectors in 2-Space 7.1 Vectors in 2-Space  7.2 Vectors in 3-Space 7.2 Vectors in 3-Space  7.3 Dot Product 7.3.
AGC DSP AGC DSP Professor A G Constantinides©1 Hilbert Spaces Linear Transformations and Least Squares: Hilbert Spaces.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
Chap. 5 Inner Product Spaces 5.1 Length and Dot Product in R n 5.2 Inner Product Spaces 5.3 Orthonormal Bases: Gram-Schmidt Process 5.4 Mathematical Models.
4 © 2012 Pearson Education, Inc. Vector Spaces 4.4 COORDINATE SYSTEMS.
Chap. 4 Vector Spaces 4.1 Vectors in Rn 4.2 Vector Spaces
AGC DSP AGC DSP Professor A G Constantinides©1 Signal Spaces The purpose of this part of the course is to introduce the basic concepts behind generalised.
Class 26: Question 1 1.An orthogonal basis for A 2.An orthogonal basis for the column space of A 3.An orthogonal basis for the row space of A 4.An orthogonal.
Signal & Weight Vector Spaces
4.1 Introduction to Linear Spaces (a.k.a. Vector Spaces)
Ch 6 Vector Spaces. Vector Space Axioms X,Y,Z elements of  and α, β elements of  Def of vector addition Def of multiplication of scalar and vector These.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
1 MAC 2103 Module 11 lnner Product Spaces II. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Construct an orthonormal.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
4.5: The Dimension of a Vector Space. Theorem 9 If a vector space V has a basis, then any set in V containing more than n vectors must be linearly dependent.
4 4.5 © 2016 Pearson Education, Inc. Vector Spaces THE DIMENSION OF A VECTOR SPACE.
4 Vector Spaces 4.1 Vector Spaces and Subspaces 4.2 Null Spaces, Column Spaces, and Linear Transformations 4.3 Linearly Independent Sets; Bases 4.4 Coordinate.
Sec Sec Sec 4.4 CHAPTER 4 Vector Spaces Let V be a set of elements with vector addition and multiplication by scalar is a vector space if these.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
An inner product on a vector space V is a function that, to each pair of vectors u and v in V, associates a real number and satisfies the following.
Beyond Vectors Hung-yi Lee. Introduction Many things can be considered as “vectors”. E.g. a function can be regarded as a vector We can apply the concept.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
 Matrix Operations  Inverse of a Matrix  Characteristics of Invertible Matrices …
REVIEW Linear Combinations Given vectors and given scalars
Lecture 7 Vector Space Last Time - Properties of Determinants
CS479/679 Pattern Recognition Dr. George Bebis
Matrices and Vectors Review Objective
Section 4.1: Vector Spaces and Subspaces
Section 4.1: Vector Spaces and Subspaces
1.3 Vector Equations.
Signal & Weight Vector Spaces
Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors.
Signal & Weight Vector Spaces
Elementary Linear Algebra
Linear Algebra Lecture 24.
Linear Algebra Lecture 20.
Linear Vector Space and Matrix Mechanics
THE DIMENSION OF A VECTOR SPACE
NULL SPACES, COLUMN SPACES, AND LINEAR TRANSFORMATIONS
Approximation of Functions
Approximation of Functions
Presentation transcript:

Signal , Weight Vector Spaces and Linear Transformations

Notation Vectors in Ân. Generalized Vectors. x 1 2 n =

Vector Space 1. An operation called vector addition is defined such that if x Î X and y Î X then x+y Î X. 2. x + y = y + x 3. (x + y) + z = x + (y + z) 4. There is a unique vector 0 Î X, called the zero vector, such that x + 0 = x for all x Î X. 5. For each vector there is a unique vector in X, to be called (-x ), such that x + (-x ) = 0 .

Vector Space (Cont.) 6. An operation, called multiplication, is defined such that for all scalars a Î F, and all vectors x Î X, a x Î X. 7. For any x Î X , 1x = x (for scalar 1). 8. For any two scalars a Î F and b Î F, and any x Î X, a (bx) = (a b) x . 9. (a + b) x = a x + b x . 10. a (x + y)  = a x + a y

Examples (Decision Boundaries) Is the p2, p3 plane a vector space?

Examples (Decision Boundaries) Is the line p1 + 2p2 - 2 = 0 a vector space?

Other Vector Spaces Polynomials of degree 2 or less. Continuous functions in the interval [0,1].

is a set of linearly independent vectors. Linear Independence If implies that each then is a set of linearly independent vectors.

Therefore the vectors are independent. Example Let This can only be true if Therefore the vectors are independent.

Spanning a Space A subset spans a space if every vector in the space can be written as a linear combination of the vectors in the subspace.

Basis Vectors A set of basis vectors for the space X is a set of vectors which spans X and is linearly independent. The dimension of a vector space, Dim(X), is equal to the number of vectors in the basis set. Let X be a finite dimensional vector space, then every basis set of X has the same number of elements.

Example Polynomials of degree 2 or less. Basis A: Basis B: (Any three linearly independent vectors in the space will work.) How can you represent the vector x = 1+2t using both basis sets?

Inner Product / Norm A scalar function of vectors x and y can be defined as an inner product, (x,y), provided the following are satisfied (for real inner products): (x , y) = (y , x) . (x , ay1+by2) = a(x , y1) + b(x , y2) . (x , x) ≧  0 , where equality holds iff x = 0 . A scalar function of a vector x is called a norm, ||x||, provided the following are satisfied: ||x || ≧0 . ||x || = 0 iff x = 0 . ||a x || = |a| ||x || for scalar a . ||x + y || Š ||x || + ||y || .

Example Angle Standard Euclidean Inner Product Standard Euclidean Norm ||x || = (x , x)1/2 ||x|| = (xTx)1/2 = (x12 + x22 + ... + xn2) 1/2 Angle cos(q) = (x , y) / (||x || ||y ||)

Two vectors x, y ÎX are orthogonal if (x , y) = 0 . Orthogonality Two vectors x, y ÎX are orthogonal if (x , y) = 0 . Example Any vector in the p2,p3 plane is orthogonal to the weight vector.

Gram-Schmidt Orthogonalization Independent Vectors Orthogonal Vectors Step 1: Set first orthogonal vector to first independent vector. Step 2: Subtract the portion of y2 that is in the direction of v1. Where a is chosen so that v2 is orthogonal to v1:

Step k: Subtract the portion of yk that is in the direction of all Gram-Schmidt (Cont.) Projection of y2 on v1: Step k: Subtract the portion of yk that is in the direction of all previous vi .

Example

Example Step 1:

Example (Cont.) Step 2.

Vector Expansion If a vector space X has a basis set {v1, v2, ..., vn}, then any x ÎX has a unique vector expansion: If the basis vectors are orthogonal, and we take the inner product of vj and x : Therefore the coefficients of the expansion can be computed:

Column of Numbers The vector expansion provides a meaning for writing a vector as a column of numbers. x 1 2 n = To interpret x, we need to know what basis was used for the expansion.

Reciprocal Basis Vectors Definition of reciprocal basis vectors, ri: r i v j ( , ) ¹ = 1 where the basis vectors are {v1, v2, ..., vn}, and the reciprocal basis vectors are {r1, r2, ..., rn}. For vectors in Ân we can use the following inner product: Therefore, the equations for the reciprocal basis vectors become:

Vector Expansion Take the inner product of the first reciprocal basis vector with the vector to be expanded: By definition of the reciprocal basis vectors: r 1 v 2 ( , ) 3 ¼ n = r v 1 ( , ) = Therefore, the first coefficient in the expansion is: In general, we then have (even for nonorthogonal basis vectors):

Example Basis Vectors: Vector to Expand:

Example (Cont.) Reciprocal Basis Vectors: Expansion Coefficients: Matrix Form:

Example (Cont.) The interpretation of the column of numbers depends on the basis set used for the expansion.