Linear Vector Space and Matrix Mechanics

Slides:



Advertisements
Similar presentations
10.4 Complex Vector Spaces.
Advertisements

Quantum One: Lecture 9. Graham Schmidt Orthogonalization.
1 D. R. Wilton ECE Dept. ECE 6382 Introduction to Linear Vector Spaces Reference: D.G. Dudley, “Mathematical Foundations for Electromagnetic Theory,” IEEE.
Euclidean m-Space & Linear Equations Euclidean m-space.
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
MA5242 Wavelets Lecture 2 Euclidean and Unitary Spaces Wayne M. Lawton Department of Mathematics National University of Singapore 2 Science Drive 2 Singapore.
Lecture 2: Geometry vs Linear Algebra Points-Vectors and Distance-Norm Shang-Hua Teng.
राघव वर्मा Inner Product Spaces Physical properties of vectors  aka length and angles in case of arrows Lets use the dot product Length of Cosine of the.
C HAPTER 4 Inner Product & Orthogonality. C HAPTER O UTLINE Introduction Norm of the Vector, Examples of Inner Product Space - Euclidean n-space - Function.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
Chapter 5: The Orthogonality and Least Squares
REVIEW OF MATHEMATICS. Review of Vectors Analysis GivenMagnitude of vector: Example: Dot product:  is the angle between the two vectors. Example:
Chapter 5 Orthogonality.
Linear Algebra Chapter 4 Vector Spaces.
Gram-Schmidt Orthogonalization
Chapter 3 Vectors in n-space Norm, Dot Product, and Distance in n-space Orthogonality.
Chapter 10 Real Inner Products and Least-Square
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
Chap. 5 Inner Product Spaces 5.1 Length and Dot Product in R n 5.2 Inner Product Spaces 5.3 Orthonormal Bases: Gram-Schmidt Process 5.4 Mathematical Models.
AGC DSP AGC DSP Professor A G Constantinides©1 Signal Spaces The purpose of this part of the course is to introduce the basic concepts behind generalised.
Class 26: Question 1 1.An orthogonal basis for A 2.An orthogonal basis for the column space of A 3.An orthogonal basis for the row space of A 4.An orthogonal.
A rule that combines two vectors to produce a scalar.
Class 24: Question 1 Which of the following set of vectors is not an orthogonal set?
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Linear Algebra Chapter 4 n Linear Algebra with Applications –-Gareth Williams n Br. Joel Baumeyer, F.S.C.
Chapter 4 Vector Spaces Linear Algebra. Ch04_2 Definition 1: ……………………………………………………………………. The elements in R n called …………. 4.1 The vector Space R n Addition.
Chapter 4 Vector Spaces Linear Algebra. Ch04_2 Definition 1. Let be a sequence of n real numbers. The set of all such sequences is called n-space (or.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Lecture 11 Inner Product Spaces Last Time Change of Basis (Cont.) Length and Dot Product in R n Inner Product Spaces Elementary Linear Algebra R. Larsen.
CSE 681 Brief Review: Vectors. CSE 681 Vectors Direction in space Normalizing a vector => unit vector Dot product Cross product Parametric form of a line.
EE611 Deterministic Systems Vector Spaces and Basis Changes Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
CA 302 Computer Graphics and Visual Programming
Dot Product of Vectors.
Spaces.
Orthogonal Matrices & Symmetric Matrices
PHY 741 Quantum Mechanics 12-12:50 PM MWF Olin 103 Plan for Lecture 1:
Review of Linear Algebra
Chapter 1 Linear Equations and Vectors
Elementary Linear Algebra
Matrices and vector spaces
Matrices and Vectors Review Objective
6.2 Dot Products of Vectors
Lecture 03: Linear Algebra
Quantum One.
Quantum One.
Signal & Weight Vector Spaces
Chapter 3 Linear Algebra
Linear Algebra Lecture 39.
Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors.
Signal & Weight Vector Spaces
Linear Algebra Lecture 38.
Lecture 2: Geometry vs Linear Algebra Points-Vectors and Distance-Norm
Elementary Linear Algebra Anton & Rorres, 9th Edition
Maths for Signals and Systems Linear Algebra in Engineering Lecture 6, Friday 21st October 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Linear Vector Space and Matrix Mechanics
Linear Vector Space and Matrix Mechanics
Linear Algebra Lecture 41.
Linear Vector Space and Matrix Mechanics
Linear Vector Space and Matrix Mechanics
Linear Vector Space and Matrix Mechanics
Linear Vector Space and Matrix Mechanics
Linear Vector Space and Matrix Mechanics
CHAPTER 3 VECTORS NHAA/IMK/UNIMAP.
Linear Vector Space and Matrix Mechanics
Linear Vector Space and Matrix Mechanics
Linear Vector Space and Matrix Mechanics
Linear Vector Space and Matrix Mechanics
Presentation transcript:

Linear Vector Space and Matrix Mechanics Chapter 1 Lecture 1.4 Dr. Arvind Kumar Physics Department NIT Jalandhar e.mail: iitd.arvind@gmail.com https://sites.google.com/site/karvindk2013/

Definition of projection of vector: Consider two vector a and b and is θ angle between these vector. Scalar projection of vector b along the vector a = b cosθ Vector projection of vector b along the direction of vector a can be written as, = If vector a is of unit norm, then, = 1.

Gram-Schmidt orthonormalisation method: A set of linealry independent basis vectors with each of unit norm and which are pairwise orthogonal are known to form ortho-normal basis. E.g. Two basis and will form orthonormal basis If,

Gram-Schmidt orthonormalization method is a procedure to convert a given linearly independent basis to orthonormal basis. Let are the linearly independent basis vectors which we want to orthonormalize. Step 1: Rescale the first vector by its norm. We get 1st vector of orthonormalize set Note that,

Step 2: Substract from 2nd vector its projection along the 1st vector leaving behind only the part perpendicular to 1st. So we get Note that above vector is orthogonal to Dividing by its norm we will get 2nd basis vector of orthonormalized set which is orthogonal to 1st and also of unit norm. =1

Consider Above vector is orthogonal to both |1> and |2> . Dividing its norm we will get the third vector of orthonormal basis set. The above procedure can be extended to other vectors of linearly independent basis to form orthonormal basis.

Schwarz Inequality: It is the generalization for the angle between two vector for vector space. The dot product of two vectors cannot exceed the product of their length. Schwartz Inequality is, Proof: We define,

To prove Schwarz Equality we made use of fact that We find -------(1) Now ---------(2) Which is real quantity. It means above equation gives .

Also Which gives us ------(3) Using (2) and (3) in (1), we get Which implies Which is Schwarz Inequality.

Exercise: Prove the triangle inequality