Chapter 7 Inner Product Spaces

Slides:



Advertisements
Similar presentations
Ordinary Least-Squares
Advertisements

5.1 Real Vector Spaces.
Chapter 4 Euclidean Vector Spaces
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Chapter 5 Orthogonality
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
1 MAC 2103 Module 10 lnner Product Spaces I. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Define and find the.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
Graphics CSE 581 – Interactive Computer Graphics Mathematics for Computer Graphics CSE 581 – Roger Crawfis (slides developed from Korea University slides)
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Chapter 5: The Orthogonality and Least Squares
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
Linear Algebra Chapter 4 Vector Spaces.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Biostatistics Lecture 17 6/15 & 6/16/2015. Chapter 17 – Correlation & Regression Correlation (Pearson’s correlation coefficient) Linear Regression Multiple.
Chapter 3 Vectors in n-space Norm, Dot Product, and Distance in n-space Orthogonality.
Chapter 7 Inner Product Spaces 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Section 2.3 Properties of Solution Sets
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
Chap. 5 Inner Product Spaces 5.1 Length and Dot Product in R n 5.2 Inner Product Spaces 5.3 Orthonormal Bases: Gram-Schmidt Process 5.4 Mathematical Models.
Chap. 4 Vector Spaces 4.1 Vectors in Rn 4.2 Vector Spaces
Projection and Some of Its Applications
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Ch7_ Inner Product Spaces In this section, we extend those concepts of R n such as: dot product of two vectors, norm of a vector, angle between vectors,
1 Chapter 4 Interpolation and Approximation Lagrange Interpolation The basic interpolation problem can be posed in one of two ways: The basic interpolation.
Chapter 4 Vector Spaces Linear Algebra. Ch04_2 Definition 1: ……………………………………………………………………. The elements in R n called …………. 4.1 The vector Space R n Addition.
Least Squares Problems From Wikipedia, the free encyclopedia The method of least squares is a standard approach to the approximate solution of overdetermined.
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
1 Projection and Some of Its Applications Mohammed Nasser Professor, Dept. of Statistics, RU,Bangladesh 1 The use of matrix.
Chapter 4 Vector Spaces Linear Algebra. Ch04_2 Definition 1. Let be a sequence of n real numbers. The set of all such sequences is called n-space (or.
6 6.1 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Graphics Graphics Korea University kucg.korea.ac.kr Mathematics for Computer Graphics 고려대학교 컴퓨터 그래픽스 연구실.
Section 8 Numerical Analysis CSCI E-24 José Luis Ramírez Herrán October 20, 2015.
Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered.
Lecture 11 Inner Product Spaces Last Time Change of Basis (Cont.) Length and Dot Product in R n Inner Product Spaces Elementary Linear Algebra R. Larsen.
EE611 Deterministic Systems Vector Spaces and Basis Changes Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
Chapter 5 Eigenvalues and Eigenvectors
Section 6.1 Inner Products.
Chapter 1 Linear Equations and Vectors
Elementary Linear Algebra Anton & Rorres, 9th Edition
FE Exam Tutorial
Systems of First Order Linear Equations
Orthogonality and Least Squares
Boyce/DiPrima 10th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
Linear Algebra Chapter 4 Vector Spaces.
1.3 Vector Equations.
Signal & Weight Vector Spaces
Linear Algebra Lecture 39.
Signal & Weight Vector Spaces
Linear Algebra Lecture 38.
Elementary Linear Algebra
Elementary Linear Algebra Anton & Rorres, 9th Edition
Properties of Solution Sets
Elementary Linear Algebra Anton & Rorres, 9th Edition
Linear Algebra Lecture 20.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Vector Spaces 1 Vectors in Rn 2 Vector Spaces
11 Vectors and the Geometry of Space
RAYAT SHIKSHAN SANSTHA’S S.M.JOSHI COLLEGE HADAPSAR, PUNE
Eigenvalues and Eigenvectors
Chapter 7: Systems of Equations and Inequalities; Matrices
Elementary Linear Algebra Anton & Rorres, 9th Edition
Orthogonality and Least Squares
Approximation of Functions
Chapter 5 Inner Product Spaces
Approximation of Functions
Presentation transcript:

Chapter 7 Inner Product Spaces Linear Algebra Chapter 7 Inner Product Spaces 大葉大學 資訊工程系 黃鈴玲

Inner Product Spaces In this chapter, we extend those concepts of Rn such as: dot product of two vectors, norm of a vector, angle between vectors, and distance between points, to general vector space. This will enable us to talk about the magnitudes of functions and orthogonal functions. This concepts are used to approximate functions by polynomials – a technique that is used to implement functions on calculators and computers. We will no longer be restricted to Euclidean Geometry, we will be able to create our own geometries on Rn.

7.1 Inner Product Spaces The dot product was a key concept on Rn that led to definitions of norm, angle, and distance. Our approach will be to generalize the dot product of Rn to a general vector space with a mathematical structure called an inner product. This in turn will be used to define norm, angle, and distance for a general vector space.

Definition An inner product on a real spaces V is a function that associates a number, denoted 〈u, v〉, with each pair of vectors u and v of V. This function has to satisfy the following conditions for vectors u, v, and w, and scalar c. 1.〈u, v〉=〈v, u〉 (symmetry axiom) 2.〈u + v, w〉=〈u, w〉+〈v, w〉 (additive axiom) 3.〈cu, v〉= c〈u, v〉 (homogeneity axiom) 4.〈u, u〉 0, and 〈u, u〉= 0 if and only if u = 0 (position definite axiom) A vector space V on which an inner product is defined is called an inner product space. Any function on a vector space that satisfies the axioms of an inner product defines an inner product on the space. There can be many inner products on a given vector space.

Example 1 Let u = (x1, x2), v = (y1, y2), and w = (z1, z2) be arbitrary vectors in R2. Prove that〈u, v〉, defined as follows, is an inner product on R2. 〈u, v〉= x1y1 + 4x2y2 Determine the inner product of the vectors (-2, 5), (3, 1) under this inner product. Solution Axiom 1:〈u, v〉= x1y1 + 4x2y2 = y1x1 + 4y2x2 =〈v, u〉 Axiom 2:〈u + v, w〉=〈 (x1, x2) + (y1, y2) , (z1, z2) 〉 =〈 (x1 + y1, x2 + y2), (z1, z2) 〉 = (x1 + y1) z1 + 4(x2 + y2)z2 = x1z1 + 4x2z2 + y1 z1 + 4 y2z2 =〈(x1, x2), (z1, z2)〉+〈(y1, y2), (z1, z2) 〉 =〈u, w〉+〈v, w〉

Axiom 3:〈cu, v〉= 〈c(x1, x2), (y1, y2)〉 =〈 (cx1, cx2), (y1, y2) 〉 = cx1y1 + 4cx2y2 = c(x1y1 + 4x2y2) = c〈u, v〉 Axiom 4: 〈u, u〉= 〈(x1, x2), (x1, x2)〉= Further, if and only if x1 = 0 and x2 = 0. That is u = 0. Thus〈u, u〉 0, and〈u, u〉= 0 if and only if u = 0. The four inner product axioms are satisfied, 〈u, v〉= x1y1 + 4x2y2 is an inner product on R2. The inner product of the vectors (-2, 5), (3, 1) is 〈(-2, 5), (3, 1)〉= (-2  3) + 4(5  1) = 14

Example 2 Consider the vector space M22 of 2  2 matrices. Let u and v defined as follows be arbitrary 2  2 matrices. Prove that the following function is an inner product on M22. 〈u, v〉= ae + bf + cg + dh Determine the inner product of the matrices . Solution Axiom 1:〈u, v〉= ae + bf + cg + dh = ea + fb + gc + hd =〈v, u〉 Axiom 3: Let k be a scalar. Then 〈ku, v〉= kae + kbf + kcg + kdh = k(ae + bf + cg + dh) = k〈u, v〉

Example 3 Consider the vector space Pn of polynomials of degree  n. Let f and g be elements of Pn. Prove that the following function defines an inner product of Pn. Determine the inner product of polynomials f(x) = x2 + 2x – 1 and g(x) = 4x + 1 Solution Axiom 1: Axiom 2:

We now find the inner product of the functions f(x) = x2 + 2x – 1 and g(x) = 4x + 1

Norm of a Vector Definition The norm of a vector in Rn can be expressed in terms of the dot product as follows Generalize this definition: The norms in general vector space do not necessary have geometric interpretations, but are often important in numerical work. Definition Let V be an inner product space. The norm of a vector v is denoted ||v|| and it defined by

Example 4 Consider the vector space Pn of polynomials with inner product The norm of the function f generated by this inner product is Determine the norm of the function f(x) = 5x2 + 1. Solution Using the above definition of norm, we get The norm of the function f(x) = 5x2 + 1 is

Example 2’ (補充) Consider the vector space M22 of 2  2 matrices. Let u and v defined as follows be arbitrary 2  2 matrices. It is known that the function 〈u, v〉= ae + bf + cg + dh is an inner product on M22 by Example 2. The norm of the matrix is

Angle between two vectors The dot product in Rn was used to define angle between vectors. The angle  between vectors u and v in Rn is defined by Definition Let V be an inner product space. The angle  between two nonzero vectors u and v in V is given by

Example 5 Consider the inner product space Pn of polynomials with inner product The angle between two nonzero functions f and g is given by Determine the cosine of the angle between the functions f(x) = 5x2 and g(x) = 3x Solution We first compute ||f || and ||g||. Thus

Example 2” (補充) Consider the vector space M22 of 2  2 matrices. Let u and v defined as follows be arbitrary 2  2 matrices. It is known that the function 〈u, v〉= ae + bf + cg + dh is an inner product on M22 by Example 2. The norm of the matrix is The angle between u and v is

Orthogonal Vectors Def. Let V be an inner product space. Two nonzero vectors u and v in V are said to be orthogonal if Example 6 Show that the functions f(x) = 3x – 2 and g(x) = x are orthogonal in Pn with inner product Solution Thus the functions f and g are orthogonal in this inner product Space.

Distance As for norm, the concept of distance will not have direct geometrical interpretation. It is however, useful in numerical mathematics to be able to discuss how far apart various functions are. Definition Let V be an inner product space with vector norm defined by The distance between two vectors (points) u and v is defined d(u,v) and is defined by

Example 7 Consider the inner product space Pn of polynomials discussed earlier. Determine which of the functions g(x) = x2 – 3x + 5 or h(x) = x2 + 4 is closed to f(x) = x2. Solution Thus The distance between f and h is 4, as we might suspect, g is closer than h to f.

Inner Product on Cn For a complex vector space, the first axiom of inner product is modified to read . An inner product can then be used to define norm, orthogonality, and distance, as far a real vector space. Let u = (x1, …, xn) and v = (y1, …, yn) be element of Cn. The most useful inner product for Cn is 

Example 8 Consider the vectors u = (2 + 3i, -1 + 5i), v = (1 + i, - i) in C2. Compute (a)〈u, v〉, and show that u and v are orthogonal. (b) ||u|| and ||v|| (c) d(u, v) Solution

Homework Exercise 7.1: 1, 4, 8(a), 9(a), 10, 12, 13, 15, 17(a), 19, 20(a)

8.2 Non-Euclidean Geometry and Special Relativity Different inner products on Rn lead to different measures of vector norm, angle, and distance – that is, to different geometries. dot product  Euclidean geometry other inner products  non-Euclidean geometries Example Let u = (x1, x2), v = (y1, y2) be arbitrary vectors in R2. It is proved that〈u, v〉, defined as follows, is an inner product on R2. 〈u, v〉= x1y1 + 4x2y2 The inner product differs from the dot product in the appearance of a 4. Consider the vector (0, 1) in this space. The norm of this vector is

The norm of this vector in Euclidean geometry is 1; in our new geometry, however, the norm is 2. Figure 7.1

Consider the vectors (1, 1) and (-4, 1) Consider the vectors (1, 1) and (-4, 1). The inner product of these vectors is Thus these two vectors are orthogonal. Figure 7.2

Let us use the definition of distance based on this inner product to compute the distance between the points(1, 0) and (0, 1). We have that Figure 7.3

7.4 Least-Squares Curves To find a polynomial that best fits given data points. Ax = y : (1) if n equations, n variables, and A-1 exists  x = A-1 y (2) if n equations, m variables with n > m  overdetermined How to solve it? We will introduce a matrix called the pseudoinverse of A, denoted pinv(A), that leads to a least-squares solution x = pinv(A)y for an overdetermined system.

Definition Let A be a matrix. The matrix (AtA)-1At is called the pseudoinverse of A and is denoted pinv(A). Example 1 Find the pseudoinverse of A = Solution

Let Ax = y be a system of n linear equations in m variables with n > m, where A is of rank m. Ax=y  AtAx=Aty  x = (AtA)-1Aty AtA is invertible Ax = y x = pinv(A)y system least-squares solution If the system Ax=y has a unique solution, the least-squares solution is that unique solution. If the system is overdetermined, the least-squares solution is the closest we can get to a true solution. The system cannot have many solutions.

Example 2 Find the least-squares solution of the following overdetermined system of equations. Sketch the solution. Solution The matrix of coefficients is rank(A)=2 

The least-squares solution is The least-squares solution is the point Figure 7.9

Least-Square Curves Least-squares line or curve minimizes Figure 7.10

Example 3 Find the least-squares line for the following data points. (1, 1), (2, 4), (3, 2), (4, 4) Solution Let the equation of the line by y = a + bx. Substituting for these points into the equation of the line, we get the overdetermined system We find the least squares solution. The matrix of coefficients A and column vector y are as follows. It can be shown that

The least squares solution is Thus a = 1, b = 0.7. The equation of the least-squares line for this data is y = 1 + 0.7x Figure 7.11

Example 4 Find the least-squares parabola for the following data points. (1, 7), (2, 2), (3, 1), (4, 3) Solution Let the equation of the parabola be y = a + bx + cx2. Substituting for these points into the equation of the parabola, we get the system We find the least squares solution. The matrix of coefficients A and column vector y are as follows. It can be shown that

The least squares solution is Thus a = 15.25, b = -10.05, c = 1.75. The equation of the least-squares parabola for these data points is y = 15.25 – 10.05x + 1.75x2 Figure 7.12

Theorem 7.1 Let (x1, y1), …, (xn, yn) be a set of n data points. Let y = a0 + … + amxm be a polynomial of degree m (n > m) that is to be fitted to these points. Substituting these points into the polynomial leads to a system Ax = y of n linear equations in the m variables a0, …, am, where The least-squares solution to this system gives the coefficients of the least-squares polynomial for these data points. Figure 7.13 y’ is the projection of y onto range(A)

Homework Exercise 7.4 3, 11, 21