Multivariable Differentiation

Slides:



Advertisements
Similar presentations
The Inverse Function Theorem
Advertisements

5.4 Basis And Dimension.
Boyce/DiPrima 9th ed, Ch 2.8: The Existence and Uniqueness Theorem Elementary Differential Equations and Boundary Value Problems, 9th edition, by William.
ESSENTIAL CALCULUS CH11 Partial derivatives
F(x,y) = x 2 y + y y  f — = f x =  x 2xyx 2 + 3y y ln2 z = f(x,y) = cos(xy) + x cos 2 y – 3  f — = f y =  y  z —(x,y) =  x – y sin(xy)
Linear Equations in Linear Algebra
2 2.3 © 2012 Pearson Education, Inc. Matrix Algebra CHARACTERIZATIONS OF INVERTIBLE MATRICES.
3 DERIVATIVES.
Eigenvalues and Eigenvectors
6 6.1 © 2012 Pearson Education, Inc. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Ch 5.2: Series Solutions Near an Ordinary Point, Part I
Ch. 4: Velocity Kinematics
Linear Equations in Linear Algebra
1 © 2012 Pearson Education, Inc. Matrix Algebra THE INVERSE OF A MATRIX.
(MTH 250) Lecture 24 Calculus. Previous Lecture’s Summary Multivariable functions Limits along smooth curves Limits of multivariable functions Continuity.
Linear Equations in Linear Algebra
1 1.7 © 2016 Pearson Education, Inc. Linear Equations in Linear Algebra LINEAR INDEPENDENCE.
3 DERIVATIVES. In this section, we will learn about: Differentiating composite functions using the Chain Rule. DERIVATIVES 3.5 The Chain Rule.
Matrix Differential Calculus By Dr. Md. Nurul Haque Mollah, Professor, Dept. of Statistics, University of Rajshahi, Bangladesh Dr. M. N. H. MOLLAH.
3 DERIVATIVES.  Remember, they are valid only when x is measured in radians.  For more details see Chapter 3, Section 4 or the PowerPoint file Chapter3_Sec4.ppt.
2 2.1 © 2012 Pearson Education, Inc. Matrix Algebra MATRIX OPERATIONS.
College Algebra Sixth Edition James Stewart Lothar Redlin Saleem Watson.
Directional Derivatives. Example…What ’ s the slope of at (0,1/2)? What’s wrong with the way the question is posed? What ’ s the slope along the direction.
A function is a rule f that associates with each element in a set A one and only one element in a set B. If f associates the element b with the element.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
Finite Element Method. History Application Consider the two point boundary value problem.
1 1.4 Linear Equations in Linear Algebra THE MATRIX EQUATION © 2016 Pearson Education, Ltd.
Boyce/DiPrima 10th ed, Ch 7.9: Nonhomogeneous Linear Systems Elementary Differential Equations and Boundary Value Problems, 10th edition, by William E.
7.3 Linear Systems of Equations. Gauss Elimination
Functions of Complex Variable and Integral Transforms
Section 14.2 Computing Partial Derivatives Algebraically
Matrix Algebra MATRIX OPERATIONS © 2012 Pearson Education, Inc.
Boyce/DiPrima 10th ed, Ch 7.2: Review of Matrices Elementary Differential Equations and Boundary Value Problems, 10th edition, by William E. Boyce and.
Matrix Algebra MATRIX OPERATIONS.
7.7 Determinants. Cramer’s Rule
Linear Equations in Linear Algebra
CSE322 PUMPING LEMMA FOR REGULAR SETS AND ITS APPLICATIONS
Eigenvalues and Eigenvectors
Matrices, Determinants, and Cramer’s Rule
DIFFERENTIATION RULES
Matrix Algebra MATRIX OPERATIONS © 2012 Pearson Education, Inc.
Partial Derivative - Definition
CHARACTERIZATIONS OF INVERTIBLE MATRICES
Copyright © Cengage Learning. All rights reserved.
§3-3 realization for multivariable systems
Mathematical Descriptions of Systems
Systems of First Order Linear Equations
Copyright © Cengage Learning. All rights reserved.
Ch 5.2: Series Solutions Near an Ordinary Point, Part I
Linear Equations in Linear Algebra
2. Matrix Algebra 2.1 Matrix Operations.
Chapter 2 Determinants Basil Hamed
Copyright © Cengage Learning. All rights reserved.
Boyce/DiPrima 10th ed, Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues Elementary Differential Equations and Boundary Value Problems,
§2-3 Observability of Linear Dynamical Equations
§1-3 Solution of a Dynamical Equation
Linear Algebra Lecture 37.
PRELIMINARY MATHEMATICS
Linear Algebra Lecture 21.
Chapter 3 Canonical Form and Irreducible Realization of Linear Time-invariant Systems.
Linear Algebra Lecture 23.
Linear Algebra Lecture 7.
Tutorial 4 Techniques of Differentiation
§3-2 Realization of single variable systems
Matrix Algebra MATRIX OPERATIONS © 2012 Pearson Education, Inc.
Numerical Analysis Lecture11.
Copyright © Cengage Learning. All rights reserved.
CHARACTERIZATIONS OF INVERTIBLE MATRICES
Eigenvalues and Eigenvectors
Presentation transcript:

Multivariable Differentiation Lectures on Calculus Multivariable Differentiation

University of West Georgia by William M. Faucette University of West Georgia

Adapted from Calculus on Manifolds by Michael Spivak

Multivariable Differentiation Recall that a function f: RR is differentiable at a in R if there is a number f (a) such that

Multivariable Differentiation This definition makes no sense for functions f:RnRm for several reasons, not the least of which is that you cannot divide by a vector.

Multivariable Differentiation However, we can rewrite this definition so that it can be generalized to several variables. First, rewrite the definition this way

Multivariable Differentiation Notice that the function taking h to f (a)h is a linear transformation from R to R. So we can view f (a) as being a linear transformation, at least in the one dimensional case.

Multivariable Differentiation So, we define a function f:RnRm to be differentiable at a in Rn if there exists a linear transformation  from Rn to Rm so that

Multivariable Differentiation Notice that taking the length here is essential since the numerator is a vector in Rm and denominator is a vector in Rn.

Multivariable Differentiation Definition: The linear transformation  is denoted Df(a) and called the derivative of f at a, provided

Multivariable Differentiation Notice that for f:RnRm, the derivative Df(a):RnRm is a linear transformation. Df(a) is the linear transformation most closely approximating the map f at a, in the sense that

Multivariable Differentiation For a function f:RnRm, the derivative Df(a) is unique if it exists. This result will follow from what we do later.

Multivariable Differentiation Since Df(a) is a linear transformation, we can give its matrix with respect to the standard bases on Rn and Rm. This matrix is an mxn matrix called the Jacobian matrix of f at a. We will see how to compute this matrix shortly.

Our First Lemma

Lemma 1 Lemma: If f:RnRm is a linear transformation, then Df(a)=f.

Lemma 1 Proof: Let =f. Then

Our Second Lemma

Lemma 2 Lemma: Let T:RmRn be a linear transformation. Then there is a number M such that |T(h)|≤M|h| for h2Rm.

Lemma 2 Proof: Let A be the matrix of T with respect to the standard bases for Rm and Rn. So A is an nxm matrix [aij] If A is the zero matrix, then T is the zero linear transformation and there is nothing to prove. So assume A≠0. Let K=max{|aij|}>0.

Lemma 2 Proof: Then So, we need only let M=Km. QED

The Chain Rule

The Chain Rule Theorem (Chain Rule): If f: RnRm is differentiable at a, and g: RmRp is differentiable at f(a), then the composition gf: RnRp is differentiable at a and

The Chain Rule In this expression, the right side is the composition of linear transformations, which, of course, corresponds to the product of the corresponding Jacobians at the respective points.

The Chain Rule Proof: Let b=f(a), let =Df(a), and let =Dg(f(a)). Define

The Chain Rule Since f is differentiable at a, and  is the derivative of f at a, we have

The Chain Rule Similarly, since g is differentiable at b, and  is the derivative of g at b, we have

The Chain Rule To show that gf is differentiable with derivative , we must show that

The Chain Rule Recall that and that  is a linear transformation. Then we have

The Chain Rule Next, recall that Then we have

The Chain Rule From the preceding slide, we have So, we must show that

The Chain Rule Recall that Given >0, we can find >0 so that which is true provided that |x-a|<1, since f must be continuous at a.

The Chain Rule Then Here, we’ve used Lemma 2 to find M so that

The Chain Rule Dividing by |x-a| and taking a limit, we get

The Chain Rule Since >0 is arbitrary, we have which is what we needed to show first.

The Chain Rule Recall that Given >0, we can find 2>0 so that

The Chain Rule By Lemma 2, we can find M so that Hence

The Chain Rule Since >0 is arbitrary, we have which is what we needed to show second. QED

The Derivative of f:RnRm

The Derivative of f:RnRm Let f be given by m coordinate functions f 1, . . . , f m. We can first make a reduction to the case where m=1 using the following theorem.

The Derivative of f:RnRm Theorem: If f:RnRm, then f is differentiable at a2Rn if and only if each f i is differentiable at a2Rn, and

The Derivative of f:RnRm Proof: One direction is easy. Suppose f is differentiable. Let i:RmR be projection onto the ith coordinate. Then f i= if. Since i is a linear transformation, by Lemma 1 it is differentiable and is its own derivative. Hence, by the Chain Rule, we have f i= if is differentiable and Df i(a) is the ith component of Df(a).

The Derivative of f:RnRm Proof: Conversely, suppose each f i is differentiable at a with derivative Df i(a). Set Then

The Derivative of f:RnRm Proof: By the definition of the derivative, we have, for each i,

The Derivative of f:RnRm Proof: Then This concludes the proof. QED

The Derivative of f:RnRm The preceding theorem reduces differentiating f:RnRm to finding the derivative of each component function f i:RnR. Now we’ll work on this problem.

Partial Derivatives

Partial Derivatives Let f: RnR and a2Rn. We define the ith partial derivative of f at a by

The Derivative of f:RnRm Theorem: If f:RnRm is differentiable at a, then Djf i(a) exists for 1≤ i ≤m, 1≤ j ≤n and f(a) is the mxn matrix (Djf i(a)).

The Derivative of f:RnRm Proof: Suppose first that m=1, so that f:RnR. Define h:RRn by h(x)=(a1, . . . , x, . . . ,an), with x in the jth place. Then

The Derivative of f:RnRm Proof: Hence, by the Chain Rule, we have

The Derivative of f:RnRm Proof: Since (fh)(aj) has the single entry Djf(a), this shows that Djf(a) exists and is the jth entry of the 1xn matrix f (a). The theorem now follows for arbitrary m since, by our previous theorem, each f i is differentiable and the ith row of f (a) is (f i)(a). QED

Pause Now we know that a function f is differentiable if and only if each component function f i is and that if f is differentiable, Df(a) is given by the matrix of partial derivatives of the component functions f i. What we need is a condition to ensure that f is differentiable.

When is f differentiable? Theorem: If f:RnRm, then Df(a) exists if all Djf i(x) exist in an open set containing a and if each function Djf i is continuous at a. (Such a function f is called continuously differentiable.)

When is f differentiable? Proof: As before, it suffices to consider the case when m=1, so that f:RnR. Then

When is f differentiable? Proof: Applying the Mean Value Theorem, we have for some b1 between a1 and a1+h1.

When is f differentiable? Proof: Applying the Mean Value Theorem in the ith place, we have for some bi between ai and ai+hi.

When is f differentiable? Proof: Then since Dif is continuous at a. QED

Summary We have learned that A function f:Rn Rm is differentiable if and only if each component function f i:Rn R is differentiable;

Summary We have learned that If f:Rn Rm is differentiable, all the partial derivatives of all the component functions exist and the matrix Df(a) is given by

Summary We have learned that If f:Rn Rm and all the partial derivatives Djf i(a) exist in a neighborhood of a and are continuous at a, then f is differentiable at a.