Consider minimizing and/or maximizing a function z = f(x,y) subject to a constraint g(x,y) = c. y z x z = f(x,y) Parametrize the curve defined by g(x,y)

Slides:



Advertisements
Similar presentations
Chapter 11-Functions of Several Variables
Advertisements

Lial/Hungerford/Holcomb/Mullins: Mathematics with Applications 11e Finite Mathematics with Applications 11e Copyright ©2015 Pearson Education, Inc. All.
Section 3.1 – Extrema on an Interval. Maximum Popcorn Challenge You wanted to make an open-topped box out of a rectangular sheet of paper 8.5 in. by 11.
F(x,y) = x 2 y + y y  f — = f x =  x 2xyx 2 + 3y y ln2 z = f(x,y) = cos(xy) + x cos 2 y – 3  f — = f y =  y  z —(x,y) =  x – y sin(xy)
Semester 1 Review Unit vector =. A line that contains the point P(x 0,y 0,z 0 ) and direction vector : parametric symmetric.
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Lagrange Multipliers OBJECTIVES  Find maximum and minimum values using Lagrange.
Slide Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
6 - 1 © 2012 Person Education, Inc.. All rights reserved. Chapter 6 Applications of the Derivative.
Chapter 14 – Partial Derivatives
15 PARTIAL DERIVATIVES.
15 PARTIAL DERIVATIVES.
Tutorial 7 Constrained Optimization Lagrange Multipliers
Constrained Optimization
Math for CSLecture 71 Constrained Optimization Lagrange Multipliers ____________________________________________ Ordinary Differential equations.
Maximum and Minimum Values
Systems of Inequalities in Two Variables Sec. 7.5a.
LAGRANGE mULTIPLIERS By Rohit Venkat.
Sec 15.6 Directional Derivatives and the Gradient Vector
D Nagesh Kumar, IIScOptimization Methods: M2L4 1 Optimization using Calculus Optimization of Functions of Multiple Variables subject to Equality Constraints.
Tangents and Normals We’re going to talk about tangents and normals to 3-D surfaces such as x 2 + y 2 + z 2 = 4 It’s useful to think of these surfaces.
Warmup- no calculator 1) 2). 4.4: Modeling and Optimization.
Section 15.6 Directional Derivatives and the Gradient Vector.
Copyright © 2016, 2012 Pearson Education, Inc
Precalculus Parametric Equations graphs. Parametric Equations  Graph parametric equations.  Determine an equivalent rectangular equation for parametric.
(iii) Lagrange Multipliers and Kuhn-tucker Conditions D Nagesh Kumar, IISc Introduction to Optimization Water Resources Systems Planning and Management:
Copyright © Cengage Learning. All rights reserved. 14 Partial Derivatives.
4.1 Extreme Values of Functions
Optimization Problems Example 1: A rancher has 300 yards of fencing material and wants to use it to enclose a rectangular region. Suppose the above region.
Extreme Values Let f (x,y) be defined on a region R containing P(x 0,y 0 ): P is a relative max of f if f (x,y) ≤ f (x 0,y 0 ) for all (x,y) on an open.
Functions of Several Variables Copyright © Cengage Learning. All rights reserved.
PGM 2002/03 Langrage Multipliers. The Lagrange Multipliers The popular Lagrange multipliers method is used to find extremum points of a function on a.
LaGrange Multipliers Consider an example from Calculus I:
MAXIMA AND MINIMA. ARTICLE -1 Definite,Semi-Definite and Indefinite Function.
Recall that for a real valued function from R n to R 1, such as f(x,y) or f(x,y,z), the derivative matrix can be treated as a vector called the gradient.
Functions of Several Variables Copyright © Cengage Learning. All rights reserved.
Copyright © Cengage Learning. All rights reserved. 14 Partial Derivatives.
Functions of Several Variables 13 Copyright © Cengage Learning. All rights reserved.
Section Lagrange Multipliers.
Optimization Ex. A rectangular box lies on the xy-axis with one vertex on the origin. If the opposite vertex lies on the plane 3x + 2y + z = 12, find.
Calculus III Hughes-Hallett Chapter 15 Optimization.
Section 15.3 Constrained Optimization: Lagrange Multipliers.
In addition to the multiple integral of a function f:R n  R over a region in R n, there are many different types of integrals which can be defined, each.
Copyright © 2016, 2012 Pearson Education, Inc
Copyright © Cengage Learning. All rights reserved Maximum and Minimum Values.
Functions of Several Variables 13 Copyright © Cengage Learning. All rights reserved.
Section 15.2 Optimization. Recall from single variable calculus Let f be continuous on a closed, bounded interval, [a,b], then f(x) has both a global.
1 Unconstrained and Constrained Optimization. 2 Agenda General Ideas of Optimization Interpreting the First Derivative Interpreting the Second Derivative.
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
Consider the curve defined by Find Write an equation of each horizontal tangent line to the curve The line through the origin with slope –1 is tangent.
Suppose that D is a simple region (a region which is both x-simple and y-simple) and that F = P(x,y)i + Q(x,y)j where P(x,y) and Q(x,y) are each functions.
Multivariate Optimization Problems
Chapter 14 Partial Derivatives
Copyright © Cengage Learning. All rights reserved.
Copyright © Cengage Learning. All rights reserved.
Functions of Several Variables
Lagrange Multipliers By Dr. Julia Arnold Professor of Mathematics
Copyright © Cengage Learning. All rights reserved.
Copyright © Cengage Learning. All rights reserved.
Unconstrained and Constrained Optimization
Math 200 Week 6 - Monday Tangent Planes.
Copyright © Cengage Learning. All rights reserved.
13 Functions of Several Variables
13 Functions of Several Variables
Outline Unconstrained Optimization Functions of One Variable
Copyright © Cengage Learning. All rights reserved.
Lagrange Multipliers.
Tutorial 3 Applications of the Derivative
Optimal Control of Systems
Multivariable optimization with no constraints
Presentation transcript:

Consider minimizing and/or maximizing a function z = f(x,y) subject to a constraint g(x,y) = c. y z x z = f(x,y) Parametrize the curve defined by g(x,y) = c as c(t) = (x(t),y(t))  g(c(t)) = c. For example, consider minimizing and maximizing the function z = x 2 – y 2 subject to the constraint x 2 + y 2 = 1. Parametrize x 2 + y 2 = 1 as c(t) = (cos t, sin t)  cos 2 t + sin 2 t = 1. d —g(c(t)) = dt d — c dt  g(c(t)) c  (t) = 0 d —(cos 2 t + sin 2 t) = dt d — 1 dt

Parametrize the curve defined by g(x,y) = c as c(t) = (x(t),y(t))  g(c(t)).= c. Parametrize x 2 + y 2 = 1 as c(t) = (cos t, sin t)  cos 2 t + sin 2 t = 1. d —g(c(t)) = dt d — c dt  g(c(t)) c  (t) = 0 d —(cos 2 t + sin 2 t) = dt d — 1 dt 2x 2y = 0 – sin t cos t Maximizing/Minimizing f(x,y) subject to g(x,y) = c is the same as maximizing/minimizing f(c(t)). Consequently, Maximizing/Minimizing f(x,y) = x 2 – y 2 subject to x 2 + y 2 = 1 is the same as maximizing/minimizing cos 2 t – sin 2 t. d —f(c(t)) = 0 dt d —(cos 2 t – sin 2 t) = 0 dt  f (c(t)) c  (t) = 0 Observe that both  f (c(t)) and  g(c(t)) must be orthogonal to c  (t).

y x If both  f (c(t)) and  g(c(t)) must be orthogonal to c  (t), then  f (c(t)) and  g(c(t)) must be c  (t0)c  (t0)multiples of each other, that is, for any point c(t 0 ) = (x 0, y 0 ) at which  f and  g are both orthogonal to c  (t), we must have  f(x 0, y 0 ) =  g(x 0, y 0 ), assuming neither gradient is the zero vector. This motivates the Method of Lagrange Multipliers (stated in its most general form in Theorem 8 on page 226). To maximize/minimize f(x,y) subject to g(x,y) = c, (1) (2) (3) Set up the system of equations  f(x,y) =  g(x,y) and g(x,y) = c. Let = 0 to find critical points of the function f(x,y), but eliminate these points from consideration if they do not satisfy g(x,y) = c. Assume  0, solve for (x,y), and substitute each candidate for an extremum into f to find the desired maximum/minimum values for f.

Find the extreme values of z = x 2 – y 2 along the circle of radius 1 centered at the origin in the xy plane. With f(x,y) = x 2 – y 2 and g(x,y) = x 2 + y 2, we have  f(x,y) =  g(x,y) = Setting  f(x,y) =  g(x,y) and including the constraint equation, we have [ 2x – 2y ] [ 2x 2y ] 2x= 2x – 2y= 2y x 2 + y 2 = 1 = 0 implies x = y = 0, which is not possible, since the third equation is not satisfied. Candidates for extrema are(0,1), (0,–1), (1,0), (–1,0). f(0,1) = f(0,–1) = –1 is the minimum value of the function on the circle. f(1,0) = f(–1,0) = 1 is the maximum value of the function on the circle.  0 implies either x or y must be zero, since we cannot have both = 1 and = –1. If x = 0, then from the third equation y = 1 or y = –1. If y = 0, then from the third equation x = 1 or x = –1.

Find the extreme values of f(x,y) = x 2 + y 2 along the line y = x + 1, that is, the line y – x = 1. With f(x,y) = x 2 + y 2 and g(x,y) = y – x, we have  f(x,y) =  g(x,y) = Setting  f(x,y) =  g(x,y) and including the constraint equation, we have [ 2x 2y ][ –1 1 ] 2x= – 2y= y – x= 1 = 0 implies x = y = 0, which is not possible, since the third equation is not satisfied. The only candidate for an extremum is(–1/2, 1/2). Since f(x,y) goes to infinity as either x or y goes to negative or positive infinity, then there can be no global maximum; therefore, the single candidate for an extremum f(–1/2, 1/2) = 1/2 is the minimum value of the function on the line.  0 implies x = –y. From the third equation, x = –1/2 and y = 1/2.

Note how we could have solved this problem using substitution. Since f(x,y) = x 2 + y 2 and y = x + 1, then we may write Setting h  (x) = 4x + 2 = 0, we find that x = –1/2 is a critical point. We then find that this critical point is a local minimum, since, h   (x) = 4 > 0. To use Lagrange multipliers to find the absolute maximum and minimum for a function f(x,y) over a region: (1) (2) Find the extreme values of f(x,y) = x 2 + y 2 along the line y = x + 1, that is y – x = 1. f(x,y) = x 2 + y 2 = x 2 + (x + 1) 2 = 2x 2 + 2x + 1 = h(x). Use Lagrange multipliers to find candidates for extrema on the boundary of the region. Add to the candidates for extrema all critical points of the function that lie in the region, and substitute each candidate into f.

Find the absolute maximum and minimum of f(x,y) = xy on the unit disc, i.e., where x 2 + y 2  1. With f(x,y) = xy and g(x,y) = x 2 + y 2, we have  f(x,y) =  g(x,y) = [ y x ][ 2x 2y ] (0,0). First, we find all critical points of f(x,y) located in the unit disk, by solving f x = f y = 0. The only critical point is Next, we locate the candidates for extrema on the boundary of the unit disk, that is, on the unit circle of radius 1. Setting  f(x,y) =  g(x,y) and including the constraint equation, we have y= 2 x x= 2 y x 2 + y 2 = 1 = 0 implies x = y = 0, which again gives us the critical point (0,0).  0 implies x and y must either both be zero or both be non-zero with x 2 = y 2. From the third equation, x = 1/  2 or x = –1/  2. Candidates for extrema are

(0,0) (1/  2, 1/  2) (–1/  2, 1/  2) (1/  2, –1/  2) (–1/  2, –1/  2). f(0,0) = f(1/  2, 1/  2) = f(–1/  2, 1/  2) = f(1/  2, –1/  2) = f(–1/  2, –1/  2) = 0 1/2 –1/2 1/2 The absolute maximum of f is 1/2 and occurs at (1/  2, 1/  2) and (–1/  2, –1/  2). The absolute minimum of f is –1/2 and occurs at (–1/  2, 1/  2) and (1/  2, –1/  2) Note: from the second derivative test, we find that (0,0) is a saddle point.

Find the absolute maximum and minimum of f(x,y) = (x 2 + y 2 )/2 on the elliptical region defined by x 2 /2 + y 2  1. With f(x,y) = (x 2 + y 2 )/2 and g(x,y) = x 2 /2 + y 2, we have  f(x,y) =  g(x,y) = [ x y ][ x 2y ] (0,0). First, we find all critical points of f(x,y) located in the elliptical region, by solving f x = f y = 0. The only critical point is Next, we locate the candidates for extrema on the boundary of the elliptical region. Setting  f(x,y) =  g(x,y) and including the constraint equation, we have x= x y= 2 y x 2 /2 + y 2 = 1 = 0 implies x = y = 0, which again gives us the critical point (0,0).  0 implies x and y cannot both be non-zero, since we cannot have both = 1 and = 1/2.

x= x y= 2y x 2 /2 + y 2 = 1 If x = 0, then from the third equation, y = 1 or y = –1; if y = 0, then from the third equation, x =  2 or x = –  2. Candidates for extrema are (0,0) (0, 1) (0, –1) (  2, 0) (–  2, 0). f(0,0) =f(0, 1) =f(0, –1) = f(  2, 0) =f(–  2, 0) = 01/2 11 The absolute maximum of f is 1 and occurs at (  2, 0) and (–  2, 0). The absolute minimum of f is 0 and occurs at (0,0). Using Lagrange multipliers to find the absolute maximum and minimum for a function of more than two variables over a region is a natural generalization of the method with a function of two variables.

Find the absolute maximum and minimum of f(x,y,z) = xy + z 2 subject to the constraint that x 2 + y 2 + z 2 = 1. With f(x,y,z) = xy + z 2 and g(x,y,z) = x 2 + y 2 + z 2, we have  f(x,y,z) =  g(x,y,z) = Setting  f(x,y,z) =  g(x,y,z) and including the constraint equation, we have [ y x 2z ][2x 2y 2z ] y = 2 x x = 2 y 2z = 2 z x 2 + y 2 + z 2 = 1 = 0  x = y = z = 0, which is not possible, since the fourth equation is not satisfied. Candidates for extrema are (1/  2, 1/  2, 0), (–1/  2, 1/  2, 0), (1/  2, –1/  2, 0), (–1/  2, –1/  2, 0) f (–1/  2, 1/  2, 0) = f(1/  2, –1/  2, 0) = –1/2 is the absolute minimum,  0  either that x = y = 0 and z = –1 or 1, or that x 2 = y 2 = 1/2 and z = 0. (0, 0, 1), (0, 0, –1), and f(0, 0, 1) = f(0, 0, –1) = 1 is the absolute maximum.

Find the absolute maximum and minimum of f(x,y,z) = xy + yz subject to the constraint that x 2 + y 2 + z 2 = 1. With f(x,y,z) = xy + yz and g(x,y,z) = x 2 + y 2 + z 2, we have  f(x,y,z) =  g(x,y,z) = Setting  f(x,y,z) =  g(x,y,z) and including the constraint equation, we have [ y x + z y ][2x 2y 2z ] y = 2 x x + z = 2 y y = 2 z x 2 + y 2 + z 2 = 1 = 0  y = 0 and x = – z, which gives us x = 1/  2 & z = –1/  2 or x = –1/  2 & z = 1/  2.  0  x = z  0 and y 2 = 2x 2, which gives us that x = z = 1/2 and y = 1/  2 or –1/  2, or Candidates for extrema are (1/  2, 0, –1/  2), (–1/  2, 0, 1/  2), f (1/2, –1/  2, 1/2) = f(–1/2, 1/  2, –1/2) = –1/  2 is the absolute minimum, and (1/2, 1/  2, 1/2), (1/2, –1/  2, 1/2), (–1/2, 1/  2, –1/2), (–1/2, –1/  2, –1/2). f(1/2, 1/  2, 1/2) = f(–1/2, –1/  2, –1/2) = 1/  2 is the absolute maximum. that x = z = –1/2 and y = 1/  2 or –1/  2.

Let x, y, and z be the dimensions of a rectangular box. Maximize the volume V = subject to the constraint that the surface area of the box is S = With f(x,y,z) = and g(x,y,z) =, we have  f(x,y,z) =  g(x,y,z) = Setting  f(x,y,z) =  g(x,y,z) and including the constraint equation, we have [ yz xz xy ][2y +2z 2x +2z 2x +2y ] yz = 2 (y+z) xz = 2 (x+z) xy = 2 (x+y) 2xy + 2xz + 2yz = S = 0 implies xy = xz = yz = 0, which is not possible, since the fourth equation is not satisfied (and the box must have positive dimensions). yz xz From the first two equations,—— =——, from which it follows x = y. y+z x+z  0 implies none of x, y, or z can be 0. xz xy From the last two equations,—— =——, from which it follows y = z. x+z x+y xyz 2xy + 2xz + 2yz. xyz2xy + 2xz + 2yz

Then, from the constraint equation, we find that the only candidate for extrema where each of x, y, and z is nonnegative is (  S/6,  S/6,  S/6). We know this must be an absolute maximum by checking that volume can only become smaller as any one of the dimensions goes to zero. The maximum volume is yz = 2 (y+z) xz = 2 (x+z) xy = 2 (x+y) 2xy + 2xz + 2yz = S = 0 implies of x, y, and z is 0, which is not possible, since the fourth equation is not satisfied (i.e., the box must have positive dimensions). yz xz From the first two equations,—— =——, from which it follows x = y. y+z x+z  0 implies none of x, y, or z can be 0. xz xy From the last two equations,—— =——, from which it follows y = z. x+z x+y (S/6) 3/2.