D Nagesh Kumar, IIScOptimization Methods: M2L4 1 Optimization using Calculus Optimization of Functions of Multiple Variables subject to Equality Constraints.

Slides:



Advertisements
Similar presentations
D Nagesh Kumar, IIScOptimization Methods: M4L1 1 Linear Programming Applications Software for Linear Programming.
Advertisements

Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Lagrange Multipliers OBJECTIVES  Find maximum and minimum values using Lagrange.
Slide Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley.
D Nagesh Kumar, IIScOptimization Methods: M2L2 1 Optimization using Calculus Convexity and Concavity of Functions of One and Two Variables.
Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions Today’s lecture is on optimality.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Optimization in Engineering Design 1 Lagrange Multipliers.
Optimization using Calculus
THE MATHEMATICS OF OPTIMIZATION
Maximization without Calculus Not all economic maximization problems can be solved using calculus –If a manager does not know the profit function, but.
Constrained Maximization
Economics 214 Lecture 37 Constrained Optimization.
Tutorial 7 Constrained Optimization Lagrange Multipliers
Constrained Optimization
Theoretical Mechanics - PHY6200 Chapter 6 Introduction to the calculus of variations Prof. Claude A Pruneau, Physics and Astronomy Department Wayne State.
Linear Programming Applications
Optimization using Calculus
D Nagesh Kumar, IIScOptimization Methods: M2L5 1 Optimization using Calculus Kuhn-Tucker Conditions.
D Nagesh Kumar, IIScOptimization Methods: M7L1 1 Integer Programming All Integer Linear Programming.
D Nagesh Kumar, IIScOptimization Methods: M3L1 1 Linear Programming Preliminaries.
Introduction and Basic Concepts
D Nagesh Kumar, IIScOptimization Methods: M3L4 1 Linear Programming Simplex method - II.
Linear Programming Applications
Finding a free maximum Derivatives and extreme points Using second derivatives to identify a maximum/minimum.
D Nagesh Kumar, IIScOptimization Methods: M2L3 1 Optimization using Calculus Optimization of Functions of Multiple Variables: Unconstrained Optimization.
1 Optimization. 2 General Problem 3 One Independent Variable x y (Local) maximum Slope = 0.
THE MATHEMATICS OF OPTIMIZATION
Managerial Economics Managerial Economics = economic theory + mathematical eco + statistical analysis.
Bell Work: Find the values of all the unknowns: R T = R T T + T = 60 R = 3 R =
Consider minimizing and/or maximizing a function z = f(x,y) subject to a constraint g(x,y) = c. y z x z = f(x,y) Parametrize the curve defined by g(x,y)
Copyright © 2013, 2009, 2005 Pearson Education, Inc. 1 3 Polynomial and Rational Functions Copyright © 2013, 2009, 2005 Pearson Education, Inc.
D Nagesh Kumar, IIScOptimization Methods: M5L2 1 Dynamic Programming Recursive Equations.
Solving Polynomial Equations – Factoring Method A special property is needed to solve polynomial equations by the method of factoring. If a ∙ b = 0 then.
Table of Contents A Quadratic Equation is an equation that can be written in the form Solving Quadratic Equations – Factoring Method Solving quadratic.
A Quadratic Equation is an equation that can be written in the form Solving Quadratic Equations – Factoring Method Solving quadratic equations by the factoring.
D Nagesh Kumar, IIScOptimization Methods: M3L2 1 Linear Programming Graphical method.
Section 2.1 Solving Equations Using Properties of Equality.
Linear Programming Revised Simplex Method, Duality of LP problems and Sensitivity analysis D Nagesh Kumar, IISc Optimization Methods: M3L5.
EASTERN MEDITERRANEAN UNIVERSITY Department of Industrial Engineering Non linear Optimization Spring Instructor: Prof.Dr.Sahand Daneshvar Submited.
Managerial Economics Managerial Economics = economic theory + mathematical eco + statistical analysis.
D Nagesh Kumar, IIScOptimization Methods: M4L4 1 Linear Programming Applications Structural & Water Resources Problems.
Optimization unconstrained and constrained Calculus part II.
D Nagesh Kumar, IIScOptimization Methods: M7L2 1 Integer Programming Mixed Integer Linear Programming.
D Nagesh Kumar, IIScOptimization Methods: M6L5 1 Dynamic Programming Applications Capacity Expansion.
(iii) Lagrange Multipliers and Kuhn-tucker Conditions D Nagesh Kumar, IISc Introduction to Optimization Water Resources Systems Planning and Management:
Table of Contents Solving Polynomial Equations – Factoring Method A special property is needed to solve polynomial equations by the method of factoring.
Stochastic Optimization
Introduction to Optimization
Optimization and Lagrangian. Partial Derivative Concept Consider a demand function dependent of both price and advertising Q = f(P,A) Analyzing a multivariate.
(i) Preliminaries D Nagesh Kumar, IISc Water Resources Planning and Management: M3L1 Linear Programming and Applications.
Economics 2301 Lecture 37 Constrained Optimization.
Functions of Several Variables Copyright © Cengage Learning. All rights reserved.
Copyright © Cengage Learning. All rights reserved. 14 Partial Derivatives.
Section Lagrange Multipliers.
Section 15.3 Constrained Optimization: Lagrange Multipliers.
D Nagesh Kumar, IIScOptimization Methods: M8L1 1 Advanced Topics in Optimization Piecewise Linear Approximation of a Nonlinear Function.
Inequality Constraints Lecture 7. Inequality Contraints (I) n A Review of Lagrange Multipliers –As we discussed last time, the first order necessary conditions.
1 Unconstrained and Constrained Optimization. 2 Agenda General Ideas of Optimization Interpreting the First Derivative Interpreting the Second Derivative.
D Nagesh Kumar, IIScOptimization Methods: M6L1 1 Dynamic Programming Applications Design of Continuous Beam.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
(iii) Simplex method - I D Nagesh Kumar, IISc Water Resources Planning and Management: M3L3 Linear Programming and Applications.
Optimal Control.
Another sufficient condition of local minima/maxima
Unconstrained and Constrained Optimization
The Lagrange Multiplier Method
PRELIMINARY MATHEMATICS
Basic Concepts, Necessary and Sufficient conditions
Multivariable optimization with no constraints
Presentation transcript:

D Nagesh Kumar, IIScOptimization Methods: M2L4 1 Optimization using Calculus Optimization of Functions of Multiple Variables subject to Equality Constraints

D Nagesh Kumar, IIScOptimization Methods: M2L4 2 Objectives  Optimization of functions of multiple variables subjected to equality constraints using  the method of constrained variation, and  the method of Lagrange multipliers.

D Nagesh Kumar, IIScOptimization Methods: M2L4 3 Constrained optimization A function of multiple variables, f(x), is to be optimized subject to one or more equality constraints of many variables. These equality constraints, g j (x), may or may not be linear. The problem statement is as follows: Maximize (or minimize) f(X), subject to g j (X) = 0, j = 1, 2, …, m where

D Nagesh Kumar, IIScOptimization Methods: M2L4 4 Constrained optimization (contd.)  With the condition that ; or else if m > n then the problem becomes an over defined one and there will be no solution. Of the many available methods, the method of constrained variation and the method of using Lagrange multipliers are discussed.

D Nagesh Kumar, IIScOptimization Methods: M2L4 5 Solution by method of Constrained Variation For the optimization problem defined above, let us consider a specific case with n = 2 and m = 1 before we proceed to find the necessary and sufficient conditions for a general problem using Lagrange multipliers. The problem statement is as follows: Minimize f(x 1,x 2 ), subject to g(x 1,x 2 ) = 0 For f(x 1,x 2 ) to have a minimum at a point X * = [x 1 *,x 2 * ], a necessary condition is that the total derivative of f(x 1,x 2 ) must be zero at [x 1 *,x 2 * ]. (1)

D Nagesh Kumar, IIScOptimization Methods: M2L4 6 Method of Constrained Variation (contd.) Since g(x 1 *,x 2 * ) = 0 at the minimum point, variations dx 1 and dx 2 about the point [x 1 *, x 2 * ] must be admissible variations, i.e. the point lies on the constraint: g(x 1 * + dx 1, x 2 * + dx 2 ) = 0 (2) assuming dx 1 and dx 2 are small the Taylor series expansion of this gives us (3)

D Nagesh Kumar, IIScOptimization Methods: M2L4 7 Method of Constrained Variation (contd.) orat [x 1 *,x 2 * ] (4) which is the condition that must be satisfied for all admissible variations. Assuming, (4) can be rewritten as (5)

D Nagesh Kumar, IIScOptimization Methods: M2L4 8 Method of Constrained Variation (contd.) (5) indicates that once variation along x 1 (dx 1 ) is chosen arbitrarily, the variation along x 2 (dx 2 ) is decided automatically to satisfy the condition for the admissible variation. Substituting equation (5) in (1) we have: (6) The equation on the left hand side is called the constrained variation of f. Equation (5) has to be satisfied for all dx 1, hence we have (7) This gives us the necessary condition to have [x 1 *, x 2 * ] as an extreme point (maximum or minimum)

D Nagesh Kumar, IIScOptimization Methods: M2L4 9 Solution by method of Lagrange multipliers Continuing with the same specific case of the optimization problem with n = 2 and m = 1 we define a quantity λ, called the Lagrange multiplier as (8) Using this in (5) (9) And (8) written as (10)

D Nagesh Kumar, IIScOptimization Methods: M2L4 10 Solution by method of Lagrange multipliers…contd. Also, the constraint equation has to be satisfied at the extreme point (11) Hence equations (9) to (11) represent the necessary conditions for the point [x 1 *, x 2 *] to be an extreme point. Note that λ could be expressed in terms of as well and has to be non-zero. Thus, these necessary conditions require that at least one of the partial derivatives of g(x 1, x 2 ) be non-zero at an extreme point.

D Nagesh Kumar, IIScOptimization Methods: M2L4 11 Solution by method of Lagrange multipliers…contd. The conditions given by equations (9) to (11) can also be generated by constructing a functions L, known as the Lagrangian function, as (12) Alternatively, treating L as a function of x 1,x 2 and, the necessary conditions for its extremum are given by (13)

D Nagesh Kumar, IIScOptimization Methods: M2L4 12 Necessary conditions for a general problem For a general problem with n variables and m equality constraints the problem is defined as shown earlier Maximize (or minimize) f(X), subject to g j (X) = 0, j = 1, 2, …, m where In this case the Lagrange function, L, will have one Lagrange multiplier j for each constraint as (14)

D Nagesh Kumar, IIScOptimization Methods: M2L4 13 Necessary conditions for a general problem…contd. L is now a function of n + m unknowns,, and the necessary conditions for the problem defined above are given by (15) which represent n + m equations in terms of the n + m unknowns, x i and j. The solution to this set of equations gives us (16) and

D Nagesh Kumar, IIScOptimization Methods: M2L4 14 Sufficient conditions for a general problem A sufficient condition for f(X) to have a relative minimum at X * is that each root of the polynomial in Є, defined by the following determinant equation be positive. (17)

D Nagesh Kumar, IIScOptimization Methods: M2L4 15 Sufficient conditions for a general problem…contd. where (18) Similarly, a sufficient condition for f(X) to have a relative maximum at X * is that each root of the polynomial in Є, defined by equation (17) be negative. If equation (17), on solving yields roots, some of which are positive and others negative, then the point X * is neither a maximum nor a minimum.

D Nagesh Kumar, IIScOptimization Methods: M2L4 16 Example Minimize, Subject to or

D Nagesh Kumar, IIScOptimization Methods: M2L4 19 Thank you