Optimization in Engineering Design 1 Lagrange Multipliers.

Slides:



Advertisements
Similar presentations
Support Vector Machine
Advertisements

Fin500J: Mathematical Foundations in Finance
Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Nonlinear Programming McCarl and Spreen Chapter 12.
Optimization. f(x) = 0 g i (x) = 0 h i (x)
Page 1 Page 1 ENGINEERING OPTIMIZATION Methods and Applications A. Ravindran, K. M. Ragsdell, G. V. Reklaitis Book Review.
Slide Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley.
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions Today’s lecture is on optimality.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
The Most Important Concept in Optimization (minimization)  A point is said to be an optimal solution of a unconstrained minimization if there exists no.
1. 2 Local maximum Local minimum 3 Saddle point.
Optimization using Calculus
Engineering Optimization
Economics 214 Lecture 37 Constrained Optimization.
Constrained Optimization
Constrained Optimization Rong Jin. Outline  Equality constraints  Inequality constraints  Linear Programming  Quadratic Programming.
Design Optimization School of Engineering University of Bradford 1 Formulation of a design improvement problem as a formal mathematical optimization problem.
MAE 552 – Heuristic Optimization Lecture 1 January 23, 2002.
Lecture 10: Support Vector Machines
D Nagesh Kumar, IIScOptimization Methods: M2L5 1 Optimization using Calculus Kuhn-Tucker Conditions.
Introduction and Basic Concepts
1 Optimization. 2 General Problem 3 One Independent Variable x y (Local) maximum Slope = 0.
Constrained Optimization Rong Jin. Outline  Equality constraints  Inequality constraints  Linear Programming  Quadratic Programming.
Nonlinear Optimization Review of Derivatives Models with One Decision Variable Unconstrained Models with More Than One Decision Variable Models with Equality.
Introduction to Optimization (Part 1)
Today Wrap up of probability Vectors, Matrices. Calculus
KKT Practice and Second Order Conditions from Nash and Sofer
Managerial Economics Managerial Economics = economic theory + mathematical eco + statistical analysis.

Duality Theory  Every LP problem (called the ‘Primal’) has associated with another problem called the ‘Dual’.  The ‘Dual’ problem is an LP defined directly.
LAGRANGE mULTIPLIERS By Rohit Venkat.
D Nagesh Kumar, IIScOptimization Methods: M2L4 1 Optimization using Calculus Optimization of Functions of Multiple Variables subject to Equality Constraints.
Optimization unconstrained and constrained Calculus part II.
L8 Optimal Design concepts pt D
Part 4 Nonlinear Programming 4.1 Introduction. Standard Form.
(iii) Lagrange Multipliers and Kuhn-tucker Conditions D Nagesh Kumar, IISc Introduction to Optimization Water Resources Systems Planning and Management:
Functions of Several Variables Copyright © Cengage Learning. All rights reserved.
EML Engineering Design Systems II (Senior Design Project)
Introduction to Optimization
Optimization and Lagrangian. Partial Derivative Concept Consider a demand function dependent of both price and advertising Q = f(P,A) Analyzing a multivariate.
Calculus-Based Optimization AGEC 317 Economic Analysis for Agribusiness and Management.
Review of PMP Derivation We want to find control u(t) which minimizes the function: x(t) = x*(t) +  x(t); u(t) = u*(t) +  u(t); (t) = *(t) +  (t);
CS B553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Constrained optimization.
Economics 2301 Lecture 37 Constrained Optimization.
Inequality Constraints Lecture 7. Inequality Contraints (I) n A Review of Lagrange Multipliers –As we discussed last time, the first order necessary conditions.
1 Introduction Optimization: Produce best quality of life with the available resources Engineering design optimization: Find the best system that satisfies.
1 Unconstrained and Constrained Optimization. 2 Agenda General Ideas of Optimization Interpreting the First Derivative Interpreting the Second Derivative.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
1 Support Vector Machines: Maximum Margin Classifiers Machine Learning and Pattern Recognition: September 23, 2010 Piotr Mirowski Based on slides by Sumit.
Part 4 Nonlinear Programming
Chapter 11 Optimization with Equality Constraints
Calculus-Based Solutions Procedures MT 235.
4.7 Modeling and Optimization
Constrained Optimization
Unconstrained and Constrained Optimization
AOE/ESM 4084 Engineering Design Optimization
The Lagrange Multiplier Method
Solving Percent Problem with Equations
Part 4 Nonlinear Programming
Outline Unconstrained Optimization Functions of One Variable
EE 458 Introduction to Optimization
Part 4 - Chapter 13.
CS5321 Numerical Optimization
Calculus-Based Optimization AGEC 317
Multivariable optimization with no constraints
Constraints.
L8 Optimal Design concepts pt D
Presentation transcript:

Optimization in Engineering Design 1 Lagrange Multipliers

Optimization in Engineering Design 2 Lagrange Multipliers The method of Lagrange multipliers gives a set of necessary conditions to identify optimal points of equality constrained optimization problems. This is done by converting a constrained problem to an equivalent unconstrained problem with the help of certain unspecified parameters known as Lagrange multipliers. The classical problem formulation minimize f(x 1, x 2,..., x n ) Subject to h 1 (x 1, x 2,..., x n ) = 0 can be converted to minimize L(x, ) = f(x) - h 1 (x) where L(x, v) is the Lagrangian function is an unspecified positive or negative constant called the Lagrangian Multiplier

Optimization in Engineering Design 3 Finding an Optimum using Lagrange Multipliers New problem is: minimize L(x, ) = f(x) - h 1 (x) Suppose that we fix = * and the unconstrained minimum of L(x; ) occurs at x = x * and x * satisfies h 1 (x * ) = 0, then x * minimizes f(x) subject to h 1 (x) = 0. Trick is to find appropriate value for Lagrangian multiplier. This can be done by treating as a variable, finding the unconstrained minimum of L(x, ) and adjusting so that h 1 (x) = 0 is satisfied.

Optimization in Engineering Design 4 Method 1.Original problem is rewritten as: minimize L(x, ) = f(x) - h 1 (x) 2.Take derivatives of L(x, ) with respect to x i and set them equal to zero. If there are n variables (i.e., x 1,..., x n ) then you will get n equations with n + 1 unknowns (i.e., n variables x i and one Lagrangian multiplier ) 3.Express all x i in terms of Langrangian multiplier 4.Plug x in terms of in constraint h 1 (x) = 0 and solve. 5.Calculate x by using the just found value for. Note that the n derivatives and one constraint equation result in n+1 equations for n+1 variables! (See example 5.3)

Optimization in Engineering Design 5 Multiple constraints The Lagrangian multiplier method can be used for any number of equality constraints. Suppose we have a classical problem formulation with k equality constraints minimize f(x 1, x 2,..., x n ) Subject to h 1 (x 1, x 2,..., x n ) = h k (x 1, x 2,..., x n ) = 0 This can be converted in minimize L(x, ) = f(x) - T h(x) where T is the transpose vector of Lagrangian multpliers and has length k

Optimization in Engineering Design 6 In closing Lagrangian multipliers are very useful in sensitivity analyses (see Section 5.3) Setting the derivatives of L to zero may result in finding a saddle point. Additional checks are always useful. Lagrangian multipliers require equalities. So a conversion of inequalities is necessary. Kuhn and Tucker extended the Lagrangian theory to include the general classical single-objective nonlinear programming problem: minimize f(x) Subject to g j (x)  0for j = 1, 2,..., J h k (x) = 0for k = 1, 2,..., K x = (x 1, x 2,..., x N )