Constrained Optimization

Slides:



Advertisements
Similar presentations
Fin500J: Mathematical Foundations in Finance
Advertisements

Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions.
Nonlinear Programming McCarl and Spreen Chapter 12.
Linear Programming. Introduction: Linear Programming deals with the optimization (max. or min.) of a function of variables, known as ‘objective function’,
Chapter 3 Linear Programming: Sensitivity Analysis and Interpretation of Solution MT 235.
Optimization in Engineering Design 1 Lagrange Multipliers.
Optimization using Calculus
Constrained Maximization
Computational Methods for Management and Economics Carla Gomes
Linear Programming. Linear programming A technique that allows decision makers to solve maximization and minimization problems where there are certain.
MAE 552 – Heuristic Optimization Lecture 1 January 23, 2002.
D Nagesh Kumar, IIScOptimization Methods: M2L5 1 Optimization using Calculus Kuhn-Tucker Conditions.
Nonlinear Optimization Review of Derivatives Models with One Decision Variable Unconstrained Models with More Than One Decision Variable Models with Equality.
Introduction to Optimization (Part 1)
1 1 Slide LINEAR PROGRAMMING: THE GRAPHICAL METHOD n Linear Programming Problem n Properties of LPs n LP Solutions n Graphical Solution n Introduction.
1 Least Cost System Operation: Economic Dispatch 2 Smith College, EGR 325 March 10, 2006.
Optimization Techniques Methods for maximizing or minimizing an objective function Examples –Consumers maximize utility by purchasing an optimal combination.
Managerial Economics Managerial Economics = economic theory + mathematical eco + statistical analysis.
Linear Programming Chapter 13 Supplement.
Managerial Economics Managerial Economics = economic theory + mathematical eco + statistical analysis.
D Nagesh Kumar, IIScOptimization Methods: M2L4 1 Optimization using Calculus Optimization of Functions of Multiple Variables subject to Equality Constraints.
Chapter 1 Introduction n Introduction: Problem Solving and Decision Making n Quantitative Analysis and Decision Making n Quantitative Analysis n Model.
Optimization unconstrained and constrained Calculus part II.
Calculus: Hughs-Hallett Chap 5 Joel Baumeyer, FSC Christian Brothers University Using the Derivative -- Optimization.
(iii) Lagrange Multipliers and Kuhn-tucker Conditions D Nagesh Kumar, IISc Introduction to Optimization Water Resources Systems Planning and Management:
Beattie, Taylor, and Watts Sections: 3.1b-c, 3.2c, , 5.2a-d
Optimization Problems
1 Optimization Techniques Constrained Optimization by Linear Programming updated NTU SY-521-N SMU EMIS 5300/7300 Systems Analysis Methods Dr.
Optimization and Lagrangian. Partial Derivative Concept Consider a demand function dependent of both price and advertising Q = f(P,A) Analyzing a multivariate.
Calculus-Based Optimization AGEC 317 Economic Analysis for Agribusiness and Management.
Nonlinear Programming In this handout Gradient Search for Multivariable Unconstrained Optimization KKT Conditions for Optimality of Constrained Optimization.
Linear Programming Short-run decision making model –Optimizing technique –Purely mathematical Product prices and input prices fixed Multi-product production.
Equations and Inequalities. Equation – A sentence stating that two qualities are equal. Includes a variable – 3x + 5 = 17 Equation variable The solution.
Calculus III Hughes-Hallett Chapter 15 Optimization.
Section 15.3 Constrained Optimization: Lagrange Multipliers.
1 Unconstrained and Constrained Optimization. 2 Agenda General Ideas of Optimization Interpreting the First Derivative Interpreting the Second Derivative.
Approximation Algorithms based on linear programming.
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
1 2 Linear Programming Chapter 3 3 Chapter Objectives –Requirements for a linear programming model. –Graphical representation of linear models. –Linear.
An Introduction to Linear Programming
Linear Programming for Solving the DSS Problems
Another sufficient condition of local minima/maxima
deterministic operations research
Chapter 11 Optimization with Equality Constraints
Calculus-Based Solutions Procedures MT 235.
2.4 & 2.5 Absolute Value Inequalities and Equations
EMGT 6412/MATH 6665 Mathematical Programming Spring 2016
Unconstrained and Constrained Optimization
Optimization Problems
3-3 Optimization with Linear Programming
Duality Theory and Sensitivity Analysis
Section 4.3 Optimization.
The Lagrange Multiplier Method
Optimization Problems
Integer Linear Programming
Constrained Optimization
Constrained Optimization – Part 1
AP Calculus March 10 and 13, 2017 Mrs. Agnew
Classification of optimization problems
Chapter 5. The Duality Theorem
Outline Unconstrained Optimization Functions of One Variable
Chapter 7 Functions of Several Variables
Constrained Optimization – Part 1
Constrained Optimization
Calculus-Based Optimization AGEC 317
Linear Programming.
Constraints.
Maximum and Minimum Values
Analyzing Multivariable Change: Optimization
Presentation transcript:

Constrained Optimization Objective of Presentation: To present the basic conceptual methods used to optimize design in real situations Essential Reality: In practical situations, the designers are constrained or limited by physical realities design standards laws and regulations, etc.

Outline Unconstrained Optimization (Review) Constrained Optimization Approach Equality constraints Lagrangeans Shadow prices Inequality constraints Kuhn-Tucker conditions Complementary slackness

Unconstrained Optimization: Definitions Optimization => Maximum of desired quantity, or => Minimum of undesired quantity Objective Function = Expression to be optimized = Z(X) Decision Variables = Variables about which we can make decisions = X = (X1….Xn)

Unconstrained Optimization: Graph B C D E F(X) X B and D are maxima A, C and E are minima

Unconstrained Optimization: Conditions By calculus: if F(X) continuous, analytic Primary conditions for maxima and minima: F(X) / Xi = 0 I ( symbol means: for all i) Secondary conditions: 2F(X) / Xi2 < 0 = > Max (B,D) 2F(X) / Xi2 > 0 = > Min (A,C,E) These define whether point of no change in Z is a maximum or a minimum

Unconstrained Optimization: Example Example: Housing insulation Total Cost = Fuel cost + Insulation cost x = Thickness of insulation F(x) = K1 / x + K2x Primary condition: F(x) / x = 0 = -K1 / x2 + K2 => x* = {K1 / K2} 1/2 (starred quantities are optimal)

Unconstrained Optimization: Graph of Solution to Example If: K1 = 500 ; K2 = 24 Then: X* = 4.56

Constrained Optimization: General “Constrained Optimization” involves the optimization of a process subject to constraints Constraints have two basic types Equality Constraints -- some factors have to equal constraints Inequality Constraints -- some factors have to be less less or greater than the constraints (these are “upper” and “lower” bounds)

Constrained Optimization: General Approach To solve situations of increasing complexity, (for example, those with equality, inequality constraints) … Transform more difficult situation into one we know how to deal with Note: this process introduces new variables! Thus, transform “constrained” optimization to “unconstrained” optimization

Equality Constraints: Example Example: Best use of budget Maximize: Output = Z(X) = aox1a1x2a2 Subject to (s.t.): Total costs = Budget = p1x1 + p2x2 Z(x) Z* Budget X Note: Z(X) / X  0 at optimum

Lagrangean Method: Approach Transforms equality constraints into unconstrained problem Start with: Opt: F(x) s.t.: gj(x) = bj => gj(x) - bj = 0 Get to: L = F(x) - j j[gj(x) - bj] j = Lagrangean multipliers (lambdas) -- these are unknown quantities for which we must solve Note: [gj(x) - bj] = 0 by definition, thus optimum for F(x) = optimum for L

Lagrangean: Optimality Conditions Since the new formulation is a single equation, we can use formulas for unconstrained optimization. We set partial derivatives equal to zero for all unknowns, the X and the  Thus, to optimize L: L / xi = 0 I L / j = 0 J

Lagrangean: Example Formulation Problem: Opt: F(x) = 6x1x2 s.t.: g(x) = 3x1 + 4x2 = 18 Lagrangean: L = 6x1x2 - (3x1 + 4x2 - 18) Optimality Conditions: L / x1 = 6x2 - 3 = 0 L / x2 = 6x1 - 4 = 0 L / i = 3x1 + 4x2 -18 = 0

Lagrangean: Graph for Example

Lagrangean: Example Solution Solving as unconstrained problem: L / x1 = 6x2 - 3 = 0 L / x2 = 6x1 - 4 = 0 L / i = 3x1 + 4x2 -18 = 0 so that:  = 2x2 = 1.5x1 (first 2 equations) => x2 = 0.75x1 => 3x1 + 3x1 - 18 = 0 (3rd equation) x1* = 18/6 = 3 x2* = 18/8 = 2.25 * = 4.5 F(x)* = 40.5

Lagrangean: Graph for Solution 

Shadow Prices Shadow Price is the Rate of change of objective function per unit change of constraint = F(x) / bj It is extremely important for system design It defines value of changing constraints, and indicates if worthwhile to change them Should we buy more resources? Should we change environmental constraints?

Lagrangean Multiplier is a Shadow Price The Lagrangean multiplier is interpreted as the shadow price on constraint SPj = F(x)*/ bj = L*/ bj =  {F(x) - j j[gj(x) - bj] } / bj = j Naturally, this is an instantaneous rate

Lagrangean = Shadow Prices Example Let’s see how this works in example, by changing constraint by 0.1 units: Opt: F(x) = 6x1x2 s.t.: g(x) = 3x1 + 4x2 = 18.1 The optimum values of the variables are x1* = (18.1)/6 x2* = (18.1)/8 * = 4.5 Thus F(x)* = 6(18.1/6)(18.1/8) = 40.95 F(x) = 40.95 - 40.5 = 0.45 = * (0.1)

Inequality Constraints: Example Example: Housing insulation Min: Costs = K1 / x + K2x s.t.: x 8 (minimum thickness)

Inequality Constraints: Approach Transform inequalities into equalities, then proceed as before Again, this introduces new variables – in this case, the “Slack” variables that define “slack” or distance between constraint and amount used The resulting equations are known as the “Kuhn-Tucker conditions”

Inequality Constraints -- putting slack variables in Lagrangean A “slack variable”, sj , for each inequality gj(x)  bj => gj(x) + sj2 = bj gj(x)  bj => gj(x) - sj2 = bj These are “squared” to be positive Start from: opt: F(x) s.t.: gj(x)  bj Get to: L = F(x) - jj[gj(x) + sj2 - bj]

Inequality Constraints: Complementary Slackness Conditions The optimality conditions are: L / xi = 0 L / j = 0 plus: L / sj = 2sjj = 0 These new equations imply: sj = 0 j  0 Or: sj  0 j  0 These are “complementary slackness” conditions. Either slack or lambda (or both) = 0 i

Interpretation of Complementary Slackness Conditions If there is slack on bj, (j = 0 ) (i.e. more than enough of it) => No value to objective function to having more: j = F(x) / bj = 0 If j  0, then all available bj used => sj = 0 In this case, it would be worthwhile to have more of this constraint available

Example: Solution Min: Costs = K1 / x + K2x s.t.: x 8 (minimum thickness) Lagrangean: L = K1/x + K2x - [x - s2 - b] = 500/x + 24x - [x - s2 - 8] Optimality Conditions: L / xi = 500/x2 + 24 -  = 0 L / sj = 2 s = 0 L / j = 0 ==> x - s2 = 8 If s = 0, x = 8,  = 31.8 (at that point) Min cost = 254.5

Example: Interpretation of Slack Since  = 31.8 > 0 Therefore, worth changing constraint to get better solution In this case, the constraint is on minimum, so easing or relaxing it means that we would lower the constraint, setting a lower requirement on thickness of insulation Unconstrained solution for lifetime cost is: x* = 4.56 Optimum = 221 < 254.5

Slack Variable Example: Graph The minimum cost with 8 inch constraint on thickness = 254.5 If the constraint were less, optimum could be better (until unconstrained minimum)

Another application to Example Min: Costs = K1 / x + K2x s.t.: x 4 (NEW MINIMUM) Lagrangean = K1/x + K2x - [x + s2 - b] = 500/x + 24x - [x + s2 - 4] Optimality Conditions: 500/x2 + 24 -  = 0 2 s = 0 x - s2 = 4 If  = 0, x = 4.56, slack, s2 = 1.56 Optimum = 221 As  = 0, not worth changing constraint

Summary of Presentation Important mathematical approaches Lagreangeans Kuhn- Tucker Conditions Important Concept: Shadow Prices THESE ANALYSES GUIDE DESIGNERS TO CHALLENGE CONSTRAINTS