L12 LaGrange Multiplier Method Homework Review Summary Test 1.

Slides:



Advertisements
Similar presentations
Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions.
Advertisements

1 OR II GSLM Outline  some terminology  differences between LP and NLP  basic questions in NLP  gradient and Hessian  quadratic form  contour,
Optimization. f(x) = 0 g i (x) = 0 h i (x)
Page 1 Page 1 ENGINEERING OPTIMIZATION Methods and Applications A. Ravindran, K. M. Ragsdell, G. V. Reklaitis Book Review.
C&O 355 Mathematical Programming Fall 2010 Lecture 15 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A.
1 OR II GSLM Outline  classical optimization – unconstrained optimization  dimensions of optimization  feasible direction.
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
by Rianto Adhy Sasongko Supervisor: Dr.J.C.Allwright
Nonlinear Programming
Easy Optimization Problems, Relaxation, Local Processing for a small subset of variables.
Local and Global Optima
Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions Today’s lecture is on optimality.
Inexact SQP Methods for Equality Constrained Optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge.
EE 553 Introduction to Optimization
Numerical Optimization
Engineering Optimization
Constrained Optimization
Design Optimization School of Engineering University of Bradford 1 Formulation of a design improvement problem as a formal mathematical optimization problem.
Constrained Optimization Economics 214 Lecture 41.
CS Pattern Recognition Review of Prerequisites in Math and Statistics Prepared by Li Yang Based on Appendix chapters of Pattern Recognition, 4.
MAE 552 – Heuristic Optimization Lecture 1 January 23, 2002.
Optimization using Calculus
D Nagesh Kumar, IIScOptimization Methods: M2L5 1 Optimization using Calculus Kuhn-Tucker Conditions.
Optimality Conditions for Nonlinear Optimization Ashish Goel Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.
D Nagesh Kumar, IIScOptimization Methods: M2L3 1 Optimization using Calculus Optimization of Functions of Multiple Variables: Unconstrained Optimization.
1 Optimization. 2 General Problem 3 One Independent Variable x y (Local) maximum Slope = 0.
1 OR II GSLM Outline  separable programming  quadratic programming.
KKT Practice and Second Order Conditions from Nash and Sofer
L13 Optimization using Excel See revised schedule read 8(1-4) + Excel “help” for Mar 12 Test Answers Review: Convex Prog. Prob. Worksheet modifications.
L20 LP part 6 Homework Review Postoptimality Analysis Summary 1.
L4 Graphical Solution Homework See new Revised Schedule Review Graphical Solution Process Special conditions Summary 1 Read for W for.
Slide 2a.1 Stiff Structures, Compliant Mechanisms, and MEMS: A short course offered at IISc, Bangalore, India. Aug.-Sep., G. K. Ananthasuresh Lecture.
Nonlinear Programming Models
7. Concavity and convexity
L8 Optimal Design concepts pt D
Part 4 Nonlinear Programming 4.1 Introduction. Standard Form.
L21 Numerical Methods part 1 Homework Review Search problem Line Search methods Summary 1 Test 4 Wed.
Introduction to Optimization
CS B553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Constrained optimization.
1 Introduction Optimization: Produce best quality of life with the available resources Engineering design optimization: Find the best system that satisfies.
1 Unconstrained and Constrained Optimization. 2 Agenda General Ideas of Optimization Interpreting the First Derivative Interpreting the Second Derivative.
Linear Programming Chap 2. The Geometry of LP  In the text, polyhedron is defined as P = { x  R n : Ax  b }. So some of our earlier results should.
OR II GSLM
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
Optimal Control.
1 Support Vector Machines: Maximum Margin Classifiers Machine Learning and Pattern Recognition: September 23, 2010 Piotr Mirowski Based on slides by Sumit.
L6 Optimal Design concepts pt B
Part 4 Nonlinear Programming
Chapter 11 Optimization with Equality Constraints
L11 Optimal Design L.Multipliers
Computational Optimization
Lecture 8 – Nonlinear Programming Models
Chapter 5. Sensitivity Analysis
Chap 3. The simplex method
Optimality conditions constrained optimisation
3-3 Optimization with Linear Programming
The Lagrange Multiplier Method
L5 Optimal Design concepts pt A
L10 Optimal Design L.Multipliers
Part 4 Nonlinear Programming
Outline Unconstrained Optimization Functions of One Variable
I.4 Polyhedral Theory (NW)
I.4 Polyhedral Theory.
CS5321 Numerical Optimization
1.6 Linear Programming Pg. 30.
Optimal Control of Systems
L7 Optimal Design concepts pt C
EE/Econ 458 Introduction to Optimization
L8 Optimal Design concepts pt D
Presentation transcript:

L12 LaGrange Multiplier Method Homework Review Summary Test 1

Constrained Optimization LaGrange Multiplier Method 2 Remember: 1.Standard form 2.Max problems f(x) = - F(x)

KKT Necessary Conditions for Min 3 Regularity check - gradients of active inequality constraints are linearly independent

Prob

KKT Necessary Conditions 5

Case 1 6

Case 2 7

Case 2 cont’d, find multipliers 8

Case 2 cont’d, regular pt? 9 Regular Pt? 1. pt feasible, YES 2. active constraint gradients independent Are active constraint gradients independent i.e. parallel? Determinant of Constraint gradients non-singular? Case 2 results in a KKT point!

Graphical Solution

Constraint Sensitivity 11 Note how relaxing h increases the feasible region but is in the wrong “direction.” Recall ν can be ±! Multiply h by -1, ah ha!

Sufficient Condition 12 Is this a convex programming problem? Check f(x) and constraints: From convexity theorems: 1. H f is PD 2. All constraints are linear Therefore KKT Pt is global Min!

True/False 13

LaGrange Multiplier Method May produce a KKT point A KKT point is a CANDIDATE minimum It may not be a local Min If a point fails KKT conditions, we cannot guarantee anything…. The point may still be a minimum. We need a SUFFICIENT condition 14

15 Convex set: All pts in feasible region on a straight line(s). Convex sets Non-convex set Pts on line are not in feasible region

16 Multiple variables Fig 4.21 What if it were an equality constraint? misprint

17. Figure 4.22 Convex function f(x)=x 2 Bowl that holds water.

18 Fig 4.23 Convex function.

Test for Convex Function 19 Difficult to use above definition! However, Thm 4.8 pg 163: If the Hessian matrix of the function is PD ro PSD at all points in the set S, then it is convex. PD… “strictly” convex, otherwise PSD… “convex”

Theorem Given: S is convex if: 1. h i are linear 2. g j are convex i.e. H g PD or PSD When f(x) and S are convex= “convex programming problem”

“Sufficient” Theorem 4.10, pg The first-order KKT conditions are Necessary and Sufficient for a GLOBAL minimum….if: 1. f(x) is convex H f (x) Positive definite 2. x is defined as a convex feasible set S Equality constraints must be linear Inequality constraints must be convex HINT: linear functions are convex!

Summary LaGrange multipliers are the instantaneous rate of change in f(x) w.r.t. relaxing a constraint. Equality constraints may need tightening rather than loosening Convex sets assure contiguity and or the smoothness of f(x) KKT pt of a convex programming problem is a GLOBAL MINIMUM! 22