KKT Practice and Second Order Conditions from Nash and Sofer

Slides:



Advertisements
Similar presentations
Solving LP Models Improving Search Special Form of Improving Search
Advertisements

Lecture #3; Based on slides by Yinyu Ye
1 OR II GSLM Outline  some terminology  differences between LP and NLP  basic questions in NLP  gradient and Hessian  quadratic form  contour,
Page 1 Page 1 ENGINEERING OPTIMIZATION Methods and Applications A. Ravindran, K. M. Ragsdell, G. V. Reklaitis Book Review.
C&O 355 Mathematical Programming Fall 2010 Lecture 15 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A.
1 OR II GSLM Outline  classical optimization – unconstrained optimization  dimensions of optimization  feasible direction.
Equilibrium Concepts in Two Player Games Kevin Byrnes Department of Applied Mathematics & Statistics.
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
L12 LaGrange Multiplier Method Homework Review Summary Test 1.
Review + Announcements 2/22/08. Presentation schedule Friday 4/25 (5 max)Tuesday 4/29 (5 max) 1. Miguel Jaller 8:031. Jayanth 8:03 2. Adrienne Peltz 8:202.
Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions Today’s lecture is on optimality.
Separating Hyperplanes
Inexact SQP Methods for Equality Constrained Optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge.
The Most Important Concept in Optimization (minimization)  A point is said to be an optimal solution of a unconstrained minimization if there exists no.
Optimization in Engineering Design 1 Lagrange Multipliers.
Engineering Optimization
Support Vector Machines Formulation  Solve the quadratic program for some : min s. t.,, denotes where or membership.  Different error functions and measures.
Constrained Optimization
Constrained Optimization Rong Jin. Outline  Equality constraints  Inequality constraints  Linear Programming  Quadratic Programming.
Design Optimization School of Engineering University of Bradford 1 Formulation of a design improvement problem as a formal mathematical optimization problem.
Unconstrained Optimization Problem
MAE 552 – Heuristic Optimization Lecture 1 January 23, 2002.
Optimality Conditions for Nonlinear Optimization Ashish Goel Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.
Constrained Optimization Rong Jin. Outline  Equality constraints  Inequality constraints  Linear Programming  Quadratic Programming.
Tier I: Mathematical Methods of Optimization
Lecture 9 – Nonlinear Programming Models
Computational Optimization
1. Problem Formulation. General Structure Objective Function: The objective function is usually formulated on the basis of economic criterion, e.g. profit,
Chapter 11 Nonlinear Programming
Linear Programming System of Linear Inequalities  The solution set of LP is described by Ax  b. Gauss showed how to solve a system of linear.
Nonlinear Programming Models
EASTERN MEDITERRANEAN UNIVERSITY Department of Industrial Engineering Non linear Optimization Spring Instructor: Prof.Dr.Sahand Daneshvar Submited.
Nonlinear Programming I Li Xiaolei. Introductory concepts A general nonlinear programming problem (NLP) can be expressed as follows: objective function.
I.4 Polyhedral Theory 1. Integer Programming  Objective of Study: want to know how to describe the convex hull of the solution set to the IP problem.
Optimization unconstrained and constrained Calculus part II.
L8 Optimal Design concepts pt D
Part 4 Nonlinear Programming 4.1 Introduction. Standard Form.
CS B553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Constrained optimization.
1 OR II GSLM Outline  equality constraint  tangent plane  regular point  FONC  SONC  SOSC.
Linear Programming Chapter 9. Interior Point Methods  Three major variants  Affine scaling algorithm - easy concept, good performance  Potential.
Inequality Constraints Lecture 7. Inequality Contraints (I) n A Review of Lagrange Multipliers –As we discussed last time, the first order necessary conditions.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
Linear Programming Chap 2. The Geometry of LP  In the text, polyhedron is defined as P = { x  R n : Ax  b }. So some of our earlier results should.
OR II GSLM
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
Chapter 4 The Simplex Algorithm and Goal Programming
Optimal Control.
Chap 10. Sensitivity Analysis
Part 4 Nonlinear Programming
Chapter 11 Optimization with Equality Constraints
Computational Optimization
L11 Optimal Design L.Multipliers
Computational Optimization
Lecture 8 – Nonlinear Programming Models
Chap 9. General LP problems: Duality and Infeasibility
Nuclear Norm Heuristic for Rank Minimization
Polyhedron Here, we derive a representation of polyhedron and see the properties of the generators. We also see how to identify the generators. The results.
Chapter 5. Sensitivity Analysis
Chap 3. The simplex method
Polyhedron Here, we derive a representation of polyhedron and see the properties of the generators. We also see how to identify the generators. The results.
The Lagrange Multiplier Method
L10 Optimal Design L.Multipliers
Part 4 Nonlinear Programming
Outline Unconstrained Optimization Functions of One Variable
I.4 Polyhedral Theory (NW)
I.4 Polyhedral Theory.
Part 3. Linear Programming
CS5321 Numerical Optimization
Chapter 2. Simplex method
L8 Optimal Design concepts pt D
Presentation transcript:

KKT Practice and Second Order Conditions from Nash and Sofer NLP KKT Practice and Second Order Conditions from Nash and Sofer

Unconstrained First Order Necessary Condition Second Order Necessary Second Order Sufficient

Easiest Problem Linear equality constraints

KKT Conditions Note for equality – multipliers are unconstrained Complementarity not an issue

Null Space Representation Let x* be a feasible point, Ax*=b. Any other feasible point can be written as x=x*+p where Ap=0 The feasible region {x : x*+p pN(A)} where N(A) is null space of A

Null and Range Spaces See Section 3.2 of Nash and Sofer for example

Orthogonality

Null Space Review

Constrained to Unconstrained You can convert any linear equality constrained optimization problem to an equivalent unconstrained problem Method 1 substitution Method 2 using Null space representation and a feasible point.

Example Solve by substitution becomes

Null Space Method x*= [4 0 0]’ x=x*+Zv becomes

General Method There exists a Null Space Matrix The feasible region is: Equivalent “Reduced” Problem

Optimality Conditions Assume feasible point and convert to null space formulation

Where is KKT? KKT implies null space Null Space implies KKT Gradient is not in Null(A), thus it must be in Range(A’)

Lemma 14.1 Necessary Conditions If x* is a local min of f over {x|Ax=b}, and Z is a null matrix Or equivalently use KKT Conditions

Lemma 14.2 Sufficient Conditions If x* satisfies (where Z is a basis matrix for Null(A)) then x* is a strict local minimizer

Lemma 14.2 Sufficient Conditions (KKT form) If (x*,*) satisfies (where Z is a basis matrix for Null(A)) then x* is a strict local minimizer

Lagrangian Multiplier * is called the Lagrangian Multiplier It represents the sensitivity of solution to small perturbations of constraints

Optimality conditions Consider min (x2+4y2)/2 s.t. x-y=10

Optimality conditions Find KKT point Check SOSC

In Class Practice Find a KKT point Verify SONC and SOSC

Linear Equality Constraints - I

Linear Equality Constraints - II

Linear Equality Constraints - III so SOSC satisfied, and x* is a strict local minimum Objective is convex, so KKT conditions are sufficient.

Next Easiest Problem Linear equality constraints Constraints form a polyhedron

Close to Equality Case Equality FONC: x* a2x = b a2x = b -a2 Polyhedron Ax>=b a3x = b a4x = b -a1 a1x = b x* contour set of function unconstrained minimum Which i are 0? What is the sign of I?

Close to Equality Case Equality FONC: x* a2x = b a2x = b -a2 Polyhedron Ax>=b a3x = b a4x = b -a1 a1x = b x* Which i are 0? What is the sign of I?

Inequality Case Inequality FONC: x* a2x = b a2x = b -a2 Polyhedron Ax>=b a3x = b a4x = b -a1 a1x = b x* Nonnegative Multipliers imply gradient points to the less than Side of the constraint.

Lagrangian Multipliers

Lemma 14.3 Necessary Conditions If x* is a local min of f over {x|Ax≤b}, and Z is a null-space matrix for active constraints then for some vector *

Lemma 14.5 Sufficient Conditions (KKT form) If (x*,*) satisfies

Lemma 14.5 Sufficient Conditions (KKT form) where Z+ is a basis matrix for Null(A +) and A + corresponds to nondegenerate active constraints) i.e.

Sufficient Example Find solution and verify SOSC

Linear Inequality Constraints - I

Linear Inequality Constraints - II

Linear Inequality Constraints - III

Linear Inequality Constraints - IV

Example Problem

You Try Solve the problem using above theorems:

Why Necessary and Sufficient? Sufficient conditions are good for? Way to confirm that a candidate point is a minimum (local) But…not every min satisifies any given SC Necessary tells you: If necessary conditions don’t hold then you know you don’t have a minimum. Under appropriate assumptions, every point that is a min satisfies the necessary cond. Good stopping criteria Algorithms look for points that satisfy Necessary conditions

General Constraints

Lagrangian Function Optimality conditions expressed using and Jacobian matrix were each row is a gradient of a constraint

Theorem 14.2 Sufficient Conditions Equality (KKT form) If (x*,*) satisfies

Theorem 14.4 Sufficient Conditions Inequality (KKT) If (x*,*) satisfies

Lemma 14.4 Sufficient Conditions (KKT form) where Z+ is a basis matrix for Null(A +) and A + corresponds to Jacobian of nondegenerate active constraints) i.e.

Sufficient Example Find solution and verify SOSC

Nonlinear Inequality Constraints - I

Nonlinear Inequality Constraints - II

Nonlinear Inequality Constraints - III

Sufficient Example Find solution and verify SOSC

Nonlinear Inequality Constraints - V

Nonlinear Inequality Constraints - VI

Theorem 14.1 Necessary Conditions- Equality If x* is a local min of f over {x|g(x)=0}, Z is a null-space matrix of the Jacobian g(x*)’, and x* is a regular point then

Theorem 14.3 Necessary Conditions If x* is a local min of f over {x|g(x)>=0}, Z is a null-space matrix of the Jacobian g(x*)’, and x* is a regular point then

Regular point If x* is a regular point with respect to the constraints g(x*) if the gradient of the active constraints are linearly independent. For equality constraints, all constraints are active so should have linearly independent rows.

Necessary Example Show optimal solution x*=[1,0]’ is regular and find KKT point

Constraint Qualifications Regularity is an example of a constraint qualification CQ. The KKT conditions are based on linearizations of the constraints. CQ guarantees that this linearization is not getting us into trouble. Problem is KKT point might not exist. There are many other CQ,e.g., for inequalities Slater is there exists g(x)<0. Note CQ not needed for linear constraints.

KKT Summary X* is global min Convex f Convex constraints X* is local min CQ SOSC KKT Satisfied