OR II GSLM 52800 1.

Slides:



Advertisements
Similar presentations
Summary of complementary slackness: 1. If x i ≠ 0, i= 1, 2, …, n the ith dual equation is tight. 2. If equation i of the primal is not tight, y i =0. 1.
Advertisements

Geometry and Theory of LP Standard (Inequality) Primal Problem: Dual Problem:
Solving Linear Programming Problems Shubhagata Roy.
Standard Minimization Problems with the Dual
1 OR II GSLM Outline  some terminology  differences between LP and NLP  basic questions in NLP  gradient and Hessian  quadratic form  contour,
Optimization. f(x) = 0 g i (x) = 0 h i (x)
Page 1 Page 1 ENGINEERING OPTIMIZATION Methods and Applications A. Ravindran, K. M. Ragsdell, G. V. Reklaitis Book Review.
C&O 355 Mathematical Programming Fall 2010 Lecture 15 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A.
1 OR II GSLM Outline  classical optimization – unconstrained optimization  dimensions of optimization  feasible direction.
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
L12 LaGrange Multiplier Method Homework Review Summary Test 1.
Easy Optimization Problems, Relaxation, Local Processing for a small subset of variables.
Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions Today’s lecture is on optimality.
Separating Hyperplanes
The Most Important Concept in Optimization (minimization)  A point is said to be an optimal solution of a unconstrained minimization if there exists no.
Support Vector Machines
Optimization in Engineering Design 1 Lagrange Multipliers.
1 Introduction to Linear and Integer Programming Lecture 9: Feb 14.
Engineering Optimization
Dual Problem of Linear Program subject to Primal LP Dual LP subject to ※ All duality theorems hold and work perfectly!
Duality Dual problem Duality Theorem Complementary Slackness
Constrained Optimization Rong Jin. Outline  Equality constraints  Inequality constraints  Linear Programming  Quadratic Programming.
Design Optimization School of Engineering University of Bradford 1 Formulation of a design improvement problem as a formal mathematical optimization problem.
Exploiting Duality (Particularly the dual of SVM) M. Pawan Kumar VISUAL GEOMETRY GROUP.
MAE 552 – Heuristic Optimization Lecture 1 January 23, 2002.
Problem Set # 4 Maximize f(x) = 3x1 + 2 x2 subject to x1 ≤ 4 x1 + 3 x2 ≤ 15 2x1 + x2 ≤ 10 Problem 1 Solve these problems using the simplex tableau. Maximize.
D Nagesh Kumar, IIScOptimization Methods: M2L5 1 Optimization using Calculus Kuhn-Tucker Conditions.
Optimality Conditions for Nonlinear Optimization Ashish Goel Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.
Objectives: Set up a Linear Programming Problem Solve a Linear Programming Problem.
Computer Algorithms Mathematical Programming ECE 665 Professor Maciej Ciesielski By DFG.
Tier I: Mathematical Methods of Optimization
Lecture 9 – Nonlinear Programming Models
1 OR II GSLM Outline  separable programming  quadratic programming.
KKT Practice and Second Order Conditions from Nash and Sofer
Linear Programming - Standard Form
Machine Learning Week 4 Lecture 1. Hand In Data Is coming online later today. I keep test set with approx test images That will be your real test.
Duality Theory 對偶理論.
3.4 Linear Programming p Optimization - Finding the minimum or maximum value of some quantity. Linear programming is a form of optimization where.
Chapter 6 Linear Programming: The Simplex Method Section R Review.
Nonlinear Programming Models
Advanced Operations Research Models Instructor: Dr. A. Seifi Teaching Assistant: Golbarg Kazemi 1.
Machine Learning Weak 4 Lecture 2. Hand in Data It is online Only around 6000 images!!! Deadline is one week. Next Thursday lecture will be only one hour.
EASTERN MEDITERRANEAN UNIVERSITY Department of Industrial Engineering Non linear Optimization Spring Instructor: Prof.Dr.Sahand Daneshvar Submited.
Optimization unconstrained and constrained Calculus part II.
L8 Optimal Design concepts pt D
Class Opener: Solve each equation for Y: 1.3x + y = y = 2x 3.x + 2y = 5 4. x – y = x + 3y = x – 5y = -3.
Nonlinear Programming In this handout Gradient Search for Multivariable Unconstrained Optimization KKT Conditions for Optimality of Constrained Optimization.
CS B553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Constrained optimization.
1 OR II GSLM Outline  equality constraint  tangent plane  regular point  FONC  SONC  SOSC.
Linear Programming Chapter 9. Interior Point Methods  Three major variants  Affine scaling algorithm - easy concept, good performance  Potential.
Linear Programming Chap 2. The Geometry of LP  In the text, polyhedron is defined as P = { x  R n : Ax  b }. So some of our earlier results should.
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
Chap 10. Sensitivity Analysis
Chapter 11 Optimization with Equality Constraints
Computational Optimization
L11 Optimal Design L.Multipliers
Computational Optimization
Module E3-a Economic Dispatch.
Lecture 8 – Nonlinear Programming Models
Chap 9. General LP problems: Duality and Infeasibility
3-3 Optimization with Linear Programming
Operational Research (OR)
L10 Optimal Design L.Multipliers
CS5321 Numerical Optimization
Linear Programming Example: Maximize x + y x and y are called
Flow Feasibility Problems
CS5321 Numerical Optimization
1.6 Linear Programming Pg. 30.
L8 Optimal Design concepts pt D
Presentation transcript:

OR II GSLM 52800 1

Outline inequality constraint 2

NLP with Inequality Constraints min f(x) s.t. hi(x) = 0 for i = 1, …, p gj(x)  0 for j = 1, …, m in matrix form h(x) = 0 g(x) = 0 3

Inequality constraint gj(x)  0 is binding at point x0 if gj(x0) = 0 Binding Constraints Inequality constraint gj(x)  0 is binding at point x0 if gj(x0) = 0 =-constraint: always binding 4

Regular Points K: the set of binding (inequality) constraints x* subject to h(x) = 0 & g(x)  0 is a regular point if Thi(x*), i = 1, …, p, & Tgj(x*), j  K, are linearly independent 5

FONC for Inequality Constraints (Karush-Kuhn-Tucker Necessary Conditions; KKT Conditions) for a regular point x* to be a local min, there exist *  p and *  m such that In Matrix Form: f(x*) + *Th(x*) + * Tg(x*) = 0 (stationary condition; dual feasibility) h(x*) = 0, g(x*)  0 (primal feasibility) * Tg(x*) = 0 (Complementary slackness conditions) *  0 (Non-negativity of the dual variables) 6

The Convex Cone Formed by Vectors v1 = (1, 0), v2 = (0, 1) {(x, y)|v1+v2 , ,   0} The convex cone formed by (1, 0) and (0, 1): {(x, y)|v1+v2 , ,   0} v1 v2 (1, 0) (0, 1) The convex cone formed by v1 and v2: {(x, y)|v1+v2 , ,   0} 7

Geometric Interpretation of the KKT Condition constraint 1: g1(x)  0 constraint 2: g2(x)  0 g2(x) = 0 g2(x0) g1(x) = 0 g1(x0) f (x0) f (x0) x0 x0 cannot be a minimum 8

Geometric Interpretation of the KKT Condition constraint 2: g2(x)  0 constraint 1: g1(x)  0 g1(x) = 0 g2(x) = 0 g2(x0) g1(x0) x0 f (x0) f (x0) x0 cannot be a minimum 9

Geometric Interpretation of the KKT Condition f (x0) f (x0) region decreasing in f constraint 2: g2(x)  0 constraint 1: g1(x)  0 g1(x) = 0 g2(x) = 0 g2(x0) g1(x0) x0 x0 cannot be a minimum 10

Geometric Interpretation of the KKT Condition region decreasing in f f (x0) f (x0) constraint 2: g2(x)  0 constraint 1: g1(x)  0 g1(x) = 0 g2(x) = 0 g2(x0) g1(x0) x0 x0 cannot be a minimum 11

Geometric Interpretation of the KKT Condition f (x0) f (x0) region decreasing in f constraint 2: g2(x)  0 constraint 1: g1(x)  0 g1(x) = 0 g2(x) = 0 g2(x0) g1(x0) x0 x0 can be a minimum 12

A Necessary Condition for x0 to be a Minimum when f in the convex cone of g1 and g2 f(x0) = 1g1(x0) + 2g2(x0), i.e., f(x0) + 1g1(x0) + 2g2(x0) = 0 g1(x) = 0 g2(x) = 0 g2(x0) g1(x0) x0 f (x0) f (x0) 13

Effect of Convexity convex objective function, convex feasible region set  the KKT condition  necessary & sufficient convex gj & linear hi 14

Example 10.11 of JB KKT conditions 15

Example 10.11 of JB Tf(x) = (4(x1+1), 6(x24)) Tg1(x) = (2x1, 2x2) possibilities 2 = 0 > 0 1 16

Example 10.11 of JB case 1 = 0, 2 = 0 (both constraints not binding) x1 = -1, x2 = 4, violating 17

Example 10.11 of JB case 1 > 0, 2 > 0 (both constraints binding) solving g2(x0) g1(x0) f (x0) g2(x0) g1(x0) f (x0) 18

Example 10.11 of JB case 1 = 0, 2 > 0 (the second constraint binding) solving, 2 < 0 19

Example 10.11 of JB case 1 > 0, 2 = 0 (the first constraint binding) solving, 20

KKT Condition with Non-negativity Constraints min f(x), s.t. gi(x)  0, i = 1, …, m, x  0 21

KKT Condition with Non-negativity Constraints define L(x, ) = f(x) + Tg(x) 22

KKT Condition with Non-negativity Constraints define L(x, ) = f(x) + 1Tg(x) - 2Tx 23

Example 10.12 of JB L(x, ) = KKT condition (a) 2x1 8 +   0, 8x2 16 +   0 (b) x1+ x2 5  0 (c) x1(2x1 8 + ) = 0, x2(8x2 16 + ) = 0 (d) (x1+ x2 5) = 0 (e) x1 0, x2 0,  0   24

Example 10.12 of JB eight cases, depending on whether x1, x2, and = 0 or > 0 on checking, the case that x1+x2  5 is binding, x1 > 0, x2 > 0  x = (3.2, 1.8) and  = 1.6 satisfying the KKT condition regular point convex objective with linear constraint  KKT being sufficient 25