CS5321 Numerical Optimization

Slides:



Advertisements
Similar presentations
3.6 Support Vector Machines
Advertisements

C&O 355 Lecture 15 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A A A A A A A A.
Fin500J: Mathematical Foundations in Finance
Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
IEOR 4004 Midterm review (Part II) March 12, 2014.
Nonlinear Programming McCarl and Spreen Chapter 12.
Page 1 Page 1 ENGINEERING OPTIMIZATION Methods and Applications A. Ravindran, K. M. Ragsdell, G. V. Reklaitis Book Review.
by Rianto Adhy Sasongko Supervisor: Dr.J.C.Allwright
Separating Hyperplanes
Decomposable Optimisation Methods LCA Reading Group, 12/04/2011 Dan-Cristian Tomozei.
The Most Important Concept in Optimization (minimization)  A point is said to be an optimal solution of a unconstrained minimization if there exists no.
Optimization in Engineering Design 1 Lagrange Multipliers.
Engineering Optimization
Dual Problem of Linear Program subject to Primal LP Dual LP subject to ※ All duality theorems hold and work perfectly!
Support Vector Classification (Linearly Separable Case, Primal) The hyperplanethat solves the minimization problem: realizes the maximal margin hyperplane.
Constrained Maximization
Constrained Optimization Rong Jin. Outline  Equality constraints  Inequality constraints  Linear Programming  Quadratic Programming.
Classification Problem 2-Category Linearly Separable Case A- A+ Malignant Benign.
Reformulated - SVR as a Constrained Minimization Problem subject to n+1+2m variables and 2m constrains minimization problem Enlarge the problem size and.
Exploiting Duality (Particularly the dual of SVM) M. Pawan Kumar VISUAL GEOMETRY GROUP.
MAE 552 – Heuristic Optimization Lecture 1 January 23, 2002.
D Nagesh Kumar, IIScOptimization Methods: M2L5 1 Optimization using Calculus Kuhn-Tucker Conditions.
Optimality Conditions for Nonlinear Optimization Ashish Goel Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.
Optimization Theory Primal Optimization Problem subject to: Primal Optimal Value:
Constrained Optimization Rong Jin. Outline  Equality constraints  Inequality constraints  Linear Programming  Quadratic Programming.
KKT Practice and Second Order Conditions from Nash and Sofer
Duality Theory 對偶理論.
Nonlinear Programming.  A nonlinear program (NLP) is similar to a linear program in that it is composed of an objective function, general constraints,
Introduction to Operations Research
Duality Theory  Every LP problem (called the ‘Primal’) has associated with another problem called the ‘Dual’.  The ‘Dual’ problem is an LP defined directly.
Department Of Industrial Engineering Duality And Sensitivity Analysis presented by: Taha Ben Omar Supervisor: Prof. Dr. Sahand Daneshvar.
CS Statistical Machine learning Lecture 18 Yuan (Alan) Qi Purdue CS Oct
Machine Learning Weak 4 Lecture 2. Hand in Data It is online Only around 6000 images!!! Deadline is one week. Next Thursday lecture will be only one hour.
Optimization unconstrained and constrained Calculus part II.
L8 Optimal Design concepts pt D
(iii) Lagrange Multipliers and Kuhn-tucker Conditions D Nagesh Kumar, IISc Introduction to Optimization Water Resources Systems Planning and Management:
Nonlinear Programming In this handout Gradient Search for Multivariable Unconstrained Optimization KKT Conditions for Optimality of Constrained Optimization.
CS B553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Constrained optimization.
Linear Programming Chapter 9. Interior Point Methods  Three major variants  Affine scaling algorithm - easy concept, good performance  Potential.
Inequality Constraints Lecture 7. Inequality Contraints (I) n A Review of Lagrange Multipliers –As we discussed last time, the first order necessary conditions.
OR II GSLM
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
1 Support Vector Machines: Maximum Margin Classifiers Machine Learning and Pattern Recognition: September 23, 2010 Piotr Mirowski Based on slides by Sumit.
Support Vector Machine Slides from Andrew Moore and Mingyue Tan.
Part 4 Nonlinear Programming
Chapter 11 Optimization with Equality Constraints
L11 Optimal Design L.Multipliers
EMGT 6412/MATH 6665 Mathematical Programming Spring 2016
Proving that a Valid Inequality is Facet-defining
Module E3-a Economic Dispatch.
Chap 9. General LP problems: Duality and Infeasibility
Chapter 5. Sensitivity Analysis
Optimality conditions constrained optimisation
CS5321 Numerical Optimization
The Lagrange Multiplier Method
Chapter 4. Duality Theory
CS5321 Numerical Optimization
Part 4 Nonlinear Programming
BT8118 – Adv. Topics in Systems Biology
Chapter 5. The Duality Theorem
CS5321 Numerical Optimization
Outline Unconstrained Optimization Functions of One Variable
CS5321 Numerical Optimization
Lecture 18. SVM (II): Non-separable Cases
I.4 Polyhedral Theory (NW)
I.4 Polyhedral Theory.
Proving that a Valid Inequality is Facet-defining
Constraints.
L8 Optimal Design concepts pt D
Presentation transcript:

CS5321 Numerical Optimization 12 Theory of Constrained Optimization 5/13/2019

General form Feasible set ={x |ci(x)=0, iE; ci(x)0, iI} Outline 2 R f ( ) s u b j e c t o ½ = E ¸ I E, I are index sets for equality and inequality constraints Feasible set ={x |ci(x)=0, iE; ci(x)0, iI} Outline Equality and inequality constraints Lagrange multipliers Linear independent constraint qualification First/second order optimality conditions Duality 5/13/2019

A single equality constraint An example subject to The optimal solution is at Optimal condition: *=−1/2 in the example m i n x f ( ) = 1 + 2 c 1 = x 2 + r f ( x ) = µ 1 ¶ r c 1 ( x ) = µ 2 ¶ µ x ¤ 1 2 ¶ = ¡ r f ( x ¤ ) = ¸ c 1 5/13/2019

A single inequality constraint  f c1 s Example: subject to Case 1: the solution is inside c1. Unconstrained optimization Case 2: the solution is on the boundary of c1. Equality constraint Complementarity condition: *c1(x*)=0 Let *=0 in case 1. m i n x f ( ) = 1 + 2 c 1 : 2 ¡ x ¸ r f ( x ¤ ) = ½ c 1 ( x ¤ ) = r f ¸ 5/13/2019

Lagrangian function Define L (x, ) = f (x) − c1(x) L (x, ) = −c1(x) xL (x, ) = f (x) − c1(x) The optimality conditions of inequality constraint L (x, ) is called the Lagrangian function;  is called the Lagrangian multiplier ½ ¸ ¤ 1 c ( x ) = r f 5/13/2019

Lagrangian multiplier At optimal solution x*, f (x*) = 1c1(x*) If c1(.) changes  unit, f (.) changes 1 unit. 1 is the change of f per unit change of c1. 1 means the sensitivity of f to ci. 1 is called the shadow price or the dual variable. 5/13/2019

Constraint qualification Active set A(x) = E{iI | ci(x)=0} Linearly Independent Constraint Qualification The gradients of constraints in the active set {ci(x) | iA(x)} are linearly independent A point is called regular if it satisfies LICQ. Other constraint qualifications may be used. c 1 ( x ) = ¡ 2 ¸ ; 5/13/2019

First order conditions A regular point that is a minimizer of the constrained problem must satisfies the KKT condition (Karush-Kuhn-Tucker) The last one is called complementarity condition r x L ( ¤ ; ¸ ) = c i f o a l 2 E I 5/13/2019

Second order conditions The critical cone C(x*, *) is a set of vectors that Suppose x* is a solution to a constrained optimization problem and * satisfies KKT conditions. For all w  C (x*,*), w 2 C ( x ¤ ; ¸ ) , 8 < : r c i T = E I \ A t h > Meaning of c(x,lambda) w T r 2 x L ( ¤ ; ¸ ) 5/13/2019

Projected Hessian Let A(x*)=[ci(x*)]iA(x*): A(x*) is the active set. It can be shown that C(x*, *) = Null A(x*) Null A(x*) can be computed via QR decomposition span{Q2} = Null A(x*) Define projected Hessian The second order optimality condition is that H is positive semidefinite. A ( x ¤ ) T = Q µ R ¶ ¡ 1 2 ¢ H = Q T 2 r x L ( ¤ ; ¸ ) 5/13/2019

Duality: An example subject to Find a lower bound for f (x) via the constraints Ex: f (x) >3c1  3x1+4x2 >3x1+3x2  9 Ex: f (x) >2c1 +0.5c2  3x1+4x2 >3x1+1.25x2  7 What is the maximum lower bound for f (x)? m i n x ¸ f ( ) = 3 1 + 4 2 c 1 : x + 2 ¸ 3 5 m a x y ¸ g ( ) = 3 1 + 2 s.t. y 1 + 2 · 3 : 5 4 5/13/2019

Dual problem Primal problem No equality constraints. Inequality constraints ci are concave. (−ci are convex) Let c(x)=[c1(x), …cm(x)]T. The Lagrangian is L (x,) = f (x) − Tc(x), Rm. The dual object function is q()=infxL (x,) If L (:,) is convex, infxL (x,) is at xL (x,)=0 The dual problem is If L (:,) is convex, additional constraint is xL(x,)=0 m i n x f ( ) s . t c ¸ m a x ¸ q ( ) s . t 5/13/2019