CS B553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Constrained optimization.

Slides:



Advertisements
Similar presentations
3.6 Support Vector Machines
Advertisements

Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions.
Optimization. f(x) = 0 g i (x) = 0 h i (x)
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
L12 LaGrange Multiplier Method Homework Review Summary Test 1.
Easy Optimization Problems, Relaxation, Local Processing for a small subset of variables.
Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions Today’s lecture is on optimality.
Separating Hyperplanes
Inexact SQP Methods for Equality Constrained Optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge.
Optimization in Engineering Design 1 Lagrange Multipliers.
Engineering Optimization
Economics 214 Lecture 37 Constrained Optimization.
Constrained Optimization
Lecture 38 Constrained Optimization
Constrained Optimization Rong Jin. Outline  Equality constraints  Inequality constraints  Linear Programming  Quadratic Programming.
Design Optimization School of Engineering University of Bradford 1 Formulation of a design improvement problem as a formal mathematical optimization problem.
Exploiting Duality (Particularly the dual of SVM) M. Pawan Kumar VISUAL GEOMETRY GROUP.
Constrained Optimization Economics 214 Lecture 41.
D Nagesh Kumar, IIScOptimization Methods: M2L5 1 Optimization using Calculus Kuhn-Tucker Conditions.
Optimality Conditions for Nonlinear Optimization Ashish Goel Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.
1 Optimization. 2 General Problem 3 One Independent Variable x y (Local) maximum Slope = 0.
Ch. 9: Direction Generation Method Based on Linearization Generalized Reduced Gradient Method Mohammad Farhan Habib NetLab, CS, UC Davis July 30, 2010.
Introduction to Optimization (Part 1)
CS B553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Gradient descent.
KKT Practice and Second Order Conditions from Nash and Sofer
Algebra 2 Chapter 3 Notes Systems of Linear Equalities and Inequalities Algebra 2 Chapter 3 Notes Systems of Linear Equalities and Inequalities.
Stevenson and Ozgur First Edition Introduction to Management Science with Spreadsheets McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies,
L4 Graphical Solution Homework See new Revised Schedule Review Graphical Solution Process Special conditions Summary 1 Read for W for.
3.4 Linear Programming p Optimization - Finding the minimum or maximum value of some quantity. Linear programming is a form of optimization where.
Opener. Notes: 3.4 Linear Programming Optimization  Many real-life problems involve a process called optimization.  This means finding a maximum or.
LAGRANGE mULTIPLIERS By Rohit Venkat.
CS Statistical Machine learning Lecture 18 Yuan (Alan) Qi Purdue CS Oct
Nonlinear Programming Models
Optimization unconstrained and constrained Calculus part II.
L8 Optimal Design concepts pt D
Part 4 Nonlinear Programming 4.1 Introduction. Standard Form.
Functions of Several Variables Copyright © Cengage Learning. All rights reserved.
Introduction to Optimization
Calculus-Based Optimization AGEC 317 Economic Analysis for Agribusiness and Management.
LINEAR PROGRAMMING 3.4 Learning goals represent constraints by equations or inequalities, and by systems of equations and/or inequalities, and interpret.
Economics 2301 Lecture 37 Constrained Optimization.
Chapter 4 The Maximum Principle: General Inequality Constraints.
Calculus III Hughes-Hallett Chapter 15 Optimization.
Section 15.3 Constrained Optimization: Lagrange Multipliers.
Lagrange Multipliers. Objective: -Understand the method of Lagrange multipliers -Use Lagrange multipliers to solve constrained optimization problems (functions.
Inequality Constraints Lecture 7. Inequality Contraints (I) n A Review of Lagrange Multipliers –As we discussed last time, the first order necessary conditions.
1 Unconstrained and Constrained Optimization. 2 Agenda General Ideas of Optimization Interpreting the First Derivative Interpreting the Second Derivative.
OR II GSLM
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
1 Support Vector Machines: Maximum Margin Classifiers Machine Learning and Pattern Recognition: September 23, 2010 Piotr Mirowski Based on slides by Sumit.
Another sufficient condition of local minima/maxima
Part 4 Nonlinear Programming
L11 Optimal Design L.Multipliers
Lecture 8 – Nonlinear Programming Models
Unconstrained and Constrained Optimization
CS5321 Numerical Optimization
3-3 Optimization with Linear Programming
The Lagrange Multiplier Method
Objective Graph and solve systems of linear inequalities in two variables.
L10 Optimal Design L.Multipliers
Part 4 Nonlinear Programming
Outline Unconstrained Optimization Functions of One Variable
Machine learning overview
LINEARPROGRAMMING 4/26/2019 9:23 AM 4/26/2019 9:23 AM 1.
CS5321 Numerical Optimization
1.6 Linear Programming Pg. 30.
Lecture 38 Constrained Optimization
Calculus-Based Optimization AGEC 317
Constraints.
L8 Optimal Design concepts pt D
Presentation transcript:

CS B553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Constrained optimization

K EY C ONCEPTS Constraint formulations Necessary optimality conditions Lagrange multipliers: equality constraints KKT conditions: equalities and inequalities

Objective function f Figure 1

S + feasible set S Objective function f Figure 1

S Local minima are either local minima of f or on the boundary of S Figure 2

S l1l1 u1u1 l2l2 u2u2 Bound constraints Linear inequalities A x  b S A i (row i ) bibi Linear equalities A x = b b1b1 A1A1 S General, nonlinear constraints Figure 3 h1(x)0h1(x)0 g 1 (x)=0 h2(x)0h2(x)0 S

Lagrange multipliers: one equality constraint At a local minimum (or maximum), the gradient of the objective and the constraint must be parallel Figure 4 g ( x )=0 x1x1 f(x1)f(x1) g(x1)g(x1) g(x2)g(x2) f(x2)f(x2) x2x2

If the constraint gradient and the objective gradient are not parallel, then there exists some direction v that you can move in to change f without changing g ( x ) Figure 5 f(x)f(x) x g(x)g(x) v

Interpretation : Suppose x* is a global minimum. I were to relax the constraint g ( x )=0 at a constant rate toward g ( x )=1, the value of tells me the rate of decrease of f ( x *). Figure 6 x*x*  f ( x * ) = -  g ( x * ) g(x*)g(x*)

One inequality constraint h ( x )  0. Either: Figure 7 h ( x ) < 0 h ( x ) > 0 h ( x ) = 0

One inequality constraint h ( x )  0. Either: 1. x is a critical point of f with h ( x ) < 0, or Figure 7 h ( x ) < 0 h ( x ) > 0 h ( x ) = 0  f ( x 1 )=0

Figure 7 h ( x ) < 0 h ( x ) > 0 h ( x ) = 0 x2x2 f(x2)f(x2) g(x2)g(x2) One inequality constraint h ( x )  0. Either: 1. x is a critical point of f with h ( x ) < 0, or 2. x is on boundary h ( x ) = 0 and satisfies a Lagrangian condition g(x3)g(x3) f(x3)f(x3) x3x3

Figure 8 h 1 ( x ) < 0 Multiple inequality constraints h 2 ( x ) < 0 x x