Survey of gradient based constrained optimization algorithms Select algorithms based on their popularity. Additional details and additional algorithms.

Slides:



Advertisements
Similar presentations
Artificial Intelligence Presentation
Advertisements

Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions.
Optimization with Constraints
Assoc. Prof. Dr.Pelin Gundes
Engineering Optimization
P. Venkataraman Mechanical Engineering P. Venkataraman Rochester Institute of Technology DETC2013 – 12269: Continuous Solution for Boundary Value Problems.
Chapter 5 The Simplex Method The most popular method for solving Linear Programming Problems We shall present it as an Algorithm.
SOFT LARGE MARGIN CLASSIFIERS David Kauchak CS 451 – Fall 2013.
Adam Networks Research Lab Transformation Methods Penalty and Barrier methods Study of Engineering Optimization Adam (Xiuzhong) Chen 2010 July 9th Ref:
P. Venkataraman Mechanical Engineering P. Venkataraman Rochester Institute of Technology DETC2014 – 35148: Continuous Solution for Boundary Value Problems.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Constrained optimization Indirect methods Direct methods.
SE- 521: Nonlinear Programming and Applications S. O. Duffuaa.
Easy Optimization Problems, Relaxation, Local Processing for a small subset of variables.
Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions Today’s lecture is on optimality.
Inexact SQP Methods for Equality Constrained Optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge.
P. Venkataraman Mechanical Engineering P. Venkataraman Rochester Institute of Technology DETC2011 –47658 Determining ODE from Noisy Data 31 th CIE, Washington.
On Constrained Optimization Approach To Object Segmentation Chia Han, Xun Wang, Feng Gao, Zhigang Peng, Xiaokun Li, Lei He, William Wee Artificial Intelligence.
Easy Optimization Problems, Relaxation, Local Processing for a small subset of variables.
Methods For Nonlinear Least-Square Problems
Design Optimization School of Engineering University of Bradford 1 Numerical optimization techniques Unconstrained multi-parameter optimization techniques.
ENGR 351 Numerical Methods Instructor: Dr. L.R. Chevalier
Support Vector Machines Formulation  Solve the quadratic program for some : min s. t.,, denotes where or membership.  Different error functions and measures.
Constrained Optimization Rong Jin. Outline  Equality constraints  Inequality constraints  Linear Programming  Quadratic Programming.
Support Vector Machine (SVM) Classification
Reformulated - SVR as a Constrained Minimization Problem subject to n+1+2m variables and 2m constrains minimization problem Enlarge the problem size and.
Optimization Methods One-Dimensional Unconstrained Optimization
Unconstrained Optimization Problem
MAE 552 – Heuristic Optimization Lecture 10 February 13, 2002.
Lecture outline Support vector machines. Support Vector Machines Find a linear hyperplane (decision boundary) that will separate the data.
Lecture 10: Support Vector Machines
Advanced Topics in Optimization
An Introduction to Optimization Theory. Outline Introduction Unconstrained optimization problem Constrained optimization problem.
Constrained Optimization Rong Jin. Outline  Equality constraints  Inequality constraints  Linear Programming  Quadratic Programming.
Optimization Methods One-Dimensional Unconstrained Optimization
Optimization of Linear Problems: Linear Programming (LP) © 2011 Daniel Kirschen and University of Washington 1.
Stevenson and Ozgur First Edition Introduction to Management Science with Spreadsheets McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies,
Chapter 15 Constrained Optimization. The Linear Programming Model Let : x 1, x 2, x 3, ………, x n = decision variables Z = Objective function or linear.
06 - Boundary Models Overview Edge Tracking Active Contours Conclusion.
ENCI 303 Lecture PS-19 Optimization 2
84 b Unidimensional Search Methods Most algorithms for unconstrained and constrained optimisation use an efficient unidimensional optimisation technique.
Nonlinear Programming.  A nonlinear program (NLP) is similar to a linear program in that it is composed of an objective function, general constraints,
Fin500J: Mathematical Foundations in Finance
Chapter 7 Optimization. Content Introduction One dimensional unconstrained Multidimensional unconstrained Example.
Solving Linear Programming Problems: The Simplex Method
Survey of gradient based constrained optimization algorithms Select algorithms based on their popularity. Additional details and additional algorithms.
Frank Edward Curtis Northwestern University Joint work with Richard Byrd and Jorge Nocedal January 31, 2007 Inexact Methods for PDE-Constrained Optimization.
FORS 8450 Advanced Forest Planning Lecture 5 Relatively Straightforward Stochastic Approach.
Optimization formulation Optimization methods help us find solutions to problem where we seek to find the best of something. This lecture is about how.
Inexact SQP methods for equality constrained optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge.
Gradient Methods In Optimization
Survey of unconstrained optimization gradient based algorithms
Nonlinear Programming In this handout Gradient Search for Multivariable Unconstrained Optimization KKT Conditions for Optimality of Constrained Optimization.
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 117 Penalty and Barrier Methods General classical constrained.
Lecture 4 Chapter 3 Improving Search
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 2 - Chapter 7 Optimization.
Regularized Least-Squares and Convex Optimization.
Bounded Nonlinear Optimization to Fit a Model of Acoustic Foams
deterministic operations research
Chapter 5 The Simplex Method
Local Search Algorithms
Dr. Arslan Ornek IMPROVING SEARCH
CS5321 Numerical Optimization
Chapter 7 Optimization.
CS5321 Numerical Optimization
CS5321 Numerical Optimization
EEE 244-8: Optimization.
Transformation Methods Penalty and Barrier methods
Optimization formulation
Constraints.
Presentation transcript:

Survey of gradient based constrained optimization algorithms Select algorithms based on their popularity. Additional details and additional algorithms in Chapter 5 of Haftka and Gurdal’s Elements of Structural Optimization

Optimization with constraints Standard formulation Equality constraints are a challenge, but are fortunately missing in most engineering design problems, so this lecture will deal only with equality constraints.

Derivative based optimizers All are predicated on the assumption that function evaluations are expensive and gradients can be calculated. Similar to a person put at night on a hill and directed to find the lowest point in an adjacent valley using a flashlight with limited battery Basic strategy: 1. Flash light to get derivative and select direction. 2.Go straight in that direction until you start going up or hit constraint. 3.Repeat until converged. Some methods move mostly along the constraint boundaries, some mostly on the inside (interior point algorithms)

Gradient projection and reduced gradient methods Find good direction tangent to active constraints Move a distance and then restore to constraint boundaries A typical active set algorithm, used in Excel

Penalty function methods Quadratic penalty function Gradual rise of penalty parameter leads to sequence of unconstrained minimization technique (SUMT). Why is it important?

Example 5.7.1

Contours for r=1.

Contours for r=1000. For non-derivative methods can avoid this by having penalty proportional to absolute value of violation instead of its square!

Problems Penalty Find how many function evaluatons fminunc and fminsearch solve Example when you use r=1000 starting from x0=[2,2]. Compare to going through r=1,10,100,1000, and starting the solution for the next r value where the solution for the previous one stopped.

5.9: Projected Lagrangian methods Sequential quadratic programming Find direction by solving Find alpha by minimizing

Matlab function fmincon FMINCON attempts to solve problems of the form: min F(X) subject to: A*X <= B, Aeq*X = Beq (linear cons) X C(X) <= 0, Ceq(X) = 0 (nonlinear cons) LB <= X <= UB [X,FVAL,EXITFLAG,OUTPUT,LAMBDA] =FMINCON(FUN,X0,A,B,Aeq,Beq,LB,UB,NONLCON) The function NONLCON accepts X and returns the vectors C and Ceq, representing the nonlinear inequalities and equalities respectively. (Set LB=[] and/or UB=[] if no bounds exist.). Possible values of EXITFLAG 1 First order optimality conditions satisfied. 0 Too many function evaluations or iterations. -1 Stopped by output/plot function. -2 No feasible point found.

Quadratic function and constraint example function f=quad2(x) Global a f=x(1)^2+a*x(2)^2; end function [c,ceq]=ring(x) global ri ro c(1)=ri^2-x(1)^2-x(2)^2; c(2)=x(1)^2+x(2)^2-ro^2; ceq=[]; end x0=[1,10]; a=10;ri=10.; ro=20; x = fval =

Fuller output Optimization completed because the objective function is non-decreasing in feasible directions, to within the default value of the function tolerance, and constraints are satisfied to within the default value of the constraint tolerance. x = fval = flag = 1 output = iterations: 6 funcCount: 22 lssteplength: 1 stepsize: e-06 algorithm: 'medium-scale: SQP, Quasi-Newton, line-search' firstorderopt: e-08 constrviolation: e-11 lambda.ineqnonlin’=

Making it harder for fmincon a=1.1; Maximum number of function evaluations exceeded; increase OPTIONS.MaxFunEvals. x = fval = flag=0 iterations: 14 funcCount: 202 lssteplength: e-04 stepsize: algorithm: 'medium-scale: SQP, Quasi-Newton, line-search' firstorderopt: constrviolation:

Restart sometimes helps x0=x x0 = x = fval = flag = 1 iterations: 15 funcCount: 108 lssteplength: 1 stepsize: e-04 algorithm: 'medium-scale: SQP, Quasi-Newton, line-search' firstorderopt: e-07 constrviolation: e-07

Problem fmincon For the ring problem with a=10, find how the number of function evaluations depends on how narrow you can make the ring (by reducing ro). Always start with x0=[1,10]