Engineering Optimization

Slides:



Advertisements
Similar presentations
Zhen Lu CPACT University of Newcastle MDC Technology Reduced Hessian Sequential Quadratic Programming(SQP)
Advertisements

Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions.
Optimization with Constraints
1 OR II GSLM Outline  some terminology  differences between LP and NLP  basic questions in NLP  gradient and Hessian  quadratic form  contour,
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 123 “True” Constrained Minimization.
Optimization Methods TexPoint fonts used in EMF.
Engineering Optimization
1 TTK4135 Optimization and control B.Foss Spring semester 2005 TTK4135 Optimization and control Spring semester 2005 Scope - this you shall learn Optimization.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
by Rianto Adhy Sasongko Supervisor: Dr.J.C.Allwright
Slide 4b.1 Stiff Structures, Compliant Mechanisms, and MEMS: A short course offered at IISc, Bangalore, India. Aug.-Sep., G. K. Ananthasuresh Lecture.
Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions Today’s lecture is on optimality.
Separating Hyperplanes
Inexact SQP Methods for Equality Constrained Optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge.
ENGINEERING OPTIMIZATION
Engineering Optimization – Concepts and Applications Engineering Optimization Concepts and Applications Fred van Keulen Matthijs Langelaar CLA H21.1
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
Numerical Optimization
Prénom Nom Document Analysis: Linear Discrimination Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
1cs542g-term Notes  Extra class this Friday 1-2pm  If you want to receive s about the course (and are auditing) send me .
Engineering Optimization
Engineering Optimization
1 Chapter 8: Linearization Methods for Constrained Problems Book Review Presented by Kartik Pandit July 23, 2010 ENGINEERING OPTIMIZATION Methods and Applications.
Constrained Optimization
Efficient Methodologies for Reliability Based Design Optimization
Unconstrained Optimization Problem
Engineering Optimization
Advanced Topics in Optimization
Linear Discriminant Functions Chapter 5 (Duda et al.)
Ch. 9: Direction Generation Method Based on Linearization Generalized Reduced Gradient Method Mohammad Farhan Habib NetLab, CS, UC Davis July 30, 2010.
Tier I: Mathematical Methods of Optimization
Optimization of Linear Problems: Linear Programming (LP) © 2011 Daniel Kirschen and University of Washington 1.
MATH 685/ CSI 700/ OR 682 Lecture Notes Lecture 9. Optimization problems.
Collaborative Filtering Matrix Factorization Approach
1 Chapter 8 Nonlinear Programming with Constraints.
Frank Edward Curtis Northwestern University Joint work with Richard Byrd and Jorge Nocedal February 12, 2007 Inexact Methods for PDE-Constrained Optimization.
ENCI 303 Lecture PS-19 Optimization 2
84 b Unidimensional Search Methods Most algorithms for unconstrained and constrained optimisation use an efficient unidimensional optimisation technique.
General Nonlinear Programming (NLP) Software
Nonlinear Programming.  A nonlinear program (NLP) is similar to a linear program in that it is composed of an objective function, general constraints,
Frank Edward Curtis Northwestern University Joint work with Richard Byrd and Jorge Nocedal January 31, 2007 Inexact Methods for PDE-Constrained Optimization.
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Optimization & Constraints Add mention of global techiques Add mention of calculus.
Part 4 Nonlinear Programming 4.1 Introduction. Standard Form.
Quasi-Newton Methods of Optimization Lecture 2. General Algorithm n A Baseline Scenario Algorithm U (Model algorithm for n- dimensional unconstrained.
1  Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
Inexact SQP methods for equality constrained optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge.
Gradient Methods In Optimization
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
Introduction to Optimization
Nonlinear Programming In this handout Gradient Search for Multivariable Unconstrained Optimization KKT Conditions for Optimality of Constrained Optimization.
Optimization in Engineering Design 1 Introduction to Non-Linear Optimization.
Linear Programming Chapter 9. Interior Point Methods  Three major variants  Affine scaling algorithm - easy concept, good performance  Potential.
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 1 Primal Methods.
Ch. Eick: Num. Optimization with GAs Numerical Optimization General Framework: objective function f(x 1,...,x n ) to be minimized or maximized constraints:
1 Introduction Optimization: Produce best quality of life with the available resources Engineering design optimization: Find the best system that satisfies.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
Linear Discriminant Functions Chapter 5 (Duda et al.) CS479/679 Pattern Recognition Dr. George Bebis.
Bounded Nonlinear Optimization to Fit a Model of Acoustic Foams
Part 4 Nonlinear Programming
Chapter 11 Optimization with Equality Constraints
CSLT ML Summer Seminar (13)
Computational Optimization
Collaborative Filtering Matrix Factorization Approach
CS5321 Numerical Optimization
CS5321 Numerical Optimization
Part 4 Nonlinear Programming
CS5321 Numerical Optimization
Support Vector Machines
Constraints.
L8 Optimal Design concepts pt D
Presentation transcript:

Engineering Optimization Concepts and Applications Pictures show convex polyhedra, found at http://www.4dsolutions.net/ocn/wgraphics.html. These resemble the feasible design spaces of linear programming problems in 3-dimensional design problems. Fred van Keulen Matthijs Langelaar CLA H21.1 A.vanKeulen@tudelft.nl

Contents Constrained Optimization: Optimality conditions recap Constrained Optimization: Algorithms Linear programming

Inequality constrained problems Consider problem with only inequality constraints: g2 g1 x1 x2 f g3 At optimum, only active constraints matter: Optimality conditions similar to equality constrained case

Inequality constraints First order optimality: Consider feasible local variation around optimum: (feasible perturbation) Since (boundary optimum)

Optimality condition Multipliers must be non-negative: x1 x2 f -f This interpretation is given in Haftka. Interpretation: negative gradient (descent direction) lies in cone spanned by positive constraint gradients -f

Optimality condition (2) g2 Feasible cone x2 Feasible direction: g1 f -f Descent direction: x1 This interpretation is given in Belegundu. Equivalent interpretation: no descent direction exists within the cone of feasible directions

Karush-Kuhn-Tucker conditions First order optimality conditions for constrained problem: Lagrangian: Note, this condition applies only to regular points, I.e. were the constraint gradients are not linearly dependent.

Sufficiency KKT conditions are necessary conditions for local constrained minima For sufficiency, consider the sufficiency conditions based on the active constraints: on tangent subspace of h and active g. Special case: convex objective & convex feasible region: KKT conditions sufficient for global optimality

Significance of multipliers Consider case where optimization problem depends on parameter a: Lagrangian: KKT: Looking for:

Significance of multipliers (3) Lagrange multipliers describe the sensitivity of the objective to changes in the constraints: Similar equations can be derived for multiple constraints and inequalities Multipliers give “price of raising the constraint” Note, this makes it logical that at an optimum, multipliers of inequality constraints must be positive!

Contents Constrained Optimization: Optimality Criteria Constrained Optimization: Algorithms Linear Programming

Constrained optimization methods Approaches: Transformation methods (penalty / barrier functions) plus unconstrained optimization algorithms Random methods / Simplex-like methods Feasible direction methods Reduced gradient methods Approximation methods (SLP, SQP) Penalty and barrier methods treated before. Note, constrained problems can also have interior optima!

Augmented Lagrangian method Recall penalty method: Disadvantages: High penalty factor needed for accurate results High penalty factor causes ill-conditioning, slow convergence

Augmented Lagrangian method Basic idea: Add penalty term to Lagrangian Use estimates and updates of multipliers Also possible for inequality constraints Multiplier update rules determine convergence Exact convergence for moderate values of p The penalty term helps to make the Hessian of L positive definite.

Contents Constrained Optimization: Optimality Criteria Constrained Optimization: Algorithms Augmented Lagrangian Feasible directions methods Reduced gradient methods Approximation methods SQP Linear Programming

Feasible direction methods Moving along the boundary Rosen’s gradient projection method Zoutendijk’s method of feasible directions Basic idea: move along steepest descent direction until constraints are encountered step direction obtained by projecting steepest descent direction on tangent plane repeat until KKT point is found Both methods involve line searches along the feasible directions. The picture shows professor J.B. Rosen

1. Gradient projection method x3 Iterations follow the constraint boundary: h = 0 For nonlinear constraints, mapping back to the constraint surface is needed, in normal space x1 x2 For simplicity, consider linear equality constrained problem:

Gradient projection method (2) Recall: Tangent space: Normal space: Projection: decompose in tangent/normal vector:

Gradient projection method (3) Search direction in tangent space: Projection matrix Nonlinear case: Correction in normal space:

Correction to constraint boundary Correction in normal subspace, e.g. using Newton iterations: xk x’k+1 sk xk+1 First order Taylor approximation: Iterations:

Practical aspects How to deal with inequality constraints? Use active set strategy: Keep set of active inequality constraints Treat these as equality constraints Update the set regularly (heuristic rules) In gradient projection method, if s = 0: Check multipliers: could be KKT point If any mi < 0, this constraint is inactive and can be removed from the active set

Slack variables Alternative way of dealing with inequality constraints: using slack variables: Disadvantages: all constraints considered all the time, + increased number of design variables

2. Zoutendijk’s feasible directions Basic idea: move along steepest descent direction until constraints are encountered at constraint surface, solve subproblem to find descending feasible direction repeat until KKT point is found Subproblem: Descending: Subproblem is LP problem, can be solved efficiently. Gives best search direction. (But alpha must be negative!!!) See Belegundu p. 168 for details Feasible:

Zoutendijk’s method Subproblem linear: efficiently solved Determine active set before solving subproblem! When a = 0: KKT point found Method needs feasible starting point. Dr. Zoutendijk worked at the University of Leiden, and invented this method around 1970. Nonlinear equality constraints have no interior, and this method requires an interior. This is because for a descent direction, alpha must be slightly negative, which means that the design is pushed slightly into the feasible region. See Belegundu for details.

Contents Constrained Optimization: Optimality Criteria Constrained Optimization: Algorithms Augmented Lagrangian Feasible directions methods Reduced gradient methods Approximation methods SQP Linear Programming

Reduced gradient methods Basic idea: Choose set of n - m decision variables d Use reduced gradient in unconstr. gradient-based method Recall: reduced gradient For the iterations, h(d,s)=0 must be written as a first order Taylor approximation, and then an iterative procedure for s can be made (basically this is a Newton method). State variables s can be determined from: (iteratively for nonlinear constraints)

Reduced gradient method Nonlinear constraints: Newton iterations to return to constraint surface (determine s): until convergence A note on the selection of the variables is given in Papalambros, p. 190. The cost of the back-to-constraint-mapping procedure depends strongly on the partitioning. Variants using 2nd order information also exist Drawback: selection of decision variables (but some procedures exist)

Contents Constrained Optimization: Optimality Criteria Constrained Optimization: Algorithms Augmented Lagrangian Feasible directions methods Reduced gradient methods Approximation methods SQP Linear Programming

Approximation methods SLP: Sequential Linear Programming Solving series of linear approximate problems Efficient methods for linear constrained problems available

SLP 1-D illustration SLP iterations approach convex feasible domain from outside: f g x = 0.8 x = 0.988

SLP points of attention Solves LP problem in every cycle: efficient only when analysis cost is relatively high Tendency to diverge Solution: trust region (move limits) x2 x1

SLP points of attention (2) Infeasible starting point can result in unsolvable LP problem Solution: relaxing constraints in first cycles k sufficiently large to force solution into feasible region The feasible domain is enlarged by beta, which allows a certain amount of constraint violation.

SLP points of attention (3) Cycling can occur when optimum lies on curved constraint Solution: move limit reduction strategy f x2 x1

Method of Moving Asymptotes First order method, by Svanberg (1987) Builds convex approximate problem, approximating responses using: R, Pi, Qi, Ui and Li are determined base don the values of the gradient and objective, and the history of the optimization process. See also p. 325 of Papalambros. Approximate problem solved efficiently Popular method in topology optimization

Sequential Approximate Optimization Zeroth order method: Determine initial trust region Generate sampling points (design of experiments) Build response surface (e.g. Least Squares, Kriging, …) Optimize approximate problem Check convergence, update trust region, repeate from 2 Many variants! See also Lecture 4

Sequential Approximate Optimization Good approach for expensive models RS dampens noise Versatile Design domain Optimum Response surface Sub-optimal point Trust region

Contents Constrained Optimization: Optimality Criteria Constrained Optimization: Algorithms Augmented Lagrangian Feasible directions methods Reduced gradient methods Approximation methods SQP Linear Programming

SQP SQP: Sequential Quadratic Programming Newton method to solve the KKT conditions KKT points: Newton:

SQP (2) Newton:

Note: KKT conditions of: SQP (3) Note: KKT conditions of: Quadratic subproblem for finding search direction sk

Quadratic subproblem Quadratic subproblem with linear constraints can be solved efficiently: General case: KKT condition: Efficient specialized algorithms exist (Papalambros p. 318) to solve this system of equations. Solution:

Basic SQP algorithm Choose initial point x0 and initial multiplier estimates l0 Set up matrices for QP subproblem Solve QP subproblem  sk , lk+1 Set xk+1 = xk + sk Check convergence criteria Finished

SQP refinements For convergence of Newton method, must be positive definite Line search along sk improves robustness To avoid computation of Hessian information for , quasi-Newton approaches (DFP, BFGS) can be used (also ensure positive definiteness) Active set strategies operate either on the original problem or on the quadratic subproblem, and the line search is also a special line search which uses a special “merit function” to locate the best point. For dealing with inequality constraints, various active set strategies exist

Comparison Method AugLag Zoutendijk GRG SQP Feasible starting point? No Yes Yes No Nonlinear constraints? Yes Yes Yes Yes Equality constraints? Yes Hard Yes Yes Uses active set? Yes Yes No Yes Iterates feasible? No Yes No No Derivatives needed? Yes Yes Yes Yes SQP generally seen as best general-purpose method for constrained problems

Contents Constrained Optimization: Optimality Criteria Constrained Optimization: Algorithms Linear programming

Linear programming problem Linear objective and constraint functions:

Feasible domain Linear constraints divide design space into two convex half-spaces: x2 x1 Feasible domain = intersection of convex half-spaces: Result: X = convex polyhedron

Global optimality Convex objective function on convex feasible domain: KKT point = unique global optimum KKT conditions: