MAE 552 – Heuristic Optimization Lecture 1 January 23, 2002.

Slides:



Advertisements
Similar presentations
Fin500J: Mathematical Foundations in Finance
Advertisements

Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Nonlinear Programming McCarl and Spreen Chapter 12.
Engineering Optimization
Optimization. f(x) = 0 g i (x) = 0 h i (x)
Introducción a la Optimización de procesos químicos. Curso 2005/2006 BASIC CONCEPTS IN OPTIMIZATION: PART II: Continuous & Unconstrained Important concepts.
Page 1 Page 1 ENGINEERING OPTIMIZATION Methods and Applications A. Ravindran, K. M. Ragsdell, G. V. Reklaitis Book Review.
Engineering Optimization
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
L12 LaGrange Multiplier Method Homework Review Summary Test 1.
Easy Optimization Problems, Relaxation, Local Processing for a small subset of variables.
Local and Global Optima
Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions Today’s lecture is on optimality.
Optimization in Engineering Design 1 Lagrange Multipliers.
Numerical Optimization
MAE 552 – Heuristic Optimization
MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.
Lecture 35 Constrained Optimization
Unconstrained Optimization Problem
MAE 552 – Heuristic Optimization Lecture 10 February 13, 2002.
D Nagesh Kumar, IIScOptimization Methods: M2L5 1 Optimization using Calculus Kuhn-Tucker Conditions.
Introduction and Basic Concepts
D Nagesh Kumar, IIScOptimization Methods: M2L3 1 Optimization using Calculus Optimization of Functions of Multiple Variables: Unconstrained Optimization.
Objectives: Set up a Linear Programming Problem Solve a Linear Programming Problem.
Tier I: Mathematical Methods of Optimization
Lecture 9 – Nonlinear Programming Models
Introduction to Optimization (Part 1)
KKT Practice and Second Order Conditions from Nash and Sofer
1. Problem Formulation. General Structure Objective Function: The objective function is usually formulated on the basis of economic criterion, e.g. profit,
ENCI 303 Lecture PS-19 Optimization 2
Nonlinear Programming.  A nonlinear program (NLP) is similar to a linear program in that it is composed of an objective function, general constraints,
1 Unconstrained Optimization Objective: Find minimum of F(X) where X is a vector of design variables We may know lower and upper bounds for optimum No.
Nonlinear Programming Models
EASTERN MEDITERRANEAN UNIVERSITY Department of Industrial Engineering Non linear Optimization Spring Instructor: Prof.Dr.Sahand Daneshvar Submited.
Optimization unconstrained and constrained Calculus part II.
L8 Optimal Design concepts pt D
ZEIT4700 – S1, 2015 Mathematical Modeling and Optimization School of Engineering and Information Technology.
CPSC 536N Sparse Approximations Winter 2013 Lecture 1 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAAAAAAA.
(iii) Lagrange Multipliers and Kuhn-tucker Conditions D Nagesh Kumar, IISc Introduction to Optimization Water Resources Systems Planning and Management:
Introduction to Optimization
LINEAR PROGRAMMING 3.4 Learning goals represent constraints by equations or inequalities, and by systems of equations and/or inequalities, and interpret.
Calculus III Hughes-Hallett Chapter 15 Optimization.
Inequality Constraints Lecture 7. Inequality Contraints (I) n A Review of Lagrange Multipliers –As we discussed last time, the first order necessary conditions.
1 Introduction Optimization: Produce best quality of life with the available resources Engineering design optimization: Find the best system that satisfies.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
OR II GSLM
Lecture 4 Chapter 3 Improving Search
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
Optimal Control.
ZEIT4700 – S1, 2016 Mathematical Modeling and Optimization School of Engineering and Information Technology.
deterministic operations research
Part 4 Nonlinear Programming
Chapter 11 Optimization with Equality Constraints
L11 Optimal Design L.Multipliers
Computational Optimization
Lecture 8 – Nonlinear Programming Models
Constrained Optimization
Dr. Arslan Ornek IMPROVING SEARCH
1. Problem Formulation.
Optimality conditions constrained optimisation
3-3 Optimization with Linear Programming
L5 Optimal Design concepts pt A
Part 4 Nonlinear Programming
Outline Unconstrained Optimization Functions of One Variable
Linear Programming Example: Maximize x + y x and y are called
LINEARPROGRAMMING 4/26/2019 9:23 AM 4/26/2019 9:23 AM 1.
CS5321 Numerical Optimization
L8 Optimal Design concepts pt D
Presentation transcript:

MAE 552 – Heuristic Optimization Lecture 1 January 23, 2002

The optimization problem is then: Find values of the variables that minimize or maximize the objective function while satisfying the constraints. The standard form of the constrained optimization problem can be written as: Minimize: F(x)objective function Subject to:g j (x)  0j=1,m inequality constraints h k (x)  0k=1,l equality constraints x i lower  x i  x i upper i=1,nside constraints where x=(x 1, x 2, x 3, x 4, x 5,x n )design variables

Conditions for Optimality Unconstrained Problems 1.  F(x)=0The gradient of F(x) must vanish at the optimum 2.Hessian Matrix must be positive definite (i.e. all positive eigenvalues at optimum point).

Conditions for Optimality Unconstrained Problems A positive definite Hessian at the minimum ensures only that a local minimum has been found The minimum is the global minimum only if it can be shown that the Hessian is positive definite for all possible values of x. This would imply a convex design space. Very hard to prove in practice!!!!

Conditions for Optimality Constrained Problems Kuhn Tucker Conditions 1.x * is feasible 2. j g j = 0j=1,m 3. These conditions only guarantee that x * is a local optimum.

Conditions for Optimality Constrained Problems In addition to the Kuhn Tucker conditions two other conditions two other conditions must be satisfied to guarantee a global optima. 1.Hessian must be positive definite for all x. 2.Constraints must be convex. A constraint is convex if a line connecting any two points in the feasible space travels always lies in the feasible region of the design space.