Chapter 7 Optimization.

Slides:



Advertisements
Similar presentations
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 14.
Advertisements

Line Search.
Nonlinear Programming McCarl and Spreen Chapter 12.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 One-Dimensional Unconstrained Optimization Chapter.
Optimization : The min and max of a function
Introducción a la Optimización de procesos químicos. Curso 2005/2006 BASIC CONCEPTS IN OPTIMIZATION: PART II: Continuous & Unconstrained Important concepts.
Optimization 吳育德.
Optimization methods Review
Optimisation The general problem: Want to minimise some function F(x) subject to constraints, a i (x) = 0, i=1,2,…,m 1 b i (x)  0, i=1,2,…,m 2 where x.
Optimization Introduction & 1-D Unconstrained Optimization
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Linear Programming?!?! Sec Linear Programming In management science, it is often required to maximize or minimize a linear function called an objective.
Thursday, April 25 Nonlinear Programming Theory Separable programming Handouts: Lecture Notes.
MIT and James Orlin © Nonlinear Programming Theory.
Optimization Methods One-Dimensional Unconstrained Optimization
Engineering Optimization
ENGR 351 Numerical Methods Instructor: Dr. L.R. Chevalier
Constrained Optimization
Optimization Mechanics of the Simplex Method
Optimization Methods One-Dimensional Unconstrained Optimization
Advanced Topics in Optimization
Optimization Methods One-Dimensional Unconstrained Optimization
Tier I: Mathematical Methods of Optimization
UNCONSTRAINED MULTIVARIABLE
Chapter 11 Nonlinear Programming
Part 2 Chapter 7 Optimization
ENCI 303 Lecture PS-19 Optimization 2
84 b Unidimensional Search Methods Most algorithms for unconstrained and constrained optimisation use an efficient unidimensional optimisation technique.
D Nagesh Kumar, IIScOptimization Methods: M1L3 1 Introduction and Basic Concepts (iii) Classification of Optimization Problems.
Nonlinear Programming (NLP) Operation Research December 29, 2014 RS and GISc, IST, Karachi.
Chapter 7 Optimization. Content Introduction One dimensional unconstrained Multidimensional unconstrained Example.
1 Unconstrained Optimization Objective: Find minimum of F(X) where X is a vector of design variables We may know lower and upper bounds for optimum No.
1 Optimization Multi-Dimensional Unconstrained Optimization Part II: Gradient Methods.
559 Fish 559; Lecture 5 Non-linear Minimization. 559 Introduction Non-linear minimization (or optimization) is the numerical technique that is used by.
Optimization unconstrained and constrained Calculus part II.
ZEIT4700 – S1, 2015 Mathematical Modeling and Optimization School of Engineering and Information Technology.
One Dimensional Search
Chapter 10 Minimization or Maximization of Functions.
1 Chapter 6 General Strategy for Gradient methods (1) Calculate a search direction (2) Select a step length in that direction to reduce f(x) Steepest Descent.
Exam 1 Oct 3, closed book Place ITE 119, Time:12:30-1:45pm
Gradient Methods In Optimization
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
Calculus-Based Optimization AGEC 317 Economic Analysis for Agribusiness and Management.
Optimization in Engineering Design 1 Introduction to Non-Linear Optimization.
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
By Liyun Zhang Aug.9 th,2012. * Method for Single Variable Unconstraint Optimization : 1. Quadratic Interpolation method 2. Golden Section method * Method.
Exam 1 Oct 3, closed book Place ITE 119, Time:12:30-1:45pm One double-sided cheat sheet (8.5in x 11in) allowed Bring your calculator to the exam Chapters.
Lecture 4 Chapter 3 Improving Search
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 2 - Chapter 7 Optimization.
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
Optimal Control.
Root Finding Methods Fish 559; Lecture 15 a.
deterministic operations research
Non-linear Minimization
Introduction to Optimization
Dr. Arslan Ornek IMPROVING SEARCH
Chapter 14.
Unconstrained and Constrained Optimization
Chapter 7: Optimization
Optimization Part II G.Anuradha.
Outline Unconstrained Optimization Functions of One Variable
Optimization and Some Traditional Methods
EEE 244-8: Optimization.
Numerical Computation and Optimization
Linear and Quadratic Functions
Part 4 - Chapter 13.
Calculus I (MAT 145) Dr. Day Wednesday March 27, 2019
Part 2 Chapter 7 Optimization
What are optimization methods?
Calculus-Based Optimization AGEC 317
Presentation transcript:

Chapter 7 Optimization

Content Introduction One dimensional unconstrained Multidimensional unconstrained Example

Introduction (1) Root finding & optimization are closely related !!

Introduction (2) An optimization problem “Find x, which minimizes or maximizes f(x) subject to… x is an n-dimensional design vector f(x) is the objective function di(x) are inequality constraints ei(x) are equality constraints ai and bi are constants ”

Introduction (3) Classification Constrained optimization problem 1) Linear programming : f(x) & constraints are linear 2) Quadratic programming : f(x) is quadratic and constraints are linear 3) Nonlinear programming : f(x) is either quadratic or nonlinear and/or constraints are nonlinear Unconstrained optimization problem

One-dimensional (1) interested in finding the absolute highest or lowest value of a function. Multimodal function

One-dimensional (2) To distinguish global minimum (or maximum) from local ones we can try Graphing to gain insight into the behavior of the function. Using randomly generated starting guesses and picking the largest of the optima as global. Perturbing the starting point to see if the routine returns a better point or the same local minimum.

One-dimensional (3) Golden section search (for a unimodal function) A unimodal function has a single maximum or a minimum in the a given interval. Step of GSS Pick two points that will bracket your extremum [xl, xu]. Pick an additional third point within this interval to determine whether a maximum occurred. Then pick a fourth point to determine whether the maximum has occurred within the first three or last three points The key is making this approach efficient by choosing intermediate points wisely thus minimizing the function evaluations by replacing the old values with new values.

One-dimensional (4) GSS principle is always keep the ratio of section length equal @ each iteration

One-dimensional (5) Step I: Starts with two initial guesses (xl ,xu) Step II: Calculate points x1,x2 according to the golden ratio where Step III: Evaluate function at x1 and x2

One-dimensional (6) Step IV: if f(x1)> f(x2) xU = x1 x if f(x1)< f(x2) xL= x2 Step V: Calculate new x1 x Try with the following example !!

One-dimensional (7) Ex Use the GSS to find the minimum of assume that xl = 0 and xu = 4 Answer x = 1.4276

Multidimensional… (1) Techniques to find minimum and maximum of a function of several variables. Classified as: Requires derivative evaluation Gradient or descent (or ascent) methods Not require derivative evaluation Non-gradient or direct methods.

Multidimensional… (2)

Multidimensional… (3) DIRECT METHODS :Random Search Based on evaluation of the function randomly at selected values of the independent variables. If a sufficient number of samples are conducted, the optimum will be eventually located. Ex: Locate the maximum of a function f (x, y)=y-x-2x2-2xy-y2

Multidimensional… (4)

Multidimensional… (5) Advantages/ Works even for discontinuous and nondifferentiable functions. Always finds the global optimum rather than the global minimum. Disadvantages/ As the number of independent variables grows, the task can become onerous. Not efficient, it does not account for the behavior of underlying function.

Multidimensional… (6) Univariate and Pattern Searches More efficient than random search and still doesn’t require derivative evaluation. The basic strategy is: Change one variable at a time while the other variables are held constant. Thus problem is reduced to a sequence of one-dimensional searches that can be solved by variety of methods. The search becomes less efficient as you approach the maximum.

Multidimensional… (7) Univariate and Pattern Searches

Multidimensional… (8) Gradient method If f(x,y) is a two dimensional function, the gradient vector tells us What direction is the steepest ascend? How much we will gain by taking that step? Directional derivative of f(x,y) at point x=a and y=b

Multidimensional… (9) Gradient method (cont’d) For n dimensions

Multidimensional… (10) Steepest ascent method Use the gradient vector to change x Says that the we should change with along h-axis Where h provides the maximum along the gradient direction

Multidimensional… (11) Example Maximize the function where x = -1, y = 1 Answer x = 2, y = 1