Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 2 - Chapter 7 Optimization.

Slides:



Advertisements
Similar presentations
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 14.
Advertisements

Line Search.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 One-Dimensional Unconstrained Optimization Chapter.
Optimization Introduction & 1-D Unconstrained Optimization
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Identifying Quadratic Functions
Nonlinear programming: One dimensional minimization methods
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Ordinary Differential Equations Equations which are.
MIT and James Orlin © Nonlinear Programming Theory.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 51.
General Linear Least-Squares and Nonlinear Regression
ENGR 351 Numerical Methods Instructor: Dr. L.R. Chevalier
Optimization Mechanics of the Simplex Method
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Martin Mendez UASLP Chapter 61 Unit II.
Optimization Methods One-Dimensional Unconstrained Optimization
Mech300 Numerical Methods, Hong Kong University of Science and Technology. 1 Part Four Optimization.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Interpolation Chapter 18.
Optimization Methods One-Dimensional Unconstrained Optimization
Tier I: Mathematical Methods of Optimization
Numerical Methods Golden Section Search Method - Theory
Non-Linear Simultaneous Equations
UNCONSTRAINED MULTIVARIABLE
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 2 Roots of Equations Why? But.
AP CALCULUS AB Chapter 4: Applications of Derivatives Section 4.1:
Chapter 6 Finding the Roots of Equations
Part 2 Chapter 7 Optimization
ENCI 303 Lecture PS-19 Optimization 2
Some Key Facts About Optimal Solutions (Section 14.1) 14.2–14.16
Chapter 7 Optimization. Content Introduction One dimensional unconstrained Multidimensional unconstrained Example.
Holt Algebra Identifying Quadratic Functions 9-1 Identifying Quadratic Functions Holt Algebra 1 Warm Up Warm Up Lesson Presentation Lesson Presentation.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Chapter 15.
Copyright © 2014, 2010 Pearson Education, Inc. Chapter 2 Polynomials and Rational Functions Copyright © 2014, 2010 Pearson Education, Inc.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 71.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Chapter 7.
9-1 Quadratic Equations and Functions Warm Up Warm Up Lesson Presentation Lesson Presentation California Standards California StandardsPreview.
Identifying Quadratic Functions. The function y = x 2 is shown in the graph. Notice that the graph is not linear. This function is a quadratic function.
9-1 Quadratic Equations and Functions Solutions of the equation y = x 2 are shown in the graph. Notice that the graph is not linear. The equation y = x.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Roots of Equations ~ Bracketing Methods Chapter 5.
McGraw-Hill/Irwin © The McGraw-Hill Companies, Inc., Table of Contents CD Chapter 14 (Solution Concepts for Linear Programming) Some Key Facts.
Newton’s Method, Root Finding with MATLAB and Excel
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 7 - Chapter 25.
One Dimensional Search
Chapter 10 Minimization or Maximization of Functions.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 271 Boundary-Value.
1 Chapter 6 General Strategy for Gradient methods (1) Calculate a search direction (2) Select a step length in that direction to reduce f(x) Steepest Descent.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Chapter 27.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Chapter 15 General Least Squares and Non- Linear.
Optimization of functions of one variable (Section 2)
Basis of Mathematical Modeling LECTURE 3 Numerical Analysis with MATLAB Dr. N.K. Sakhnenko, PhD, Professor Associate.
Introduction to Optimization
Optimization in Engineering Design 1 Introduction to Non-Linear Optimization.
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
By Liyun Zhang Aug.9 th,2012. * Method for Single Variable Unconstraint Optimization : 1. Quadratic Interpolation method 2. Golden Section method * Method.
2/24/ Golden Section Search Method Major: All Engineering Majors Authors: Autar Kaw, Ali Yalcin
Exam 1 Oct 3, closed book Place ITE 119, Time:12:30-1:45pm One double-sided cheat sheet (8.5in x 11in) allowed Bring your calculator to the exam Chapters.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 6 - Chapters 22 and 23.
1 Unconstrained and Constrained Optimization. 2 Agenda General Ideas of Optimization Interpreting the First Derivative Interpreting the Second Derivative.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 2 / Chapter 5.
Warm Up 1. Evaluate x2 + 5x for x = 4 and x = –3. 36; –6
Introduction to Optimization
Identifying quadratic functions
Identifying Quadratic Functions
Chapter 7 Optimization.
Chapter 7: Optimization
Warm Up 1. Evaluate x2 + 5x for x = 4 and x = –3.
Quadratic Equations and Functions
Numerical Computation and Optimization
Part 4 - Chapter 13.
Part 2 Chapter 7 Optimization
Presentation transcript:

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 2 - Chapter 7 Optimization

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 2 Optimization Root finding and optimization are related, both involve –guessing and searching for a point on a function.

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Fundamental difference is: Root finding is searching for zeros of a function or functions Optimization is finding the minimum or the maximum of a function of one or more variables. 3

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 4 figure 7.2

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 5 We’ll be looking at strategies to find the minimum of our functions of interest

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 6 Functions of more than one variable are harder to visualize

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 7

8 Each contour represents a constant value of f(x,y) We will be looking at strategies for both one- dimensional optimization and multi- dimensional optimization

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 9 Mathematical Background An optimization or mathematical programming problem generally can be stated as: Find x, which minimizes or maximizes f(x) subject to Where x is an n-dimensional design vector, f(x) is the objective function, d i (x) are inequality constraints, e i (x) are equality constraints, and a i and b i are constants

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 10 Optimization problems can be classified on the basis of the form of f(x): –If f(x) and the constraints are linear, we have linear programming. –If f(x) is quadratic and the constraints are linear, we have quadratic programming. –If f(x) is not linear or quadratic and/or the constraints are nonlinear, we have nonlinear programming. When equations are included, we have a constrained optimization problem; otherwise, it is unconstrained optimization problem.

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 11 One-Dimensional Unconstrained Optimization If you don’t put a limit on x it’s hard to tell where the min’s or max’s occur In multimodal functions, both local and global optima can occur. Usually, we are interested in finding the absolute highest or lowest value of a function.

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 12 How do we distinguish global optimum from local ones? By graphing to gain insight into the behavior of the function. Using randomly generated starting guesses and picking the largest of the optima as global. Perturbing the starting point to see if the routine returns a better point or the same local minimum.

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Example 7.1 If an object such as a bungee jumper is projected upward at a specified velocity – and if it is subjected to linear drag – its altitude as a function of time can be computed as: 13

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Determine the time and magnitude of the peak elevation 14 Given g = 9.81 m/s 2 z o =100 m v o = 55 m/s m = 80 kg c = 15 kg/s

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 15 Equation for the position of a bungie jumper – Eq. 7.1

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 16

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 17

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 18 Golden-Section Search A unimodal function has a single maximum or a minimum in the a given interval. For a unimodal function: –First pick two points that will bracket your extremum [x L, x u ]. –Pick an additional third point within this interval to determine whether a minimum (or maximum) occurred. –Then pick a fourth point to determine whether the minimum (or maximum) has occurred within the first three or last three points –The key is making this approach efficient by choosing intermediate points wisely thus minimizing the function evaluations by replacing the old values with new values.

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. –First pick two points that will bracket your extremum [x L, x u ]. –Pick an additional third point within this interval to determine whether a minimum (or maximum) occurred. –Then pick a fourth point to determine whether the minimum (or maximum) has occurred within the first three or last three points 19

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. –First pick two points that will bracket your extremum [x L, x u ]. –Pick an additional third point within this interval to determine whether a minimum (or maximum) occurred. –Then pick a fourth point to determine whether the minimum (or maximum) has occurred within the first three or last three points 20

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. –First pick two points that will bracket your extremum [x L, x u ]. –Pick an additional third point within this interval to determine whether a minimum (or maximum) occurred. –Then pick a fourth point to determine whether the minimum (or maximum) has occurred within the first three or last three points 21 Eliminate Minimum

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. –First pick two points that will bracket your extremum [x L, x u ]. –Pick an additional third point within this interval to determine whether a minimum (or maximum) occurred. –Then pick a fourth point to determine whether the minimum (or maximum) has occurred within the first three or last three points 22 Eliminate Minimum Old x 2 Old x 1 New x 2 x1x1 xuxu xLxL

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Golden Ratio 23

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 24 The first condition specifies that the sum of the two sub lengths l 1 and l 2 must equal the original interval length. The second says that the ratio of the lengths must be equal l0 l0

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 25 Golden Ratio

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 26 Golden Ratio

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 27 d=R*(x u –x L )

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 28 The method starts with two initial guesses, x L and x u, that bracket one local extremum of f(x): Next two interior points x 1 and x 2 are chosen according to the golden ratio The function is evaluated at these two interior points.

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 29 Two results can occur: If f(x 1 )<f(x 2 ) then the domain of x to the left of x 2 from x L to x 2, can be eliminated because it does not contain the minimum. Then, x 2 becomes the new x L for the next round.

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 30 Two results can occur: If f(x 1 )<f(x 2 ) then the domain of x to the left of x 2 from x L to x 2, can be eliminated because it does not contain the minimum. Then, x 2 becomes the new x L for the next round. If f(x 2 )<f(x 1 ), then the domain of x to the right of x 1 from x L to x 2, would have been eliminated. In this case, x 1 becomes the new x u for the next round. Eliminate Minimum

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 31 Eliminate Minimum Old x 2 Old x 1 New x 2 x1x1 xuxu xLxL

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 32 The real benefit from the use of golden ratio is because the original x 1 and x 2 were chosen using golden ratio, we do not need to recalculate all the function values for the next iteration.

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Example 7.2 Golden Search 33

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 34 x2x2 x1x1 How do we decide whether x 1 or x 2 is our best ‘guess’ for the minimum?

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 35 What would it take to package this into a function?

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Parabolic Interpolation Similar to Golden Section The method for improving our guess for the minimum or the maximum is different (In the example shown in the textbook the author used Parabolic interpolation to find a maximum) 36

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Just As… 2 points can be used to determine the equation of a line 3 points can be used to determine the equation for a parabola (a second order polynomial) 37 y=ax+b y=ax 2 + bx + c

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. From the parabolic fit –Differentiate –Set the result equal to 0 –Solve for the root – which corresponds to the max (of the parabola) 38

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.

Example 7.3 You should be able to create appropriate code to implement parabolic (quadratic) interpolation 40

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. MATLAB Functions for 1-D optimization 41 If you can find the derivative analytically –Use fzero If you can’t find the derivative analytically –Use fminbnd –Uses a combination of the golden section and parabolic techniques To find a maximum just change the sign on the function

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. MATLAB Function for 1-D optimization 42

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Multidimensional Optimization 43

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Example

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 45

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 46 Just for fun

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 47

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Example

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 49 This is an example of constrained optimization

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 50

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 51 fminsearch uses unconstrained optimization

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Summary We looked at both –Analytical –Numerical techniques to find maximums and minimums for one- dimensional and multi-dimensional problems 52

Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Summary Analytical Techniques based on finding the derivative and setting it equal to zero Golden Search Techique Parabolic Technique fzero (which requires you be able to find the analytical derivative – since fzero is a root finding function) fminbnd (Constrained one-dimensional) fminsearch (Unconstrained multi-dimensional) 53