Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 One-Dimensional Unconstrained Optimization Chapter.

Slides:



Advertisements
Similar presentations
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 14.
Advertisements

Part 2 Chapter 6 Roots: Open Methods
Line Search.
Optimization methods Review
Optimisation The general problem: Want to minimise some function F(x) subject to constraints, a i (x) = 0, i=1,2,…,m 1 b i (x)  0, i=1,2,…,m 2 where x.
Optimization Introduction & 1-D Unconstrained Optimization
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Nonlinear programming: One dimensional minimization methods
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 61.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Ordinary Differential Equations Equations which are.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 51.
Optimization Methods One-Dimensional Unconstrained Optimization
Engineering Optimization
ENGR 351 Numerical Methods Instructor: Dr. L.R. Chevalier
Revision.
Optimization Mechanics of the Simplex Method
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Martin Mendez UASLP Chapter 61 Unit II.
Optimization Methods One-Dimensional Unconstrained Optimization
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Chapter 181 Interpolation Chapter 18 Estimation of intermediate.
Mech300 Numerical Methods, Hong Kong University of Science and Technology. 1 Part Four Optimization.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 CURVE.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 181 Interpolation.
Optimization Methods One-Dimensional Unconstrained Optimization
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 Least.
Tier I: Mathematical Methods of Optimization
Chapter 8 Traffic-Analysis Techniques. Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 8-1.
UNCONSTRAINED MULTIVARIABLE
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 2 Roots of Equations Why? But.
Part 2 Chapter 7 Optimization
ENCI 303 Lecture PS-19 Optimization 2
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 111.
Review Taylor Series and Error Analysis Roots of Equations
Chapter 7 Optimization. Content Introduction One dimensional unconstrained Multidimensional unconstrained Example.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Chapter 15.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Roots of Equations ~ Open Methods Chapter 6 Credit:
1 Optimization Multi-Dimensional Unconstrained Optimization Part II: Gradient Methods.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Numerical Differentiation and Integration ~ Integration.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 71.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Chapter 7.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. ~ Ordinary Differential Equations ~ Stiffness and Multistep.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 261 Stiffness.
Department of Mechanical Engineering, The Ohio State University Sl. #1GATEWAY Optimization.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 3- Chapter 12 Iterative Methods.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Roots of Equations ~ Bracketing Methods Chapter 5.
Optimization unconstrained and constrained Calculus part II.
ZEIT4700 – S1, 2015 Mathematical Modeling and Optimization School of Engineering and Information Technology.
Solving Non-Linear Equations (Root Finding)
One Dimensional Search
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 271 Boundary-Value.
1 Chapter 6 General Strategy for Gradient methods (1) Calculate a search direction (2) Select a step length in that direction to reduce f(x) Steepest Descent.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Chapter 15 General Least Squares and Non- Linear.
Exam 1 Oct 3, closed book Place ITE 119, Time:12:30-1:45pm
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
SOLVING NONLINEAR EQUATIONS. SECANT METHOD MATH-415 Numerical Analysis 1.
Optimization in Engineering Design 1 Introduction to Non-Linear Optimization.
By Liyun Zhang Aug.9 th,2012. * Method for Single Variable Unconstraint Optimization : 1. Quadratic Interpolation method 2. Golden Section method * Method.
Exam 1 Oct 3, closed book Place ITE 119, Time:12:30-1:45pm One double-sided cheat sheet (8.5in x 11in) allowed Bring your calculator to the exam Chapters.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 2 - Chapter 7 Optimization.
Copyright © 2014, The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 2 / Chapter 5.
Introduction to Optimization
3-3 Optimization with Linear Programming
Chapter 7 Optimization.
Numerical Computation and Optimization
Part 4 - Chapter 13.
Part 2 Chapter 7 Optimization
Bracketing.
Presentation transcript:

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 One-Dimensional Unconstrained Optimization Chapter 13 Credit: Prof. Lale Yurttas, Chemical Eng., Texas A&M University

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 2 Mathematical Background An optimization or mathematical programming problem is generally stated as: Find x, which minimizes or maximizes f(x) subject to Where x is an n-dimensional design vector, f(x) is the objective function, d i (x) are inequality constraints, e i (x) are equality constraints, and a i and b i are constants Optimization problems can be classified on the basis of the form of f(x): –If f(x) and the constraints are linear, we have linear programming. –If f(x) is quadratic and the constraints are linear, we have quadratic programming. –If f(x) is not linear or quadratic and/or the constraints are nonlinear, we have nonlinear programming. When equations(*) are included, we have a constrained optimization problem; otherwise, it is unconstrained optimization problem.

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 3 One-Dimensional Unconstrained Optimization How do we look for the global optimum? By graphing to gain insight into the behavior of the function. Using randomly generated starting guesses and picking the largest of the optima Perturbing the starting point to see if the routine returns a better point Root finding and optimization are related. Both involve guessing and searching for a point on a function. Difference is: –Root finding is searching for zeros of a function –Optimization is finding the minimum or the maximum of a function of several variables. In multimodal functions, both local and global optima can occur. We are mostly interested in finding the absolute highest or lowest value of a function.

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 4 Golden Ratio A unimodal function has a single maximum or a minimum in the a given interval. For a unimodal function: First pick two points that will bracket your extremum [x l, x u ]. Then, pick two more points within this interval to determine whether a maximum has occurred within the first three or last three points

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Golden Ratio The Parthenon in Athens, Greece was constructed in the 5 th century B.C. Its front dimensions can be fit exactly within a golden rectangle

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 6 Pick two initial guesses, x l and x u, that bracket one local extremum of f(x): Choose two interior points x 1 and x 2 according to the golden ratio Evaluate the function at x 1 and x 2 : If f(x 1 ) > f(x 2 ) then the domain of x to the left of x 2 (from x l to x 2 ) does not contain the maximum and can be eliminated. Then, x 2 becomes the new x l If f(x 2 ) > f(x 1 ), then the domain of x to the right of x 1 (from x 1 to x u ) can be eliminated. In this case, x 1 becomes the new x u. The benefit of using golden ratio is that we do not need to recalculate all the function values in the next iteration. If f(x 1 ) > f(x 2 ) then New x 2  x 1 else New x 1  x 2 Golden-Section Search Stopping Criteria | x u - x l | < 

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Fig 13.5

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Finding maximum through Quadratic Interpolation If we have 3 points that jointly bracket a maximum (or minimum), then: 1.Fit a parabola to 3 points (using Lagrange approach) and find f(x) 2.Then solve df/dx = 0 to find the optimum point.

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 9 Example (from the textbook): Use the Golden-section search to find the maximum of f(x) = 2 sin x – x 2 /10 within the interval x l =0 and x u =4

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 10 Newton’s Method A similar approach to Newton- Raphson method can be used to find an optimum of f(x) by finding the root of f’(x) (i.e. solving f’(x)=0): Disadvantage: it may be divergent If it is difficult to find f’(x) and f”(x) analytically, then a secant- like version of Newton’s technique can be developed by using finite-difference approximations for the derivatives.

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 11 Example 13.3 : Use Newton’s method to find the maximum of f(x) = 2 sin x – x 2 /10 with an initial guess of x 0 = 2.5 Solution: ixf(x)f’(x)f”(x)