L23 Numerical Methods part 3

Slides:



Advertisements
Similar presentations
Arc-length computation and arc-length parameterization
Advertisements

Lecture 5 Newton-Raphson Method
Optimization with Constraints
Line Search.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 One-Dimensional Unconstrained Optimization Chapter.
Optimization : The min and max of a function
Open Methods Chapter 6 The Islamic University of Gaza
Optimization 吳育德.
Optimization Introduction & 1-D Unconstrained Optimization
Least Squares example There are 3 mountains u,y,z that from one site have been measured as 2474 ft., 3882 ft., and 4834 ft.. But from u, y looks 1422 ft.
Nonlinear programming: One dimensional minimization methods
MIT and James Orlin © Nonlinear Programming Theory.
Page 1 Page 1 Engineering Optimization Second Edition Authors: A. Rabindran, K. M. Ragsdell, and G. V. Reklaitis Chapter-2 (Functions of a Single Variable)
1cs542g-term Notes  Extra class this Friday 1-2pm  If you want to receive s about the course (and are auditing) send me .
Optimization Methods One-Dimensional Unconstrained Optimization
Optimization Mechanics of the Simplex Method
Optimization Methods One-Dimensional Unconstrained Optimization
Advanced Topics in Optimization
Optimization Methods One-Dimensional Unconstrained Optimization
Ch. 9: Direction Generation Method Based on Linearization Generalized Reduced Gradient Method Mohammad Farhan Habib NetLab, CS, UC Davis July 30, 2010.
Tier I: Mathematical Methods of Optimization

Computational Optimization
UNCONSTRAINED MULTIVARIABLE
Chapter 17 Boundary Value Problems. Standard Form of Two-Point Boundary Value Problem In total, there are n 1 +n 2 =N boundary conditions.
ENCI 303 Lecture PS-19 Optimization 2
84 b Unidimensional Search Methods Most algorithms for unconstrained and constrained optimisation use an efficient unidimensional optimisation technique.
Chapter 7 Optimization. Content Introduction One dimensional unconstrained Multidimensional unconstrained Example.
Part 4 Nonlinear Programming 4.3 Successive Linear Programming.
1 Unconstrained Optimization Objective: Find minimum of F(X) where X is a vector of design variables We may know lower and upper bounds for optimum No.
1 Optimization Multi-Dimensional Unconstrained Optimization Part II: Gradient Methods.
Chapter 3 Roots of Equations. Objectives Understanding what roots problems are and where they occur in engineering and science Knowing how to determine.
Numerical Methods.
Solution of Nonlinear Functions
L21 Numerical Methods part 1 Homework Review Search problem Line Search methods Summary 1 Test 4 Wed.
One Dimensional Search
L22 Numerical Methods part 2 Homework Review Alternate Equal Interval Golden Section Summary Test 4 1.
Chapter 10 Minimization or Maximization of Functions.
1 Chapter 6 General Strategy for Gradient methods (1) Calculate a search direction (2) Select a step length in that direction to reduce f(x) Steepest Descent.
L24 Numerical Methods part 4
Optimization of functions of one variable (Section 2)
L25 Numerical Methods part 5 Project Questions Homework Review Tips and Tricks Summary 1.
Gradient Methods In Optimization
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
SOLVING NONLINEAR EQUATIONS. SECANT METHOD MATH-415 Numerical Analysis 1.
Optimization in Engineering Design 1 Introduction to Non-Linear Optimization.
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
Optimal Control.
CHAPTER 3 NUMERICAL METHODS
Root Finding Methods Fish 559; Lecture 15 a.
Read Chapters 5 and 6 of the textbook
CS5321 Numerical Optimization
Computers in Civil Engineering 53:081 Spring 2003
Chapter 7 Optimization.
CS5321 Numerical Optimization
Optimization Part II G.Anuradha.
Introduction to Scientific Computing II
~ Least Squares example
Optimization and Some Traditional Methods
Math 175: Numerical Analysis II
~ Least Squares example
FP1: Chapter 2 Numerical Solutions of Equations
7. Trial and Error Searching
Performance Optimization
Outline Preface Fundamentals of Optimization
Outline Preface Fundamentals of Optimization
1 Newton’s Method.
Section 4 The Definite Integral
CS5321 Numerical Optimization
Bracketing.
Presentation transcript:

L23 Numerical Methods part 3 Project Homework Review Steepest Descent Algorithm Summary Test 4 results

H22 ans optimum solution __0.444___ min value __0.0494__ interval of uncertainty__0.889__ number of fcn evals __6____

H22 cont’d For iterations # 2 on….The interval is reduced to 0.888/1.333 =67% …. For the cost of 2 function evaluations. If we create a measure of efficiency

H22

H22 optimum solution __.472____ min value __.0124______ interval of uncertainty___0.764_____ number of fcn evals ___5_____ For iterations # 2 on….The interval is reduced to 61.8% of interval, I for the cost of only 1 function evaluation. If we create a measure of efficiency Golden Section Best

Search algorithm? 1. Find a direction, then 2. Find best step size for alpha 3. Repeat steps 1 and 2 ‘til “done”

Unimodal functions in “locale” monotonic decreasing then monotonic increasing monotonic increasing then monotonic decreasing Figure 10.4 Unimodal function f().

Review: Step Size Methods “Analytical” Search direction = (-) gradient, (i.e. line search) Find f’(α)=0, f’’(α)≥0 Region Elimination (“interval reducing”) Equal interval Alternate equal interval Golden Section Others Newton-Raphson Successive quadratic Interpolation

Successive Alternate Equal Interval Assume bounding phase has found Min can be on either side of   Point values… not a line But for sure its not in this region!

Golden section Figure 10.9 Graphic of a section partition.

Descent Algorithm? Descent is guaranteed!

Steepest descent algorithm How does it work?

“Modified” Steepest-Descent Algorithm

Ex 10.4 Use Solver to find α*

EX 10.4 ||c||=0 Done!

H22 Prob 10.52 Let’s use SteepDescentTemplate.xls to set up 10.52 and solve.

Summary Step size methods: analytical, region elimin. Golden Section is very efficient Algorithms include stopping criteria (||c||,∆f ) Steepest descent algorithm Convergence is assured Lots of Fcn evals (in line search) Each iteration is independent of previous moves (i.e. totally “local” ) Successive iterations slow down.. may stall