Download presentation
Presentation is loading. Please wait.
Published byJeffry Garrett Modified over 8 years ago
1
Exam 1 Oct 3, closed book Place ITE 119, Time:12:30-1:45pm One double-sided cheat sheet (8.5in x 11in) allowed Bring your calculator to the exam Chapters 1-11 Error Analysis Taylor Series Roots of Equations Linear Systems Numerical Methods, Lecture 8 1 Prof. Jinbo Bi CSE, UConn
2
2 Today’s class Optimization One-dimensional unconstrained Numerical Methods, Fall 2011 Lecture 10 2 Prof. Jinbo Bi CSE, UConn
3
Optimization Given a function f(x 1, x 2, x 3, …, x n ) find the set of values that minimize or maximize the function Examples: Lowering power usage in a circuit while maximizing speed Least-cost management of supply chain 3 Numerical Methods, Fall 2011 Lecture 10 3 Prof. Jinbo Bi CSE, UConn
4
Optimization Optimization of a differential function means setting the derivative f’(x 1, x 2, x 3, …, x n ) to zero and finding solutions to that equation. If f’’(x 1, x 2, x 3, …, x n ) > 0, it is a minimum and if f’’ (x 1, x 2, x 3, …, x n ) < 0, it is a maximum To find where the derivative is zero, we can use the root-finding techniques we discussed before 4 Numerical Methods, Fall 2011 Lecture 10 4 Prof. Jinbo Bi CSE, UConn
5
Optimization 5 Numerical Methods, Fall 2011 Lecture 10 5 Prof. Jinbo Bi CSE, UConn
6
Optimization Finding the roots of the derivative requires knowing what the derivative is Often, the function is not well-defined and you can not analytically derive the derivative Requires finite-difference approximation of the derivative 6 Numerical Methods, Fall 2011 Lecture 10 6 Prof. Jinbo Bi CSE, UConn
7
Optimization General optimization problem: p 7 Numerical Methods, Fall 2011 Lecture 10 7 Prof. Jinbo Bi CSE, UConn
8
8 Optimization Unconstrained optimization Constrained optimization Degrees of freedom n-p-m Under-constrained p+m≤n Solution possible Over-constrained p+m>n Solution unlikely 8 Numerical Methods, Fall 2011 Lecture 10 8 Prof. Jinbo Bi CSE, UConn
9
Optimization Linear Programming Objective function and the constraints are all linear Quadratic Programming Objective function is quadratic and the constraints are linear Nonlinear Programming Objective function is not linear or quadratic and/or the constraints are nonlinear 9 Numerical Methods, Fall 2011 Lecture 10 9 Prof. Jinbo Bi CSE, UConn
10
Optimization Dimensionality One-dimensional Single dependent variable Multi-dimensional Function is dependent on more than one variable 10 Numerical Methods, Fall 2011 Lecture 10 10 Prof. Jinbo Bi CSE, UConn
11
Optimization How do you know that the minimum or maximum that you found is the global minimum or maximum? Multimodal systems 11 Numerical Methods, Fall 2011 Lecture 10 11 Prof. Jinbo Bi CSE, UConn
12
Optimization How do you find the global extremum? Use graphical methods to gain insight Use multiple guesses to find multiple local extrema and then pick the largest as global Slight perturbations of initial guesses to see if the extrema changes For large multi-dimensional problems it is sometimes too difficult to guarantee that you have found the global extremum 12 Numerical Methods, Fall 2011 Lecture 10 12 Prof. Jinbo Bi CSE, UConn
13
One-dimensional Unconstrained Optimization Similar to root-finding Bracketing methods Golden-section search Quadratic interpolation Open methods Newton’s method 13 Numerical Methods, Fall 2011 Lecture 10 13 Prof. Jinbo Bi CSE, UConn
14
Golden-section search Also called Golden Ratio search Start with an interval bracket around the maximum 14 Numerical Methods, Fall 2011 Lecture 10 14 Prof. Jinbo Bi CSE, UConn
15
Golden-section search Assume that the function is unimodal over interval 15 Numerical Methods, Fall 2011 Lecture 10 15 Prof. Jinbo Bi CSE, UConn
16
Golden-section search 16 Numerical Methods, Fall 2011 Lecture 10 16 Prof. Jinbo Bi CSE, UConn
17
Golden-section search Pick two interior points in the interval using the golden ratio 17 Numerical Methods, Fall 2011 Lecture 10 17 Prof. Jinbo Bi CSE, UConn
18
Golden-section search Two possibilities 18 Numerical Methods, Fall 2011 Lecture 10 18 Prof. Jinbo Bi CSE, UConn
19
Golden-section search Example maximum 19 Numerical Methods, Fall 2011 Lecture 10 19 Prof. Jinbo Bi CSE, UConn
20
Golden-section search 20 Numerical Methods, Fall 2011 Lecture 10 20 Prof. Jinbo Bi CSE, UConn
21
Golden-section search 21 Numerical Methods, Fall 2011 Lecture 10 21 Prof. Jinbo Bi CSE, UConn
22
Golden-section search Error analysis Worst-case error is when true value is at the far-end of the sub-interval Take case when optimum value is in upper interval True value is at the left 22 Numerical Methods, Fall 2011 Lecture 10 22 Prof. Jinbo Bi CSE, UConn
23
Golden-section search Relative error is (considering the worst case) 23 Numerical Methods, Fall 2011 Lecture 10 23 Prof. Jinbo Bi CSE, UConn
24
Quadratic interpolation Use a second order polynomial as an approximation of the function near the optimum 24 Numerical Methods, Fall 2011 Lecture 10 24 Prof. Jinbo Bi CSE, UConn
25
Quadratic interpolation Use the above formula to get a new guess and then use same method as with Golden-search to get rid of one of the guesses x 3 is the value of x that corresponds to the maximum of the quadratic function 25 Numerical Methods, Fall 2011 Lecture 10 25 Prof. Jinbo Bi CSE, UConn
26
Quadratic interpolation Example maximum 26 Numerical Methods, Fall 2011 Lecture 10 26 Prof. Jinbo Bi CSE, UConn
27
Quadratic interpolation 27 Numerical Methods, Fall 2011 Lecture 10 27 Prof. Jinbo Bi CSE, UConn
28
Newton’s Method Newton-Raphson could be used to find the root of an function When finding a function optimum, use the fact that we want to find the root of the derivative and apply Newton-Raphson 28 Numerical Methods, Fall 2011 Lecture 10 28 Prof. Jinbo Bi CSE, UConn
29
Newton’s Method Example maximum 29 Numerical Methods, Fall 2011 Lecture 10 29 Prof. Jinbo Bi CSE, UConn
30
Newton’s Method Example 30 Numerical Methods, Fall 2011 Lecture 10 30 Prof. Jinbo Bi CSE, UConn
31
Newton’s Method Like Newton-Raphson for roots, Newton’s method for finding optima may also diverge Can use secant-like methods using finite- difference approximations if the derivative is not available Usually used only near the optima Use Hybrid methods Bracketing methods to get near the optimum Open methods to quickly converge to the optimum 31 Numerical Methods, Fall 2011 Lecture 10 31 Prof. Jinbo Bi CSE, UConn
32
Next class Multi-dimensional Optimization Read Chapter 13 and 14 32 Numerical Methods, Fall 2011 Lecture 10 32 Prof. Jinbo Bi CSE, UConn
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.