Download presentation
Presentation is loading. Please wait.
1
Optimization Part II G.Anuradha
2
Review of previous lecture- Steepest Descent
Choose the next step so that the function decreases: For small changes in x we can approximate F(x): where If we want the function to decrease: We can maximize the decrease by choosing:
3
Example
4
Plot
7
Necessary and sufficient conditions for a function with single variable
9
Functions with two variables
Sufficient conditions Necessary conditions
10
Stationary Points
11
Effect of learning rate
More the learning rate the trajectory becomes oscillatory. This will make the algorithm unstable The upper limit for learning rates can be set for quadratic functions
12
Stable Learning Rates (Quadratic)
Stability is determined by the eigenvalues of this matrix. Eigenvalues of [I - aA]. (li - eigenvalue of A) Stability Requirement:
13
Example
14
Newton’s Method Take the gradient of this second-order approximation
and set it equal to zero to find the stationary point:
15
Example
16
Plot
17
This is used for finding line minimization methods and their stopping criteria
Initial bracketing Line searches Newton’s method Secant method Sectioning method
18
Initial Bracketing Helps in finding the range which contains the relative minimum Bracketing some assumed minimum in the starting interval is required Two schemes are used for this purpose
20
Sectioning methods
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.