Download presentation
Presentation is loading. Please wait.
1
Steepest Descent Optimization
2
Outline Newton Method -> Steepest Descent Convergence Rates
Exact and Numerical Line Search Regularization
3
Given: f(x) ~ f(x0) + dx g + dx H dx
Newton -> Steepest Descent Method Given: f(x) ~ f(x0) + dx g + dx H dx T T (1) Find: stationary point x* s.t. f(x*)=0 D Soln: Newton’s Method (2) a is step length that should be calculated so we never go uphill
4
Assume H is diagonally dominant
Ill-conditioned f(x) ~ dx + dy 2 H = Well-conditioned Well-conditioned f(x) ~ dx dy 2 Ill-conditioned H = So H ~ d /H -1 ij ij ii Therefore (2) becomes Preconditoned Steepest Descent if g is Premultiplied by approximation to inverse H
5
Poorly Conditioned Eqns
Steepest Descent For k=1:niter end Ill-conditioned Well-conditioned Poorly Conditioned Eqns Converge Slowly
6
Outline Newton Method -> Steepest Descent Convergence Rates
Exact and Numerical Line Search Regularization
7
Poorly Conditioned Eqns
Steepest Descent For k=1:niter end Poorly Conditioned Eqns Converge Slowly
8
Outline Newton Method -> Steepest Descent Convergence Rates
Exact and Numerical Line Search Regularization
9
Exact Line Search
10
Numerical Line Search Problem 1: Solve above eqn for alpha
Problem 2: Write Matlab code for Newton’s method with Rosenbrock, But now compute step length Problem 3: Write Matlab code for Steepest Descent method with Rosenbrock, But now compute step length and Preconditoning
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.