Download presentation
Presentation is loading. Please wait.
1
A few words about convergence We have been looking at e a as our measure of convergence A more technical means of differentiating the speed of convergence looks at asymptotic convergence
2
Definition: Rate of convergence if we say that the method converges to x-true with order p>0. Higher p is faster convergence. p=1 is linear p=2 is quadratic
3
Lambda is asymptotic error constant Bisection: p=1 Regula falsi: p=1.4 to 1.6
4
Another open method is fixed point iteration Idea: rewrite original equation f(x)=0 into form x=g(x). Use iteration x i+1 =g(x i ) to find a value that reaches convergence Example:
5
For our Manning’s equation problem becomes
6
Fortran program performing fixed-point iteration for Manning’s eq. example
8
Fixed point iteration doesn’t always work. Basically, if |g’(x)| is <1 near the intersection with the x line, it will work. (See your book for derivation). Example where it doesn’t work
10
King of the root-finding methods Newton-Raphson method Based on Taylor series expansion
11
Truncate to get At the root, f(x i+1 )=0, so and
12
Note that an evaluation of the derivative is required. You may have to do this numerically. However, can converge very quickly.
13
Example using our Manning’s equation problem The derivative of this w.r.t h is
14
Spreadsheet example
15
Error analysis and convergence of Newton- Raphson The error of the Newton-Raphson method can be estimated from Because the error at time i+1 is proportional to the square of the previous error, the number of correct decimal places doubles each iteration
16
Although Newton-Raphson converges very rapidly, it can diverge, and fail to find roots. 1) if an inflection point is near the root 2) if there is a local minimum or maximum 3) if there are multiple roots 4) if a zero slope is reached
17
Secant method continued There is an alternate secant method that uses a perturbation method to approximate derivative. Start with
18
Now plug this approximation for the derivative into the Taylor series approximation used in Newton-Raphson: becomes
19
No derivative evaluation - like the secant method Only one initial guess is needed - like Newton-Raphson method Matlab example
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.