Download presentation
1
Open Methods (Part 1) Fixed Point Iteration & Newton-Raphson Methods
Roots of Equations Open Methods (Part 1) Fixed Point Iteration & Newton-Raphson Methods
2
The following root finding methods will be introduced:
A. Bracketing Methods A.1. Bisection Method A.2. Regula Falsi B. Open Methods B.1. Fixed Point Iteration B.2. Newton Raphson's Method B.3. Secant Method
3
B. Open Methods To find the root for f(x) = 0, we construct a magic formulae xi+1 = g(xi) to predict the root iteratively until x converge to a root. However, x may diverge! Bisection method Open method (diverge) Open method (converge)
4
What you should know about Open Methods
How to construct the magic formulae g(x)? How can we ensure convergence? What makes a method converges quickly or diverge? How fast does a method converge?
5
B.1. Fixed Point Iteration
Also known as one-point iteration or successive substitution To find the root for f(x) = 0, we reformulate f(x) = 0 so that there is an x on one side of the equation. If we can solve g(x) = x, we solve f(x) = 0. x is known as the fixed point of g(x). We solve g(x) = x by computing until xi+1 converges to x.
6
Fixed Point Iteration – Example
Reason: If x converges, i.e. xi+1 xi
7
Example Find root of f(x) = e-x - x = 0. (Answer: α= 0.56714329) i xi
100.0 1 76.3 2 171.8 35.1 3 46.9 22.1 4 38.3 11.8 5 17.4 6.89 6 11.2 3.83 7 5.90 2.20 8 3.48 1.24 9 1.93 0.705 10 1.11 0.399
8
Two Curve Graphical Method
The point, x, where the two curves, f1(x) = x and f2(x) = g(x), intersect is the solution to f(x) = 0. Demo
9
There are infinite ways to construct g(x) from f(x).
Fixed Point Iteration There are infinite ways to construct g(x) from f(x). For example, (ans: x = 3 or -1) Case a: Case b: Case c: So which one is better?
10
Converge! Diverge! Converge, but slower x0 = 4 x1 = 3.31662
11
How to choose g(x)? Can we know which g(x) would converge to solution before we do the computation?
12
Convergence of Fixed Point Iteration
By definition Fixed point iteration
13
Convergence of Fixed Point Iteration
According to the derivative mean-value theorem, if g(x) and g'(x) are continuous over an interval xi ≤ x ≤ α, there exists a value x = c within the interval such that Therefore, if |g'(c)| < 1, the error decreases with each iteration. If |g'(c)| > 1, the error increase. If the derivative is positive, the iterative solution will be monotonic. If the derivative is negative, the errors will oscillate.
14
Demo (a) |g'(x)| < 1, g'(x) is +ve converge, monotonic
(b) |g'(x)| < 1, g'(x) is -ve converge, oscillate (c) |g'(x)| > 1, g'(x) is +ve diverge, monotonic (d) |g'(x)| > 1, g'(x) is -ve diverge, oscillate Demo
15
Fixed Point Iteration Impl. (as C function)
// x0: Initial guess of the root // es: Acceptable relative percentage error // iter_max: Maximum number of iterations allowed double FixedPt(double x0, double es, int iter_max) { double xr = x0; // Estimated root double xr_old; // Keep xr from previous iteration int iter = 0; // Keep track of # of iterations do { xr_old = xr; xr = g(xr_old); // g(x) has to be supplied if (xr != 0) ea = fabs((xr – xr_old) / xr) * 100; iter++; } while (ea > es && iter < iter_max); return xr; }
16
The following root finding methods will be introduced:
A. Bracketing Methods A.1. Bisection Method A.2. Regula Falsi B. Open Methods B.1. Fixed Point Iteration B.2. Newton Raphson's Method B.3. Secant Method
17
B.2. Newton-Raphson Method
Use the slope of f(x) to predict the location of the root. xi+1 is the point where the tangent at xi intersects x-axis.
18
Newton-Raphson Method
What would happen when f '(α) = 0? For example, f(x) = (x –1)2 = 0
19
Error Analysis of Newton-Raphson Method
By definition Newton-Raphson method
20
Error Analysis of Newton-Raphson Method
Suppose α is the true value (i.e., f(α) = 0). Using Taylor's series When xi and α are very close to each other, c is between xi and α. The iterative process is said to be of second order.
21
The Order of Iterative Process (Definition)
Using an iterative process we get xk+1 from xk and other info. We have x0, x1, x2, …, xk+1 as the estimation for the root α. Let δk = α – xk Then we may observe The process in such a case is said to be of p-th order. It is called Superlinear if p > 1. It is call quadratic if p = 2 It is called Linear if p = 1. It is called Sublinear if p < 1.
22
Error of the Newton-Raphson Method
Each error is approximately proportional to the square of the previous error. This means that the number of correct decimal places roughly doubles with each approximation. Example: Find the root of f(x) = e-x - x = 0 (Ans: α= ) Error Analysis
23
Error Analysis i xi εt (%) |δi| estimated |δi+1| 100 0.56714329 0.0582
100 0.0582 1 11.8 2 0.147 3 2.83x10-15 4 < 10-8
24
Newton-Raphson vs. Fixed Point Iteration
Find root of f(x) = e-x - x = 0. (Answer: α= ) Fixed Point Iteration with i xi εa (%) εt (%) 100.0 1 76.3 2 171.8 35.1 3 46.9 22.1 4 38.3 11.8 5 17.4 6.89 6 11.2 3.83 7 5.90 2.20 8 3.48 1.24 9 1.93 0.705 10 1.11 0.399 Newton-Raphson i xi εt (%) |δi| 100 1 11.8 2 0.147 3 4 < 10-8
25
Pitfalls of the Newton-Raphson Method
Sometimes slow iteration xi 0.5 1 51.65 2 46.485 3 4 5 … 40 41 42 43
26
Pitfalls of the Newton-Raphson Method
Figure (a) An inflection point (f"(x)=0) at the vicinity of a root causes divergence. Figure (b) A local maximum or minimum causes oscillations.
27
Pitfalls of the Newton-Raphson Method
Figure (c) It may jump from one location close to one root to a location that is several roots away. Figure (d) A zero slope causes division by zero.
28
Overcoming the Pitfalls?
No general convergence criteria for Newton-Raphson method. Convergence depends on function nature and accuracy of initial guess. A guess that's close to true root is always a better choice Good knowledge of the functions or graphical analysis can help you make good guesses Good software should recognize slow convergence or divergence. At the end of computation, the final root estimate should always be substituted into the original function to verify the solution.
29
Other Facts Newton-Rahpson method converges quadratically (when it converges). Except when the root is a multiple roots When the initial guess is close to the root, Newton-Rahpson method usually converges. To improve the chance of convergence, we could use a bracketing method to locate the initial value for the Newton-Raphson method.
30
Summary Differences between bracketing methods and open methods for locating roots Guarantee of convergence? Performance? Convergence criteria for fixed-point iteration method Rate of convergence Linear, quadratic, super-linear, sublinear Understand what conditions make Newton-Raphson method converges quickly or diverges
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.