Presentation is loading. Please wait.

Presentation is loading. Please wait.

Algorithms and Convergence

Similar presentations


Presentation on theme: "Algorithms and Convergence"β€” Presentation transcript:

1 Algorithms and Convergence
Sec:1.3 Algorithms and Convergence

2 Sec:1.3 Algorithms and Convergence
An algorithm is a procedure that describes, in an unambiguous manner, a finite sequence of steps to be performed in a specified order. The object of the algorithm is to implement a procedure to solve a problem or approximate a solution to the problem. We use a pseudocode to describe the algorithms. This pseudocode specifies the form of the input to be supplied and the form of the desired output.

3 Sec:1.3 Algorithms and Convergence
Looping techniques counter-controlled x=1:5; vsum = 0; for i=1:5 vsum = vsum + x(i); end vsum conditional execution x=1:5; vsum = 0; for i=1:5 vsum = vsum + x(i); if vsum > 5; break; end vsum condition-controlled x=1:5; vsum = 0; i=1; while i < 3 vsum = vsum + x(i); i = i + 1; end vsum Indentation

4 Sec:1.3 Algorithms and Convergence
𝑷 𝑡 𝒙 = π’™βˆ’πŸ 𝟏 βˆ’ π’™βˆ’πŸ 𝟐 𝟐 π’™βˆ’πŸ πŸ‘ πŸ‘ βˆ’β€¦+ βˆ’πŸ 𝑡+𝟏 π’™βˆ’πŸ 𝑡 πŸ‘ Calculate: 𝑷 πŸ— 𝟏.πŸ“ clear; clc n = 9; x=1.5; s=+1; pw = x-1; pn = s*pw; for i = 2:n s = -s; pw=pw*(x-1); term = s*pw/i; pn = pn + term; end pn clear; clc n = 9; x=1.5; pn = 0; for i = 1:n term = (-1)^(i+1)*(x-1)^i/i; pn = pn + term; end pn

5 Sec:1.3 Algorithms and Convergence
Construct an algorithm to determine the minimal value of N required for 𝑷 𝑡 𝒙 = π’™βˆ’πŸ 𝟏 βˆ’ π’™βˆ’πŸ 𝟐 𝟐 π’™βˆ’πŸ πŸ‘ πŸ‘ βˆ’β€¦+ βˆ’πŸ 𝑡+𝟏 π’™βˆ’πŸ 𝑡 πŸ‘ | π’π’β‘πŸ.πŸ“ βˆ’ 𝑷 𝑡 (𝟏.πŸ“)| < 𝟏𝟎 βˆ’πŸ“ , 𝑺 𝑡 = 𝟎.πŸ“ 𝟏 βˆ’ 𝟎.πŸ“ 𝟐 𝟐 𝟎.πŸ“ πŸ‘ πŸ‘ βˆ’β€¦+ βˆ’πŸ 𝑡+𝟏 𝟎.πŸ“ 𝑡 πŸ‘ clear; clc n = 13; x=1.5; pn = 0; for i = 1:n term = (-1)^(i+1)*(x-1)^i/i; pn = pn + term; end pn From calculus we know that |𝑆 βˆ’ 𝑆 𝑁 | ≀ | π‘‘π‘’π‘Ÿπ‘š 𝑁+1 |. if abs(term) < 1e-5; N=i; break; end

6 Sec:1.3 Algorithms and Convergence
Algorithm is stable small changes in the initial data produce correspondingly small changes in the final results. otherwise it is unstable. Some algorithms are stable only for certain choices of initial data, and are called conditionally stable. Example: How small is 𝝅 π‘₯ π‘₯βˆ’22=0 π‘₯ 1 π‘₯ 2 π‘₯ 1 βˆ’ π‘₯ 1 (πŸ‘.πŸπŸ’πŸπŸ“) π‘₯ π‘₯βˆ’22=0 π‘₯ 1 π‘₯ 2 π‘₯ 2 βˆ’ π‘₯ 2 |π…βˆ’πŸ‘.πŸπŸ’πŸπŸ“| π‘₯ 1 βˆ’ π‘₯ 1 π‘₯ 2 βˆ’ π‘₯ 2 small changes in the initial data produce small changes

7 Sec:1.3 Algorithms and Convergence
Example Rates of Convergence Consider the following two series sequence: {Ξ±n} οƒ  Ξ± 𝒏 𝜢 𝒏 𝜸 𝒏 { 𝜢 𝒏 βˆ’πœΆ} οƒ  0 1 2 3 4 5 6 7 then we say that {Ξ±n} converges to Ξ± with rate (order) of convergence O( ( 𝟏 𝒏 ) 𝒑 ). β€œbig oh of” If a positive constant K exists with |Ξ±n βˆ’ Ξ±| ≀ K 1 𝑛 𝒑 for large n, 𝜢 𝒏 = 𝒏+𝟏 𝒏 𝟐 𝜸 𝒏 = 𝒏+πŸ‘ 𝒏 πŸ‘ Then we write: Which one is faster? Ξ±n = Ξ± + O( ( 1 𝑛 ) 𝑝 ). Rate of convergence Remark: Comparsion test and Limit comparison test

8 Sec:1.3 Algorithms and Convergence
Example Rates of Convergence Consider the following two series sequence: {Ξ±n} οƒ  Ξ± 𝜢 𝒏 = 𝒏+𝟏 𝒏 𝟐 𝜸 𝒏 = 𝒏+πŸ‘ 𝒏 πŸ‘ { 𝜢 𝒏 βˆ’πœΆ} οƒ  0 then we say that {Ξ±n} converges to Ξ± with rate (order) of convergence O( ( 𝟏 𝒏 ) 𝒑 ). 𝜢 𝒏 = 𝒏+𝟏 𝒏 𝟐 β‰€πŸ ( 𝟏 𝒏 ) 𝟏 β€œbig oh of” 𝒑=𝟏 If a positive constant K exists with 𝜸 𝒏 = 𝒏+πŸ‘ 𝒏 πŸ‘ β‰€πŸ’ ( 𝟏 𝒏 ) 𝟐 𝒑=𝟐 |Ξ±n βˆ’ Ξ±| ≀ K 1 𝑛 𝒑 for large n, Then we write: Ξ±n = Ξ± + O( ( 1 𝑛 ) 𝑝 ). Remark: Comparsion test and Limit comparison test

9 Sec:1.3 Algorithms and Convergence
Rates of Convergence Suppose {Ξ²n} is a sequence known to converge to zero, and {Ξ±n} converges to a number Ξ±. If a positive constant K exists with |Ξ±n βˆ’ Ξ±| ≀ K|Ξ²n|, for large n, then we say that {Ξ±n} converges to Ξ± with rate (order) of convergence O(Ξ²n). (This expression is read β€œbig oh of Ξ²n”.) Rates of Convergence |Ξ±n βˆ’ Ξ±| ≀ K 1 𝑛 𝑝 for large n, Two sequences: {Ξ±n} οƒ  Ξ± { 𝛽 𝑛 = ( 1 𝑛 ) 𝑝 } οƒ  0 We are generally interested in the largest value of p with Ξ±n = Ξ± + O( ( 1 𝑛 ) 𝑝 ).

10 Root-finding problem 𝑓 π‘₯ = 0
The root-finding problem is a process involves finding a root, or solution, of an equation of the form 𝑓 π‘₯ = 0 for a given function 𝑓 . A root of this equation is also called a zero of the function 𝑓 . In graph, the root (or zero) of a function is the x-intercept Three numerical methods for root-finding Sec(2.1): The Bisection Method Sec(2.2): Fixed point Iteration root Sec(2.3): The Newton-Raphson Method

11 𝒏 𝒙 𝒏 Newton’s Method 𝑓 π‘₯ =0
THE NEWTON-RAPHSON METHOD is a method for finding successively better approximations to the roots (or zeroes) of a function. Example Algorithm Use the Newton-Raphson method to estimate the root of f (x) = 𝒆 βˆ’π’™ βˆ’π’™, employing an initial guess of x1 = 0. To approximate the roots of 𝑓 π‘₯ =0 Given initial guess π‘₯ 1 f (x) = 𝒆 βˆ’π’™ βˆ’π’™ 𝒇 β€² 𝒙 =βˆ’π’† βˆ’π’™ βˆ’πŸ f (0) =𝟏 𝒇 β€² 𝟎 = βˆ’πŸ π‘₯ 𝑛+1 = π‘₯ 𝑛 βˆ’ 𝑓( π‘₯ 𝑛 ) 𝑓′( π‘₯ 𝑛 ) π‘₯ 1 =0 π‘₯ 2 = π‘₯ 1 βˆ’ 𝑓( π‘₯ 1 ) 𝑓′( π‘₯ 1 ) π‘₯ 2 =0βˆ’ 𝑓(0) 𝑓′(0) 𝒏 𝒙 𝒏 =0.5 π‘₯ 3 =0.5βˆ’ 𝑓(0.5 ) 𝑓′(0.5 ) = The true value of the root: Thus, the approach rapidly converges on the true root.

12 Newton’s Method THE NEWTON-RAPHSON METHOD is a method for finding successively better approximations to the roots (or zeroes) of a function. Example clear f exp(-t) - t; df - exp(-t) - 1; x(1) = 0; for i=1:4 x(i+1) = x(i) - f( x(i) )/df( x(i) ); end x' Use the Newton-Raphson method to estimate the root of f (x) = 𝒆 βˆ’π’™ βˆ’π’™, employing an initial guess of x1 = 0. 𝒏 𝒙 𝒏

13 𝒏 𝒙 𝒏 (Newton) Newton’s Method Example Approximate a root of
f (x) = 𝒄𝒐𝒔𝒙 βˆ’ 𝒙 using Newton’s Method employing an initial guess of π’™πŸ = 𝝅 πŸ’ clear f exp(-t) - t; df - exp(-t) - 1; x(1) = 0; for i=1:4 x(i+1) = x(i) - f( x(i) )/df( x(i) ); end x' 𝒏 𝒙 𝒏 (Newton) 1 2 3 4 5 6 7 8


Download ppt "Algorithms and Convergence"

Similar presentations


Ads by Google