Presentation is loading. Please wait.

Presentation is loading. Please wait.

Approximation of Functions

Similar presentations


Presentation on theme: "Approximation of Functions"โ€” Presentation transcript:

1 Approximation of Functions
Approximation problems divided into five parts: Least square approximation of discrete functions or Regression Least square approximation of continuous function using various basis polynomials Orthogonal basis functions Approximation of periodic functions Interpolation

2 Norm and Seminorm A real valued function is called a norm on a vector space if it is defined everywhere on the space and satisfies the following conditions: ๐‘“ >0 for๐‘“โ‰ 0; ๐‘“ =0 iff ๐‘“=0 โˆ€๐‘ฅโˆˆ ๐‘Ž,๐‘ ๐›ผ๐‘“ = ๐›ผ ๐‘“ for a real ๐›ผ ๐‘“+๐‘” โ‰ค ๐‘“ + ๐‘” Lp-Norm: ๐‘“ ๐‘ = ๐‘Ž ๐‘ ๐‘“ ๐‘ฅ ๐‘ ๐‘‘๐‘ฅ 1 ๐‘ p = 2: Euclidean or L2 norm, ๐‘“ 2 = ๐‘Ž ๐‘ ๐‘“ ๐‘ฅ 2 ๐‘‘๐‘ฅ 1 2 p โ†’ โˆž: ๐‘“ โˆž = max ๐‘ฅโˆˆ ๐‘Ž,๐‘ ๐‘“ ๐‘ฅ Weighted Euclidean Norm: ๐‘“ 2,๐‘ค = ๐‘Ž ๐‘ ๐‘“ ๐‘ฅ 2 ๐‘ค ๐‘ฅ ๐‘‘๐‘ฅ 1 2 Euclidean seminorm: tab ๐‘“ 2, ๐บ = ๐‘—=0 ๐‘› ๐‘“ ๐‘ฅ ๐‘—

3 Inner Product The inner product of two real-valued continuous functions f(x) and g(x) is denoted by ๐‘“, ๐‘” and is defined as: ๐‘“, ๐‘” = ๐‘Ž ๐‘ ๐‘“ ๐‘ฅ ๐‘” ๐‘ฅ ๐‘ค ๐‘ฅ ๐‘‘๐‘ฅ Continuous Case ๐‘–=0 ๐‘› ๐‘“ ๐‘ฅ ๐‘– ๐‘” ๐‘ฅ ๐‘– ๐‘ค ๐‘– (Discrete Case) Properties of Inner Product: Commutativity: ๐‘“, ๐‘” = ๐‘”, ๐‘“ Linearity: (๐‘ 1 ๐‘“+ ๐‘ 2 ๐‘”),๐œ‘ = ๐‘ 1 ๐‘“, ๐œ‘ + ๐‘ 2 ๐‘”, ๐œ‘ Positivity: ๐‘“, ๐‘“ โ‰ฅ0 ๐‘“, ๐‘“ = ๐‘“ =[ ๐‘Ž ๐‘ ๐‘“ ๐‘ฅ 2 ๐‘‘๐‘ฅ ]2

4 Basis Functions: Linear Independence
A sequence of (n + 1) functions ๐œ‘ 0 , ๐œ‘ 1 , ๐œ‘ 2 ,โ‹ฏ ๐œ‘ ๐‘› are linearly independent if, ๐‘—=0 ๐‘› ๐‘ ๐‘— ๐œ‘ ๐‘— =0 โŸน ๐‘ ๐‘— =0 โˆ€๐‘— This sequence of functions builds (or spans) an (n + 1)-dimensional linear subspace From the Definition of Norm: ๐‘—=0 ๐‘› ๐‘ ๐‘— ๐œ‘ ๐‘— =0 is true only if ๐‘ ๐‘— =0 โˆ€๐‘— Linearity of inner product: ๐‘—=0 ๐‘› ๐‘ ๐‘— ๐œ‘ ๐‘— , ๐œ‘ ๐‘˜ = ๐‘—=0 ๐‘› ๐‘ ๐‘— ๐œ‘ ๐‘— ๐‘ฅ , ๐œ‘ ๐‘˜ ๐‘ฅ

5 Orthogonal Functions Two real-valued continuous functions f(x) and g(x) are said to be orthogonal if ๐‘“, ๐‘” =0 A finite or infinite sequence of functions ๐œ‘ 0 , ๐œ‘ 1 , ๐œ‘ 2 ,โ‹ฏ ๐œ‘ ๐‘› ,โ‹ฏ make an orthogonal system if ๐œ‘ ๐‘– , ๐œ‘ ๐‘— =0 for all i โ‰  j and ๐œ‘ ๐‘– โ‰ 0 for all i. In addition, if ๐œ‘ ๐‘– =1, the sequence of functions is called an orthonormal system Pythagorean Theorem for functions: ๐‘“, ๐‘” =0โŸน ๐‘“+๐‘” 2 = ๐‘“ ๐‘” 2 ๐‘“+๐‘” 2 = ๐‘“+๐‘” , ๐‘“+๐‘” = ๐‘“, ๐‘“ + ๐‘“, ๐‘” + ๐‘”, ๐‘“ + ๐‘”, ๐‘” = ๐‘“ ๐‘” 2 Generalized for orthogonal system: ๐‘—=0 ๐‘› ๐‘ ๐‘— ๐œ‘ ๐‘— 2 = ๐‘—=0 ๐‘› ๐‘ ๐‘— ๐œ‘ ๐‘— 2

6 Least Square Problem Let f(x) be a continuous real-valued function in (a, b) that is to be approximated by p(x), a linear combination of a system of (n + 1) linearly independent functions ๐œ‘ 0 , ๐œ‘ 1 , ๐œ‘ 2 ,โ‹ฏ ๐œ‘ ๐‘› as shown below: ๐‘ ๐‘ฅ = ๐‘—=0 ๐‘› ๐‘ ๐‘— ๐œ‘ ๐‘— ๐‘ฅ Determine the coefficients ๐‘ ๐‘— = ๐‘ ๐‘— โˆ— such that, a weighted Euclidean norm or seminorm of error p(x) โ€“ f(x) becomes as small as possible. ๐‘ ๐‘ฅ โˆ’๐‘“ ๐‘ฅ 2 = ๐‘Ž ๐‘ ๐‘ ๐‘ฅ โˆ’๐‘“ ๐‘ฅ 2 ๐‘‘๐‘ฅ is minimum when ๐‘ ๐‘— = ๐‘ ๐‘— โˆ— ๐‘ ๐‘ฅ โˆ’๐‘“ ๐‘ฅ 2 = ๐‘–=1 ๐‘š ๐‘ ๐‘ฅ ๐‘– โˆ’๐‘“ ๐‘ฅ ๐‘– is minimum when ๐‘ ๐‘— = ๐‘ ๐‘— โˆ— Least Square Solution: ๐‘ โˆ— ๐‘ฅ = ๐‘—=0 ๐‘› ๐‘ ๐‘— โˆ— ๐œ‘ ๐‘— ๐‘ฅ

7 Schematic of Least Square Solution
f(x) Space Spanned by a system of (n + 1) linearly independent functions ๐œ‘ 0 , ๐œ‘ 1 , ๐œ‘ 2 ,โ‹ฏ ๐œ‘ ๐‘› ๐‘ ๐‘ฅ = ๐‘—=0 ๐‘› ๐‘ ๐‘— ๐œ‘ ๐‘— ๐‘ โˆ— ๐‘ฅ = ๐‘—=0 ๐‘› ๐‘ ๐‘— โˆ— ๐œ‘ ๐‘— Solution: ๐‘“ ๐‘ฅ โˆ’ ๐‘ โˆ— ๐‘ฅ , ๐œ‘ ๐‘˜ =0 for k = 0, 1, 2, โ€ฆ n

8 Least Square Solution: Proof
When ๐œ‘ 0 , ๐œ‘ 1 , ๐œ‘ 2 ,โ‹ฏ ๐œ‘ ๐‘› are linearly independent, the least square problem has a unique solution: ๐‘ โˆ— ๐‘ฅ = ๐‘—=0 ๐‘› ๐‘ ๐‘— โˆ— ๐œ‘ ๐‘— ๐‘ฅ where, p*(x) โ€“ f(x) is orthogonal to all ฯ†jโ€™s, j = 0, 1, 2, โ€ฆ n. We need to prove: ๐‘ โˆ— ๐‘ฅ โˆ’๐‘“ ๐‘ฅ 2 is minimum when p*(x) โ€“ f(x) is orthogonal to all ฯ†jโ€™s, j = 0, 1, 2, โ€ฆ n existence and uniqueness of the least square solution

9 Least Square Solution: Proof
Let ๐‘ 0 , ๐‘ 1 , ๐‘ 2 ,โ‹ฏ ๐‘ ๐‘› be another sequence of coefficients with ๐‘ ๐‘— โ‰  ๐‘ ๐‘— โˆ— for at least one j, then ๐‘—=0 ๐‘› ๐‘ ๐‘— ๐œ‘ ๐‘— ๐‘ฅ โˆ’๐‘“ ๐‘ฅ = ๐‘—=0 ๐‘› ๐‘ ๐‘— โˆ’ ๐‘ ๐‘— โˆ— ๐œ‘ ๐‘— ๐‘ฅ + ๐‘ โˆ— ๐‘ฅ โˆ’๐‘“ ๐‘ฅ If p*(x) โ€“ f(x) is orthogonal to all ฯ†jโ€™s, it is also orthogonal to their linear combination ๐‘—=0 ๐‘› ๐‘ ๐‘— โˆ’ ๐‘ ๐‘— โˆ— ๐œ‘ ๐‘— ๐‘ฅ . According to Pythagorean theorem, we have: ๐‘—=0 ๐‘› ๐‘ ๐‘— ๐œ‘ ๐‘— ๐‘ฅ โˆ’๐‘“ ๐‘ฅ 2 = ๐‘—=0 ๐‘› ๐‘ ๐‘— โˆ’ ๐‘ ๐‘— โˆ— ๐œ‘ ๐‘— ๐‘ฅ 2 + ๐‘ โˆ— ๐‘ฅ โˆ’๐‘“ ๐‘ฅ 2 > ๐‘ โˆ— ๐‘ฅ โˆ’๐‘“ ๐‘ฅ 2 Therefore, if p*(x) โ€“ f(x) is orthogonal to all ฯ†jโ€™s, then p*(x) is the solution of the least square problem.

10 Least Square Solution: Normal Equations
If ๐œ‘ 0 , ๐œ‘ 1 , ๐œ‘ 2 ,โ‹ฏ ๐œ‘ ๐‘› are linearly independent, solution to the least square problem is: ๐‘ โˆ— ๐‘ฅ โˆ’๐‘“ ๐‘ฅ , ๐œ‘ ๐‘˜ ๐‘ฅ =0 ๐‘˜=0, 1, 2,โ‹ฏ๐‘› where, ๐‘ โˆ— ๐‘ฅ = ๐‘—=0 ๐‘› ๐‘ ๐‘— โˆ— ๐œ‘ ๐‘— ๐‘ฅ Therefore, ๐‘—=0 ๐‘› ๐‘ ๐‘— โˆ— ๐œ‘ ๐‘— ๐‘ฅ โˆ’๐‘“ ๐‘ฅ , ๐œ‘ ๐‘˜ ๐‘ฅ =0 ๐‘˜=0, 1, 2,โ‹ฏ๐‘› ๐‘—=0 ๐‘› ๐‘ ๐‘— โˆ— ๐œ‘ ๐‘— ๐‘ฅ , ๐œ‘ ๐‘˜ ๐‘ฅ = ๐‘“ ๐‘ฅ , ๐œ‘ ๐‘˜ ๐‘ฅ ๐‘˜=0, 1, 2,โ‹ฏ๐‘› Normal Equations!

11 Least Square Solution: Normal Equations
๐‘—=0 ๐‘› ๐‘ ๐‘— โˆ— ๐œ‘ ๐‘— ๐‘ฅ , ๐œ‘ ๐‘˜ ๐‘ฅ = ๐‘“ ๐‘ฅ , ๐œ‘ ๐‘˜ ๐‘ฅ ; ๐‘˜=0, 1, 2,โ‹ฏ๐‘› ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 0 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 0 + ๐‘ 2 โˆ— ๐œ‘ 2 , ๐œ‘ 0 +โ‹ฏ+ ๐‘ ๐‘› โˆ— ๐œ‘ ๐‘› , ๐œ‘ 0 = ๐‘“, ๐œ‘ ๐‘˜=0 ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 1 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 1 + ๐‘ 2 โˆ— ๐œ‘ 2 , ๐œ‘ 1 +โ‹ฏ+ ๐‘ ๐‘› โˆ— ๐œ‘ ๐‘› , ๐œ‘ 1 = ๐‘“, ๐œ‘ ๐‘˜=1 ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 2 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 2 + ๐‘ 2 โˆ— ๐œ‘ 2 , ๐œ‘ 2 +โ‹ฏ+ ๐‘ ๐‘› โˆ— ๐œ‘ ๐‘› , ๐œ‘ 2 = ๐‘“, ๐œ‘ ๐‘˜=2 โ‹ฎ โ‹ฎ โ‹ฎ โ‹ฏ โ‹ฎ = โ‹ฎ ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ ๐‘› + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ ๐‘› + ๐‘ 2 โˆ— ๐œ‘ 2 , ๐œ‘ ๐‘› +โ‹ฏ+ ๐‘ ๐‘› โˆ— ๐œ‘ ๐‘› , ๐œ‘ ๐‘› = ๐‘“, ๐œ‘ ๐‘› ๐‘˜=๐‘› Moreover, if ๐œ‘ 0 , ๐œ‘ 1 , ๐œ‘ 2 ,โ‹ฏ ๐œ‘ ๐‘› is an orthogonal system: ๐‘ ๐‘˜ โˆ— = ๐‘“, ๐œ‘ ๐‘˜ ๐œ‘ ๐‘˜ , ๐œ‘ ๐‘˜ ๐‘˜=0, 1, 2,โ‹ฏ๐‘›

12 Least Square Solution: Existence and Uniqueness
๐‘—=0 ๐‘› ๐‘ ๐‘— โˆ— ๐œ‘ ๐‘— ๐‘ฅ , ๐œ‘ ๐‘˜ ๐‘ฅ = ๐‘“ ๐‘ฅ , ๐œ‘ ๐‘˜ ๐‘ฅ ; ๐‘˜=0, 1, 2,โ‹ฏ๐‘› ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 0 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 0 + ๐‘ 2 โˆ— ๐œ‘ 2 , ๐œ‘ 0 +โ‹ฏ+ ๐‘ ๐‘› โˆ— ๐œ‘ ๐‘› , ๐œ‘ 0 = ๐‘“, ๐œ‘ ๐‘˜=0 ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 1 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 1 + ๐‘ 2 โˆ— ๐œ‘ 2 , ๐œ‘ 1 +โ‹ฏ+ ๐‘ ๐‘› โˆ— ๐œ‘ ๐‘› , ๐œ‘ 1 = ๐‘“, ๐œ‘ ๐‘˜=1 ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 2 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 2 + ๐‘ 2 โˆ— ๐œ‘ 2 , ๐œ‘ 2 +โ‹ฏ+ ๐‘ ๐‘› โˆ— ๐œ‘ ๐‘› , ๐œ‘ 2 = ๐‘“, ๐œ‘ ๐‘˜=2 โ‹ฎ โ‹ฎ โ‹ฎ โ‹ฏ โ‹ฎ = โ‹ฎ ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ ๐‘› + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ ๐‘› + ๐‘ 2 โˆ— ๐œ‘ 2 , ๐œ‘ ๐‘› +โ‹ฏ+ ๐‘ ๐‘› โˆ— ๐œ‘ ๐‘› , ๐œ‘ ๐‘› = ๐‘“, ๐œ‘ ๐‘› ๐‘˜=๐‘› Solution to the normal equations exist and is unique unless the following homogenous system has a nontrivial solution for ๐‘ 0 โˆ— , ๐‘ 1 โˆ— ,โ‹ฏ ๐‘ ๐‘› โˆ— , i.e., ๐‘ ๐‘— โˆ— โ‰ 0 for at least one j: ๐‘—=0 ๐‘› ๐‘ ๐‘— โˆ— ๐œ‘ ๐‘— ๐‘ฅ , ๐œ‘ ๐‘˜ ๐‘ฅ = 0

13 Least Square Solution: Existence and Uniqueness
Solution to the normal equations exist and is unique unless the following homogenous system has a nontrivial solution for ๐‘ 0 โˆ— , ๐‘ 1 โˆ— ,โ‹ฏ ๐‘ ๐‘› โˆ— , i.e., ๐‘ ๐‘— โˆ— โ‰ 0 for at least one j: ๐‘—=0 ๐‘› ๐‘ ๐‘— โˆ— ๐œ‘ ๐‘— ๐‘ฅ , ๐œ‘ ๐‘˜ ๐‘ฅ = 0 This would lead to ๐‘—=0 ๐‘› ๐‘ ๐‘— โˆ— ๐œ‘ ๐‘— 2 = ๐‘—=0 ๐‘› ๐‘ ๐‘— โˆ— ๐œ‘ ๐‘— , ๐‘˜=0 ๐‘› ๐‘ ๐‘˜ โˆ— ๐œ‘ ๐‘˜ = ๐‘˜=0 ๐‘› ๐‘—=0 ๐‘› ๐œ‘ ๐‘— , ๐œ‘ ๐‘˜ ๐‘ ๐‘— โˆ— ๐‘ ๐‘˜ โˆ— = ๐‘˜=0 ๐‘› 0. ๐‘ ๐‘˜ โˆ— =0 which contradicts that the ๐œ‘ 0 , ๐œ‘ 1 , ๐œ‘ 2 ,โ‹ฏ ๐œ‘ ๐‘› are linearly independent.

14 Least Square Solution: Example (Continuous)
Approximate the function f(x) = 1/(1 + x2) for x in [0, 1] using a straight line. The basis functions are: ๐œ‘ 0 ๐‘ฅ = ๐œ‘ 0 =1; ๐œ‘ 1 ๐‘ฅ = ๐œ‘ 1 =๐‘ฅ; ๐‘ โˆ— ๐‘ฅ = ๐‘—=0 1 ๐‘ ๐‘— โˆ— ๐œ‘ ๐‘— ๐‘ฅ Normal Equation: ๐‘—=0 ๐‘› ๐‘ ๐‘— โˆ— ๐œ‘ ๐‘— ๐‘ฅ , ๐œ‘ ๐‘˜ ๐‘ฅ = ๐‘“ ๐‘ฅ , ๐œ‘ ๐‘˜ ๐‘ฅ ; ๐‘˜=0, 1, 2,โ‹ฏ๐‘› ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 0 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 0 = ๐‘“, ๐œ‘ 0 ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 1 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 1 = ๐‘“, ๐œ‘ 1 ๐œ‘ 0 , ๐œ‘ 0 = ๐‘‘๐‘ฅ =1; ๐œ‘ 0 , ๐œ‘ 1 = ๐œ‘ 1 , ๐œ‘ 0 = 0 1 ๐‘ฅ 1 ๐‘‘๐‘ฅ = 1 2 =0.5 ๐œ‘ 1 , ๐œ‘ 1 = 0 1 ๐‘ฅ ๐‘ฅ ๐‘‘๐‘ฅ = 1 3 ; ๐‘“, ๐œ‘ 0 = ๐‘ฅ ๐‘‘๐‘ฅ = ๐œ‹ 4 ๐‘“, ๐œ‘ 1 = ๐‘ฅ 2 ๐‘ฅ ๐‘‘๐‘ฅ = ln 2 2 ๐‘“, ๐‘” = ๐‘Ž ๐‘ ๐‘“ ๐‘ฅ ๐‘” ๐‘ฅ ๐‘ค ๐‘ฅ ๐‘‘๐‘ฅ Continuous Case

15 Least Square Solution: Example (Continuous)
๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 0 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 0 = ๐‘“, ๐œ‘ 0 ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 1 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 1 = ๐‘“, ๐œ‘ /3 ๐‘ 0 โˆ— ๐‘ 1 โˆ— = ๐œ‹/4 ln 2 /2 ๐‘ 0 โˆ— = ๐œ‹/4 0.5 ln 2 /2 1/ /3 =1.062 ๐‘ 1 โˆ— = 1 ๐œ‹/4 0.5 ln 2 / /3 =โˆ’ ๐‘ โˆ— ๐‘ฅ = ๐‘—=0 1 ๐‘ ๐‘— โˆ— ๐œ‘ ๐‘— ๐‘ฅ =1.062โˆ’0.5535๐‘ฅ

16 Least Square Solution: Example (Continuous)
Approximate the function f(x) = 1/(1 + x2) for x in [0, 1] using a 2nd order polynomial. The basis functions are: ๐œ‘ 0 =1; ๐œ‘ 1 =๐‘ฅ; ๐œ‘ 2 = ๐‘ฅ 2 ; ๐‘ โˆ— ๐‘ฅ = ๐‘—=0 2 ๐‘ ๐‘— โˆ— ๐œ‘ ๐‘— ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 0 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 0 + ๐‘ 2 โˆ— ๐œ‘ 2 , ๐œ‘ 0 = ๐‘“, ๐œ‘ 0 ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 1 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 1 + ๐‘ 2 โˆ— ๐œ‘ 2 , ๐œ‘ 1 = ๐‘“, ๐œ‘ 1 ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 2 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 2 + ๐‘ 2 โˆ— ๐œ‘ 2 , ๐œ‘ 2 = ๐‘“, ๐œ‘ 2 Additional inner products to be evaluated are: ๐œ‘ 0 , ๐œ‘ 2 = ๐œ‘ 2 , ๐œ‘ 0 = 0 1 ๐‘ฅ 2 1 ๐‘‘๐‘ฅ = 1 3 ๐œ‘ 1 , ๐œ‘ 2 = ๐œ‘ 2 , ๐œ‘ 1 = 0 1 ๐‘ฅ 2 ๐‘ฅ ๐‘‘๐‘ฅ = 1 4 =0.25; ๐œ‘ 2 , ๐œ‘ 2 = 0 1 ๐‘ฅ 2 ๐‘ฅ 2 ๐‘‘๐‘ฅ = 1 5 =0.2 ๐‘“, ๐œ‘ 2 = ๐‘ฅ 2 ๐‘ฅ 2 ๐‘‘๐‘ฅ =1โˆ’ ฯ€ 4

17 Least Square Solution: Example (Continuous)
๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 0 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 0 + ๐‘ 2 โˆ— ๐œ‘ 2 , ๐œ‘ 0 = ๐‘“, ๐œ‘ 0 ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 1 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 1 + ๐‘ 2 โˆ— ๐œ‘ 2 , ๐œ‘ 1 = ๐‘“, ๐œ‘ 1 ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 2 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 2 + ๐‘ 2 โˆ— ๐œ‘ 2 , ๐œ‘ 2 = ๐‘“, ๐œ‘ / / / ๐‘ 0 โˆ— ๐‘ 1 โˆ— ๐‘ 2 โˆ— = ๐œ‹/4 ln 2 /2 1โˆ’๐œ‹/4 Solving by Gauss Elimination: ๐‘ 0 โˆ— =1.030; ๐‘ 1 โˆ— =โˆ’0.3605; ๐‘ 2 โˆ— =โˆ’ ๐‘ โˆ— ๐‘ฅ = ๐‘—=0 2 ๐‘ ๐‘— โˆ— ๐œ‘ ๐‘— ๐‘ฅ =1.03โˆ’0.3605๐‘ฅโˆ’0.193 ๐‘ฅ 2

18 Least Square Solution: Example (Continuous)
How do you judge how good the fit is or how do you compare fit of two polynomials? Estimate ๐‘ โˆ— ๐‘ฅ โˆ’๐‘“ ๐‘ฅ โˆž . Two options: Analytical (+ numerical) Numerical (+ visual)

19 Discrete Data (n + 1) observations or data pairs [(x0, y0), (x1, y1), (x2, y2) โ€ฆ (xn, yn)] (m + 1) basis functions: ๐œ‘ 0 , ๐œ‘ 1 , ๐œ‘ 2 ,โ‹ฏ ๐œ‘ ๐‘š Approximating polynomial: ๐‘ ๐‘ฅ = ๐‘—=0 ๐‘š ๐‘ ๐‘— ๐œ‘ ๐‘— ๐‘ฅ ๐‘ 0 ๐œ‘ 0 ๐‘ฅ 0 + ๐‘ 1 ๐œ‘ 1 ๐‘ฅ 0 + ๐‘ 2 ๐œ‘ 2 ๐‘ฅ 0 +โ‹ฏ+ ๐‘ ๐‘š ๐œ‘ ๐‘š ๐‘ฅ 0 = ๐‘ฆ 0 ๐‘ 0 ๐œ‘ 0 ๐‘ฅ 1 + ๐‘ 1 ๐œ‘ 1 ๐‘ฅ 1 + ๐‘ 2 ๐œ‘ 2 ๐‘ฅ 1 +โ‹ฏ+ ๐‘ ๐‘š ๐œ‘ ๐‘š ๐‘ฅ 1 = ๐‘ฆ 1 :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: ๐‘ 0 ๐œ‘ 0 ๐‘ฅ ๐‘› + ๐‘ 1 ๐œ‘ 1 ๐‘ฅ ๐‘› + ๐‘ 2 ๐œ‘ 2 ๐‘ฅ ๐‘› +โ‹ฏ+ ๐‘ ๐‘š ๐œ‘ ๐‘š ๐‘ฅ ๐‘› = ๐‘ฆ ๐‘› n equations, m unknowns: m < n: over-determined system, least square regression m = n: unique solution, interpolation m > n: under-determined system

20 Least Square Solution: Example (Discrete Data)
Variation of Ultimate Shear Strength (y) with curing Temperature (x) for a certain rubber compound was reported (J. Quality Technology, 1971, pp ) as: Fit a linear and a quadratic regression model to the data. For the linear model, the basis functions and the polynomial are: ๐œ‘ 0 ๐‘ฅ = ๐œ‘ 0 =1; ๐œ‘ 1 ๐‘ฅ = ๐œ‘ 1 =๐‘ฅ; ๐‘ โˆ— ๐‘ฅ = ๐‘—=0 1 ๐‘ ๐‘— โˆ— ๐œ‘ ๐‘— ๐‘ฅ ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 0 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 0 = ๐‘“, ๐œ‘ 0 ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 1 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 1 = ๐‘“, ๐œ‘ 1 Given vectors: x = [138, 140, 144.5, 146, 148, 151.5, 153.5, 157] and y = f(x) = [770, 800, 840, 810, 735, 640, 590, 560] No. of data points, m = 8. Denote elements of x and y as xi and yi, respectively. x, in oC 138 140 144.5 146 148 151.5 153.5 157 y, in psi 770 800 840 810 735 640 590 560

21 Least Square Solution: Example (Discrete Data)
๐œ‘ 0 , ๐œ‘ 0 = ๐‘–=1 ๐‘š 1 1 =8; ๐œ‘ 1 , ๐œ‘ 1 = ๐‘–=1 ๐‘š ๐‘ฅ ๐‘– ๐‘ฅ ๐‘– = ๐‘–=1 ๐‘š ๐‘ฅ ๐‘– 2 = ๐œ‘ 1 , ๐œ‘ 0 = ๐œ‘ 0 , ๐œ‘ 1 = ๐‘–=1 ๐‘š 1 ๐‘ฅ ๐‘– = ๐‘–=1 ๐‘š ๐‘ฅ ๐‘– = ๐‘“, ๐œ‘ 0 = ๐‘–=1 ๐‘š ๐‘ฆ ๐‘– 1 =5745; ๐‘“, ๐œ‘ 1 = ๐‘–=1 ๐‘š ๐‘ฆ ๐‘– ๐‘ฅ ๐‘– = ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 0 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 0 = ๐‘“, ๐œ‘ 0 ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 1 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 1 = ๐‘“, ๐œ‘ ๐‘ 0 โˆ— ๐‘ 1 โˆ— = โŸน ๐‘ 0 โˆ— = ; ๐‘ 1 โˆ— =โˆ’ ๐‘ โˆ— ๐‘ฅ = ๐‘—=0 1 ๐‘ ๐‘— โˆ— ๐œ‘ ๐‘— ๐‘ฅ =2773.5โˆ’ ๐‘ฅ

22 Least Square Solution: Example (Discrete Data)
For the quadratic model, the basis functions and the polynomial are: ๐œ‘ 0 =1; ๐œ‘ 1 =๐‘ฅ; ๐œ‘ 2 = ๐‘ฅ 2 ; ๐‘ โˆ— ๐‘ฅ = ๐‘—=0 2 ๐‘ ๐‘— โˆ— ๐œ‘ ๐‘— ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 0 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 0 + ๐‘ 2 โˆ— ๐œ‘ 2 , ๐œ‘ 0 = ๐‘“, ๐œ‘ 0 ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 1 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 1 + ๐‘ 2 โˆ— ๐œ‘ 2 , ๐œ‘ 1 = ๐‘“, ๐œ‘ 1 ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 2 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 2 + ๐‘ 2 โˆ— ๐œ‘ 2 , ๐œ‘ 2 = ๐‘“, ๐œ‘ 2 Additional inner products to be evaluated are: ๐œ‘ 0 , ๐œ‘ 2 = ๐œ‘ 2 , ๐œ‘ 0 = ๐‘–=1 ๐‘š 1 ๐‘ฅ ๐‘– 2 = ๐œ‘ 1 , ๐œ‘ 2 = ๐œ‘ 2 , ๐œ‘ 1 = ๐‘–=1 ๐‘š ๐‘ฅ ๐‘– ๐‘ฅ ๐‘– 2 = ๐‘–=1 ๐‘š ๐‘ฅ ๐‘– 3 = ๐œ‘ 2 , ๐œ‘ 2 = ๐‘–=1 ๐‘š ๐‘ฅ ๐‘– 2 ๐‘ฅ ๐‘– 2 = ๐‘–=1 ๐‘š ๐‘ฅ ๐‘– 4 = ๐‘“, ๐œ‘ 2 = ๐‘–=1 ๐‘š ๐‘ฆ ๐‘– ๐‘ฅ ๐‘– 2 =

23 Least Square Solution: Example (Discrete Data)
๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 0 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 0 + ๐‘ 2 โˆ— ๐œ‘ 2 , ๐œ‘ 0 = ๐‘“, ๐œ‘ 0 ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 1 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 1 + ๐‘ 2 โˆ— ๐œ‘ 2 , ๐œ‘ 1 = ๐‘“, ๐œ‘ 1 ๐‘ 0 โˆ— ๐œ‘ 0 , ๐œ‘ 2 + ๐‘ 1 โˆ— ๐œ‘ 1 , ๐œ‘ 2 + ๐‘ 2 โˆ— ๐œ‘ 2 , ๐œ‘ 2 = ๐‘“, ๐œ‘ ๐‘ 0 โˆ— ๐‘ 1 โˆ— ๐‘ 2 โˆ— = Solving by Gauss Elimination: ๐‘ 0 โˆ— =โˆ’ ; ๐‘ 1 โˆ— = ; ๐‘ 2 โˆ— =โˆ’ ๐‘ โˆ— ๐‘ฅ = ๐‘—=0 2 ๐‘ ๐‘— โˆ— ๐œ‘ ๐‘— ๐‘ฅ =โˆ’ ๐‘ฅโˆ’ ๐‘ฅ 2

24 Least Square Solution: Example (Discrete Data)
How do you judge how good the fit is or how do you compare fit of two polynomials? Square of errors!

25 Least Square Solution: Example (Discrete Data)
Denote: ๐‘ฆ ๐‘– =๐‘“ ๐‘ฅ ๐‘– , ๐‘ฆ ๐‘– = ๐‘ โˆ— ๐‘ฅ ๐‘– ๐œ‡ ๐‘ฆ = ๐‘–=0 ๐‘š ๐‘ฆ ๐‘– ๐‘š+1 , ๐œŽ ๐‘ก 2 = ๐‘–=0 ๐‘š ๐‘ฆ ๐‘– โˆ’ ๐œ‡ ๐‘ฆ 2 ๐œ€ 2 = ๐‘–=0 ๐‘š ๐‘ฆ ๐‘– โˆ’ ๐‘ฆ ๐‘– 2 R2 = 0.73 Coefficient of regression r or R is given by: ๐‘Ÿ 2 = ๐œŽ ๐‘ก 2 โˆ’ ๐œ€ 2 ๐œŽ ๐‘ก 2 R2 = 0.88

26 Additional Points You can do multiple regression using the same frame work. Linearize some non-linear equations Evaluate Integrals using Numerical Methods for Integration for functions that are not easy to integrate using analytical means!


Download ppt "Approximation of Functions"

Similar presentations


Ads by Google