Download presentation
Presentation is loading. Please wait.
1
Approximation of Functions
Approximation problems divided into five parts: Least square approximation of discrete functions or Regression Least square approximation of continuous function using various basis polynomials Orthogonal basis functions Approximation of periodic functions Interpolation
2
Norm and Seminorm A real valued function is called a norm on a vector space if it is defined everywhere on the space and satisfies the following conditions: ๐ >0 for๐โ 0; ๐ =0 iff ๐=0 โ๐ฅโ ๐,๐ ๐ผ๐ = ๐ผ ๐ for a real ๐ผ ๐+๐ โค ๐ + ๐ Lp-Norm: ๐ ๐ = ๐ ๐ ๐ ๐ฅ ๐ ๐๐ฅ 1 ๐ p = 2: Euclidean or L2 norm, ๐ 2 = ๐ ๐ ๐ ๐ฅ 2 ๐๐ฅ 1 2 p โ โ: ๐ โ = max ๐ฅโ ๐,๐ ๐ ๐ฅ Weighted Euclidean Norm: ๐ 2,๐ค = ๐ ๐ ๐ ๐ฅ 2 ๐ค ๐ฅ ๐๐ฅ 1 2 Euclidean seminorm: tab ๐ 2, ๐บ = ๐=0 ๐ ๐ ๐ฅ ๐
3
Inner Product The inner product of two real-valued continuous functions f(x) and g(x) is denoted by ๐, ๐ and is defined as: ๐, ๐ = ๐ ๐ ๐ ๐ฅ ๐ ๐ฅ ๐ค ๐ฅ ๐๐ฅ Continuous Case ๐=0 ๐ ๐ ๐ฅ ๐ ๐ ๐ฅ ๐ ๐ค ๐ (Discrete Case) Properties of Inner Product: Commutativity: ๐, ๐ = ๐, ๐ Linearity: (๐ 1 ๐+ ๐ 2 ๐),๐ = ๐ 1 ๐, ๐ + ๐ 2 ๐, ๐ Positivity: ๐, ๐ โฅ0 ๐, ๐ = ๐ =[ ๐ ๐ ๐ ๐ฅ 2 ๐๐ฅ ]2
4
Basis Functions: Linear Independence
A sequence of (n + 1) functions ๐ 0 , ๐ 1 , ๐ 2 ,โฏ ๐ ๐ are linearly independent if, ๐=0 ๐ ๐ ๐ ๐ ๐ =0 โน ๐ ๐ =0 โ๐ This sequence of functions builds (or spans) an (n + 1)-dimensional linear subspace From the Definition of Norm: ๐=0 ๐ ๐ ๐ ๐ ๐ =0 is true only if ๐ ๐ =0 โ๐ Linearity of inner product: ๐=0 ๐ ๐ ๐ ๐ ๐ , ๐ ๐ = ๐=0 ๐ ๐ ๐ ๐ ๐ ๐ฅ , ๐ ๐ ๐ฅ
5
Orthogonal Functions Two real-valued continuous functions f(x) and g(x) are said to be orthogonal if ๐, ๐ =0 A finite or infinite sequence of functions ๐ 0 , ๐ 1 , ๐ 2 ,โฏ ๐ ๐ ,โฏ make an orthogonal system if ๐ ๐ , ๐ ๐ =0 for all i โ j and ๐ ๐ โ 0 for all i. In addition, if ๐ ๐ =1, the sequence of functions is called an orthonormal system Pythagorean Theorem for functions: ๐, ๐ =0โน ๐+๐ 2 = ๐ ๐ 2 ๐+๐ 2 = ๐+๐ , ๐+๐ = ๐, ๐ + ๐, ๐ + ๐, ๐ + ๐, ๐ = ๐ ๐ 2 Generalized for orthogonal system: ๐=0 ๐ ๐ ๐ ๐ ๐ 2 = ๐=0 ๐ ๐ ๐ ๐ ๐ 2
6
Least Square Problem Let f(x) be a continuous real-valued function in (a, b) that is to be approximated by p(x), a linear combination of a system of (n + 1) linearly independent functions ๐ 0 , ๐ 1 , ๐ 2 ,โฏ ๐ ๐ as shown below: ๐ ๐ฅ = ๐=0 ๐ ๐ ๐ ๐ ๐ ๐ฅ Determine the coefficients ๐ ๐ = ๐ ๐ โ such that, a weighted Euclidean norm or seminorm of error p(x) โ f(x) becomes as small as possible. ๐ ๐ฅ โ๐ ๐ฅ 2 = ๐ ๐ ๐ ๐ฅ โ๐ ๐ฅ 2 ๐๐ฅ is minimum when ๐ ๐ = ๐ ๐ โ ๐ ๐ฅ โ๐ ๐ฅ 2 = ๐=1 ๐ ๐ ๐ฅ ๐ โ๐ ๐ฅ ๐ is minimum when ๐ ๐ = ๐ ๐ โ Least Square Solution: ๐ โ ๐ฅ = ๐=0 ๐ ๐ ๐ โ ๐ ๐ ๐ฅ
7
Schematic of Least Square Solution
f(x) Space Spanned by a system of (n + 1) linearly independent functions ๐ 0 , ๐ 1 , ๐ 2 ,โฏ ๐ ๐ ๐ ๐ฅ = ๐=0 ๐ ๐ ๐ ๐ ๐ ๐ โ ๐ฅ = ๐=0 ๐ ๐ ๐ โ ๐ ๐ Solution: ๐ ๐ฅ โ ๐ โ ๐ฅ , ๐ ๐ =0 for k = 0, 1, 2, โฆ n
8
Least Square Solution: Proof
When ๐ 0 , ๐ 1 , ๐ 2 ,โฏ ๐ ๐ are linearly independent, the least square problem has a unique solution: ๐ โ ๐ฅ = ๐=0 ๐ ๐ ๐ โ ๐ ๐ ๐ฅ where, p*(x) โ f(x) is orthogonal to all ฯjโs, j = 0, 1, 2, โฆ n. We need to prove: ๐ โ ๐ฅ โ๐ ๐ฅ 2 is minimum when p*(x) โ f(x) is orthogonal to all ฯjโs, j = 0, 1, 2, โฆ n existence and uniqueness of the least square solution
9
Least Square Solution: Proof
Let ๐ 0 , ๐ 1 , ๐ 2 ,โฏ ๐ ๐ be another sequence of coefficients with ๐ ๐ โ ๐ ๐ โ for at least one j, then ๐=0 ๐ ๐ ๐ ๐ ๐ ๐ฅ โ๐ ๐ฅ = ๐=0 ๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ฅ + ๐ โ ๐ฅ โ๐ ๐ฅ If p*(x) โ f(x) is orthogonal to all ฯjโs, it is also orthogonal to their linear combination ๐=0 ๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ฅ . According to Pythagorean theorem, we have: ๐=0 ๐ ๐ ๐ ๐ ๐ ๐ฅ โ๐ ๐ฅ 2 = ๐=0 ๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ฅ 2 + ๐ โ ๐ฅ โ๐ ๐ฅ 2 > ๐ โ ๐ฅ โ๐ ๐ฅ 2 Therefore, if p*(x) โ f(x) is orthogonal to all ฯjโs, then p*(x) is the solution of the least square problem.
10
Least Square Solution: Normal Equations
If ๐ 0 , ๐ 1 , ๐ 2 ,โฏ ๐ ๐ are linearly independent, solution to the least square problem is: ๐ โ ๐ฅ โ๐ ๐ฅ , ๐ ๐ ๐ฅ =0 ๐=0, 1, 2,โฏ๐ where, ๐ โ ๐ฅ = ๐=0 ๐ ๐ ๐ โ ๐ ๐ ๐ฅ Therefore, ๐=0 ๐ ๐ ๐ โ ๐ ๐ ๐ฅ โ๐ ๐ฅ , ๐ ๐ ๐ฅ =0 ๐=0, 1, 2,โฏ๐ ๐=0 ๐ ๐ ๐ โ ๐ ๐ ๐ฅ , ๐ ๐ ๐ฅ = ๐ ๐ฅ , ๐ ๐ ๐ฅ ๐=0, 1, 2,โฏ๐ Normal Equations!
11
Least Square Solution: Normal Equations
๐=0 ๐ ๐ ๐ โ ๐ ๐ ๐ฅ , ๐ ๐ ๐ฅ = ๐ ๐ฅ , ๐ ๐ ๐ฅ ; ๐=0, 1, 2,โฏ๐ ๐ 0 โ ๐ 0 , ๐ 0 + ๐ 1 โ ๐ 1 , ๐ 0 + ๐ 2 โ ๐ 2 , ๐ 0 +โฏ+ ๐ ๐ โ ๐ ๐ , ๐ 0 = ๐, ๐ ๐=0 ๐ 0 โ ๐ 0 , ๐ 1 + ๐ 1 โ ๐ 1 , ๐ 1 + ๐ 2 โ ๐ 2 , ๐ 1 +โฏ+ ๐ ๐ โ ๐ ๐ , ๐ 1 = ๐, ๐ ๐=1 ๐ 0 โ ๐ 0 , ๐ 2 + ๐ 1 โ ๐ 1 , ๐ 2 + ๐ 2 โ ๐ 2 , ๐ 2 +โฏ+ ๐ ๐ โ ๐ ๐ , ๐ 2 = ๐, ๐ ๐=2 โฎ โฎ โฎ โฏ โฎ = โฎ ๐ 0 โ ๐ 0 , ๐ ๐ + ๐ 1 โ ๐ 1 , ๐ ๐ + ๐ 2 โ ๐ 2 , ๐ ๐ +โฏ+ ๐ ๐ โ ๐ ๐ , ๐ ๐ = ๐, ๐ ๐ ๐=๐ Moreover, if ๐ 0 , ๐ 1 , ๐ 2 ,โฏ ๐ ๐ is an orthogonal system: ๐ ๐ โ = ๐, ๐ ๐ ๐ ๐ , ๐ ๐ ๐=0, 1, 2,โฏ๐
12
Least Square Solution: Existence and Uniqueness
๐=0 ๐ ๐ ๐ โ ๐ ๐ ๐ฅ , ๐ ๐ ๐ฅ = ๐ ๐ฅ , ๐ ๐ ๐ฅ ; ๐=0, 1, 2,โฏ๐ ๐ 0 โ ๐ 0 , ๐ 0 + ๐ 1 โ ๐ 1 , ๐ 0 + ๐ 2 โ ๐ 2 , ๐ 0 +โฏ+ ๐ ๐ โ ๐ ๐ , ๐ 0 = ๐, ๐ ๐=0 ๐ 0 โ ๐ 0 , ๐ 1 + ๐ 1 โ ๐ 1 , ๐ 1 + ๐ 2 โ ๐ 2 , ๐ 1 +โฏ+ ๐ ๐ โ ๐ ๐ , ๐ 1 = ๐, ๐ ๐=1 ๐ 0 โ ๐ 0 , ๐ 2 + ๐ 1 โ ๐ 1 , ๐ 2 + ๐ 2 โ ๐ 2 , ๐ 2 +โฏ+ ๐ ๐ โ ๐ ๐ , ๐ 2 = ๐, ๐ ๐=2 โฎ โฎ โฎ โฏ โฎ = โฎ ๐ 0 โ ๐ 0 , ๐ ๐ + ๐ 1 โ ๐ 1 , ๐ ๐ + ๐ 2 โ ๐ 2 , ๐ ๐ +โฏ+ ๐ ๐ โ ๐ ๐ , ๐ ๐ = ๐, ๐ ๐ ๐=๐ Solution to the normal equations exist and is unique unless the following homogenous system has a nontrivial solution for ๐ 0 โ , ๐ 1 โ ,โฏ ๐ ๐ โ , i.e., ๐ ๐ โ โ 0 for at least one j: ๐=0 ๐ ๐ ๐ โ ๐ ๐ ๐ฅ , ๐ ๐ ๐ฅ = 0
13
Least Square Solution: Existence and Uniqueness
Solution to the normal equations exist and is unique unless the following homogenous system has a nontrivial solution for ๐ 0 โ , ๐ 1 โ ,โฏ ๐ ๐ โ , i.e., ๐ ๐ โ โ 0 for at least one j: ๐=0 ๐ ๐ ๐ โ ๐ ๐ ๐ฅ , ๐ ๐ ๐ฅ = 0 This would lead to ๐=0 ๐ ๐ ๐ โ ๐ ๐ 2 = ๐=0 ๐ ๐ ๐ โ ๐ ๐ , ๐=0 ๐ ๐ ๐ โ ๐ ๐ = ๐=0 ๐ ๐=0 ๐ ๐ ๐ , ๐ ๐ ๐ ๐ โ ๐ ๐ โ = ๐=0 ๐ 0. ๐ ๐ โ =0 which contradicts that the ๐ 0 , ๐ 1 , ๐ 2 ,โฏ ๐ ๐ are linearly independent.
14
Least Square Solution: Example (Continuous)
Approximate the function f(x) = 1/(1 + x2) for x in [0, 1] using a straight line. The basis functions are: ๐ 0 ๐ฅ = ๐ 0 =1; ๐ 1 ๐ฅ = ๐ 1 =๐ฅ; ๐ โ ๐ฅ = ๐=0 1 ๐ ๐ โ ๐ ๐ ๐ฅ Normal Equation: ๐=0 ๐ ๐ ๐ โ ๐ ๐ ๐ฅ , ๐ ๐ ๐ฅ = ๐ ๐ฅ , ๐ ๐ ๐ฅ ; ๐=0, 1, 2,โฏ๐ ๐ 0 โ ๐ 0 , ๐ 0 + ๐ 1 โ ๐ 1 , ๐ 0 = ๐, ๐ 0 ๐ 0 โ ๐ 0 , ๐ 1 + ๐ 1 โ ๐ 1 , ๐ 1 = ๐, ๐ 1 ๐ 0 , ๐ 0 = ๐๐ฅ =1; ๐ 0 , ๐ 1 = ๐ 1 , ๐ 0 = 0 1 ๐ฅ 1 ๐๐ฅ = 1 2 =0.5 ๐ 1 , ๐ 1 = 0 1 ๐ฅ ๐ฅ ๐๐ฅ = 1 3 ; ๐, ๐ 0 = ๐ฅ ๐๐ฅ = ๐ 4 ๐, ๐ 1 = ๐ฅ 2 ๐ฅ ๐๐ฅ = ln 2 2 ๐, ๐ = ๐ ๐ ๐ ๐ฅ ๐ ๐ฅ ๐ค ๐ฅ ๐๐ฅ Continuous Case
15
Least Square Solution: Example (Continuous)
๐ 0 โ ๐ 0 , ๐ 0 + ๐ 1 โ ๐ 1 , ๐ 0 = ๐, ๐ 0 ๐ 0 โ ๐ 0 , ๐ 1 + ๐ 1 โ ๐ 1 , ๐ 1 = ๐, ๐ /3 ๐ 0 โ ๐ 1 โ = ๐/4 ln 2 /2 ๐ 0 โ = ๐/4 0.5 ln 2 /2 1/ /3 =1.062 ๐ 1 โ = 1 ๐/4 0.5 ln 2 / /3 =โ ๐ โ ๐ฅ = ๐=0 1 ๐ ๐ โ ๐ ๐ ๐ฅ =1.062โ0.5535๐ฅ
16
Least Square Solution: Example (Continuous)
Approximate the function f(x) = 1/(1 + x2) for x in [0, 1] using a 2nd order polynomial. The basis functions are: ๐ 0 =1; ๐ 1 =๐ฅ; ๐ 2 = ๐ฅ 2 ; ๐ โ ๐ฅ = ๐=0 2 ๐ ๐ โ ๐ ๐ ๐ 0 โ ๐ 0 , ๐ 0 + ๐ 1 โ ๐ 1 , ๐ 0 + ๐ 2 โ ๐ 2 , ๐ 0 = ๐, ๐ 0 ๐ 0 โ ๐ 0 , ๐ 1 + ๐ 1 โ ๐ 1 , ๐ 1 + ๐ 2 โ ๐ 2 , ๐ 1 = ๐, ๐ 1 ๐ 0 โ ๐ 0 , ๐ 2 + ๐ 1 โ ๐ 1 , ๐ 2 + ๐ 2 โ ๐ 2 , ๐ 2 = ๐, ๐ 2 Additional inner products to be evaluated are: ๐ 0 , ๐ 2 = ๐ 2 , ๐ 0 = 0 1 ๐ฅ 2 1 ๐๐ฅ = 1 3 ๐ 1 , ๐ 2 = ๐ 2 , ๐ 1 = 0 1 ๐ฅ 2 ๐ฅ ๐๐ฅ = 1 4 =0.25; ๐ 2 , ๐ 2 = 0 1 ๐ฅ 2 ๐ฅ 2 ๐๐ฅ = 1 5 =0.2 ๐, ๐ 2 = ๐ฅ 2 ๐ฅ 2 ๐๐ฅ =1โ ฯ 4
17
Least Square Solution: Example (Continuous)
๐ 0 โ ๐ 0 , ๐ 0 + ๐ 1 โ ๐ 1 , ๐ 0 + ๐ 2 โ ๐ 2 , ๐ 0 = ๐, ๐ 0 ๐ 0 โ ๐ 0 , ๐ 1 + ๐ 1 โ ๐ 1 , ๐ 1 + ๐ 2 โ ๐ 2 , ๐ 1 = ๐, ๐ 1 ๐ 0 โ ๐ 0 , ๐ 2 + ๐ 1 โ ๐ 1 , ๐ 2 + ๐ 2 โ ๐ 2 , ๐ 2 = ๐, ๐ / / / ๐ 0 โ ๐ 1 โ ๐ 2 โ = ๐/4 ln 2 /2 1โ๐/4 Solving by Gauss Elimination: ๐ 0 โ =1.030; ๐ 1 โ =โ0.3605; ๐ 2 โ =โ ๐ โ ๐ฅ = ๐=0 2 ๐ ๐ โ ๐ ๐ ๐ฅ =1.03โ0.3605๐ฅโ0.193 ๐ฅ 2
18
Least Square Solution: Example (Continuous)
How do you judge how good the fit is or how do you compare fit of two polynomials? Estimate ๐ โ ๐ฅ โ๐ ๐ฅ โ . Two options: Analytical (+ numerical) Numerical (+ visual)
19
Discrete Data (n + 1) observations or data pairs [(x0, y0), (x1, y1), (x2, y2) โฆ (xn, yn)] (m + 1) basis functions: ๐ 0 , ๐ 1 , ๐ 2 ,โฏ ๐ ๐ Approximating polynomial: ๐ ๐ฅ = ๐=0 ๐ ๐ ๐ ๐ ๐ ๐ฅ ๐ 0 ๐ 0 ๐ฅ 0 + ๐ 1 ๐ 1 ๐ฅ 0 + ๐ 2 ๐ 2 ๐ฅ 0 +โฏ+ ๐ ๐ ๐ ๐ ๐ฅ 0 = ๐ฆ 0 ๐ 0 ๐ 0 ๐ฅ 1 + ๐ 1 ๐ 1 ๐ฅ 1 + ๐ 2 ๐ 2 ๐ฅ 1 +โฏ+ ๐ ๐ ๐ ๐ ๐ฅ 1 = ๐ฆ 1 :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: ๐ 0 ๐ 0 ๐ฅ ๐ + ๐ 1 ๐ 1 ๐ฅ ๐ + ๐ 2 ๐ 2 ๐ฅ ๐ +โฏ+ ๐ ๐ ๐ ๐ ๐ฅ ๐ = ๐ฆ ๐ n equations, m unknowns: m < n: over-determined system, least square regression m = n: unique solution, interpolation m > n: under-determined system
20
Least Square Solution: Example (Discrete Data)
Variation of Ultimate Shear Strength (y) with curing Temperature (x) for a certain rubber compound was reported (J. Quality Technology, 1971, pp ) as: Fit a linear and a quadratic regression model to the data. For the linear model, the basis functions and the polynomial are: ๐ 0 ๐ฅ = ๐ 0 =1; ๐ 1 ๐ฅ = ๐ 1 =๐ฅ; ๐ โ ๐ฅ = ๐=0 1 ๐ ๐ โ ๐ ๐ ๐ฅ ๐ 0 โ ๐ 0 , ๐ 0 + ๐ 1 โ ๐ 1 , ๐ 0 = ๐, ๐ 0 ๐ 0 โ ๐ 0 , ๐ 1 + ๐ 1 โ ๐ 1 , ๐ 1 = ๐, ๐ 1 Given vectors: x = [138, 140, 144.5, 146, 148, 151.5, 153.5, 157] and y = f(x) = [770, 800, 840, 810, 735, 640, 590, 560] No. of data points, m = 8. Denote elements of x and y as xi and yi, respectively. x, in oC 138 140 144.5 146 148 151.5 153.5 157 y, in psi 770 800 840 810 735 640 590 560
21
Least Square Solution: Example (Discrete Data)
๐ 0 , ๐ 0 = ๐=1 ๐ 1 1 =8; ๐ 1 , ๐ 1 = ๐=1 ๐ ๐ฅ ๐ ๐ฅ ๐ = ๐=1 ๐ ๐ฅ ๐ 2 = ๐ 1 , ๐ 0 = ๐ 0 , ๐ 1 = ๐=1 ๐ 1 ๐ฅ ๐ = ๐=1 ๐ ๐ฅ ๐ = ๐, ๐ 0 = ๐=1 ๐ ๐ฆ ๐ 1 =5745; ๐, ๐ 1 = ๐=1 ๐ ๐ฆ ๐ ๐ฅ ๐ = ๐ 0 โ ๐ 0 , ๐ 0 + ๐ 1 โ ๐ 1 , ๐ 0 = ๐, ๐ 0 ๐ 0 โ ๐ 0 , ๐ 1 + ๐ 1 โ ๐ 1 , ๐ 1 = ๐, ๐ ๐ 0 โ ๐ 1 โ = โน ๐ 0 โ = ; ๐ 1 โ =โ ๐ โ ๐ฅ = ๐=0 1 ๐ ๐ โ ๐ ๐ ๐ฅ =2773.5โ ๐ฅ
22
Least Square Solution: Example (Discrete Data)
For the quadratic model, the basis functions and the polynomial are: ๐ 0 =1; ๐ 1 =๐ฅ; ๐ 2 = ๐ฅ 2 ; ๐ โ ๐ฅ = ๐=0 2 ๐ ๐ โ ๐ ๐ ๐ 0 โ ๐ 0 , ๐ 0 + ๐ 1 โ ๐ 1 , ๐ 0 + ๐ 2 โ ๐ 2 , ๐ 0 = ๐, ๐ 0 ๐ 0 โ ๐ 0 , ๐ 1 + ๐ 1 โ ๐ 1 , ๐ 1 + ๐ 2 โ ๐ 2 , ๐ 1 = ๐, ๐ 1 ๐ 0 โ ๐ 0 , ๐ 2 + ๐ 1 โ ๐ 1 , ๐ 2 + ๐ 2 โ ๐ 2 , ๐ 2 = ๐, ๐ 2 Additional inner products to be evaluated are: ๐ 0 , ๐ 2 = ๐ 2 , ๐ 0 = ๐=1 ๐ 1 ๐ฅ ๐ 2 = ๐ 1 , ๐ 2 = ๐ 2 , ๐ 1 = ๐=1 ๐ ๐ฅ ๐ ๐ฅ ๐ 2 = ๐=1 ๐ ๐ฅ ๐ 3 = ๐ 2 , ๐ 2 = ๐=1 ๐ ๐ฅ ๐ 2 ๐ฅ ๐ 2 = ๐=1 ๐ ๐ฅ ๐ 4 = ๐, ๐ 2 = ๐=1 ๐ ๐ฆ ๐ ๐ฅ ๐ 2 =
23
Least Square Solution: Example (Discrete Data)
๐ 0 โ ๐ 0 , ๐ 0 + ๐ 1 โ ๐ 1 , ๐ 0 + ๐ 2 โ ๐ 2 , ๐ 0 = ๐, ๐ 0 ๐ 0 โ ๐ 0 , ๐ 1 + ๐ 1 โ ๐ 1 , ๐ 1 + ๐ 2 โ ๐ 2 , ๐ 1 = ๐, ๐ 1 ๐ 0 โ ๐ 0 , ๐ 2 + ๐ 1 โ ๐ 1 , ๐ 2 + ๐ 2 โ ๐ 2 , ๐ 2 = ๐, ๐ ๐ 0 โ ๐ 1 โ ๐ 2 โ = Solving by Gauss Elimination: ๐ 0 โ =โ ; ๐ 1 โ = ; ๐ 2 โ =โ ๐ โ ๐ฅ = ๐=0 2 ๐ ๐ โ ๐ ๐ ๐ฅ =โ ๐ฅโ ๐ฅ 2
24
Least Square Solution: Example (Discrete Data)
How do you judge how good the fit is or how do you compare fit of two polynomials? Square of errors!
25
Least Square Solution: Example (Discrete Data)
Denote: ๐ฆ ๐ =๐ ๐ฅ ๐ , ๐ฆ ๐ = ๐ โ ๐ฅ ๐ ๐ ๐ฆ = ๐=0 ๐ ๐ฆ ๐ ๐+1 , ๐ ๐ก 2 = ๐=0 ๐ ๐ฆ ๐ โ ๐ ๐ฆ 2 ๐ 2 = ๐=0 ๐ ๐ฆ ๐ โ ๐ฆ ๐ 2 R2 = 0.73 Coefficient of regression r or R is given by: ๐ 2 = ๐ ๐ก 2 โ ๐ 2 ๐ ๐ก 2 R2 = 0.88
26
Additional Points You can do multiple regression using the same frame work. Linearize some non-linear equations Evaluate Integrals using Numerical Methods for Integration for functions that are not easy to integrate using analytical means!
27
ESO 208A: Computational Methods in Engineering Orthogonal Basis, Periodic Functions
28
Least Square Solution: Normal Equations
๐=0 ๐ ๐ ๐ โ ๐ ๐ ๐ฅ , ๐ ๐ ๐ฅ = ๐ ๐ฅ , ๐ ๐ ๐ฅ ; ๐=0, 1, 2,โฏ๐ If ๐ 0 , ๐ 1 , ๐ 2 ,โฏ ๐ ๐ is an orthogonal system: ๐ ๐ โ = ๐, ๐ ๐ ๐ ๐ , ๐ ๐ ๐=0, 1, 2,โฏ๐ Let us explore orthogonal systems of basis functions!
29
Orthogonal Polynomials: Tchebycheff
Consider the Equality: cos ๐+1 ๐ + cos ๐โ1 ๐ =2 cos ๐ cos ๐๐ ๐โฅ1 n = 1: cos 2๐ =2 cos 2 ๐ โ1 n = 2: cos 3๐ =2 cos ๐ cos 2๐ โ cos ๐ =4 cos 3 ๐ โ3 cos ๐ Define: ๐ฅ= cos ๐ ,๐โ 0,๐ and ๐ ๐ ๐ฅ = cos ๐๐ = cos ๐ cos โ1 ๐ฅ Recursion Formula: ๐ ๐+1 ๐ฅ =2๐ฅ ๐ ๐ ๐ฅ โ ๐ ๐โ1 ๐ฅ Example: ๐ 0 ๐ฅ =1; ๐ 1 ๐ฅ =๐ฅ; ๐ 2 ๐ฅ =2 ๐ฅ 2 โ1 Symmetry: ๐ ๐ โ๐ฅ = โ1 ๐ ๐ ๐ ๐ฅ Leading Coefficient: 2n-1 for n โฅ 1 and 1 for n = 0 Tn(x) constitutes an orthogonal family of polynomials in [-1, 1]!
30
Orthogonal Polynomials: Tchebycheff
Tn(x) has n zeros in [-1, 1] called the Tchebycheff abscissae: ๐ฅ ๐ = cos 2๐+1 ๐ ๐ ๐=0, 1, 2 โฏ ๐โ1 Follows from cos ๐๐ =0 for ๐= 2๐+1 ๐ ๐ 2 Tn(x) has n+1 extrema in [-1, 1]: ๐ฅ ๐ = cos ๐๐ ๐ ; ๐ ๐ ๐ฅ ๐ = โ1 ๐ ๐=0, 1, 2 โฏ๐ Follows from cos ๐๐ has maxima at ๐= ๐๐ ๐
31
Orthogonal Polynomials: Tchebycheff
Orthogonality (continuous): ๐ ๐ ๐ฅ , ๐ ๐ ๐ฅ = 0 ๐ cos ๐๐ cos ๐๐ ๐๐ = โ1 1 ๐ ๐ ๐ฅ ๐ ๐ ๐ฅ โ ๐ฅ 2 ๐๐ฅ = if ๐โ ๐ ๐ 2 if ๐=๐ โ 0 ๐ if ๐=๐=0 For arbitrary f(x), a โค x โค b: ๐ฅ= ๐+๐ 2 + ๐โ๐ 2 ๐ โ1โค๐โค1
32
Orthogonal Polynomials: Tchebycheff
Orthogonality (discrete): For 0 โค m โค N and 0 โค n โค N ๐ ๐ ๐ฅ , ๐ ๐ ๐ฅ = ๐=0 ๐ ๐ ๐ ๐ฅ ๐ ๐ ๐ ๐ฅ ๐ = 0 if ๐โ ๐ 1 2 ๐+1 if ๐=๐ โ 0 ๐+1 if ๐=๐=0 where, {xk} are the zeros of TN+1(x)
33
Orthogonal Polynomials: Legendre
Solution of the Legendreโs equation (for n non-negative: ๐ ๐๐ฅ 1โ ๐ฅ 2 ๐๐ ๐๐ฅ +๐ ๐+1 ๐=0 โ1โค๐ฅโค1 Solutions are an orthogonal set of polynomials given by: ๐ 0 ๐ฅ =1; ๐ ๐ ๐ฅ = 1 2 ๐ ๐! ๐ ๐ ๐๐ฅ ๐ ๐ฅ 2 โ1 ๐ Bonnetโs recursive relation for n โฅ 2: ๐ ๐ ๐ฅ = 2๐โ1 ๐ ๐ฅ ๐ ๐โ1 ๐ฅ โ ๐โ1 ๐ ๐ ๐โ2 ๐ฅ Examples: ๐ 0 ๐ฅ =1; ๐ 1 ๐ฅ =๐ฅ; ๐ 2 ๐ฅ = ๐ฅ 2 โ1 ; ๐ 3 ๐ฅ = ๐ฅ 3 โ3๐ฅ
34
Orthogonal Polynomials: Legendre
Symmetry: ๐ ๐ โ๐ฅ = โ1 ๐ ๐ ๐ ๐ฅ Orthogonality (continuous): ๐ ๐ ๐ฅ , ๐ ๐ ๐ฅ = if ๐โ ๐ 2 2๐+1 if ๐=๐ ๐ ๐ฅ =1; ๐ ๐ ๐ฅ โค1 For discrete data: equidistant data points.
35
Orthonormal Polynomials: Gram
For Equidistant Discrete Data: For โ1โค๐ฅโค1, a net of (n + 1) equidistant points are given by: ๐ฅ ๐ =โ1+ 2๐ ๐ for ๐=0, 1, 2, โฏ๐ On this net, the orthonormal set of polynomials ๐บ ๐ ๐ฅ ๐=0 ๐ are given by: ๐บ โ1 ๐ฅ =0; ๐บ 0 ๐ฅ = 1 ๐+1 ; ๐บ ๐+1 ๐ฅ = ๐ผ ๐ ๐ฅ ๐บ ๐ ๐ฅ โ ๐ผ ๐ ๐ผ ๐โ1 ๐บ ๐โ1 ๐ฅ ๐ผ ๐ = ๐ ๐ ๐+1 2 โ1 ๐+1 2 โ ๐ ๐=0, 1, 2,โฏ๐โ1 Example (for n = 5): ๐บ 0 ๐ฅ = ; ๐บ 1 ๐ฅ = 5๐ฅ ; ๐บ 2 ๐ฅ = ๐ฅ 2 โ
36
Orthonormal Polynomials: Gram
Orthogonality: ๐บ ๐ ๐ฅ , ๐บ ๐ ๐ฅ = ๐=0 ๐ ๐บ ๐ ๐ฅ ๐ ๐บ ๐ ๐ฅ ๐ = 0 if ๐โ ๐ 1 if ๐=๐ When m << n1/2, Gm(x) are very similar to the Legendre polynomials When n << m1/2, Gm(x) have very large oscillations between the net points, large maximum norm in [-1, 1] When fitting a polynomial to equidistant data, one should never choose m larger than ~ 2n1/2
37
Least Square Solution: Example (Continuous)
Approximate the function f(x) = 1/(1 + x2) for x in [0, 1] using a 2nd order polynomial using Legendre polynomials. For Legendre polynomials, use x = (z + 1)/2 such that for x in [0, 1], z is in [-1, 1] The function is: f(z) = 4/(5 + 2z + z2) The basis functions are: ๐ 0 =1; ๐ 1 =๐ง; ๐ 2 = ๐ง 2 โ1 ; ๐ โ ๐ง = ๐=0 2 ๐ ๐ โ ๐ ๐ ๐, ๐ 0 = โ ๐ง+ ๐ง 2 ๐๐ง = ฯ 2 = ๐, ๐ 1 = โ1 1 4๐ง 5+2๐ง+ ๐ง 2 ๐๐ง =2 ln 2 โ ฯ 2 =โ ๐, ๐ 2 = โ ๐ง 2 โ1 5+2๐ง+ ๐ง 2 ๐๐ง =12โ6 ln 2 โ 5ฯ 2 =โ0.1286ร 10 โ1 ๐ 0 โ = =0.7854; ๐ 1 โ = โ /3 =โ0.2768; ๐ 2 โ = โ0.1286ร 10 โ1 2/5 =โ0.3216ร 10 โ1
38
Least Square Solution: Example (Continuous)
๐ 0 โ = =0.7854; ๐ 1 โ = โ /3 =โ0.2768; ๐ 2 โ = โ0.1286ร 10 โ1 2/5 =โ0.3216ร 10 โ1 ๐ โ ๐ง = ๐=0 2 ๐ ๐ โ ๐ ๐ ๐ง =0.7854โ0.2768๐งโ0.3216ร 10 โ1 ร ๐ง 2 โ1 =0.8015โ0.2768๐งโ0.4824ร 10 โ1 ๐ง 2 If you now use, z = 2x โ 1 ๐ โ ๐ฅ =0.8015โ ๐ฅโ1 โ0.4824ร 10 โ1 2๐ฅโ1 2 =1.030โ0.3605๐ฅโ0.193 ๐ฅ 2 Least square polynomial is unique! It does not depend on the basis!
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.