Download presentation
Presentation is loading. Please wait.
1
Approximation of Functions
Approximation problems divided into five parts: Least square approximation of discrete functions or Regression Least square approximation of continuous function using various basis polynomials Orthogonal basis functions Approximation of periodic functions Interpolation
2
Norm and Seminorm A real valued function is called a norm on a vector space if it is defined everywhere on the space and satisfies the following conditions: ๐ >0 for๐โ 0; ๐ =0 iff ๐=0 โ๐ฅโ ๐,๐ ๐ผ๐ = ๐ผ ๐ for a real ๐ผ ๐+๐ โค ๐ + ๐ Lp-Norm: ๐ ๐ = ๐ ๐ ๐ ๐ฅ ๐ ๐๐ฅ 1 ๐ p = 2: Euclidean or L2 norm, ๐ 2 = ๐ ๐ ๐ ๐ฅ 2 ๐๐ฅ 1 2 p โ โ: ๐ โ = max ๐ฅโ ๐,๐ ๐ ๐ฅ Weighted Euclidean Norm: ๐ 2,๐ค = ๐ ๐ ๐ ๐ฅ 2 ๐ค ๐ฅ ๐๐ฅ 1 2 Euclidean seminorm: tab ๐ 2, ๐บ = ๐=0 ๐ ๐ ๐ฅ ๐
3
Inner Product The inner product of two real-valued continuous functions f(x) and g(x) is denoted by ๐, ๐ and is defined as: ๐, ๐ = ๐ ๐ ๐ ๐ฅ ๐ ๐ฅ ๐ค ๐ฅ ๐๐ฅ Continuous Case ๐=0 ๐ ๐ ๐ฅ ๐ ๐ ๐ฅ ๐ ๐ค ๐ (Discrete Case) Properties of Inner Product: Commutativity: ๐, ๐ = ๐, ๐ Linearity: (๐ 1 ๐+ ๐ 2 ๐),๐ = ๐ 1 ๐, ๐ + ๐ 2 ๐, ๐ Positivity: ๐, ๐ โฅ0 ๐, ๐ = ๐ =[ ๐ ๐ ๐ ๐ฅ 2 ๐๐ฅ ]2
4
Basis Functions: Linear Independence
A sequence of (n + 1) functions ๐ 0 , ๐ 1 , ๐ 2 ,โฏ ๐ ๐ are linearly independent if, ๐=0 ๐ ๐ ๐ ๐ ๐ =0 โน ๐ ๐ =0 โ๐ This sequence of functions builds (or spans) an (n + 1)-dimensional linear subspace From the Definition of Norm: ๐=0 ๐ ๐ ๐ ๐ ๐ =0 is true only if ๐ ๐ =0 โ๐ Linearity of inner product: ๐=0 ๐ ๐ ๐ ๐ ๐ , ๐ ๐ = ๐=0 ๐ ๐ ๐ ๐ ๐ ๐ฅ , ๐ ๐ ๐ฅ
5
Orthogonal Functions Two real-valued continuous functions f(x) and g(x) are said to be orthogonal if ๐, ๐ =0 A finite or infinite sequence of functions ๐ 0 , ๐ 1 , ๐ 2 ,โฏ ๐ ๐ ,โฏ make an orthogonal system if ๐ ๐ , ๐ ๐ =0 for all i โ j and ๐ ๐ โ 0 for all i. In addition, if ๐ ๐ =1, the sequence of functions is called an orthonormal system Pythagorean Theorem for functions: ๐, ๐ =0โน ๐+๐ 2 = ๐ ๐ 2 ๐+๐ 2 = ๐+๐ , ๐+๐ = ๐, ๐ + ๐, ๐ + ๐, ๐ + ๐, ๐ = ๐ ๐ 2 Generalized for orthogonal system: ๐=0 ๐ ๐ ๐ ๐ ๐ 2 = ๐=0 ๐ ๐ ๐ ๐ ๐ 2
6
Least Square Problem Let f(x) be a continuous real-valued function in (a, b) that is to be approximated by p(x), a linear combination of a system of (n + 1) linearly independent functions ๐ 0 , ๐ 1 , ๐ 2 ,โฏ ๐ ๐ as shown below: ๐ ๐ฅ = ๐=0 ๐ ๐ ๐ ๐ ๐ ๐ฅ Determine the coefficients ๐ ๐ = ๐ ๐ โ such that, a weighted Euclidean norm or seminorm of error p(x) โ f(x) becomes as small as possible. ๐ ๐ฅ โ๐ ๐ฅ 2 = ๐ ๐ ๐ ๐ฅ โ๐ ๐ฅ 2 ๐๐ฅ is minimum when ๐ ๐ = ๐ ๐ โ ๐ ๐ฅ โ๐ ๐ฅ 2 = ๐=1 ๐ ๐ ๐ฅ ๐ โ๐ ๐ฅ ๐ is minimum when ๐ ๐ = ๐ ๐ โ Least Square Solution: ๐ โ ๐ฅ = ๐=0 ๐ ๐ ๐ โ ๐ ๐ ๐ฅ
7
Schematic of Least Square Solution
f(x) Space Spanned by a system of (n + 1) linearly independent functions ๐ 0 , ๐ 1 , ๐ 2 ,โฏ ๐ ๐ ๐ ๐ฅ = ๐=0 ๐ ๐ ๐ ๐ ๐ ๐ โ ๐ฅ = ๐=0 ๐ ๐ ๐ โ ๐ ๐ Solution: ๐ ๐ฅ โ ๐ โ ๐ฅ , ๐ ๐ =0 for k = 0, 1, 2, โฆ n
8
Least Square Solution: Proof
When ๐ 0 , ๐ 1 , ๐ 2 ,โฏ ๐ ๐ are linearly independent, the least square problem has a unique solution: ๐ โ ๐ฅ = ๐=0 ๐ ๐ ๐ โ ๐ ๐ ๐ฅ where, p*(x) โ f(x) is orthogonal to all ฯjโs, j = 0, 1, 2, โฆ n. We need to prove: ๐ โ ๐ฅ โ๐ ๐ฅ 2 is minimum when p*(x) โ f(x) is orthogonal to all ฯjโs, j = 0, 1, 2, โฆ n existence and uniqueness of the least square solution
9
Least Square Solution: Proof
Let ๐ 0 , ๐ 1 , ๐ 2 ,โฏ ๐ ๐ be another sequence of coefficients with ๐ ๐ โ ๐ ๐ โ for at least one j, then ๐=0 ๐ ๐ ๐ ๐ ๐ ๐ฅ โ๐ ๐ฅ = ๐=0 ๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ฅ + ๐ โ ๐ฅ โ๐ ๐ฅ If p*(x) โ f(x) is orthogonal to all ฯjโs, it is also orthogonal to their linear combination ๐=0 ๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ฅ . According to Pythagorean theorem, we have: ๐=0 ๐ ๐ ๐ ๐ ๐ ๐ฅ โ๐ ๐ฅ 2 = ๐=0 ๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ฅ 2 + ๐ โ ๐ฅ โ๐ ๐ฅ 2 > ๐ โ ๐ฅ โ๐ ๐ฅ 2 Therefore, if p*(x) โ f(x) is orthogonal to all ฯjโs, then p*(x) is the solution of the least square problem.
10
Least Square Solution: Normal Equations
If ๐ 0 , ๐ 1 , ๐ 2 ,โฏ ๐ ๐ are linearly independent, solution to the least square problem is: ๐ โ ๐ฅ โ๐ ๐ฅ , ๐ ๐ ๐ฅ =0 ๐=0, 1, 2,โฏ๐ where, ๐ โ ๐ฅ = ๐=0 ๐ ๐ ๐ โ ๐ ๐ ๐ฅ Therefore, ๐=0 ๐ ๐ ๐ โ ๐ ๐ ๐ฅ โ๐ ๐ฅ , ๐ ๐ ๐ฅ =0 ๐=0, 1, 2,โฏ๐ ๐=0 ๐ ๐ ๐ โ ๐ ๐ ๐ฅ , ๐ ๐ ๐ฅ = ๐ ๐ฅ , ๐ ๐ ๐ฅ ๐=0, 1, 2,โฏ๐ Normal Equations!
11
Least Square Solution: Normal Equations
๐=0 ๐ ๐ ๐ โ ๐ ๐ ๐ฅ , ๐ ๐ ๐ฅ = ๐ ๐ฅ , ๐ ๐ ๐ฅ ; ๐=0, 1, 2,โฏ๐ ๐ 0 โ ๐ 0 , ๐ 0 + ๐ 1 โ ๐ 1 , ๐ 0 + ๐ 2 โ ๐ 2 , ๐ 0 +โฏ+ ๐ ๐ โ ๐ ๐ , ๐ 0 = ๐, ๐ ๐=0 ๐ 0 โ ๐ 0 , ๐ 1 + ๐ 1 โ ๐ 1 , ๐ 1 + ๐ 2 โ ๐ 2 , ๐ 1 +โฏ+ ๐ ๐ โ ๐ ๐ , ๐ 1 = ๐, ๐ ๐=1 ๐ 0 โ ๐ 0 , ๐ 2 + ๐ 1 โ ๐ 1 , ๐ 2 + ๐ 2 โ ๐ 2 , ๐ 2 +โฏ+ ๐ ๐ โ ๐ ๐ , ๐ 2 = ๐, ๐ ๐=2 โฎ โฎ โฎ โฏ โฎ = โฎ ๐ 0 โ ๐ 0 , ๐ ๐ + ๐ 1 โ ๐ 1 , ๐ ๐ + ๐ 2 โ ๐ 2 , ๐ ๐ +โฏ+ ๐ ๐ โ ๐ ๐ , ๐ ๐ = ๐, ๐ ๐ ๐=๐ Moreover, if ๐ 0 , ๐ 1 , ๐ 2 ,โฏ ๐ ๐ is an orthogonal system: ๐ ๐ โ = ๐, ๐ ๐ ๐ ๐ , ๐ ๐ ๐=0, 1, 2,โฏ๐
12
Least Square Solution: Existence and Uniqueness
๐=0 ๐ ๐ ๐ โ ๐ ๐ ๐ฅ , ๐ ๐ ๐ฅ = ๐ ๐ฅ , ๐ ๐ ๐ฅ ; ๐=0, 1, 2,โฏ๐ ๐ 0 โ ๐ 0 , ๐ 0 + ๐ 1 โ ๐ 1 , ๐ 0 + ๐ 2 โ ๐ 2 , ๐ 0 +โฏ+ ๐ ๐ โ ๐ ๐ , ๐ 0 = ๐, ๐ ๐=0 ๐ 0 โ ๐ 0 , ๐ 1 + ๐ 1 โ ๐ 1 , ๐ 1 + ๐ 2 โ ๐ 2 , ๐ 1 +โฏ+ ๐ ๐ โ ๐ ๐ , ๐ 1 = ๐, ๐ ๐=1 ๐ 0 โ ๐ 0 , ๐ 2 + ๐ 1 โ ๐ 1 , ๐ 2 + ๐ 2 โ ๐ 2 , ๐ 2 +โฏ+ ๐ ๐ โ ๐ ๐ , ๐ 2 = ๐, ๐ ๐=2 โฎ โฎ โฎ โฏ โฎ = โฎ ๐ 0 โ ๐ 0 , ๐ ๐ + ๐ 1 โ ๐ 1 , ๐ ๐ + ๐ 2 โ ๐ 2 , ๐ ๐ +โฏ+ ๐ ๐ โ ๐ ๐ , ๐ ๐ = ๐, ๐ ๐ ๐=๐ Solution to the normal equations exist and is unique unless the following homogenous system has a nontrivial solution for ๐ 0 โ , ๐ 1 โ ,โฏ ๐ ๐ โ , i.e., ๐ ๐ โ โ 0 for at least one j: ๐=0 ๐ ๐ ๐ โ ๐ ๐ ๐ฅ , ๐ ๐ ๐ฅ = 0
13
Least Square Solution: Existence and Uniqueness
Solution to the normal equations exist and is unique unless the following homogenous system has a nontrivial solution for ๐ 0 โ , ๐ 1 โ ,โฏ ๐ ๐ โ , i.e., ๐ ๐ โ โ 0 for at least one j: ๐=0 ๐ ๐ ๐ โ ๐ ๐ ๐ฅ , ๐ ๐ ๐ฅ = 0 This would lead to ๐=0 ๐ ๐ ๐ โ ๐ ๐ 2 = ๐=0 ๐ ๐ ๐ โ ๐ ๐ , ๐=0 ๐ ๐ ๐ โ ๐ ๐ = ๐=0 ๐ ๐=0 ๐ ๐ ๐ , ๐ ๐ ๐ ๐ โ ๐ ๐ โ = ๐=0 ๐ 0. ๐ ๐ โ =0 which contradicts that the ๐ 0 , ๐ 1 , ๐ 2 ,โฏ ๐ ๐ are linearly independent.
14
Least Square Solution: Example (Continuous)
Approximate the function f(x) = 1/(1 + x2) for x in [0, 1] using a straight line. The basis functions are: ๐ 0 ๐ฅ = ๐ 0 =1; ๐ 1 ๐ฅ = ๐ 1 =๐ฅ; ๐ โ ๐ฅ = ๐=0 1 ๐ ๐ โ ๐ ๐ ๐ฅ Normal Equation: ๐=0 ๐ ๐ ๐ โ ๐ ๐ ๐ฅ , ๐ ๐ ๐ฅ = ๐ ๐ฅ , ๐ ๐ ๐ฅ ; ๐=0, 1, 2,โฏ๐ ๐ 0 โ ๐ 0 , ๐ 0 + ๐ 1 โ ๐ 1 , ๐ 0 = ๐, ๐ 0 ๐ 0 โ ๐ 0 , ๐ 1 + ๐ 1 โ ๐ 1 , ๐ 1 = ๐, ๐ 1 ๐ 0 , ๐ 0 = ๐๐ฅ =1; ๐ 0 , ๐ 1 = ๐ 1 , ๐ 0 = 0 1 ๐ฅ 1 ๐๐ฅ = 1 2 =0.5 ๐ 1 , ๐ 1 = 0 1 ๐ฅ ๐ฅ ๐๐ฅ = 1 3 ; ๐, ๐ 0 = ๐ฅ ๐๐ฅ = ๐ 4 ๐, ๐ 1 = ๐ฅ 2 ๐ฅ ๐๐ฅ = ln 2 2 ๐, ๐ = ๐ ๐ ๐ ๐ฅ ๐ ๐ฅ ๐ค ๐ฅ ๐๐ฅ Continuous Case
15
Least Square Solution: Example (Continuous)
๐ 0 โ ๐ 0 , ๐ 0 + ๐ 1 โ ๐ 1 , ๐ 0 = ๐, ๐ 0 ๐ 0 โ ๐ 0 , ๐ 1 + ๐ 1 โ ๐ 1 , ๐ 1 = ๐, ๐ /3 ๐ 0 โ ๐ 1 โ = ๐/4 ln 2 /2 ๐ 0 โ = ๐/4 0.5 ln 2 /2 1/ /3 =1.062 ๐ 1 โ = 1 ๐/4 0.5 ln 2 / /3 =โ ๐ โ ๐ฅ = ๐=0 1 ๐ ๐ โ ๐ ๐ ๐ฅ =1.062โ0.5535๐ฅ
16
Least Square Solution: Example (Continuous)
Approximate the function f(x) = 1/(1 + x2) for x in [0, 1] using a 2nd order polynomial. The basis functions are: ๐ 0 =1; ๐ 1 =๐ฅ; ๐ 2 = ๐ฅ 2 ; ๐ โ ๐ฅ = ๐=0 2 ๐ ๐ โ ๐ ๐ ๐ 0 โ ๐ 0 , ๐ 0 + ๐ 1 โ ๐ 1 , ๐ 0 + ๐ 2 โ ๐ 2 , ๐ 0 = ๐, ๐ 0 ๐ 0 โ ๐ 0 , ๐ 1 + ๐ 1 โ ๐ 1 , ๐ 1 + ๐ 2 โ ๐ 2 , ๐ 1 = ๐, ๐ 1 ๐ 0 โ ๐ 0 , ๐ 2 + ๐ 1 โ ๐ 1 , ๐ 2 + ๐ 2 โ ๐ 2 , ๐ 2 = ๐, ๐ 2 Additional inner products to be evaluated are: ๐ 0 , ๐ 2 = ๐ 2 , ๐ 0 = 0 1 ๐ฅ 2 1 ๐๐ฅ = 1 3 ๐ 1 , ๐ 2 = ๐ 2 , ๐ 1 = 0 1 ๐ฅ 2 ๐ฅ ๐๐ฅ = 1 4 =0.25; ๐ 2 , ๐ 2 = 0 1 ๐ฅ 2 ๐ฅ 2 ๐๐ฅ = 1 5 =0.2 ๐, ๐ 2 = ๐ฅ 2 ๐ฅ 2 ๐๐ฅ =1โ ฯ 4
17
Least Square Solution: Example (Continuous)
๐ 0 โ ๐ 0 , ๐ 0 + ๐ 1 โ ๐ 1 , ๐ 0 + ๐ 2 โ ๐ 2 , ๐ 0 = ๐, ๐ 0 ๐ 0 โ ๐ 0 , ๐ 1 + ๐ 1 โ ๐ 1 , ๐ 1 + ๐ 2 โ ๐ 2 , ๐ 1 = ๐, ๐ 1 ๐ 0 โ ๐ 0 , ๐ 2 + ๐ 1 โ ๐ 1 , ๐ 2 + ๐ 2 โ ๐ 2 , ๐ 2 = ๐, ๐ / / / ๐ 0 โ ๐ 1 โ ๐ 2 โ = ๐/4 ln 2 /2 1โ๐/4 Solving by Gauss Elimination: ๐ 0 โ =1.030; ๐ 1 โ =โ0.3605; ๐ 2 โ =โ ๐ โ ๐ฅ = ๐=0 2 ๐ ๐ โ ๐ ๐ ๐ฅ =1.03โ0.3605๐ฅโ0.193 ๐ฅ 2
18
Least Square Solution: Example (Continuous)
How do you judge how good the fit is or how do you compare fit of two polynomials? Estimate ๐ โ ๐ฅ โ๐ ๐ฅ โ . Two options: Analytical (+ numerical) Numerical (+ visual)
19
Discrete Data (n + 1) observations or data pairs [(x0, y0), (x1, y1), (x2, y2) โฆ (xn, yn)] (m + 1) basis functions: ๐ 0 , ๐ 1 , ๐ 2 ,โฏ ๐ ๐ Approximating polynomial: ๐ ๐ฅ = ๐=0 ๐ ๐ ๐ ๐ ๐ ๐ฅ ๐ 0 ๐ 0 ๐ฅ 0 + ๐ 1 ๐ 1 ๐ฅ 0 + ๐ 2 ๐ 2 ๐ฅ 0 +โฏ+ ๐ ๐ ๐ ๐ ๐ฅ 0 = ๐ฆ 0 ๐ 0 ๐ 0 ๐ฅ 1 + ๐ 1 ๐ 1 ๐ฅ 1 + ๐ 2 ๐ 2 ๐ฅ 1 +โฏ+ ๐ ๐ ๐ ๐ ๐ฅ 1 = ๐ฆ 1 :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: ๐ 0 ๐ 0 ๐ฅ ๐ + ๐ 1 ๐ 1 ๐ฅ ๐ + ๐ 2 ๐ 2 ๐ฅ ๐ +โฏ+ ๐ ๐ ๐ ๐ ๐ฅ ๐ = ๐ฆ ๐ n equations, m unknowns: m < n: over-determined system, least square regression m = n: unique solution, interpolation m > n: under-determined system
20
Least Square Solution: Example (Discrete Data)
Variation of Ultimate Shear Strength (y) with curing Temperature (x) for a certain rubber compound was reported (J. Quality Technology, 1971, pp ) as: Fit a linear and a quadratic regression model to the data. For the linear model, the basis functions and the polynomial are: ๐ 0 ๐ฅ = ๐ 0 =1; ๐ 1 ๐ฅ = ๐ 1 =๐ฅ; ๐ โ ๐ฅ = ๐=0 1 ๐ ๐ โ ๐ ๐ ๐ฅ ๐ 0 โ ๐ 0 , ๐ 0 + ๐ 1 โ ๐ 1 , ๐ 0 = ๐, ๐ 0 ๐ 0 โ ๐ 0 , ๐ 1 + ๐ 1 โ ๐ 1 , ๐ 1 = ๐, ๐ 1 Given vectors: x = [138, 140, 144.5, 146, 148, 151.5, 153.5, 157] and y = f(x) = [770, 800, 840, 810, 735, 640, 590, 560] No. of data points, m = 8. Denote elements of x and y as xi and yi, respectively. x, in oC 138 140 144.5 146 148 151.5 153.5 157 y, in psi 770 800 840 810 735 640 590 560
21
Least Square Solution: Example (Discrete Data)
๐ 0 , ๐ 0 = ๐=1 ๐ 1 1 =8; ๐ 1 , ๐ 1 = ๐=1 ๐ ๐ฅ ๐ ๐ฅ ๐ = ๐=1 ๐ ๐ฅ ๐ 2 = ๐ 1 , ๐ 0 = ๐ 0 , ๐ 1 = ๐=1 ๐ 1 ๐ฅ ๐ = ๐=1 ๐ ๐ฅ ๐ = ๐, ๐ 0 = ๐=1 ๐ ๐ฆ ๐ 1 =5745; ๐, ๐ 1 = ๐=1 ๐ ๐ฆ ๐ ๐ฅ ๐ = ๐ 0 โ ๐ 0 , ๐ 0 + ๐ 1 โ ๐ 1 , ๐ 0 = ๐, ๐ 0 ๐ 0 โ ๐ 0 , ๐ 1 + ๐ 1 โ ๐ 1 , ๐ 1 = ๐, ๐ ๐ 0 โ ๐ 1 โ = โน ๐ 0 โ = ; ๐ 1 โ =โ ๐ โ ๐ฅ = ๐=0 1 ๐ ๐ โ ๐ ๐ ๐ฅ =2773.5โ ๐ฅ
22
Least Square Solution: Example (Discrete Data)
For the quadratic model, the basis functions and the polynomial are: ๐ 0 =1; ๐ 1 =๐ฅ; ๐ 2 = ๐ฅ 2 ; ๐ โ ๐ฅ = ๐=0 2 ๐ ๐ โ ๐ ๐ ๐ 0 โ ๐ 0 , ๐ 0 + ๐ 1 โ ๐ 1 , ๐ 0 + ๐ 2 โ ๐ 2 , ๐ 0 = ๐, ๐ 0 ๐ 0 โ ๐ 0 , ๐ 1 + ๐ 1 โ ๐ 1 , ๐ 1 + ๐ 2 โ ๐ 2 , ๐ 1 = ๐, ๐ 1 ๐ 0 โ ๐ 0 , ๐ 2 + ๐ 1 โ ๐ 1 , ๐ 2 + ๐ 2 โ ๐ 2 , ๐ 2 = ๐, ๐ 2 Additional inner products to be evaluated are: ๐ 0 , ๐ 2 = ๐ 2 , ๐ 0 = ๐=1 ๐ 1 ๐ฅ ๐ 2 = ๐ 1 , ๐ 2 = ๐ 2 , ๐ 1 = ๐=1 ๐ ๐ฅ ๐ ๐ฅ ๐ 2 = ๐=1 ๐ ๐ฅ ๐ 3 = ๐ 2 , ๐ 2 = ๐=1 ๐ ๐ฅ ๐ 2 ๐ฅ ๐ 2 = ๐=1 ๐ ๐ฅ ๐ 4 = ๐, ๐ 2 = ๐=1 ๐ ๐ฆ ๐ ๐ฅ ๐ 2 =
23
Least Square Solution: Example (Discrete Data)
๐ 0 โ ๐ 0 , ๐ 0 + ๐ 1 โ ๐ 1 , ๐ 0 + ๐ 2 โ ๐ 2 , ๐ 0 = ๐, ๐ 0 ๐ 0 โ ๐ 0 , ๐ 1 + ๐ 1 โ ๐ 1 , ๐ 1 + ๐ 2 โ ๐ 2 , ๐ 1 = ๐, ๐ 1 ๐ 0 โ ๐ 0 , ๐ 2 + ๐ 1 โ ๐ 1 , ๐ 2 + ๐ 2 โ ๐ 2 , ๐ 2 = ๐, ๐ ๐ 0 โ ๐ 1 โ ๐ 2 โ = Solving by Gauss Elimination: ๐ 0 โ =โ ; ๐ 1 โ = ; ๐ 2 โ =โ ๐ โ ๐ฅ = ๐=0 2 ๐ ๐ โ ๐ ๐ ๐ฅ =โ ๐ฅโ ๐ฅ 2
24
Least Square Solution: Example (Discrete Data)
How do you judge how good the fit is or how do you compare fit of two polynomials? Square of errors!
25
Least Square Solution: Example (Discrete Data)
Denote: ๐ฆ ๐ =๐ ๐ฅ ๐ , ๐ฆ ๐ = ๐ โ ๐ฅ ๐ ๐ ๐ฆ = ๐=0 ๐ ๐ฆ ๐ ๐+1 , ๐ ๐ก 2 = ๐=0 ๐ ๐ฆ ๐ โ ๐ ๐ฆ 2 ๐ 2 = ๐=0 ๐ ๐ฆ ๐ โ ๐ฆ ๐ 2 R2 = 0.73 Coefficient of regression r or R is given by: ๐ 2 = ๐ ๐ก 2 โ ๐ 2 ๐ ๐ก 2 R2 = 0.88
26
Additional Points You can do multiple regression using the same frame work. Linearize some non-linear equations Evaluate Integrals using Numerical Methods for Integration for functions that are not easy to integrate using analytical means!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.