Download presentation
Presentation is loading. Please wait.
1
Discrete Least Squares Approximation
Sec:8.1 Discrete Least Squares Approximation
2
Sec:8.1 Discrete Least Squares Approximation
Find a relationship between x and y. From this graph, it appears that the actual relationship between x and y is linear. From this graph, it appears that the actual relationship between x and y is linear. The likely reason that no line precisely fits the data is because of errors in the data. The likely reason that no line precisely fits the data is because of errors in the data. ๐ ๐ (๐) = ๐ ๐ + ๐ ๐ ๐+ ๐ ๐ ๐ ๐ ๐ ๐ (๐)= ๐ ๐ + ๐ ๐ ๐
3
Sec:8.1 Discrete Least Squares Approximation
๐=๐ฟ โ๐ Fit the data in Table with the discrete least squares polynomial of degree at most 2. ๐ฟ ๐ป โ๐= ๐ฟ ๐ป โ ๐ฟ โ๐ This is a linear system of 3 equations in 3 unknowns ๐ = ๐ฉ โ๐ ๐ร๐ 3ร๐ ๐ร๐ ๐ ๐ (๐) = ๐ ๐ + ๐ ๐ ๐+ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐.๐ = ๐ ๐ + ๐โ๐ ๐ + ๐ ๐ โ๐ ๐ ๐ ๐ 1.284 = ๐ ๐ + ๐.๐๐๐ ๐ + ๐.๐๐ ๐ ๐ ๐ = = ๐ ๐ + ๐.๐๐ ๐ + ๐.๐ ๐ ๐ ๐ 2.117 = ๐ ๐ + ๐.๐๐๐ ๐ + ๐.๐๐ ๐ ๐ ๐ 8.7680 ๐ ๐ ๐ ๐ ๐ ๐ = ๐ ๐ + ๐๐ ๐ + ๐ ๐ ๐ ๐ normal equations = Matrix Form ๐ = ๐ฉ โ ๐ = ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ 0.8418 0.1106 0.0527 = ๐ ๐ ๐ฟ ๐ ๐ (๐) = ๐ ๐ ๐ ๐ร๐ ๐ร๐ ๐ร๐
4
Sec:8.1 Discrete Least Squares Approximation
Fit the data in Table with the discrete least squares polynomial of degree at most 2. ๐ ๐ (๐) = ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ฌ๐๐๐๐
5
Sec:8.1 Discrete Least Squares Approximation
EXAMPLE: Use least-squares regression to fit a straight line to x [ ] y [ ] Plot the data and the regression line. ๐ ๐ (๐)= ๐ clear x = [ ]'; Y = [ ]'; n = length(x) X = [ones(n,1) x ]; % Y = X * a % B = X'*X; Z = X' * Y; a = B\Z a = X\Y xx=[0:0.1:19]'; yy=a(1)+a(2)*xx; plot(x,Y,'xr',xx,yy,'-b'); grid on ๐ ๐ = ๐ ๐ โ( ๐ ๐ + ๐ ๐ ๐ ๐ )
6
Sec:8.1 Discrete Least Squares Approximation
Derivation for the formula: EXAMPLE: One strategy for fitting a โbestโ line through the data would be to minimize the sum of the absolute value of residual errors for all the available data Use least-squares regression to fit a straight line to x [ ] y [ ] ๐=๐ ๐๐ ๐ ๐ = ๐=๐ ๐๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐ ๐ ๐ = ๐ ๐ โ( ๐ ๐ + ๐ ๐ ๐ ๐ ) find values of ๐ ๐ and ๐ ๐ min ๐ ๐ , ๐ ๐ ๐=๐ ๐๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐ ๐ ๐ (๐)= ๐ To minimize a function of two variables, we need to set its partial derivatives to zero and simultaneously solve the resulting equations. ๐ ๐ = ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐ ๐ ๐ = ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐ โฎ โฎ The problem is that the absolute-value function is not differentiable at zero, and we might not be able to find solutions to this pair of equations. ๐ ๐๐ = ๐ ๐๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐๐
7
Sec:8.1 Discrete Least Squares Approximation
EXAMPLE: Least Squares Use least-squares regression to fit a straight line to The least squares approach to this problem involves determining the best approximating line when the error involved is the sum of the squares x [ ] y [ ] ๐ ๐ = ๐ ๐ โ( ๐ ๐ + ๐ ๐ ๐ ๐ ) min ๐ ๐ , ๐ ๐ ๐=๐ ๐๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐ ๐ The least squares method is the most convenient procedure for determining best linear approximations ๐ ๐ (๐)= ๐ ๐ฌ ๐ ๐ , ๐ ๐ = ๐=๐ ๐๐ ๐ ๐ For a minimum to occur, we need both = ๐=๐ ๐๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐ ๐ ๐๐ฌ ๐ ๐ ๐ =๐ ๐๐ฌ ๐ ๐ ๐ =๐
8
Sec:8.1 Discrete Least Squares Approximation
๐=๐ ๐ โ๐ ๐ ๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐ =๐ ๐๐ฌ ๐ ๐ ๐ =๐ = ๐=๐ ๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐ ๐ ๐ฌ ๐ ๐ , ๐ ๐ โ ๐=๐ ๐ ๐ ๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐ =๐ For a minimum to occur, we need both ๐๐ฌ ๐ ๐ ๐ =๐ ๐๐ฌ ๐ ๐ ๐ =๐ โ ๐=๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ =๐ ๐=๐ ๐ โ๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐ =๐ ๐๐ฌ ๐ ๐ ๐ =๐ ๐=๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ = ๐=๐ ๐ ๐ ๐ ๐ ๐ โ ๐=๐ ๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐ =๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ = ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ โ ๐=๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ =๐ ๐=๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ = ๐=๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ = ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ These are called the normal equations.
9
Sec:8.1 Discrete Least Squares Approximation
+ ๐=๐ ๐ ๐ ๐ ๐ ๐ = ๐=๐ ๐ ๐ ๐ ๐ฆ 1 โฎ ๐ฆ ๐ = 1 ๐ฅ 1 โฎ โฎ 1 ๐ฅ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ Least Squares Use least-squares regression to fit a straight line to ๐=๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ = ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ฟ ๐ ๐ฅ 1 ๐ฆ 1 ๐ฅ 2 ๐ฆ 2 โฎ โฎ ๐ฅ ๐ ๐ฆ ๐ ๐ฅ ๐ฆ ๐ร๐ ๐ร๐ ๐ร๐ These are called the normal equations. ๐=๐ฟ โ๐ ๐ฟ ๐ป โ๐= ๐ฟ ๐ป โ ๐ฟ โ๐ Matrix Form ๐ ๐ (๐)= ๐ ๐ + ๐ ๐ ๐ ๐ ๐ฅ ๐ ๐ฅ ๐ ๐ฅ ๐ ๐ ๐ ๐ ๐ = ๐ฆ ๐ ๐ฅ ๐ ๐ฆ ๐ 1 โฏ 1 ๐ฅ 1 โฏ ๐ฅ ๐ ๐ฆ 1 โฎ ๐ฆ ๐ = 1 โฏ 1 ๐ฅ 1 โฏ ๐ฅ ๐ ๐ฅ 1 โฎ โฎ 1 ๐ฅ ๐ ๐ ๐ ๐ ๐ ๐ฆ 1 +โฏ+ ๐ฆ ๐ ๐ฅ 1 ๐ฆ 1 +โฏ+ ๐ฅ ๐ ๐ฆ ๐ = 1+โฏ+1 ๐ฅ 1 +โฏ+ ๐ฅ ๐ ๐ฅ 1 +โฏ+ ๐ฅ ๐ ๐ฅ 1 2 +โฏ+ ๐ฅ ๐ ๐ ๐ ๐ ๐
10
Derivation for the formula: (polynomial of degree 2)
Sec:8.1 Discrete Least Squares Approximation Derivation for the formula: (polynomial of degree 2)
11
Derivation for the formula: (polynomial of degree 2)
Sec:8.1 Discrete Least Squares Approximation Least Squares = ๐=๐ ๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐ โ ๐ ๐ ๐ ๐ ๐ ๐ ๐ฅ 1 ๐ฆ 1 ๐ฅ 2 ๐ฆ 2 โฎ โฎ ๐ฅ ๐ ๐ฆ ๐ ๐ฅ ๐ฆ ๐ฌ ๐ ๐ , ๐ ๐ Use least-squares regression to fit a polynomial of degree 2 to For a minimum to occur, we need both ๐ ๐ ๐ = ๐ ๐ + ๐ ๐ ๐+ ๐ ๐ ๐ ๐ ๐๐ฌ ๐ ๐ ๐ =๐ ๐๐ฌ ๐ ๐ ๐ =๐ ๐๐ฌ ๐ ๐ ๐ =๐ ๐=๐ ๐ โ๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐ โ ๐ ๐ ๐ ๐ ๐ =๐ ๐๐ฌ ๐ ๐ ๐ =๐ Derivation for the formula: (polynomial of degree 2) โ ๐=๐ ๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐ โ ๐ ๐ ๐ ๐ ๐ =๐ โ ๐=๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ =๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ = ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐
12
Sec:8.1 Discrete Least Squares Approximation
= ๐=๐ ๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐ โ ๐ ๐ ๐ ๐ ๐ ๐ ๐ฅ 1 ๐ฆ 1 ๐ฅ 2 ๐ฆ 2 โฎ โฎ ๐ฅ ๐ ๐ฆ ๐ ๐ฅ ๐ฆ ๐ฌ ๐ ๐ , ๐ ๐ Use least-squares regression to fit a polynomial of degree 2 to For a minimum to occur, we need both ๐ ๐ ๐ = ๐ ๐ + ๐ ๐ ๐+ ๐ ๐ ๐ ๐ ๐๐ฌ ๐ ๐ ๐ =๐ ๐๐ฌ ๐ ๐ ๐ =๐ ๐๐ฌ ๐ ๐ ๐ =๐ ๐=๐ ๐ โ๐ ๐ ๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐ โ ๐ ๐ ๐ ๐ ๐ =๐ ๐๐ฌ ๐ ๐ ๐ =๐ โ ๐=๐ ๐ ๐ ๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐ โ ๐ ๐ ๐ ๐ ๐ =๐ โ ๐=๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ =๐ ๐=๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ = ๐=๐ ๐ ๐ ๐ ๐ ๐
13
Sec:8.1 Discrete Least Squares Approximation
= ๐=๐ ๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐ โ ๐ ๐ ๐ ๐ ๐ ๐ ๐ฅ 1 ๐ฆ 1 ๐ฅ 2 ๐ฆ 2 โฎ โฎ ๐ฅ ๐ ๐ฆ ๐ ๐ฅ ๐ฆ ๐ฌ ๐ ๐ , ๐ ๐ Use least-squares regression to fit a polynomial of degree 2 to For a minimum to occur, we need both ๐ ๐ ๐ = ๐ ๐ + ๐ ๐ ๐+ ๐ ๐ ๐ ๐ ๐๐ฌ ๐ ๐ ๐ =๐ ๐๐ฌ ๐ ๐ ๐ =๐ ๐๐ฌ ๐ ๐ ๐ =๐ ๐=๐ ๐ โ๐ ๐ ๐ ๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐ โ ๐ ๐ ๐ ๐ ๐ =๐ ๐๐ฌ ๐ ๐ ๐ =๐ โ ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐ โ ๐ ๐ ๐ ๐ ๐ =๐ โ ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ =๐ ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ = ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐
14
Sec:8.1 Discrete Least Squares Approximation
= ๐=๐ ๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐ โ ๐ ๐ ๐ ๐ ๐ ๐ ๐ฅ 1 ๐ฆ 1 ๐ฅ 2 ๐ฆ 2 โฎ โฎ ๐ฅ ๐ ๐ฆ ๐ ๐ฅ ๐ฆ ๐ฌ ๐ ๐ , ๐ ๐ Use least-squares regression to fit a polynomial of degree 2 to For a minimum to occur, we need both ๐ ๐ ๐ = ๐ ๐ + ๐ ๐ ๐+ ๐ ๐ ๐ ๐ ๐๐ฌ ๐ ๐ ๐ =๐ ๐๐ฌ ๐ ๐ ๐ =๐ ๐๐ฌ ๐ ๐ ๐ =๐ normal equations normal equations in Matrix Form + ๐=๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ = ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐=๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ = ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ฅ ๐ ๐ ๐ ๐ฅ ๐ ๐ฅ ๐ ๐ฅ ๐ ๐ฅ ๐ ๐ฅ ๐ ๐ ๐ ๐ ๐ ๐ ๐ = ๐ ๐ ๐ ๐ ๐ ๐ ๐ฅ ๐ 2 ๐ ๐ ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ = ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐
15
Sec:8.1 Discrete Least Squares Approximation
๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ = ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐=๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ = ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ + ๐=๐ ๐ ๐ ๐ ๐ ๐ ๐ = ๐=๐ ๐ ๐ ๐ ๐ฆ 1 โฎ ๐ฆ ๐ = 1 ๐ฅ 1 ๐ฅ 1 2 โฎ โฎ โฎ 1 ๐ฅ ๐ ๐ฅ ๐ ๐ ๐ ๐ ๐ ๐ ๐ Least Squares Use least-squares regression to fit a poly of deg 2 to ๐ ๐ฅ 1 ๐ฆ 1 ๐ฅ 2 ๐ฆ 2 โฎ โฎ ๐ฅ ๐ ๐ฆ ๐ ๐ฅ ๐ฆ ๐ฟ ๐ ๐ร๐ ๐ร๐ ๐ร๐ ๐=๐ฟ โ๐ normal equations. ๐ฟ ๐ป โ๐= ๐ฟ ๐ป โ ๐ฟ โ๐ Matrix Form ๐ ๐ ๐ = ๐ ๐ + ๐ ๐ ๐+ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ฅ ๐ ๐ ๐ ๐ฅ ๐ ๐ฅ ๐ ๐ฅ ๐ ๐ฅ ๐ ๐ฅ ๐ ๐ ๐ ๐ ๐ ๐ ๐ = ๐ ๐ ๐ ๐ ๐ ๐ ๐ฅ ๐ 2 ๐ ๐ 1 โฏ 1 ๐ฅ 1 โฏ ๐ฅ ๐ ๐ฅ 1 2 โฏ ๐ฅ ๐ ๐ฆ 1 โฎ ๐ฆ ๐ = 1 โฏ 1 ๐ฅ 1 โฏ ๐ฅ ๐ ๐ฅ 1 2 โฏ ๐ฅ ๐ ๐ฅ 1 ๐ฅ 1 2 โฎ โฎ โฎ 1 ๐ฅ ๐ ๐ฅ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ฆ 1 +โฏ+ ๐ฆ ๐ ๐ฅ 1 ๐ฆ 1 +โฏ+ ๐ฅ ๐ ๐ฆ ๐ ๐ฅ 1 2 ๐ฆ 1 +โฏ+ ๐ฅ ๐ 2 ๐ฆ ๐ = 1+โฏ+1 ๐ฅ 1 +โฏ+ ๐ฅ ๐ ๐ฅ 1 2 +โฏ+ ๐ฅ ๐ 2 ๐ฅ 1 +โฏ+ ๐ฅ ๐ ๐ฅ 1 2 +โฏ+ ๐ฅ ๐ 2 ๐ฅ 1 3 +โฏ+ ๐ฅ ๐ 3 ๐ฅ 1 2 +โฏ+ ๐ฅ ๐ 2 ๐ฅ 1 3 +โฏ+ ๐ฅ ๐ 3 ๐ฅ 1 4 +โฏ+ ๐ฅ ๐ ๐ ๐ ๐ ๐ ๐ ๐
16
๐ ๐ Sec:8.1 Discrete Least Squares Approximation
= ๐=๐ ๐ ๐ ๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐ โโฏโ ๐ ๐ ๐ ๐ ๐ ๐ ๐ฅ 1 ๐ฆ 1 ๐ฅ 2 ๐ฆ 2 โฎ โฎ ๐ฅ ๐ ๐ฆ ๐ ๐ฅ ๐ฆ ๐ฌ ๐ ๐ , ๐ ๐ Use least-squares regression to fit a polynomial of degree n to For a minimum to occur, we need both ๐๐ฌ ๐ ๐ ๐ =๐ ๐๐ฌ ๐ ๐ ๐ =๐ ๐๐ฌ ๐ ๐ ๐ =๐ โฏโฏ ๐ ๐ ๐ = ๐ ๐ + ๐ ๐ ๐+โฏ+ ๐ ๐ ๐ ๐ normal equations in Matrix Form ๐ ๐ ๐ โฏ ๐ฅ ๐ ๐ ๐ ๐ ๐ฅ ๐ โฏ ๐ฅ ๐ ๐ โฎ โฎ โฑ โฎ ๐ฅ ๐ ๐ ๐ฅ ๐ ๐ โฏ ๐ฅ ๐ 2๐ ๐ ๐ ๐ ๐ โฎ ๐ ๐ = ๐ ๐ ๐ ๐ ๐ ๐ โฎ ๐ฅ ๐ ๐ ๐ ๐ ๐ฅ ๐ 0 ๐ Number of data points ๐ Degree of polynomial (๐+1)ร(๐+1) (๐+1)ร1
17
Sec:8.1 Discrete Least Squares Approximation
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.