Presentation is loading. Please wait.

Presentation is loading. Please wait.

Adam Q. Colley EMIS8381 March 31, 2012.  One Dependent Variable  One or more (n) Independent Variables  (Independent to the Dependent Variable)  More.

Similar presentations


Presentation on theme: "Adam Q. Colley EMIS8381 March 31, 2012.  One Dependent Variable  One or more (n) Independent Variables  (Independent to the Dependent Variable)  More."— Presentation transcript:

1 Adam Q. Colley EMIS8381 March 31, 2012

2  One Dependent Variable  One or more (n) Independent Variables  (Independent to the Dependent Variable)  More data points than number of Variables  Find one equation with least amount of error

3  Find coefficients for Independent Variables  Find constant (coefficient for x 0 = 1)  Final equation is y = Σ (a i *x i ) {i = 0, 1, …, n}  n+1 equations are created by multiplying each data point by x i and summing  Solve n+1 equations for n+1 unknown values

4 yx1x1 x 2 (= x 1 2 ) 7939 366749 194525 122513169 600981

5 times x 0 times x 1 times x 2 yx0x0 x1x1 x2x2 yx0x0 x1x1 x2x2 yx0x0 x1x1 x2x2 79139237392771192781 3661749256274934317934493432401 1941525970525125485025125625 122511316915925131692197207025169219728561 6001981540098172948600817296561 246453733325094373333421279120333342138229 Σ Equations: 2464 = 5a 0 + 37a 1 + 333a 2 25094 = 37a 0 + 333a 1 + 3421a 2 279120 = 333a 0 + 3421a 1 + 38229a 2 Solution: a 0 = 8.777 a 1 = 1.979 a 2 = 7.048 Equation: y = 1.979x 1 + 7.048x 2 + 8.777

6  Raw Data:  yR m  XR m,n  Calculated Data: (A. a = b => a = A -1. b)  C = [ Î m | X ]R m,n+1  D = [ C | y ]R m,n+2  T = C T. DR n+1,n+2  b = T :,n+2 R n+1  A = T :,1:n+1 R n+1,n+1  a = A -1. bR n+1

7

8  Least Squares Method:  Data values are known  Finds new linear function based on known values  Minimizes error of new function to known values  What If:  Needed to find function  Function is non-linear

9  Example: y = a*f(x, b) + c  f(x, b) = e bx  Least Squares will find values for a and c

10  Given LS(b) is the covariant using b in f(x, b)  LS(b) >= 0  LS(b) is usually quasi-convex or better  Example exception: sin(bx)  LS(b) has no constraints  Objective: Minimize LS(b)

11  Use a search method on LS(b):

12  b a = 3.5, b b = 4.5, l = 0.1, n = 6 kbabab byby BuBu LS(b a )LS(b b )LS(b y )LS(b u ) 13.54.53.884.12214.64197.9563.8264.44 23.884.54.124.2763.82197.9564.4498.76 33.884.274.044.1263.8298.7656.5864.44 43.884.123.964.0463.8264.4456.2956.58

13  b = 3.96  LS(b) = 56.29  R 2 = 0.9997  y = 5.1837e 3.96x – 3.0567  Excel suggests: y = 3.8907e 4.2607x + 0  Covariance = 118.68, R 2 = 0.9972

14  Steps:  Modify X matrix using search value for b  Use the same LS function  Use Coefficient to find Ŷ values  Subtract Y from Ŷ, square, and sum  Use summation for deciding next search step

15

16


Download ppt "Adam Q. Colley EMIS8381 March 31, 2012.  One Dependent Variable  One or more (n) Independent Variables  (Independent to the Dependent Variable)  More."

Similar presentations


Ads by Google