Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Proof of unbiased estimator of  2 (Ref. To Gujarati (2003)pp.100-101) The least squares formulas (estimators) in the simple regression case: b2b2.

Similar presentations


Presentation on theme: "The Proof of unbiased estimator of  2 (Ref. To Gujarati (2003)pp.100-101) The least squares formulas (estimators) in the simple regression case: b2b2."— Presentation transcript:

1 The Proof of unbiased estimator of  2 (Ref. To Gujarati (2003)pp.100-101) The least squares formulas (estimators) in the simple regression case: b2b2 = xiyixi2xiyixi2 =  k i (Y i -Y) =  k i Y i - Y  k i =  k i Y Substitute the PRF: Y =  1 +  2 X + u into the b 2 formula b 2 =  k i (  1 +  2 X i + u i ) =  1  k i +  2  k i X i +  k i u i =  2 +  k i u Since  k i = 0  k i X = 1 = 0 = 1 Take the expectation on both side: E(b 2 ) =  2 +  k i E( u i ) E(b 2 ) =  2 (unbiased estimator) = 0 By assumptions: E(u i ) = 0 E(u i,u j ) = 0 Var(u I ) =  2 = 0 = k i

2 Var (b 2 ) = E[ b 2 – E(b 2 )] 2 = E[ b 2 -  2 ] 2 = E[  k i u i ]2 = E[k 1 2 u 1 2 + k 2 2 u 2 2 + k 3 2 u 3 2 +….+2k 1 k 2 u 1 u 2 +2k 1 k 3 u 1 u 3 + …..] = k 1 2 E(u 1 2 )+ k 2 2 E(u 2 2 ) + k 3 2 E(u 3 2 ) +….+2k 1 k 2 E(u 1 u 2 ) +2k 1 k 3 E(u 1 u 3 )+….. = k 1 2  2 + k 2 2  2 + k 3 2  2 + …. + 0 + 0 + 0 + … =  2  k i 2 =  2  (x i /  x i 2 ) 2 =  2  x i 2 / (  x i 2 ) 2 =  2 /  x i 2 By assumptions: E(  i ) = 0 E(  i,  j ) = 0 Var(  I ) =  2 = 0 E(b 1 ) =  1 (unbiased estimator) The Proof of variance of  2 (Ref. To Gujarati (2003)pp.101-102) ^

3 By definition: Cov (b 1, b 2 ) = E{[b 1 - E(b 1 )][b 2 - E(b 2 )]} = E{[(Y – b 2 X) – (Y -  2 X)][b 2 -  2 ]} = E{[-X(b 2 -  2 )][b 2 -  2 ]} = -X E(b 2 -  2 ) 2 = -X[  2 /  x i 2 ] The Proof of covariance of  1 and  2 : cov(  1,  2 ) ^ ^ ^ ^ (Ref. To Gujarati (2003)pp.102)

4 The OLS estimator of  2 is: b 2 = =  k i Y i xiyixi2xiyixi2 Now suppose an other linear estimator of  2 is b 2 * =  w i Y i And assume w i  k i Since b 2 * =  w i Y i =  w i (  2 +  2 X i +u i ) =  w i  1 +  2  w i X i +  w i u I Take the expectation of b 2 * : E( b 2 * ) = E(  w i  1 )+  2 E(  w i X i )+ E(  w i u i ) =  1  w i +  2  w i X i since E(u i )=0 For b 2 * to be unbiased, i.e., E(b 2 * ) =  2 there must be  w i =0, and  w i X i = 1 The Proof of minimum variance property of OLS (Ref. To Gujarati (2003)pp.104-105)

5 And the variance of b 2 * is: Var(b 2 *) = E[b 2 * - E(b 2 * )] 2 = E[ b 2 * -  2 ] 2 = E(  w i u i ) 2 =  w i 2 E(u i ) 2 =  2  w i 2 =  2  [(w i - k i ) + k i )] 2 =  2  (w i - k i ) 2 +  2  k i 2 + 2  2  (w i - k i )k i =  2  (w i - k i )2 +  2  k i 2 Since  k i =0 =  2  (w i - k i )2 +  2 /  x i 2 =  2  (w i - k i )2 + Var(b 2 ) Therefore, only if w i = k i, then Var(b 2 * ) = Var(b 2 ), Hence the OLS estimator b 2 is the min. variance. If it is not min.=>OLS isn’t the best = 0 If b 2 * is an unbiased estimator, then b 2 * =  w i Y i =  w i (  1 +  2 X i +u i ) =  2 +  w i u i Therefore, (b 2 * -  2 ) =  w i u i

6 The Proof of unbiased estimator of  2 Var( u ) = E(u 2 ) =  2 ^ ^ ^ Or Var( e ) = E(e 2 ) =  2 Since Y =  1 +  2 X + u and Y =  1 +  2 X + u => y =  2 x + (u - u ) e = Y – b 1 – b 2 X and 0 = Y – b 1 – b 2 X => e = y – b 2 x e =  2 x + (u - u ) – b 2 x = (  2 – b 2 )x + (u - u ) Deviation form Take squares and summing on both sides:  e 2 = (  2 –b 2 ) 2  x 2 +  (u - u ) 2 – 2(  2 –b 2 )  x(u - u) Take expectation on both sides: E(  e 2 ) = E[(  2 –b 2 ) 2 ]  x 2 + E[  (u - u ) 2 ] – 2E[(  2 –b 2 )  x(u - u)] I II III (Ref. To Gujarati (2003)pp.102-03)

7 Utilize the OLS assumptions, that are E(u) = 0, E(u 2 ) =  2 and E(u i u j ) = 0 And get I =  2 II = (n-1)  2 III = -2  2 Substituting these three terms, I, II, and III into the equation and get E(  e 2 ) = (n-2)  2 And if define  2 =  e 2 /(n-2) Therefore, the expected value is E(  2 ) = E(  e 2 )/(n-2) = (n-2)  2 /(n-2) =  2 ^ ^


Download ppt "The Proof of unbiased estimator of  2 (Ref. To Gujarati (2003)pp.100-101) The least squares formulas (estimators) in the simple regression case: b2b2."

Similar presentations


Ads by Google