Presentation is loading. Please wait.

Presentation is loading. Please wait.

May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation.

Similar presentations


Presentation on theme: "May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation."— Presentation transcript:

1 May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation

2 May 2004 Prof. Himayatullah 2 7-1. The three-Variable Model: Notation and Assumptions Y i = ß 1 + ß 2 X 2i + ß 3 X 3i + u i (7.1.1) ß 2, ß 3 are partial regression coefficients With the following assumptions: + Zero mean value of U i: : E(u i |X 2i,X 3i ) = 0.  i (7.1.2) + No serial correlation: Cov(u i,u j ) = 0,  i # j (7.1.3) + Homoscedasticity: Var(u i ) =  2 (7.1.4) + Cov(u i,X 2i ) = Cov(u i,X 3i ) = 0 (7.1.5) + No specification bias or model correct specified (7.1.6) + No exact collinearity between X variables (7.1.7) (no multicollinearity in the cases of more explanatory vars. If there is linear relationship exits, X vars. Are said to be linearly dependent) + Model is linear in parameters

3 May 2004 Prof. Himayatullah 3 7-2. Interpretation of Multiple Regression E(Y i | X 2i,X 3i ) = ß 1 + ß 2 X 2i + ß 3 X 3i (7.2.1) (7.2.1) gives conditional mean or expected value of Y conditional upon the given or fixed value of the X 2 and X 3

4 May 2004 Prof. Himayatullah 4 7-3. The meaning of partial regression coefficients Y i = ß 1 + ß 2 X 2i + ß 3 X 3 +….+ ß s X s + u i ß k measures the change in the mean value of Y per unit change in X k, holding the rest explanatory variables constant. It gives the “direct” effect of unit change in X k on the E(Y i ), net of X j (j # k) How to control the “true” effect of a unit change in X k on Y? (read pages 195-197)

5 May 2004 Prof. Himayatullah 5 7-4. OLS and ML estimation of the partial regression coefficients This section (pages 197-201) provides: 1. The OLS estimators in the case of three- variable regression Y i = ß 1 + ß 2 X 2i + ß 3 X 3 + u i 2. Variances and standard errors of OLS estimators 3.8 properties of OLS estimators (pp 199-201) 4.Understanding on ML estimators

6 May 2004 Prof. Himayatullah 6 7-5. The multiple coefficient of determination R 2 and the multiple coefficient of correlation R This section provides: 1.Definition of R 2 in the context of multiple regression like r 2 in the case of two-variable regression 2.R =  R 2 is the coefficient of multiple regression, it measures the degree of association between Y and all the explanatory variables jointly 3. Variance of a partial regression coefficient Var(ß^ k ) =  2 /  x 2 k (1/(1-R 2 k )) (7.5.6) Where ß^ k is the partial regression coefficient of regressor X k and R 2 k is the R 2 in the regression of X k on the rest regressors

7 May 2004 Prof. Himayatullah 7 7-6. Example 7.1: The expectations-augmented Philips Curve for the US (1970-1982) This section provides an illustration for the ideas introduced in the chapter Regression Model (7.6.1) Data set is in Table 7.1

8 May 2004 Prof. Himayatullah 8 7-7. Simple regression in the context of multiple regression: Introduction to specification bias This section provides an understanding on “ Simple regression in the context of multiple regression”. It will cause the specification bias which will be discussed in Chapter 13

9 May 2004 Prof. Himayatullah 9 7-8. R 2 and the Adjusted-R 2 R 2 is a non-decreasing function of the number of explanatory variables. An additional X variable will not decrease R 2 R 2 = ESS/TSS = 1- RSS/TSS = 1-  u^ 2 I /  y^ 2 i (7.8.1) This will make the wrong direction by adding more irrelevant variables into the regression and give an idea for an adjusted-R 2 (R bar ) by taking account of degree of freedom R 2 bar = 1- [  u^ 2 I /(n-k)] / [  y^ 2 i /(n-1) ], or (7.8.2) R 2 bar = 1-  ^ 2 / S 2 Y (S 2 Y is sample variance of Y) K= number of parameters including intercept term –By substituting (7.8.1) into (7.8.2) we get R 2 bar = 1- (1-R 2 ) (n-1)/(n- k) (7.8.4) –For k > 1, R 2 bar < R 2 thus when number of X variables increases R 2 bar increases less than R 2 and R 2 bar can be negative

10 May 2004 Prof. Himayatullah 10 7-8. R 2 and the Adjusted-R 2 R 2 is a non-decreasing function of the number of explanatory variables. An additional X variable will not decrease R 2 R 2 = ESS/TSS = 1- RSS/TSS = 1-  u^ 2 I /  y^ 2 i (7.8.1) This will make the wrong direction by adding more irrelevant variables into the regression and give an idea for an adjusted-R 2 (R bar ) by taking account of degree of freedom R 2 bar = 1- [  u^ 2 I /(n-k)] / [  y^ 2 i /(n-1) ], or (7.8.2) R 2 bar = 1-  ^ 2 / S 2 Y (S 2 Y is sample variance of Y) K= number of parameters including intercept term –By substituting (7.8.1) into (7.8.2) we get R 2 bar = 1- (1-R 2 ) (n-1)/(n- k) (7.8.4) –For k > 1, R 2 bar < R 2 thus when number of X variables increases R 2 bar increases less than R 2 and R 2 bar can be negative

11 May 2004 Prof. Himayatullah 11 7-8. R 2 and the Adjusted-R 2 Comparing Two R 2 Values: To compare, the size n and the dependent variable must be the same Example 7-2: Coffee Demand Function Revisited (page 210) The “game” of maximizing adjusted-R 2 : Choosing the model that gives the highest R 2 bar may be dangerous, for in regression our objective is not for that but for obtaining the dependable estimates of the true population regression coefficients and draw statistical inferences about them Should be more concerned about the logical or theoretical relevance of the explanatory variables to the dependent variable and their statistical significance

12 May 2004 Prof. HimayatullahProf. Himayatullah 12 7-9. Partial Correlation Coefficients This section provides: 1. Explanation of simple and partial correlation coefficients 2.Interpretation of simple and partial correlation coefficients (pages 211-214)

13 May 2004 Prof. Himayatullah 13 7-10. Example 7.3: The Cobb- Douglas Production function More on functional form Y i =  1 X  2 2i X  3 3i e U i (7.10.1) By log-transform of this model: lnY i = ln  1 +  2 ln X 2i +  3 ln X 3i + U i =  0 +  2 ln X 2i +  3 ln X 3i + U i (7.10.2) Data set is in Table 7.3 Report of results is in page 216

14 May 2004 Prof. Himayatullah 14 7-11 Polynomial Regression Models Y i =  0 +  1 X i +  2 X 2 i +…+  k X k i + U i (7.11.3) Example 7.4: Estimating the Total Cost Function Data set is in Table 7.4 Empirical results is in page 221 -------------------------------------------------------------- 7-12. Summary and Conclusions (page 221)


Download ppt "May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation."

Similar presentations


Ads by Google