May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation.

Slides:



Advertisements
Similar presentations
Further Inference in the Multiple Regression Model Hill et al Chapter 8.
Advertisements

Multiple Regression Analysis
Multiple Regression W&W, Chapter 13, 15(3-4). Introduction Multiple regression is an extension of bivariate regression to take into account more than.
Multivariate Regression
CHAPTER 3: TWO VARIABLE REGRESSION MODEL: THE PROBLEM OF ESTIMATION
The Multiple Regression Model.
NOTATION & ASSUMPTIONS 2 Y i =  1 +  2 X 2i +  3 X 3i + U i Zero mean value of U i No serial correlation Homoscedasticity Zero covariance between U.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
CHAPTER 8 MULTIPLE REGRESSION ANALYSIS: THE PROBLEM OF INFERENCE
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Ch11 Curve Fitting Dr. Deshi Ye
Assumption MLR.3 Notes (No Perfect Collinearity)
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
The Multiple Regression Model Prepared by Vera Tabakova, East Carolina University.
Chapter 10 Simple Regression.
CHAPTER 3 ECONOMETRICS x x x x x Chapter 2: Estimating the parameters of a linear regression model. Y i = b 1 + b 2 X i + e i Using OLS Chapter 3: Testing.
Linear Regression with One Regression
Econ Prof. Buckles1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
1 Chapter 3 Multiple Linear Regression Ray-Bing Chen Institute of Statistics National University of Kaohsiung.
Specific to General Modelling The traditional approach to econometrics modelling was as follows: 1.Start with an equation based on economic theory. 2.Estimate.
3-variable Regression Derive OLS estimators of 3-variable regression
Chapter 4 Multiple Regression.
All rights reserved by Dr.Bill Wan Sing Hung - HKBU 4A.1 Week 4a Multiple Regression The meaning of partial regression coefficients.
Multivariate Data Analysis Chapter 4 – Multiple Regression.
CHAPTER 4 ECONOMETRICS x x x x x Multiple Regression = more than one explanatory variable Independent variables are X 2 and X 3. Y i = B 1 + B 2 X 2i +
Chapter 11 Multiple Regression.
Lecture 2 (Ch3) Multiple linear regression
Empirical Estimation Review EconS 451: Lecture # 8 Describe in general terms what we are attempting to solve with empirical estimation. Understand why.
Statistics 350 Lecture 17. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Introduction to Regression Analysis, Chapter 13,
Ordinary Least Squares
Chapter 8 Forecasting with Multiple Regression
Course Leader Prof. Dr.Sc VuThieu
Chapter 11 Simple Regression
Chapter 14 Introduction to Multiple Regression Sections 1, 2, 3, 4, 6.
Problems with Incorrect Functional Form You cannot compare R 2 between two different functional forms. ▫ Why? TSS will be different. One should also remember.
Lecture 17 Summary of previous Lecture Eviews. Today discussion  R-Square  Adjusted R- Square  Game of Maximizing Adjusted R- Square  Multiple regression.
7.1 Multiple Regression More than one explanatory/independent variable This makes a slight change to the interpretation of the coefficients This changes.
Specification Error I.
Managerial Economics Demand Estimation. Scatter Diagram Regression Analysis.
CHAPTER 14 MULTIPLE REGRESSION
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 6 EXTENSIONS OF THE TWO-VARIABLE LINEAR REGRESSION MODEL.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Six.
1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 5: TWO-VARIABLE REGRESSION: Interval Estimation and Hypothesis Testing.
Chapter Three TWO-VARIABLEREGRESSION MODEL: THE PROBLEM OF ESTIMATION
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin The Two-Variable Model: Hypothesis Testing chapter seven.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Chapter 4 The Classical Model Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
Business Statistics for Managerial Decision Farideh Dehkordi-Vakil.
Chap 6 Further Inference in the Multiple Regression Model
1 Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
Class 5 Multiple Regression CERAM February-March-April 2008 Lionel Nesta Observatoire Français des Conjonctures Economiques
1 Basic Econometrics 2 Introduction: What is Econometrics?
Chapter 4. The Normality Assumption: CLassical Normal Linear Regression Model (CNLRM)
Lecture 6 Feb. 2, 2015 ANNOUNCEMENT: Lab session will go from 4:20-5:20 based on the poll. (The majority indicated that it would not be a problem to chance,
REGRESSION DIAGNOSTIC I: MULTICOLLINEARITY
Fundamentals of regression analysis
Chapter 3: TWO-VARIABLE REGRESSION MODEL: The problem of Estimation
Chapter 6: MULTIPLE REGRESSION ANALYSIS
Basic Econometrics Chapter 4: THE NORMALITY ASSUMPTION:
MULTIPLE REGRESSION ANALYSIS: THE PROBLEM OF ESTIMATION
Migration and the Labour Market
Interval Estimation and Hypothesis Testing
Some issues in multivariate regression
Simple Linear Regression
Multiple Regression Analysis: OLS Asymptotics
Presentation transcript:

May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation

May 2004 Prof. Himayatullah The three-Variable Model: Notation and Assumptions Y i = ß 1 + ß 2 X 2i + ß 3 X 3i + u i (7.1.1) ß 2, ß 3 are partial regression coefficients With the following assumptions: + Zero mean value of U i: : E(u i |X 2i,X 3i ) = 0.  i (7.1.2) + No serial correlation: Cov(u i,u j ) = 0,  i # j (7.1.3) + Homoscedasticity: Var(u i ) =  2 (7.1.4) + Cov(u i,X 2i ) = Cov(u i,X 3i ) = 0 (7.1.5) + No specification bias or model correct specified (7.1.6) + No exact collinearity between X variables (7.1.7) (no multicollinearity in the cases of more explanatory vars. If there is linear relationship exits, X vars. Are said to be linearly dependent) + Model is linear in parameters

May 2004 Prof. Himayatullah Interpretation of Multiple Regression E(Y i | X 2i,X 3i ) = ß 1 + ß 2 X 2i + ß 3 X 3i (7.2.1) (7.2.1) gives conditional mean or expected value of Y conditional upon the given or fixed value of the X 2 and X 3

May 2004 Prof. Himayatullah The meaning of partial regression coefficients Y i = ß 1 + ß 2 X 2i + ß 3 X 3 +….+ ß s X s + u i ß k measures the change in the mean value of Y per unit change in X k, holding the rest explanatory variables constant. It gives the “direct” effect of unit change in X k on the E(Y i ), net of X j (j # k) How to control the “true” effect of a unit change in X k on Y? (read pages )

May 2004 Prof. Himayatullah OLS and ML estimation of the partial regression coefficients This section (pages ) provides: 1. The OLS estimators in the case of three- variable regression Y i = ß 1 + ß 2 X 2i + ß 3 X 3 + u i 2. Variances and standard errors of OLS estimators 3.8 properties of OLS estimators (pp ) 4.Understanding on ML estimators

May 2004 Prof. Himayatullah The multiple coefficient of determination R 2 and the multiple coefficient of correlation R This section provides: 1.Definition of R 2 in the context of multiple regression like r 2 in the case of two-variable regression 2.R =  R 2 is the coefficient of multiple regression, it measures the degree of association between Y and all the explanatory variables jointly 3. Variance of a partial regression coefficient Var(ß^ k ) =  2 /  x 2 k (1/(1-R 2 k )) (7.5.6) Where ß^ k is the partial regression coefficient of regressor X k and R 2 k is the R 2 in the regression of X k on the rest regressors

May 2004 Prof. Himayatullah Example 7.1: The expectations-augmented Philips Curve for the US ( ) This section provides an illustration for the ideas introduced in the chapter Regression Model (7.6.1) Data set is in Table 7.1

May 2004 Prof. Himayatullah Simple regression in the context of multiple regression: Introduction to specification bias This section provides an understanding on “ Simple regression in the context of multiple regression”. It will cause the specification bias which will be discussed in Chapter 13

May 2004 Prof. Himayatullah R 2 and the Adjusted-R 2 R 2 is a non-decreasing function of the number of explanatory variables. An additional X variable will not decrease R 2 R 2 = ESS/TSS = 1- RSS/TSS = 1-  u^ 2 I /  y^ 2 i (7.8.1) This will make the wrong direction by adding more irrelevant variables into the regression and give an idea for an adjusted-R 2 (R bar ) by taking account of degree of freedom R 2 bar = 1- [  u^ 2 I /(n-k)] / [  y^ 2 i /(n-1) ], or (7.8.2) R 2 bar = 1-  ^ 2 / S 2 Y (S 2 Y is sample variance of Y) K= number of parameters including intercept term –By substituting (7.8.1) into (7.8.2) we get R 2 bar = 1- (1-R 2 ) (n-1)/(n- k) (7.8.4) –For k > 1, R 2 bar < R 2 thus when number of X variables increases R 2 bar increases less than R 2 and R 2 bar can be negative

May 2004 Prof. Himayatullah R 2 and the Adjusted-R 2 R 2 is a non-decreasing function of the number of explanatory variables. An additional X variable will not decrease R 2 R 2 = ESS/TSS = 1- RSS/TSS = 1-  u^ 2 I /  y^ 2 i (7.8.1) This will make the wrong direction by adding more irrelevant variables into the regression and give an idea for an adjusted-R 2 (R bar ) by taking account of degree of freedom R 2 bar = 1- [  u^ 2 I /(n-k)] / [  y^ 2 i /(n-1) ], or (7.8.2) R 2 bar = 1-  ^ 2 / S 2 Y (S 2 Y is sample variance of Y) K= number of parameters including intercept term –By substituting (7.8.1) into (7.8.2) we get R 2 bar = 1- (1-R 2 ) (n-1)/(n- k) (7.8.4) –For k > 1, R 2 bar < R 2 thus when number of X variables increases R 2 bar increases less than R 2 and R 2 bar can be negative

May 2004 Prof. Himayatullah R 2 and the Adjusted-R 2 Comparing Two R 2 Values: To compare, the size n and the dependent variable must be the same Example 7-2: Coffee Demand Function Revisited (page 210) The “game” of maximizing adjusted-R 2 : Choosing the model that gives the highest R 2 bar may be dangerous, for in regression our objective is not for that but for obtaining the dependable estimates of the true population regression coefficients and draw statistical inferences about them Should be more concerned about the logical or theoretical relevance of the explanatory variables to the dependent variable and their statistical significance

May 2004 Prof. HimayatullahProf. Himayatullah Partial Correlation Coefficients This section provides: 1. Explanation of simple and partial correlation coefficients 2.Interpretation of simple and partial correlation coefficients (pages )

May 2004 Prof. Himayatullah Example 7.3: The Cobb- Douglas Production function More on functional form Y i =  1 X  2 2i X  3 3i e U i (7.10.1) By log-transform of this model: lnY i = ln  1 +  2 ln X 2i +  3 ln X 3i + U i =  0 +  2 ln X 2i +  3 ln X 3i + U i (7.10.2) Data set is in Table 7.3 Report of results is in page 216

May 2004 Prof. Himayatullah Polynomial Regression Models Y i =  0 +  1 X i +  2 X 2 i +…+  k X k i + U i (7.11.3) Example 7.4: Estimating the Total Cost Function Data set is in Table 7.4 Empirical results is in page Summary and Conclusions (page 221)