Ch5 Relaxing the Assumptions of the Classical Model

Slides:



Advertisements
Similar presentations
Multiple Regression.
Advertisements

Autocorrelation and Heteroskedasticity
Regression Analysis.
Applied Econometrics Second edition
Managerial Economics in a Global Economy
Multivariate Regression
Forecasting Using the Simple Linear Regression Model and Correlation
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Notes. What is a simple linear relation? When one variable is associated with another variable in such a way that two numbers completely.
Chapter 13 Additional Topics in Regression Analysis
Economics Prof. Buckles1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
Chapter 5 Heteroskedasticity. What is in this Chapter? How do we detect this problem What are the consequences of this problem? What are the solutions?
Additional Topics in Regression Analysis
CHAPTER 4 ECONOMETRICS x x x x x Multiple Regression = more than one explanatory variable Independent variables are X 2 and X 3. Y i = B 1 + B 2 X 2i +
Chapter 11 Multiple Regression.
Topic 3: Regression.
Autocorrelation Lecture 18 Lecture 18.
Ordinary Least Squares
12 Autocorrelation Serial Correlation exists when errors are correlated across periods -One source of serial correlation is misspecification of the model.
Regression Method.
What does it mean? The variance of the error term is not constant
2-1 MGMG 522 : Session #2 Learning to Use Regression Analysis & The Classical Model (Ch. 3 & 4)
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
1 MF-852 Financial Econometrics Lecture 10 Serial Correlation and Heteroscedasticity Roy J. Epstein Fall 2003.
Pure Serial Correlation
Chap 14-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 14 Additional Topics in Regression Analysis Statistics for Business.
1Spring 02 Problems in Regression Analysis Heteroscedasticity Violation of the constancy of the variance of the errors. Cross-sectional data Serial Correlation.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Model Building and Model Diagnostics Chapter 15.
Problems with the Durbin-Watson test
EC 532 Advanced Econometrics Lecture 1 : Heteroscedasticity Prof. Burak Saltoglu.
1 Regression Review Population Vs. Sample Regression Line Residual and Standard Error of Regression Interpretation of intercept & slope T-test, F-test.
MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED?
Lecturer: Ing. Martina Hanová, PhD.. Regression analysis Regression analysis is a tool for analyzing relationships between financial variables:  Identify.
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
Heteroscedasticity Heteroscedasticity is present if the variance of the error term is not a constant. This is most commonly a problem when dealing with.
Heteroscedasticity Chapter 8
F-tests continued.
REGRESSION DIAGNOSTIC III: AUTOCORRELATION
Linear Regression.
Chow test.
Kakhramon Yusupov June 15th, :30pm – 3:00pm Session 3
REGRESSION DIAGNOSTIC II: HETEROSCEDASTICITY
Econometric methods of analysis and forecasting of financial markets
Multivariate Regression
Fundamentals of regression analysis
Fundamentals of regression analysis 2
Pure Serial Correlation
Correlation and Regression
Chapter 12 – Autocorrelation
Autocorrelation.
Serial Correlation and Heteroskedasticity in Time Series Regressions
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Multiple Regression Models
REGRESSION DIAGNOSTIC I: MULTICOLLINEARITY
Serial Correlation and Heteroscedasticity in
Heteroskedasticity.
Chapter 7: The Normality Assumption and Inference with OLS
Product moment correlation
BEC 30325: MANAGERIAL ECONOMICS
Chapter 13 Additional Topics in Regression Analysis
Autocorrelation Dr. A. PHILIP AROKIADOSS Chapter 5 Assistant Professor
Autocorrelation.
3.2. SIMPLE LINEAR REGRESSION
Autocorrelation MS management.
Heteroskedasticity.
Ch3 The Two-Variable Regression Model
Financial Econometrics Fin. 505
BEC 30325: MANAGERIAL ECONOMICS
Serial Correlation and Heteroscedasticity in
Presentation transcript:

Ch5 Relaxing the Assumptions of the Classical Model 1. Multicollinearity: What Happens if the Regressors Are Correlated? 2.Heteroscedasticity: What Happens if the Error Variance Is Nonconstant? 3. Autocorrelation: What Happens if the Error Terms Are correlated?

1.Multicollinearity Perfect Collinearity Multicollinearity: two or more variables are highly(but not perfectly) correlated with each other. The easiest way to test multicollinearity is to examine the standard errors of the coefficients. Reasonable method to relieve multicollinearity is to drop some highly correlated variables.

1.Test of Multicollinearity A relatively high in an equation with few significant t statistics; Relatively high simple correlations between one or more pairs of explanatory variables; But the above criterion is not very applicable for time series data. And it can not test multicollinearity that arises because three or four variables are related to each other, either.

2.Heteroscedasticity: Impact of heteroscedasticity on parameter estimators; Corrections for heteroscedasticity; Tests for Corrections for heteroscedasticity.

2.Impact of Heteroscedasticity Existence of heteroscedasticity would make OLS parameter estimators not efficient, although these estimators are still unbiased and consistent ; Often occurs when dealing with cross-sectional data.

2.Correction of Heteroscedasticity Known Variance Unknown Variance (Error Variance Varies Directly with an Independent Variable)

Known Variance Two-variable regression model:

Known Variance Multiple linear regression model: let Because: Therefore: WLS is BLUE

Unknown Variance: Let Because: Therefore: WLS is BLUE

2.Tests of Heteroscedasticity Informal Test Method: Observe the residuals to see whether estimated variances differ from observation to observation. Formal Test Methods: Goldfeld-Quandt Test Breusch-Pagan Test and The White Test

Goldfeld-Quandt Test Steps: Order the data by the magnitude of Independent Variable; Omit the middle d observations; Fit two separate regression, the first for the smaller X and the second for the larger X; Calculate the residual sum of squares of each regression: RSS1 and RSS2; Assuming the error process is normally distributed, then RSS2/RSS1~F((N-d-2k)/2, (N-d-2k)/2) .

Breusch-Pagan Test Steps: First calculate the least-squares residuals and use these residuals to estimate: ; Run the followingregression: If the error term is normally distributed and the null hypothesis is valid, then:

White Test Steps: First calculate the least-squares residuals and use these residuals to estimate: ; Run the following regression: When the null hypothesis is valid, then:

3.Serial Correlation: Impact of serial correlation on OLS estimators; Corrections for serial correlation; Tests for serial correlation.

Serial Correlation Serial Correlation often occurs in time-series studies; Fist-order serial correlation; Positive serial correlation;

Impact of Serial Correlation Serial correlation will not affect the unbiasedness or consistency of the OLS estimators, but it does not affect their efficiency.

Correction of Serial Correlation The model with serial correlated error terms usually is described as: Formula for first-order serial-correlation coefficient

Correction of Serial Correlation When is known:Generalized Differencing

Methods for estimating The Cochrane-Orcutt Procedure; The Hildreth-Lu Procedure.

Cochrane-Orcutt Procedure Steps: Using OLS to estimate the original model: Using the residuals from the above equation to perform the regression: Using the estimated value of to perform the generalized differencing transformation process and yield new parameters. Substituting these revised parameters into the original equation and obtaining the new estimated residuals:

Cochrane-Orcutt Procedure Steps: Using these second-round residuals to run the regression and obtain new estimate of ; The above iterative process can be carried on many times until the new estimates of differ from the old ones by less than 0.01 or 0.005.

The Hildreth-Lu Procedure Steps: Specifying a set of grid values for ; For each value of , estimating the transformed equation: Selecting the equation with the lowest sum-of-squared residuals as the best equation; The above procedure can be continued with new grid values chosen in the neighborhood of the value that is first selected until the desired accuracy is attained.

Test of Serial Correlation Durbin-Watson Test Test statistic is: DW~[0,4], value near 2 indicating no first-order serial correlation. Positive serial correlation is associated with DW values below 2, and negative serial correlation is associated with DW values above 2.

Range of the DW Statistic Value of DW Result 4-dl<DW<4 Reject the null hypothesis; negative serial 4-du<DW<4-dl Result indeterminate 2<DW<4-du Accept null hypothesis du<DW<2 Accept null hypothesis dl<DW<du Result indeterminate 0<DW<dl Reject null hypothesis; positive serial