Chapter 12 – Autocorrelation

Slides:



Advertisements
Similar presentations
Autocorrelation and Heteroskedasticity
Advertisements

Regression Analysis.
Econometric Modeling Through EViews and EXCEL
Managerial Economics in a Global Economy
Economics 310 Lecture 16 Autocorrelation Continued.
Economics 20 - Prof. Anderson1 Time Series Data y t =  0 +  1 x t  k x tk + u t 2. Further Issues.
Chapter 11 Autocorrelation.
1 Chapter 2 Simple Linear Regression Ray-Bing Chen Institute of Statistics National University of Kaohsiung.
The Simple Linear Regression Model: Specification and Estimation
Economics Prof. Buckles1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
Economics 20 - Prof. Anderson1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 6. Heteroskedasticity.
Economics 20 - Prof. Anderson1 Time Series Data y t =  0 +  1 x t  k x tk + u t 2. Further Issues.
1Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 6. Heteroskedasticity.
12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly.
Topic 3: Regression.
1Prof. Dr. Rainer Stachuletz Time Series Data y t =  0 +  1 x t  k x tk + u t 2. Further Issues.
Economics Prof. Buckles
Regression Method.
Serial Correlation and the Housing price function Aka “Autocorrelation”
What does it mean? The variance of the error term is not constant
1 MF-852 Financial Econometrics Lecture 10 Serial Correlation and Heteroscedasticity Roy J. Epstein Fall 2003.
Pure Serial Correlation
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
Autocorrelation in Time Series KNNL – Chapter 12.
1Spring 02 Problems in Regression Analysis Heteroscedasticity Violation of the constancy of the variance of the errors. Cross-sectional data Serial Correlation.
LECTURE 7 CONSTRUCTION OF ECONOMETRIC MODELS WITH AUTOCORRELATED RESIDUES.
1 Javier Aparicio División de Estudios Políticos, CIDE Primavera Regresión.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Model Building and Model Diagnostics Chapter 15.
Problems with the Durbin-Watson test
AUTOCORRELATION: WHAT HAPPENS IF THE ERROR TERMS ARE CORRELATED?
Review of Statistical Inference Prepared by Vera Tabakova, East Carolina University ECON 4550 Econometrics Memorial University of Newfoundland.
Dynamic Models, Autocorrelation and Forecasting ECON 6002 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s notes.
Review of Statistical Inference Prepared by Vera Tabakova, East Carolina University.
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
Chapter 4. The Normality Assumption: CLassical Normal Linear Regression Model (CNLRM)
Heteroscedasticity Heteroscedasticity is present if the variance of the error term is not a constant. This is most commonly a problem when dealing with.
Heteroscedasticity Chapter 8
Ch5 Relaxing the Assumptions of the Classical Model
REGRESSION DIAGNOSTIC III: AUTOCORRELATION
Dynamic Models, Autocorrelation and Forecasting
Kakhramon Yusupov June 15th, :30pm – 3:00pm Session 3
REGRESSION DIAGNOSTIC II: HETEROSCEDASTICITY
THE LINEAR REGRESSION MODEL: AN OVERVIEW
Fundamentals of regression analysis
Chapter 3: TWO-VARIABLE REGRESSION MODEL: The problem of Estimation
Fundamentals of regression analysis 2
STOCHASTIC REGRESSORS AND THE METHOD OF INSTRUMENTAL VARIABLES
Pure Serial Correlation
STAT 497 LECTURE NOTE 9 DIAGNOSTIC CHECKS.
HETEROSCEDASTICITY: WHAT HAPPENS IF THE ERROR VARIANCE IS NONCONSTANT?
Autocorrelation.
Serial Correlation and Heteroskedasticity in Time Series Regressions
Lecturer Dr. Veronika Alhanaqtah
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
REGRESSION DIAGNOSTIC I: MULTICOLLINEARITY
Review of Statistical Inference
Chengyuan Yin School of Mathematics
Serial Correlation and Heteroscedasticity in
HETEROSCEDASTICITY: WHAT HAPPENS IF THE ERROR VARIANCE IS NONCONSTANT?
Simple Linear Regression
Tutorial 1: Misspecification
Heteroskedasticity.
Tutorial 6 SEG rd Oct..
Autocorrelation Dr. A. PHILIP AROKIADOSS Chapter 5 Assistant Professor
Autocorrelation.
Lecturer Dr. Veronika Alhanaqtah
Tutorial 2: Autocorrelation
Autocorrelation MS management.
Financial Econometrics Fin. 505
Serial Correlation and Heteroscedasticity in
Presentation transcript:

Chapter 12 – Autocorrelation Econometrics Chapter 12 – Autocorrelation

Autocorrelation In the classical regression model, it is assumed that E(utus) = 0 if t is not equal to s. What happens when this assumption is violated?

First-order autocorrelation

Positive first-order autocorrelation (r > 0)

Negative first-order autocorrelation (r < 0)

Incorrect model specification and apparent autocorrelation

Violation of assumption of classical regression model

Consequences of first-order autocorrelation OLS estimators are unbiased and consistent OLS estimators are not BLUE Estimated variances of residuals is biased Biased estimator of standard errors of residuals (usually a downward bias) Biased t-ratios (usually an upward bias)

Detection Durbin-Watson statistic

Acceptance and rejection regions for DW statistic

AR(1) correction: known r Lagging this relationship 1 period: Multiplying this by -r With a little bit of algebra:

AR(1) correction: known r Solution? quasi-difference each variable: Regress:

AR(1) correction: known r This procedure provides unbiased and consistent estimates of all model parameters and standard errors. If r = 1, a unit root is said to exist. In this case, quasi-differencing is equivalent to differencing:

Generalized least squares This approach is referred to as: Generalized Least Squares (GLS) GLS estimation strategy: If one of the assumptions of the classical regression model is violated, transform the model so that the transformed model satisfies these assumptions. Estimate the transformed model

AR(1) correction: unknown r Cochrane-Orcutt procedure: Estimate the original model using OLS. Save the error terms Regress saved error term on lagged error term (without a constant) to estimate r Estimate a quasi-differenced version of original model. Use the estimated parameters to generate new estimate of error term. Go to step 2. Repeat this process until change in parameter estimates become less than selected threshold value. This results in unbiased and consistent estimates of all model parameters and standard errors.

Prais-Winsten estimator Cochrane-Orcutt method involves the loss of 1 observation. Prais-Winsten estimator is similar to Cochrane-Orcutt method, but applies a different transformation to the first observation (see text, p. 444). Monte Carlo studies indicate substantial efficiency gain from the use of the Prais-Winsten estimator (relative to the Cochrane-Orcutt method)

Hildreth-Lu estimator A grid search algorithm – helps ensure that the estimator reaches a global minimum sum of squared error terms rather than a local minimum sum of squared error terms

Maximum likelihood estimator selects parameter values that maximize the computed probability of observing the realized outcomes for the dependent and independent variables an asymptotically efficient estimator

Higher-order autocorrelation AR(p):

Detection of AR(p) error process Breusch-Godfrey test: Estimate the parameters of original model and save error term Regress estimated error term on all independent variables in the original model and the first p lagged error terms Compute the Breusch-Godfrey Lagrange Multiplier test statistic: NR2 An AR(p) process is found to exist if the LM statistic exceeds the critical value for a c2 variate with p degrees of freedom. Use Box-Pierce or Ljung-Box statistic (see p. 451)

Lagged dependent variable as regressor Durbin-Watson statistic is biased downward when a lagged dependent variable is used as a regressor. Use Durbin’s h test or Lagrange Multiplier test (test statistic = (N-1)R2 in this case). Correction: Hatanaka’s estimator (on pp. 458-9 of the text)

Correction of AR(p) process Use Prais-Winsten (modified for an AR(p) process) or maximum likelihood estimator