Autocorrelation Dr. A. PHILIP AROKIADOSS Chapter 5 Assistant Professor

Slides:



Advertisements
Similar presentations
Autocorrelation Functions and ARIMA Modelling
Advertisements

Regression Analysis.
Heteroskedasticity Hill et al Chapter 11. Predicting food expenditure Are we likely to be better at predicting food expenditure at: –low incomes; –high.
Applied Econometrics Second edition
Autocorrelation Lecture 20 Lecture 20.
Economics 310 Lecture 16 Autocorrelation Continued.
Forecasting Using the Simple Linear Regression Model and Correlation
Using SAS for Time Series Data
Regression Analysis Notes. What is a simple linear relation? When one variable is associated with another variable in such a way that two numbers completely.
Specification Error II
LECTURE 3 Introduction to Linear Regression and Correlation Analysis
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
Chapter 13 Additional Topics in Regression Analysis
Economics 310 Lecture 15 Autocorrelation. Correlation between members of series of observations order in time or space. For our classic model, we have.
Additional Topics in Regression Analysis
Autocorrelation Lecture 18 Lecture 18.
Business Statistics - QBM117 Statistical inference for regression.
12 Autocorrelation Serial Correlation exists when errors are correlated across periods -One source of serial correlation is misspecification of the model.
Regression Method.
BSc (Hons) Finance II/ BSc (Hons) Finance with Law II
Serial Correlation and the Housing price function Aka “Autocorrelation”
What does it mean? The variance of the error term is not constant
The Examination of Residuals. The residuals are defined as the n differences : where is an observation and is the corresponding fitted value obtained.
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Specification Error I.
1 MF-852 Financial Econometrics Lecture 10 Serial Correlation and Heteroscedasticity Roy J. Epstein Fall 2003.
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
The Examination of Residuals. Examination of Residuals The fitting of models to data is done using an iterative approach. The first step is to fit a simple.
Chap 14-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 14 Additional Topics in Regression Analysis Statistics for Business.
12.1 Heteroskedasticity: Remedies Normality Assumption.
AUTOCORRELATION: WHAT HAPPENS IF THE ERROR TERMS ARE CORRELATED?
11.1 Heteroskedasticity: Nature and Detection Aims and Learning Objectives By the end of this session students should be able to: Explain the nature.
Chapter 13 Simple Linear Regression
Ch5 Relaxing the Assumptions of the Classical Model
Inference for Least Squares Lines
REGRESSION DIAGNOSTIC III: AUTOCORRELATION
Dynamic Models, Autocorrelation and Forecasting
Statistics for Managers using Microsoft Excel 3rd Edition
Kakhramon Yusupov June 15th, :30pm – 3:00pm Session 3
REGRESSION DIAGNOSTIC II: HETEROSCEDASTICITY
Econometric methods of analysis and forecasting of financial markets
Multivariate Regression
Fundamentals of regression analysis
Fundamentals of regression analysis 2
Chapter 13 Simple Linear Regression
Pure Serial Correlation
STAT 497 LECTURE NOTE 9 DIAGNOSTIC CHECKS.
HETEROSCEDASTICITY: WHAT HAPPENS IF THE ERROR VARIANCE IS NONCONSTANT?
Slides by JOHN LOUCKS St. Edward’s University.
Chapter 12 – Autocorrelation
Autocorrelation.
Serial Correlation and Heteroskedasticity in Time Series Regressions
Lecturer Dr. Veronika Alhanaqtah
Serial Correlation and Heteroscedasticity in
HETEROSCEDASTICITY: WHAT HAPPENS IF THE ERROR VARIANCE IS NONCONSTANT?
Chapter 4, Regression Diagnostics Detection of Model Violation
REGRESSION DIAGNOSTIC III: AUTOCORRELATION
BEC 30325: MANAGERIAL ECONOMICS
The Examination of Residuals
Chapter 13 Additional Topics in Regression Analysis
Tutorial 6 SEG rd Oct..
Autocorrelation.
Lecturer Dr. Veronika Alhanaqtah
Tutorial 2: Autocorrelation
Autocorrelation MS management.
Financial Econometrics Fin. 505
BEC 30325: MANAGERIAL ECONOMICS
Serial Correlation and Heteroscedasticity in
St. Edward’s University
Presentation transcript:

Autocorrelation Dr. A. PHILIP AROKIADOSS Chapter 5 Assistant Professor Department of Statistics St. Joseph’s College (Autonomous) Tiruchirappalli-620 002.

Aims and Learning Objectives By the end of this session students should be able to: Explain the nature of autocorrelation Understand the causes and consequences of autocorrelation Perform tests to determine whether a regression model has autocorrelated disturbances

Nature of Autocorrelation Autocorrelation is a systematic pattern in the errors that can be either attracting (positive) or repelling (negative) autocorrelation. For efficiency (accurate estimation/prediction) all systematic information needs to be incor-porated into the regression model.

Yt = 1 + 2X2t + 3X3t + Ut Cov (Ui, Uj) or E(Ui, Uj) = 0 Regression Model Yt = 1 + 2X2t + 3X3t + Ut Cov (Ui, Uj) or E(Ui, Uj) = 0 No autocorrelation: Autocorrelation: Cov (Ui, Uj)  0 or E(Ui, Uj)  0 Note: i  j In general E(Ut, Ut-s)  0

Ut Attracting . . Postive Auto. . . . . . . . . . . . . . . . . . . . . t Ut Random . . . . . . . . . . . . No Auto. . . . . . . . . . . . . . . . . t . Ut . Repelling . . . . . . . Negative Auto. . . . . . . . t . .

Order of Autocorrelation Yt = 1 + 2X2t + 3X3t + Ut 1st Order: Ut = Ut1 + t 2nd Order: Ut = 1Ut1 + 2Ut2 + t 3rd Order: Ut = 1Ut1 + 2Ut2 + 3Ut3 + t Where -1 <  < +1 We will assume First Order Autocorrelation: Ut = Ut1 + t AR(1) :

Causes of Autocorrelation Indirect Omitted Variables Functional form Seasonality Direct Inertia or persistence Spatial correlation Cyclical Influences

Consequences of Autocorrelation 1. Ordinary least squares still linear and unbiased. 2. Ordinary least squares not efficient. 3. Usual formulas give incorrect standard errors for least squares. 4. Confidence intervals and hypothesis tests based on usual standard errors are wrong.

Yt = 1 + 2Xt + et E(et, et-s)  0 ^ ^ Autocorrelated disturbances: Formula for ordinary least squares variance (no autocorrelation in disturbances): Formula for ordinary least squares variance (autocorrelated disturbances): Therefore when errors are autocorrelated ordinary least squares estimators are inefficient (i.e. not “best”)

Detecting Autocorrelation et provide proxies for Ut Preliminary Analysis (Informal Tests) Data - autocorrelation often occurs in time-series (exceptions: spatial correlation, panel data) Graphical examination of residuals - plot et against time or et-1 to see if there is a relation

Formal Tests for Autocorrelation Runs Test: analyse the uninterrupted sequence of the residuals Durbin-Watson (DW) d test: ratio of the sum of squared differences in successive residuals to the residual sum of squares Breusch-Godfrey LM test: A more general test which does not assume the disturbances are AR(1).

et et-1 et Ho:  = 0 vs. H1:  = 0 ,  > 0, or  < 0 Durbin-Watson d Test Ho:  = 0 vs. H1:  = 0 ,  > 0, or  < 0 The Durbin-Watson Test statistic, d, is : et et-1 et n t = 2 t = 1 2 d= Ratio of the sum of squared differences in successive residuals to the residual sum of squares

The test statistic, d, is approximately related to  as: ^ d  2(1) ^ When  = 0 , the Durbin-Watson statistic is d  2. ^ When  = 1 , the Durbin-Watson statistic is d  0. ^ When  = -1 , the Durbin-Watson statistic is d  4. ^

DW d Test 4 Steps Step 1: Estimate And obtain the residuals Step 2: Compute the DW d test statistic Step 3: Obtain dL and dU: the lower and upper points from the Durbin-Watson tables

Step 4: Implement the following decision rule:

Yt = 1 + 2X2t + 3X3t + 4Yt-1+ Ut Restrictive Assumptions: There is an intercept in the model X values are non-stochastic Disturbances are AR(1) Model does not include a lagged dependent variable as an explanatory variable, e.g. Yt = 1 + 2X2t + 3X3t + 4Yt-1+ Ut

Breusch-Godfrey LM Test This test is valid with lagged dependent variables and can be used to test for higher order autocorrelation Suppose, for example, that we estimate: Yt = 1 + 2X2t + 3X3t + 4Yt-1+ Ut And wish to test for autocorrelation of the form:

Breusch-Godfrey LM Test 4 steps Step 1. Estimate Yt = 1 + 2X2t + 3X3t + 4Yt-1+ Ut obtain the residuals (et) Step 2. Estimate the following auxiliary regression model:

Breusch-Godfrey LM Test Step 3. For large sample sizes, the test statistic is: Step 4. If the test statistic exceeds the critical chi-square value we can reject the null hypothesis of no serial correlation in any of the  terms

Summary In this lecture we have: 1. Analysed the theoretical causes and consequences of autocorrelation 2. Described a number of methods for detecting the presence of autocorrelation