Lecturer Dr. Veronika Alhanaqtah

Slides:



Advertisements
Similar presentations
Introduction Describe what panel data is and the reasons for using it in this format Assess the importance of fixed and random effects Examine the Hausman.
Advertisements

Managerial Economics in a Global Economy
Hypothesis Testing Steps in Hypothesis Testing:
Homoscedasticity equal error variance. One of the assumption of OLS regression is that error terms have a constant variance across all value so f independent.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
The Simple Linear Regression Model: Specification and Estimation
Chapter 13 Additional Topics in Regression Analysis
CHAPTER 3 ECONOMETRICS x x x x x Chapter 2: Estimating the parameters of a linear regression model. Y i = b 1 + b 2 X i + e i Using OLS Chapter 3: Testing.
Additional Topics in Regression Analysis
Chapter 11 Multiple Regression.
Topic 3: Regression.
Chapter 12 Section 1 Inference for Linear Regression.
12 Autocorrelation Serial Correlation exists when errors are correlated across periods -One source of serial correlation is misspecification of the model.
Inference for regression - Simple linear regression
1 MADE WHAT IF SOME OLS ASSUMPTIONS ARE NOT FULFILED?
Regression Method.
What does it mean? The variance of the error term is not constant
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
Chap 14-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 14 Additional Topics in Regression Analysis Statistics for Business.
Various topics Petter Mostad Overview Epidemiology Study types / data types Econometrics Time series data More about sampling –Estimation.
Lecture 8 Simple Linear Regression (cont.). Section Objectives: Statistical model for linear regression Data for simple linear regression Estimation.
LECTURE 7 CONSTRUCTION OF ECONOMETRIC MODELS WITH AUTOCORRELATED RESIDUES.
3.4 The Components of the OLS Variances: Multicollinearity We see in (3.51) that the variance of B j hat depends on three factors: σ 2, SST j and R j 2.
STAT 497 LECTURE NOTE 9 DIAGNOSTIC CHECKS 1. After identifying and estimating a time series model, the goodness-of-fit of the model and validity of the.
AUTOCORRELATION: WHAT HAPPENS IF THE ERROR TERMS ARE CORRELATED?
Multiple Regression Analysis: Inference
Ch5 Relaxing the Assumptions of the Classical Model
Chapter 14 Introduction to Multiple Regression
Inference for Least Squares Lines
REGRESSION DIAGNOSTIC III: AUTOCORRELATION
Statistics for Managers using Microsoft Excel 3rd Edition
Kakhramon Yusupov June 15th, :30pm – 3:00pm Session 3
REGRESSION DIAGNOSTIC II: HETEROSCEDASTICITY
Econometric methods of analysis and forecasting of financial markets
Fundamentals of regression analysis
Fundamentals of regression analysis 2
STOCHASTIC REGRESSORS AND THE METHOD OF INSTRUMENTAL VARIABLES
Pure Serial Correlation
...Relax... 9/21/2018 ST3131, Lecture 3 ST5213 Semester II, 2000/2001
STAT 497 LECTURE NOTE 9 DIAGNOSTIC CHECKS.
HETEROSCEDASTICITY: WHAT HAPPENS IF THE ERROR VARIANCE IS NONCONSTANT?
I271B Quantitative Methods
Chapter 12 – Autocorrelation
Autocorrelation.
Serial Correlation and Heteroskedasticity in Time Series Regressions
Chapter 6: MULTIPLE REGRESSION ANALYSIS
Lecturer Dr. Veronika Alhanaqtah
Lecturer Dr. Veronika Alhanaqtah
PENGOLAHAN DAN PENYAJIAN
REGRESSION DIAGNOSTIC I: MULTICOLLINEARITY
Serial Correlation and Heteroscedasticity in
HETEROSCEDASTICITY: WHAT HAPPENS IF THE ERROR VARIANCE IS NONCONSTANT?
Simple Linear Regression
Tutorial 1: Misspecification
Chapter 7: The Normality Assumption and Inference with OLS
BEC 30325: MANAGERIAL ECONOMICS
Chapter 13 Additional Topics in Regression Analysis
Tutorial 6 SEG rd Oct..
Autocorrelation Dr. A. PHILIP AROKIADOSS Chapter 5 Assistant Professor
Lecturer Dr. Veronika Alhanaqtah
Autocorrelation.
Tutorial 2: Autocorrelation
Autocorrelation MS management.
Financial Econometrics Fin. 505
BEC 30325: MANAGERIAL ECONOMICS
Lecturer Dr. Veronika Alhanaqtah
Serial Correlation and Heteroscedasticity in
Presentation transcript:

Lecturer Dr. Veronika Alhanaqtah ECONOMETRICS Lecturer Dr. Veronika Alhanaqtah

Topic 4.3. Possible problems in multiple linear regression estimated by OLS. Autocorrelation Nature of autocorrelation Autocorrelation of the first order Consequences of autocorrelation Correction of autocorrelation: Robust standard errors Detecting of autocorrelation Graphical analysis of residuals The Durbin-Watson test The Breusch-Godfrey test

OLS assumptions (from Topic 2) (1) The expected value of a residual is equal to zero for all observations: (2) The variance of residuals is constant (even, uniform, homoscedastic) for every observation: (3) Residuals are uncorrelated between observations (4) Residuals are independent on regressors (x) (5) Model is linear in relation to its parameters It means that beta-estimators are linear in relation to yi: where cn are values which are dependent only on regressors xi but not on the dependent variable y.

1. Nature of autocorrelation  

1. Nature of autocorrelation   Summer Winter X Y

1. Nature of autocorrelation Example of negative autocorrelation: Negative autocorrelation means that positive deviation is followed by the negative deviation, and vice versa. This situation can happen when we analyze the same relationship, as above, but use seasonal data (winter – summer).

1. Nature of autocorrelation Reasons of autocorrelation: mistakes in model specification time lag in change of economic parameters web-effect data smoothing

1. Reasons of autocorrelation  

1. Reasons of autocorrelation Time lag. Many economic parameters are cyclical, as a consequence of undulating economic cycles. Changes doesn’t happen immediately. It takes some time or a time lag. Web-effect. In many spheres of economic activity, parameters react to changes of economic conditions with delay, or time lag. For example, supply of agricultural products react to price changes with delay (equal to the agricultural season). High price of agricultural products in the previous year, most likely, leads to effect of its overproduction in the current year, and, as a consequence, the price will decrease, and so on. In this case, deviation of residuals from each other is not accidental (or random). Data smoothing. Very often the data over some long period of time are averaged along the subintervals. To smooth a data set is to create an approximating function that attempts to capture important patterns in the data, while leaving out noise or other fine-scale structures.

1. Reasons of autocorrelation  

1. Reasons of autocorrelation Autocorrelation is examined in details: time series analysis; spatial econometrics; panel data. These are almost independent disciplines.

2. Autocorrelation of the first order Autocorrelation may have very complicated structure: AR, MA, ARMA, ARIMA, VAR, VMA, VARMA, VECM, ARCH, GARCH, EGARCH, FIGARCH, TARCH, AVARCH, ZARCH, CCC, DCC, BEKK, VEC, DLM ….

2. Autocorrelation of the first order  

2. Autocorrelation of the first order  

2. Autocorrelation of the first order  

3. Consequences of autocorrelation Beta-estimators are still linear and unbiased. Linearity means that estimators of coefficients are still linear with respect to y. Unbiasedness means that, on average, estimators fall into unknown β-coefficient. Beta-estimators are not consistent. Consistent estimate is the one that gives rather precise value for the large samples. Beta-estimators are not efficient. It means they don’t have the least variance in comparison with other estimates (by other methods). Variance of beta-coefficients will be computed with biasedness. This biasedness is the consequence of the following fact: variance, which is not explained by the regression equation, is no longer unbiased. Variance is biased. In this case variance is highly likely underestimated, that leads to overestimated t- statistics. Consequently, if we rely upon such inferences, we might mistakenly accept as significant those coefficients which, in fact, are not significant. Standard errors of beta-coefficients are inconsistent. Even when number of observations is very large, variance of beta-coefficients will be incorrect (biased).

3. Consequences of autocorrelation As a consequence of everything above, in the presence of autocorrelation: statistical inferences on t- and F-statistic (which determine significance of beta-coefficients and coefficient of determination R2) are, very likely, incorrect. Thus, prediction qualities of a model are worsened. we can use and interpret beta-coefficients (because beta- estimators are still unbiased); but standard errors are inconsistent; so we can’t construct confidence intervals for beta- coefficients and test hypothesizes about them.

4. Correction of autocorrelation: robust standard errors What to do in the presence of autocorrelation? Correct standard errors! We use heteroscedascicity and autocorrelation consistent covariance matrix instead of usual matrix: We use the Newey-West estimator of the covariance matrix (1987, although there are a number of later variants). The estimator can be used to improve the OLS-regression when the variables have heteroskedasticity or autocorrelation. It is computed using software: So, the idea is to replace usual standard errors for robust standard errors (heteroscedasticity and autocorrelation consistent) where are roots of diagonal elements of a corresponding matrix.

4. Correction of autocorrelation: robust standard errors   Great!

4. Correction of autocorrelation: robust standard errors !  

4. Correction of autocorrelation: robust standard errors In practice: (1) Estimate a model as usual: model<-lm(data=data, y~x+z) (2) Compute robust covariance matrix (“sandwich” package) vcovHAC(model) (3) Use robust covariance matrix for hypothesis testing (“lmtest” package) coeftest(model,vcov.=vcovHAC)

4. Correction of autocorrelation: robust standard errors When is it advisable to use robust covariance matrix and robust standard errors? In cases when we suspect presence of autocorrelation and do not want to model its structure (structure of relationship between residuals). For example, in time series or in data where there is geographical closeness between observations.

5. Detecting autocorrelation 5.1. Graphical analysis of residuals  

5. Detecting autocorrelation 5.2. The Durbin-Watson test  

5. Detecting autocorrelation 5.2. The Durbin-Watson test. Algorithm Estimate a regression model, using OLS, and obtain residuals Compute DW-statistic: Note: distribution of DW-statistic is rather sophisticated; it is impossible to obtain the table with exact critical values for all possible samples; it is only possible to compute upper and lower borders for the critical value DW. Statistical inference for the DW-test: – sample correlation of residuals, then : DW≈0: strong positive autocorrelation DW≈2: no autocorrelation DW≈4: strong negative autocorrelation Note: 1st order autocorrelation is an autocorrelation between “yesterday” and “today” observations)

5. Detecting autocorrelation 5.2. The Durbin-Watson test  

5. Detecting autocorrelation 5.3. The Breusch-Godfrey test  

5. Detecting autocorrelation 5.3. The Breusch-Godfrey test. Algorithm Estimate a regression model, using OLS, and obtain residuals Estimate an auxiliary regression for dependent on initial regressors and compute (auxiliary coefficient of determination). Compute BG-statistic: where n is a number of observations; is an assumed order of autocorrelation: Statistical inference on BG-test: H0: (no autocorrelation) If H0 is true, then BG-statistic has 𝜒2-distribution with degrees of freedom: If then H0 is rejected. So, there is autocorrelation in the model. Note: it is advisable to prefer BG-test to DW-test. Nowadays, analogues of DW-test are used in spatial econometrics.

5. Detecting autocorrelation. Examples of DW- and BG-tests Model: n=19 Questions: Is there autocorrelation in the model? What is the order of autocorrelation ( )? Assumption 1. Autocorrelation of the first order: Assumption 2. Autocorrelation of the second order: Step 1. Autocorrelation of the 1st order. DW-test. In software we computed DW=1.32. From the formula compute: There is weak correlation between “yesterday” and “today” observations, i.e. almost absence of the autocorrelation of the 1st order.

5. Detecting autocorrelation. Examples of DW- and BG-tests Model: n=19 Questions: Is there autocorrelation in the model? What is the order of autocorrelation ( )? Assumption 1. Autocorrelation of the first order: Assumption 2. Autocorrelation of the second order: Step 2. Autocorrelation of the 2nd order. BG-test. In software we estimate an auxiliary regression for on initial regressors . Compute Compute test-statistic: If H0 is true (no autocorrelation): (simultaneously), then Otherwise, Ha: Find in R: qchis(0.95,df=2). Critical point: Statistical inference: so H0 (no 2nd order autocorrelation) is not rejected. There are too a few number of observations to reject H0. To sum up, we do not have enough data to reject absolute absence of autocorrelation.

Autocorrelation