Download presentation
Presentation is loading. Please wait.
Published byGervais Smith Modified over 6 years ago
1
Lecture 17 Preview: Autocorrelation (Serial Correlation)
Regression Model Standard Ordinary least Squares (OLS) Premises Estimation Procedures Embedded within the Ordinary Least Squares (OLS) Estimation Procedure Covariance and Independence What Is Autocorrelation (Serial Correlation)? Consequences of Autocorrelation The Mathematics Our Suspicions Confirming Our Suspicions Accounting for Autocorrelation: An Example Justifying the Generalized Least Squares (GLS) Estimation Procedure Robust Standard Errors
2
Regression Model yt = Dependent variable xt = Explanatory variable et = Error term yt = Const + xxt + et Const and x are the parameters t = 1, 2, …, T The error term is a random variable representing random influences: Mean[et] = 0 Standard Ordinary Least Squares (OLS) Premises Error Term Equal Variance Premise: The variance of the error term’s probability distribution for each observation is the same. Error Term/Error Term Independence Premise: The error terms are independent. Explanatory Variable/Error Term Independence Premise: The explanatory variables, the xt’s, and the error terms, the et’s, are not correlated. OLS Estimation Procedure Includes Three Estimation Procedures Value of the parameters, Const and x: bx = bConst = Question: What happens when the error term/error term independence premise is violated? Variance of the error term’s probability distribution, Var[e]: SSR EstVar[e] = Degrees of Freedom Variance of the coefficient estimate’s probability distribution, Var[bx]: EstVar[bx] = Good News: When the standard premises are satisfied each of these procedures is unbiased. Good News: When the standard premises are satisfied the OLS estimation procedure for the coefficient value is the best linear unbiased estimation procedure (BLUE). Crucial Point: When the ordinary least squares (OLS) estimation procedure performs its calculations, it implicitly assumes that the three standard (OLS) premises are satisfied.
3
Correlation, Independence, and Covariance
Deviation of Precipitation and Nasdaq growth: Deviation of Dow Jones and Nasdaq growth: > 0 < 0 Question: Does precipitation in Amherst help you predict the Nasdaq growth rate? Question: Does the Dow Jones growth rate help you predict the Nasdaq growth rate? Answer: No. The variables are independent. Answer: Yes. The variables are not independent. The observations are spread about evenly; Most observations are in the quadrants I and III; hence, the covariance is about 0 (.9 0). hence, the covariance is positive (19.5). Summary: When two variables are independent their covariance is 0. Summary: When two variables are not independent their covariance is not 0. Variance of the sum of two independent variables: Var[x + y] = Var[x] + Var[y] Variance of the sum of two variables: Var[x + y] = Var[x] + 2Cov[x, y] + Var[y]
4
vt’s are independent and reflect random influences Mean[vt] = 0
What Is Autocorrelation? Autocorrelation often appears when analyzing time series data. Autocorrelation exists whenever the value of the previous observation’s error term would help us predict the value of this observation’s error term. Lab 17.1 = 0 Model: et = et1 + vt 0 et = vt et depends on et1 vt’s are independent and reflect random influences Mean[vt] = 0 No Autocorrelation Autocorrelation et et et-1 et-1 Conclusion: When autocorrelation exists the error terms are not independent. = 0 > 0 When autocorrelation is present the error term/error term independence premise is violated. et-1 < 0 et-1 > 0 Typically, et < 0 Typically, et > 0
5
Consequences of Autocorrelation
How does the presence of autocorrelation affect the ordinary least squares (OLS) estimation procedure for the value of the coefficient? variance of the error term’s probability distribution? variance of the coefficient estimate’s probability distribution? Estimation Procedure for the Value of the Coefficient Question: In the presence of autocorrelation, is the OLS estimation procedure for the value of the coefficient unbiased? That is, does Mean[bx] still equal x? Review: Arithmetic of Means Mean of a constant plus a variable: Mean[c + x] = c + Mean[x] Mean of a constant times a variable: Mean[cx] = c Mean[x] Mean of the sum of two variables: Mean[x + y] = Mean[x] + Mean[y]
6
Mean of the Coefficient Estimate’s Probability Distribution
bx Mean[c + x] = c + Mean[x] Rewrite fraction as a product Mean[cx] = cMean[x] Mean[x+y] = Mean[x] + Mean[y] Mean[cx] = cMean[x] Mean[e1] = Mean[e2] = Mean[e3] = 0 Question: Have we relied on the error term/error tem independence premise to show that the OLS estimation procedure for the coefficient value is unbiased? No Question: In the presence of autocorrelation, should we expect the OLS estimation procedure for the coefficient value still to be unbiased? Yes
7
Focus on Step 2: Did the absence of autocorrelation play a role here?
OLS Estimation Procedure: Variance of the Coefficient Estimate’s Probability Distribution Question: In the presence of autocorrelation, can we be confident that the OLS estimation procedure for the variance of the coefficient estimate’s probability distribution unbiased will still be unbiased? Recall the two step strategy we used to estimate the variance of the coefficient estimate’s probability distribution: Step 1: Estimate the variance of the error term’s probability distribution from the available information. Step 2: Apply the relationship between the variances of the coefficient estimate’s and error term’s probability distributions: SSR EstVar[e] = Var[bx] = Degrees of Freedom EstVar[e] Focus on Step 2: Did the absence of autocorrelation play a role here? EstVar[bx] = Review: Arithmetic of Variance Variance of a constant times a variable: Var[cx] = c2Var[x] Variance of the sum of a constant and a variable: Var[c + x] = Var[x] Variance of the sum of two variables: Var[x + y] = Var[x] + 2Cov[x, y] + Var[y] Variance of the sum of two independent variables: Var[x + y] = Var[x] + Var[y]
8
Variance of the Coefficient Estimate’s Probability Distribution
bx Var[c + x] = Var[x] Rewrite the fraction as a product Var[cx] = c2Var[x] et’s independent Var[cx] = c2Var[x] Var[et] = Var[e] Factor out Var[e] Let us look at this step more carefully.
9
When autocorrelation is present the error terms are not independent.
Var[bx] Error term independence premise: et’s independent Crucial Observation: We did rely on the error term/error term independence premise to derive the relationship between the variances used in Step 2 of our strategy. When autocorrelation is present the error terms are not independent. When the error terms are not independent can we ignore the covariances? Strategy Step Step 2 SSR EstVar[e] = Var[bx] = Degrees of Freedom No. EstVar[e] EstVar[bx] = Question: Have we relied on the error term/error tem independence premise to show that Var[bx] = ? Yes Question: In the presence of autocorrelation, can we expect the OLS estimation procedure for the variance of the coefficient estimate’s probability distribution still to be unbiased? No
10
But when autocorrelation is present
Our Suspicions: OLS estimation procedure for coefficient value should be unbiased. variance of the coefficient estimate’s probability distribution may be flawed. Act Coef Is the estimation procedure for the coefficient value unbiased? Unbiased estimation procedure: After many, many repetitions of the experiment the average (mean) of the estimates equals the actual value. 2 0 2 Mean (average) of the value estimates from all repetitions. Repetition Coefficient estimate for this repetition: Coef Value Est Variance of the estimated coefficient values estimates from all repetitions. Mean bx = Var Sum Sqr XDev Is the estimation procedure for the variance of the coefficient estimate’s probability distribution unbiased? SSR EstVar[e] = Degrees of Freedom SSR EstVar[bx] = Coef Var Est Estimate of the variance for the coefficient estimate’s probability distribution calculated from this repetition Mean Average of the variance estimates from all repetitions But when autocorrelation is present
11
Is the OLS estimation procedure for the coefficient’s value unbiased?
Simulation Results Is OLS estimation procedure for the variance of the coefficient estimate’s probability distribution unbiased? Lab 17.2 Is the OLS estimation procedure for the coefficient’s value unbiased? Mean (Average) Variance of the Average of Actual of the Estimated Estimated Coef Estimated Variances, Value Values, bx, from Values, bx, from EstVar[bx], from Each Rho of x All Repetitions All Repetitions All Repetitions 2.0 2.0 .22 .22 .6 2.0 2.0 1.11 .28 When autocorrelation is absent Nothing but good news. When autocorrelation is present Good news: OLS estimation procedure for the coefficient value is unbiased. Bad news: OLS procedure for estimating the variance of the coefficient estimate’s probability distribution is biased. All calculations based on the variance estimate will be flawed; that is, the standard errors, t-statistics, and tail probabilities appearing on the OLS regression printout are flawed. Summary: Is the estimation procedure: Std Premises Auto an unbiased estimation procedure for the OLS OLS coefficient value? Yes Yes variance of the coefficient estimate’s probability distribution? Yes No
12
Accounting for Autocorrelation
Step 1: Apply the Ordinary Least Squares (OLS) Estimation Procedure. Estimate the model’s parameters with the ordinary least squares (OLS) estimation procedure. Step 2: Consider the Possibility of Autocorrelation Ask whether there is reason to suspect that autocorrelation may be present. Use the ordinary least squares (OLS) regression results to “get a sense” of whether autocorrelation is a problem by examining the residuals. Use the Lagrange Multiplier approach by estimating an artificial regression to test for the presence of autocorrelation. Estimate the value of the parameter in the autocorrelation model. Step 3: Apply the Generalized Least Squares (GLS) Estimation Procedure. Apply the model of autocorrelation and algebraically manipulate the original model to derive a new, tweaked model in which the error terms do not suffer from autocorrelation. Use the ordinary least squares (OLS) estimation procedure to estimate the parameters of the tweaked model. An Example: Disposable income and the consumption of durables. Theory: Higher disposable income increases the consumption of durables. Model: ConsDurt = Const + IInct + et Theory: I > 0. Consumer Durable Data: Monthly time series data of consumer durable consumption and income statistics 2004 to 2009. ConsDurt Consumption of durables in month t (billions of 2005 dollars) Inct Disposable income in month t (billions of 2005 dollars)
13
Ordinary Least Squares (OLS)
Step 1: Apply the Ordinary Least Squares (OLS) Estimation Procedure. Model: ConsDurt = Const + IInct + et Dependent Variable: ConsDur Explanatory Variable: Inc EViews Ordinary Least Squares (OLS) Dependent Variable: ConsDur Explanatory Variable(s): Estimate SE t-Statistic Prob Inc 0.0000 Const 0.0656 Number of Observations 72 Estimated Equation: ConsDur = Inc Interpretation: We estimate that a $1 increase in real disposable income increases the real consumption of durable goods by $.087. Critical Result: The Inc coefficient estimate equals This evidence, the positive sign of the coefficient estimate, suggests that higher disposable income increases the consumption of consumer durables thereby supporting the theory. H0: I = 0 Higher disposable income does not affect the consumption of durables H1: I > 0 Higher disposable income increases the consumption of durables <.0001 <.0001 Prob[Results IF H0 True] = 2 Is there a potential problem here? If autocorrelation is present, the standard errors, t-statistics, and tail probabilities are flawed.
14
Strong economic period
Step 2: Consider the Possibility of Autocorrelation Ask whether there is reason to suspect that autocorrelation may be present. Model: ConsDurt = Const + IInct + et Key Observation: Business cycles tend to last for many months. Strong economic period Weak economic period Consumer confidence was high last month; consumers spend more freely, consumer more, last month Consumer confidence was low last month; consumers spend less freely, consumer less, last month et1 < 0 et1 > 0 Typically, consumer confidence will continue to be high this month; consumers will spend more freely, consumer more, this month Typically, consumer confidence will continue to be low this month; consumers will spend less freely, consumer less, this month et < 0 et > 0 We can use the value of last month’s error term to predict this month’s. The error terms are not independent: positive autocorrelation is present.
15
Use the ordinary least squares (OLS) regression results to “get a sense” of whether autocorrelation is a problem by examining the residuals. We can think of the residuals as the estimated errors. The error terms, the et’s, are unobservable The residuals, the Rest’s, are observable yt = Const + xxt + et Rest = yt Estyt Estyt = bConst + bxxt et = yt (Const + xxt) Rest = yt (bConst + bxxt) EViews Rest Rest1 Typically, a negative residual is followed by another negative residual and a positive residuals is typically followed by another positive residual. Most scatter diagram points lie in the first and third quadrants.
16
The Original Model: yt = Const + xxt + et et’s are not observable
Use the Lagrange Multiplier approach by estimating an artificial regression to test for the presence of autocorrelation. The Original Model: yt = Const + xxt + et et’s are not observable Autocorrelation Model: et = et-1 + vt vt’s are independent OLS Estimates : Estyt = bConst + bxxt Residuals: Rest = yt Estyt Rest’s are observable Rest = yt Estyt = Const xxt et Estyt = Const xxt et-1 + vt Estyt = Const xxt et vt (bConst + bxxt) Rearranging terms. = Const bConst xxt bxxt et vt Factoring. = (Const bConst) (x bx)xt et vt We can think of the residuals as the estimated error terms. But et-1, an actual error term, is not observable. Rest = (Const bConst) + (x bx )xt Rest vt Should we worry about autocorrelation here? No, the vt‘s are independent
17
Lagrange Multiplier (LR) Ordinary Least Squares (OLS)
Dependent Variable: Resid Explanatory Variable(s): Estimate SE t-Statistic Prob Inc 0.8133 Const 0.8173 Resid(1) 0.0000 Number of Observations 72 EViews Critical Result: The Resid(-1) coefficient estimate equals The positive sign of the coefficient estimate suggests that an increase in last period’s residual increases this period’s residual. This evidence suggests that autocorrelation is present. H0: = 0 No autocorrelation present H1: > 0 Autocorrelation present <.0001 <.0001 Prob[Results IF H0 True] = 2 Estimate the value of the autocorrelation parameter, . Autocorrelation Model: et = et-1 + vt Dependent Variable: Residual Explanatory Variable: ResidualLag (No constant) NB: The autocorrelation model does not include a constant. EViews Ordinary Least Squares (OLS) Dependent Variable: Residual Explanatory Variable(s): Estimate SE t-Statistic Prob ResidualLag 0.0000 Number of Observations 71 Estimate of = Est = .839
18
The original model: yt = Const + xxt + et yt = ConsDurt xt = Inct
Step 3: Apply the Generalized Least Squares (GLS) Estimation Procedure. Apply the model of autocorrelation and algebraically manipulate the original model to derive a new, tweaked model in which the error terms do not suffer from autocorrelation. The original model: yt = Const + xxt + et yt = ConsDurt xt = Inct Autocorrelation model: et = et1 + vt vt’s are independent For t yt = Const xxt et For t1 yt = Const xxt et1 Multiply by yt = Const xxt et1 Subtract yt yt1 = Const Const xxt xxt et et1 yt yt1 = Const Const x(xt xt1) et et1 Factor out x yt yt1 = Const Const x(xt xt1) et1 + vt et1 et-1‘s disappear (yt yt1) = (Const Const) x ( xt xt1) vt Substitute Est for . (yt Estyt1) = (Const EstConst) x ( xt Estxt1) vt Adjyt = (Const Const) + xAdjxt vt Question: Is autocorrelation still an issue? Adjyt = yt Estyt- 1 New dependent variable: AdjConsDurt = ConsDurt ConsDurt1 Answer: No, the vt’s are independent. New explanatory variable: Adjxt = xt Estxt-1 AdjInct = Inct Inct1
19
Ordinary Least Squares (OLS)
Use the ordinary least squares (OLS) estimation procedure to estimate the parameters of the tweaked model. Dependent Variable: AdjConsDur Explanatory Variable: AdjInc EViews Ordinary Least Squares (OLS) Dependent Variable: AdjConsDur Explanatory Variable(s): Estimate SE t-Statistic Prob AdjInc 0.1545 Const 0.0093 Number of Observations 71 H0: I = 0 H1: I > 0 .1545 = Prob[Results IF H0 True] = 2 The Ordinary Least Squares (OLS) and Generalized Least Squares (GLS) Estimates I Estimate SE t-Statistic Tails Prob Ordinary Least Squares (OLS) <.0001 Generalized Least Squares (GLS)
20
Justifying the Generalized Least Squares (GLS) Estimation Procedure
Lab 17.4 Sample Size = 30 Mean (Average) of Variance of Average of Actual the Estimated the Estimated Coef Estimated Variances, Estim Value Values, bx, from Values, bx, from EstVar[bx], from Rho Proc of x All Repetitions All Repetitions All Repetitions 0 OLS 2.0 2.0 .22 .22 .6 OLS 2.0 2.0 1.11 .28 .6 GLS 2.0 2.0 1.01 Question: What about GLS? Summary: When autocorrelation is present: Can the ordinary least squares (OLS) estimation procedure for the variance of the coefficient estimate’s probability distribution be trusted? No. Which estimation procedure for the coefficient value is better, the ordinary least squares (OLS) or the generalized least squares (GLS)? GLS. Is the ordinary least squares (OLS) estimation procedure for the coefficient value the best linear unbiased estimation procedure (BLUE)? No. Two issues emerge with the ordinary least squares (OLS) estimation procedure when autocorrelation is present: The standard error calculations made by the ordinary least squares (OLS) estimation procedure are flawed. While the ordinary least squares (OLS) for the coefficient value is unbiased, it is not the best linear unbiased estimation procedure (BLUE).
21
Ordinary Least Squares (OLS) Ordinary Least Squares (OLS)
Robust Standard Errors: An Alternative Approach Two issues emerge with the ordinary least squares (OLS) estimation procedure when autocorrelation is present: The standard error calculations made by the ordinary least squares (OLS) estimation procedure are flawed. While the ordinary least squares (OLS) for the coefficient value is unbiased, it is not the best linear unbiased estimation procedure (BLUE). EViews HAC (Newey-West) standard errors: Ordinary Least Squares (OLS) Dependent Variable: ConsDur Explanatory Variable(s): Estimate SE t-Statistic Prob Inc 0.0032 Const 0.2822 HAC (Newey-West) standard errors Number of Observations 72 Standard errors based on the error term/error term independence premises: Ordinary Least Squares (OLS) Dependent Variable: ConsDur Explanatory Variable(s): Estimate SE t-Statistic Prob Inc 0.0000 Const 0.0656 Number of Observations 72
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.