AUTOCORRELATION 1 The third Gauss-Markov condition is that the values of the disturbance term in the observations in the sample be generated independently of each other. y x y = + x
AUTOCORRELATION 2 In the graph above, it is clear that this condition is violated. Positive values tend to be followed by positive ones, and negative values by negative ones. Successive values tend to have the same sign. This is described as positive autocorrelation. y y = + x x
AUTOCORRELATION 3 In this graph, positive values tend to be followed by negative ones, and negative values by positive ones. This is an example of negative autocorrelation. y y = + x x
First-order autoregressive autocorrelation: AR(1) AUTOCORRELATION 4 A particularly common type of autocorrelation, at least as an approximation, is first-order autoregressive autocorrelation, usually denoted AR(1) autocorrelation.
First-order autoregressive autocorrelation: AR(1) AUTOCORRELATION 5 It is autoregressive, because u t depends on lagged values of itself, and first-order, because it depends on only its previous value. u t also depends on t, an injection of fresh randomness at time t, often described as the innovation at time t.
First-order autoregressive autocorrelation: AR(1) Fifth-order autoregressive autocorrelation: AR(5) AUTOCORRELATION 6 Here is a more complex example of autoregressive autocorrelation. It is described as fifth- order, and so denoted AR(5), because it depends on lagged values of u t up to the fifth lag.
First-order autoregressive autocorrelation: AR(1) Fifth-order autoregressive autocorrelation: AR(5) Third-order moving average autocorrelation: MA(3) AUTOCORRELATION 7 The other main type of autocorrelation is moving average autocorrelation, where the disturbance term is a linear combination of the current innovation and a finite number of previous ones.
First-order autoregressive autocorrelation: AR(1) Fifth-order autoregressive autocorrelation: AR(5) Third-order moving average autocorrelation: MA(3) AUTOCORRELATION 8 This example is described as third-order moving average autocorrelation, denoted MA(3), because it depends on the three previous innovations as well as the current one.
AUTOCORRELATION 9 The rest of this sequence gives examples of the patterns that are generated when the disturbance term is subject to AR(1) autocorrelation. The object is to provide some bench- mark images to help you assess plots of residuals in time series regressions.
AUTOCORRELATION 10 We will use 50 independent values of , taken from a normal distribution with 0 mean, and generate series for u using different values of .
AUTOCORRELATION 11 We have started with equal to 0, so there is no autocorrelation. We will increase progressively in steps of 0.1.
AUTOCORRELATION 12
AUTOCORRELATION 13
AUTOCORRELATION 14 With equal to 0.3, a pattern of positive autocorrelation is beginning to be apparent.
AUTOCORRELATION 15
AUTOCORRELATION 16
AUTOCORRELATION 17 With equal to 0.6, it is obvious that u is subject to positive autocorrelation. Positive values tend to be followed by positive ones and negative values by negative ones.
AUTOCORRELATION 18
AUTOCORRELATION 19
AUTOCORRELATION 20 With equal to 0.9, the sequences of values with the same sign have become long and the tendency to return to 0 has become weak.
AUTOCORRELATION 21 The process is now approaching what is known as a random walk, where is equal to 1 and the process becomes nonstationary. The terms random walk and nonstationarity will be defined in the next chapter. For the time being we will assume | | < 1.
AUTOCORRELATION 22 Next we will look at negative autocorrelation, starting with the same set of 50 independently-distributed values of t.
AUTOCORRELATION 23 We will take larger steps this time.
AUTOCORRELATION 24 With equal to 0.6, you can see that positive values tend to be followed by negative ones, and vice versa, more frequently than you would expect as a matter of chance.
AUTOCORRELATION 25 Now the pattern of negative autocorrelation is very obvious.
= ============================================================ Dependent Variable: LGFOOD Method: Least Squares Sample: Included observations: 36 ============================================================= Variable Coefficient Std. Error t-Statistic Prob. ============================================================= C LGDPI LGPRFOOD ============================================================= R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criter Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic) ============================================================= AUTOCORRELATION 26 Finally, we will look at a plot of the residuals of the logarithmic regression of expenditure on food on income and relative price.
AUTOCORRELATION 27 This is the plot of the residuals of course, not the disturbance term. But if the disturbance term is subject to autocorrelation, then the residuals will be subject to a similar pattern of autocorrelation.
AUTOCORRELATION 28 You can see that there is strong evidence of positive autocorrelation. Comparing the graph with the randomly generated patterns, one would say that is about 0.6 or 0.7. The next step is to perform a formal test for autocorrelation, the subject of the next sequence.
Copyright Christopher Dougherty This slideshow may be freely copied for personal use.