AUTOCORRELATION 1 The third Gauss-Markov condition is that the values of the disturbance term in the observations in the sample be generated independently.

Slides:



Advertisements
Similar presentations
Autocorrelation Functions and ARIMA Modelling
Advertisements

Christopher Dougherty EC220 - Introduction to econometrics (chapter 11) Slideshow: static models and models with lags Original citation: Dougherty, C.
Using SAS for Time Series Data
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
Christopher Dougherty EC220 - Introduction to econometrics (chapter 13) Slideshow: cointegration Original citation: Dougherty, C. (2012) EC220 - Introduction.
COINTEGRATION 1 The next topic is cointegration. Suppose that you have two nonstationary series X and Y and you hypothesize that Y is a linear function.
Augmented Dickey-Fuller Test Equation Dependent Variable: D(LGDPI) Method: Least Squares Sample (adjusted): Included observations: 44 after adjustments.
Chapter 11 Autocorrelation.
Christopher Dougherty EC220 - Introduction to econometrics (chapter 13) Slideshow: tests of nonstationarity: example and further complications Original.
============================================================ Dependent Variable: LGHOUS Method: Least Squares Sample: Included observations:
FITTING MODELS WITH NONSTATIONARY TIME SERIES 1 Detrending Early macroeconomic models tended to produce poor forecasts, despite having excellent sample-period.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Three Ending Tuesday, September 11 (Note: You must go over these slides and complete every.
1 TIME SERIES MODELS: STATIC MODELS AND MODELS WITH LAGS In this sequence we will make an initial exploration of the determinants of aggregate consumer.
Stationary process NONSTATIONARY PROCESSES 1 In the last sequence, the process shown at the top was shown to be stationary. The expected value and variance.
Angela Sordello Christopher Friedberg Can Shen Hui Lai Hui Wang Fang Guo.
TAKE HOME PROJECT 2 Group C: Robert Matarazzo, Michael Stromberg, Yuxing Zhang, Yin Chu, Leslie Wei, and Kurtis Hollar.
Marietta College Week 14 1 Tuesday, April 12 2 Exam 3: Monday, April 25, 12- 2:30PM Bring your laptops to class on Thursday too.
1 Econ 240 C Lecture 3. 2 Part I Modeling Economic Time Series.
1 Econ 240 C Lecture White noise inputoutput 1/(1 – z) White noise input output Random walkSynthesis 1/(1 – bz) White noise input output.
Is There a Difference?. How Should You Vote? Is “Big Government” better?Is “Big Government” better? –Republicans want less government involvement. –Democrats.
Global Warming: Is It True? Peter Fuller Odeliah Greene Amanda Smith May Zin.
1 Econ 240 C Lecture Time Series Concepts Analysis and Synthesis.
Determents of Housing Prices. What & WHY Our goal was to discover the determents of rising home prices and to identify any anomies in historic housing.
1 Econ 240 C Lecture 6. 2 Part I: Box-Jenkins Magic ARMA models of time series all built from one source, white noise ARMA models of time series all built.
Car Sales Analysis of monthly sales of light weight vehicles. Laura Pomella Karen Chang Heidi Braunger David Parker Derek Shum Mike Hu.
1 Econ 240 C Lecture 3. 2 Time Series Concepts Analysis and Synthesis.
Why Can’t I Afford a Home? By: Philippe Bonnan Emelia Bragadottir Troy Dewitt Anders Graham S. Matthew Scott Lingli Tang.
1 Econ 240A Power 7. 2 This Week, So Far §Normal Distribution §Lab Three: Sampling Distributions §Interval Estimation and HypothesisTesting.
Determining what factors have an impact on the burglary rate in the United States Team 7 : Adam Fletcher, Branko Djapic, Ivan Montiel, Chayaporn Lertarattanapaiboon,
Violent Crime in America ECON 240A Group 4 Thursday 3 December 2009.
California Expenditure VS. Immigration By: Daniel Jiang, Keith Cochran, Justin Adams, Hung Lam, Steven Carlson, Gregory Wiefel Fall 2003.
So far, we have considered regression models with dummy variables of independent variables. In this lecture, we will study regression models whose dependent.
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
GDP Published by: Bureau of Economic Analysis Frequency: Quarterly Period Covered: prior quarter Volatility: Moderate Market significance: very high Web.
1 Econ 240A Power 7. 2 Last Week §Normal Distribution §Lab Three: Sampling Distributions §Interval Estimation and HypothesisTesting.
Zhen Tian Jeff Lee Visut Hemithi Huan Zhang Diana Aguilar Yuli Yan A Deep Analysis of A Random Walk.
Forecasting Fed Funds Rate Group 4 Neelima Akkannapragada Chayaporn Lertrattanapaiboon Anthony Mak Joseph Singh Corinna Traumueller Hyo Joon You.
U.S. Tax Revenues and Policy Implications A Time Series Approach Group C: Liu He Guizi Li Chien-ju Lin Lyle Kaplan-Reinig Matthew Routh Eduardo Velasquez.
Matt Mullens Gulsah Gunenc Alex Keyfes Gaoyuan Tian Andrew Booth.
EC220 - Introduction to econometrics (chapter 12)
DURBIN–WATSON TEST FOR AR(1) AUTOCORRELATION
Christopher Dougherty EC220 - Introduction to econometrics (chapter 12) Slideshow: autocorrelation, partial adjustment, and adaptive expectations Original.
What does it mean? The variance of the error term is not constant
What decides the price of used cars? Group 1 Jessica Aguirre Keith Cody Rui Feng Jennifer Griffeth Joonhee Lee Hans-Jakob Lothe Teng Wang.
Pure Serial Correlation
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Five Ending Wednesday, September 26 (Note: Exam 1 is on September 27)
1 Economics 240A Power Eight. 2 Outline n Maximum Likelihood Estimation n The UC Budget Again n Regression Models n The Income Generating Process for.
Christopher Dougherty EC220 - Introduction to econometrics (revision lectures 2011) Slideshow: autocorrelation Original citation: Dougherty, C. (2011)
SPURIOUS REGRESSIONS 1 In a famous Monte Carlo experiment, Granger and Newbold fitted the model Y t =  1 +  2 X t + u t where Y t and X t were independently-generated.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Four Ending Wednesday, September 19 (Assignment 4 which is included in this study guide.
PARTIAL ADJUSTMENT 1 The idea behind the partial adjustment model is that, while a dependent variable Y may be related to an explanatory variable X, there.
AUTOCORRELATION 1 Assumption C.5 states that the values of the disturbance term in the observations in the sample are generated independently of each other.
Christopher Dougherty EC220 - Introduction to econometrics (chapter 4) Slideshow: exercise 4.5 Original citation: Dougherty, C. (2012) EC220 - Introduction.
2010, ECON Hypothesis Testing 1: Single Coefficient Review of hypothesis testing Testing single coefficient Interval estimation Objectives.
FUNCTIONAL FORMS OF REGRESSION MODELS Application 5.
Air pollution is the introduction of chemicals and biological materials into the atmosphere that causes damage to the natural environment. We focused.
EC208 – Introductory Econometrics. Topic: Spurious/Nonsense Regressions (as part of chapter on Dynamic Models)
With the support of the European Commission 1 Competitiveness of the SME’s in Albania A review of the business conditions with a focus on financing conditions.
Spurious regressions with variables possessing deterministic trends Regresson Regress Y t on X t and t SPURIOUS REGRESSIONS 1 To motivate the discussion.
4 In our case, the starting point should be the model with all the lagged variables. DYNAMIC MODEL SPECIFICATION General model with lagged variables Static.
TESTING FOR NONSTATIONARITY 1 This sequence will describe two methods for detecting nonstationarity, a graphical method involving correlograms and a more.
NONSTATIONARY PROCESSES 1 In the last sequence, the process shown at the top was shown to be stationary. The expected value and variance of X t were shown.
Partial Equilibrium Framework Empirical Evidence for Argentina ( )
Page 0 Modelling Effective Office Rents by Matt Hall DTZ, 125 Old Broad Street, London, EC2N 2BQ Tel: +44 (0)
TESTING FOR NONSTATIONARITY 1 This sequence will describe two methods for detecting nonstationarity, a graphical method involving correlograms and a more.
AUTOCORRELATION 1 Assumption B.5 states that the values of the disturbance term in the observations in the sample are generated independently of each other.
المبادلة بين العائد و المخاطرة دراسة قياسية السنة الدراســــــــية:
Introduction to Econometrics, 5th edition Chapter 12: Autocorrelation
Table 4. Regression Statistics for the Model
Presentation transcript:

AUTOCORRELATION 1 The third Gauss-Markov condition is that the values of the disturbance term in the observations in the sample be generated independently of each other. y x y =  +  x

AUTOCORRELATION 2 In the graph above, it is clear that this condition is violated. Positive values tend to be followed by positive ones, and negative values by negative ones. Successive values tend to have the same sign. This is described as positive autocorrelation. y y =  +  x x

AUTOCORRELATION 3 In this graph, positive values tend to be followed by negative ones, and negative values by positive ones. This is an example of negative autocorrelation. y y =  +  x x

First-order autoregressive autocorrelation: AR(1) AUTOCORRELATION 4 A particularly common type of autocorrelation, at least as an approximation, is first-order autoregressive autocorrelation, usually denoted AR(1) autocorrelation.

First-order autoregressive autocorrelation: AR(1) AUTOCORRELATION 5 It is autoregressive, because u t depends on lagged values of itself, and first-order, because it depends on only its previous value. u t also depends on  t, an injection of fresh randomness at time t, often described as the innovation at time t.

First-order autoregressive autocorrelation: AR(1) Fifth-order autoregressive autocorrelation: AR(5) AUTOCORRELATION 6 Here is a more complex example of autoregressive autocorrelation. It is described as fifth- order, and so denoted AR(5), because it depends on lagged values of u t up to the fifth lag.

First-order autoregressive autocorrelation: AR(1) Fifth-order autoregressive autocorrelation: AR(5) Third-order moving average autocorrelation: MA(3) AUTOCORRELATION 7 The other main type of autocorrelation is moving average autocorrelation, where the disturbance term is a linear combination of the current innovation and a finite number of previous ones.

First-order autoregressive autocorrelation: AR(1) Fifth-order autoregressive autocorrelation: AR(5) Third-order moving average autocorrelation: MA(3) AUTOCORRELATION 8 This example is described as third-order moving average autocorrelation, denoted MA(3), because it depends on the three previous innovations as well as the current one.

AUTOCORRELATION 9 The rest of this sequence gives examples of the patterns that are generated when the disturbance term is subject to AR(1) autocorrelation. The object is to provide some bench- mark images to help you assess plots of residuals in time series regressions.

AUTOCORRELATION 10 We will use 50 independent values of , taken from a normal distribution with 0 mean, and generate series for u using different values of .

AUTOCORRELATION 11 We have started with  equal to 0, so there is no autocorrelation. We will increase  progressively in steps of 0.1.

AUTOCORRELATION 12

AUTOCORRELATION 13

AUTOCORRELATION 14 With  equal to 0.3, a pattern of positive autocorrelation is beginning to be apparent.

AUTOCORRELATION 15

AUTOCORRELATION 16

AUTOCORRELATION 17 With  equal to 0.6, it is obvious that u is subject to positive autocorrelation. Positive values tend to be followed by positive ones and negative values by negative ones.

AUTOCORRELATION 18

AUTOCORRELATION 19

AUTOCORRELATION 20 With  equal to 0.9, the sequences of values with the same sign have become long and the tendency to return to 0 has become weak.

AUTOCORRELATION 21 The process is now approaching what is known as a random walk, where  is equal to 1 and the process becomes nonstationary. The terms random walk and nonstationarity will be defined in the next chapter. For the time being we will assume |  | < 1.

AUTOCORRELATION 22 Next we will look at negative autocorrelation, starting with the same set of 50 independently-distributed values of  t.

AUTOCORRELATION 23 We will take larger steps this time.

AUTOCORRELATION 24 With  equal to 0.6, you can see that positive values tend to be followed by negative ones, and vice versa, more frequently than you would expect as a matter of chance.

AUTOCORRELATION 25 Now the pattern of negative autocorrelation is very obvious.

= ============================================================ Dependent Variable: LGFOOD Method: Least Squares Sample: Included observations: 36 ============================================================= Variable Coefficient Std. Error t-Statistic Prob. ============================================================= C LGDPI LGPRFOOD ============================================================= R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criter Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic) ============================================================= AUTOCORRELATION 26 Finally, we will look at a plot of the residuals of the logarithmic regression of expenditure on food on income and relative price.

AUTOCORRELATION 27 This is the plot of the residuals of course, not the disturbance term. But if the disturbance term is subject to autocorrelation, then the residuals will be subject to a similar pattern of autocorrelation.

AUTOCORRELATION 28 You can see that there is strong evidence of positive autocorrelation. Comparing the graph with the randomly generated patterns, one would say that  is about 0.6 or 0.7. The next step is to perform a formal test for autocorrelation, the subject of the next sequence.

Copyright Christopher Dougherty This slideshow may be freely copied for personal use.