Unit Roots & Forecasting

Slides:



Advertisements
Similar presentations
Cointegration and Error Correction Models
Advertisements

Autocorrelation Functions and ARIMA Modelling
Time Series Analysis Definition of a Time Series process
COMM 472: Quantitative Analysis of Financial Decisions
Long run models in economics Professor Bill Mitchell Director, Centre of Full Employment and Equity School of Economics University of Newcastle Australia.
Economics 20 - Prof. Anderson1 Stationary Stochastic Process A stochastic process is stationary if for every collection of time indices 1 ≤ t 1 < …< t.
Vector Autoregressive Models
Using SAS for Time Series Data
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
STATIONARY AND NONSTATIONARY TIME SERIES
Nonstationary Time Series Data and Cointegration Prepared by Vera Tabakova, East Carolina University.
Unit Root Tests: Methods and Problems
Non-stationary data series
Economics 20 - Prof. Anderson1 Testing for Unit Roots Consider an AR(1): y t =  +  y t-1 + e t Let H 0 :  = 1, (assume there is a unit root) Define.
Advanced Time Series PS 791C. Advanced Time Series Techniques A number of topics come under the general heading of “state-of-the-art” time series –Unit.
Economics 20 - Prof. Anderson1 Time Series Data y t =  0 +  1 x t  k x tk + u t 2. Further Issues.
1 MF-852 Financial Econometrics Lecture 11 Distributed Lags and Unit Roots Roy J. Epstein Fall 2003.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
Time Series Analysis Materials for this lecture Lecture 5 Lags.XLS Lecture 5 Stationarity.XLS Read Chapter 15 pages Read Chapter 16 Section 15.
Regression with Time-Series Data: Nonstationary Variables
Time Series Building 1. Model Identification
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
STAT 497 APPLIED TIME SERIES ANALYSIS
Stationary Stochastic Process
Stationary process NONSTATIONARY PROCESSES 1 In the last sequence, the process shown at the top was shown to be stationary. The expected value and variance.
Economics Prof. Buckles1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
1Prof. Dr. Rainer Stachuletz Testing for Unit Roots Consider an AR(1): y t =  +  y t-1 + e t Let H 0 :  = 1, (assume there is a unit root) Define 
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Economics 20 - Prof. Anderson
14 Vector Autoregressions, Unit Roots, and Cointegration.
12 Autocorrelation Serial Correlation exists when errors are correlated across periods -One source of serial correlation is misspecification of the model.
BOX JENKINS METHODOLOGY
STAT 497 LECTURE NOTES 2.
Cointegration in Single Equations: Lecture 6 Statistical Tests for Cointegration Thomas 15.2 Testing for cointegration between two variables Cointegration.
FAME Time Series Econometrics Daniel V. Gordon Department of Economics University of Calgary.
The Properties of Time Series: Lecture 4 Previously introduced AR(1) model X t = φX t-1 + u t (1) (a) White Noise (stationary/no unit root) X t = u t i.e.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
How do we identify non-stationary processes? (A) Informal methods Thomas 14.1 Plot time series Correlogram (B) Formal Methods Statistical test for stationarity.
3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.
Time Series Analysis Lecture 11
EC208 – Introductory Econometrics. Topic: Spurious/Nonsense Regressions (as part of chapter on Dynamic Models)
Previously Definition of a stationary process (A) Constant mean (B) Constant variance (C) Constant covariance White Noise Process:Example of Stationary.
TESTING FOR NONSTATIONARITY 1 This sequence will describe two methods for detecting nonstationarity, a graphical method involving correlograms and a more.
NONSTATIONARY PROCESSES 1 In the last sequence, the process shown at the top was shown to be stationary. The expected value and variance of X t were shown.
Cointegration in Single Equations: Lecture 5
TESTING FOR NONSTATIONARITY 1 This sequence will describe two methods for detecting nonstationarity, a graphical method involving correlograms and a more.
Univariate Time series - 2 Methods of Economic Investigation Lecture 19.
Stationarity and Unit Root Testing Dr. Thomas Kigabo RUSUHUZWA.
1 Lecture Plan : Statistical trading models for energy futures.: Stochastic Processes and Market Efficiency Trading Models Long/Short one.
© 2013 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
EC 827 Module 2 Forecasting a Single Variable from its own History.
Advanced Econometrics - Lecture 5 Univariate Time Series Models.
Lecture 12 Time Series Model Estimation Materials for lecture 12 Read Chapter 15 pages 30 to 37 Lecture 12 Time Series.XLSX Lecture 12 Vector Autoregression.XLSX.
Time Series Econometrics
Financial Econometrics Lecture Notes 4
Nonstationary Time Series Data and Cointegration
Financial Econometrics Lecture Notes 2
Further Issues Using OLS with Time Series Data
CHAPTER 16 ECONOMIC FORECASTING Damodar Gujarati
Further Issues in Using OLS with Time Series Data
Unit Roots 31/12/2018.
Serial Correlation and Heteroscedasticity in
Unit Root & Augmented Dickey-Fuller (ADF) Test
Introduction to Time Series
Further Issues Using OLS with Time Series Data
Lecturer Dr. Veronika Alhanaqtah
Autocorrelation MS management.
Serial Correlation and Heteroscedasticity in
BOX JENKINS (ARIMA) METHODOLOGY
Presentation transcript:

Unit Roots & Forecasting Methods of Economic Investigation Lecture 20

Last Time Descriptive Time Series Processes Estimating with exogenous serial correlation Estimating with endogenous processes

Today’s Class Non-stationaryTime Series Returning to Causal Effects Unit Roots and Spurious Regressions Orders of Integration Returning to Causal Effects Impulse Response Functions Forecasting

Random Walk Processes Definition: Et[xt+1] = xt that is today’s value of X is the best predictor of tomorrow’s value. This looks very similar to our AR(1) processes, where φ = 1. Autocovariances of a random walk are not well defined in a technical sense, but imagine AR(1) process with φ1: we have nearly perfect autocorrelation for any two time periods. persistence dies out too slowly so most of variance will largely be due to very low-frequency “shocks.”

Permanence of Shocks in Unit Root An innovation (a shock at t ) to a stationary AR process dies out eventually (the autocorrelation function declines eventually to zero). A shock to a random walk is permanent Variance is increasing over time Var(xt) = Var(x0) + tσ2

Drifts and Trends Deterministic trend yt = δt + xt + εt xt is some stationary process yt is “trend” stationary It’s easy to add a deterministic trend to a random walk

Orders of Integration A series is integrated of order p if a p differences render it stationary. If a time series is integrated and differencing once renders the time series stationary, then it is integrated of order 1 or I(1). If it is necessary to difference twice before a time series is stationary, then it is I(2), and so forth.

Integrated Series If a time series has a unit root, it is said to be integrated. First differencing the time series removes the unit root. E.g. in the case of a random walk yt = yt-1 + ut, ut ~ N(0, σ2) Δyt = ut the first difference is white noise, which is stationary. For an AR(p) a unit root implies 1 – β1L – β2L2 – ... – βpLp = (1 – L) (1 – λ1L – λ2L2 ... – λpLp-1) = 0 and as a result first differencing also removes the unit root.

Non-stationarity Nonstationarity can have important consequences for regression modelsand inference. Autoregressive coefficients are biased t-statistics have non-normal distributions even in large samples Spurious regression

Problem: Spurious Regression imagine we now have two series are generated by independent random walks, Suppose we run yt on xt using OLS, that is we estimate yt = α + βxt + νt. In this case, you tend to see ”significant” β because the low-frequency changes make it seem as if the two series are in some way associated.

Unit Root Tests Standard Dickey-Fuller test appropriate for AR(1) processes Many economic and financial time series have a more complicated dynamic structure than is captured by a simple AR(1) model. Said and Dickey (1984) augment the basic autoregressive unit root test to accommodate general ARMA(p, q) models with unknown orders and Called the augmented Dickey-Fuller (ADF) test

ADF Test – 1 The ADF test tests the null hypothesis that a time series yt is I(1) against the alternative that it is I(0), assuming that the dynamics in the data have an ARMA structure. The ADF test is based on estimating the test regression Other serial correlation Deterministic variables Potential unit root

ADF Test - 2 To see why: Subtract yt-1 from both sides and define Φ = (α1+ α2+…+ αp – 1) and we get Test Φ= 0 against alternative Φ<0 Use special DF upperbound and lowerbound Under alternative, test statistic is not normally distributed

Estimating in Time Series Non-stationary time series can lead to a lot of problems in econometric analysis. In order to work with time series, particular in regression models, we should therefore transform our variables to stationary time series first. First differencing removes unit roots or trends. Hence, difference a time series until it is I(0). Differencing too often is less of a problem since a differenced stationary series is still stationary. Regressions of one stationary variable on another is less problematic. Although observations may not be independent, we can expect regression to have similar properties as with cross sectional data.

Impulse Response Function One of the most interesting things to do with an ARMA model is form predictions of the variable given its past. we want to know what is Et(xt+j ) Can do inference with Vart(xt+j) The impulse response function is a simpel way to do that Follow te path that x follows if it is kicked by unit shock characterization of the behavior of our models. allows us to start thinking about “causes” and “effects”.

Impulse Response and MA(∞) 1. The MA(∞) representation is the same thing as the impulse response function. i.e. The easiest way to calculate an MA(∞) representation is to simulate the impulse-response function. The impulse response function is the same as Et(xt+j) − Et−1(xt+j).

Causality and Impulse Response Can either forecast or simulate the effect of a given shock Try to pick a shock time/level to simulate and try to replicate observed data Issue of whether that shock is what really happened Know a shock happened in time time t See if observed change (more on this next time) Granger causality implies a correlation between the current value of one variable and the past values of others it does not necessarily imply that changes in one variable “causes” changes in another. Use a F-test to jointly test for the significance of the lags on the explanatory variables, this in effect tests for ‘Granger causality’ between these variables. Can visually see correlation in impulse response functions

Source: Cochrane, QJE (1994)

Next Time Estimating Causality in Time Series Some additional forecasting stuff Testing for breaks Regression discontinuity/Event study