Time Series EC Burak Saltoglu

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

Cointegration and Error Correction Models
Autocorrelation Functions and ARIMA Modelling
VAR Models Gloria González-Rivera University of California, Riverside
Vector Autoregressive Models
Using SAS for Time Series Data
STATIONARY AND NONSTATIONARY TIME SERIES
Non-stationary data series
Unit Roots & Forecasting
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
Regression with Time-Series Data: Nonstationary Variables
Time Series Building 1. Model Identification
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
STAT 497 APPLIED TIME SERIES ANALYSIS
13 Introduction toTime-Series Analysis. What is in this Chapter? This chapter discusses –the basic time-series models: autoregressive (AR) and moving.
Chapter 10 Simple Regression.
The Simple Regression Model
Prediction and model selection
Financial Econometrics
Chapter 11 Multiple Regression.
14 Vector Autoregressions, Unit Roots, and Cointegration.
#1 EC 485: Time Series Analysis in a Nut Shell. #2 Data Preparation: 1)Plot data and examine for stationarity 2)Examine ACF for stationarity 3)If not.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
The Properties of Time Series: Lecture 4 Previously introduced AR(1) model X t = φX t-1 + u t (1) (a) White Noise (stationary/no unit root) X t = u t i.e.
STAT 497 LECTURE NOTE 11 VAR MODELS AND GRANGER CAUSALITY 1.
Dynamic Models, Autocorrelation and Forecasting ECON 6002 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s notes.
EC208 – Introductory Econometrics. Topic: Spurious/Nonsense Regressions (as part of chapter on Dynamic Models)
1 Chapter 5 : Volatility Models Similar to linear regression analysis, many time series exhibit a non-constant variance (heteroscedasticity). In a regression.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Previously Definition of a stationary process (A) Constant mean (B) Constant variance (C) Constant covariance White Noise Process:Example of Stationary.
Module 4 Forecasting Multiple Variables from their own Histories EC 827.
TESTING FOR NONSTATIONARITY 1 This sequence will describe two methods for detecting nonstationarity, a graphical method involving correlograms and a more.
TESTING FOR NONSTATIONARITY 1 This sequence will describe two methods for detecting nonstationarity, a graphical method involving correlograms and a more.
Univariate Time series - 2 Methods of Economic Investigation Lecture 19.
Time Series Analysis PART II. Econometric Forecasting Forecasting is an important part of econometric analysis, for some people probably the most important.
Lecture 5 Stephen G. Hall COINTEGRATION. WE HAVE SEEN THE POTENTIAL PROBLEMS OF USING NON-STATIONARY DATA, BUT THERE ARE ALSO GREAT ADVANTAGES. CONSIDER.
1 Lecture Plan : Statistical trading models for energy futures.: Stochastic Processes and Market Efficiency Trading Models Long/Short one.
Lecture 12 Time Series Model Estimation Materials for lecture 12 Read Chapter 15 pages 30 to 37 Lecture 12 Time Series.XLSX Lecture 12 Vector Autoregression.XLSX.
Time Series Econometrics
Financial Econometrics Lecture Notes 4
Lecture 12 Time Series Model Estimation
Nonstationary Time Series Data and Cointegration
Spurious Regression and Simple Cointegration
Financial Econometrics Lecture Notes 2
VAR models and cointegration
Ch8 Time Series Modeling
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
Time Series EC Burak Saltoglu
Applied Econometric Time-Series Data Analysis
CHAPTER 16 ECONOMIC FORECASTING Damodar Gujarati
STATIONARY AND NONSTATIONARY TIME SERIES
I271B Quantitative Methods

CHAPTER 29: Multiple Regression*
Time Series EC Burak Saltoglu
Unit Roots 31/12/2018.
Spurious Regression and Simple Cointegration
Spurious Regression and Simple Cointegration
Unit Root & Augmented Dickey-Fuller (ADF) Test
Introduction to Time Series
Vector AutoRegression models (VARs)
VAR Models Gloria González-Rivera University of California, Riverside
Product moment correlation
Lecturer Dr. Veronika Alhanaqtah
Time Series introduction in R - Iñaki Puigdollers
CH2 Time series.
BOX JENKINS (ARIMA) METHODOLOGY
Presentation transcript:

Time Series EC 532 2017 Burak Saltoglu

Ec532 2nd half: Time Series Analysis Topic 1 linear time series Topic 2 nonstationary time series Topic 3 Cointegration and unit roots Topic 4 Vector Autoregression (VAR) Topic 5 Volatility modeling (if time allows)

References

Time series books Hamiton, J (1994); Time Series Analysis Enders W (2014), Applied Time Series Chatfield (2003), The Analysis of Time Series Diebold F (2006), Elements of Forecasting 9/22/2018

References books : Ruey Tsay, 2013 Walters Applied Time Series Methods, Wiley, 2013 Granger Long Run Economic Relationships, 1990. Hamilton Time Series Analysis, 1994.

Topic 1: Linear Time series Outline Non-stationary time series Distributed Lag Models Nonlinear Models

outline Linear Time Series Models ARDL Models Granger Causality Test AR and MA processes Diagnostics in Time Series Correlogram Box-Pierce Q Statistics Ljung-Box (LB) Statistics Forecasting

Later in topic 2 Stationary versus Non-stationary Times Series Testing for Stationarity

The Reasons for using Time Series Psychological Reasons: People do not change their habits immediately Technological Reasons: Quantity of a resource needed or bought might not be so adaptive in many cases Instutitional Reasons: There might be some limitations on individuals

Distributed Lag Models In the distributed lag (DL) model we have not only current value of the explanatory variable but also its past value(s). With DL models, the effect of a shock in the explanatory variable lasts more. We can estimate DL models (in principal)with OLS. Because the lags of X are also non-stochastic.

Autoregressive Models In the Autoregressive (AR) models, the past value(s) of the dependent variables becomes an explanatory variable. We can not esitmate an autoregressive model with OLS due to 1.Presence of stochastic explanatory variables and 2.Posibility of serial correlation

ARDL Models In the ARDL models, we have both AR and DL part in one regression.

Granger Causality Test Let us consider the relation between GNP and money supply. A regression analysis can show us the relation between these two. But our regression analysis can not say us the direction of the relation. The granger causality test examines the causality between series, the direction of the relation. We can test whether GNP causes money supply to increase or a monetary expansion lead GNP to rise, under conditions defined by Granger.

Granger Causality Test Steps for testing M (granger) causes GNP; Regress GNP on all lagged GNP obtain Regress GNP on all lagged GNP and all lagged M obtain The null is ’s are alll zero. Test statistics; where m number of lags, k the number of parameters in step-2. df(m,n-k)

Granger Causality Test

Linear Time Series Models: y(t) Time series analysis is useful when the economic relationship is difficult to set Even if there are explanatory variables to express y, it is not possible to forecast y(t)

Stationary Stochastic Process Any time series data can be thought of as being generated by a stochastic process. A stochastic process is said to be stationary if its mean and variance are constant over time the value of covariance between two time depends only on the distance or lag between the two time periods and not on the actual time at which the covariance is computed.

Times series and white noise a process is said to be white noise if it follows the following properties

Stationary Time Series If a time series is time invariant with respect to changes in time The process can be estimated with fixed coefficients Strict-sense Stationarity:

Stationarity Wide sense stationarity

Stationarity Strict sense stationarity implies wide sense stationarity but the reverse is not true. İmplication of stationarity: inference we obtain from a non-stationary series is misleading and wrong.

Linear Time Series Models-AR Basic ARMA Models

Lag Operators Or we can use lag polynomials

Lag operators and polynomials

AR vs MA Representation

AUTOCORRELATIONS and AUTOCOVARIANCE FUNCTIONS

Autocorrelation

Partial Autocorrelation

Linear Time Series -AR For AR(1);

Linear Time Series Models-AR

Linear Time Series Models-AR

Linear Time Series Models-AR(1) So if you have a data which is generated by an AR(1) process, it is correlogram will diminish slowly (if it is stationary)

AR(1) process 𝑦 𝑡 =0.99 𝑦 𝑡−1 + ε 𝑡

AR process simulation 𝑦 𝑡 =0.90 𝑦 𝑡−1 + ε 𝑡

AR process simulation 𝑦 𝑡 =0.5 𝑦 𝑡−1 + ε 𝑡

AR(1) with weak predictable part 𝑦 𝑡 =0.05 𝑦 𝑡−1 + ε 𝑡

𝑦 𝑡 = ε 𝑡

𝑦 𝑡 =0.9 𝑦 𝑡−1 + ε 𝑡 𝑦 𝑡 =0.8 𝑦 𝑡−1 + ε 𝑡

Linear Time Series Models-AR(p) Autoregressive Expected value of Y;

Linear Time Series Models-MA Moving Average MA(k) Process The term ‘moving average’ comes from the fact that y is constructed from a weighted sum of the two most recent Error terms

MA(1) Correlogram

Linear Time Series Models-MA(1) So if you have a data that is generated by MA(1) its correlogram will decline to zero quickly (after one lag.)

An MA(1) example One major implication is the MA(1) process has a memory of only one Lag. i.e. MA(1) process forgets immediately after one term or only remembers the Just one Previous realization.

Variance-autocovariance MA(2) **Since it is white noise

MA(2)

Linear Time Series Models-MA Moving Average MA(k) Process Error term is white noise. MA(k) has k+2 parameters Variance of y;

Homework, derive the autocorrelation function for MA(3),..MA(k).

ARMA Models: ARMA(1,1)

ARMA(1,1)

ARMA(1,1)

Model Selection How well does it fit the data? Adding additional lags for p and q will reduce the SSR. Adding new variables decrease the degrees of freedom In addition, adding new variables decreases the forecasting performance of the fitted model. Parsimonious model: optimizes this trade-off

Two Model Selection Criteria Akaike Information Criterion: Schwartz Bayesian Criterion. AIC: k is the number of parameters estimated if intercept term is allowed: (p+q+1) else k=p+q. T: number of observations Choose the lag order which minimizes the AIC or SBC AIC may be biased towards selecting overparametrized model whereas SBC is asympoticaly consistent

Chararterization of Time Series Visual inspection Autocorrelation order selection Test for significance Barlett Box ljung

Correlogram Under stationarity, One simple test of stationarity is based on autocorrelation function (ACF). ACF at lag k is; Under stationarity,

Sample Autocorrelation

Correlogram If we plot against k, the graph is called as correlogram. As an example let us look at the correlogram of Turkey’s GDP.

Autocorrelation Function Correlogram Autocorrelation Function

Test for autocorrelation Barlett Test: to test for

ISE30 Return Correlation

Box-Pierce Q Statistics To test the joint hypothesis that all the autocorrelation coefficients are simultaneously zero, one can use the Q statistics. where; m= lag length n= sample size

Box-Pierce Q Statistics

Ljung-Box (LB) Statistics It is variant of Q statistics as;

Box Jenkins approach to time series data Stop: If the series are non-stationary Identification Choose the order of p q ARMA Estimate ARMA coefficients Diagnostic checking: Is the model appropriate Forecasting

forecasting T T+R Today Ex post forecasting period T+1,…T+R Ex ante period ESTIMATION PERIOD t=1,…T

Introduction to forecasting

In practice If we can consistently estimate the order via AIC then one can forecast the future values of y. There are alternative measures to conduct forecast accuracy

Mean Square Prediction Error Method (MSPE) Choose model with the lowest MSPE If there are observations in the holdback periods, the MSPE for Model 1 is defined as:

A Forecasting example for AR(1) Suppose we are given

A Forecasting example for AR(1) Left for forecasting

Introduction to forecasting

Forecast of AR(1) model forecast actual y(151) forecast -6.452201702 -6.265965609 y(152) forecast -5.806981532 -5.225758143 y(153) forecast -5.226283379 -5.175019085 y(154) forecast -4.703655041 -4.383751313 y(155) forecast -4.233289537 -4.204952791 y(156) forecast -3.809960583 -4.594492147 y(157) forecast -3.428964525 -4.742611541 y(158) forecast -3.086068072 -2.272600688 y(159) forecast -2.777461265 -1.975709705 y(160) forecast -2.499715139 -2.205455725 y(161) forecast -2.249743625 -1.568798136

AR(1) forecast

Summary Find the AR, MA order via autocovariances, correlogram plots Use, AIC, SBC to choose orders Check LB stats Run a regression Do forecasting (use RMSE or MSE) to choose the best out-of-sample forecasting model.

Topic II: Testing for Stationarity and Unit Roots EC 532

Outline What is unit roots? Why is it important? Test for unit roots Spurious regression Test for unit roots Dickey Fuller Augmented Dickey Fuller tests

Stationarity and random walk Can we test via ACF or Box Ljung? Why a formal test is necessary? Source: W Enders Chapter 4, chapter 6

Spurious Regression Regressions involving time series data include the possibility of obtaining spurious or dubious results signals the spurious regression. Two variables carrying the same trend makes two series to move together this does not mean that there is a genuine or natural relationship.

Spurios regression One of OLS assumptions was the stationarity of these series we will call such regression as spurious regression (Newbold and Granger (1974)).

Unit roots and cointegration Clive Granger Robert Engle

Spurious regression the least squares estimates are not consistent and regular tests and inference do not hold. As rule of thumb (Granger and Newbold,1974)

Example Spurious Regression : two simulated RW:Ar1.xls Xt = Xt-1 + ut ut~N(0,1) Yt = Yt-1 + εt εt~N(0,1) ut and εt are independent Spurious regression: Yt = βXt + ut   Coefficients Standard Error t Stat P-value X Variable 1 -0.3081 0.03216 -9.58025 9.87E-16

Examples:Gozalo

Unit Roots: Stationarity

Stationary and unit roots

Some Time Series Models: Random Walk Model Where error term follows the white noise property with the following properties

Random Walk Now let us look at the dynamics of such a model; 𝜎 2

Implications of Random walk

Random Walk: BİST30 index

Random Walk:ISE percentage returns

Why a formal test is necessary? For instance, daily brent oil series given below graph shows a series non-stationarity time series.

Brent Oil:20 years of daily data End of lecture

How instructive to use ACF?

Does Crude Oil data follow random walk? (or does it contain unit root) Neither Graph nor autocovariance functions can be formal proof of the existence of random walk series. How about standard t-test?

Testing for Unit Roots: Dickey Fuller      But it would not be appropriate to use this information to reject the null of unit root. This t-test is not appropriate under the null of a unit –root. Dickey and Fuller (1979,1981) developed a formal test for unit roots. Hypothesis tests based on non-stationary variables cannot be analytically evaluated. But non-standard test statistics can be obtained via Monte Carlo

Dickey Fuller Test These are three versions of the Dickey-Fuller (DF) unit root tests. The null hypothesis for all versions is same whether beta1 is zero or not.

Dickey Fuller Test These are three versions of the Dickey-Fuller (DF) unit root tests. The null hypothesis for all versions is same whether beta1 is zero or not.

Dickey Fuller Test The test involves to estimate any of the below specifications

Dickey Fuller test So we will run and test the slope to be significant or not So the test statistic is the same as conventioanl t-test.

Running DF Regression

Testing DF in EVIEWS

DF: EVIEWS

Testing for DF for other specifications: RW with trend

Dickey Fuller F-test (1981) . Now of course the test statistic is distributed under F test which can be found in Dickey Fuller tables. They are calculated under conventional F tests.

Dickey Fuller Test These are three versions of the Dickey-Fuller (DF) unit root tests. The null hypothesis for all versions is same whether beta1 is zero or not.

Augemented Dickey Fuller

Augmented Dickey Fuller Test With Dickey-Fuller (ADF) test we can handle with the autocorrelation problem. The m, number of lags included, should be big enough so that the error term is not serially correlated. The null hypothesis is again the same. Let us consider GDP example again

Augmented Dickey Fuller Test

Augmented Dickey Fuller Test At 99% confidence level, we can not reject the null. “ not augmented”

Augmented Dickey Fuller Test At 99% confidence level, we reject the null. This time we “augmented” the regression to handle with serial correlation ***Because GDP is not stationary at level and stationary at first difference,it is called integrated order one, I(1). Then a stationary serie is I(0).

Augmented Dickey Fuller Test In order to handle the autocorrelation problem Augmented Dickey-Fuller (ADF) test is proposed. The p, number of lags included, should be big enough so that the error term is not serially correlated. So in practice we use either SBC or AIC to clean the residuals. The null hypothesis is again the same.

ADF

Example:Daily Brent Oil We can not reject the null of unit root t-Statistic   Prob.* Augmented Dickey-Fuller test stat -1.321561 p= 0.8823 Test critical values: 1% level -3.959824 5% level -3.410679 10% level -3.127123 *MacKinnon (1996) one-sided p-values. Augmented Dickey-Fuller Test Equation Dependent Variable: D(BRENT) Included observations: 5137 after adjustments Variable Coefficient Std. Error t-Statistic Prob.   BRENT(-1) -0.001151 0.000871 -1.321561 0.1864 C -0.010891 0.021220 -0.513233 0.6078 @TREND(1) 1.78E-05 9.00E-06 1.979332 0.0478 R-squared 0.000767     Mean dependent var 0.011207

Diagnostics: Monthly trl30

Trl30 and 360

I(1) ve I(0) Series If a series is stationary it is said to be I(0) series If a series is not stationary but its first difference is stationary it is called to be difference stationary or I(1).

Next presentation will investigate the stationarity behaviour of more than one time series known as co-integration.

COINTEGRATION EC332 Burak Saltoglu

Economic theory, implies equilibrium relationships between the levels of time series variables that are best described as being I(1). Similarly, arbitrage arguments imply that the I(1) prices of certain financial time series are linked. (two stocks, two emerging market bonds etc).

Cointegration If two (or more) series are themselves non-stationary (I(1)), but a linear combination of them is stationary (I(0)) then these series are said to be co-integrated. Examples: Inflation and interest rates, Exchange Rates and inflation rates, Money Demand: inflation, interest rates, income

Money demand r:interest rates, y;income, infl: inflation. Each series in the above eqn may be nonstationary (I(1)) but the money demand relationship may be stationary... All of the above series may wander around individually but as an equilibrium relationship MD is stable.... Or even though the series themselves may be non-stationary, they will move closely together over time and their difference will be stationary.

COINTEGRATION ANALYSIS Consider the m time series variables y1t, ,y2t,…,ymt known to non-stationary, ie. suppose Then, yt=(y1t, y2t,…,ymt)’ are said to form one or more cointegrating relations if there are linear combinations of yit’s that are I (0) ie. i.e if there exists an matrix such that Where, r denotes the number of cointegrating vectors. 16

Testing for Cointegration Engle – Granger Residual-Based Tests Econometrica, 1987 Step 1: Run an OLS regression of y1t (say) on the rest of the variables: namely y2t, y3t, …ymt, and save the residual from this regression 17

Dickey Fuller Test Dickey-Fuller (DF) unit root tests.

Residual Based Cointegration test: Dickey Fuller test Therefore, testing for co-integration yields to test whether the residuals from a combination of I(1) series are I(0). If u: is an I(0) then we conclude Even the individual data series are I(1) their linear combination might be I(0). This means that there is an equilibrium vector and if the variables divert from equilibrium they will converge there at a later date. If the residuals appear to be I(1) then there does not exist any co-integration relationship implying that the inference obtained from these variables are not reliable.

Higher order integration If two series are I(2) may be they might have an I(1) relationship.

Example of ECM The following is the ECM that can be formed,

COINTEGRATION and Error Correction Mechanism Estimation of the ECM 22

Error Correction Term The error correction term tells us the speed with which our model returns to equilibrium for a given exogenous shock. It should have a negative signed, indicating a move back towards equilibrium, a positive sign indicates movement away from equilibrium The coefficient should lie between 0 and 1, 0 suggesting no adjustment one time period later, 1 indicates full adjustment

An Example Are Turkish interest rates with different maturities (1 month versus 12 months) co-integrated Step 1: Test for I(1) for each series. Step 2: test whether two of these series move together in the long-run. if yes then set up an Error Correction Mechanism.

So both of these series are non-stationary i.e I(1) Now we test whether there exists a linear combination of these two series which is stationary.

COINTEGRATION and Error Correction Mechanism 22

Test for co-integration

COINTEGRATION and Error Correction Mechanism Estimate the ECM 22

ECM regression

Use of Cointegration in Economic and Finance Purchasing Power Parity: FX rate differences between two countries is equal to inflation differences. Big Mac etc… Uncovered Interest Rate Parity: Exchange rate can be determined with the interest rate differentials Interest Rate Expectations: Long and short rate of interests should be moving together. Consumption Income HEDGE FUNDS! (ECM can be used to make money!) 22

conlcusion Test for co-integration via ADF is easy but might have problems when the relationship is more than 2-dimensional (Johansen is more suitable) Nonlinear co-integration, Near unit roots, structural breaks are also important. But stationarity and long run relationship of macro time series should be investigated in detail.

Vector Autoregression (VAR) In 1980’s proposed by Christopher Sims is an econometric model used to capture the evolution and the interdependencies among multiple economic time series generalize the univariate AR models All the variables in the VAR system are treated symmetrically (by own lags and the lags of all the other variables in the model VAR models as a theory-free method to estimate economic relationships, They consitutean alternative to the "identification restrictions" in structural models

VECTOR AUTOREGRESSİON

Why VAR? Christoffer Sims, from princeton (nobel prize winner 2011) First VAR paper in 1980

VAR Models In Vector Autoregression specification, all variables are regressed on their and others lagged values.For example a simple VAR model is or which is called VAR(1) model with dimension 2

VAR Models Generally VAR(p) model with k dimension is where each Ai is a k*k matrix of coefficients, m and εt is the k*1 vectors. Furthermore, No serial correlation but there can be contemporaneous correlations

An Example VAR Models: 1 month 12 months TRY Interest rates monthly Generally VAR(p) model with k dimension is where each Ai is a k*k matrix of coefficients, m and εt is the k*1 vectors. Furthermore, No serial correlation but there can be contemporaneous correlations

TRL30R TRL360R TRL30R(-1)  0.061772  0.748568  (0.08784)  (0.05396) [ 8.52170] [ 1.14479] TRL30R(-2)  0.060829  (0.07095)  (0.04358) [ 0.85739] [-0.74188] TRL360R(-1) -0.032331 -0.584529  1.255779  (0.14401)  (0.08846) [-4.05883] [ 14.1953] TRL360R(-2)  0.507183 -0.282513  (0.15040)  (0.09239) [ 3.37219] [-3.05790] C  0.002033  0.025592  (0.02499)  (0.01535) [ 0.08136] [ 1.66685]

trl30 and trl360 Akaike information criterion -4.089038  Schwarz criterion -3.914965

Hypothesis testing To test whether a VAR with a lag order 8 is preferred to a lag order 10

VAR Models Impulse Response Functions: Suppose we want to see the reaction of our simple initial VAR(1) model to a shock, say ε1=[1,0]’ and rest is 0, where ....