Econometrics I Professor William Greene Stern School of Business Department of Economics
Econometrics I Part 25 – Time Series
Modeling an Economic Time Series Observed y0, y1, …, yt,… What is the “sample” Random sampling? The “observation window”
Estimators Functions of sums of observations Law of large numbers? Nonindependent observations What does “increasing sample size” mean? Asymptotic properties? (There are no finite sample properties.)
Interpreting a Time Series Time domain: A “process” y(t) = ax(t) + by(t-1) + … Regression like approach/interpretation Frequency domain: A sum of terms y(t) = Contribution of different frequencies to the observed series. (“High frequency data and financial econometrics – “frequency” is used slightly differently here.)
For example,…
In parts…
Studying the Frequency Domain Cannot identify the number of terms Cannot identify frequencies from the time series Deconstructing the variance, autocovariances and autocorrelations Contributions at different frequencies Apparent large weights at different frequencies Using Fourier transforms of the data Does this provide “new” information about the series?
Autocorrelation in Regression Yt = b’xt + εt Cov(εt, εt-1) ≠ 0 Ex. RealConst = a + bRealIncome + εt U.S. Data, quarterly, 1950-2000
Autocorrelation How does it arise? What does it mean? Modeling approaches Classical – direct: corrective Estimation that accounts for autocorrelation Inference in the presence of autocorrelation Contemporary – structural Model the source Incorporate the time series aspect in the model
Stationary Time Series zt = b1yt-1 + b2yt-2 + … + bPyt-P + et Autocovariance: γk = Cov[yt,yt-k] Autocorrelation: k = γk / γ0 Stationary series: γk depends only on k, not on t Weak stationarity: E[yt] is not a function of t, E[yt * yt-s] is not a function of t or s, only of |t-s| Strong stationarity: The joint distribution of [yt,yt-1,…,yt-s] for any window of length s periods, is not a function of t or s. A condition for weak stationarity: The smallest root of the characteristic polynomial: 1 - b1z1 - b2z2 - … - bPzP = 0, is greater than one. The unit circle Complex roots Example: yt = yt-1 + ee, 1 - z = 0 has root z = 1/ , | z | > 1 => | | < 1.
Stationary vs. Nonstationary Series
The Lag Operator Lxt = xt-1 L2 xt = xt-2 LPxt + LQxt = xt-P + xt-Q Polynomials in L: yt = B(L)yt + et A(L) yt = et Invertibility: yt = [A(L)]-1 et
Inverting a Stationary Series yt= yt-1 + et (1- L)yt = et yt = [1- L]-1 et = et + et-1 + 2et-2 + … Stationary series can be inverted Autoregressive vs. moving average form of series
Regression with Autocorrelation yt = xt’b + et, et = et-1 + ut (1- L)et = ut et = (1- L)-1ut E[et] = E[ (1- L)-1ut] = (1- L)-1E[ut] = 0 Var[et] = (1- L)-2Var[ut] = 1+ 2u2 + … = u2/(1- 2) Cov[et,et-1] = Cov[et-1 + ut, et-1] = =Cov[et-1,et-1]+Cov[ut,et-1] = u2/(1- 2)
OLS vs. GLS OLS GLS Unbiased? Consistent: (Except in the presence of a lagged dependent variable) Inefficient GLS Consistent and efficient
+----------------------------------------------------+ | Ordinary least squares regression | | LHS=REALCONS Mean = 2999.436 | | Autocorrel Durbin-Watson Stat. = .0920480 | | Rho = cor[e,e(-1)] = .9539760 | +---------+--------------+----------------+--------+---------+----------+ |Variable | Coefficient | Standard Error |t-ratio |P[|T|>t] | Mean of X| Constant -80.3547488 14.3058515 -5.617 .0000 REALDPI .92168567 .00387175 238.054 .0000 3341.47598 | Robust VC Newey-West, Periods = 10 | Constant -80.3547488 41.7239214 -1.926 .0555 REALDPI .92168567 .01503516 61.302 .0000 3341.47598 +---------------------------------------------+ | AR(1) Model: e(t) = rho * e(t-1) + u(t) | | Final value of Rho = .998782 | | Iter= 6, SS= 118367.007, Log-L=-941.371914 | | Durbin-Watson: e(t) = .002436 | | Std. Deviation: e(t) = 490.567910 | | Std. Deviation: u(t) = 24.206926 | | Durbin-Watson: u(t) = 1.994957 | | Autocorrelation: u(t) = .002521 | | N[0,1] used for significance levels | |Variable | Coefficient | Standard Error |b/St.Er.|P[|Z|>z] | Mean of X| Constant 1019.32680 411.177156 2.479 .0132 REALDPI .67342731 .03972593 16.952 .0000 3341.47598 RHO .99878181 .00346332 288.389 .0000
Detecting Autocorrelation Use residuals Durbin-Watson d= Assumes normally distributed disturbances strictly exogenous regressors Variable addition (Godfrey) yt = ’xt + εt-1 + ut Use regression residuals et and test = 0 Assumes consistency of b.
A Unit Root? How to test for = 1? By construction: εt – εt-1 = ( - 1)εt-1 + ut Test for γ = ( - 1) = 0 using regression? Variance goes to 0 faster than 1/T. Need a new table; can’t use standard t tables. Dickey – Fuller tests Unit roots in economic data. (Are there?) Nonstationary series Implications for conventional analysis
Reinterpreting Autocorrelation
Integrated Processes Integration of order (P) when the P’th differenced series is stationary Stationary series are I(0) Trending series are often I(1). Then yt – yt-1 = yt is I(0). [Most macroeconomic data series.] Accelerating series might be I(2). Then (yt – yt-1)- (yt – yt-1) = 2yt is I(0) [Money stock in hyperinflationary economies. Difficult to find many applications in economics]
Cointegration: Real DPI and Real Consumption
Cointegration – Divergent Series?
Cointegration X(t) and y(t) are obviously I(1) Looks like any linear combination of x(t) and y(t) will also be I(1) Does a model y(t) = bx(t) + u(u) where u(t) is I(0) make any sense? How can u(t) be I(0)? In fact, there is a linear combination, [1,-] that is I(0). y(t) = .1*t + noise, x(t) = .2*t + noise y(t) and x(t) have a common trend y(t) and x(t) are cointegrated.
Cointegration and I(0) Residuals