Computational Finance II: Time Series Guest Lecture II K.Ensor
Review ACF PACF Autocorrelation when estimating the mean. First difference removes linear nonstationarities Sth differences removes seasonal nonstationarities
Identifying periodic patterns Spectral decomposition of a realization of length n. Any finite sequence of numbers can be represented as the sum of many (n/2) sin and cos functions of different periodicities and different amplitudes. Examination of the amplitudes (or amplitudes squared) provides an understanding of the dominant periodicities that exist in the data.
Smoothed Periodogram for Johnson and Johnson Series
Regression with Autocorrelated Errors Residual Standard Error = , Multiple R-Square = N = 84, F-statistic = on 2 and 81 df, p-value = 0 coef std.err t.stat p.value Int jjt cjj
Residuals from Linear plus periodic regression
How to proceed? The residuals from the regression fit exhibit dependence over the time lags. Identify the time series model. Refit Regression + time series model using MLE.
Regression with Autocorrelated Errors How can we study the relationship between 2 or more time series. Consider the regression model for studying the term structure of interest rates such as r(1tr(2t)+e(t) with time series Let’s look at the relationship between two U.S. weekly interest rate series measured in percentages r(1t) = The 1-year Treasury constant maturity rate r(2t) = The 3-year Treasury constant maturity rate From 1/5/1962 to 9/10/1999.
Scatterplots between series simultaneous in time, and the change in each series. The two series are highly correlated.
Try this model? r3(tr2(t) + e(t) Splus summary results of least squares fit Residual Standard Error = 0.538, Multiple R-Square = N = 1967, F-statistic = on 1 and 1965 df, p-value = 0 coef std.err t.stat p.value Inter X
Behavior of Residuals The residuals are nonstionary. Thus there does not appear to be a long-term equlibrium between the two interest rates.
From a regression perspective the assumptions of our regression model are violated. Let’s consider the change series of interest rates c1(t)=(1-B)r1(t) c3(t)=(1-B)r2(t) Now regress c1 on c3.
Regression results Residual Standard Error = , Multiple R-Square = N = 1966, F-statistic = on 1 and 1964 df, p- value = 0 coef std.err t.stat p.value Intercept X rse R2 n coef std.err t.stat p.value Intercept X
Looking at the Residuals There is a small bit of autocorrelation – violating our regression assumptions.
The correct model c3(t)=c1(t) + e(t) with e(t) = a(t) +a(t-1) Parameter estimates: 2) Standard errors: ( , , ) R-squared = 85.4%
Regression + MA diagnostics
Summary – Regression with autocorrelated errors Fit regression model Check residuals for the presence of autocorrelation. If autocorrelation is present, identify the nature of the autocorrelation and simultaneously fit (via MLE) the regression parameters and the time series parameters (in Splus use arima.mle)