Models for Non-Stationary Time Series The ARIMA(p,d,q) time series
Many non-stationary time series can be converted to a stationary time series by taking d th order differences.
Let {x t |t T} denote a time series such that {w t |t T} is an ARMA(p,q) time series where w t = d x t = (I – B) d x t = the d th order differences of the series x t. Then {x t |t T} is called an ARIMA(p,d,q) time series (an integrated auto-regressive moving average time series.)
The equation for the time series {w t |t T} is: (B)w t = + (B)u t or (B) x t = + (B)u t.. Where (B) = (B) d = (B)(I - B) d The equation for the time series {x t |t T} is: (B) d x t = + (B)u t
Suppose that d roots of the polynomial (x) are equal to unity then (x) can be written: (B) = (1 - 1 x - 2 x p x p )(1-x) d. and (B) could be written: (B) = (I - 1 B - 2 B p B p )(I-B) d = (B) d. In this case the equation for the time series becomes: (B)x t = + (B)u t or (B) d x t = + (B)u t..
Comments: 1.The operator (B) = (B) d = 1 - 1 x - 2 x p+d x p+d is called the generalized autoregressive operator. (d roots are equal to 1, the remaining p roots have |r i | > 1) 2. The operator (B) is called the autoregressive operator. (p roots with |r i | > 1) 3. The operator (B) is called moving average operator.
Example – ARIMA(1,1,1) The equation: (I – 1 B)(I – B)x t = + (I + 1 )u t (I – (1 + 1 ) B + 1 B 2 )x t = + u t + 1 u t - 1 x t – (1 + 1 ) x t-1 + 1 x t-2 = + u t + 1 u t – 1 or x t = (1 + 1 ) x t-1 – 1 x t-2 + + u t + 1 u t – 1
Modeling of Seasonal Time Series
If a time series, {x t : t T},that is seasonal we would expect observations in the same season in adjacent years to have a higher auto correlation than observations that are close in time (but in different seasons. For example for data that is monthly we would expect the autocorrelation function to look like this
The AR(1) seasonal model This model satisfies the equation: The autocorrelation for this model can be shown to be: This model is also an AR(12) model with 1 = … = 11 = 0
Graph: (h)
The AR model with both seasonal and serial correlation This model satisfies the equation: The autocorrelation for this model will satisfy the equations: This model is also an AR(13) model.
The Yule-Walker Equations:
or:
Some solutions for h
Excel file for determining Autocorrelation functionExcel
The general ARIMA model incorporating seasonality where
Prediction
Three Important Forms of a Non-Stationary Time Series
The Difference equation Form: x t = 1 x t-1 + 2 x t p+d x t-p-d + + u t + 1 u t-1 + 2 u t q u t-q (B) d x t = + (B)u t
The Random Shock Form: x t = (t) + u t + 1 u t-1 + 2 u t-2 + 3 u t x t = (t) + (B)u t
The Inverted Form: x t = 1 x t-1 + 2 x t-2 + 3 x t + + u t (B)x t = + u t
Example Consider the ARIMA(1,1,1) time series (I – 0.8B) x t = (I + 0.6B)u t (I – 0.8B) (I –B) x t = (I + 0.6B)u t (I – 1.8B + 0.8B 2 ) x t = (I + 0.6B)u t x t = 1.8 x t x t u t + 0.6u t -1 Difference equation form
(I – 1.8B + 0.8B 2 ) x t = (I + 0.6B)u t The random shock form x t = (I – 1.8B + 0.8B 2 ) -1 (I + 0.6B)u t x t = (I + 2.4B B B B 4 + … )u t x t = u t u t u t u t u t …
(I – 1.8B + 0.8B 2 ) x t = (I + 0.6B)u t The Inverted form (I + 0.6B) -1 (I – 1.8B + 0.8B 2 )x t = u t (I - 2.4B B 2 – B B 4 +… )x t = u t x t = 2.4 x t x t x t x t … + u t
Forecasting an ARIMA(p,d,q) Time Series Let P T denote {…, x T-2, x T-1, x T } = the “past” until time T. Then the optimal forecast of x T+l given P T is denoted by: This forecast minimizes the mean square error
Three different forms of the forecast 1.Random Shock Form 2.Inverted Form 3.Difference Equation Form Note:
Random Shock Form of the forecast Recall x t = (t) + u t + 1 u t-1 + 2 u t-2 + 3 u t x T+l = (T + l) + u T+l + 1 u T+l-1 + 2 u T+l-2 + 3 u T+l or Taking expectations of both sides and using
To compute this forecast we need to compute {…, u T-2, u T-1, u T } from {…, x T-2, x T-1, x T }. Note: x t = (t) + u t + 1 u t-1 + 2 u t-2 + 3 u t Thus Which can be calculated recursively and
The Error in the forecast: The Mean Sqare Error in the Forecast Hence
Prediction Limits for forecasts (1 – )100% confidence limits for x T+l
The Inverted Form: (B)x t = + u t or x t = 1 x t-1 + 2 x t-2 + 3 x + u t where (B) = [ (B)] -1 (B) = [ (B)] -1 [ (B) d ] = I - 1 B - 2 B 2 - 3 B
The Inverted form of the forecast x t = 1 x t-1 + 2 x t + u t and for t = T+l x T+l = 1 x T+l-1 + 2 x T+l + + u T+l Taking conditional Expectations Note:
The Difference equation form of the forecast x T+l = 1 x T+l-1 + 2 x T+l + p+d x T+l-p-d + + u T+l + 1 u T+l-1 + 2 u T+l q u T+l-q Taking conditional Expectations
Example: ARIMA(1,1,2) The Model: x t - x t-1 = 1 (x t-1 - x t-2 ) + u t + 1 u t + 2 u t or x t = (1 + 1 )x t-1 - 1 x t-2 + u t + 1 u t + 2 u t or (B)x t = (B)(I-B)x t = (B)u t where (x) = 1 - (1 + 1 )x + 1 x 2 = (1 - 1 x)(1-x) and (x) = 1 + 1 x + 2 x 2.
The Random Shock form of the model: x t = (B)u t where (B) = [ (B)(I-B)] -1 (B) = [ (B)] -1 (B) i.e. (B) [ (B)] = (B). Thus (I + 1 B + 2 B 2 + 3 B 3 + 4 B )(I - (1 + 1 )B + 1 B 2 ) = I + 1 B + 2 B 2 Hence 1 = 1 - (1 + 1 ) or 1 = 1 + 1 + 1. 2 = 2 - 1 (1 + 1 ) + 1 or 2 = 1 (1 + 1 ) - 1 + 2. 0 = h - h-1 (1 + 1 ) + h-2 1 or h = h-1 (1 + 1 ) - h-2 1 for h ≥ 3.
The Inverted form of the model: (B) x t = u t where (B) = [ (B)] -1 (B)(I-B) = B)] -1 (B) i.e. (B) [ (B)] = (B). Thus (I - 1 B - 2 B 2 - 3 B 3 - 4 B )(I + 1 B + 2 B 2 ) = I - (1 + 1 )B + 1 B 2 Hence -(1 + 1 ) = 1 - 1 or 1 = 1 + 1 + 1. 1 = - 2 - 1 1 + 2 or 2 = - 1 1 - 1 + 2. 0 = h - h-1 1 - h-2 2 or h = -( h-1 1 + h-2 2 ) for h ≥ 3.
Now suppose that 1 = 0.80, 1 = 0.60 and 2 = 0.40 then the Random Shock Form coefficients and the Inverted Form coefficients can easily be computed and are tabled below:
The Forecast Equations
The Difference Form of the Forecast Equation
Computation of the Random Shock Series, One- step Forecasts One-step Forecasts Random Shock Computations
Computation of the Mean Square Error of the Forecasts and Prediction Limits Mean Square Error of the Forecasts Prediction Limits
Table: MSE of Forecasts to lead time l = 12 ( 2 = 2.56)
Raw Observations, One-step Ahead Forecasts, Estimated error, Error
Forecasts with 95% and 66.7% prediction Limits
Graph: Forecasts with 95% and 66.7% Prediction Limits