Download presentation
Presentation is loading. Please wait.
Published byAlisha Merritt Modified over 9 years ago
1
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
2
Models for Stationary Time Series.
3
The Moving Average Time series of order q, MA(q) where is a white noise time series with variance 2. is a Moving Average time series of order q. MA(q) if it satisfies the equation:
4
The autocorrelation function for an MA(q) time series The autocovariance function for an MA(q) time series The mean
5
The Autoregressive Time series of order p, AR(p) where {u t |t T} is a white noise time series with variance 2. {x t |t T} is called a Autoregressive time series of order p. AR(p) if it satisfies the equation:
6
The mean value of a stationary AR(p) series The Autocovariance function (h) of a stationary AR(p) series Satisfies the equations:
7
with for h > p The Autocorrelation function (h) of a stationary AR(p) series Satisfies the equations: and
8
or: and c 1, c 2, …, c p are determined by using the starting values of the sequence (h). where r 1, r 2, …, r p are the roots of the polynomial
9
Stationarity AR(p) time series: consider the polynomial with roots r 1, r 2, …, r p 1. then {x t |t T} is stationary if |r i | > 1 for all i. 2. If |r i | < 1 for at least one i then {x t |t T} exhibits deterministic behaviour. 3. If |r i | ≥ 1 and |r i | = 1 for at least one i then {x t |t T} exhibits non-stationary random behaviour.
10
The Mixed Autoregressive Moving Average Time Series of order p, ARMA(p,q) where {u t |t T} is a white noise time series with variance 2, A Mixed Autoregressive- Moving Average time series - ARMA(p,q) series {x t : t T} satisfies the equation:
11
The mean value of a stationary ARMA(p,q) series Stationary of an ARMA(p,q) series Consider the polynomial with roots r 1, r 2, …, r p 1. then {x t |t T} is stationary if |r i | > 1 for all i. 2. If |r i | < 1 for at least one i then {x t |t T} exhibits deterministic behaviour. 3. If |r i | ≥ 1 and |r i | = 1 for at least one i then {x t |t T} exhibits non-stationary random behaviour.
12
The autocovariance function (h) satisfies: For h = 0, 1. …, q: for h > q:
13
h ux (h) 0 -2 -3
14
The partial auto correlation function at lag k is defined to be: Using Cramer’s Rule
15
A recursive formula for kk : Starting with 11 = 1 and
16
Spectral density function f( )
17
Let {x t : t T} denote a time series with auto covariance function (h) and let f( ) satisfy: then f( ) is called the spectral density function of the time series {x t : t T}
18
Linear Filters
19
Let {x t : t T} be any time series and suppose that the time series {y t : t T} is constructed as follows:: The time series {y t : t T} is said to be constructed from {x t : t T} by means of a Linear Filter. input x t output y t Linear Filter a s
20
Spectral theory for Linear Filters if {y t : t T} is obtained from {x t : t T} by the linear filter:
21
since Applications: The Moving Average Time series of order q, MA(q)
22
since The Autoregressive Time series of order p, AR(p)
23
since where {z t |t T} is a MA(q) time series. The ARMA(p,q) Time series of order p,q
25
Three Important Forms of a Non-Stationary Time Series The Difference equation Form: (B) x t = + (B)u t or x t = 1 x t-1 + 2 x t-2 +... + p x t-p + + u t + 1 u t-1 + 2 u t-2 +...+ q u t-q
26
The Random Shock Form: x t = + (B)u t or x t = + u t + 1 u t-1 + 2 u t-2 + 3 u t-3 +... where (B) = (B) -1 (B) = = I + 1 B + 2 B 2 + 3 B 3 +... = (B) -1 = /(1 - 1 - 2 -... - p ) and
27
The Inverted Form: (B)x t = + u t or x t = 1 x t-1 + 2 x t-2 + 3 x 3 +... + + u t where (B) = [ (B)] -1 [ (B)] = I - 1 B - 2 B 2 - 3 B 3 -...
28
Models for Non-Stationary Time Series The ARIMA(p,d,q) time series
29
An important fact: Most Non-stationary time series have changes that are stationary Recall the time series {x t : t T} defined by the following equation: x t = 1 x t-1 + u t Then 1) if | 1 | < 1 then the time series {x t : t T} is a stationary time series. 2) if | 1 | = 1 then the time series {x t : t T} is a non stationary time series. 3) if | 1 | > 1 then the time series {x t : t T} is a deterministic time series in nature.
30
In fact if 1 = 1 then this equation becomes: x t = x t-1 + u t This is the equation of a well known non stationary time series (called a Random Walk.) Note: x t - x t-1 = (I - B)x t = x t = u t where = I - B Thus by the simple transformation of computing first differences we can can convert the time series {x t : t T} into a stationary time series.
31
Now consider the time series, {x t : t T}, defined by the equation: (B)x t = + (B)u t where (B) = I - 1 B - 2 B 2 -... - p+d B p+d. Let r 1, r 2,...,r p+d are the roots of the polynomial (x) where: (x) = 1 - 1 x - 2 x 2 -... - p+d x p+d.
32
Then 1) if |r i | > 1 for i = 1,2,...,p+d the time series {x t : t T} is a stationary time series. 2) if |r i | = 1 for at least one i (i = 1,2,...,p) and |r i | > 1 for the remaining values of i then the time series {x t : t T} is a non stationary time series. 3) if |r i | < 1 for at least one i (i = 1,2,...,p) then the time series {x t : t T} is a deterministic time series in nature.
33
Suppose that d roots of the polynomial (x) are equal to unity then (x) can be written: (B) = (1 - 1 x - 2 x 2 -... - p x p )(1-x) d. and (B) could be written: (B) = (I - 1 B - 2 B 2 -... - p B p )(I-B) d = (B) d. In this case the equation for the time series becomes: (B)x t = + (B)u t or (B) d x t = + (B)u t..
34
Thus if we let w t = d x t then the equation for {w t : t T} becomes: (B)w t = + (B)u t Since the roots of (B) are all greater than 1 in absolute value then the time series {w t : t T} is a stationary ARMA(p,q) time series. The original time series, {x t : t T}, is called an ARIMA(p,d,q) time series or Integrated Moving Average Autoregressive time series. The reason for this terminology is that in the case that d = 1 then {x t : t T} can be expressed in terms of {w t : t T} as follows: x t = -1 w t = (I - B) -1 w t = (I + B + B 2 + B 3 + B 4 +...)w t = w t + w t-1 + w t-2 + w t-3 + w t-4 +...
35
Comments: 1. The operator (B) = (B) d is called the generalized autoregressive operator. 2. The operator (B) is called the autoregressive operator. 3. The operator (B) is called moving average operator. 4. If d = 0 then t the process is stationary and the level of the process is constant. 5. If d = 1 then the level of the process is randomly changing. 6. If d = 2 thern the slope of the process is randomly changing.
36
(B)x t = + (B)u t
37
(B) x t = + (B)u t
38
(B) 2 x t = + (B)u t
39
Forecasting for ARIMA(p,d,q) Time Series
40
Consider the m+n random variables x 1, x 2,..., x m, y 1, y 2,..., y n with joint density function f(x 1, x 2,..., x m, y 1, y 2,..., y n ) = f(x,y) where x = (x 1, x 2,..., x m ) and y = (y 1, y 2,..., y n ).
41
Then the conditional density of x = (x 1, x 2,..., x m ) given y = (y 1, y 2,..., y n ) is defined to be:
42
In addition the conditional expectation of g(x) = g(x 1, x 2,..., x m ) given y = (y 1, y 2,..., y n ) is defined to be:
43
Prediction
44
Again consider the m+n random variables x 1,..., x m, y 1,..., y n. Suppose we are interested in predicting g(x 1,..., x m ) = g(x) given y = (y 1, y 2,..., y n ). Let t(y 1, y 2,..., y n ) = t(y) denote any predictor of g(x 1, x 2,..., x m ) = g(x) given the information in the observations y = (y 1, y 2,..., y n ). Then the Mean square error of t(y) in predicting g(x) using t(y) is defined to be MSE[t(y)] = E[{t(y)-g(x)} 2 |y]
45
It can be shown that the choice of t(y) that minimizes MSE[t(y)] is t(y) = E[g(x) |y]. Proof: Let v(t) = E{[t-g(x)] 2 |y } = E[t 2 -2tg(x)+g 2 (x) |y] = t 2 -2tE[g(x)|y]+E[g 2 (x) |y]. Then v'(t) = 2t -2 E[g(x)|y] = 0 when t = E[g(x)|y].
46
Three Important Forms of a Non-Stationary Time Series The Difference equation Form: (B) d x t = + (B)u t or (B)x t = + (B)u t or x t = 1 x t-1 + 2 x t-2 +... + p+d x t-p-d + + u t +a 1 u t-1 + a 2 u t-2 +...+a q u t-q
47
The Random Shock Form: x t = (t) + (B)u t or x t = (t) + u t + 1 u t-1 + 2 u t-2 + 3 u t-3 +... where (B) = (B) -1 (B) = (B) d -1 (B) = I + 1 B + 2 B 2 + 3 B 3 +... = (B) -1 = /(1 - 1 - 2 -... - p ) and (t) = -d
48
Note: d (t) = i.e. the d th order differences are constant. This implies that (t) is a polynomial of degree d.
49
Consider The Difference equation Form: (B) d x t = + (B)u t or (B)x t = + (B)u t
50
Multiply both sides by (B) -1 To get (B) -1 (B)x t = (B) -1 + (B) -1 (B)u t or x t = (B) -1 + (B) -1 (B)u t
51
The Inverted Form: (B)x t = + u t or x t = 1 x t-1 + 2 x t-2 + 3 x 3 +... + + u t where (B) = [ (B)] -1 (B) = [ (B)] -1 [ (B) d ] = I - 1 B - 2 B 2 - 3 B 3 -...
52
Again Consider The Difference equation Form: (B)x t = + (B)u t Multiply both sides by (B) -1 To get (B) -1 (B)x t = (B) -1 + (B) -1 (B)u t or (B) x t = + u t
53
Forecasting an ARIMA(p,d,q) Time Series Let P T denote {…, x T-2, x T-1, x T } = the “past” til time T. Then the optimal forecast of x T+l given P T is denoted by: This forecast minimizes the mean square error
54
Three different forms of the forecast 1.Random Shock Form 2.Inverted Form 3.Difference Equation Form Note:
55
Random Shock Form of the forecast Recall x t = (t) + u t + 1 u t-1 + 2 u t-2 + 3 u t-3 +... x T+l = (T + l) + u T+l + 1 u T+l-1 + 2 u T+l-2 + 3 u T+l-3 +... or Taking expectations of both sides and using
56
To compute this forecast we need to compute {…, u T-2, u T-1, u T } from {…, x T-2, x T-1, x T }. Note: x t = (t) + u t + 1 u t-1 + 2 u t-2 + 3 u t-3 +... Thus Which can be calculated recursively and
57
The Error in the forecast: The Mean Sqare Error in the Forecast Hence
58
Prediction Limits for forecasts (1 – )100% confidence limits for x T+l
59
The Inverted Form: (B)x t = + u t or x t = 1 x t-1 + 2 x t-2 + 3 x 3 +... + + u t where (B) = [ (B)] -1 (B) = [ (B)] -1 [ (B) d ] = I - 1 B - 2 B 2 - 3 B 3 -...
60
The Inverted form of the forecast x t = 1 x t-1 + 2 x t-2 +... + + u t and for t = T+l x T+l = 1 x T+l-1 + 2 x T+l-2 +... + + u T+l Taking conditional Expectations Note:
61
The Difference equation form of the forecast x T+l = 1 x T+l-1 + 2 x T+l-2 +... + p+d x T+l-p-d + + u T+l + 1 u T+l-1 + 2 u T+l-2 +... + q u T+l-q Taking conditional Expectations
62
Example: The Model: x t - x t-1 = 1 (x t-1 - x t-2 ) + u t + 1 u t + 2 u t or x t = (1 + 1 )x t-1 - 1 x t-2 + u t + 1 u t + 2 u t or (B)x t = (B)(I-B)x t = (B)u t where (x) = 1 - (1 + 1 )x + 1 x 2 = (1 - 1 x)(1-x) and (x) = 1 + 1 x + 2 x 2.
63
The Random Shock form of the model: x t = (B)u t where (B) = [ (B)(I-B)] -1 (B) = [ (B)] -1 (B) i.e. (B) [ (B)] = (B). Thus (I + 1 B + 2 B 2 + 3 B 3 + 4 B 4 +... )(I - (1 + 1 )B + 1 B 2 ) = I + 1 B + 2 B 2 Hence 1 = 1 - (1 + 1 ) or 1 = 1 + 1 + 1. 2 = 2 - 1 (1 + 1 ) + 1 or 2 = 1 (1 + 1 ) - 1 + 2. 0 = h - h-1 (1 + 1 ) + h-2 1 or h = h-1 (1 + 1 ) - h-2 1 for h ≥ 3.
64
The Inverted form of the model: (B) x t = u t where (B) = [ (B)] -1 (B)(I-B) = B)] -1 (B) i.e. (B) [ (B)] = (B). Thus (I - 1 B - 2 B 2 - 3 B 3 - 4 B 4 -... )(I + 1 B + 2 B 2 ) = I - (1 + 1 )B + 1 B 2 Hence -(1 + 1 ) = 1 - 1 or 1 = 1 + 1 + 1. 1 = - 2 - 1 1 + 2 or 2 = - 1 1 - 1 + 2. 0 = h - h-1 1 - h-2 2 or h = -( h-1 1 + h-2 2 ) for h ≥ 3.
65
Now suppose that 1 = 0.80, 1 = 0.60 and 2 = 0.40 then the Random Shock Form coefficients and the Inverted Form coefficients can easily be computed and are tabled below:
66
The Forecast Equations
67
The Difference Form of the Forecast Equation
68
Computation of the Random Shock Series, One- step Forecasts One-step Forecasts Random Shock Computations
69
Computation of the Mean Square Error of the Forecasts and Prediction Limits Mean Square Error of the Forecasts Prediction Limits
70
Table: MSE of Forecasts to lead time l = 12 ( 2 = 2.56)
71
Raw Observations, One-step Ahead Forecasts, Estimated error, Error
72
Forecasts with 95% and 66.7% prediction Limits
73
Graph: Forecasts with 95% and 66.7% Prediction Limits
74
Next Topic – Modelling Seasonal Time seriesModelling Seasonal Time series
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.