Presentation is loading. Please wait.

Presentation is loading. Please wait.

Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)

Similar presentations


Presentation on theme: "Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)"— Presentation transcript:

1 Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)

2 for Non-Stationary Time Series.
Models for Non-Stationary Time Series.

3 The Moving Average Time series of order q, MA(q)
is a Moving Average time series of order q. MA(q) if it satisfies the equation: where is a white noise time series with variance s2.

4 The mean The autocovariance function for an MA(q) time series The autocorrelation function for an MA(q) time series

5 The Autoregressive Time series of order p, AR(p)
{xt|t  T} is called a Autoregressive time series of order p. AR(p) if it satisfies the equation: where {ut|t  T} is a white noise time series with variance s2.

6 The mean value of a stationary AR(p) series
The Autocovariance function s(h) of a stationary AR(p) series Satisfies the equations:

7 Satisfies the equations:
The Autocorrelation function r(h) of a stationary AR(p) series Satisfies the equations: with for h > p and

8 or: where r1, r2, … , rp are the roots of the polynomial and c1, c2, … , cp are determined by using the starting values of the sequence r(h).

9 Stationarity AR(p) time series:
consider the polynomial with roots r1, r2 , … , rp 1. then {xt|t  T} is stationary if |ri| > 1 for all i. 2. If |ri| < 1 for at least one i then {xt|t  T} exhibits deterministic behaviour. 3. If |ri|  1 and |ri| = 1 for at least one i then {xt|t  T} exhibits non-stationary random behaviour.

10 The Mixed Autoregressive Moving Average Time Series of order p, ARMA(p,q)
A Mixed Autoregressive- Moving Average time series - ARMA(p,q) series.{xt|t  T} satisfies the equation: where {ut|t  T} is a white noise time series with variance s2,

11 The mean value of a stationary ARMA(p,q) series
Stationary of an ARMA(p,q) series Consider the polynomial with roots r1, r2 , … , rp 1. then {xt|t  T} is stationary if |ri| > 1 for all i. 2. If |ri| < 1 for at least one i then {xt|t  T} exhibits deterministic behaviour. 3. If |ri|  1 and |ri| = 1 for at least one i then {xt|t  T} exhibits non-stationary random behaviour.

12 The autocovariance function s(h) satisfies:
For h = 0, 1. … , q: for h > q:

13 h sux(h) -1 -2 -3

14 The partial auto correlation function at lag k is defined to be:
Using Cramer’s Rule

15 A recursive formula for Fkk:
Starting with F11 = r1 and

16 Spectral density function f(l)

17 Let {xt: t  T} denote a time series with auto covariance function s(h) and let f(l) satisfy:
then f(l) is called the spectral density function of the time series {xt: t  T}

18 Linear Filters

19 Let {xt : t  T} be any time series and suppose that the time series {yt : t  T} is constructed as follows: : The time series {yt : t  T} is said to be constructed from {xt : t  T} by means of a Linear Filter. Linear Filter as output yt input xt

20 Spectral theory for Linear Filters
if {yt : t  T} is obtained from {xt : t  T} by the linear filter:

21 Applications: The Moving Average Time series of order q, MA(q) since

22 The Autoregressive Time series of order p, AR(p)
since

23 The ARMA(p,q) Time series of order p,q
where {zt |t  T} is a MA(q) time series. since

24

25 Models for Non-Stationary Time Series
The ARIMA(p,d,q) time series

26 An important fact: Most Non-stationary time series have changes that are stationary
Recall the time series {xt : t Î T} defined by the following equation: xt = b1xt-1 + ut Then 1) if |b1| < 1 then the time series {xt : t Î T} is a stationary time series. 2) if |b1| = 1 then the time series {xt : t Î T} is a non stationary time series. 3) if |b1| > 1 then the time series {xt : t Î T} is a deterministic time series in nature.

27 In fact if b1 = 1 then this equation becomes:
xt = xt-1 + ut This is the equation of a well known non stationary time series (called a Random Walk.)  Note: xt - xt-1 = (I - B)xt= Dxt = ut where D = I - B Thus by the simple transformation of computing first differences we can can convert the time series {xt : t Î T} into a stationary time series.

28 Now consider the time series, {xt : t Î T}, defined by the equation:
f(B)xt = d + a(B)ut where f(B) = I - f1B - f2B fp+d Bp+d. Let r1, r2, ... ,rp+d are the roots of the polynomial f(x) where: f(x) = 1 - f1x - f2x fp+dxp+d.

29 Then 1) if |ri| > 1 for i = 1,2,...,p+d the time series {xt : t Î T} is a stationary time series. 2) if |ri| = 1 for at least one i (i = 1,2,...,p) and |ri| > 1 for the remaining values of i then the time series {xt : t Î T} is a non stationary time series. 3) if |ri| < 1 for at least one i (i = 1,2,...,p) then the time series {xt : t Î T} is a deterministic time series in nature.

30 Suppose that d roots of the polynomial f(x) are equal to unity then f(x) can be written:
f(B) = (1 - b1x - b2x bpxp)(1-x)d. and f(B) could be written: f(B) = (I - b1B - b2B bpBp)(I-B)d= b(B)Dd. In this case the equation for the time series becomes: f(B)xt = d + a(B)ut or b(B)Dd xt = d + a(B)ut..

31 Thus if we let wt = Ddxt then the equation for {wt : t ÎT} becomes:
b(B)wt = d + a(B)ut Since the roots of b(B) are all greater than 1 in absolute value then the time series {wt: t ÎT} is a stationary ARMA(p,q) time series. The original time series , {xt : t ÎT}, is called an ARIMA(p,d,q) time series or Integrated Moving Average Autoregressive time series. The reason for this terminology is that in the case that d = 1 then {xt: t ÎT} can be expressed in terms of {wt: t ÎT} as follows: xt = D-1wt = (I - B)-1wt = (I + B + B2 + B3 + B4+ ...)wt = wt + wt-1 + wt-2 + wt-3 + wt

32 Comments: 1. The operator f(B) =b(B)Dd is called the generalized autoregressive operator. 2. The operator b(B) is called the autoregressive operator. 3. The operator a(B) is called moving average operator. 4. If d = 0 then t the process is stationary and the level of the process is constant. 5. If d = 1 then the level of the process is randomly changing. 6. If d = 2 thern the slope of the process is randomly changing.

33 b(B)xt = d + a(B)ut

34 b(B)Dxt = d + a(B)ut

35 b(B)D2xt = d + a(B)ut

36 for ARIMA(p,d,q) Time Series
Forecasting for ARIMA(p,d,q) Time Series

37 Consider the m+n random variables
x1, x2, ... , xm, y1, y2, ... , yn with joint density function f(x1, x2,... , xm, y1, y2, ... , yn) = f(x,y) where x = (x1, x2, ... , xm) and y = (y1, y2, ... , yn).

38 Then the conditional density of x = (x1, x2,. , xm) given y = (y1, y2,
Then the conditional density of x = (x1, x2,... , xm) given y = (y1, y2, ... , yn) is defined to be:

39 In addition the conditional expectation of
g(x) = g(x1, x2,... , xm) given y = (y1, y2, ... , yn) is defined to be:

40 Prediction

41 Again consider the m+n random variables x1,... , xm, y1,... , yn.
Suppose we are interested in predicting g(x1,... , xm) = g(x) given y = (y1, y2, ... , yn). Let t(y1, y2, ... , yn) = t(y) denote any predictor of g(x1, x2,... , xm) = g(x) given the information in the observations y = (y1, y2, ... , yn). Then the Mean square error of t(y) in predicting g(x) using t(y) is defined to be MSE[t(y)] = E[{t(y)-g(x)}2 |y]

42 It can be shown that the choice of t(y) that minimizes MSE[t(y)] is t(y) = E[g(x) |y].
Proof: Let v(t) = E{[t-g(x)]2 |y } = E[t2-2tg(x)+g2(x) |y] = t2-2tE[g(x)|y]+E[g2(x) |y]. Then v'(t) = 2t -2 E[g(x)|y] = 0 when t = E[g(x)|y].

43 Three Important Forms of a Non-Stationary Time Series
The Difference equation Form: b(B)Ddxt = d + a(B)ut or f(B)xt = d + a(B)ut xt = f1xt-1 + f2xt fp+dxt-p-d + d + ut +a1ut-1 + a2ut aqut-q

44 The Random Shock Form: xt = m(t) +y(B)ut or xt = m(t) + ut +y1ut-1 + y2ut-2 +y3ut where y(B) = [f(B)]-1a(B) = [b(B)Dd]-1a(B) = I + y1B + y2B2 +y3B m = [b(B)]-1d= d/(1 - b1 - b bp) and m(t) = D-dm

45 Note: Ddm(t) = m i.e. the dth order differences are constant. This implies that m(t) is a polynomial of degree d.

46 or f(B)xt = d + a(B)ut Consider The Difference equation Form:
b(B)Ddxt = d + a(B)ut or f(B)xt = d + a(B)ut

47 = f(B)-1 d + f(B)-1a(B)ut Multiply both sides by f(B)-1 To get
f(B)-1 f(B)xt = f(B)-1 d + f(B)-1a(B)ut or xt = f(B)-1 d + f(B)-1a(B)ut

48 The Inverted Form: p(B)xt = t + ut or xt = p1xt-1 + p2xt-2 +p3x3+ ... + t + ut where p(B) = [a(B)]-1f(B) = [a(B)]-1[b(B)Dd] = I - p1B - p2B2 - p3B

49 f(B)xt = d + a(B)ut Again Consider The Difference equation Form:
Multiply both sides by a(B)-1 To get a(B)-1f(B)xt = a(B)-1d + a(B)-1a(B)ut or p(B) xt = t + ut

50 Forecasting an ARIMA(p,d,q) Time Series
Let PT denote {…, xT-2, xT-1, xT} = the “past” til time T. Then the optimal forecast of xT+l given PT is denoted by: This forecast minimizes the mean square error

51 Three different forms of the forecast
Random Shock Form Inverted Form Difference Equation Form Note:

52 Random Shock Form of the forecast
Recall xt = m(t) + ut +y1ut-1 + y2ut-2 +y3ut or xT+l = m(T + l) + uT+l +y1uT+l-1 + y2uT+l-2 +y3uT+l Taking expectations of both sides and using

53 To compute this forecast we need to compute
{…, uT-2, uT-1, uT} from {…, xT-2, xT-1, xT}. Note: xt = m(t) + ut +y1ut-1 + y2ut-2 +y3ut Thus and Which can be calculated recursively

54 The Error in the forecast:
The Mean Sqare Error in the Forecast Hence

55 Prediction Limits for forecasts
(1 – a)100% confidence limits for xT+l

56 The Inverted Form: p(B)xt = t + ut or xt = p1xt-1 + p2xt-2 +p3x3+ ... + t + ut where p(B) = [a(B)]-1f(B) = [a(B)]-1[b(B)Dd] = I - p1B - p2B2 - p3B

57 The Inverted form of the forecast
Note: xt = p1xt-1 + p2xt t + ut and for t = T+l xT+l = p1xT+l-1 + p2xT+l t + uT+l Taking conditional Expectations

58 The Difference equation form of the forecast
xT+l = f1xT+l-1 + f2xT+l fp+dxT+l-p-d + d + uT+l +a1uT+l-1 + a2uT+l aquT+l-q Taking conditional Expectations

59 Example: The Model: xt - xt-1 = b1(xt-1 - xt-2) + ut + a1ut + a2ut or
xt = (1 + b1)xt-1 - b1 xt-2 + ut + a1ut + a2ut f(B)xt = b(B)(I-B)xt = a(B)ut where f(x) = 1 - (1 + b1)x + b1x2 = (1 - b1x)(1-x) and a(x) = 1 + a1x + a2x2 .

60 The Random Shock form of the model:
xt =y(B)ut where y(B) = [b(B)(I-B)]-1a(B) = [y(B)]-1a(B) i.e. y(B) [f(B)] = a(B). Thus (I + y1B + y2B2 + y3B3 + y4B )(I - (1 + b1)B + b1B2) = I + a1B + a2B2 Hence a1 = y1 - (1 + b1) or y1 = 1 + a1 + b1. a2 = y2 - y1(1 + b1) + b1 or y2 =y1(1 + b1) - b1 + a2. 0 = yh - yh-1(1 + b1) + yh-2b1 or yh = yh-1(1 + b1) - yh-2b1 for h ≥ 3.

61 The Inverted form of the model:
p(B) xt = ut where p(B) = [a(B)]-1b(B)(I-B) = [a(B)]-1f(B) i.e. p(B) [a(B)] = f(B). Thus (I - p1B - p2B2 - p3B3 - p4B )(I + a1B + a2B2) = I - (1 + b1)B + b1B2 Hence -(1 + b1) = a1 - p1 or p1 = 1 + a1 + b1. b1 = -p2 - p1a1 + a2 or p2 = -p1a1 - b1 + a2. 0 = -ph - ph-1a1 - ph-2a2 or ph = -(ph-1a1 + ph-2a2) for h ≥ 3.

62 Now suppose that b1 = 0. 80, a1 = 0. 60 and a2 = 0
Now suppose that b1 = 0.80, a1 = 0.60 and a2 = 0.40 then the Random Shock Form coefficients and the Inverted Form coefficients can easily be computed and are tabled below:

63 The Forecast Equations

64 The Difference Form of the Forecast Equation

65 Computation of the Random Shock Series, One-step Forecasts
Random Shock Computations

66 Computation of the Mean Square Error of the Forecasts and Prediction Limits

67 Table: MSE of Forecasts to lead time l = 12 (s2 = 2.56)

68 Raw Observations, One-step Ahead Forecasts, Estimated error , Error

69 Forecasts with 95% and 66.7% prediction Limits

70 Graph: Forecasts with 95% and 66.7% Prediction Limits


Download ppt "Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)"

Similar presentations


Ads by Google