Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Spectral Representation of Stationary Time Series

Similar presentations


Presentation on theme: "The Spectral Representation of Stationary Time Series"— Presentation transcript:

1 The Spectral Representation of Stationary Time Series

2 Stationary time series satisfy the properties:
Constant mean (E(xt) = m) Constant variance (Var(xt) = s2) Correlation between two observations (xt, xt + h) dependent only on the distance h. These properties ensure the periodic nature of a stationary time series

3 Recall is a stationary Time series where l1, l2, … lk are k values in (0,p) and X1, X1, … , Xk and Y1, Y2, … , Yk are independent independent random variables with

4 With this time series and
We can give it a non-zero mean, m, by adding m to the equation and

5 We now try to extend this example to a wider class of time series which turns out to be the complete set of weakly stationary time series. In this case the collection of frequencies may even vary over a continuous range of frequencies [0,p].

6 The Riemann integral The Riemann-Stiltjes integral If F is continuous with derivative f then: If F is is a step function with jumps pi at xi then:

7 First, we are going to develop the concept of integration with respect to a stochastic process.
Let {U(l): l  [0,p]} denote a stochastic process with mean 0 and independent increments; that is E{[U(l2) - U(l1)][U(l4) - U(l3)]} = 0 for 0 ≤ l1 < l2 ≤ l3 < l4 ≤ p. and E[U(l) ] = 0 for 0 ≤ l ≤ p.

8 In addition let G(l) =E[U2(l) ] for 0 ≤ l ≤ p and assume G(0) = 0. It is easy to show that G(l) is monotonically non decreasing. i.e. G(l1) ≤ G(l2) for l1 < l2 .

9 Now let us define: analogous to the Riemann-Stieltjes integral

10 Let li denote any value in the interval [li-1,li] Consider:
Let 0 = l0 < l1 < l2 < ... < ln = p be any partition of the interval. Let Let li denote any value in the interval [li-1,li] Consider: Suppose that and there exists a random variable V such that *

11 Then V is denoted by:

12 Properties:

13 The Spectral Representation of Stationary Time Series

14 Let {X(l): l  [0,p]} and {Y(l): l  [0,p]} denote a uncorrelated stochastic process with mean 0 and independent increments. Also let F(l) =E[X2(l) ] =E[Y2(l) ] for 0 ≤ l ≤ p and F(0) = 0. Now define the time series {xt : t  T}as follows:

15 Then

16 Also

17

18

19 Thus the time series {xt : t  T} defined as follows:
is a stationary time series with: F(l) is called the spectral distribution function: If f(l) = Fˊ(l) is called then is called the spectral density function:

20 Note The spectral distribution function, F(l), and spectral density function, f(l) describe how the variance of xt is distributed over the frequencies in the interval [0,p]

21 The autocovariance function, s(h), can be computed from the spectral density function, f(l), as follows: Also the spectral density function, f(l), can be computed from the autocovariance function, s(h), as follows:

22 Example: Suppose X1, X1, … , Xk and Y1, Y2, … , Yk are independent independent random variables with Let l1, l2, … lk denote k values in (0,p) Then

23 If we define {X(l): l  [0,p]} and {Y(l): l  [0,p]} Note: X(l) and Y(l) are “random” step functions and F(l) is a step function.

24

25 Another important comment
In the case when F(l) is continuous then

26 Sometimes the spectral density function, f(l), is extended to the interval [-p,p] and is assumed symmetric about 0 (i.e. fs(l) = fs (-l) = f (l)/2 ) in this case It can be shown that

27 From now on we will use the symmetric spectral density function and let it be denoted by, f(l).
Hence

28 Example: Let {ut : t  T} be identically distributed and uncorrelated with mean zero (a white noise series). Thus and

29 Graph:

30 Linear Filters

31 Let {xt : t  T} be any time series and suppose that the time series {yt : t  T} is constructed as follows: : The time series {yt : t  T} is said to be constructed from {xt : t  T} by means of a Linear Filter. Linear Filter as output yt input xt

32 Let sx(h) denote the autocovariance function of {xt : t T} and sy(h) the autocovariance function of {yt : t  T}. Assume also that E[xt] = E[yt] = 0. Then: :

33

34

35 Hence where of the linear filter

36 Note: hence

37 Spectral density function Moving Average Time series of order q, MA(q)
Let a0 =1, a1, a2, … aq denote q + 1 numbers. Let {ut|t  T} denote a white noise time series with variance s2. Let {xt|t  T} denote a MA(q) time series with m = 0. Note: {xt|t  T} is obtained from {ut|t  T} by a linear filter.

38 Now Hence

39 Example: q = 1

40 Example: q = 2

41 Spectral density function for MA(1) Series

42 Spectral density function Autoregressive Time series of order p, AR(p)
Let b1, b2, … bp denote p + 1 numbers. Let {ut|t  T} denote a white noise time series with variance s2. Let {xt|t  T} denote a AR(p) time series with d = 0. Note: {ut|t  T} is obtained from {xt|t  T} by a linear filter.

43 Now Hence

44 Example: p = 1

45 Example: p = 2

46 Example : Sunspot Numbers (1770-1869)

47

48 Autocorrelation function and partial autocorrelation function

49 Spectral density Estimate

50 Assuming an AR(2) model

51 A linear discrete time series
Moving Average time series of infinite order

52 Let q0 =1, q1, q2, … denote an infinite sequence of numbers.
Let {ut|t  T} denote a white noise time series with variance s2. independent mean 0, variance s2. Let {xt|t  T} be defined by the equation. Then {xt|t  T} is called a Linear discrete time series. Comment: A linear discrete time series is a Moving average time series of infinite order

53 The AR(1) Time series Let {xt|t  T} be defined by the equation. Then

54 where and An alternative approach using the back shift operator, B. The equation: can be written

55 Now since The equation: has the equivalent form:

56 For the general AR(p) time series:
where The time series {xt |t  T} can be written as a linear discrete time series and [b(B)]-1can be found by carrying out the multiplication

57 Thus the AR(p) time series:
can be written: where Hence This called the Random Shock form of the series

58 Thus the AR(p) time series:
can be written: where Hence This called the Random Shock form of the series

59 The Random Shock form of an
ARMA(p,q) time series: An ARMA(p,q) time series {xt |t  T} satisfies the equation: where and

60 Again the time series {xt |t  T} can be written as a linear discrete time series
namely where q(B) =[b(B)]-1[a(B)] can be found by carrying out the multiplication

61 Thus an ARMA(p,q) time series can be written:
where

62 The inverted form of a stationary time series
Autoregressive time series of infinite order

63 An ARMA(p,q) time series {xt |t  T} satisfies the equation:
where and Suppose that This will be true if the roots of the polynomial all exceed 1 in absolute value. The time series {xt |t  T} in this case is called invertible.

64 Then or where

65 Thus an ARMA(p,q) time series can be written:
where This is called the inverted form of the time series. This expresses the time series an autoregressive time series of infinite order.


Download ppt "The Spectral Representation of Stationary Time Series"

Similar presentations


Ads by Google