Presentation is loading. Please wait.

Presentation is loading. Please wait.

Stochastic models - time series. Random process. an infinite collection of consistent distributions probabilities exist Random function. a family of random.

Similar presentations


Presentation on theme: "Stochastic models - time series. Random process. an infinite collection of consistent distributions probabilities exist Random function. a family of random."— Presentation transcript:

1 Stochastic models - time series. Random process. an infinite collection of consistent distributions probabilities exist Random function. a family of random variables, e.g. {Y(t), t in Z}

2 Specified if given F(y 1,...,y n ;t 1,...,t n ) = Prob{Y(t 1 )  y 1,...,Y(t n )  y n } that are symmetric F(  y;  t) = F(y;t),  a permutation compatible F(y 1,...,y m, ,...,  ;t 1,...,t m,t m+1,...,t n } = F(y 1,...,y m ;t 1,...,t m )

3 Finite dimensional distributions First-order F(y;t) = Prob{Y(t)  t} Second-order F(y 1,y 2 ;t 1,t 2 ) = Prob{Y(t 1 )  y 1 and Y(t 2 )  y 2 } and so on

4 Other methods i) Y(t;  ),  : random variable ii) urn model iii) probability on function space iv) analytic formula Y(t) =  cos( t +  )  : fixed  : uniform on (- ,  ]

5 There may be densities The Y(t) may be discrete, angles, proportions,... Kolmogorov extension theorem. To specify a stochastic process give the distribution of any finite subset {Y(  1 ),...,Y(  n )} in a consistent way,  in A

6 Moment functions. Mean function c Y (t) = E{Y(t)} =  y dF(y;t) =  y f(y;t) dy if continuous =  y j f(y j ; t) if discrete E{  1 Y 1 (t) +  2 Y 2 (t)} =  1 c 1 (t) +  2 c 2 (t) vector-valued case mean level - signal plus noise: S(t) +  (t) S(.): fixed

7 Second-moments. autocovariance function c YY (s,t) = cov{Y(s),Y(t)} = E{Y(s)Y(t)} - E{Y(s)}E{Y(t)} non-negative definite   j  k c YY (t j, t k )  0 scalars  crosscovariance function c 12 (s,t) = cov{Y 1 (s),Y 2 (t)}

8 Stationarity. Joint distributions, {Y(t+u 1 ),...,Y(t+u k-1 ),Y(t)}, do not depend on t for k=1,2,... Often reasonable in practice - for some time stretches Replaces "identically distributed"

9 mean E{Y(t)} = c Y for t in Z autocovariance function cov{Y(t+u),Y(t)} = c YY (u) t,u in Z u: lag = E{Y(t+u)Y(t)} if mean 0 autocorrelation function  (u) = corr{Y(t+u),Y(t)}, |  (u)|  1 crosscovariance function cov{X(t+u),Y(t)} = c XY (u)

10 joint density Prob{x < Y(t+u) < x+dx and y < Y(t) < y+ dy} = f(x,y|u) dxdy

11 Some useful models Chatfield notation Purely random / white noise often mean 0 Building block

12 Random walk not stationary

13 (*)

14 Moving average, MA(q) From (*) stationary

15 MA(1)  0 =1  1 = -.7

16 Backward shift operator Linear process. Need convergence condition

17 autoregressive process, AR(p) first-order, AR(1) Markov Linear process For convergence/stationarity *

18 a.c.f. From (*) p.a.c.f. corr{Y(t),Y(t-m)|Y(t-1),...,Y(t-m+1)} linearly = 0 for m  p when Y is AR(p)

19 In general case, Useful for prediction

20 ARMA(p,q)

21 ARIMA(p,d,q).

22 Some series and acf’s

23 Yule-Walker equations for AR(p). Correlate, with X t-k, each side of

24 Cumulants. multilinear functional 0 if some subset of variantes independent of rest 0 of order > 2 for normal normal is determined by its moments


Download ppt "Stochastic models - time series. Random process. an infinite collection of consistent distributions probabilities exist Random function. a family of random."

Similar presentations


Ads by Google