Presentation is loading. Please wait.

Presentation is loading. Please wait.

Algebra U = ∑ a(t) Y(t) E{U} = c Y ∑ a(t)

Similar presentations


Presentation on theme: "Algebra U = ∑ a(t) Y(t) E{U} = c Y ∑ a(t)"— Presentation transcript:

1 Algebra U = ∑ a(t) Y(t) E{U} = c Y ∑ a(t) cov{U,V} = ∑ a(s) b(t) c YY(s-t) U is gaussian if {Y(t)} gaussian

2 Some useful stochastic models
Purely random / white noise (i.i.d.) (often mean assumed 0) cYY(u) = cov(Y(t+u),Y(t)} = σY2 if u = 0 = 0 if u ≠ 0 ρYY(u) = 1, u=0 = 0, u ≠ 0 A building block

3 Random walk Y(t) = Y(t-1) + Z(t), Y(0) = 0 Y(t) = ∑i=1t Z(i) E{Y(t)} = t μZ var{Y(t)} = t σZ2 Not stationary, but ∆Y(t) = Y(t) – Y(t-1) = Z(t)

4 Moving average, MA(q) Y(t) = β(0)Z(t) + β(1)Z(t-1) +…+ β(q)Z(t-q) If E{Z(t)} = 0, E{Y(t)} = 0 c(u) = 0, u > q = σZ2 ∑ t=0q-k β(t) β(t+u) u=0,1,…,q = c(-u) stationary MA(1). ρ(u) = 1 u = 0 = β(1)/(1+ β(1) 2), k = ±1 = 0 otherwise

5 Backward shift operator remember translation operator TuY(t)=Y(t+u)
BjY(t) = Y(t-j) Linear process. Need convergence condition, e.g. |i | or |i |2 < 

6 autoregressive process, AR(p)
first-order, AR(1) Markov (**) Linear process invertible For convergence in probability/stationarity

7 a.c.f. of ar(1) from previous slide (**)
p.a.c.f. using normal or linear definitions corr{Y(t),Y(t-m)|Y(t-1),...,Y(t-m+1)} = 0 for m  p when Y is AR(p) Proof. via multiple regression

8 In general case, Useful for prediction

9 Yule-Walker equations for AR(p).
Sometimes used for estimation Correlate, with Xt-k , each side of

10 ARMA(p,q) (B)Yt = (B)Zt

11 ARIMA(p,d,q). Xt = Xt - Xt 2Xt = Xt - 2Xt-1 + Xt-2 arima.mle() fits by mle assuming Gaussian noise

12 Armax. (B)Yt = β(B)Xt + (B)Zt arima.mle(…,xreg,…) State space. st = Ft(st-1 , zt ) Yt = Ht(st , Zt ) could enclude X

13 Next i.i.d. → mixing stationary process
Mixing has a variety of definitions e.g. normal case, ∑ |cYY(u)| < ∞, e.g.Cryer and Chan (2008) CLT Y-bar = ∑ t=1T Y(t)/T Normal with E{Y-bar} = cY var{Y-bar} = ∑ s=1T ∑ t=1T c YY(s-t) ≈ T ∑ u c YY(u) = T σYY if white noise

14 Cumulants. cum(Y1,Y2, ...,Yk ) Extends mean, variance, covariance cum(Y) = E{Y} cum(Y,Y) = Var{Y} cum(X,Y) = Cov(X,Y) DRB (1975)

15

16 Proof of ordinary CLT. ST = Y(1) + … + Y(T) cumk(ST) = T κ k additivity and imdependence cumk(ST/√T) = T–k/2 cumk(ST) = O( T T–k/2 ) → 0 for k > 2 as T → ∞ normal cumulants of order > 2 are 0 normal is determined by its moments (ST - Tμ)/√ T tends in distribution to N(0,σ2)


Download ppt "Algebra U = ∑ a(t) Y(t) E{U} = c Y ∑ a(t)"

Similar presentations


Ads by Google