Presentation is loading. Please wait.

Presentation is loading. Please wait.

Understanding Nonstationarity and Recognizing it When You See it

Similar presentations


Presentation on theme: "Understanding Nonstationarity and Recognizing it When You See it"— Presentation transcript:

1 Understanding Nonstationarity and Recognizing it When You See it
D. A. Dickey North Carolina State University

2 Nonstationary Forecast

3 ”Trend Stationary” Forecast
Nonstationary Forecast

4 DYt = m (1- r) + (r-1)Yt-1 + et DYt = (r-1)(Yt-1 - m) + et
Autoregressive Model AR(1) Yt - m = r (Yt-1-m) + et Yt = m (1- r) + rYt-1 + et DYt = m (1- r) + (r-1)Yt et DYt = (r-1)(Yt-1 - m) + et where DYt is Yt -Yt-1 AR(p) Yt - m = a1(Yt-1-m) + a2(Yt-2-m) ap(Yt-1-m) + et

5 AR(1) Stationary  |r| < 1
OLS Regression Estimators – Stationary case Mann and Wald (1940’s) : For |r| < 1 More exciting algebra coming up ……

6 AR(1) Stationary  |r| < 1
OLS Regression Estimators – Stationary case Same limit if sample mean replaced by m (2) AR(p)  Multivariate Normal Limits 

7 |r| < 1 Yt-m = r(Yt-1-m) + et = r(r(Yt-2-m) + et-1) + et = ... = et + ret-1 +r2et-2 + … +rk-1 et-k+1+ rk (Yt-k-m) Yt=m (converges for |r| < 1) Var{Yt } = s2/(1-r2) r = 1 But if r=1, then Yt = Yt-1 + et, a random walk. Yt = Y0 + et + et-1 + et-2 + … + e1 Var{Yt - Y0} = ts2 E{Yt} = E{Y0}

8 Forecast of Yt+L converges to m (exponentially fast)
AR(1) |r| < 1 E{Yt} = m Var{Yt } is constant Forecast of Yt+L converges to m (exponentially fast) Forecast error variance is bounded AR(1) r = 1 Yt = Yt-1 + et E{Yt} = E{Y0} Var{Yt} grows without bound Forecast not mean reverting

9 E = MC2 r = ?

10 Nonstationary (r=1) cases:
Case 1: m known (=0) Regression Estimators (Yt on Yt-1 noint ) /n n /n2

11 r=1  Nonstationary Recall stationary results: Note: all results independent of s 2

12 Where are my clothes? H0:r=1 H1:|r|<1 ?

13 DF Distribution ?? Numerator: e e e3 … en e e e1e e1e3 … e1en e e e2e3 … e2en e e32 … e3en : : en en2 : Y1e2 Y2e3 Yn-1en

14 Denominator For n Observations:
(eigenvalues are reciprocals of each other)

15 Results: eTAne = n-2 eTAne = Graph of gi,502 and limit :

16 Histograms for n=50: -1.96 -8.1

17 Theory 1: Donsker’s Theorem (pg. 68, 137 Billingsley)
{et} an iid(0,s2) sequence  (n=100) Sn = e1+e2+ …+en X(t,n) = S[nt]/(n1/2s)=Sn normalized

18 Theory 1: Donsker’s Theorem (pg. 137 Billingsley)
Donsker: X(t,n) converges in law to W(z), a “Wiener Process” plots of X(t,n) versus z= t/n for n=20, 100, 2000 20 realizations of X(t,100) vs. z=t/n

19 Theory 2: Continuous mapping theorem
(Billingsley pg. 72) h( ) a continuous functional => h( X(t,n) ) h(W(t)) For our estimators, and so…… Distribution is …. ???????

20 Nice proof Grandpa! As you see, I’m very excited.

21 Extension 1: Add a mean (intercept)
New quadratic forms. New distributions Estimator independent of Y0

22 Extension 2: Add linear trend
Regress Yt on 1, t, Yt-1 annihilates Y0 , bt New quadratic forms. New distributions

23 The 6 Distributions coefficient n(rj-1) t test t f(t) = 0 mean trend
-8.1 -14.1 -21.8 t test t - 1.96 -1.95 -2.93 -3.50 f(t) = mean trend

24 t percentiles, n=50 t percentiles, limit pr<t 0.01 0.025 0.05 0.10
0.50 0.90 0.95 0.975 0.99 f(t) --- -2.62 -2.25 -1.95 -1.61 -0.49 0.91 1.31 1.66 2.08 1 -3.59 -3.32 -2.93 -2.60 -1.55 -0.41 -0.04 0.28 0.66 (1,t) -4.16 -3.80 -3.50 -3.18 -2.16 -1.19 -0.87 -0.58 -0.24 t percentiles, limit pr<t 0.01 0.025 0.05 0.10 0.50 0.90 0.95 0.975 0.99 f(t) --- -2.58 -2.23 -1.95 -1.62 -0.51 0.89 1.28 1.62 2.01 1 -3.42 -3.12 -2.86 -2.57 -1.57 -0.44 -0.08 0.23 0.60 (1,t) -3.96 -3.67 -3.41 -3.13 -2.18 -1.25 -0.94 -0.66 -0.32

25 Higher Order Models stationary: “characteristic eqn.”
roots 0.5, 0.8 ( < 1) note: (1-.5)(1-.8) = -0.1 nonstationary

26 Higher Order Models- General AR(2)
roots: (m - a )( m - b ) = m2 - ( a + b )m + ab AR(2): ( Yt - m ) = ( a + b ) ( Yt-1 - m ) - ab ( Yt-2 - m ) + et (0 if unit root) nonstationary t test same as AR(1). Coefficient requires modification t test  N(0,1) !!

27 Tests Regress: These coefficients  normal! |   | on (1, t) Yt-1
|   | on (1, t) Yt-1 ( “ADF” test ) r-1 ( t ) augmenting affects limit distn. “ does not affect “ “

28 Silver example: Nonstationary Forecast Stationary Forecast

29 Is AR(2) sufficient ? test vs. AR(5).
proc reg; model D = Y1 D1-D4; test D2=0, D3=0, D4=0; Source df Coeff t Pr>|t| Intercept Yt Yt-1-Yt Yt-2-Yt Yt-3-Yt Yt-4-Yt F413 = 1152 / 871 = Pr>F = X

30 Fit AR(2) and do unit root test
Method 1: OLS output and tabled critical value (-2.86) proc reg; model D = Y1 D1; Source df Coeff. t Pr>|t| Intercept X Yt X Yt-1-Yt  Method 2: OLS output and tabled critical values proc arima; identify var=silver stationarity = (dickey=(1)); Augmented Dickey-Fuller Unit Root Tests Type Lags t Prob<t Zero Mean Single Mean  Trend

31 ? First part ACF IACF PACF

32 Full data ACF IACF PACF

33 Amazon.com Stock ln(Closing Price)
Levels Differences

34 Levels Differences Type Lags Tau Pr < Tau Zero Mean 2 1.85 0.9849
Augmented Dickey-Fuller Unit Root Tests Type Lags Tau Pr < Tau Zero Mean Single Mean Trend Differences Augmented Dickey-Fuller Unit Root Tests Type Lags Tau Pr<Tau Zero Mean <.0001 Single Mean <.0001 Trend <.0001

35 Are differences white noise (p=q=0) ?
Autocorrelation Check for White Noise To Chi Pr > Lag Square DF ChiSq Autocorrelations

36 Amazon.com Stock Volume
Levels Differences

37  Augmented Dickey-Fuller Unit Root Tests Type Lags Tau Pr < Tau
Zero Mean Single Mean Trend <.0001 Maximum Likelihood Estimation Approx Parameter Estimate t Value Pr > |t| Lag Variable MU < volume MA1, < volume AR1, < volume AR1, < volume NUM < date To Chi Pr > Lag Square DF ChiSq Autocorrelations

38 Amazon.com Spread = ln(High/Low)
Levels Differences

39  Augmented Dickey-Fuller Unit Root Tests Type Lags Tau Pr<Tau
Zero Mean Single Mean <.0001 Trend <.0001 Maximum Likelihood Estimation Approx Parm Estimate t Value Pr>|t| Lag Variable MU spread MA1, < spread AR1, < spread AR1, < spread NUM date To Chi Pr > Lag Square DF ChiSq Autocorrelations

40 S.E. Said: Use AR(k) model even if MA terms in true model.
N. Fountis: Vector Process with One Unit Root D. Lee: Double Unit Root Effect M. Chang: Overdifference Checks G. Gonzalez-Farias: Exact MLE K. Shin: Multivariate Exact MLE T. Lee: Seasonal Exact MLE Y. Akdi, B. Evans – Periodograms of Unit Root Processes

41 H. Kim: Panel Data tests S. Huang: Nonlinear AR processes S. Huh: Intervals: Order Statistics S. Kim: Intervals: Level Adjustment & Robustness J. Zhang: Long Period Seasonal. Q. Zhang: Comparing Seasonal Cointegration Methods.

42


Download ppt "Understanding Nonstationarity and Recognizing it When You See it"

Similar presentations


Ads by Google