Presentation is loading. Please wait.

Presentation is loading. Please wait.

Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)

Similar presentations


Presentation on theme: "Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)"— Presentation transcript:

1 Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization) Inference is possible (though sometimes questionable) Static Normal-based inference not generally reliable Cyclic component hard to estimate Decomposition Easy to interpret Possible to have dynamic seasonal effects Cyclic components can be estimated Descriptive (no inference per def) Static in trend

2 Explanation to the static behaviour: The classical approach assumes all components except the irregular ones (i.e.  t and IR t ) to be deterministic, i.e. fixed functions or constants To overcome this problem, all components should be allowed to be stochastic, i.e. be random variates. A time series y t should from a statistical point of view be treated as a stochastic process. We will interchangeably use the terms time series and process depending on the situation.

3 Stationary and non-stationary time series

4 AR-models Consider the model y t =δ+φ·y t-1 +a t with {a t } i.i.d with zero mean and constant variance = σ 2 Set δ=0 by sake of simplicity  E(y t )=0 Let R(k)=Cov(y t,y t-k )= Cov(y t,y t+k )=E(y t ·y t-k )= E(y t ·y t+k )  R(0)=Var(y t ) assumed to be constant

5 Now: R(0)=E(y t ·y t )= E(y t ·(φ·y t-1 +a t )= φ· E(y t ·y t-1 )+ E(y t ·a t )= = φ·R(1)+ E((φ·y t-1 +a t ) ·a t )= φ·R(1)+ φ· E(y t-1 ·a t )+ E(a t ·a t )= = φ·R(1)+ 0 + σ 2 (for a t is independent of y t-1 ) R(1)=E(y t ·y t+1 )= E(y t ·(φ·y t +a t+1 )= φ· E(y t ·y t )+ E(y t ·a t+1 )= = φ·R(0)+0 (for a t+1 is independent of y t ) R(2)= E(y t ·y t+2 )= E(y t ·(φ·y t+1 +a t+2 )= φ· E(y t ·y t+1 )+ E(y t ·a t+2 )= = φ·R(1)+0 (for a t+1 is independent of y t ) 

6 R(0)= φ·R(1)+ σ 2 R(1)= φ·R(0)Yule-Walker equations R(2)= φ·R(1) …  R(k )= φ·R(k-1)=…= φ k ·R(0) R(0)= φ 2 ·R(0) + σ 2 

7 Note that for R(0) to become positive and finite (which we require from a variance) the following must hold: This in effect the condition for an AR(1)-process to be weakly stationary Note now that

8 ρ k is called the Autocorrelation function (ACF) of y t ”Auto” because it gives correlations within the same time series. For pairs of different time series one can define the Cross correlation function which gives correlations at different lags between series. By studying the ACF it might be possible to identify the approximate magnitude of φ

9 Examples:

10

11

12 The look of an ACF can be similar for different kinds of time series, e.g. the ACF for an AR(1) with φ=0.3 could be approximately the same as the ACF for an Auto-regressive time series of higher order than 1 (we will discuss higher order AR-models later) To do a less ambiguos identification we need another statistic: The Partial Autocorrelation function (PACF): υ k = Corr (y t,y t-k | y t-k+1, y t-k+2,…, y t-1 ) i.e. the conditional correlation between y t and y t-k given all observations in-between. Note that -1  υ k  1

13 A concept hard to interpretate, but it can be shown that For AR(1)-models with φ positive the look of the PACF is and for AR(1)-models with φ negative the look of the PACF is

14 Assume now that we have a sample y 1, y 2,…, y n from a time series assumed to follow an AR(1)-model. Example:

15 The ACF and the PACF can be estimated from data by their sample counterparts: Sample Autocorrelation function (SAC): if n large, otherwise a scaling might be needed Sample Partial Autocorrelation function (SPAC) Complicated structure, so not shown here

16 The variance function of these two estimators can also be estimated  Opportunity to test the null hypothesis of a zero autocorrelation or partial autocorrelation for a single value of k. Estimated sample functions are usually plotted together with critical limits based on estimated variances.

17 Example (cont) DKK/USD exchange: SAC: SPAC: Critical limits

18 Ignoring all bars within the red limits, we would identify the series as beeing an AR(1) with positive φ. The value of φ is approximately 0.9 (ordinate of first bar in SAC plot and in SPAC plot)

19 Higher-order AR-models AR(2):or y t-2 must be present AR(3): or other combinations with φ 3 y t-3 AR(p): i.e. different combinations with φ p y t-p

20 Typical patterns of ACF and PACF functions for higher order stationary AR-models (AR( p )): ACF: Similar pattern as for AR(1), i.e. (exponentially) decreasing bars, (most often) positive for φ 1 positive and alternating for φ 1 negative. PACF: The first p values of k are non-zero with decreasing magnitude. The rest are all zero (cut-off point at p ) (Most often) all positive if φ 1 positive and alternating if φ 1 negative Requirements for stationarity usually complex to describe for p>2.

21 Examples: AR(2), φ 1 positive: AR(5), φ 1 negative:


Download ppt "Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)"

Similar presentations


Ads by Google