Download presentation
Presentation is loading. Please wait.
Published byValentine Ford Modified over 9 years ago
1
TIME SERIES ANALYSIS Time Domain Models: Red Noise; AR and ARMA models LECTURE 7 Supplementary Readings: Wilks, chapters 8
2
Recall: Statistical Model [Observed Data] = [Signal] + [Noise] “noise” has to satisfy certain properties! If not, we must iterate on this process... We will seek a general method of specifying just such a model…
3
Statistics of the time series don’t change with time We conventionally assume Stationarity Weak Stationarity (Covariance Stationarity) statistics are a function of lag k but not absolute time t Strict Stationarity Cyclostationarity statistics are a periodic function of lag k
4
Time Series Modeling All linear time series models can be written in the form: We assume that are Gaussian distributed. Autoregressive Moving-Average Model(“ARMA”) Box and Jenkins (1976) ARMA(K,M) MODEL
5
Consider Special case of simple Autoregressive AR(k) model ( m =0) Suppose k=1, and define 1 This should look familiar! Special case of a Markov Process (a process for which the state of system depends on previous states) Time Series Modeling All linear time series models can be written in the form:
6
Suppose k=1, and define 1 Time Series Modeling Assume process is zero mean, then Lag-one correlated process or “AR(1)” process… Consider Special case of simple Autoregressive AR(k) model ( m =0) All linear time series models can be written in the form:
7
For simplicity, we assume zero mean AR(1) Process
8
Let us take this series as a random forcing AR(1) Process
9
Blue : =0.4 Red: =0.7 Let us take this series as a random forcing
10
What is the standard deviation of an AR(1) process? AR(1) Process
11
Blue: =0.4 Red: =0.7 Green: =0.9 AR(1) Process
12
Suppose =1 Random Walk (“Brownian Motion”) Not stationary How might we try to turn this into a stationary time series? AR(1) Process Variance is infinite!
13
Let us define the lag-k autocorrelation: We can approximate: AR(1) Process Autocorrelation Function Let us assume the series x has been de-meaned…
14
Let us define the lag-k autocorrelation: AR(1) Process Autocorrelation Function Let us define the lag-k autocorrelation: Then:
15
Autocorrelation Function CO 2 since 1976
16
Recursively, we thus have for an AR(1) process, AR(1) Process Autocorrelation Function (Theoretical)
17
“tcf.m” scf.m =0.5 N=100 “rednoise.m” AR(1) Process Autocorrelation Function
18
=0.5 N=100 “rednoise.m” AR(1) Process Autocorrelation Function =0.5 N=500 “rednoise.m”
19
AR(1) Process Autocorrelation Function Glacial Varves =0.23 N=1000
20
AR(1) Process Autocorrelation Function Northern Hem Temp =0.75 N=144
21
AR(1) Process Autocorrelation Function Northern Hem Temp (linearly detrended) =0.54 N=144
22
AR(1) Process Autocorrelation Function Dec-Mar Nino3
23
AR(1) Process The sampling distribution for is given by the sampling distribution for the slope parameter in linear regression!
24
AR(1) Process The sampling distribution for is given by the sampling distribution for the slope parameter in linear regression! How do we determine if is significantly non-zero? This is just the t test!
25
When Serial Correlation is Present, the variance of the mean must be adjusted, Recall for AR(1) series, AR(1) Process Variance inflation factor This effects the significance of regression/correlation as we saw previously…
26
AR(1) Process Suppose 1
27
Multiply this equation by x t-k’ and sum, Now consider an AR(K) Process For simplicity, we assume zero mean
28
Yule-Walker Equations Use AR(K) Process
29
Several results obtained for the AR(1) model generalize readily to the AR(K) model: AR(K) Process Yule-Walker Equations
30
AR(K) Process Yule-Walker Equations is particularly important because of the range of behavior it can describe w/ a parsimonious number of parameters The AR(2) model The Yule-Walker equations give:
31
AR(2) Process is particularly important because of the range of behavior it can describe w/ a parsimonious number of parameters The AR(2) model The Yule-Walker equations give: Which readily gives:
32
AR(2) Process Which readily gives: 012-2 +1 11 22 For stationarity, we must have:
33
AR(2) Process Which readily gives: 012-2 +1 11 22 For stationarity, we must have: Note that this model allows for independent lag-1 and lag-2 correlation, so that both positive correlation and negative correlation are possible...
34
AR(2) Process Which readily gives: For stationarity, we must have: Note that this model allows for independent lag-1 and lag-2 correlation, so that both positive correlation and negative correlation are possible... “artwo.m”
35
Selection Rules AR(K) Process Bayesian Information Criterion Akaike Information Criterion The minima in AIC or BIC represent an ‘optimal’ tradeoff between degrees of freedom and variance explained
36
ENSO Multivariate ENSO Index (“MEI”) AR(K) Process
37
AR(1) Fit Autocorrelation Function Dec-Mar Nino3
38
AR(2) Fit Autocorrelation Function Dec-Mar Nino3 (m>2)
39
AR(3) Fit Autocorrelation Function Dec-Mar Nino3 Theoretical AR(3) Fit
40
AR(K) Fit Autocorrelation Function Dec-Mar Nino3 Favors AR(K) Fit for K=? Minimum in BIC? Minimum in AIC?
41
MA model Now, consider the case k =0 Pure Moving Average (MA) model, represents a running mean of the past M values. Consider case where M=1…
42
MA(1) model Now, consider the case k =0 Consider case where M=1…
43
ARMA(1,1) model
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.