Models for Non-Stationary Time Series

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

Time Series Analysis Definition of a Time Series process
State Space Models. Let { x t :t T} and { y t :t T} denote two vector valued time series that satisfy the system of equations: y t = A t x t + v t (The.
Stationary Time Series
Dates for term tests Friday, February 07 Friday, March 07
DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Exam 1 review: Quizzes 1-6.
Model Building For ARIMA time series
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
1 MF-852 Financial Econometrics Lecture 11 Distributed Lags and Unit Roots Roy J. Epstein Fall 2003.
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
Business Forecasting Chapter 10 The Box–Jenkins Method of Forecasting.
Solving Quadratic Equations Algebraically Lesson 2.2.
Prediction and model selection
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Non-Seasonal Box-Jenkins Models
ARMA models Gloria González-Rivera University of California, Riverside
Time-Series Analysis and Forecasting – Part V To read at home.
TIME SERIES by H.V.S. DE SILVA DEPARTMENT OF MATHEMATICS
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
Accelerated Math II Polynomial Review. Quick Practice “Quiz” 1. A rectangular sheet of metal 36 inches wide is to be made into a trough by turning up.
Models for Non-Stationary Time Series The ARIMA(p,d,q) time series.
Intro. ANN & Fuzzy Systems Lecture 26 Modeling (1): Time Series Prediction.
FORECASTING. Minimum Mean Square Error Forecasting.
Big Data at Home Depot KSU – Big Data Survey Course Steve Einbender Advanced Analytics Architect.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
Time Series Analysis Lecture 11
Linear Filters. denote a bivariate time series with zero mean. Let.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
The Box-Jenkins (ARIMA) Methodology
MODELS FOR NONSTATIONARY TIME SERIES By Eni Sumarminingsih, SSi, MM.
Computational Finance II: Time Series K.Ensor. What is a time series? Anything observed sequentially (by time?) Returns, volatility, interest rates, exchange.
Introduction to stochastic processes
1 Autocorrelation in Time Series data KNN Ch. 12 (pp )
Time Series Analysis.
assignment 7 solutions ► office networks ► super staffing
Nature of Roots of a Quadratic Equation
Nature of Roots of a Quadratic Equation
Ch8 Time Series Modeling
Quadratic Equations and Problem Solving
Solving Quadratic Functions
Lecture 8 ARIMA Forecasting II
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
Income Forecasting.
Factorization by Cross-method
Statistics 153 Review - Sept 30, 2008
Model Building For ARIMA time series
ECON 240C Lecture 7.
Lecture 26 Modeling (1): Time Series Prediction
Solving Quadratic Functions
Chapter 5 Nonstationary Time series Models
Hidden Markov Autoregressive Models
Forecasting with non-stationary data series
MBF1413 | Quantitative Methods Prepared by Dr Khairul Anuar
Chapter 6: Forecasting/Prediction
Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
2 Identities and Factorization
Module 3 Forecasting a Single Variable from its own History, continued
Unit Root & Augmented Dickey-Fuller (ADF) Test
CHAPTER 14 MULTIPLE REGRESSION
State Space Models.
Linear Filters.
The Spectral Representation of Stationary Time Series
Time Series Analysis and Forecasting
WHY WE FILTER TO IDENTIFY THE RELATIONSHIP
Load forecasting Prepared by N.CHATHRU.
3.2 The Remainder Theorem.
Presentation transcript:

Models for Non-Stationary Time Series The ARIMA(p,d,q) time series

The ARIMA(p,d,q) time series Many non-stationary time series can be converted to a stationary time series by taking dth order differences.

Let {xt|t  T} denote a time series such that {wt|t  T} is an ARMA(p,q) time series where wt = Ddxt = (I – B)dxt = the dth order differences of the series xt. Then {xt|t  T} is called an ARIMA(p,d,q) time series (an integrated auto-regressive moving average time series.)

The equation for the time series {wt|t  T} is: b(B)wt = d + a(B)ut The equation for the time series {xt|t  T} is: b(B)Ddxt = d + a(B)ut or f(B) xt = d + a(B)ut.. Where f(B) = b(B)Dd = b(B)(I - B) d

Suppose that d roots of the polynomial f(x) are equal to unity then f(x) can be written: f(B) = (1 - b1x - b2x2 -... - bpxp)(1-x)d. and f(B) could be written: f(B) = (I - b1B - b2B2 -... - bpBp)(I-B)d= b(B)Dd. In this case the equation for the time series becomes: f(B)xt = d + a(B)ut or b(B)Dd xt = d + a(B)ut..

Comments: The operator f(B) =b(B)Dd = 1 - f1x - f2x2 -... - fp+dxp+d is called the generalized autoregressive operator. (d roots are equal to 1, the remaining p roots have |ri| > 1) 2. The operator b(B) is called the autoregressive operator. (p roots with |ri| > 1) 3. The operator a(B) is called moving average operator.

Example – ARIMA(1,1,1) The equation: (I – b1B)(I – B)xt = d + (I + a1)ut (I – (1 + b1) B + b1B2)xt = d + ut + a1 ut - 1 xt – (1 + b1) xt-1 + b1xt-2 = d + ut + a1 ut – 1 or xt = (1 + b1) xt-1 – b1xt-2 + d + ut + a1 ut – 1

Modeling of Seasonal Time Series

If a time series, {xt: t  T},that is seasonal we would expect observations in the same season in adjacent years to have a higher auto correlation than observations that are close in time (but in different seasons. For example for data that is monthly we would expect the autocorrelation function to look like this

The AR(1) seasonal model This model satisfies the equation: The autocorrelation for this model can be shown to be: This model is also an AR(12) model with b1 = … = b11 = 0

Graph: r(h)

The AR model with both seasonal and serial correlation This model satisfies the equation: This model is also an AR(13) model. The autocorrelation for this model will satisfy the equations:

The Yule-Walker Equations:

or:

Some solutions for rh

Excel file for determining Autocorrelation function

The general ARIMA model incorporating seasonality where

Prediction

Three Important Forms of a Non-Stationary Time Series

xt = f1xt-1 + f2xt-2 +... +fp+dxt-p-d The Difference equation Form: xt = f1xt-1 + f2xt-2 +... +fp+dxt-p-d + d + ut +a1ut-1 + a2ut-2 +...+ aqut-q f (B) xt = b(B)Ddxt = d + a(B)ut f (B) = b(B)Dd = b(B) (I - B)d

xt = m(t) +y(B)ut The Random Shock Form: xt = m(t) + ut +y1ut-1 + y2ut-2 +y3ut-3 +... xt = m(t) +y(B)ut y(B) = I + y1B + y2B2 + ... = [f(B)]-1a(B) the coeficients) y1, y2 ... are used to calculate the MSE of forecasts

The Inverted Form: xt = p1xt-1 + p2xt-2 +p3xt-3+ ... + t + ut p(B)xt = t + ut p(B) = I - p1B - b2B2 + ... = [a(B)]-1 f(B)

Difference equation form Example Consider the ARIMA(1,1,1) time series (I – 0.8B)Dxt = (I + 0.6B)ut (I – 0.8B) (I –B) xt = (I + 0.6B)ut (I – 1.8B + 0.8B2) xt = (I + 0.6B)ut xt = 1.8 xt - 1 - 0.8 xt - 2 + ut+ 0.6ut -1 Difference equation form

The random shock form (I – 1.8B + 0.8B2) xt = (I + 0.6B)ut xt = (I – 1.8B + 0.8B2)-1(I + 0.6B)ut xt = (I + 2.4B + 3.52B2 + 4.416B3 + 5.1328B4 + … )ut xt = ut + 2.4 ut - 1 + 3.52 ut - 2 + 4.416 ut - 3 + 5.1328 ut - 4 + …

The Inverted form (I – 1.8B + 0.8B2) xt = (I + 0.6B)ut (I + 0.6B)-1(I – 1.8B + 0.8B2)xt = ut (I - 2.4B + 2.24B2 – 1.344 B3 + 0.8064B4 +… )xt = ut xt = 2.4 xt - 1 - 2.24 xt - 2 + 1.344 xt - 3 - 0.8064 xt - 4 + … + ut

Forecasting an ARIMA(p,d,q) Time Series Let PT denote {…, xT-2, xT-1, xT} = the “past” until time T. Then the optimal forecast of xT+l given PT is denoted by: This forecast minimizes the mean square error

Three different forms of the forecast Random Shock Form Inverted Form Difference Equation Form Note:

Random Shock Form of the forecast Recall xt = m(t) + ut +y1ut-1 + y2ut-2 +y3ut-3 +... or xT+l = m(T + l) + uT+l +y1uT+l-1 + y2uT+l-2 +y3uT+l-3 +... Taking expectations of both sides and using

To compute this forecast we need to compute {…, uT-2, uT-1, uT} from {…, xT-2, xT-1, xT}. Note: xt = m(t) + ut +y1ut-1 + y2ut-2 +y3ut-3 +... Thus and Which can be calculated recursively

The Error in the forecast: The Mean Sqare Error in the Forecast Hence

Prediction Limits for forecasts (1 – a)100% confidence limits for xT+l

The Inverted Form: p(B)xt = t + ut or xt = p1xt-1 + p2xt-2 +p3x3+ ... + t + ut where p(B) = [a(B)]-1f(B) = [a(B)]-1[b(B)Dd] = I - p1B - p2B2 - p3B3 - ...

The Inverted form of the forecast Note: xt = p1xt-1 + p2xt-2 +... + t + ut and for t = T+l xT+l = p1xT+l-1 + p2xT+l-2 + ... + t + uT+l Taking conditional Expectations

The Difference equation form of the forecast xT+l = f1xT+l-1 + f2xT+l-2 + ... + fp+dxT+l-p-d + d + uT+l +a1uT+l-1 + a2uT+l-2 +... + aquT+l-q Taking conditional Expectations

Three Important Forms of a Non-Stationary Time Series

xt = f1xt-1 + f2xt-2 +... +fp+dxt-p-d The Difference equation Form: xt = f1xt-1 + f2xt-2 +... +fp+dxt-p-d + d + ut +a1ut-1 + a2ut-2 +...+ aqut-q f (B) xt = b(B)Ddxt = d + a(B)ut f (B) = b(B)Dd = b(B) (I - B)d

xt = m(t) +y(B)ut The Random Shock Form: xt = m(t) + ut +y1ut-1 + y2ut-2 +y3ut-3 +... xt = m(t) +y(B)ut y(B) = I + y1B + y2B2 + ... = [f(B)]-1a(B) the coeficients) y1, y2 ... are used to calculate the MSE of forecasts

The Inverted Form: xt = p1xt-1 + p2xt-2 +p3xt-3+ ... + t + ut p(B)xt = t + ut p(B) = I - p1B - b2B2 + ... = [a(B)]-1 f(B)

Forecasting an ARIMA(p,d,q) Time Series Let PT denote {…, xT-2, xT-1, xT} = the “past” until time T. Then the optimal forecast of xT+l given PT is denoted by: This forecast minimizes the mean square error

Three different forms of the forecast Random Shock Form Inverted Form Difference Equation Form Note:

Random Shock Form of the forecast The Inverted form of the forecast The Difference equation form of the forecast

Computation of white noise values

Prediction Limits for forecasts (1 – a)100% confidence limits for xT+l

Example: ARIMA(1,1,2) The Model: xt - xt-1 = b1(xt-1 - xt-2) + ut + a1ut + a2ut or xt = (1 + b1)xt-1 - b1 xt-2 + ut + a1ut + a2ut f(B)xt = b(B)(I-B)xt = a(B)ut where f(x) = 1 - (1 + b1)x + b1x2 = (1 - b1x)(1-x) and a(x) = 1 + a1x + a2x2 .

The Random Shock form of the model: xt =y(B)ut where y(B) = [b(B)(I-B)]-1a(B) = [f(B)]-1a(B) i.e. y(B) [f(B)] = a(B). Thus (I + y1B + y2B2 + y3B3 + y4B4 + ... )(I - (1 + b1)B + b1B2) = I + a1B + a2B2 Hence a1 = y1 - (1 + b1) or y1 = 1 + a1 + b1. a2 = y2 - y1(1 + b1) + b1 or y2 =y1(1 + b1) - b1 + a2. 0 = yh - yh-1(1 + b1) + yh-2b1 or yh = yh-1(1 + b1) - yh-2b1 for h ≥ 3.

The Inverted form of the model: p(B) xt = ut where p(B) = [a(B)]-1b(B)(I-B) = [a(B)]-1f(B) i.e. p(B) [a(B)] = f(B). Thus (I - p1B - p2B2 - p3B3 - p4B4 - ... )(I + a1B + a2B2) = I - (1 + b1)B + b1B2 Hence -(1 + b1) = a1 - p1 or p1 = 1 + a1 + b1. b1 = -p2 - p1a1 + a2 or p2 = -p1a1 - b1 + a2. 0 = -ph - ph-1a1 - ph-2a2 or ph = -(ph-1a1 + ph-2a2) for h ≥ 3.

Now suppose that b1 = 0. 80, a1 = 0. 60 and a2 = 0 Now suppose that b1 = 0.80, a1 = 0.60 and a2 = 0.40 then the Random Shock Form coefficients and the Inverted Form coefficients can easily be computed and are tabled below:

The Forecast Equations

The Difference Form of the Forecast Equation

Computation of the Random Shock Series, One-step Forecasts Random Shock Computations

Computation of the Mean Square Error of the Forecasts and Prediction Limits

Table: MSE of Forecasts to lead time l = 12 (s2 = 2.56)

Raw Observations, One-step Ahead Forecasts, Estimated error , Error

Forecasts with 95% and 66.7% prediction Limits

Graph: Forecasts with 95% and 66.7% Prediction Limits

Next: Topic: Model Building