Autoregressive Integrated Moving Average (ARIMA) models

Slides:



Advertisements
Similar presentations
1 A B C
Advertisements

5.1 Rules for Exponents Review of Bases and Exponents Zero Exponents
Simplifications of Context-Free Grammars
Variations of the Turing Machine
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Angstrom Care 培苗社 Quadratic Equation II
AP STUDY SESSION 2.
1
STATISTICS HYPOTHESES TEST (I)
STATISTICS INTERVAL ESTIMATION Professor Ke-Sheng Cheng Department of Bioenvironmental Systems Engineering National Taiwan University.
STATISTICS POINT ESTIMATION Professor Ke-Sheng Cheng Department of Bioenvironmental Systems Engineering National Taiwan University.
David Burdett May 11, 2004 Package Binding for WS CDL.
SMA 6304 / MIT / MIT Manufacturing Systems Lecture 11: Forecasting Lecturer: Prof. Duane S. Boning Copyright 2003 © Duane S. Boning. 1.
Create an Application Title 1Y - Youth Chapter 5.
CALENDAR.
Autocorrelation Functions and ARIMA Modelling
Chapter 7 Sampling and Sampling Distributions
1 00/XXXX © Crown copyright Carol Roadnight, Peter Clark Met Office, JCMM Halliwell Representing convection in convective scale NWP models : An idealised.
Simple Linear Regression 1. review of least squares procedure 2
Media-Monitoring Final Report April - May 2010 News.
Biostatistics Unit 5 Samples Needs to be completed. 12/24/13.
Stationary Time Series
Time Series Analysis -- An Introduction -- AMS 586 Week 2: 2/4,6/2014.
Break Time Remaining 10:00.
The basics for simulations
Factoring Quadratics — ax² + bx + c Topic
EE, NCKU Tien-Hao Chang (Darby Chang)
Turing Machines.
PP Test Review Sections 6-1 to 6-6
Regression with Panel Data
Copyright © 2012, Elsevier Inc. All rights Reserved. 1 Chapter 7 Modeling Structure with Blocks.
Chapter 1: Expressions, Equations, & Inequalities
Dates for term tests Friday, February 07 Friday, March 07
Adding Up In Chunks.
MaK_Full ahead loaded 1 Alarm Page Directory (F11)
1 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt Synthetic.
Artificial Intelligence
When you see… Find the zeros You think….
12 October, 2014 St Joseph's College ADVANCED HIGHER REVISION 1 ADVANCED HIGHER MATHS REVISION AND FORMULAE UNIT 2.
: 3 00.
5 minutes.
1 Non Deterministic Automata. 2 Alphabet = Nondeterministic Finite Accepter (NFA)
1 hi at no doifpi me be go we of at be do go hi if me no of pi we Inorder Traversal Inorder traversal. n Visit the left subtree. n Visit the node. n Visit.
1 Let’s Recapitulate. 2 Regular Languages DFAs NFAs Regular Expressions Regular Grammars.
FIGURE 12-1 Op-amp symbols and packages.
12 System of Linear Equations Case Study
Converting a Fraction to %
Numerical Analysis 1 EE, NCKU Tien-Hao Chang (Darby Chang)
Clock will move after 1 minute
famous photographer Ara Guler famous photographer ARA GULER.
Copyright © 2013 Pearson Education, Inc. All rights reserved Chapter 11 Simple Linear Regression.
Lial/Hungerford/Holcomb/Mullins: Mathematics with Applications 11e Finite Mathematics with Applications 11e Copyright ©2015 Pearson Education, Inc. All.
Physics for Scientists & Engineers, 3rd Edition
Select a time to count down from the clock above
16. Mean Square Estimation
Copyright Tim Morris/St Stephen's School
1.step PMIT start + initial project data input Concept Concept.
9. Two Functions of Two Random Variables
1 Dr. Scott Schaefer Least Squares Curves, Rational Representations, Splines and Continuity.
1 Decidability continued…. 2 Theorem: For a recursively enumerable language it is undecidable to determine whether is finite Proof: We will reduce the.
1 Non Deterministic Automata. 2 Alphabet = Nondeterministic Finite Accepter (NFA)
Chapter 4 FUGACITY.
5 – Autoregressive Integrated Moving Average (ARIMA) Models
Time Series Building 1. Model Identification
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES
BOX JENKINS (ARIMA) METHODOLOGY
Presentation transcript:

Autoregressive Integrated Moving Average (ARIMA) models

- Forecasting techniques based on exponential smoothing General assumption for the above models: times series data are represented as the sum of two distinct components (deterministc & random) Random noise: generated through independent shocks to the process In practice: successive observations show serial dependence

ARIMA Models - ARIMA models are also known as the Box-Jenkins methodology very popular : suitable for almost all time series & many times generate more accurate forecasts than other methods. limitations: If there is not enough data, they may not be better at forecasting than the decomposition or exponential smoothing techniques. Recommended number of observations at least 30-50 - Weak stationarity is required - Equal space between intervals

Linear Models for Time series

Linear Filter - It is a process that converts the input xt, into output yt The conversion involves past, current and future values of the input in the form of a summation with different weights Time invariant do not depend on time Physically realizable: the output is a linear function of the current and past values of the input Stable if In linear filters: stationarity of the input time series is also reflected in the output

Stationarity

A time series that fulfill these conditions tends to return to its mean and fluctuate around this mean with constant variance. Note: Strict stationarity requires, in addition to the conditions of weak stationarity, that the time series has to fulfill further conditions about its distribution including skewness, kurtosis etc. Determine stationarity -Take snaphots of the process at different time points & observe its behavior: if similar over time then stationary time series -A strong & slowly dying ACF suggests deviations from stationarity

Infinite Moving Average Input xt stationary Output yt Stationary, with & THEN, the linear process with white noise time series εt Is stationary εt independent random shocks, with E(εt)=0 &

autocovariance function Linear Process Infinite moving average

The infinite moving average serves as a general class of models for any stationary time series THEOREM (World 1938): Any no deterministic weakly stationary time series yt can be represented as where INTERPRETATION A stationary time series can be seen as the weighted sum of the present and past disturbances

Infinite moving average: Impractical to estimate the infinitely weights Useless in practice except for special cases: i. Finite order moving average (MA) models : weights set to 0, except for a finite number of weights ii. Finite order autoregressive (AR) models: weights are generated using only a finite number of parameters iii. A mixture of finite order autoregressive & moving average models (ARMA)

Finite Order Moving Average (MA) process Moving average process of order q(MA(q)) MA(q) : always stationary regardless of the values of the weights

εt white noise Expected value of MA(q) Variance of MA(q) Autocovariance of MA(q) Autocorelation of MA(q)

ACF function: Helps identifying the MA model & its appropriate order as its cuts off after lag k Real applications: r(k) not always zero after lag q; becomes very small in absolute value after lag q

First Order Moving Average Process MA(1) q=1 Autocovariance of MA(q) Autocorelation of MA(q)

Mean & variance : stable Short runs where successive observations tend to follow each other Positive autocorrelation Observations oscillate successively negative autocorrelation

Second Order Moving Average MA(2) process Autocovariance of MA(q) Autocorelation of MA(q)

The sample ACF cuts off after lag 2

Finite Order Autoregressive Process World’s theorem: infinite number of weights, not helpful in modeling & forecasting Finite order MA process: estimate a finite number of weights, set the other equal to zero Oldest disturbance obsolete for the next observation; only finite number of disturbances contribute to the current value of time series Take into account all the disturbances of the past : use autoregressive models; estimate infinitely many weights that follow a distinct pattern with a small number of parameters

First Order Autoregressive Process, AR(1) Assume : the contributions of the disturbances that are way in the past are small compared to the more recent disturbances that the process has experienced Reflect the diminishing magnitudes of contributions of the disturbances of the past,through set of infinitely many weights in descending magnitudes , such as Exponential decay pattern The weights in the disturbances starting from the current disturbance and going back in the past:

First order autoregressive process AR(1) where WHY AUTOREGRESSIVE ? AR(1) stationary if

Mean AR(1) Autocovariance function AR(1) Autocorrelation function AR(1) The ACF for a stationary AR(1) process has an exponential decay form

Observe: The observations exhibit up/down movements

Second Order Autoregressive Process, AR(2) This model can be represented in the infinite MA form & provide the conditions of stationarity for yt in terms of φ1 & φ2 WHY? 1. Infinite MA Apply

where

Calculate the weights We need

Solutions The satisfy the second-order linear difference equation The solution : in terms of the 2 roots m1 and m2 from AR(2) stationary: Condition of stationarity for complex conjugates a+ib: AR(2) infinite MA representation:

Mean Autocovariance function For k=0: For k>0: Yule-Walker equations

Autocorrelation function Solutions A. Solve the Yule-Walker equations recursively B. General solution Obtain it through the roots m1 & m2 associated with the polynomial

Case I: m1, m2 distinct real roots c1, c2 constants: can be obtained from ρ (0) ,ρ(1) stationarity: ACF form: mixture of 2 exponentially decay terms e.g. AR(2) model It can be seen as an adjusted AR(1) model for which a single exponential decay expression as in the AR(1) is not enough to describe the pattern in the ACF and thus, an additional decay expression is added by introducing the second lag term yt-2

Case II: m1, m2 complex conjugates in the form c1, c2: particular constants ACF form: damp sinusoid; damping factor R; frequency ; period

Case III: one real root m0; m1= m2=m0 ACF form: exponential decay pattern

AR(2) process :yt=4+0.4yt-1+0.5yt-2+et Roots of the polynomial: real ACF form: mixture of 2 exponential decay terms

AR(2) process: yt=4+0.8yt-1-0.5yt-2+et Roots of the polynomial: complex conjugates ACF form: damped sinusoid behavior

General Autoregressive Process, AR(p) Consider a pth order AR model or

AR(P) stationary If the roots of the polynomial are less than 1 in absolute value AR(P) absolute summable infinite MA representation Under the previous condition

Weights of the random shocks as

For stationary AR(p)

ACF pth order linear difference equations AR(p) : -satisfies the Yule-Walker equations -ACF can be found from the p roots of the associated polynomial e.g. distinct & real roots : - In general the roots will not be real ACF : mixture of exponential decay and damped sinusoid

ACF MA(q) process: useful tool for identifying order of process cuts off after lag k AR(p) process: mixture of exponential decay & damped sinusoid expressions Fails to provide information about the order of AR

Partial Autocorrelation Function Consider : three random variables X, Y, Z & Simple regression of X on Z & Y on Z The errors are obtained from

Partial correlation between X & Y after adjusting for Z: The correlation between X* & Y* Partial correlation can be seen as the correlation between two variables after being adjusted for a common factor that affects them

Partial autocorrelation function (PACF) between yt & yt-k The autocorrelation between yt & yt-k after adjusting for yt-1, yt-2, …yt-k AR(p) process: PACF between yt & yt-k for k>p should equal zero Consider a stationary time series yt; not necessarily an AR process For any fixed value k , the Yule-Walker equations for the ACF of an AR(p) process

Matrix notation Solutions For any given k, k =1,2,… the last coefficient is called the partial autocorrelation coefficient of the process at lag k AR(p) process: Identify the order of an AR process by using the PACF

MA(1) MA(2) AR(1) AR(1) Cuts off after 1st lag AR(2) AR(2) Decay pattern Decay pattern AR(1) AR(1) Cuts off after 1st lag AR(2) AR(2) Cuts off after 2nd lag

Invertibility of MA models Invertible moving average process: The MA(q) process is invertible if it has an absolute summable infinite AR representation It can be shown: The infinite AR representation for MA(q)

Obtain We need Condition of invertibility The roots of the associated polynomial be less than 1 in absolute value An invertible MA(q) process can then be written as an infinite AR process

PACF possibly never cuts off PACF of a MA(q) process is a mixture of exponential decay & damp sinusoid expressions In model identification, use both sample ACF & sample PACF

Mixed Autoregressive –Moving Average (ARMA) Process ARMA (p,q) model Adjust the exponential decay pattern by adding a few terms

Stationarity of ARMA (p,q) process Related to the AR component ARMA(p,q) stationary if the roots of the polynomial less than one in absolute value ARMA(p,q) has an infinite MA representation

Invertibility of ARMA(p,q) process Invertibility of ARMA process related to the MA component Check through the roots of the polynomial If the roots less than 1 in absolute value then ARMA(p,q) is invertible & has an infinite representation Coefficients:

ARMA(1,1) Sample ACF & PACF: exponential decay behavior

Non Stationary Process Not constant level, exhibit homogeneous behavior over time yt is homogeneous, non stationary if It is not stationary Its first difference, wt=yt-yt-1=(1-B)yt or higher order differences wt=(1-B)dyt produce a stationary time series Yt autoregressive intergrated moving average of order p, d,q –ARIMA(p,d,q) If the d difference , wt=(1-B)dyt produces a stationary ARMA(p,q) process ARIMA(p,d,q)

The random walk process ARIMA(0,1,0) Simplest non-stationary model First differencing eliminates serial dependence & yields a white noise process

yt=20+yt-1+et Evidence of non-stationary process Sample ACF : dies out slowly Sample PACF: significant at the first lag Sample PACF value at lag 1 close to 1 First difference Time series plot of wt: stationary Sample ACF& PACF: do not show any significant value Use ARIMA(0,1,0)

The random walk process ARIMA(0,1,1) Infinite AR representation, derived from: ARIMA(0,1,1)= (IMA(1,1)): expressed as an exponential weighted moving average (EWMA) of all past values

ARIMA(0,1,1) Possible model :AR(2) Check the roots The mean of the process is moving upwards in time Sample ACF: dies relatively slow Sample PACF: 2 significant values at lags 1& 2 Possible model :AR(2) Check the roots First difference looks stationary Sample ACF & PACF: an MA(1) model would be appropriate for the first difference , its ACF cuts off after the first lag & PACF decay pattern