STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

Autocorrelation Functions and ARIMA Modelling
Autoregressive Integrated Moving Average (ARIMA) models
Stationary Time Series
Time Series Presented by Vikas Kumar vidyarthi Ph.D Scholar ( ),CE Instructor Dr. L. D. Behera Department of Electrical Engineering Indian institute.
Dates for term tests Friday, February 07 Friday, March 07
Model Building For ARIMA time series
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
Time Series Building 1. Model Identification
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003.
How should these data be modelled?. Identification step: Look at the SAC and SPAC Looks like an AR(1)- process. (Spikes are clearly decreasing in SAC.
Non-Seasonal Box-Jenkins Models
1 Ka-fu Wong University of Hong Kong Modeling Cycles: MA, AR and ARMA Models.
Time series analysis - lecture 2 A general forecasting principle Set up probability models for which we can derive analytical expressions for and estimate.
ARIMA-models for non-stationary time series
Modeling Cycles By ARMA
1 Takehome One month treasury bill rate.
3 mo treasury yield borrowing costs Dow industrials NY Times 18 Sept 2008 front page.
1 ECON 240C Lecture 8. 2 Outline: 2 nd Order AR Roots of the quadratic Roots of the quadratic Example: capumfg Example: capumfg Polar form Polar form.
1 ECON 240C Lecture 8. 2 Outline: 2 nd Order AR Roots of the quadratic Roots of the quadratic Example: change in housing starts Example: change in housing.
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Financial Econometrics
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
Non-Seasonal Box-Jenkins Models
1 ECON 240C Lecture 8. 2 Outline: 2 nd Order AR Roots of the quadratic Roots of the quadratic Example: change in housing starts Example: change in housing.
BOX JENKINS METHODOLOGY
Box Jenkins or Arima Forecasting. H:\My Documents\classes\eco346\Lectures\chap ter 7\Autoregressive Models.docH:\My Documents\classes\eco346\Lectures\chap.
AR- MA- och ARMA-.
ARMA models Gloria González-Rivera University of California, Riverside
Teknik Peramalan: Materi minggu kedelapan
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 8: Estimation & Diagnostic Checking in Box-Jenkins.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
TIME SERIES ANALYSIS Time Domain Models: Red Noise; AR and ARMA models LECTURE 7 Supplementary Readings: Wilks, chapters 8.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Exam 2 review: Quizzes 7-12* (*) Please note that.
Autoregressive Integrated Moving Average (ARIMA) Popularly known as the Box-Jenkins methodology.
Models for Non-Stationary Time Series The ARIMA(p,d,q) time series.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
K. Ensor, STAT Spring 2005 Model selection/diagnostics Akaike’s Information Criterion (AIC) –A measure of fit plus a penalty term for the number.
Time Series Basics (2) Fin250f: Lecture 3.2 Fall 2005 Reading: Taylor, chapter , 3.9(skip 3.6.1)
One Random Variable Random Process.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
K. Ensor, STAT Spring 2005 Estimation of AR models Assume for now mean is 0. Estimate parameters of the model, including the noise variace. –Least.
Linear Filters. denote a bivariate time series with zero mean. Let.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Introduction to stochastic processes
Time Series Analysis PART II. Econometric Forecasting Forecasting is an important part of econometric analysis, for some people probably the most important.
Subodh Kant. Auto-Regressive Integrated Moving Average Also known as Box-Jenkins methodology A type of linear model Capable of representing stationary.
EC 827 Module 2 Forecasting a Single Variable from its own History.
Time Series Analysis.
Applied Econometric Time Series Third Edition
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
Statistics 153 Review - Sept 30, 2008
Model Building For ARIMA time series
Econ 427 lecture 13 slides ARMA Models Byron Gangnes.
Machine Learning Week 4.
Module 3 Forecasting a Single Variable from its own History, continued
Linear Filters.
The Spectral Representation of Stationary Time Series
Time Series Analysis and Forecasting
CH2 Time series.
BOX JENKINS (ARIMA) METHODOLOGY
Presentation transcript:

STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES (ARMA PROCESSES OR BOX-JENKINS PROCESSES)

AUTOREGRESSIVE PROCESSES AR(p) PROCESS: or where

AR(p) PROCESS Because the process is always invertible. To be stationary, the roots of p(B)=0 must lie outside the unit circle. The AR process is useful in describing situations in which the present value of a time series depends on its preceding values plus a random shock.

AR(1) PROCESS where atWN(0, ) Always invertible. To be stationary, the roots of (B)=1B=0 must lie outside the unit circle.

||<1 STATIONARITY CONDITION AR(1) PROCESS OR using the characteristic equation, the roots of m=0 must lie inside the unit circle. B=1  |B|<|1| ||<1 STATIONARITY CONDITION

AR(1) PROCESS This process sometimes called as the Markov process because the distribution of Yt given Yt1,Yt2,… is exactly the same as the distribution of Yt given Yt1.

AR(1) PROCESS PROCESS MEAN: 

AR(1) PROCESS AUTOCOVARIANCE FUNCTION: k Keep this part as it is

AR(1) PROCESS

AR(1) PROCESS When ||<1, the process is stationary and the ACF decays exponentially.

AR(1) PROCESS 0 <  < 1  All autocorrelations are positive. 1 <  < 0  The sign of the autocorrelation shows an alternating pattern beginning a negative value.

AR(1) PROCESS RSF: Using the geometric series

AR(1) PROCESS RSF: By operator method _ We know that

AR(1) PROCESS RSF: By recursion

THE SECOND ORDER AUTOREGRESSIVE PROCESS AR(2) PROCESS: Consider the series satisfying where atWN(0, ).

AR(2) PROCESS Always invertible. Already in the Inverted Form. To be stationary, the roots of must lie outside the unit circle. OR the roots of the characteristic equation must lie inside the unit circle.

AR(2) PROCESS

AR(2) PROCESS Considering both real and complex roots, we have the following stationary conditions for AR(2) process (see page 84 for the proof)

AR(2) PROCESS THE AUTOCOVARIANCE FUNCTION: Assuming stationarity and that at is independent of Ytk, we have

AR(2) PROCESS

AR(2) PROCESS

AR(2) PROCESS

AR(2) PROCESS

AR(2) PROCESS ACF: It is known as Yule-Walker Equations ACF shows an exponential decay or sinusoidal behavior.

AR(2) PROCESS PACF: PACF cuts off after lag 2.

AR(2) PROCESS RANDOM SHOCK FORM: Using the Operator Method

The p-th ORDER AUTOREGRESSIVE PROCESS: AR(p) PROCESS Consider the process satisfying where atWN(0, ). provided that roots of all lie outside the unit circle

AR(p) PROCESS ACF: Yule-Walker Equations ACF: tails of as a mixture of exponential decay or damped sine wave (if some roots are complex). PACF: cuts off after lag p.

MOVING AVERAGE PROCESSES Suppose you win 1 TL if a fair coin shows a head and lose 1 TL if it shows tail. Denote the outcome on toss t by at. The average winning on the last 4 tosses=average pay-off on the last tosses: MOVING AVERAGE PROCESS

MOVING AVERAGE PROCESS Errors are the average of this period’s random error and last period’s random error. No memory of past levels. The impact of shock to the series takes exactly 1-period to vanish for MA(1) process. In MA(2) process, the shock takes 2-periods and then fade away. In MA(1) process, the correlation would last only one period.

MOVING AVERAGE PROCESSES Consider the process satisfying

MOVING AVERAGE PROCESSES Because , MA processes are always stationary. Invertible if the roots of q(B)=0 all lie outside the unit circle. It is a useful process to describe events producing an immediate effects that lasts for short period of time.

THE FIRST ORDER MOVING AVERAGE PROCESS_MA(1) PROCESS Consider the process satisfying

MA(1) PROCESS From autocovariance generating function

MA(1) PROCESS ACF ACF cuts off after lag 1. General property of MA(1) processes: 2|k|<1

MA(1) PROCESS PACF:

MA(1) PROCESS Basic characteristic of MA(1) Process: ACF cuts off after lag 1. PACF tails of exponentially depending on the sign of . Always stationary. Invertible if the root of 1B=0 lie outside the unit circle or the root of the characteristic equation m=0 lie inside the unit circle.  INVERTIBILITY CONDITION: ||<1.

MA(1) PROCESS It is already in RSF. IF: 1= 2=2

MA(1) PROCESS IF: By operator method

THE SECOND ORDER MOVING AVERAGE PROCESS_MA(2) PROCESS Consider the moving average process of order 2:

MA(2) PROCESS From autocovariance generating function

MA(2) PROCESS ACF ACF cuts off after lag 2. PACF tails of exponentially or a damped sine waves depending on a sign and magnitude of parameters.

MA(2) PROCESS Always stationary. Invertible if the roots of all lie outside the unit circle. OR if the roots of all lie inside the unit circle.

MA(2) PROCESS Invertibility condition for MA(2) process

MA(2) PROCESS It is already in RSF form. IF: Using the operator method:

The q-th ORDER MOVING PROCESS_ MA( q) PROCESS Consider the MA(q) process:

MA(q) PROCESS The autocovariance function: ACF:

THE AUTOREGRESSIVE MOVING AVERAGE PROCESSES_ARMA(p, q) PROCESSES If we assume that the series is partly autoregressive and partly moving average, we obtain a mixed ARMA process.

ARMA(p, q) PROCESSES For the process to be invertible, the roots of lie outside the unit circle. For the process to be stationary, the roots of Assuming that and share no common roots, Pure AR Representation: Pure MA Representation:

ARMA(p, q) PROCESSES Autocovariance function ACF Like AR(p) process, it tails of after lag q. PACF: Like MA(q), it tails of after lag p.

ARMA(1, 1) PROCESSES The ARMA(1, 1) process can be written as Stationary if ||<1. Invertible if ||<1.

ARMA(1, 1) PROCESSES Autocovariance function:

ARMA(1,1) PROCESS The process variance

ARMA(1,1) PROCESS

ARMA(1,1) PROCESS Both ACF and PACF tails of after lag 1.

ARMA(1,1) PROCESS IF:

ARMA(1,1) PROCESS RSF:

AR(1) PROCESS

AR(2) PROCESS

MA(1) PROCESS

MA(2) PROCESS

ARMA(1,1) PROCESS

ARMA(1,1) PROCESS (contd.)