Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)

Slides:



Advertisements
Similar presentations
Autocorrelation Functions and ARIMA Modelling
Advertisements

Dates for term tests Friday, February 07 Friday, March 07
DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Exam 1 review: Quizzes 1-6.
Long run models in economics Professor Bill Mitchell Director, Centre of Full Employment and Equity School of Economics University of Newcastle Australia.
Economics 20 - Prof. Anderson1 Stationary Stochastic Process A stochastic process is stationary if for every collection of time indices 1 ≤ t 1 < …< t.
Using SAS for Time Series Data
Model Building For ARIMA time series
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
STATIONARY AND NONSTATIONARY TIME SERIES
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
Time Series Analysis Materials for this lecture Lecture 5 Lags.XLS Lecture 5 Stationarity.XLS Read Chapter 15 pages Read Chapter 16 Section 15.
Regression with Time-Series Data: Nonstationary Variables
Time Series Building 1. Model Identification
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
STAT 497 APPLIED TIME SERIES ANALYSIS
Stationary Stochastic Process
How should these data be modelled?. Identification step: Look at the SAC and SPAC Looks like an AR(1)- process. (Spikes are clearly decreasing in SAC.
Forecasting Purpose is to forecast, not to explain the historical pattern Models for forecasting may not make sense as a description for ”physical” behaviour.
Moving Averages Ft(1) is average of last m observations
1 Ka-fu Wong University of Hong Kong Modeling Cycles: MA, AR and ARMA Models.
Economics Prof. Buckles1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
ARIMA-models for non-stationary time series
Modeling Cycles By ARMA
NY Times 23 Sept time series of the day. Stat Sept 2008 D. R. Brillinger Chapter 4 - Fitting t.s. models in the time domain sample autocovariance.
Applied Geostatistics
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Economics 20 - Prof. Anderson
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
Non-Seasonal Box-Jenkins Models
K. Ensor, STAT Spring 2005 The Basics: Outline What is a time series? What is a financial time series? What is the purpose of our analysis? Classification.
12 Autocorrelation Serial Correlation exists when errors are correlated across periods -One source of serial correlation is misspecification of the model.
BOX JENKINS METHODOLOGY
AR- MA- och ARMA-.
ARMA models Gloria González-Rivera University of California, Riverside
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
Oceanography 569 Oceanographic Data Analysis Laboratory Kathie Kelly Applied Physics Laboratory 515 Ben Hall IR Bldg class web site: faculty.washington.edu/kellyapl/classes/ocean569_.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Exam 2 review: Quizzes 7-12* (*) Please note that.
Autoregressive Integrated Moving Average (ARIMA) Popularly known as the Box-Jenkins methodology.
It’s About Time Mark Otto U. S. Fish and Wildlife Service.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
Module 3: Introduction to Time Series Methods and Models.
Big Data at Home Depot KSU – Big Data Survey Course Steve Einbender Advanced Analytics Architect.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
Time Series Analysis Lecture 11
Estimation Method of Moments (MM) Methods of Moment estimation is a general method where equations for estimating parameters are found by equating population.
4 In our case, the starting point should be the model with all the lagged variables. DYNAMIC MODEL SPECIFICATION General model with lagged variables Static.
Seasonal ARIMA FPP Chapter 8.
TESTING FOR NONSTATIONARITY 1 This sequence will describe two methods for detecting nonstationarity, a graphical method involving correlograms and a more.
Computational Finance II: Time Series K.Ensor. What is a time series? Anything observed sequentially (by time?) Returns, volatility, interest rates, exchange.
Components of Time Series Su, Chapter 2, section II.
Ch16: Time Series 24 Nov 2011 BUSI275 Dr. Sean Ho HW8 due tonight Please download: 22-TheFed.xls 22-TheFed.xls.
STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES
Analysis of financial data Anders Lundquist Spring 2010.
EC 827 Module 2 Forecasting a Single Variable from its own History.
Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University
Stochastic Process - Introduction
Covariance, stationarity & some useful operators
Time Series Analysis.
Financial Econometrics Lecture Notes 2
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
Model Building For ARIMA time series
STATIONARY AND NONSTATIONARY TIME SERIES
BOX JENKINS (ARIMA) METHODOLOGY
Presentation transcript:

Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization) Inference is possible (though sometimes questionable) Static Normal-based inference not generally reliable Cyclic component hard to estimate Decomposition Easy to interpret Possible to have dynamic seasonal effects Cyclic components can be estimated Descriptive (no inference per def) Static in trend

Explanation to the static behaviour: The classical approach assumes all components except the irregular ones (i.e.  t and IR t ) to be deterministic, i.e. fixed functions or constants To overcome this problem, all components should be allowed to be stochastic, i.e. be random variates. A time series y t should from a statistical point of view be treated as a stochastic process. We will interchangeably use the terms time series and process depending on the situation.

Stationary and non-stationary time series

AR-models Consider the model y t =δ+φ·y t-1 +a t with {a t } i.i.d with zero mean and constant variance = σ 2 Set δ=0 by sake of simplicity  E(y t )=0 Let R(k)=Cov(y t,y t-k )= Cov(y t,y t+k )=E(y t ·y t-k )= E(y t ·y t+k )  R(0)=Var(y t ) assumed to be constant

Now: R(0)=E(y t ·y t )= E(y t ·(φ·y t-1 +a t )= φ· E(y t ·y t-1 )+ E(y t ·a t )= = φ·R(1)+ E((φ·y t-1 +a t ) ·a t )= φ·R(1)+ φ· E(y t-1 ·a t )+ E(a t ·a t )= = φ·R(1)+ 0 + σ 2 (for a t is independent of y t-1 ) R(1)=E(y t ·y t+1 )= E(y t ·(φ·y t +a t+1 )= φ· E(y t ·y t )+ E(y t ·a t+1 )= = φ·R(0)+0 (for a t+1 is independent of y t ) R(2)= E(y t ·y t+2 )= E(y t ·(φ·y t+1 +a t+2 )= φ· E(y t ·y t+1 )+ E(y t ·a t+2 )= = φ·R(1)+0 (for a t+1 is independent of y t ) 

R(0)= φ·R(1)+ σ 2 R(1)= φ·R(0)Yule-Walker equations R(2)= φ·R(1) …  R(k )= φ·R(k-1)=…= φ k ·R(0) R(0)= φ 2 ·R(0) + σ 2 

Note that for R(0) to become positive and finite (which we require from a variance) the following must hold: This in effect the condition for an AR(1)-process to be weakly stationary Note now that

ρ k is called the Autocorrelation function (ACF) of y t ”Auto” because it gives correlations within the same time series. For pairs of different time series one can define the Cross correlation function which gives correlations at different lags between series. By studying the ACF it might be possible to identify the approximate magnitude of φ

Examples:

The look of an ACF can be similar for different kinds of time series, e.g. the ACF for an AR(1) with φ=0.3 could be approximately the same as the ACF for an Auto-regressive time series of higher order than 1 (we will discuss higher order AR-models later) To do a less ambiguos identification we need another statistic: The Partial Autocorrelation function (PACF): υ k = Corr (y t,y t-k | y t-k+1, y t-k+2,…, y t-1 ) i.e. the conditional correlation between y t and y t-k given all observations in-between. Note that -1  υ k  1

A concept hard to interpretate, but it can be shown that For AR(1)-models with φ positive the look of the PACF is and for AR(1)-models with φ negative the look of the PACF is

Assume now that we have a sample y 1, y 2,…, y n from a time series assumed to follow an AR(1)-model. Example:

The ACF and the PACF can be estimated from data by their sample counterparts: Sample Autocorrelation function (SAC): if n large, otherwise a scaling might be needed Sample Partial Autocorrelation function (SPAC) Complicated structure, so not shown here

The variance function of these two estimators can also be estimated  Opportunity to test the null hypothesis of a zero autocorrelation or partial autocorrelation for a single value of k. Estimated sample functions are usually plotted together with critical limits based on estimated variances.

Example (cont) DKK/USD exchange: SAC: SPAC: Critical limits

Ignoring all bars within the red limits, we would identify the series as beeing an AR(1) with positive φ. The value of φ is approximately 0.9 (ordinate of first bar in SAC plot and in SPAC plot)

Higher-order AR-models AR(2):or y t-2 must be present AR(3): or other combinations with φ 3 y t-3 AR(p): i.e. different combinations with φ p y t-p

Typical patterns of ACF and PACF functions for higher order stationary AR-models (AR( p )): ACF: Similar pattern as for AR(1), i.e. (exponentially) decreasing bars, (most often) positive for φ 1 positive and alternating for φ 1 negative. PACF: The first p values of k are non-zero with decreasing magnitude. The rest are all zero (cut-off point at p ) (Most often) all positive if φ 1 positive and alternating if φ 1 negative Requirements for stationarity usually complex to describe for p>2.

Examples: AR(2), φ 1 positive: AR(5), φ 1 negative: