Forecasting Purpose is to forecast, not to explain the historical pattern Models for forecasting may not make sense as a description for ”physical” behaviour.

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

Autocorrelation Functions and ARIMA Modelling
DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Exam 1 review: Quizzes 1-6.
Part II – TIME SERIES ANALYSIS C3 Exponential Smoothing Methods © Angel A. Juan & Carles Serrat - UPC 2007/2008.
Time series. Characteristics Non-independent observations (correlations structure) Systematic variation within a year (seasonal effects) Long-term increasing.
Model Building For ARIMA time series
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
Time Series Analysis Materials for this lecture Lecture 5 Lags.XLS Lecture 5 Stationarity.XLS Read Chapter 15 pages Read Chapter 16 Section 15.
Time Series Building 1. Model Identification
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
STAT 497 APPLIED TIME SERIES ANALYSIS
How should these data be modelled?. Identification step: Look at the SAC and SPAC Looks like an AR(1)- process. (Spikes are clearly decreasing in SAC.
Business Forecasting Chapter 10 The Box–Jenkins Method of Forecasting.
Time Series Analysis Autocorrelation Naive & Simple Averaging
Moving Averages Ft(1) is average of last m observations
Economics Prof. Buckles1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
Forecasting using simple models
ARIMA-models for non-stationary time series
Modeling Cycles By ARMA
Chapter 12 - Forecasting Forecasting is important in the business decision-making process in which a current choice or decision has future implications:
Data Sources The most sophisticated forecasting model will fail if it is applied to unreliable data Data should be reliable and accurate Data should be.
Forecasting Purpose is to forecast, not to explain the historical pattern Models for forecasting may not make sense as a description for ”physical” beaviour.
Prediction and model selection
CHAPTER 4 MOVING AVERAGES AND SMOOTHING METHODS (Page 107)
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Exercise 8.8 “Consider the weekly thermostat sales in Figure 8.7(b)” Data are in table 8.1 (p. 359) and can be found on the CDROM, e.g. as a Minitab Worksheet:
Economics 20 - Prof. Anderson
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
Part II – TIME SERIES ANALYSIS C2 Simple Time Series Methods & Moving Averages © Angel A. Juan & Carles Serrat - UPC 2007/2008.
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
Forecasting Purpose is to forecast, not to explain the historical pattern Models for forecasting may not make sense as a description for ”physical” beaviour.
Introduction to Regression Analysis, Chapter 13,
Slides 13b: Time-Series Models; Measuring Forecast Error
Applied Business Forecasting and Planning
BOX JENKINS METHODOLOGY
Statistical Methods For Engineers ChE 477 (UO Lab) Larry Baxter & Stan Harding Brigham Young University.
AR- MA- och ARMA-.
Time Series “The Art of Forecasting”. What Is Forecasting? Process of predicting a future event Underlying basis of all business decisions –Production.
The Forecast Process Dr. Mohammed Alahmed
Time-Series Analysis and Forecasting – Part V To read at home.
TIME SERIES by H.V.S. DE SILVA DEPARTMENT OF MATHEMATICS
STAT 497 LECTURE NOTES 2.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 8: Estimation & Diagnostic Checking in Box-Jenkins.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Exam 2 review: Quizzes 7-12* (*) Please note that.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Time Series Forecasting Chapter 16.
McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies, Inc. All rights reserved. Time Series Forecasting Chapter 13.
Autoregressive Integrated Moving Average (ARIMA) Popularly known as the Box-Jenkins methodology.
It’s About Time Mark Otto U. S. Fish and Wildlife Service.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
Autocorrelation, Box Jenkins or ARIMA Forecasting.
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
Time Series Analysis Lecture 11
Forecasting (prediction) limits Example Linear deterministic trend estimated by least-squares Note! The average of the numbers 1, 2, …, t is.
1 BABS 502 Moving Averages, Decomposition and Exponential Smoothing Revised March 14, 2010.
The Box-Jenkins (ARIMA) Methodology
Seasonal ARIMA FPP Chapter 8.
Components of Time Series Su, Chapter 2, section II.
Ch16: Time Series 24 Nov 2011 BUSI275 Dr. Sean Ho HW8 due tonight Please download: 22-TheFed.xls 22-TheFed.xls.
Time Series Analysis PART II. Econometric Forecasting Forecasting is an important part of econometric analysis, for some people probably the most important.
Times Series Forecasting and Index Numbers Chapter 16 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Demand Management and Forecasting Chapter 11 Portions Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
Analysis of financial data Anders Lundquist Spring 2010.
Forecasting. Model with indicator variables The choice of a forecasting technique depends on the components identified in the time series. The techniques.
Chapter 11 – With Woodruff Modications Demand Management and Forecasting Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Forecasting Purpose is to forecast, not to explain the historical pattern Models for forecasting may not make sense as a description for ”physical” behaviour.
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
BOX JENKINS (ARIMA) METHODOLOGY
Presentation transcript:

Forecasting Purpose is to forecast, not to explain the historical pattern Models for forecasting may not make sense as a description for ”physical” behaviour of the time series Common sense and mathematics in a good combination produces ”optimal” forecasts With time series regression models, forecasting (prediction) is a natural step and forecasting limits (intervals) can be constructed With Classical decomposition, forecasting may be done, but estimation of accuracy lacks and no forecasting limits are produced Classical decomposition is usually combined with Exponential smoothing methods

Exponential smoothing Use the historical data to forecast the future Let different parts of the history have different impact on the forecasts Forecast model is not developed from any statistical theory

Single exponential smoothing Given are historical values y1,y2,…yT Assume data contains no trend

Algorithm for forecasting: where is a smoothing parameter with value between 0 and 1 The forecast procedure is a recursion formula How shall we choose α? Where should we start, i.e. which is the initial value ?

For long length time series: Use a part (usually first half) of the historical data and calculate their average: Set Update with the rest of the historical data using the recursion formula

Example: Sales of everyday commodities Year Sales values 1985 151 1986 151 1987 147 1988 149 1989 146 1990 142 1991 143 1992 145 1993 141 1994 143 1995 145 1996 138 1997 147 1998 151 1999 148 2000 148 Note! This time series is short but we use it for illustration purposes!

Calculate the average of the first 8 observations of the series: Set Assume first that the sales are very stable, i.e. during the period the background mean value is assumed not to change Set α to be relatively small. This means that the latest observation plays a less role than the history in the forecasts. Thumb rule: 0.05 < α < 0.3 E.g. Set α=0.1 Update using the next 8 values of the historical data

Forecasts:

For short length time series: Calculate the average of all historical data i.e. Update from the beginning of the time series: There are a lot of alternatives: Average of all data, update from the middle of the series Average of the first half, update from beginning etc.

Analysis of example data with MINITAB 

MTB > Name c3 "FORE1" c4 "UPPE1" c5 "LOWE1" MTB > SES 'Sales values'; SUBC> Weight 0.1; SUBC> Initial 8; SUBC> Forecasts 3; SUBC> Fstore 'FORE1'; SUBC> Upper 'UPPE1'; SUBC> Lower 'LOWE1'; SUBC> Title "SES alpha=0.1". Single Exponential Smoothing for Sales values Data Sales values Length 16 Smoothing Constant Alpha 0.1

Accuracy Measures MAPE 2.2378 MAD 3.2447 MSD 14.4781 Forecasts Period Forecast Lower Upper 17 146.043 138.094 153.992 18 146.043 138.094 153.992 19 146.043 138.094 153.992

MINITAB uses smoothing from 1st value!

Assume now that the sales are less stable, i. e Assume now that the sales are less stable, i.e. during the period the background mean value is possibly changing. (Note that a change means an occasional “level shift” , not a systematic trend) Set α to be relatively large. This means that the latest observation becomes more important in the forecasts. E.g. Set α=0.5 (A bit exaggerated)

Single Exponential Smoothing for Sales values Data Sales values Length 16 Smoothing Constant Alpha 0.5 Accuracy Measures MAPE 1.9924 MAD 2.8992 MSD 13.0928 Forecasts Period Forecast Lower Upper 17 147.873 140.770 154.976 18 147.873 140.770 154.976 19 147.873 140.770 154.976

Slightly narrower prediction intervals

We can also use some adaptive procedure to continuosly evaluate the forecast ability and maybe change the smoothing parameter over time Alt. We can run the process with different alphas and choose the one that performs best. This can be done with the MINITAB procedure.

Yet, narrower prediction intervals Single Exponential Smoothing for Sales values --- Smoothing Constant Alpha 0.567101 Accuracy Measures MAPE 1.7914 MAD 2.5940 MSD 12.1632 Forecasts Period Forecast Lower Upper 17 148.013 141.658 154.369 18 148.013 141.658 154.369 19 148.013 141.658 154.369 Yet, narrower prediction intervals

Exponential smoothing for times series with trend and/or seasonal variation Double exponential smoothing (one smoothing parameter) for trend Holt’s method (two smoothing parameters) for trend Multiplicative Winter’s method (three smoothing parameters) for seasonal (and trend) Additive Winter’s method (three smoothing parameters) for seasonal (and trend)

Modern methods The classical approach: Method Pros Cons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization) Inference is possible (though sometimes questionable) Static Normal-based inference not generally reliable Cyclic component hard to estimate Decomposition Easy to interpret Possible to have dynamic seasonal effects Cyclic components can be estimated Descriptive (no inference per def) Static in trend

Explanation to the static behaviour: The classical approach assumes all components except the irregular ones (i.e. t and IRt ) to be deterministic, i.e. fixed functions or constants To overcome this problem, all components should be allowed to be stochastic, i.e. be random variates. A time series yt should from a statistical point of view be treated as a stochastic process. We will interchangeably use the terms time series and process depending on the situation.

Stationary and non-stationary time series Characteristics for a stationary time series: Constant mean Constant variance  A time series with trend is non-stationary!

ARIMA – models Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary time series can be transformed into a stationary time series, modelled and back-transformed to original scale (e.g. for purposes of forecasting) ARIMA – models Auto Regressive, Integrated, Moving Average This part has to do with the transformation These parts can be modelled on a stationary series

Different types of transformation 1. From a series with linear trend to a series with no trend: First-order differences zt = yt – yt – 1 MTB > diff c1 c2

Note that the differenced series varies around zero.

2. From a series with quadratic trend to a series with no trend: Second-order differences wt = zt – zt – 1 = (yt – yt – 1) – (yt – 1 – yt – 2) = yt – 2yt – 1 + yt – 2 MTB > diff 2 c3 c4

3. From a series with non-constant variance (heteroscedastic) to a series with constant variance (homoscedastic): Box-Cox transformations (per def 1964) Practically  is chosen so that yt +  is always > 0 Simpler form: If we know that yt is always > 0 (as is the usual case for measurements)

The log transform (ln yt ) usually also makes the data ”more” normally distributed Example: Application of root (yt ) and log (ln yt ) transforms

AR-models (for stationary time series) Consider the model yt = δ + ·yt –1 + at with {at } i.i.d with zero mean and constant variance = σ2 and where δ (delta) and  (phi) are (unknown) parameters Set δ = 0 by sake of simplicity  E(yt ) = 0 Let R(k) = Cov(yt,yt-k ) = Cov(yt,yt+k ) = E(yt ·yt-k ) = E(yt ·yt+k )  R(0) = Var(yt) assumed to be constant

Now: R(0) = E(yt ·yt ) = E(yt ·( ·yt-1 + at ) =  · E(yt ·yt-1 ) + E(yt ·at ) = =  ·R(1) + E(( ·yt-1 + at ) ·at ) =  ·R(1) +  · E(yt-1 ·at ) + E(at ·at )= =  ·R(1) + 0 + σ2 (for at is independent of yt-1 ) R(1) = E(yt ·yt+1 ) = E(yt ·( ·yt + at+1 ) =  · E(yt ·yt ) + E(yt ·at+1 ) = =  ·R(0) + 0 (for at+1 is independent of yt ) R(2) = E(yt ·yt+2 ) = E(yt ·( ·yt+1 + at+2 ) =  · E(yt ·yt+1 ) + + E(yt ·at+2 ) =  ·R(1) + 0 (for at+1 is independent of yt ) 

R(0) =  ·R(1) + σ2 R(1) =  ·R(0) Yule-Walker equations R(2) =  ·R(1) …  R(k ) =  ·R(k – 1) =…=  k·R(0) R(0) =  2 ·R(0) + σ2 

Note that for R(0) to become positive and finite (which we require from a variance) the following must hold: This in effect the condition for an AR(1)-process to be weakly stationary Now, note that

ρk is called the Autocorrelation function (ACF) of yt ”Auto” because it gives correlations within the same time series. For pairs of different time series one can define the Cross correlation function which gives correlations at different lags between series. By studying the ACF it might be possible to identify the approximate magnitude of 

Examples:

The look of an ACF can be similar for different kinds of time series, e.g. the ACF for an AR(1) with  = 0.3 could be approximately the same as the ACF for an Auto-regressive time series of higher order than 1 (we will discuss higher order AR-models later) To do a less ambiguous identification we need another statistic: The Partial Autocorrelation function (PACF): υk = Corr (yt ,yt-k | yt-k+1, yt-k+2 ,…, yt-1 ) i.e. the conditional correlation between yt and yt-k given all observations in-between. Note that –1  υk  1

A concept sometimes hard to interpret, but it can be shown that for AR(1)-models with  positive the look of the PACF is and for AR(1)-models with  negative the look of the PACF is

Assume now that we have a sample y1, y2,…, yn from a time series assumed to follow an AR(1)-model. Example:

The ACF and the PACF can be estimated from data by their sample counterparts: Sample Autocorrelation function (SAC): if n large, otherwise a scaling might be needed Sample Partial Autocorrelation function (SPAC) Complicated structure, so not shown here

The variance function of these two estimators can also be estimated  Opportunity to test H0: k = 0 vs. Ha: k  0 or H0: k = 0 vs. Ha: k  0 for a particular value of k. Estimated sample functions are usually plotted together with critical limits based on estimated variances.

Example (cont) DKK/USD exchange: SAC: SPAC: Critical limits

Ignoring all bars within the red limits, we would identify the series as being an AR(1) with positive . The value of  is approximately 0.9 (ordinate of first bar in SAC plot and in SPAC plot)

Higher-order AR-models AR(2): or yt-2 must be present AR(3): or other combinations with  3 yt-3 AR(p): i.e. different combinations with  p yt-p

Stationarity conditions: For p > 2, difficult to express on closed form. For p = 2: The values of 1 and 2 must lie within the blue triangle in the figure below:

Typical patterns of ACF and PACF functions for higher order stationary AR-models (AR( p )): ACF: Similar pattern as for AR(1), i.e. (exponentially) decreasing bars, (most often) positive for  1 positive and alternating for 1 negative. PACF: The first p values of k are non-zero with decreasing magnitude. The rest are all zero (cut-off point at p ) (Most often) all positive if  1 positive and alternating if  1 negative

Examples: AR(2),  1 positive: AR(5),  1 negative: