Chapter 6: Forecasting/Prediction

Slides:



Advertisements
Similar presentations
Stationary Time Series
Advertisements

Dates for term tests Friday, February 07 Friday, March 07
DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Exam 1 review: Quizzes 1-6.
Model Building For ARIMA time series
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
The General Linear Model. The Simple Linear Model Linear Regression.
STAT 497 APPLIED TIME SERIES ANALYSIS
CHAPTER 5 TIME SERIES AND THEIR COMPONENTS (Page 165)
Prediction and model selection
1 Simple Linear Regression Chapter Introduction In this chapter we examine the relationship among interval variables via a mathematical equation.
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
Lecture 17 Interaction Plots Simple Linear Regression (Chapter ) Homework 4 due Friday. JMP instructions for question are actually for.
BOX JENKINS METHODOLOGY
ARMA models Gloria González-Rivera University of California, Riverside
Regression Method.
STAT 497 LECTURE NOTES 2.
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
CHAPTER 14 MULTIPLE REGRESSION
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
Regression Section 10.2 Bluman, Chapter 101. Example 10-1: Car Rental Companies Construct a scatter plot for the data shown for car rental companies in.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Models for Non-Stationary Time Series The ARIMA(p,d,q) time series.
FORECASTING. Minimum Mean Square Error Forecasting.
K. Ensor, STAT Spring 2005 Model selection/diagnostics Akaike’s Information Criterion (AIC) –A measure of fit plus a penalty term for the number.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Economics 173 Business Statistics Lecture 10 Fall, 2001 Professor J. Petry
Chapter 8: Simple Linear Regression Yang Zhenlin.
Principles of Extrapolation
Linear Filters. denote a bivariate time series with zero mean. Let.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
The Box-Jenkins (ARIMA) Methodology
MBF1413 | Quantitative Methods Prepared by Dr Khairul Anuar 8: Time Series Analysis & Forecasting – Part 1
Environmental Data Analysis with MatLab 2 nd Edition Lecture 22: Linear Approximations and Non Linear Least Squares.
Regression Analysis Part A Basic Linear Regression Analysis and Estimation of Parameters Read Chapters 3, 4 and 5 of Forecasting and Time Series, An Applied.
1 Objective Given two linearly correlated variables (x and y), find the linear function (equation) that best describes the trend. Section 10.3 Regression.
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
Short-Term Forecasting
Lecture Slides Elementary Statistics Twelfth Edition
The simple linear regression model and parameter estimation
Models for Non-Stationary Time Series
12. Principles of Parameter Estimation
10.2 Regression If the value of the correlation coefficient is significant, the next step is to determine the equation of the regression line which is.
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
Chapter 4: Seasonal Series: Forecasting and Decomposition
Chapter 13 Simple Linear Regression
Model Building For ARIMA time series
Bias and Variance of the Estimator
2.1 Equations of Lines Write the point-slope and slope-intercept forms
Lecture Slides Elementary Statistics Thirteenth Edition
Chapter 5 Nonstationary Time series Models
Regression Models - Introduction
MBF1413 | Quantitative Methods Prepared by Dr Khairul Anuar
Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Chapter 4 Other Stationary Time Series Models
Chapter 3 ARMA Time Series Models
Chapter 9 Model Building
Module 3 Forecasting a Single Variable from its own History, continued
Unit Root & Augmented Dickey-Fuller (ADF) Test
Chapter 8. Model Identification
Simple Linear Regression
CHAPTER 14 MULTIPLE REGRESSION
Linear Filters.
The Spectral Representation of Stationary Time Series
Chapter 2 – Linear Filters
12. Principles of Parameter Estimation
16. Mean Square Estimation
Load forecasting Prepared by N.CHATHRU.
Regression Models - Introduction
Presentation transcript:

Chapter 6: Forecasting/Prediction First Note That: Forecasting is Extrapolating NOTE: Some slides have blank sections. They are based on a teaching style in which the corresponding blank (derivations, theorem proofs, examples, …) are worked out in class on the board or overhead projector.

Predict the price of a car that weighs 3500 lbs. STAT 1301 Example Predict the price of a car that weighs 3500 lbs. - extrapolation would say it’s about $16,000

oops!!! Predict the price of a car that weighs 3500 lbs. STAT 1301 Example Predict the price of a car that weighs 3500 lbs. oops!!! - extrapolation would say it’s about $16,000

Note: Forecasting is Extrapolating Lesson from regression analysis and 1301 example? - Don’t do it However, it’s not very important to - “predict” the sunspot number for 2012 - “predict” sales for the previous quarter - ...

Deterministic Signal-plus-Noise Models Example Signals: , C constant

Recall -- sometimes it’s not easy to tell whether a deterministic signal is present in the data Is there a deterministic signal?

Realizations - is there a deterministic signal? Recall - Sometimes it’s not easy to tell whether a deterministic signal is present in the data. Global Temperature Data

How would you predict/forecast into the future? - depends on the model Global Temperature Data

Now which forecast do you prefer? Suppose I told you this is a Realization from the model Other Realizations Now which forecast do you prefer?

Forecasting Setting in this Chapter Forecast future behavior of a time series given a finite realization of its past. The use of ARMA models for this purpose has become popular in recent years. ARMA Forecasting Vs. Curve Fitting Curve fitting (regression): Underlying assumption that future behavior follows some deterministic path with only random fluctuations ARMA Forecasting: Underlying assumption is that the future is guided only by its correlation to the past

Problem: Notation: “Best Predictor”: Mean Square Sense

Prediction Background Theorem 6.1: for any random variable h(X) such that E[h(X)2] <  - i.e. E[Y|X] minimizes mean square error between Y and square integrable random variables that are functions of X - proved in Probability Theory

Notes: We restrict our attention to “linear predictors” i.e. we are looking for a predictor of the form

Theorem 6.2: (Projection Theorem) If X t is a stationary time series, then is the best linear predictor of (has minimum mean square error among linear predictors) if and only if Proof:

Notes: 3. We let denote the best linear predictor 4. Denote coefficients of by i.e.

Theorem 6.3: (proof – HW) If Xt is a stationary process with autocovariance function gk , then are the solutions to the system

Note 5: The typical application of these results is the case t0 = n = realization length. This results in a “large” system of equations to solve.

Box-Jenkins Approach * Assumes: In this setting: Xt is stationary/invertible ARMA * In this setting: - best predictor, , is easily obtained - results in “natural approximations” in practice - for “moderately” large t0 , results are very similar to - approach we will use in tswge

Result : (Section 6.2)

- may be difficult to obtain Review: - may be difficult to obtain - Theorem 6.2 (Projection Theorem)

Note: Best forecast is LINEAR (Stationary and invertible ARMA, m = 0) Best Forecast is Note: Best forecast is LINEAR - but it is not in a form for calculation

Note: Using  - step ahead forecast error Nonzero Mean

Properties

Forecasting Based on the Difference Equation Suppose Xt is ARMA(p,q), i.e. j(B)X t = q(B)a t

Recall:

Non-Zero Mean Form

Note: its known values as of time t0 or (2) its forecast from t0

Forecasts using the difference equation are more easily used for calculating than those from GLP. However – there are still problems in implementation. Practical Situation (a) only have been observed (b) model has been estimated (c) m is unknown (d) ak ‘s are unknown

In Practice: We use difference equation form with the following modifications: The estimated model is assumed to be the true model (ii) m is estimated by (iii) a k’s are estimated from the data

Basic Formula for Calculating Forecasts from Difference Equation

Example 6.1: AR(1) (1 -.8B)(X t - 25) = a t X 1, ... , X 80 observed

Example 6.2: (1-1.2B+ .6B2)(Xt – m) = (1-.5B)at Calculate Forecasts:

Example 6. 2: Xt , t = 1,2,…,25 from (1-1. 2B+. 6B2)(Xt – 50) = (1- Example 6.2: Xt , t = 1,2,…,25 from (1-1.2B+ .6B2)(Xt – 50) = (1-.5B)at t Xt 1 49.49 0.00 14 50.00 -.42 2 51.14 15 49.70 -.61 3 49.97 -1.72 16 51.73 1.77 4 49.74 17 51.88 .49 5 50.35 .42 18 50.58 -.11 6 49.57 -.81 19 48.78 -1.22 7 50.39 .70 20 49.14 .51 8 51.04 .65 21 48.10 -1.36 9 49.93 -.78 22 49.52 .59 10 49.78 .08 23 50.06 -.23 11 48.37 -1.38 24 50.66 .17 12 48.85 -.03 25 52.05 1.36 13 49.85 .22

Example 6.2: Calculate Forecasts

Eventual Forecast Function Recall:

Eventual Forecast Function

Notes:

Example 6.1 (cont.) Forecasts using the AR(1) Model What should forecasts look like? - like rk for AR(1) - damped exponential C1(.8) k

Example 6.1 (cont.) Forecasts using the AR(1) Model What should forecasts look like? - like rk for AR(1) - damped exponential C1(.8) k

Example 6.2 (cont) Forecasts using the ARMA(2,1) Model

Basic Formula for Calculating Forecasts from Difference Equation

Probability Limits for Forecasts Recall:

Note: If the white noise is normal, then

100(1-a)% Probability Limits for Forecasts Theorem 2.3 Notes: p

Example 6.2: (cont.) psi.weights.wge l Forecast Half-width 1 51.40 1.70 2 50.46 2.07 3 49.73 2.11 4 49.42 2.12

Example 6.2 Forecasts with 95% Prediction Limits

tswge demo To forecast from a stationary ARMA model use fore.arma.wge(x,phi,theta,n.ahead,lastn=FALSE,plot=TRUE,limits=TRUE) Example 6.1 Forecasts using the AR(1) Model data(fig6.1nf) fore.arma.wge(fig6.1nf,phi=.8,n.ahead=20, lastn=FALSE,plot=TRUE,limits=FALSE) fore.arma.wge(fig6.1nf,phi=.8,n.ahead=20, lastn=FALSE,plot=TRUE,limits=TRUE) fore.arma.wge(fig6.1nf,phi=.8,n.ahead=20, lastn=TRUE,plot=TRUE,limits=FALSE)

tswge demo - continued Example 6.2 Forecasts using the ARMA(2,1) Model data(fig6.2nf) fore.arma.wge(fig6.2nf,phi=c(1.2,-.6), theta=.5, n.ahead=20,lastn=FALSE,plot=TRUE,limits=FALSE) fore.arma.wge(fig6.2nf,phi=c(1.2,-.6), theta=.5, n.ahead=20,lastn=FALSE,plot=TRUE,limits=TRUE)

Forecasting with ARUMA Models * Formally, as in the stationary case

Notes: (1) Consider as limit of stationary models *

Example 6.3:

Example 6.3:

Example 6.4:

Note: Forecasts follow a line determined by the last 2 data values Example 6.5: Note: Forecasts follow a line determined by the last 2 data values In General: Forecasts follow a polynomial of degree d - 1 that passes through the last d time points.

tswge demo (1-.8B)(1-B)Xt = at (1-B)2 Xt = at fore.aruma.wge(x,phi,theta,d,s,lambda,n.ahead, lastn=FALSE,plot=TRUE,limits=TRUE) Example 6.4 Forecasts using the ARUMA(1,1,0) Model (1-.8B)(1-B)Xt = at x=gen.aruma.wge(n=200,phi=.8,d=1) fore.aruma.wge(x,phi=.8,d=1,n.ahead=20,lastn=FALSE,plot=TRUE,limits=TRUE) Example 6.5 Forecasts using the ARUMA(0,2,0) Model (1-B)2 Xt = at x=gen.aruma.wge(n=50,d=2) fore.aruma.wge(x,d=2,n.ahead=20,lastn=FALSE, plot=TRUE,limits=TRUE)

Example:

Example 6.6: Seasonal Model Forecasts: - i.e. forecasts are an exact replica of the last s values

More general seasonal model forecasts follow general pattern of last s points, but not an exact replica “finer detail” incorporated through j (B) and q (B)

tswge demo fore.aruma.wge(x,phi,theta,d,s,lambda,n.ahead, lastn=FALSE,plot=TRUE,limits=TRUE) Forecasts from the pure seasonal model (1-B 4)Xt = at x=gen.aruma.wge(n=20,s=4) fore.aruma.wge(x,s=4,n.ahead=8,lastn=FALSE,plot=TRUE, limits=FALSE) # fore.aruma.wge(x,s=4,n.ahead=8,lastn=TRUE,plot=TRUE, limits=FALSE) Example 6.5 Forecasts using seasonal model (1-.8B)(1-B 4)Xt = at x=gen.aruma.wge(n=20,phi=.9,s=4) fore.aruma.wge(x,phi=.9,s=4,n.ahead=8,lastn=FALSE, plot=TRUE,limits=FALSE) Also lastn=TRUE

Example 6.8: Seasonal Model with Trend airline model note that full model characteristic equation has 2 roots of +1 See Table 5.5 Log airline data and forecasts from t0 = 108

tswge demo – log airline data 1(B)(1-B)(1-B 12)Xt = at Where 1(B) is the 12th order operator given in (6.53) of Woodward, Gray, and Elliott (2017) data(airlog) phi1=c(-.36,-.05,-.14,-.11,.04,.09,-.02, .02,.17,.03,-.10,-.38) fore.aruma.wge(airlog,phi=phi1,d=1,s=12,n.ahead=36,lastn=TRUE,plot=TRUE,limits=FALSE)

Example 6.9: Cyclic Models forecasts follow a non-damping sinusoid adaptive

Forecasts: Note: It can be shown that the forecasts satisfy

Signal-plus-noise forecasts Xt = st + Zt Text considers

Forecasting Strategy (linear case)

tswge demo fore.sigplusnoise.wge(x, linear = TRUE, freq = 0, max.p = 5, n.ahead = 10, lastn = FALSE, plot = TRUE, limits = TRUE) x=gen.sigplusnoise.wge(n=50,b0=10,b1=.2, phi=.8) # xfore=fore.sigplusnoise.wge(x,linear=TRUE,,n.ahead=10,lastn=FALSE,limits=FALSE)

Effect of starting values on ARUMA Model

Effect of starting values on ARUMA Model

Observation: Forecasts from ARMA/ARUMA model are more adaptive Application may dictate whether this adaptability is desirable