Time Series and Forecasting BIA 674 SUPPLY CHAIN ANALYTICS Additional Material.

Slides:



Advertisements
Similar presentations
ECON 251 Research Methods 11. Time Series Analysis and Forecasting.
Advertisements

Time Series and Forecasting
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
Trends and Seasonality Using Multiple Regression with Time Series Data Many time series data have a common tendency of growing over time, and therefore.
Statistics for Managers Using Microsoft® Excel 5th Edition
Chapter 5 Time Series Analysis
Data Sources The most sophisticated forecasting model will fail if it is applied to unreliable data Data should be reliable and accurate Data should be.
MOVING AVERAGES AND EXPONENTIAL SMOOTHING
Forecasting & Time Series Minggu 6. Learning Objectives Understand the three categories of forecasting techniques available. Become aware of the four.
Part II – TIME SERIES ANALYSIS C2 Simple Time Series Methods & Moving Averages © Angel A. Juan & Carles Serrat - UPC 2007/2008.
Discovering and Describing Relationships
Chapter 16 Time-Series Analysis and Forecasting
Chapter 19 Time-Series Analysis and Forecasting
McGraw-Hill/Irwin Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. Time Series and Forecasting Chapter 16.
Time Series and Forecasting
Slides 13b: Time-Series Models; Measuring Forecast Error
© 2003 Prentice-Hall, Inc.Chap 12-1 Business Statistics: A First Course (3 rd Edition) Chapter 12 Time-Series Forecasting.
Lecture 4 Time-Series Forecasting
Applied Business Forecasting and Planning
© 2002 Prentice-Hall, Inc.Chap 13-1 Statistics for Managers using Microsoft Excel 3 rd Edition Chapter 13 Time Series Analysis.
Time Series and Forecasting
The Forecast Process Dr. Mohammed Alahmed
Example 13.1 Forecasting Monthly Stereo Sales Testing for Randomness.
Demand Management and Forecasting
TIME SERIES by H.V.S. DE SILVA DEPARTMENT OF MATHEMATICS
1 1 Slide © 2012 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Copyright © 2015 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill Education.
McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies, Inc. All rights reserved. Time Series Forecasting Chapter 13.
Time-Series Forecasting Learning Objectives 1.Describe What Forecasting Is 2. Forecasting Methods 3.Explain Time Series & Components 4.Smooth a Data.
MBA.782.ForecastingCAJ Demand Management Qualitative Methods of Forecasting Quantitative Methods of Forecasting Causal Relationship Forecasting Focus.
Time Series Analysis and Forecasting
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 15-1 Business Statistics: A Decision-Making Approach 6 th Edition Chapter.
Time series Decomposition Farideh Dehkordi-Vakil.
Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc Chapter 20 Time Series Analysis and Forecasting.
1 1 Slide © 2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
© 1999 Prentice-Hall, Inc. Chap Chapter Topics Component Factors of the Time-Series Model Smoothing of Data Series  Moving Averages  Exponential.
ECNE610 Managerial Economics Week 4 MARCH Dr. Mazharul Islam Chapter-5.
COMPLETE BUSINESS STATISTICS
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 16-1 Chapter 16 Time-Series Forecasting and Index Numbers Basic Business Statistics 11 th.
TESTING FOR NONSTATIONARITY 1 This sequence will describe two methods for detecting nonstationarity, a graphical method involving correlograms and a more.
Statistics for Managers Using Microsoft Excel, 5e © 2008 Prentice-Hall, Inc.Chap 16-1 Statistics for Managers Using Microsoft® Excel 5th Edition Chapter.
Time Series and Forecasting
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 14 l Time Series: Understanding Changes over Time.
Time-Series Forecast Models  A time series is a sequence of evenly time-spaced data points, such as daily shipments, weekly sales, or quarterly earnings.
Time Series and Forecasting Chapter 16 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved.
13 – 1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall. Forecasting 13 For Operations Management, 9e by Krajewski/Ritzman/Malhotra.
©The McGraw-Hill Companies, Inc. 2008McGraw-Hill/Irwin Time Series and Forecasting Chapter 16.
Statistics for Business and Economics Module 2: Regression and time series analysis Spring 2010 Lecture 7: Time Series Analysis and Forecasting 1 Priyantha.
Chapter 11 – With Woodruff Modications Demand Management and Forecasting Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Time Series Forecasting Trends and Seasons and Time Series Models PBS Chapters 13.1 and 13.2 © 2009 W.H. Freeman and Company.
TIME SERIES MODELS. Definitions Forecast is a prediction of future events used for planning process. Time Series is the repeated observations of demand.
Welcome to MM305 Unit 5 Seminar Forecasting. What is forecasting? An attempt to predict the future using data. Generally an 8-step process 1.Why are you.
©The McGraw-Hill Companies, Inc. 2008McGraw-Hill/Irwin Time Series and Forecasting Chapter 16.
Yandell – Econ 216 Chap 16-1 Chapter 16 Time-Series Forecasting.
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
Keller: Stats for Mgmt & Econ, 7th Ed
Chapter 11: Forecasting Models
Chapter Nineteen McGraw-Hill/Irwin
Quantitative Analysis for Management
Correlation and Simple Linear Regression
What is Correlation Analysis?
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
Statistics for Managers using Microsoft Excel 3rd Edition
“The Art of Forecasting”
John Loucks St. Edward’s University . SLIDES . BY.
MBF1413 | Quantitative Methods Prepared by Dr Khairul Anuar
Exponential Smoothing
Chapter Nineteen McGraw-Hill/Irwin
Demand Management and Forecasting
Chap 4: Exponential Smoothing
Presentation transcript:

Time Series and Forecasting BIA 674 SUPPLY CHAIN ANALYTICS Additional Material

Stationarity

We know that not all time series are stationary. However, it is easy to convert a trend or a seasonal time series to a stationary time series. Simply use the concept of “Differencing.” Differencing

Convert a trend time series to stationary time series using the differencing method

Convert a seasonal time series to stationary time series

Converting a time series to a stationary Non-constant variance can be removed by performing a natural log transformation

Autocorrelation

The Problem Stereo sales data suggests that the pattern of sales is not completely random. Large values tend to follow large values, and small values tend to follow small values. The time series may be autocorrelated, i.e., successive observations are correlated with one other Do autocorrelations support this conclusion?

Autocorrelations Recall that successive observations in a random series are probabilistically independent of one another. Many time series violate this property and are instead autocorrelated. Correlation coefficient is a summary statistic that measures the extent of linear relationship between two variables. As such they can be used to identify explanatory relationships. The “auto” means that successive observations are correlated with one other. To understand autocorrelations it is first necessary to understand what it means to lag a time series.

Autocorrelations in Excel To lag by 1 month, we simply “push down” the series by one row. Lags are simply previous observations, removed by a certain number of periods from the present time.

Lags and Autocorrelations for Stereo Sales

Autocorrelation In evaluating time series data, it is useful to look at the correlation between successive observations over time. This measure of correlation is called autocorrelation and may be calculated as follows: r k = autocorrelation coefficient for a k period lag. mean of the time series. y t = Value of the time series at period t. y t-k = Value of time series k periods before period t.

Correlograms: An Alternative Method of Data Exploration The plot of the autocorrelation Function (ACF) versus time lag is called Correlogram. The horizontal scale is the time lag The vertical axis is the autocorrelation coefficient. Patterns in a Correlogram are used to analyze key features of data.

AUTOCORRELATION

Example Autocorrelation.xls

Autocorrelation Autocorrelation coefficient for different time lags can be used to answer the following questions about a time series data. Are the data random? If the autocorrelations between y t and y t-k for any lag are close to zero, then the successive values of a time series are not related to each other.

Correlograms: An Alternative Method of Data Exploration Is there a trend? If the series has a trend, y t and y t-k are highly correlated The autocorrelation coefficients are significantly different from zero for the first few lags and then gradually drops toward zero. The autocorrelation coefficient for the lag 1 is often very large (close to 1). A series that contains a trend is said to be non- stationary.

Example:Mobil Home Shipment Correlograms for the mobile home shipment Note that this is quarterly data

Correlograms: An Alternative Method of Data Exploration Is there seasonal pattern? If a series has a seasonal pattern, there will be a significant autocorrelation coefficient at the seasonal time lag or multiples of the seasonal lag. The seasonal lag is 4 for quarterly data and 12 for monthly data.

Correlograms: An Alternative Method of Data Exploration Is it stationary? A stationary time series is one whose basic statistical properties, such as the mean and variance, remain constant over time. Autocorrelation coefficients for a stationary series decline to zero fairly rapidly, generally after the second or third time lag.

Correlograms: An Alternative Method of Data Exploration To determine whether the autocorrelation at lag k is significantly different from zero, the following hypothesis and rule of thumb may be used. H 0 :  k = 0,H a :  k  0 For any k, reject H 0 if Where n is the number of observations. This rule of thumb is for  = 5%

Correlograms: An Alternative Method of Data Exploration The hypothesis test developed to determine whether a particular autocorrelation coefficient is significantly different from zero is: Hypotheses H 0 :  k = 0,H a :  k  0 Test Statistic: Reject H 0 if

Example:Japanese exchange Rate As the world’s economy becomes increasingly interdependent, various exchange rates between currencies have become important in making business decisions. For many U.S. businesses, The Japanese exchange rate (in yen per U.S. dollar) is an important decision variable. A time series plot of the Japanese-yen U.S.-dollar exchange rate is shown below. On the basis of this plot, would you say the data is stationary? Is there any seasonal component to this time series plot?

Example:Japanese exchange Rate

Here is the autocorrelation structure for EXRJ. With a sample size of 12, the critical value is This is the approximate 95% critical value for rejecting the null hypothesis of zero autocorrelation at lag K.

Example:Japanese exchange Rate The Correlograms for EXRJ is given below

Example:Japanese exchange Rate Since the autocorrelation coefficients fall to below the critical value after just two periods, we can conclude that there is no trend in the data. No seasonality is observed

Example:Japanese exchange Rate To check for seasonality at  =.05 The hypotheses are: H 0 ;  12 = 0H a :  12  0 Test statistic is: Reject H 0 if

Example:Japanese exchange Rate Since We do not reject H 0, therefore seasonality does not appear to be an attribute of the data.

ACF of Forecast Error The autocorrelation function of the forecast errors is very useful in determining if there is any remaining pattern in the errors (residuals) after a forecasting model has been applied. This is not a measure of accuracy, but rather can be used to indicate if the forecasting method could be improved.

Random Series

Time Series Plot of Demand for Parts

Visual Inspection Demands vary randomly around the sample mean of $ (shown as the horizontal centerline). The variance appears to be constant through time, and there are no obvious time series patterns. To check formally whether this apparent randomness holds, we calculate the first 10 autocorrelations.

Findings None of the autocorrelations is significantly large. These findings are consistent with randomness. For all practical purposes there is no time series pattern to these demand data.

AUTOCORRELATION (a) (quite) Stationary(b) (rather) Non-stationary (c) Trend(d) Seasonality (quarterly)

The Random Walk Model

Random Walk Model Random series are sometimes building blocks for other time series models. The random walk model is an example of this. In the random walk model the series itself is not random. However, its differences - that is the changes from one period to the next - are random. This type of behavior is typical of stock price data.

Solution The Dow Jones series itself is not random, due to upward trend, so we form the differences in Column C with the formula =B7-B6 which is copied down column C. The difference can be seen on the next slide. A graph of the differences (see graph following data) show the series to be a much more random series, varying around the mean difference The runs test appears in column H and shows that there is absolutely no evidence of nonrandom differences; the observed number of runs is almost identical to the expected number.

Differences for Dow Jones Data

Time Series Plot of Dow Differences

Solution -- continued Similarly, the autocorrelations are all small except for a random “blip” at lag 11. Because the values are 11 months apart we would tend to ignore this autocorrelation. Assuming the random walk model is adequate, the forecast of April 1992 made in March 1992 is the observed March value, , plus the mean difference, or A measure of the forecast accuracy is the standard deviation of We can be 95% certain that our forecast will be within the standard deviations.

Additional Forecasting If we wanted to forecast further into the future, say 3 months, based on the data through March 1992, we would add the most recent value, , to three times the mean difference, That is, we just project the trend that far into the future. We caution about forecasting too far into the future for such a volatile series as the Dow.

Autoregressive Models

A retailer has recorded its weekly sales of hammers (units purchased) for the past 42 weeks. The data are found in the file. The graph of this time series appears below and reveals a “meandering” behavior. Plot and Data

The Plot and Data The values begin high and stay high awhile, then get lower and stay lower awhile, then get higher again. This behavior could be caused by any number of things. How useful is autoregression for modeling these data and how would it be used for forecasting?

Autocorrelations A good place to start is with the autocorrelations of the series. These indicate whether the Sales variable is linearly related to any of its lags. The first six autocorrelations are shown below.

Autocorrelations -- continued The first three of them are significantly positive, and then they decrease. Based on this information, we create three lags of Sales and run a regression of Sales versus these three lags. Here is the output from this regression

Autoregression Output with Three Lagged Variables

Autocorrelations -- continued We see that R 2 is fairly high, about 57%, and that s e is about However, the p-values for lags 2 and 3 are both quite large. It appears that once the first lag is included in the regression equation, the other two are not really needed. Therefore we reran the regression with only the first lag include.

Autoregression Output with a Single Lagged Variable

Forecasts from Aggression This graph shows the original Sales variable and its forecasts

Regression Equation The estimated regression equation is Forecasted Sales t = Sales t-1 The associated R 2 and s e values are approximately 65% and The R 2 is a measure of the reasonably good fit we see in the previous graph, whereas the s e is a measure of the likely forecast error for short-term forecasts. It implies that a short-term forecast could easily be off by as much as two standard errors, or about 31 hammers.

Regression Equation -- continued To use the regression equation for forecasting future sales values, we substitute known or forecasted sales values in the right hand side of the equation. Specifically, the forecast for week 43, the first week after the data period, is approximately 98.6 using the equation ForecastedSales 43 = Sales 42 The forecast for week 44 is approximately 92.0 and requires the forecasted value of sales in week 43 in the equation: ForecastedSales 44 = ForecastedSales 43

Forecasts Perhaps these two forecasts of future sales are on the mark and perhaps they are not. The only way to know for certain is to observe future sales values. However, it is interesting that in spite of the upward movement in the series, the forecasts for weeks 43 and 44 are downward movements.

Regression Equation Properties The downward trend is caused by a combination of the two properties of the regression equation. First, the coefficient of Sales t-1, 0.793, is positive. Therefore the equation forecasts that large sales will be followed by large sales (that is, positive autocorrelation). Second, however, this coefficient is less than 1, and this provides a dampening effect. The equation forecasts that a large will follow a large, but not that large.

Seasonality Indexes Ratio-to-moving-Average Monthly seasonality indexes Use smoothing averages to smooth data and eliminate seasonality

57 Seasonal Variation One of the components of a time series Seasonal variations are fluctuations that coincide with certain seasons and are repeated year after year Understanding seasonal fluctuations help plan for sufficient goods and materials on hand to meet varying seasonal demand Analysis of seasonal fluctuations over a period of years help in evaluating current sales

Seasonal Component PERIOD LENGTH“SEASON” LENGTHNUMBER OF “SEASONS” IN PATTERN WeekDay7 MonthWeek4 – 4.5 MonthDay28 – 31 YearQuarter4 YearMonth12 YearWeek52

59 Seasonal Index A number, usually expressed in percent, that expresses the relative value of a season with respect to the average for the year (100%) Ratio-to-moving-average method The method most commonly used to compute the typical seasonal pattern It eliminates the trend (T), cyclical (C), and irregular (I) components from the time series

60 The table below shows the quarterly sales for Toys International for the years 2001 through The sales are reported in millions of dollars. Determine a quarterly seasonal index using the ratio-to-moving-average method. Ratio-to-moving Average - Example 1

61 Step (1) – Organize time series data in column form Step (2) Compute the 4-quarter moving totals Step (3) Compute the 4-quarter moving averages Step (4) Compute the centered moving averages by getting the average of two 4-quarter moving averages Step (5) Compute ratio by dividing actual sales by the centered moving averages

62 Ratio-to-moving Average - Example 1

63 Ratio-to-moving Average - Example 1 Deseasonalized Sales = Sales / Seasonal Index

64 Ratio-to-moving Average - Example 1

65 Ratio-to-moving Average - Example 1

66 Ratio-to-moving Average - Example 1

67 Ratio-to-moving Average - Example 1 Given the deseasonalized linear equation for Toys International sales as Ŷ= t, generate the seasonally adjusted forecast for the each of the quarters of 2007 Quartert Ŷ (unadjusted forecast) Seasonal Index Quarterly Forecast (seasonally adjusted forecast) Winter Spring Summer Fall Ŷ = (28) Ŷ X SI = X 1.519

Monthly seasons - Example 2 1.Find average historical demand for each month 2.Compute the average demand over all months 3.Compute a seasonal index for each month 4.Estimate next year’s total demand 5.Divide this estimate of total demand by the number of months, then multiply it by the seasonal index for that month Steps in the process for monthly seasons:

Seasonal Index Example DEMAND MONTHYEAR 1YEAR 2YEAR 3 AVERAGE YEARLY DEMAND AVERAGE MONTHLY DEMAND SEASONAL INDEX Jan Feb7085 Mar Apr May June July Aug Sept Oct Nov Dec Total average annual demand = 1,128

Seasonal Index Example 2 DEMAND MONTHYEAR 1YEAR 2YEAR 3 AVERAGE YEARLY DEMAND AVERAGE MONTHLY DEMAND SEASONAL INDEX Jan Feb Mar Apr May June July Aug Sept Oct Nov Dec Total average annual demand =1,128 Average monthly demand

Seasonal Index Example 2 DEMAND MONTHYEAR 1YEAR 2YEAR 3 AVERAGE YEARLY DEMAND AVERAGE MONTHLY DEMAND SEASONAL INDEX Jan Feb Mar Apr May June July Aug Sept Oct Nov Dec Total average annual demand =1,128 Seasonal index.957( = 90/94)

Seasonal Index Example 2 DEMAND MONTHYEAR 1YEAR 2YEAR 3 AVERAGE YEARLY DEMAND AVERAGE MONTHLY DEMAND SEASONAL INDEX Jan ( = 90/94) Feb ( = 80/94) Mar ( = 85/94) Apr ( = 100/94) May ( = 123/94) June ( = 115/94) July ( = 105/94) Aug ( = 100/94) Sept ( = 90/94) Oct ( = 80/94) Nov ( = 80/94) Dec ( = 80/94) Total average annual demand =1,128

Seasonal Index Example 2 MONTHDEMANDMONTHDEMAND Jan 1,200 x.957 = 96 July 1,200 x = Feb 1,200 x.851 = 85 Aug 1,200 x = Mar 1,200 x.904 = 90 Sept 1,200 x.957 = Apr 1,200 x = 106 Oct 1,200 x.851 = May 1,200 x = 131 Nov 1,200 x.851 = June 1,200 x = 122 Dec 1,200 x.851 = Seasonal forecast for Year 4

Seasonal Index Example – 130 – 120 – 110 – 100 – 90 – 80 – 70 – ||||||||||||JFMAMJJASOND||||||||||||JFMAMJJASOND Time Demand Year 4 Forecast Year 3 Demand Year 2 Demand Year 1 Demand

Eliminate seasonality using moving averages Moving averages smooth out noise in the data For example, each January sales are less than December sales. The unsuspected analyst will think that there is downward trend in the trend!

Outliers

Outliers in regression are data points where the absolute value of the error (actual value of y – predicted value of y) exceeds two standard errors. Europe.xls

Nonlinearities

Outliers Independent variables can interact with or influence a dependent variable in nonlinear ways. Priceandads.xls

Mixed Models

MOVING AVERAGE (MA) METHOD

WEIGHTED MA METHOD

WEIGHTED MA METHOD

AUTOREGRESSION Obtained via least-squares or regression

MIXED MODELS

MIXED MODELS

MIXED MODELS

Data Patterns

Data Pattern A time series is likely to contain some or all of the following components: Trend Seasonal Cyclical Irregular

Data Pattern Trend in a time series is the long-term change in the level of the data i.e. observations grow or decline over an extended period of time. Positive trend When the series move upward over an extended period of time Negative trend When the series move downward over an extended period of time Stationary When there is neither positive or negative trend.

Data Pattern Seasonal pattern in time series is a regular variation in the level of data that repeats itself at the same time every year. Examples: Retail sales for many products tend to peak in November and December. Housing starts are stronger in spring and summer than fall and winter.

Data Pattern Cyclical patterns in a time series is presented by wavelike upward and downward movements of the data around the long-term trend. They are of longer duration and are less regular than seasonal fluctuations. The causes of cyclical fluctuations are usually less apparent than seasonal variations.

Data Pattern Irregular pattern in a time series data are the fluctuations that are not part of the other three components These are the most difficult to capture in a forecasting model

No Trend—Stationary demand seems to cluster around a specific level.

Demand consistently increases or decreases over time. Time Trend

Seasonality

Cyclical

Data Patterns & Model Selection

Data Patterns and Model Selection Forecasting techniques used for stationary time series data are: Naive methods Simple averaging methods, Moving averages Simple exponential smoothing Autoregressive moving average (ARMA)

Data Patterns and Model Selection Methods used for time series data with trend are: Moving averages Holt’s linear exponential smoothing Simple regression Growth curve Exponential models Time series decomposition Autoregressive integrated moving average (ARIMA)

Data Patterns and Model Selection For time series data with seasonal component the goal is to estimate seasonal indexes from historical data. These indexes are used to include seasonality in forecast or remove such effect from the observed value. Forecasting methods to be considered for these type of data are: Winter’s exponential smoothing Time series multiple regression Autoregressive integrated moving average (ARIMA)

Example: GDP, in 1996 Dollars For GDP, which has a trend and a cycle but no seasonality, the following might be appropriate: Holt’s exponential smoothing Linear regression trend Causal regression Time series decomposition

Example:Quarterly data on private housing starts Private housing starts have a trend, seasonality, and a cycle. The likely forecasting models are: Winter’s exponential smoothing Linear regression trend with seasonal adjustment Causal regression Time series decomposition

Example:U.S. billings of the Leo Burnet advertising agency For U.S. billings of Leo Burnett advertising, There is a non-linear trend, with no seasonality and no cycle, therefore the models appropriate for this data set are: Non-linear regression trend Causal regression

handbook/index.htm