MOVING AVERAGES AND EXPONENTIAL SMOOTHING

Slides:



Advertisements
Similar presentations
Part II – TIME SERIES ANALYSIS C3 Exponential Smoothing Methods © Angel A. Juan & Carles Serrat - UPC 2007/2008.
Advertisements

Forecasting OPS 370.
1 1 Slide © 2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Exponential Smoothing Methods
Time Series Analysis Autocorrelation Naive & Simple Averaging
Chapter 12 - Forecasting Forecasting is important in the business decision-making process in which a current choice or decision has future implications:
1 Spreadsheet Modeling & Decision Analysis: A Practical Introduction to Management Science, 3e by Cliff Ragsdale.
MOVING AVERAGES AND EXPONENTIAL SMOOTHING
Chapter 13 Forecasting.
Forecasting & Time Series Minggu 6. Learning Objectives Understand the three categories of forecasting techniques available. Become aware of the four.
CHAPTER 4 MOVING AVERAGES AND SMOOTHING METHODS (Page 107)
Part II – TIME SERIES ANALYSIS C2 Simple Time Series Methods & Moving Averages © Angel A. Juan & Carles Serrat - UPC 2007/2008.
Applied Business Forecasting and Planning
Business Forecasting Chapter 5 Forecasting with Smoothing Techniques.
Slides 13b: Time-Series Models; Measuring Forecast Error
MOVING AVERAGES AND EXPONENTIAL SMOOTHING. Forecasting methods: –Averaging methods. Equally weighted observations –Exponential Smoothing methods. Unequal.
CHAPTER 18 Models for Time Series and Forecasting
Fall, 2012 EMBA 512 Demand Forecasting Boise State University 1 Demand Forecasting.
Forecasting Chapter 15.
1 1 Slide © 2009 South-Western, a part of Cengage Learning Chapter 6 Forecasting n Quantitative Approaches to Forecasting n Components of a Time Series.
Slides by John Loucks St. Edward’s University.
Winter’s Exponential smoothing
ENGM 745 Forecasting for Business & Technology Paula Jensen South Dakota School of Mines and Technology, Rapid City 3rd Session 2/01/12: Chapter 3 Moving.
Operations and Supply Chain Management
1 1 Slide © 2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Production Planning and Control. 1. Naive approach 2. Moving averages 3. Exponential smoothing 4. Trend projection 5. Linear regression Time-Series Models.
Demand Management and Forecasting
1 Spreadsheet Modeling & Decision Analysis: A Practical Introduction to Management Science, 3e by Cliff Ragsdale.
Forecasting supply chain requirements
McGraw-Hill/Irwin Copyright © 2002 by The McGraw-Hill Companies, Inc. All rights reserved. 3-2 Business Forecasting with Accompanying Excel-Based ForecastX™
Copyright © 2015 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill Education.
Holt’s exponential smoothing
DSc 3120 Generalized Modeling Techniques with Applications Part II. Forecasting.
1 DSCI 3023 Forecasting Plays an important role in many industries –marketing –financial planning –production control Forecasts are not to be thought of.
MBA.782.ForecastingCAJ Demand Management Qualitative Methods of Forecasting Quantitative Methods of Forecasting Causal Relationship Forecasting Focus.
Time Series Analysis and Forecasting
Time series Decomposition Farideh Dehkordi-Vakil.
Simple Exponential Smoothing The forecast value is a weighted average of all the available previous values The weights decline geometrically Gives more.
Time-Series Forecasting Overview Moving Averages Exponential Smoothing Seasonality.
1 1 Slide © 2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Welcome to MM305 Unit 5 Seminar Prof Greg Forecasting.
Business Forecasting 1 Dr. Mohammed Alahmed (011) Dr. Mohammed Alahmed.
1 BABS 502 Moving Averages, Decomposition and Exponential Smoothing Revised March 14, 2010.
DEPARTMENT OF MECHANICAL ENGINEERING VII-SEMESTER PRODUCTION TECHNOLOGY-II 1 CHAPTER NO.4 FORECASTING.
MBF1413 | Quantitative Methods Prepared by Dr Khairul Anuar 8: Time Series Analysis & Forecasting – Part 1
Managerial Decision Modeling 6 th edition Cliff T. Ragsdale.
Chapter 15 Forecasting. Forecasting Methods n Forecasting methods can be classified as qualitative or quantitative. n Such methods are appropriate when.
İŞL 276 Moving Averages and Smoothing Methods Chapter 4 Fall 2014.
Chapter 3 Lecture 4 Forecasting. Time Series is a sequence of measurements over time, usually obtained at equally spaced intervals – Daily – Monthly –
TIME SERIES MODELS. Definitions Forecast is a prediction of future events used for planning process. Time Series is the repeated observations of demand.
Welcome to MM305 Unit 5 Seminar Dr. Bob Forecasting.
Welcome to MM305 Unit 5 Seminar Forecasting. What is forecasting? An attempt to predict the future using data. Generally an 8-step process 1.Why are you.
Yandell – Econ 216 Chap 16-1 Chapter 16 Time-Series Forecasting.
Time Series And Business Forecasting
Forecasting Methods Dr. T. T. Kachwala.
Forecasting techniques
Demand Management and Forecasting
“The Art of Forecasting”
John Loucks St. Edward’s University . SLIDES . BY.
Chapter 4: Seasonal Series: Forecasting and Decomposition
FORCASTING AND DEMAND PLANNING
Competing on Cost PART IV.
MBF1413 | Quantitative Methods Prepared by Dr Khairul Anuar
Forecasting Chapter 15.
Exponential Smoothing
1 1 Slide © 2013 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Chap 4: Exponential Smoothing
Forecasting Plays an important role in many industries
Exponential Smoothing
TIME SERIES MODELS – MOVING AVERAGES.
Presentation transcript:

MOVING AVERAGES AND EXPONENTIAL SMOOTHING Dr. Mohammed Alahmed http://fac.ksu.edu.sa/alahmed alahmed@ksu.edu.sa (011) 4674108

Introduction This chapter introduces models applicable to time series data with seasonal, trend, or both seasonal and trend component and stationary data. Forecasting methods discussed in this chapter can be classified as: Averaging methods. Equally weighted observations Exponential Smoothing methods. Unequal set of weights to past data, where the weights decay exponentially from the most recent to the most distant data points. All methods in this group require that certain parameters to be defined. These parameters (with values between 0 and 1) will determine the unequal weights to be applied to past data.

Introduction Averaging methods If a time series is generated by a constant process subject to random error, then mean is a useful statistic and can be used as a forecast for the next period. Averaging methods are suitable for stationary time series data where the series is in equilibrium around a constant value ( the underlying mean) with a constant variance over time.

Introduction Exponential smoothing methods The simplest exponential smoothing method is the single smoothing (SES) method where only one parameter needs to be estimated Holt’s method makes use of two different parameters and allows forecasting for series with trend. Holt-Winters’ method involves three smoothing parameters to smooth the data, the trend, and the seasonal index. Adaptive-response-rate single exponential smoothing

Averaging Methods The Mean Uses the average of all the historical data as the forecast. When new data becomes available, the forecast for time t+2 is the new mean including the previously observed data plus this new observation. This method is appropriate when there is no noticeable trend or seasonality.

Averaging Methods The Moving Average The moving average for time period t is the mean of the “k ” most recent observations. The constant number k is specified at the beginning. The smaller the number k, the more weight is given to recent periods. The greater the number k, the less weight is given to more recent periods. A large k is desirable when there are wide, infrequent fluctuations in the series.

Moving Averages A small k is most desirable when there are sudden shifts in the level of series. For quarterly data, a four-quarter moving average, MA(4), eliminates or averages out seasonal effects. For monthly data, a 12-month moving average, MA(12), eliminate or averages out seasonal effect. Equal weights are assigned to each observation used in the average. Each new data point is included in the average as it becomes available, and the oldest data point is discarded.

Moving Averages A moving average of order k, MA(k) is the value of k consecutive observations. K is the number of terms in the moving average. The moving average model does not handle trend or seasonality very well, although it can do better than the total mean.

Example: Weekly Department Store Sales The weekly sales figures (in millions of dollars) presented in the following table are used by a major department store to determine the need for temporary sales personnel.

Example: Weekly Department Store Sales

Example: Weekly Department Store Sales Use a three-week moving average (k=3) for the department store sales to forecast for the week 24 and 26. The forecast error is The forecast for the week 26 is

Example: Weekly Department Store Sales RMSE = 0.63

Summary of Moving Averages Advantages: Easily understood Easily computed Provides stable forecasts Disadvantages: Requires saving lots of past data points: at least the N periods used in the moving average computation Lags behind a trend Ignores complex relationships in data

Motivation of Exponential Smoothing Simple moving average method assigns equal weights (1/k) to all k data points. Arguably, recent observations provide more relevant information than do observations in the past. So we want a weighting scheme that assigns decreasing weights to the more distant observations.

Exponential Smoothing Methods This method provides an exponentially weighted moving average of all previously observed values. Appropriate for data with no predictable upward or downward trend. The aim is to estimate the current level and use it as a forecast of future value. This is an obvious extension the moving average method. With simple moving average forecasts the mean of the past k observations used as a forecast have equal eights (1/k) for all k data points. With exponential smoothing the idea is that the most recent observations will usually provide the best guide as to the future, so we want a weighting scheme that has decreasing weights as the observations get older

Simple Exponential Smoothing Formally, the exponential smoothing equation is: forecast for the next period.  = smoothing constant. yt = observed value of series in period t. = old forecast for period t. The forecast Ft+1 is based on weighting the most recent observation yt with a weight  and weighting the most recent forecast Ft with a weight of (1- ).

Simple Exponential Smoothing The implication of exponential smoothing can be better seen if the previous equation is expanded by replacing Ft with its components as follows:

Simple Exponential Smoothing If this substitution process is repeated by replacing Ft-1 by its components, Ft-2 by its components, and so on the result is: Therefore, Ft+1 is the weighted moving average of all past observations.

Simple Exponential Smoothing The following table shows the weights assigned to past observations for  = 0.2, 0.4, 0.6

Simple Exponential Smoothing The exponential smoothing equation rewritten in the following form elucidate the role of weighting factor . Exponential smoothing forecast is the old forecast plus an adjustment for the error that occurred in the last forecast.

Simple Exponential Smoothing The value of smoothing constant  must be between 0 and 1, and cannot be equal to 0 or 1. If stable predictions with smoothed random variation is desired then a small value of  is desire. If a rapid response to a real change in the pattern of observations is desired, a large value of  is appropriate. To estimate , forecasts are computed for  equal to 0.1, 0.2, 0.3, …, 0.9 and the sum of squared forecast error is computed for each. The value of  with the smallest RMSE is chosen for use in producing the future forecasts

Simple Exponential Smoothing To start the algorithm, we need F1 because Since F1 is not known, we can: set the first estimate equal to the first observation. use the average of the first five or six observations for the initial smoothed value.

Example: University of Michigan Index of Consumer Sentiment University of Michigan Index of Consumer Sentiment for January1995- December1996. We want to forecast the University of Michigan Index of Consumer Sentiment using Simple Exponential Smoothing Method.

Example: University of Michigan Index of Consumer Sentiment Since no forecast is available for the first period, we will set the first estimate equal to the first observation. We try  =0.3, and 0.6

Example: University of Michigan Index of Consumer Sentiment Note the first forecast is the first observed value. The forecast for Feb. 95 (t = 2) and Mar. 95 (t = 3) are evaluated as follows:

RMSE =2.66 for  = 0.6 RMSE =2.96 for  = 0.3

Simple Exponential Smoothing Advantages Simpler than other forms Requires limited data Disadvantages Lags behind actual data No trend or seasonality http://www.excel-easy.com/

Holt’s Exponential smoothing Holt’s two parameter exponential smoothing method is an extension of simple exponential smoothing. It adds a growth factor (or trend factor) to the smoothing equation as a way of adjusting for the trend.

Holt’s Exponential smoothing Three equations and two smoothing constants are used in the model. The exponentially smoothed series or current level estimate. The trend estimate. Forecast m periods into the future.

Holt’s Exponential smoothing Ft+1 = Smoothed value for period t+1  = smoothing constant for the data. yt = new observation or actual value of series in period t. Ft = Forecast value for time period t Tt+1 = Trend estimate  = smoothing constant for trend estimate m = periods to be forecast into the future. H t+ m = Holt’s forecast value for period t + m

Holt’s Exponential smoothing The weight  and  can be selected subjectively or by minimizing a measure of forecast error such as RMSE. Large weights result in more rapid changes in the component. Small weights result in less rapid changes.

Holt’s Exponential smoothing The initialization process for Holt’s linear exponential smoothing requires two estimates: One to get the first smoothed value for Ft+1 The other to get the trend Tt+1. One alternative is to set Ft+1 = y1 and

Example: Quarterly sales of saws for Acme tool company The following table shows the sales of saws for the Acme tool Company. These are quarterly sales From 1994 through 2000.

Example: Quarterly sales of saws for Acme tool company Examination of the plot shows: A non-stationary time series data. Seasonal variation seems to exist. Sales for the first and fourth quarter are larger than other quarters.

Example: Quarterly sales of saws for Acme tool company The plot of the Acme data shows that there might be trending in the data therefore we will try Holt’s model to produce forecasts. We need two initial values The first smoothed value for F1 The initial trend value T1. We will use the first observation for the estimate of the smoothed value F1, and the initial trend value T1 = 0. We will use  =0.3 and  =0.1

Example: Quarterly sales of saws for Acme tool company

Example: Quarterly sales of saws for Acme tool company RMSE for this application is:  = 0.3 and  = 0.1 RMSE = 155.5 The plot also showed the possibility of seasonal variation that needs to be investigated.

Winter’s Exponential Smoothing Winter’s exponential smoothing model is the second extension of the basic Exponential smoothing model. It is used for data that exhibit both trend and seasonality. It is a three parameter model that is an extension of Holt’s method. An additional equation adjusts the model for the seasonal component.

Winter’s Exponential Smoothing The four equations necessary for Winter’s multiplicative method are: The exponentially smoothed series: The trend estimate: The seasonality estimate:

Winter’s Exponential Smoothing Forecast m period into the future: Ft = Smoothed value for period t.  = smoothing constant for the data. yt = new observation or actual value in period t. Ft-1 = Average experience of series smoothed to period t-1. Tt = trend estimate.  = smoothing constant for trend estimate. St =Seasonal component estimate.  = smoothing constant for seasonality estimate. m = Number of periods in the forecast lead period. s = length of seasonality (number of periods in the season) = forecast for m periods into the future.

Winter’s Exponential Smoothing As with Holt’s linear exponential smoothing, the weights , , and  can be selected subjectively or by minimizing a measure of forecast error such as RMSE. As with all exponential smoothing methods, we need initial values for the components to start the algorithm. To start the algorithm, the initial values for Ft, the trend Tt, and the indices St must be set.

Winter’s Exponential Smoothing To determine initial estimates of the seasonal indices we need to use at least one complete season's data (i.e. s periods).Therefore, we initialize trend and level at period s. Initialize level as: Initialize trend as Initialize seasonal indices as:

Winter’s Exponential Smoothing We will apply Winter’s method to Acme Tool company sales. The value for  is 0.4, the value for  is 0.1, and the value for  is 0.3 The smoothing constant  smoothes the data to eliminate randomness. The smoothing constant  smoothes the trend in the data set. The smoothing constant  smoothes the seasonality in the data. The initial values for the smoothed series Ft, the trend Tt, and the seasonal index St must be set.

Example: Quarterly Sales of Saws for Acme tool

Example: Quarterly Sales of Saws for Acme tool RMSE for this application is:  = 0.4, = 0.1,  = 0.3 and RMSE = 83.36 Note the decrease in RMSE.

Adaptive-Response-Rate Single Exponential Smoothing (ADRES) This method is the same as simple exponential smoothing except that it uses a different α value (smoothing factor) for each period. The value of α is determined in such a way that it adapts to the changes in the basic pattern of the data. Other than its ability to adapt to changing circumstances, this method is the same as simple smoothing model, so it is applied to data which have little trend or seasonality.

Simple exponential smoothing model We choose  value by selecting the value that minimized the RMSE associated with the model We were allowed to choose only a single value for  ADRES smoothing model We allow the  value to adapt as the data change.

Adaptive-Response-Rate Single Exponential Smoothing (ADRES) Ft+1 = αtyt + (1- αt)Ft Where: αt = | 𝑆𝑡 𝐴𝑡 | St = βet + (1 – β)St-1 (Smoothed error) At=β|et|+ (1 – β)At-1 (Absolute smoothed error) et = yt – Ft (Error) As you see in the formula, there may be a different α value for each period. In most cases β is assigned a value of either 0.1 or 0.2 (See Table 3.5 on pg.116).

Table 3-5

Using Single, Holt’s or ADRES Smoothing to Forecast a Seasonal Data Series When we observe seasonal pattern in the data, we can either use Winters’ smoothing model or apply the following steps: Calculate seasonal indices for the series. Deseasonalize the original data by dividing each value by its corresponding seasonal index. Apply a forecasting method (such as ES, Holt's, or ADRES) to the deseasonalize series to produce an intermediate forecast of the deseasonalize data. Reseasonalize the series by multiplying each deseasonalize forecast by its corresponding seasonal index. (See Table 3.6 and Figure 3.7)

Table 3-6

Figure 3-7

Event Modeling This technique helps to improve forecast accuracy by taking into consideration special events such as promotions, natural disasters. The logic of event models is similar to seasonal models: just as each period (i.e., month or a quarter) is assigned its own index for seasonality, so that, each event type is assigned its own index for a specific promotional activity. See the event values used by a condiment manufacturer on pg. 121

Conclusion If the time series is a stationary one, the moving-average method of forecasting is appropriate for predicting future values. When recent-past observations are thought to contain more information than distant-past observations, some form of exponential smoothing may be appropriate. A suggested method for choosing an optimal smoothing constant is to minimize the RMSE. When some trend is observed, simple exponential smoothing becomes less able to preform accurate predication.

Conclusion Holt’s smoothing adds a growth factor to the smoothing model to account for trend. When seasonality is also present, Winter’s smoothing model is appropriate. Adaptive-response-rate single exponential smoothing provides another technique that can be useful when the level of the forecasted variable changes infrequently.