Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Forecasting Models CHAPTER 7 2 9.1 Introduction to Time Series Forecasting Forecasting is the process of predicting the future. Forecasting is an integral.

Similar presentations


Presentation on theme: "1 Forecasting Models CHAPTER 7 2 9.1 Introduction to Time Series Forecasting Forecasting is the process of predicting the future. Forecasting is an integral."— Presentation transcript:

1

2 1 Forecasting Models CHAPTER 7

3 2 9.1 Introduction to Time Series Forecasting Forecasting is the process of predicting the future. Forecasting is an integral part of almost all business enterprises. Examples –Manufacturing firms forecast demand for their product, to schedule manpower and raw material allocation. –Service organizations forecast customer arrival patterns to maintain adequate customer service.

4 3 More examples –Security analysts forecast revenues, profits, and debt ratios, to make investment recommendations. –Firms consider economic forecasts of indicators (housing starts, changes in gross national profit) before deciding on capital investments. Introduction

5 4 Good forecasts can lead to –Reduced inventory costs. –Lower overall personnel costs. –Increased customer satisfaction. The forecasting process can be based on: –Educated guess. –Expert opinions. –Past history of data values, known as a time series. Introduction

6 5 Components of a Time Series – Long-term trend A time series may be stationary or exhibit trend over time. Long-term trend is typically modeled as a linear, quadratic or exponential function. – Seasonal variation When a repetitive pattern is observed over some time horizon, the series is said to have seasonal behavior. Seasonal effects are usually associated with calendar or climatic changes. Seasonal variation is frequently tied to yearly cycles.

7 6 – Cyclical variation An upturn or downturn not tied to seasonal variation. Usually results from changes in economic conditions. – Random effects Components of a Time Series

8 7 A stationary time series Linear trend time series Linear trend and seasonality time series Time series value Future Components of a Time Series

9 8 The goal of a time series forecast is to identify factors that can be predicted. This is a systematic approach involving the following steps. –Step 1: Hypothesize a form for the time series model. –Step 2: Select a forecasting technique. –Step 3: Prepare a forecast. Steps in the Time Series Forecasting Process

10 9 Step 1: Identify components included in the time series –Collect historical data. –Graph the data vs. time. –Hypothesize a form for the time series model. –Verify this hypothesis statistically. Steps in the Time Series Forecasting Process

11 10 Step 2: Select a Forecasting Technique –Select a forecasting technique from among several techniques available. The selection includes Determination of input parameter values Performance evaluation on past data of each technique Step 3: Prepare a Forecast using the selected technique Steps in the Time Series Forecasting Process

12 11 In a stationary model the mean value of the time series is assumed to be constant. The general form of such a model is Where: y t = the value of the time series at time period t.  0 = the unchanged mean value of the time series.  t = a random error term at time period t. 7.2 Stationary Forecasting Models y t =  0 +  t The values of  t are assumed to be independent The values of  t are assumed to have a mean of 0.

13 12 Checking for trend Use Linear Regression if  t  is normally distributed. Use a nonparametric test if  t  is not normally distributed. Checking for seasonality component Autocorrelation measures the relationship between the values of the time series in different periods. Lag k autocorrelation measures the correlation between time series values which are k periods apart. – Autocorrelation between successive periods indicates a possible trend. – Lag 7 autocorrelation indicates one week seasonality (daily data). – Lag 12 autocorrelation indicates 12-month seasonality (monthly data). Checking for Cyclical Components Stationary Forecasting Models

14 13 The last period technique The forecast for the next period is the last observed value. tt+1 t t t Moving Average Methods

15 14 t+1 t t-2t-1 t-2 t-1 t The moving average method The forecast is the average of the last n observations of the time series. Moving Average Methods

16 15 The weighted moving average method –More recent values of the time series get larger weights than past values when performing the forecast. Moving Average Methods = w 1 y t + w 2 y t-1 +w 3 y t-2 + …+ w n y t-n+1 w 1  w 2  …  w n  w i = 1

17 16 Forecasts for Future Time Periods The forecast for time period t+ 1 is the forecast for all future time periods: This forecast is revised only when new data becomes available. Moving Average Methods

18 17 Galaxy Industries is interested in forecasting weekly demand for its YoHo brand yo-yos. The yo-yo is a mature product. This year demand pattern is expected to repeat next year. To forecast next year demand, the past 52 weeks demand records were collected. YOHO BRAND YO - YOs Moving Average Methods -

19 18 Three forecasting methods were suggested: –Last period technique - suggested by Amy Chang. –Four-period moving average - suggested by Bob Gunther. –Four-period weighted moving average - suggested by Carlos Gonzalez. Management wants to determine: –If a stationary model can be used. –What forecast will be obtained using each method? YOHO BRAND YO - YOs Moving Average Methods -

20 19 Construct the time series plot Neither seasonality nor cyclical effects can be observed YOHO BRAND YO YOs- Solution

21 20 Run linear regression to test  1 in the model y t =  0 +  1 t+  t Excel results 0.71601 This large P-value indicates that there is little evidence that trend exists Conclusion: A stationary model is appropriate. Is trend present?

22 21 Last period technique (Amy’s Forecast) –Four-period moving average (Bob’s forecast) –Four period weighted moving average (Carlo’s forecast) 53 = (y 52 + y 51 + y 50 + y 49 ) / 4 = (484+482+393+245) / 4 = 401 boxes. = 484 boxes. 53 = y 52 53 = 0.4y 52 + 0.3y 51 + 0.2y 50 + 0.1y 49 = 0.4(484) + 0.3(482) + 0.2(393) + 0.1(245) = 441.3 boxes. Forecast for Week 53

23 22 Since the time series is stationary, the forecasts for weeks 54 and 55 remain as the forecast for week 53. These forecasts will be revised pending observation of the actual demand in week 53. Forecast for Weeks 54 and 55

24 23 This technique is used to forecast stationary time series. All the previous values of historical data affect the forecast. The Exponential Smoothing Technique

25 24 For each period create a smoothed value L t of the time series, that represents all the information known by t. The smoothed value L t is the weighted average of –The current period’s actual value (with weight of  ). –The forecast value for the current period (with weight of 1-  ). The smoothed value L t becomes the forecast for period t+1. The Exponential Smoothing Technique

26 25 An initial “forecast” is needed to start the process. Define: F t+1 = the forecast value for time t+1 y t = the value of the time series at time t  = smoothing constant  The Exponential Smoothing Technique

27 26 – Approach 1: Continue from t=3 with the recursive formula. – Approach 2: Average the initial “ n ” values of the time series. Use this average as the forecast for period n + 1 Begin using exponential smoothing from that time period onward and so on. The Exponential Smoothing Technique – Generating an initial forecast

28 27 Since this technique deals with stationary time series, the forecasts for future periods does not change. Assume N is the number of periods for which data are available. Then F N+1 =  y N + (1 –  )F N, F N+k = F N+1, for k = 2, 3, … The Exponential Smoothing Technique – Future Forecasts

29 28 An exponential smoothing forecast is suggested, with  = 0.1. An Initial Forecast is created at t=2 by F 2 = y 1 = 415. The recursive formula is used from period 3 onward: F 3 =.1y 2 +.9F 2 =.1(236) +.9(415) = 397.10 F 4 =.1y 3 +.9F 3 =.1(348) +.9(397.10) = 392.19 and so on, until period 53 is reached (N+1 = 52+1 = 53). F 53 =.1y 52 +.9F 52 =.1(484) +.9(382.32) = 392.49 F 54 = F 55 = 392.49 ( = F 52 ) YOHO BRAND YO - YOs The Exponential Smoothing Technique

30 29 YOHO BRAND YO - YOs The Exponential Smoothing Technique (Excel)

31 30 Notice the amount of smoothing Included in the smoothed series YOHO BRAND YO - YOs The Exponential Smoothing Technique (Excel)

32 31 Relationship between exponential smoothing and simple moving average –The two techniques will generate forecasts having the same average age of information if –This formula is a useful guide to the appropriate value for  An exponential smoothing forecast “based on large number of periods” should have a small . A small  provides a lot of smoothing. A large  provides a fast response to the recent changes in the time series and a smaller amount of smoothing. The Exponential Smoothing Technique Average age of information

33 32 7.3 Evaluating the performance of Forecasting Techniques Several forecasting methods have been presented. Which one of these forecasting methods gives the “best” forecast?

34 33 Generally, to evaluate each forecasting method: –Select an evaluation measure. –Calculate the value of the evaluation measure using the forecast error equation –Select the forecast with the smallest value of the evaluation measure. Performance Measures

35 34 Time 1 2 34 5 6 Time series: 100 110 90 80105115 3-Period Moving average: 10093.33 91.6 Error for the 3-Period MA:- 2011.67 23.4 3-Period Weighted MA(.5,.3,.2) 9889 85.5 Error for the 3-Period WMA- 1816 29.5 Performance Measures – Sample Example Find the forecasts and the errors for each forecasting technique applied to the following stationary time series.

36 35   t  2  n MSE = Performance Measures MAD =  t |  n  MAPE = n   t  n y t LAD = max |t||t|

37 36 = 361.24 (-20) 2 +(11.67) 2 +(23.4) 2 3 MSE = =   t    n Divide by 3, not by 6 periods. Period 1, 2, 3 do not have a forecast Performance Measures – MSE for the Sample Example MSE for the moving average technique: MSE for the weighted moving average technique: (-18) 2 + (16) 2 + (29.5) 2 3 MSE = =   t    n = 483.4

38 37 MAD for the moving average technique: MAD for the weighted moving average technique: = 21.17 = 18.35 |-20| + |11.67| + |23.4| 3 MAD = =   t |  n |-18| + |116| + |29.5| 3 MAD = =   t |  n Performance Measures – MAD for the Sample Example

39  t Y t |  n 38 MAPE for the moving average technique: MAPE for the weighted moving average technique: =.211 =. 188 |-20/80| + |11.67/105|+ |23.4/115| 3 MAPE= =  |-18/80| + |16/105| + |29.5/115| 3 MAPE= = Performance Measures – MAPE for the Sample Example  t Y t |  n 

40 39 LAD for the moving average technique: LAD for the weighted moving average technique: = 23.4 = 29.5 Performance Measures – LAD for the Sample Example |-20|, |11.67|, |23.4| LAD= max = |  t | max {|-18|, |16|, |29.5|} LAD= max = |  t |

41 40 Performance Measures – YOHO BRAND YO - YOs =B4 Drag to Cell C56 =E5/B 5 =D5^ 2 =ABS(D5 ) =B5-C5 Highlight Cells D5:G5 and Drag to Cells D55:G55

42 41 Performance Measures – YOHO BRAND YO - YOs =AVERAGE(B4:B7) Drag to Cell C56 Highlight Cells D8:G8 and Drag to Cells D55:G55 =E8/B 8 =D8^ 2 =ABS(D8 ) =B8-C8 Forecast begins at period 5. =C56 Drag to C58

43 42 Use the performance measures to select a good set of values for each model parameter. –For the moving average: the number of periods (n). –For the weighted moving average: The number of periods (n), The weights (W i ). –For the exponential smoothing: The exponential smoothing factor (  ). Excel Solver can be used to determine the values of the model parameters. Performance Measures – Selecting Model Parameters

44 43 Weights for the Weighted Moving Average Minimize MSE using Solver- Cell containing MSE Minimize Weights Weights Are Nonincreasin g Total of Weights Sum to 1

45 44 Key issues considered when determining the technique to be used in forecasting stationary time- series. –The degree of autocorrelation. –The possibility of future shifts in time series values. –The desired responsiveness of the forecasting technique. –The amount of data required to calculate the forecast. Selecting Forecasting Technique

46 45 If we suspect trend, we should assess whether the trend is linear or nonlinear. Here we assume only linear trend. y t =  0 +  1 t +  t Forecasting methods –Linear regression forecast. –Holt’s Linear Exponential Smoothing Technique. 7.4 Time Series with Linear Trend

47 46 The Linear Regression Approach Construct the regression equation based on the historical data available. The independent variable is “time”. The dependent variable is the “time-series value”. Forecasts should not expand to periods far into the future.

48 47 23 + + + 4 Y3Y3 + + + + + Holt’s Technique – A qualitative demonstration +

49 48 The Holt’s Linear Exponential Smoothing Technique. –Adjust the Level L t, and the Trend T t in each period:    =  smoothing constant for the time series level.  = smoothing constant for the time series trend. = estimate of the time series for time t as of time t. = estimate of the time series trend for time t as of time t. y t = value of the time series at time t. = forecast of the value of the time series for time t calculated at a period prior to time t. The Holt’s Technique Level: Trend: Initial values:

50 49 Forecasting k periods into the future –By linear regression –By the Holt’s linear exponential smoothing technique Future Forecasts

51 50 American Family Products Corp. Standard and Poor’s (S&P) is a bond rating firm. It is conducting an analysis of American Family Products Corp. (AFP). The forecast of year-end current assets is required for years 11 and 12, based on data over the previous 10 years.

52 51 The company’s assets have been increasing at a relatively constant rate over time. Data Year-end current assets YearCurrent Assets 11990 (Million) 22280 32328 42635 53249 63310 73256 83533 93826 104119 American Family Products Corp.

53 52 A linear trend seems to exists AFP - SOLUTION Forecasting with the Linear Regression Model

54 53 = 1788.2 + 229.89 t The Regression Equation The p-value AFP - SOLUTION Forecasting with the Linear Regression Model

55 54 =$B$31+$B$32*A2 Drag to cells C3:C13 Forecasts for Years 11 and 12 AFP - SOLUTION Forecasting with the Linear Regression Model

56 55 YearCurrent Assets 11990 22280 32328 42635 53249 ……………………… Demonstration of the calculation procedure. with  = 0.1 and  = 0.2 Year 2: y 2 = 2280 L 2 = 2280.00 T 2 = 2280 - 1990 = 290 F 3 = 2280 + 290 = 2570.00 Year 3:y 3 = 2328 L 3 = (.1)(2328) + (1 - 0.1)(2570.00) = 2545.80 T 3 = (.2)(2545.80-2280)) + (1 - 0.2)(290.00) = 285.16 F 4 = 2545.80 + 285.16 = 2830.96 AFP - SOLUTION Forecasting using the Holt’s technique L 2 =y 2 T 2 =y 2 -y 1 F 3 =L 2 +T 2 F t+1 =L t +T t

57 56 AFP - SOLUTION Forecasting using the Holt’s technique (Excel)

58 57 Many time series exhibit seasonal and cyclical variation along with trend. Seasonality and cyclical variation arise due to calendar, climate, or economic factors. Two models are considered: –Additive model y t = T t + C t + S t +  t –Multiplicative model 7.5 Time Series with Trend, Seasonality, and Cyclical Variation y t = T t C t S t  t Time series value Trend component Cyclical component Random error Seasonal component

59 58 This technique can be used to develop an additive or multiplicative model. The time series is first decomposed to its components (trend, seasonality, cyclical variation). After these components have been determined, the series is re-composed by –adding the components - in the additive model –multiplying the components - in the multiplicative model. The Classical Decomposition

60 59 Smooth the time series to remove random effects and seasonality. Calculate moving averages. Determine “ period factors ” to isolate the (seasonal)  (error) factor. Calculate the ratio y t /MA t. Determine the “un adjusted seasonal factors ” to eliminate the random component from the period factors The Classical Decomposition- Procedure (1) Average all the y t /MA t that correspond to the same season.

61 60 Determine the “ adjusted seasonal factors”. Calculate: [Unadjusted seasonal factor] [Average seasonal factor] Determine “ Deseasonalized data values”. Calculate: y t [Adjusted seasonal factors] t Determine a deseasonalized trend forecast. The Classical Decomposition- Procedure (2) Use linear regression on the deseasonalized time series. Calculate: (y t /Ma t )  [Adjusted seasonal forecast]. Determine an “ adjusted seasonal forecast ”.

62 61 The CFA is the exclusive bargaining agent for the state-supported Canadian college faculty. Membership in the organization has grown over the years, but in the summer months there was always a decline. To prepare the budget for the 2001 fiscal year, a forecast of the average quarterly membership covering the year 2001 is required. CANADIAN FACULTY ASSOCIATION (CFA)

63 62 Membership records from 1997 through 2000 were collected and graphed. CFA - Solution

64 63 CFA - Solution 1997199819992000 The graph exhibits long term trend The graph exhibits seasonality pattern

65 64 First moving average is centered at quarter (1+4)/ 2 = 2.5 Centered moving average of the first two moving averages is [7245.01 + 7380.75]/2 = 7312.875 Smooth the time series to remove random effects and seasonality. Calculate moving averages. Classical Decomposition – step 1: Isolating Trend and Cyclical Components Average membership for the first 4 periods = [7130+6940+7354+7556]/4 = 7245.01 Second moving average is centered at quarter (2+5)/ 2 = 3.5 Average membership for periods [2, 5] = [6940+7354+7556+7673]/4 = 7380.75 Centered location is t = 3

66 65 Classical Decomposition – step 2 [Seasonal  Random Error] Factors Determine “ period factors ” to isolate the (Seasonal)  (Random error) factor. Calculate the ratio y t / MA t. The Centered Moving Average only represents T t C t. The Seasonal  Random error factors are represented by S t  t = y t /T t C t Example: In period 7 (3 rd quarter of 1998): S 7  7 =7662/7643.875 = 1.00237

67 66 Averaging the Seasonal  Random Error Factors eliminates the random factor from S t  t. This leaves us with the seasonality component only for each season. Example: Unadjusted Seasonal Factor for the third quarter. S 3 = {S 3,97   + S 3,98  3,98 + S 3,99  3,99 } / 3 = {1.0056+1.0024+1.0079} / 3 = 1.0053 Classical Decomposition – step 3 Unadjusted Seasonal Factors Determine the “un adjusted seasonal factors ” to eliminate the random component from the period factors Average all the y t / MA t that correspond to the same season.

68 67 Without seasonality the seasonal factors for each season should be equal to 1. Thus, the sum of all seasonal factors would be equal to 4. The adjustment of the unadjusted seasonal factors maintains the sum of 4. Example: The average seasonal factor is (1.0149+.9658+1.00533+1.01624) / 4=1.00057. The adjusted seasonal factor for the 3rd quarter: S 3 /Average seasonal factor = 1.00053 / 1.00057 = 1.00472 Excel’s exact value=1.004759). Classical Decomposition – step 4 Adjusted Seasonal Factors Determine the “ adjusted seasonal factors”. Calculate: [Unadjusted seasonal factor] [Average seasonal factor] Adjusted Seasonal factors

69 68 Classical Decomposition – step 5 The Deseasonalized Time Series Deseasonalize the Time Series by.y t / (Adjusted S) t = T t C t  t Example: Deseasonalized series value for the 2 nd quarter, 1998 = y 6 /[ Adjusted S 2 ] = 7332 / 0.9652 = 7595.94 Determine “ Deseasonalized data values”. Calculate: y t [Adjusted seasonal factors] t

70 69 Seasonality has been substantially Reduced. This graph represents T t C t  t. Classical Decomposition – step 5 The Time series trend

71 70 Determine a deseasonalized trend forecast. Use linear regression on the deseasonalized time series. Classical Decomposition – step 6 The Time series trend Component Trend factor: T t = 7069.6677 + 78.4046 t

72 71 Assuming no cyclic effects the forecast becomes: F(quarter i, time N+k) = T N+k  (Adjusted S i ) Classical Decomposition – step 7 The forecast Trend factor: T 17 = 7069.6677 + 78.4046(17) = 8402 Forecast( Q1, t=17 ) = (8402)(1.01433) = 8523

73 72 Classical Decomposition – step 7 The forecast Assuming cyclic effects, create a series of the cyclic component, as follows: Perform the forecast: F(quarter i, time N+k) = T N+k  C N  (Adjusted S i ) Deseasonalized time series Trend component (from the regression) Smooth out the error component using moving averages to isolate C t. T t C t  t = = C t  t. T t

74 73 Classical Decomposition Template

75 74 For a time series with trend and seasonality: Y t = T t +S t + R t, which translates to Y t =  0 +  1 t +  2 S 1 + … +  k S k +  t The additive model – The Multiple Regression Approach The trend element The seasonality element

76 75 Troy’s Mobil Station Troy owns a gas station that experience seasonal variation in sales. In addition, due to a steady increase in population Troy feels that average sales are increasing generally.

77 76 Data Troy’s Mobil Station F W Spg Smr

78 77 Year 1 Year 2 Trend variable Seasonal variables Fall Winter Spring Not FallNot WinterNot Spring Troy’s Mobil Station – Multiple Regression input data

79 78 Troy’s Mobil Station Template

80 79 Troy’s Mobil Station – Multiple regression (Excel output) Extremely good fit Extremely useful All the variable are linearly related to sales.

81 80 3610.625 + 58.33125t     -155.00 -322.93 -248.27  -155.00 Troy’s Mobil Station – Multiple regression (Graphical interpretation) Fall Winter Spring Summer

82 81 The forecasting additive model is: F t = 3610.625 + 58.33 t – 155 F – 323 W – 248.27 S Forecasts for year 5 are produced as follows: F(Year 5, Fall) = 3610.625+58.33( 21 ) – 155( 1 ) – 323( 0 ) – 248.27( 0 ) F(Year 5, Winter) = 3610.625+58.33( 22 ) – 155( 0 ) – 323( 1 ) – 248.27( 0 ) F(Year 5, Spring) = 3610.625+58.33( 23 ) – 155( 0 ) – 323( 0 ) – 248.27( 1 ) F(Year 5, Summer) = 3610.625+58.33( 24 ) – 155( 0 ) – 323( 0 ) – 248.27( 0 ) Troy’s Mobil Station – Performing the forecast

83 82 Copyright  John Wiley & Sons, Inc. All rights reserved. Reproduction or translation of this work beyond that named in Section 117 of the United States Copyright Act without the express written consent of the copyright owner is unlawful. Requests for further information should be addressed to the Permissions Department, John Wiley & Sons, Inc. Adopters of the textbook are granted permission to make back-up copies for their own use only, to make copies for distribution to students of the course the textbook is used in, and to modify this material to best suit their instructional needs. Under no circumstances can copies be made for resale. The Publisher assumes no responsibility for errors, omissions, or damages, caused by the use of these programs or from the use of the information contained herein.


Download ppt "1 Forecasting Models CHAPTER 7 2 9.1 Introduction to Time Series Forecasting Forecasting is the process of predicting the future. Forecasting is an integral."

Similar presentations


Ads by Google