Time Series Forecasting

Slides:



Advertisements
Similar presentations
Decomposition Method.
Advertisements

Time Series and Forecasting
Chapter 16 Time-Series Forecasting
19- 1 Chapter Nineteen McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved.
Exponential Smoothing Methods
Time Series Analysis Autocorrelation Naive & Simple Averaging
Moving Averages Ft(1) is average of last m observations
Statistics for Managers Using Microsoft® Excel 5th Edition
Analyzing and Forecasting Time Series Data
CD-ROM Chapter 15 Analyzing and Forecasting Time-Series Data
Data Sources The most sophisticated forecasting model will fail if it is applied to unreliable data Data should be reliable and accurate Data should be.
Part II – TIME SERIES ANALYSIS C2 Simple Time Series Methods & Moving Averages © Angel A. Juan & Carles Serrat - UPC 2007/2008.
Basic Business Statistics (9th Edition)
Copyright ©2011 Pearson Education, Inc. publishing as Prentice Hall 16-1 Chapter 16 Time-Series Forecasting Statistics for Managers using Microsoft Excel.
© 2011 Pearson Education, Inc. Statistics for Business and Economics Chapter 13 Time Series: Descriptive Analyses, Models, & Forecasting.
McGraw-Hill/Irwin Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. Time Series and Forecasting Chapter 16.
Time Series and Forecasting
Slides 13b: Time-Series Models; Measuring Forecast Error
CHAPTER 18 Models for Time Series and Forecasting
Business Forecasting Chapter 4 Data Collection and Analysis in Forecasting.
© 2003 Prentice-Hall, Inc.Chap 12-1 Business Statistics: A First Course (3 rd Edition) Chapter 12 Time-Series Forecasting.
Lecture 4 Time-Series Forecasting
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc. Chap 16-1 Chapter 16 Time-Series Forecasting and Index Numbers Basic Business Statistics 10 th.
© 2002 Prentice-Hall, Inc.Chap 13-1 Statistics for Managers using Microsoft Excel 3 rd Edition Chapter 13 Time Series Analysis.
Chapter 15 Time-Series Forecasting and Index Numbers
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall7-1 Chapter 7: Forecasting.
1 1 Slide © 2012 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Copyright © 2015 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill Education.
DSc 3120 Generalized Modeling Techniques with Applications Part II. Forecasting.
Chapter 17 Time Series Analysis and Forecasting ©.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Time Series Forecasting Chapter 16.
McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies, Inc. All rights reserved. Time Series Forecasting Chapter 13.
MBA.782.ForecastingCAJ Demand Management Qualitative Methods of Forecasting Quantitative Methods of Forecasting Causal Relationship Forecasting Focus.
Time Series Analysis and Forecasting
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 15-1 Business Statistics: A Decision-Making Approach 6 th Edition Chapter.
Time series Decomposition Farideh Dehkordi-Vakil.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 15-1 Chapter 15 Time Series Forecasting and Index Numbers Statistics.
Time-Series Forecasting Overview Moving Averages Exponential Smoothing Seasonality.
1 1 Slide © 2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Copyright ©2016 Cengage Learning. All Rights Reserved
© 1999 Prentice-Hall, Inc. Chap Chapter Topics Component Factors of the Time-Series Model Smoothing of Data Series  Moving Averages  Exponential.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Model Building and Model Diagnostics Chapter 15.
1 BABS 502 Moving Averages, Decomposition and Exponential Smoothing Revised March 14, 2010.
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 16-1 Chapter 16 Time-Series Forecasting and Index Numbers Basic Business Statistics 11 th.
Forecasting Demand. Forecasting Methods Qualitative – Judgmental, Executive Opinion - Internal Opinions - Delphi Method - Surveys Quantitative - Causal,
Statistics for Managers Using Microsoft Excel, 5e © 2008 Prentice-Hall, Inc.Chap 16-1 Statistics for Managers Using Microsoft® Excel 5th Edition Chapter.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 14 l Time Series: Understanding Changes over Time.
Forecasting is the art and science of predicting future events.
Time Series and Forecasting Chapter 16 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved.
Forecasting Demand. Problems with Forecasts Forecasts are Usually Wrong. Every Forecast Should Include an Estimate of Error. Forecasts are More Accurate.
Times Series Forecasting and Index Numbers Chapter 16 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Chapter 20 Time Series Analysis and Forecasting. Introduction Any variable that is measured over time in sequential order is called a time series. We.
Welcome to MM305 Unit 5 Seminar Dr. Bob Forecasting.
©The McGraw-Hill Companies, Inc. 2008McGraw-Hill/Irwin Time Series and Forecasting Chapter 16.
Yandell – Econ 216 Chap 16-1 Chapter 16 Time-Series Forecasting.
Operations Management Contemporary Concepts and Cases
Chapter Nineteen McGraw-Hill/Irwin
Forecasting Methods Dr. T. T. Kachwala.
Forecasting Chapter 11.
Statistics for Managers using Microsoft Excel 3rd Edition
“The Art of Forecasting”
John Loucks St. Edward’s University . SLIDES . BY.
Chapter 4: Seasonal Series: Forecasting and Decomposition
Chapter 16 Time-Series Forecasting and Index Numbers
FORCASTING AND DEMAND PLANNING
Forecasting is an Integral Part of Business Planning
Forecasting Qualitative Analysis Quantitative Analysis.
Exponential Smoothing
Chapter Nineteen McGraw-Hill/Irwin
Chap 4: Exponential Smoothing
Presentation transcript:

Time Series Forecasting Chapter 16 Time Series Forecasting

Time Series Forecasting 16.1 Time Series Components and Models 16.2 Time Series Regression: Basic Models 16.3 Time Series Regression: More Advanced Models (Optional) 16.4 Multiplicative Decomposition

Time Series Forecasting 16.5 Simple Exponential Smoothing 16.6 Holt-Winter’s Models 16.7 Forecast Error Comparisons 16.8 Index Numbers

Time Series Components and Models Trend Long-run growth or decline Cycle Long-run up and down fluctuation around the trend level Seasonal Regular periodic up and down movements that repeat within the calendar year Irregular Erratic very short-run movements that follow no regular pattern

Trend

Cycle

Seasonal Variation

No Trend When there is no trend, the least squares point estimate b0 of b0 is just the average y value yt = b0 + et That is, we have a horizontal line that crosses the y axis at its average value

Example 16.1: The Cod Catch Case Bay City Seafood Co. operates fishing trawlers and a fish processing plant Wishes to forecast its minimum and maximum possible revenues Wishes to make point estimate and prediction interval forecast of monthly cod catch

Example 16.1: Cod Catch (In Tons) Month Year 1 Year 2 January 362 276 February 381 334 March 317 394 April 297 May 309 384 June 402 314 July 375 344 August 349 337 September 386 345 October 328 November 389 December 343 365

Example 16.1: Plot of Cod Catch versus Time

Example 16.1: The Cod Catch Case #4 Company believes this data pattern will continue It seems reasonable to use the “no trend” regression model yt = β0 + εt Point estimate of b0 is y̅ ŷ = 351.29 + εt

Example 16.1: The Cod Catch Case #5 A prediction interval for the “no trend” model is t/2 is based on n-1 degrees of freedom A 95 percent prediction interval is:

Trend When sales increase (or decrease) over time, we have a trend Oftentimes, that trend is linear in nature Linear trend is modeled using regression Sales is the dependent variable Time is the independent variable Weeks · Months Quarters · Years Not only is simple linear regression used, quadratic regression is sometimes used

Example 16.2: Calculator Sales Case For two years, Smith’s Department Sore has carried a Bismark X-12 calculator Sales are increasing Smith’s wishes to forecast demand Both point estimate and prediction interval

Example 16.2: Calculator Sales Data Month Year 1 Year 2 January 197 296 February 211 276 March 203 305 April 247 308 May 239 356 June 269 393 July 363 August 262 386 September 258 443 October 256 November 261 358 December 288 384

Example 16.2: Plot of Calculator Sales versus Time

Example 16.2: The Calculator Sales Case #4 Smith’s believes the trend will continue Reasonable to use the “linear trend” regression model yt = β0 + β1t + εt y25 = 198.029 + 8.074 (25) = 399.9 y26 = 198.029 + 8.074 (26) = 408.0 An software package can show the prediction intervals are [328.6,471.2] and [336.0,479.9] respectively

Seasonality Some products have demand that varies a great deal by period Coats Bathing suits Bicycles This periodic variation is called seasonality Seasonality alters the linear relationship between time and demand

Types of Seasonality Constant seasonal variation is where the magnitude of the swing does not depend on the level of the time series Increasing seasonal variation is where the magnitude of the swing increases as the level of the time series increases

Modeling Seasonality Within regression, seasonality can be modeled using dummy variables Consider the model: yt = b0 + b1t + bQ2Q2 + bQ3Q3 + bQ4Q4 + et For Quarter 1, Q2 = 0, Q3 = 0 and Q4 = 0 For Quarter 2, Q2 = 1, Q3 = 0 and Q4 = 0 For Quarter 3, Q2 = 0, Q3 = 1 and Q4 = 0 For Quarter 4, Q2 = 0, Q3 = 0 and Q4 = 1 The b coefficient will then give us the seasonal impact of that quarter relative to Quarter 1 Negative means lower sales, positive lower sales

Example 16.3: The Bike Sales Case A bicycle shop in Switzerland has sold the TRK-50 mountain bike for four years Wish to describe sales using a regression model

Example 16.3: Quarterly Sales Data Year Quarter t Sales 1 10 3 9 13 2 31 34 43 11 48 4 16 12 19 5 15 6 33 14 37 7 45 51 8 17 21

Example 16.3: MINITAB Plot of Bike Sales

Example 16.3: MINITAB Output of Dummy Variable Regression

Example 16.3: The Bike Sales Case #5 ŷ17 = b0 + b1(17) + bQ2(0) + bQ3(0) + bQ4(0) ŷ17 = 8.75 + 0.5(17) = 17.250 ŷ18 = b0 + b1(18) + bQ2(1) + bQ3(0) + bQ4(0) ŷ18 = 8.75 + 0.5(18) + 21 = 38.750 ŷ19 = b0 + b1(19) + bQ2(0) + bQ3(1) + bQ4(0) ŷ19 = 8.75 + 0.5(19) + 33.5 = 51.750 ŷ20 = b0 + b1(20) + bQ2(0) + bQ3(0) + bQ4(1) ŷ20 = 8.75 + 0.5(20) + 4.5 = 23.250

Time Series Regression: More Advanced Models Sometimes, transforming the sales data makes it easier to forecast Square root Quartic roots Natural logarithms While these transformations can make the forecasting easier, they make it harder to understand the resulting model

Example 16.4: Traveler’s Rest Case #1 Seasonal Regression

Example 16.4: MegaStat Regression Output

Example 16.4: MegaStat Output of Predicted Values

Autocorrelation One of the assumptions of regression is that the error terms are independent With time series data, that assumption is often violated Positive or negative autocorrelation is common One type of autocorrelation is first-order autocorrelation

First-Order Autocorrelation Error term in time period t is related to the one in t-1 t = φt-1 + at φ is the correlation coefficient that measures the relationship between the error terms at is an error term, often called a random shock

Autocorrelation Continued We can test for first-order correlation using Durbin-Watson Covered in Chapter 15 One approach to dealing with first-order correlation is predict future values of the error term using the model et = φet-1 + at

Autoregressive Model The error term et can be related to more than just the previous error term et-1 This is often the case with seasonal data Autoregressive error term model of order q: t = φt-1 + φt-2 + … + φt-q + at relates the error term to any number of past error terms The Box-Jenkins methodology can be used to systematically build a model that relates t to an appropriate number of past error terms

Multiplicative Decomposition We can use the multiplicative decomposition method to decompose a time series into its components: Trend Seasonal Cyclical Irregular

Steps to Multiplicative Decomposition #1 Compute a moving average This eliminates the seasonality Averaging period matches seasonal period

Steps to Multiplicative Decomposition #2 Compute a two-period centering moving average The average from Step 1 needs to be matched up with a specific period Consider a 4-period moving average The average of 1, 2, 3 and 4 is 2.5 This does not match any period The average of 2.5 and the next term 3.5 is 3 This matches up with period 3 Step 2 not needed if Step 1 uses odd number of periods

Steps to Multiplicative Decomposition #3 The original demand for each period is divided by the value computed in Step 2 for that same period The first and last few period do not have a value from Step 2 These periods are skipped All of the values from Step 3 for season 1 are averaged together to form seasonal factor for season 1 This is repeated for every season If there are four seasons, there will be four factors

Steps to Multiplicative Decomposition #4 The original demand for each period is divided by the appropriate seasonal factor for that period This gives us the deseasonalized observation A forecast is prepared using the deseasonalized observations This is usually simple regression The deseasonalized forecast for each period from Step 6 is multiplied by the appropriate seasonal factor for that period This returns seasonality to the forecast

Steps to Multiplicative Decomposition #5 We estimate the period-by-period cyclical and irregular component by dividing the deseasonalized observation from Step 5 by the deseasonalized forecast from Step 6 We use a three-period moving average to average out the irregular component The value from Step 9 divided by the value from Step 8 gives us the cyclical component Values close to one indicate a small cyclical component We are interested in long-term patterns

Example 16.5: The Tasty Cola Case Discount Soda Shop owns ten drive-in soft drink scores Have been selling Tasty Cola for three years They wish to forecast monthly demand Data has seasonal variation

Example 16.5: Monthly Sales of Tasty Cola (In Hundreds of Cases) Year Month t Sales 1 189 2 7 19 831 229 8 20 960 3 249 9 21 1,152 4 289 10 22 759 5 260 11 23 607 6 431 12 24 371 660 25 298 777 26 378 915 27 373 613 28 443 485 29 374 277 30 13 244 31 1,004 14 296 32 1,153 15 319 33 1,388 16 370 34 904 17 313 35 715 18 556 36 441

Example 16.5: Monthly Sales of Tasty Cola Graphically

Example 16.5: The Tasty Cola Case #4 Seasonal variation seems to be increasing It is reasonable to use the multiplicative model yt = TRt x SNt x CLt x IRt TRt x SNt x CLt x IRt represent the trend, seasonal, cyclical, and irregular components in time period t

Example 16.5: Estimation of the Seasonal Factors

Example 16.5: Plot of Tasty Cola Sales and Deseasonalized Data

Example 16.5: Forecasts of Future Values of Tasty Cola Sales

Example 16.5: A Plot of the Observed and Forecast Tasty Cola Sales

Simple Exponential Smoothing Earlier, we saw that when there is no trend, the least squares point estimate b0 of β0 is just the average y value yt = β0 + t That gave us a horizontal line that crosses the y axis at its average value Since we estimate b0 using regression, each period is weighted the same If β0 is slowly changing over time, we want to weight more recent periods heavier Exponential smoothing does just this

Exponential Smoothing Continued Exponential smoothing takes on the form: ST = yT + (1 – )ST-1 Alpha is a smoothing constant between zero and one Alpha is typically between 0.02 and 0.30 Smaller values of alpha represent slower change We want to test the data and find an alpha value that minimizes the sum of squared forecast errors

Example 16.6: The Cod Catch Case Example 16.1 suggested that the no trend model may approximate the cod catch series It is also possible that β0 is slowly changing over time Will estimate β0 using first six observations Denoted as S0 After of first six periods S0 = 359.67

Example 16.6: The Cod Catch Case #2 Assume that at the end of T-1, we have ST-1 to estimate β0 Assume that in T, we obtain a new observation yT, we can update ST-1 to ST We compute the updated estimate using the smoothing equation ST = yT + (1-)ST-1  is between zero and one

Example 16.6: One-Period-Ahead Forecast Using Exponential Smoothing

Example 16.7: The Cod Catch Case Example 16.6 showed that =0.02 was a good value We can use =0.02 to forecast future monthly cod catches Example 16.6 estimated S24=356.13 The point forecast made in month 24 of any future monthly cod catch is 356.13

Example 16.7: The Cod Catch Case #2 Assume the cod catch in January of year 3 is y25=384 S25 = y25 + (1-)S24 S25 = 0.02(384) + 0.98(356.13) = 356.69 356.69 is the point forecast made in month 25 for any future monthly cod catch

Example 16.7: MINITAB Output of Using Exponential Smoothing to Forecast Cod

Holt–Winters’ Models Simple exponential smoothing cannot handle trend or seasonality Holt–Winters’ double exponential smoothing can handle trended data of the form yt = β0 + β1t + t Assumes β0 and β1 changing slowly over time We first find initial estimates of β0 and β1 Then use updating equations to track changes over time Requires smoothing constants called alpha and gamma Updating equations in Appendix K of the CD-ROM

Multiplicative Winters’ Method Double exponential smoothing cannot handle seasonality Multiplicative Winters’ method can handle trended data of the form yt = (β0 + β1t) · SNt + t Assumes β0, β1, and SNt changing slowly over time We first find initial estimates of β0 and β1 and seasonal factors Then use updating equations to track over time Requires smoothing constants alpha, gamma and delta Updating equations in Appendix K of the CD-ROM

MINITAB Output of Double Exponential Smoothing for Calculator Sales Data

MINITAB Output of Double Exponential Smoothing for Calculator Sales Data

MINITAB Output of Graphical Forecasts When =0.496 and ƴ=0.142

MINITAB Output Using Winters’ Method to Forecast Tasty Cola Sales

Forecast Error Comparison Forecast errors: et = yt - ŷt Error comparison criteria Mean absolute deviation (MAD) Mean squared deviation (MSD)

Example: The Tasty Cola Case

Index Numbers Index numbers allow us to compare changes in time series over time We begin by selecting a base period Every period is converted to an index by dividing its value by the base period and then multiplying the results by 100 Simple (quantity) index

Example: Installment Credit Installment Credit Outstanding (in billions of dollars): 1990 -1996

Aggregate Price Index Often wish to compare a group of items To do this, we compute the total prices of the items over time We then index this total Aggregate price index

Example: Market Basket of Groceries

Weighted Aggregate Price Index An aggregate price index assumes all items in the basket are purchased with the same frequency A weighted aggregate price index takes into account varying purchasing frequency The Laspeyres index assumes the same mixture of items for all periods as was used in the base period The Paasche index allows the mixture of items in the basket to change over time as purchasing habits change

Example: Laspeyres Index (base quantity weights) 1992 (base) and 1997 Prices for a Market Basket of Grocery Items

Example: Paasche Index (current quantity weights) 1992 (base) and 1997 Prices for a Market Basket of Grocery Items