1 King Abdulaziz University Faculty of Engineering Industrial Engineering Dept. IE 436 Dynamic Forecasting.

Slides:



Advertisements
Similar presentations
Autocorrelation Functions and ARIMA Modelling
Advertisements

11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Forecasting OPS 370.
Forecasting Using the Simple Linear Regression Model and Correlation
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Part II – TIME SERIES ANALYSIS C1 Introduction to TSA © Angel A. Juan & Carles Serrat - UPC 2007/2008 "If we could first know where we are, then whither.
CHAPTER 5 TIME SERIES AND THEIR COMPONENTS (Page 165)
Time Series Analysis Autocorrelation Naive & Simple Averaging
Analyzing and Forecasting Time Series Data
Chapter 12 - Forecasting Forecasting is important in the business decision-making process in which a current choice or decision has future implications:
Chapter 5 Time Series Analysis
Data Sources The most sophisticated forecasting model will fail if it is applied to unreliable data Data should be reliable and accurate Data should be.
Chapter 13 Forecasting.
Pengujian Parameter Koefisien Korelasi Pertemuan 04 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Linear Regression and Correlation Analysis
Forecasting & Time Series Minggu 6. Learning Objectives Understand the three categories of forecasting techniques available. Become aware of the four.
Quantitative Business Forecasting Introduction to Business Statistics, 5e Kvanli/Guynes/Pavur (c)2000 South-Western College Publishing.
CHAPTER 4 MOVING AVERAGES AND SMOOTHING METHODS (Page 107)
Operations Management R. Dan Reid & Nada R. Sanders
Part II – TIME SERIES ANALYSIS C2 Simple Time Series Methods & Moving Averages © Angel A. Juan & Carles Serrat - UPC 2007/2008.
Discovering and Describing Relationships
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Chapter 7 Forecasting with Simple Regression
Business Forecasting Chapter 5 Forecasting with Smoothing Techniques.
Chapter 2 Data Patterns and Choice of Forecasting Techniques
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS & Updated by SPIROS VELIANITIS.
Lecture 4 Time-Series Forecasting
Applied Business Forecasting and Planning
Introduction to Linear Regression and Correlation Analysis
The Forecast Process Dr. Mohammed Alahmed
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall7-1 Chapter 7: Forecasting.
The Importance of Forecasting in POM
Demand Management and Forecasting
Stevenson and Ozgur First Edition Introduction to Management Science with Spreadsheets McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies,
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
© 2003 Prentice-Hall, Inc.Chap 13-1 Basic Business Statistics (9 th Edition) Chapter 13 Simple Linear Regression.
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
© 2004 Prentice-Hall, Inc. Chapter 7 Demand Forecasting in a Supply Chain Supply Chain Management (2nd Edition) 7-1.
DAVIS AQUILANO CHASE PowerPoint Presentation by Charlie Cook F O U R T H E D I T I O N Forecasting © The McGraw-Hill Companies, Inc., 2003 chapter 9.
Time Series Analysis and Forecasting
© 2000 Prentice-Hall, Inc. Chap The Least Squares Linear Trend Model Year Coded X Sales
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
4-1 Operations Management Forecasting Chapter 4 - Part 2.
© 1999 Prentice-Hall, Inc. Chap Chapter Topics Component Factors of the Time-Series Model Smoothing of Data Series  Moving Averages  Exponential.
Time Series Analysis and Forecasting. Introduction to Time Series Analysis A time-series is a set of observations on a quantitative variable collected.
The Box-Jenkins (ARIMA) Methodology
MGS3100_03.ppt/Feb 11, 2016/Page 1 Georgia State University - Confidential MGS 3100 Business Analysis Time Series Forecasting Feb 11, 2016.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 14 l Time Series: Understanding Changes over Time.
QM Spring 2002 Business Statistics Analysis of Time Series Data: an Introduction.
Forecasting is the art and science of predicting future events.
CHAPTER 12 FORECASTING. THE CONCEPTS A prediction of future events used for planning purpose Supply chain success, resources planning, scheduling, capacity.
13 – 1 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall. Forecasting 13 For Operations Management, 9e by Krajewski/Ritzman/Malhotra.
İŞL 276 Exploring Data Patterns & Choosing a Forecasting Technique Chapter 3 Fall 2014.
Demand Management and Forecasting Chapter 11 Portions Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
Forecast 2 Linear trend Forecast error Seasonal demand.
Chapter 15 Forecasting. Forecasting Methods n Forecasting methods can be classified as qualitative or quantitative. n Such methods are appropriate when.
1 Doing Statistics for Business Doing Statistics for Business Data, Inference, and Decision Making Marilyn K. Pelosi Theresa M. Sandifer Chapter 13 Time.
Forecasting. Model with indicator variables The choice of a forecasting technique depends on the components identified in the time series. The techniques.
Chapter 11 – With Woodruff Modications Demand Management and Forecasting Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Welcome to MM305 Unit 5 Seminar Dr. Bob Forecasting.
Chapter 4 Basic Estimation Techniques
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Slides by JOHN LOUCKS St. Edward’s University.
Presentation transcript:

1 King Abdulaziz University Faculty of Engineering Industrial Engineering Dept. IE 436 Dynamic Forecasting

2 CHAPTER 3 Exploring Data Patterns and an Introduction to Forecasting techniques Cross-sectional data: Cross-sectional data: collected at a single point in time. collected at a single point in time. A Time series: collected, and recorded over successive increments of time. (Page 62) A Time series: collected, and recorded over successive increments of time. (Page 62)

3 Exploring Time Series Data Patterns Horizontal (stationary). Horizontal (stationary). Trend. Trend. Cyclical. Cyclical. Seasonal. Seasonal. A Stationary Series Its mean and variance remain constant over time

4 The Trend The long-term component that represents the growth or decline in the time series. The Cyclical component The wavelike fluctuation around the trend. Cyclical Peak FIGURE 3-2 Trend and Cyclical Components of an Annual Time Series Such as Housing Costs Cyclical Valley Trend Line Year Cost Page (63)

5 A pattern of change that repeats itself year after year. The Seasonal Component FIGURE 3-3 Electrical Usage for Washington water Power Company, Page (64)

6 Exploring Data Patterns with Autocorrelation Analysis The correlation between a variable lagged one or more periods and itself. Autocorrelation: Autocorrelation: = autocorrelation coefficient for a lag of k periods = mean of the values of the series = observation in time period t = observation at time period t-k (Pages 64-65)

7 Autocorrelation Function (Correlogram) A graph of the autocorrelations for various lags. A graph of the autocorrelations for various lags. Computation of the lag 1 autocorrelation coefficient Table 3-1 (page 65)

8 Example 3.1 Data are presented in Table 3-1 (page 65). Data are presented in Table 3-1 (page 65). Table 3-2 shows the computations that lead to the calculation of the lag 1 autocorrelation coefficient. Table 3-2 shows the computations that lead to the calculation of the lag 1 autocorrelation coefficient. Figure 3-4 contains a scatter diagram of the pairs of observations (Y t, Y t-1 ). Figure 3-4 contains a scatter diagram of the pairs of observations (Y t, Y t-1 ). Using the totals from Table 3-2 and Equation 3.1: Using the totals from Table 3-2 and Equation 3.1:

9 Autocorrelation Function (Correlogram) (Cont.) FIGURE 3-5 Correlogram or Autocorrelation Function for the Data Used in Example 3.1 Minitab instructions: Stat > Time Series > Autocorrelation

10 using Autocorrelation Analysis Questions to be Answered Are the data random? Are the data random? Do the data have a trend? Do the data have a trend? Are the data stationary? Are the data stationary? Are the data seasonal? Are the data seasonal? (Page 68)

11 Are the data random? If a series is random: The successive values are not related to each other. Almost all the autocorrelation coefficients are significantly different from zero.

12 Is an autocorrelation coefficient significantly different from zero? -At a specified confidence level, a series can be considered random if the autocorrelation coefficients are within the interval [0 ± t SE(r k )], (z instead of t for large samples). - The autocorrelation coefficients of random data have an approximate normal sampling distribution. - The following t statistic can be used:

13 - Standard error of the autocorrelation at lag k: n r rSE k i i k      )( (3.2) Where: r i = the autocorrelation at time lag k. k = the time lag n = the number of observations in the time series

14 Example 3.2 (Page 69) At significant level = 0.05: the critical values ± 2.2 are the t upper and lower points for n-1 = 11 degrees of freedom. Decision Rule: If t 2.2, reject H ◦ : = 0 A hypothesis test: Is a particular autocorrelation coefficient is significantly different from zero? Note: t is given directly in the Minitab output under the heading T.

15 Is an autocorrelation coefficient different from zero? (Cont.) The Modified Box-Pierce Q statistic (developed by: Ljung, and Box) “LBQ” A portmanteau test: Whether a whole set of autocorrelation coefficients at once.

16 n= number of observations K= the time lag m= number of time lags to be considered = k th autocorrelation coefficient lagged k time periods The value of Q can be compared with the chi-square with m degrees of freedom.     m k k kn r nnQ 1 2 )2( (3.3) Where:

17 Example 3.3 (Page 70) tYtYt tYtYt tYtYt tYtYt

18 FIGURE 3-7 Autocorrelation Function for the Data Used in Example 3.3

19 Q statistic for m= 10 time lags is calculated = 7.75 (using Minitab). The chi-square value = , (tested at 0.05 significance level, degrees of freedom df = m = 10). Table B-4 (Page 527) Q <, Conclusion: the series is random.

20 Do the Data have a Trend? between A significant relationship exists between successive time series values. The autocorrelation coefficients are large for the first several time lags, and then gradually drop toward zero as the number of periods increases. The autocorrelation for time lag 1: is close to 1, for time lag 2: is large but smaller than for time lag 1.

21 Example 3.4 (Page 72) YearYtYt YtYt YtYt YtYt ……. (Page 74) Data in Table 3-4 (Page 74)

22 Data Differencing A time series can be differenced to remove the trend and to create a stationary series. See FIGURE 3-8 (Page 73) for differencing the Data of Example 3.1 See FIGURES 3-12, 3-13 (Page 75)

23 Are The Data Seasonal? For quarterly data: a significant autocorrelation coefficient will appear at time lag 4. For monthly data: a significant autocorrelation coefficient will appear at time lag 12.

24 (Page 76) Example 3.5 (Page 76) Year December 31 March 31 June 30 September Table 3-5: (Page 77) See Figures 3-14, 3-15 (Page 77)

25 Time Series Graph FIGURE 3-14 Time Series Plot of Quarterly Sales for Coastal Marine for Example 3.5

26 Autocorrelation coefficients at time lags 1 and 4 are significantly different from zero, Sales are seasonal on quarterly basis. FIGURE 3-15 Autocorrelation Function for quarterly Sales for Coastal Marine for Example 3.5

27 Questions to be Considered: Choosing a Forecasting Technique Why is a forecast needed? Who will use the forecast? What are the characteristics of the data? What time period is to be forecast? What are the minimum data requirements? How much accuracy is required? What will the forecast cost?

28 Choosing a Forecasting Technique (Cont.) Define the nature of the forecasting problem. Define the nature of the forecasting problem. Explain the nature of the data. Explain the nature of the data. Describe the properties of the techniques. Describe the properties of the techniques. Develop criteria for selection. Develop criteria for selection. The Forecaster Should Accomplish the Following:

29 Level of Details. Time horizon. Based on judgment or data manipulation. Management acceptance. Cost. Factors Considered: Choosing a Forecasting Technique (Cont.)

30 MethodUsesConsiderations Judgment  Can be used in the absence of historical data (e.g. new product).  Most helpful in medium- and long-term forecasts  Subjective estimates are subject to the biases and motives of estimators. Causal  Sophisticated method  Very good for medium- and long-term forecasts  Must have historical data.  Relationships can be difficult to specify Time series  Easy to implement  Work well when the series is relatively stable  Rely exclusively on past data.  Most useful for short-term estimates. General considerations for choosing the appropriate method

31 Pattern of data: ST, stationary; T, trended; S, seasonal; C, cyclical. Time horizon: S, short term (less than three months); I, intermediate; L, long term Type of model: TS, time series; C, causal. Seasonal: s, length of seasonality. of Variable: V, number variables. Method Pattern of Data Time Horizon Type of Model Minimal Data Requirements NonseasonalSeasonal NaïveST, T, SSTS1 Simple averagesSTSTS30 Moving averagesSTSTS4-20 Single Exponential smoothingSTSTS2 Linear (Double) exponential smoothing (Holt’s)TSTS3 Quadratic exponential smoothingTSTS4 Seasonal exponential smoothing (Winter’s)SSTS2 x s Adaptive filteringSSTS5 x s Simple regressionTIC10 Multiple regressionC, SIC10 x V Classical decompositionSSTS5 x s Exponential trend modelsTI, LTS10 S-curve fittingTI, LTS10 Gompertz modelsTI, LTS10 Growth curvesTI, LTS10 Census X-12SSTS6 x s ARIMA (Box-Jenkins)ST, T, C, SSTS243 x s Lading indicatorsCSC24 Econometric modelsCSC30 Time series multiple regressionT, SI, LC 6 x s

32 = actual value of a time series in time t = forecast value for time period t = - = forecast error in time t (residual) Basic Forecasting Notation Measuring Forecast Error

33 The Mean Absolute Deviation The Mean Squared Error The Mean Absolute Percentage Error The Mean Percentage Error Measuring Forecasting Error (Cont.) The Root Mean Square Error Equations ( )

34 Example 3.6 (Page 83) Evaluate the model using: MAD, MSE, RMSE, MAPE, and MPE. The measurement of a technique usefulness or reliability. Comparison of the accuracy of two different techniques. The search for an optimal technique. Used for:

35 Empirical Evaluation of Forecasting Methods Complex methods do not necessarily produce more accurate forecasts than simpler ones. Complex methods do not necessarily produce more accurate forecasts than simpler ones. Various accuracy measures (MAD, MSE, MAPE) produce consistent results. Various accuracy measures (MAD, MSE, MAPE) produce consistent results. The performance of methods depends on the forecasting horizon and the kind of data analyzed( yearly, quarterly, monthly). The performance of methods depends on the forecasting horizon and the kind of data analyzed( yearly, quarterly, monthly). Results of the forecast accuracy for a sample of 3003 time series (1997):

36 Determining the Adequacy of a Forecasting Technique Are the residuals indicate a random series? Are the residuals indicate a random series? (Examine the autocorrelation coefficients of the residuals, there should be no significant ones) (Examine the autocorrelation coefficients of the residuals, there should be no significant ones) Are they approximately normally distributed? Are they approximately normally distributed? Is the technique simple and understood by decision makers? Is the technique simple and understood by decision makers?