Presentation is loading. Please wait.

Presentation is loading. Please wait.

ECNE610 Managerial Economics Week 4 MARCH 2014 1 Dr. Mazharul Islam Chapter-5.

Similar presentations


Presentation on theme: "ECNE610 Managerial Economics Week 4 MARCH 2014 1 Dr. Mazharul Islam Chapter-5."— Presentation transcript:

1 ECNE610 Managerial Economics Week 4 MARCH 2014 1 Dr. Mazharul Islam Chapter-5

2 Demand Estimation and Forecasting Dr. Mazharul Islam 2 5

3 Lesson Objectives  know how to specify and interpret a regression.  understand importance of forecasting in business.  describe six different forecasting techniques.  use seasonal and smoothing methods.  recognize limitations of consumer data. 3 Dr. Mazharul Islam

4 4 Data Sources Primary Data Collection Secondary Data Compilation Observation Experimentation Survey Print or Electronic Those that are collected for your purposes. Data collected & compiled by an outside source or by someone in your organization who provides others access to the data.

5 TYPES OF DATA 5 Data Qualitative Quantitative ContinuousDiscrete

6 6 Data Timing ã Time series data consist of a set of ordered data values observed at successive points in time. ã Cross-sectional data are a set of data values observed at a fixed point in time.

7 7 Data Timing (Panda’s sales reports) Sales (in $1000’s) 2003200420052006 Jeddah435460475490 Riyadh320345375395 Dammam405390410395 Madina260270285280 Time Series Data Cross Section Data

8 8  A Population is the set of all items or individuals of interest.  Examples: All likely voters in the next election. All parts produced today. All sales receipts for November.  A Sample is a subset of the population.  Examples:1000 voters selected at random for interview. A few parts selected for destructive testing Every 100 th receipt selected for audit. Populations and Samples

9 9 Introduction to Regression Analysis  Regression analysis is used to:  Predict the value of a dependent variable based on the value of at least one independent variable  Explain the impact of changes in an independent variable on the dependent variable Dependent variable : the variable we wish to explain Independent variable : the variable used to explain the dependent variable

10 15-10 The Multiple Regression Model Idea: Examine the linear relationship between 1 dependent (y) & 2 or more independent variables (x i ) Population model: Y-interceptPopulation slopes Random Error Estimated (or predicted) value of y Estimated slope coefficients Estimated multiple regression model: Estimated intercept

11 The Least Squares Equation  The formulas for b 1 and b 0 are: and

12 Example  A distributor of frozen desert pies wants to evaluate factors thought to influence demand. Dependent variable: Pie sales (units per week) Independent variables: Price (in $) Advertising ($100’s)  Data are collected for 15 weeks December 22, 2015 Dr. Mazharul Islam Slide 12

13 Formulate the Model  Week Pie Sales Price ($) Advertising ($100s) 13505.503.3 24607.503.3 33508.003.0 44308.004.5 53506.803.0 63807.504.0 74304.503.0 84706.403.7 94507.003.5 104905.004.0 113407.203.5 123007.903.2 134405.904.0 144505.003.5 153007.002.7 Pie SalesPriceAdvertising Pie Sales1 Price-0.443271 Advertising0.556320.030441 Correlation matrix: Estimated Regression model: December 22, 2015 Dr. Mazharul Islam Slide 13

14 Dr. Mazharul Islam 15-14 Regression Output Regression Statistics Multiple R0.72213 R Square0.52148 Adjusted R Square0.44172 Standard Error47.46341 Observations15 ANOVA dfSSMSF Significance F Regression229460.02714730.0136.538610.01201 Residual1227033.3062252.776 Total1456493.333 Coefficients Standard Errort StatP-valueLower 95%Upper 95% Intercept306.52619114.253892.682850.0199357.58835555.46404 Price-24.9750910.83213-2.305650.03979-48.57626-1.37392 Advertising74.1309625.967322.854780.0144917.55303130.70888 December 22, 2015

15 The Regression Equation where Sales is in number of pies per week Price is in $ Advertising is in $100’s. December 22, 2015 Dr. Mazharul Islam Slide 15

16 Using The Model to Make Predictions Predict sales for a week in which the selling price is $5.50 and advertising is $350: Predicted sales is 428.62 pies Note that Advertising is in $100’s, so $350 means that x 2 = 3.5 December 22, 2015 Dr. Mazharul Islam Slide 16

17 15-17 Regression Statistics Multiple R0.72213 R Square0.52148 Adjusted R Square0.44172 Standard Error47.46341 Observations15 ANOVA dfSSMSF Significance F Regression229460.02714730.0136.538610.01201 Residual1227033.3062252.776 Total1456493.333 Coefficient s Standard Errort StatP-valueLower 95%Upper 95% Intercept306.52619114.253892.682850.0199357.58835555.46404 Price-24.9750910.83213-2.305650.03979-48.57626-1.37392 Advertising74.1309625.967322.854780.0144917.55303130.70888 52.1% of the variation in pie sales is explained by the variation in price and advertising Coefficient of Determination (continued)

18 December 22, 2015 Slide 18 b To test the statistical significance of the regression relation between the response variable y and the set of variables x 2 and x 3, i.e. to choose between the alternatives: b We use the test statistic: Dr. Mazharul Islam F-Test for Overall Significance of the Model Shows if there is a linear relationship between all of the x variables considered together and y. where numerator’s df = k and denominator’s df = (n – k – 1)

19 Copyright ©2011 Pearson Education, Inc. publishing as Prentice Hall 15-19 Regression Statistics Multiple R0.72213 R Square0.52148 Adjusted R Square0.44172 Standard Error47.46341 Observations15 ANOVA dfSSMSF Significance F Regression229460.02714730.0136.538610.01201 Residual1227033.3062252.776 Total1456493.333 Coefficients Standard Errort StatP-valueLower 95%Upper 95% Intercept306.52619114.253892.682850.0199357.58835555.46404 Price-24.9750910.83213-2.305650.03979-48.57626-1.37392 Advertising74.1309625.967322.854780.0144917.55303130.70888 (continued) F-Test for Overall Significance With 2 and 12 degrees of freedom P-value for the F-Test

20 H 0 : β 2 = β 3 = 0 H A : β 2 and β 3 not both zero  = 0.05 df 2 = 2 df 3 = 12 Test Statistic: Decision: Conclusion: Reject H 0 at  = 0.05 The regression model does explain a significant portion of the variation in pie sales (There is evidence that at least one independent variable affects y ) 0  = 0.05 F 0.05 = 3.885 Reject H 0 Critical Value: F  = 3.885 F-Test for Overall Significance (continued) F Do not reject H 0

21 December 22, 2015 Slide 21 Significance tests for  i b b ‘ t’ test for a population slope Is there a linear relationship between x and y ? b b Null and alternative hypotheses H 0 : β i = 0(no linear relationship) H A : β i  0(linear relationship does exist) b b Test statistic Dr. Mazharul Islam

22 December 22, 2015 Slide 22 b Reject H 0 if, Dr. Mazharul Islam

23 Copyright ©2011 Pearson Education, Inc. publishing as Prentice Hall 15-23 Regression Statistics Multiple R0.72213 R Square0.52148 Adjusted R Square0.44172 Standard Error47.46341 Observations15 ANOVA dfSSMSF Significance F Regression229460.02714730.0136.538610.01201 Residual1227033.3062252.776 Total1456493.333 Coefficient s Standard Errort StatP-valueLower 95% Upper 95% Intercept306.52619114.253892.682850.0199357.58835555.46404 Price-24.9750910.83213-2.305650.03979-48.57626-1.37392 Advertising74.1309625.967322.854780.0144917.55303130.70888 (continued) Are Individual Variables Significant ? t-value for Price is t = -2.306, with p-value 0.0398 t-value for Advertising is t = 2.855, with p-value 0.0145

24 d.f. = 15-2-1 = 12  = 0.05 t  /2 = 2.1788 Inferences about the Slope: t Test Example H 0 : β i = 0 H A : β i  0 The test statistic for each variable falls in the rejection region (p-values < 0.05) There is evidence that both Price and Advertising affect pie sales at  = 0.05 From Excel output: Reject H 0 for each variable CoefficientsStandard Errort StatP-value Price-24.9750910.83213-2.305650.03979 Advertising74.1309625.967322.854780.01449 Decision: Conclusion: Reject H 0  /2=0.025 -t α/2 Do not reject H 0 0 t α/2  /2=0.025 -2.1788 2.1788

25 Forecasting Horizon Forecasting Horizon: The number of future periods covered by a forecast. (Consists of one or more Forecasting Periods.) It is sometimes referred to as forecast lead-time. Forecasting Horizon, or lead time, is typically divided into four categories.  Immediate term – less than one month  Short term – one to three months  Medium term – three months to two years  Long term – two years or more

26 Time-Series Components Time-Series Cyclical Component Random Component Trend Component Seasonal Component

27 Upward trend Trend Component  Long-run increase or decrease over time (overall upward or downward movement)  Data taken over a long period of time Sales Time

28 Downward linear trend Trend Component  Trend can be upward or downward  Trend can be linear or non-linear  Can be stationary or non-stationary Sales Time Upward nonlinear trend Sales Time (continued)

29 Seasonal Component  Short-term regular wave-like patterns  Observed within 1 year  Often monthly or quarterly  Recurrence period - shortest period of repetition (must be less than one year) Sales Time (Quarterly) Winter Spring Summer Fall

30 Cyclical Component  Long-term wave-like patterns  Regularly occur but may vary in length  Often measured peak to peak or trough to trough Sales 1 Cycle Year

31 Random Component  Unpredictable, random, “residual” fluctuations  Due to random variations of  Nature  Accidents or unusual events  “Noise” in the time series

32 Multiplicative Time-Series Model  Used primarily for forecasting  Allows consideration of seasonal variation  Observed value in time series is the product of components  Classical decomposition is used to identify the various components whereT t = Trend value at time t S t = Seasonal value at time t C t = Cyclical value at time t I t = Irregular (random) value at time t

33 16-33 Seasonal Adjustment 1.Compute each moving average 2.Compute the centered moving averages 3.Isolate the seasonal component by determining the ratio-to-moving average values 4.Determine seasonal indexes and normalize if necessary 5.Deseasonalize the time series 6.Develop trend line using deseasonalized data 7.Develop unadjusted forecasts using trend projection 8.Seasonally adjust the forecasts

34 16-34 Moving Averages Example: Four-quarter moving average  First average:  Second average:  etc… (continued)

35 16-35 Seasonal Data QuarterSales 1 2 3 4 5 6 7 8 9 10 11 etc… 23 40 25 27 32 48 33 37 50 40 etc… … …

36 16-36 Calculating Moving Averages  Each moving average is for a consecutive block of 4 quarters QuarterSales 123 240 325 427 532 648 733 837 9 1050 1140 Average Period 4-Quarter Moving Average 2.528.75 3.531.00 4.533.00 5.535.00 6.537.50 7.538.75 8.539.25 9.541.00 etc…

37 16-37 Centered Moving Averages  Average periods of 2.5 or 3.5 don’t match the original quarters, so we average two consecutive moving averages to get centered moving averages Average Period 4-Quarter Moving Average 2.528.75 3.531.00 4.533.00 5.535.00 6.537.50 7.538.75 8.539.25 9.541.00 Centered Period Centered Moving Average 329.88 432.00 534.00 636.25 738.13 839.00 940.13 etc…

38 16-38 Calculating the Ratio-to-Moving Average  Divide the actual sales value by the centered moving average for that quarter

39 16-39 Calculating Seasonal Indexes QuarterSales Centered Moving Average Ratio-to- Moving Average 1 2 3 4 5 6 7 8 9 10 11 … 23 40 25 27 32 48 33 37 50 40 … 29.88 32.00 34.00 36.25 38.13 39.00 40.13 etc… … 0.837 0.844 0.941 1.324 0.865 0.949 0.922 etc… … Example:

40 16-40 Calculating Seasonal Indexes QuarterSales Centered Moving Average Ratio-to- Moving Average 1 2 3 4 5 6 7 8 9 10 11 … 23 40 25 27 32 48 33 37 50 40 … 29.88 32.00 34.00 36.25 38.13 39.00 40.13 etc… … 0.837 0.844 0.941 1.324 0.865 0.949 0.922 etc… … Average all of the Fall values to get Fall’s seasonal index Fall Do the same for the other three seasons to get the other seasonal indexes (continued)

41 16-41 Interpreting Seasonal Indexes  Suppose we get these seasonal indexes: Season Seasonal Index Spring0.825 Summer1.310 Fall0.920 Winter0.945  = 4.000 -- four seasons, so must sum to 4 Spring sales average 82.5% of the annual average sales Summer sales are 31.0% higher than the annual average sales etc… Interpretation:

42 Since the sum  4 use a multiplier

43 To find the Seasonal Index  Adjusted seasonal indexes are: These seasonal indexes can now be used to remove the seasonal component from the original series

44 16-44 Deseasonalizing  The data is deseasonalized by dividing the observed value by its seasonal index This smooths the data by removing seasonal variation

45 16-45 Deseasonalizing QuarterSales Seasonal Index Deseasonalized Sales 1 2 3 4 5 6 7 8 9 10 11 … 23 40 25 27 32 48 33 37 50 40 0.825 1.310 0.920 0.945 0.825 1.310 0.920 0.945 0.825 1.310 0.920 … 27.88 30.53 27.17 28.57 38.79 36.64 35.87 39.15 44.85 38.17 43.48 … etc… (continued) Example:

46 16-46 Unseasonalized vs. Seasonalized

47 Fitting trend models Once the seasonality has been removed from the data set, trend models can be applied to the deseasonalised data

48 Trend-Based Forecasting  Estimate a trend line using regression analysis Year Time Period (t) Sales (y) 1999 2000 2001 2002 2003 2004 123456123456 20 40 30 50 70 65 Use time (t) as the independent variable:

49 Trend-Based Forecasting  The linear trend model is: Year Time Period (t) Sales (y) 1999 2000 2001 2002 2003 2004 123456123456 20 40 30 50 70 65 (continued)

50 Trend-Based Forecasting  Forecast for 2005 (ie t = 7): Year Time Period (t) Sales (y) 1999 2000 2001 2002 2003 2004 2005 12345671234567 20 40 30 50 70 65 ?? (continued)


Download ppt "ECNE610 Managerial Economics Week 4 MARCH 2014 1 Dr. Mazharul Islam Chapter-5."

Similar presentations


Ads by Google