Download presentation
Presentation is loading. Please wait.
1
1 Econ 240 C Lecture 6
2
2 Part I: Box-Jenkins Magic ARMA models of time series all built from one source, white noise ARMA models of time series all built from one source, white noise
3
3 Analysis and Synthesis White noise, WN(t) White noise, WN(t) Is a sequence of draws from a normal distribution, N(0, ), indexed by time Is a sequence of draws from a normal distribution, N(0, ), indexed by time
4
4Analysis Random walk, RW(t) Random walk, RW(t) Analysis formulation: Analysis formulation: RW(t) = RW(t-1) + WN(t) RW(t) = RW(t-1) + WN(t) RW(t) - RW(t-1) = WN(t) RW(t) - RW(t-1) = WN(t) RW(t) – Z*RW(t) = WN(t) RW(t) – Z*RW(t) = WN(t) [1 – Z]*RW(t) = WN(t) [1 – Z]*RW(t) = WN(t) *RW(t) = WN(t) shows how you turn a random walk into white noise *RW(t) = WN(t) shows how you turn a random walk into white noise
5
5Synthesis Random Walk, Synthesis formulation Random Walk, Synthesis formulation RW(t) = {1/[1 – Z]}*WN(t) RW(t) = {1/[1 – Z]}*WN(t) RW(t) = [1 + Z + Z 2 + ….]*WN(t) RW(t) = [1 + Z + Z 2 + ….]*WN(t) RW(t) = WN(t) + Z*WN(t) + …. RW(t) = WN(t) + Z*WN(t) + …. RW(t) = WN(t) + WN(t-1) + WN(t-2) + … shows how you build a random walk from white noise RW(t) = WN(t) + WN(t-1) + WN(t-2) + … shows how you build a random walk from white noise
6
6Analysis Autoregressive process of the first order, analysis formulation Autoregressive process of the first order, analysis formulation ARONE(t) = b*ARONE(t-1) + WN(t) ARONE(t) = b*ARONE(t-1) + WN(t) ARONE(t) - b*ARONE(t-1) = WN(t) ARONE(t) - b*ARONE(t-1) = WN(t) ARONE(t) - b*Z*ARONE(t) = WN(t) ARONE(t) - b*Z*ARONE(t) = WN(t) [1 – b*Z]*ARONE(t) = WN(t) is a quasi- difference and shows how you turn an autoregressive process of the first order into white noise [1 – b*Z]*ARONE(t) = WN(t) is a quasi- difference and shows how you turn an autoregressive process of the first order into white noise
7
7Synthesis Autoregressive process of the first order, synthetic formulation Autoregressive process of the first order, synthetic formulation ARONE(t) = {1/[1 –b*Z]}*WN(t) ARONE(t) = {1/[1 –b*Z]}*WN(t) ARONE(t) = [1 + b*Z + b 2 *Z 2 + ….]*WN(t) ARONE(t) = [1 + b*Z + b 2 *Z 2 + ….]*WN(t) ARONE(t) =WN(t)+b*Z*WN(t)+b 2 *Z 2 *WN(t) +.. ARONE(t) =WN(t)+b*Z*WN(t)+b 2 *Z 2 *WN(t) +.. ARONE(t) = WN(t) + b*WN(t-1) +b 2 *WN(t-2) +. Shows how you turn white noise into an autoregressive process of the first order ARONE(t) = WN(t) + b*WN(t-1) +b 2 *WN(t-2) +. Shows how you turn white noise into an autoregressive process of the first order
8
8 Part II: Characterizing Time Series Behavior Mean function, m(t) = E [time_series(t)] Mean function, m(t) = E [time_series(t)] White noise: m(t) = E WN(t) = 0, all t White noise: m(t) = E WN(t) = 0, all t Random walk: m(t) = E[WN(t)+WN(t-1) +..] equals 0, all t Random walk: m(t) = E[WN(t)+WN(t-1) +..] equals 0, all t First order autoregressive process, m(t) = E[WN(t) + b*WN(t-1) + b 2 WN(t-2) + …] equals 0, all t First order autoregressive process, m(t) = E[WN(t) + b*WN(t-1) + b 2 WN(t-2) + …] equals 0, all t Note that for all three types of time series we calculate the mean function from the synthetic expression for the time series. Note that for all three types of time series we calculate the mean function from the synthetic expression for the time series.
9
9 Characterization: the Autocovariance Function E[WN(t)*WN(t-u)] = 0 for u>0, uses the orthogonality (independence) property of white noise E[WN(t)*WN(t-u)] = 0 for u>0, uses the orthogonality (independence) property of white noise E[RW(t)*RW(t-u)] = E{[WN(t)+WN(t-1) + WN(t-2) + …]*[WN(t-u)+WN(t-u-1) +…]} = + .... = uses the orthogonality property for white noise plus the theoretically infinite history of a random walk E[RW(t)*RW(t-u)] = E{[WN(t)+WN(t-1) + WN(t-2) + …]*[WN(t-u)+WN(t-u-1) +…]} = + .... = uses the orthogonality property for white noise plus the theoretically infinite history of a random walk
10
10 The Autocovariance Function E[ARONE(t)*ARONE(t-u)] =b*E[ARONE(t-1)* ARONE(t-u)] + E[WN(t)*ARONE(t-u)] E[ARONE(t)*ARONE(t-u)] =b*E[ARONE(t-1)* ARONE(t-u)] + E[WN(t)*ARONE(t-u)] AR,AR (u) = b* AR,AR (u-1) + 0 u>0, uses both the analytic and the synthetic formulations for ARONE(t). The analytic formulation is used tomultiply by ARONE(t-u) and take expectations. The synthetic formulation is used to lag and show ARONE(t-1depends only on WN(t-1) and earlier shocks. AR,AR (u) = b* AR,AR (u-1) + 0 u>0, uses both the analytic and the synthetic formulations for ARONE(t). The analytic formulation is used tomultiply by ARONE(t-u) and take expectations. The synthetic formulation is used to lag and show ARONE(t-1depends only on WN(t-1) and earlier shocks.
11
11 The Autocorrelation Function x,x (u) = AR,AR (u)/ AR,AR (0) x,x (u) = AR,AR (u)/ AR,AR (0) White Noise: WN,WN (u) = 0 u White Noise: WN,WN (u) = 0 u Random Walk: RW,RW (u) = 1, all u Random Walk: RW,RW (u) = 1, all u Autoregressive of the first order: x,x (u) = b u Autoregressive of the first order: x,x (u) = b u
12
12 Visual Preview of the Autocorrelation Function
13
13 Visual Preview of the Autocorrelation Function
14
14 Visual Preview of the Autocorrelation Function
15
15 Drop Lag Zero: The Mirror Image of the Mark of Zorro White Noise First Order Autoregressive Random Walk Lag 1 0
16
16 Part III.Analysis in the Lab: Process Identification Identification Estimation Estimation Verification Verification Forecasting Forecasting
17
17 Analysis in the Lab: Process Identification Identification Is the time series stationary? Is the time series stationary? Trace Trace Histogram Histogram Autocorrelation Function Autocorrelation Function If it is, proceed If it is, proceed If it is not, difference (prewhitening) If it is not, difference (prewhitening)
18
18 Change in Business Inventories, 1987 $ No trend, no seasonal
19
19 Change in Business Inventories, 1987 $ Symmetric, not normal
20
20 Change in Business Inventories, 1987 $ Sample: 1954:1 1998:2 Included observations: 176 AutocorrelationPartial CorrelationAC PAC Q-Stat Prob.|***** |.|***** |10.6340.63471.9320.000.|*** |.|. |20.391-0.01899.4340.000.|** |.|. |30.230-0.018108.990.000.|. | **|. |4-0.025-0.267109.110.000 *|. |.|. |5-0.146-0.033113.000.000 *|. |.|. |6-0.1560.033117.460.000 *|. |.|. |7-0.1530.011121.830.000 *|. |.|. |8-0.128-0.034124.870.000 *|. |.|. |9-0.074-0.008125.900.000.|. |.|. |10-0.0010.057125.900.000.|. |.|. |110.0480.029126.340.000.|. |.|. |120.055-0.032126.910.000.|* |.|. |130.0690.010127.840.000
21
21 Process: Analysis in the LAB Identification Identification conclude; stationary conclude; stationary conjecture: autoregressive of the first order conjecture: autoregressive of the first order
22
22 Process: Analysis in the LAB Estimation Estimation EVIEWS model: EVIEWS model: time series(t) = constant + residual(t) time series(t) = constant + residual(t) residual(t) =b*residual(t-1) + WN(t)? residual(t) =b*residual(t-1) + WN(t)? Combine the two: Combine the two: [time series(t) - c] =b*[time series(t-1) - c] +WN(t)? [time series(t) - c] =b*[time series(t-1) - c] +WN(t)? EVIEWS Specification: EVIEWS Specification: cbusin87 c ar(1) cbusin87 c ar(1)
23
23 Dependent Variable: CBUSIN87 Method: Least Squares Sample(adjusted): 1954:2 1997:4 Included observations: 175 after adjusting endpoints Convergence achieved after 3 iterations VariableCoefficientStd. Errort-StatisticProb. C36.583196.9770235.2433810.0000 AR(1)0.6368160.05843510.897880.0000 R-squared0.407055 Mean dependent var35.67438 Adjusted R-squared 0.403627 S.D. dependent var43.38323 S.E. of regression 33.50278 Akaike info criterion9.872497 Sum squared resid 194181.5 Schwarz criterion9.908666 Log likelihood-861.8435 F-statistic118.7638 Durbin-Watson stat 1.978452 Prob(F-statistic)0.000000 Inverted AR Roots.64
24
24Estimation Goodness of Fit Goodness of Fit Structure in the residuals? Are they orthogonal? Structure in the residuals? Are they orthogonal? Are the residuals normally distributed? Are the residuals normally distributed?
25
25 Goodness of Fit and Trace of the Residuals Conclude: Good fit, random residuals
26
26 Residuals Sample: 1954:2 1997:4 Included observations: 175 Q-statistic probabilities adjusted for 1 ARMA term(s) AutocorrelationPartial CorrelationAC PAC Q-Stat Prob.|. |.|. |10.0070.0070.0095.|. |.|. |2-0.007-0.0070.01870.891.|* |.|* |30.1560.1564.41320.110 *|. | *|. |4-0.145-0.1518.21990.042 *|. | *|. |5-0.150-0.14812.3070.015.|. | *|. |6-0.046-0.07112.6880.026 *|. |.|. |7-0.058-0.01213.3140.038 *|. |.|. |8-0.063-0.04014.0550.050.|. | *|. |9-0.042-0.07114.3780.072.|. |.|. |100.023-0.00514.4770.106.|. |.|. |110.0540.04615.0200.131.|. |.|. |120.001-0.01115.0210.182.|. |.|. |130.0640.02715.8050.200 Correlogram of the Residuals Conclude: (orthogonal)
27
27 Histogram of the Residuals Conclude: Not normal, kurtotic
28
28 Process: Analysis in the LAB Identification Identification Estimation Estimation Verification Verification Is there any structure left in the residuals? If not, we are back to our building block, orthogonal residuals, and we accept the model. Is there any structure left in the residuals? If not, we are back to our building block, orthogonal residuals, and we accept the model.
29
29 Process: Analysis in the LAB Identification Identification Estimation Estimation Verification Verification Is there any structure left in the residuals? If not, we are back to our building block, orthogonal residuals, and we accept the model. Is there any structure left in the residuals? If not, we are back to our building block, orthogonal residuals, and we accept the model. Forecasting Forecasting one period ahead forecasts one period ahead forecasts
30
30 Process: Analysis in the LAB Forecasting Forecasting The estimated model The estimated model [cbusin87(t) - 36.58] = 0.637*[cbusin87(t-1) - 36.58] + N(t) where N(t) is an independent error series but is not normally distributed [cbusin87(t) - 36.58] = 0.637*[cbusin87(t-1) - 36.58] + N(t) where N(t) is an independent error series but is not normally distributed The forecast is based on the estimated model: The forecast is based on the estimated model: [cbusin87(1998.1) - 36.58] = 0.637*[cbusin87(1997.4) - 36.58] + N(1998.1) [cbusin87(1998.1) - 36.58] = 0.637*[cbusin87(1997.4) - 36.58] + N(1998.1)
31
31 Process: Analysis in the LAB Estimation Estimation EVIEWS model: EVIEWS model: time series(t) = constant + residual(t) time series(t) = constant + residual(t) residual(t) =b*residual(t-1) + WN(t)? residual(t) =b*residual(t-1) + WN(t)? Combine the two: Combine the two: [time series(t) - c] =b*[time series(t-1) - c] +WN(t)? [time series(t) - c] =b*[time series(t-1) - c] +WN(t)? EVIEWS Specification: EVIEWS Specification: cbusin87 c ar(1) cbusin87 c ar(1)
32
32 Dependent Variable: CBUSIN87 Method: Least Squares Sample(adjusted): 1954:2 1997:4 Included observations: 175 after adjusting endpoints Convergence achieved after 3 iterations VariableCoefficientStd. Errort-StatisticProb. C36.583196.9770235.2433810.0000 AR(1)0.6368160.05843510.897880.0000 R-squared0.407055 Mean dependent var35.67438 Adjusted R-squared 0.403627 S.D. dependent var43.38323 S.E. of regression 33.50278 Akaike info criterion9.872497 Sum squared resid 194181.5 Schwarz criterion9.908666 Log likelihood-861.8435 F-statistic118.7638 Durbin-Watson stat 1.978452 Prob(F-statistic)0.000000 Inverted AR Roots.64
33
33 The Forecast Take expectations of the model, as of 1997.4 Take expectations of the model, as of 1997.4 E 1997.4 [cbusin87(1998.1) - 36.58] = 0.637*E 1997.4 [cbusin87(1997.4) - 36.58] + E 1997.4 N(1998.1) E 1997.4 [cbusin87(1998.1) - 36.58] = 0.637*E 1997.4 [cbusin87(1997.4) - 36.58] + E 1997.4 N(1998.1) E 1997.4 cbisin87(1998.1) is the forecast conditional on what we know as of 1997.4 E 1997.4 cbisin87(1998.1) is the forecast conditional on what we know as of 1997.4 cbusin87(1997.4) = 74, the value of the series in 1997.4 cbusin87(1997.4) = 74, the value of the series in 1997.4 E 1997.4 N(1998.1) = 0, the best guess for the shock E 1997.4 N(1998.1) = 0, the best guess for the shock
34
34 The Forecast Calculate the forecast by hand Calculate the forecast by hand for a one period ahead forecast, the standard error of the regression can be used for the standard error of the forecast for a one period ahead forecast, the standard error of the regression can be used for the standard error of the forecast calculate the upper band: forecast+ 2*SER calculate the upper band: forecast+ 2*SER calculate the lower band: forecast - 2*SER calculate the lower band: forecast - 2*SER
35
35 The Forecast Use EVIEWS as a check Use EVIEWS as a check
36
36
37
37 95 % Confidence Intervals and the Forecast, Visual
38
38 1995:4 14.60000 14.60000 NA 1996:1-3.000000-3.000000 NA 1996:2 6.700000 6.700000 NA 1996:3 37.90000 37.90000 NA 1996:4 32.90000 32.90000 NA 1997:1 63.70000 63.70000 NA 1997:2 77.60000 77.60000 NA 1997:3 47.50000 47.50000 NA 1997:4 74.00000 74.00000 NA 1998:1 NA 60.41081 33.50278 1998:2 NA NA NA The Numerical Forecast in EVIEWS and the Standard Error of the Forecast
39
39 Part IV. Process: Fill in the Blanks The ratio of inventories to sales, total business The ratio of inventories to sales, total business
40
40 What is the first step?
41
41 1992:01 1.560000 1992:02 1.560000 1992:03 1.540000 1992:04 1.530000 1992:05 1.530000 1992:06 1.520000 1992:07 1.510000 1992:08 1.540000 1992:09 1.520000 1992:10 1.520000 1992:11 1.520000 1992:12 1.530000 1993:01 1.500000 1993:02 1.510000 2003:01 1.36 spreadsheet
42
42 Conclusions? Trace
43
43 Histogram Conclusions?
44
44 CorrelogramConclusions?
45
45 What is the Next Step?
46
46 Conjecture: Model
47
47 What do we do now?
48
48
49
49 What do we do next?
50
50 What conclusions can we draw?
51
51 Conclusions
52
52
53
53 If we accepted this model, what would the formula be? Ratioinvsale(t) Ratioinvsale(t)
54
54 Make a one period ahead forecast; what is the standard error of the forecast?
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.