1 Econ 240 C Lecture 6
2 Part I: Box-Jenkins Magic ARMA models of time series all built from one source, white noise ARMA models of time series all built from one source, white noise
3 Analysis and Synthesis White noise, WN(t) White noise, WN(t) Is a sequence of draws from a normal distribution, N(0, ), indexed by time Is a sequence of draws from a normal distribution, N(0, ), indexed by time
4Analysis Random walk, RW(t) Random walk, RW(t) Analysis formulation: Analysis formulation: RW(t) = RW(t-1) + WN(t) RW(t) = RW(t-1) + WN(t) RW(t) - RW(t-1) = WN(t) RW(t) - RW(t-1) = WN(t) RW(t) – Z*RW(t) = WN(t) RW(t) – Z*RW(t) = WN(t) [1 – Z]*RW(t) = WN(t) [1 – Z]*RW(t) = WN(t) *RW(t) = WN(t) shows how you turn a random walk into white noise *RW(t) = WN(t) shows how you turn a random walk into white noise
5Synthesis Random Walk, Synthesis formulation Random Walk, Synthesis formulation RW(t) = {1/[1 – Z]}*WN(t) RW(t) = {1/[1 – Z]}*WN(t) RW(t) = [1 + Z + Z 2 + ….]*WN(t) RW(t) = [1 + Z + Z 2 + ….]*WN(t) RW(t) = WN(t) + Z*WN(t) + …. RW(t) = WN(t) + Z*WN(t) + …. RW(t) = WN(t) + WN(t-1) + WN(t-2) + … shows how you build a random walk from white noise RW(t) = WN(t) + WN(t-1) + WN(t-2) + … shows how you build a random walk from white noise
6Analysis Autoregressive process of the first order, analysis formulation Autoregressive process of the first order, analysis formulation ARONE(t) = b*ARONE(t-1) + WN(t) ARONE(t) = b*ARONE(t-1) + WN(t) ARONE(t) - b*ARONE(t-1) = WN(t) ARONE(t) - b*ARONE(t-1) = WN(t) ARONE(t) - b*Z*ARONE(t) = WN(t) ARONE(t) - b*Z*ARONE(t) = WN(t) [1 – b*Z]*ARONE(t) = WN(t) is a quasi- difference and shows how you turn an autoregressive process of the first order into white noise [1 – b*Z]*ARONE(t) = WN(t) is a quasi- difference and shows how you turn an autoregressive process of the first order into white noise
7Synthesis Autoregressive process of the first order, synthetic formulation Autoregressive process of the first order, synthetic formulation ARONE(t) = {1/[1 –b*Z]}*WN(t) ARONE(t) = {1/[1 –b*Z]}*WN(t) ARONE(t) = [1 + b*Z + b 2 *Z 2 + ….]*WN(t) ARONE(t) = [1 + b*Z + b 2 *Z 2 + ….]*WN(t) ARONE(t) =WN(t)+b*Z*WN(t)+b 2 *Z 2 *WN(t) +.. ARONE(t) =WN(t)+b*Z*WN(t)+b 2 *Z 2 *WN(t) +.. ARONE(t) = WN(t) + b*WN(t-1) +b 2 *WN(t-2) +. Shows how you turn white noise into an autoregressive process of the first order ARONE(t) = WN(t) + b*WN(t-1) +b 2 *WN(t-2) +. Shows how you turn white noise into an autoregressive process of the first order
8 Part II: Characterizing Time Series Behavior Mean function, m(t) = E [time_series(t)] Mean function, m(t) = E [time_series(t)] White noise: m(t) = E WN(t) = 0, all t White noise: m(t) = E WN(t) = 0, all t Random walk: m(t) = E[WN(t)+WN(t-1) +..] equals 0, all t Random walk: m(t) = E[WN(t)+WN(t-1) +..] equals 0, all t First order autoregressive process, m(t) = E[WN(t) + b*WN(t-1) + b 2 WN(t-2) + …] equals 0, all t First order autoregressive process, m(t) = E[WN(t) + b*WN(t-1) + b 2 WN(t-2) + …] equals 0, all t Note that for all three types of time series we calculate the mean function from the synthetic expression for the time series. Note that for all three types of time series we calculate the mean function from the synthetic expression for the time series.
9 Characterization: the Autocovariance Function E[WN(t)*WN(t-u)] = 0 for u>0, uses the orthogonality (independence) property of white noise E[WN(t)*WN(t-u)] = 0 for u>0, uses the orthogonality (independence) property of white noise E[RW(t)*RW(t-u)] = E{[WN(t)+WN(t-1) + WN(t-2) + …]*[WN(t-u)+WN(t-u-1) +…]} = + .... = uses the orthogonality property for white noise plus the theoretically infinite history of a random walk E[RW(t)*RW(t-u)] = E{[WN(t)+WN(t-1) + WN(t-2) + …]*[WN(t-u)+WN(t-u-1) +…]} = + .... = uses the orthogonality property for white noise plus the theoretically infinite history of a random walk
10 The Autocovariance Function E[ARONE(t)*ARONE(t-u)] =b*E[ARONE(t-1)* ARONE(t-u)] + E[WN(t)*ARONE(t-u)] E[ARONE(t)*ARONE(t-u)] =b*E[ARONE(t-1)* ARONE(t-u)] + E[WN(t)*ARONE(t-u)] AR,AR (u) = b* AR,AR (u-1) + 0 u>0, uses both the analytic and the synthetic formulations for ARONE(t). The analytic formulation is used tomultiply by ARONE(t-u) and take expectations. The synthetic formulation is used to lag and show ARONE(t-1depends only on WN(t-1) and earlier shocks. AR,AR (u) = b* AR,AR (u-1) + 0 u>0, uses both the analytic and the synthetic formulations for ARONE(t). The analytic formulation is used tomultiply by ARONE(t-u) and take expectations. The synthetic formulation is used to lag and show ARONE(t-1depends only on WN(t-1) and earlier shocks.
11 The Autocorrelation Function x,x (u) = AR,AR (u)/ AR,AR (0) x,x (u) = AR,AR (u)/ AR,AR (0) White Noise: WN,WN (u) = 0 u White Noise: WN,WN (u) = 0 u Random Walk: RW,RW (u) = 1, all u Random Walk: RW,RW (u) = 1, all u Autoregressive of the first order: x,x (u) = b u Autoregressive of the first order: x,x (u) = b u
12 Visual Preview of the Autocorrelation Function
13 Visual Preview of the Autocorrelation Function
14 Visual Preview of the Autocorrelation Function
15 Drop Lag Zero: The Mirror Image of the Mark of Zorro White Noise First Order Autoregressive Random Walk Lag 1 0
16 Part III.Analysis in the Lab: Process Identification Identification Estimation Estimation Verification Verification Forecasting Forecasting
17 Analysis in the Lab: Process Identification Identification Is the time series stationary? Is the time series stationary? Trace Trace Histogram Histogram Autocorrelation Function Autocorrelation Function If it is, proceed If it is, proceed If it is not, difference (prewhitening) If it is not, difference (prewhitening)
18 Change in Business Inventories, 1987 $ No trend, no seasonal
19 Change in Business Inventories, 1987 $ Symmetric, not normal
20 Change in Business Inventories, 1987 $ Sample: 1954:1 1998:2 Included observations: 176 AutocorrelationPartial CorrelationAC PAC Q-Stat Prob.|***** |.|***** | |*** |.|. | |** |.|. | |. | **|. | *|. |.|. | *|. |.|. | *|. |.|. | *|. |.|. | *|. |.|. | |. |.|. | |. |.|. | |. |.|. | |* |.|. |
21 Process: Analysis in the LAB Identification Identification conclude; stationary conclude; stationary conjecture: autoregressive of the first order conjecture: autoregressive of the first order
22 Process: Analysis in the LAB Estimation Estimation EVIEWS model: EVIEWS model: time series(t) = constant + residual(t) time series(t) = constant + residual(t) residual(t) =b*residual(t-1) + WN(t)? residual(t) =b*residual(t-1) + WN(t)? Combine the two: Combine the two: [time series(t) - c] =b*[time series(t-1) - c] +WN(t)? [time series(t) - c] =b*[time series(t-1) - c] +WN(t)? EVIEWS Specification: EVIEWS Specification: cbusin87 c ar(1) cbusin87 c ar(1)
23 Dependent Variable: CBUSIN87 Method: Least Squares Sample(adjusted): 1954:2 1997:4 Included observations: 175 after adjusting endpoints Convergence achieved after 3 iterations VariableCoefficientStd. Errort-StatisticProb. C AR(1) R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic) Inverted AR Roots.64
24Estimation Goodness of Fit Goodness of Fit Structure in the residuals? Are they orthogonal? Structure in the residuals? Are they orthogonal? Are the residuals normally distributed? Are the residuals normally distributed?
25 Goodness of Fit and Trace of the Residuals Conclude: Good fit, random residuals
26 Residuals Sample: 1954:2 1997:4 Included observations: 175 Q-statistic probabilities adjusted for 1 ARMA term(s) AutocorrelationPartial CorrelationAC PAC Q-Stat Prob.|. |.|. | |. |.|. | |* |.|* | *|. | *|. | *|. | *|. | |. | *|. | *|. |.|. | *|. |.|. | |. | *|. | |. |.|. | |. |.|. | |. |.|. | |. |.|. | Correlogram of the Residuals Conclude: (orthogonal)
27 Histogram of the Residuals Conclude: Not normal, kurtotic
28 Process: Analysis in the LAB Identification Identification Estimation Estimation Verification Verification Is there any structure left in the residuals? If not, we are back to our building block, orthogonal residuals, and we accept the model. Is there any structure left in the residuals? If not, we are back to our building block, orthogonal residuals, and we accept the model.
29 Process: Analysis in the LAB Identification Identification Estimation Estimation Verification Verification Is there any structure left in the residuals? If not, we are back to our building block, orthogonal residuals, and we accept the model. Is there any structure left in the residuals? If not, we are back to our building block, orthogonal residuals, and we accept the model. Forecasting Forecasting one period ahead forecasts one period ahead forecasts
30 Process: Analysis in the LAB Forecasting Forecasting The estimated model The estimated model [cbusin87(t) ] = 0.637*[cbusin87(t-1) ] + N(t) where N(t) is an independent error series but is not normally distributed [cbusin87(t) ] = 0.637*[cbusin87(t-1) ] + N(t) where N(t) is an independent error series but is not normally distributed The forecast is based on the estimated model: The forecast is based on the estimated model: [cbusin87(1998.1) ] = 0.637*[cbusin87(1997.4) ] + N(1998.1) [cbusin87(1998.1) ] = 0.637*[cbusin87(1997.4) ] + N(1998.1)
31 Process: Analysis in the LAB Estimation Estimation EVIEWS model: EVIEWS model: time series(t) = constant + residual(t) time series(t) = constant + residual(t) residual(t) =b*residual(t-1) + WN(t)? residual(t) =b*residual(t-1) + WN(t)? Combine the two: Combine the two: [time series(t) - c] =b*[time series(t-1) - c] +WN(t)? [time series(t) - c] =b*[time series(t-1) - c] +WN(t)? EVIEWS Specification: EVIEWS Specification: cbusin87 c ar(1) cbusin87 c ar(1)
32 Dependent Variable: CBUSIN87 Method: Least Squares Sample(adjusted): 1954:2 1997:4 Included observations: 175 after adjusting endpoints Convergence achieved after 3 iterations VariableCoefficientStd. Errort-StatisticProb. C AR(1) R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic) Inverted AR Roots.64
33 The Forecast Take expectations of the model, as of Take expectations of the model, as of E [cbusin87(1998.1) ] = 0.637*E [cbusin87(1997.4) ] + E N(1998.1) E [cbusin87(1998.1) ] = 0.637*E [cbusin87(1997.4) ] + E N(1998.1) E cbisin87(1998.1) is the forecast conditional on what we know as of E cbisin87(1998.1) is the forecast conditional on what we know as of cbusin87(1997.4) = 74, the value of the series in cbusin87(1997.4) = 74, the value of the series in E N(1998.1) = 0, the best guess for the shock E N(1998.1) = 0, the best guess for the shock
34 The Forecast Calculate the forecast by hand Calculate the forecast by hand for a one period ahead forecast, the standard error of the regression can be used for the standard error of the forecast for a one period ahead forecast, the standard error of the regression can be used for the standard error of the forecast calculate the upper band: forecast+ 2*SER calculate the upper band: forecast+ 2*SER calculate the lower band: forecast - 2*SER calculate the lower band: forecast - 2*SER
35 The Forecast Use EVIEWS as a check Use EVIEWS as a check
36
37 95 % Confidence Intervals and the Forecast, Visual
: NA 1996: NA 1996: NA 1996: NA 1996: NA 1997: NA 1997: NA 1997: NA 1997: NA 1998:1 NA :2 NA NA NA The Numerical Forecast in EVIEWS and the Standard Error of the Forecast
39 Part IV. Process: Fill in the Blanks The ratio of inventories to sales, total business The ratio of inventories to sales, total business
40 What is the first step?
: : : : : : : : : : : : : : : spreadsheet
42 Conclusions? Trace
43 Histogram Conclusions?
44 CorrelogramConclusions?
45 What is the Next Step?
46 Conjecture: Model
47 What do we do now?
48
49 What do we do next?
50 What conclusions can we draw?
51 Conclusions
52
53 If we accepted this model, what would the formula be? Ratioinvsale(t) Ratioinvsale(t)
54 Make a one period ahead forecast; what is the standard error of the forecast?