1 Econ 240 C Lecture 6. 2 Part I: Box-Jenkins Magic ARMA models of time series all built from one source, white noise ARMA models of time series all built.

Slides:



Advertisements
Similar presentations
COINTEGRATION 1 The next topic is cointegration. Suppose that you have two nonstationary series X and Y and you hypothesize that Y is a linear function.
Advertisements

============================================================ Dependent Variable: LGHOUS Method: Least Squares Sample: Included observations:
AUTOCORRELATION 1 The third Gauss-Markov condition is that the values of the disturbance term in the observations in the sample be generated independently.
FITTING MODELS WITH NONSTATIONARY TIME SERIES 1 Detrending Early macroeconomic models tended to produce poor forecasts, despite having excellent sample-period.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Three Ending Tuesday, September 11 (Note: You must go over these slides and complete every.
Chapter 4 Using Regression to Estimate Trends Trend Models zLinear trend, zQuadratic trend zCubic trend zExponential trend.
NBA Statistical Analysis Econ 240A. Intro. to Econometrics. Fall Group 3 Lu Mao Ying Fan Matthew Koson Ryan Knefel Eric Johnson Tyler Nelson Grop.
Angela Sordello Christopher Friedberg Can Shen Hui Lai Hui Wang Fang Guo.
1 Lecture Twelve. 2 Outline Failure Time Analysis Linear Probability Model Poisson Distribution.
1 Power Nine Econ 240C. 2 Outline Lab Three Exercises Lab Three Exercises –Fit a linear trend to retail and food sales –Add a quadratic term –Use both.
TAKE HOME PROJECT 2 Group C: Robert Matarazzo, Michael Stromberg, Yuxing Zhang, Yin Chu, Leslie Wei, and Kurtis Hollar.
Marietta College Week 14 1 Tuesday, April 12 2 Exam 3: Monday, April 25, 12- 2:30PM Bring your laptops to class on Thursday too.
1 Econ 240 C Lecture 3. 2 Part I Modeling Economic Time Series.
1 Econ 240 C Lecture White noise inputoutput 1/(1 – z) White noise input output Random walkSynthesis 1/(1 – bz) White noise input output.
Is There a Difference?. How Should You Vote? Is “Big Government” better?Is “Big Government” better? –Republicans want less government involvement. –Democrats.
Global Warming: Is It True? Peter Fuller Odeliah Greene Amanda Smith May Zin.
1 Econ 240 C Lecture Time Series Concepts Analysis and Synthesis.
1 Lecture Eleven Econ 240C. 2 Outline Review Stochastic Time Series –White noise –Random walk –ARONE: –ARTWO –ARTHREE –ARMA(2,2) –MAONE*SMATWELVE.
1 Power 2 Econ 240C. 2 Lab 1 Retrospective Exercise: –GDP_CAN = a +b*GDP_CAN(-1) + e –GDP_FRA = a +b*GDP_FRA(-1) + e.
1 Identifying ARIMA Models What you need to know.
Car Sales Analysis of monthly sales of light weight vehicles. Laura Pomella Karen Chang Heidi Braunger David Parker Derek Shum Mike Hu.
1 Power Nine Econ 240C. 2 Outline Lab Three Exercises Lab Three Exercises –Fit a linear trend to retail and food sales –Add a quadratic term –Use both.
1 Econ 240C Lecture Five Outline w Box-Jenkins Models w Time Series Components Model w Autoregressive of order one.
1 Econ 240 C Lecture 3. 2 Time Series Concepts Analysis and Synthesis.
1 Power Nine Econ 240C. 2 Outline Lab Three Exercises Lab Three Exercises –Fit a linear trend to retail and food sales –Add a quadratic term –Use both.
1 Econ 240C Lecture Five. 2 Outline w Box-Jenkins Models: the grand design w What do you need to learn? w Preview of partial autocorrelation function.
1 Econ 240A Power 7. 2 This Week, So Far §Normal Distribution §Lab Three: Sampling Distributions §Interval Estimation and HypothesisTesting.
1 ECON 240C Lecture 8. 2 Outline: 2 nd Order AR Roots of the quadratic Roots of the quadratic Example: capumfg Example: capumfg Polar form Polar form.
1 ECON 240C Lecture 8. 2 Part I. Economic Forecast Project Santa Barbara County Seminar –April 17, 2003 URL:
Violent Crime in America ECON 240A Group 4 Thursday 3 December 2009.
Lecture Week 3 Topics in Regression Analysis. Overview Multiple regression Dummy variables Tests of restrictions 2 nd hour: some issues in cost of capital.
Alcohol Consumption Allyson Cady Dave Klotz Brandon DeMille Chris Ross.
California Expenditure VS. Immigration By: Daniel Jiang, Keith Cochran, Justin Adams, Hung Lam, Steven Carlson, Gregory Wiefel Fall 2003.
1 Lecture One Econ 240C. 2 Outline Pooling Time Series and Cross- Section Review: Analysis of Variance –one-way ANOVA –two-way ANOVA Pooling Examples.
1 ECON 240C Lecture 8. 2 Outline: 2 nd Order AR Roots of the quadratic Roots of the quadratic Example: change in housing starts Example: change in housing.
1 Lecture One Econ 240C. 2 Einstein’s blackboard, Theory of relativity, Oxford, 1931.
1 Econ 240C Lecture Five Outline w Box-Jenkins Models w Time Series Components Model w Autoregressive of order one.
1 ECON 240C Lecture 8. 2 Part I. Economic Forecast Project Santa Barbara County Seminar Santa Barbara County Seminar  April 22, 2004 (April 17, 2003)
1 Econ 240A Power 7. 2 Last Week §Normal Distribution §Lab Three: Sampling Distributions §Interval Estimation and HypothesisTesting.
1 Power Nine Econ 240C. 2 Outline Lab Three Exercises Lab Three Exercises –Fit a linear trend to retail sales –Add a quadratic term –Use both models to.
Zhen Tian Jeff Lee Visut Hemithi Huan Zhang Diana Aguilar Yuli Yan A Deep Analysis of A Random Walk.
1 ECON 240C Lecture 8. 2 Outline: 2 nd Order AR Roots of the quadratic Roots of the quadratic Example: change in housing starts Example: change in housing.
1 Power Fifteen Analysis of Variance (ANOVA). 2 Analysis of Variance w One-Way ANOVA Tabular Regression w Two-Way ANOVA Tabular Regression.
Forecasting Fed Funds Rate Group 4 Neelima Akkannapragada Chayaporn Lertrattanapaiboon Anthony Mak Joseph Singh Corinna Traumueller Hyo Joon You.
U.S. Tax Revenues and Policy Implications A Time Series Approach Group C: Liu He Guizi Li Chien-ju Lin Lyle Kaplan-Reinig Matthew Routh Eduardo Velasquez.
Matt Mullens Gulsah Gunenc Alex Keyfes Gaoyuan Tian Andrew Booth.
EC220 - Introduction to econometrics (chapter 12)
DURBIN–WATSON TEST FOR AR(1) AUTOCORRELATION
Predicting volatility: a comparative analysis between GARCH Models and Neural Network Models MCs Student: Miruna State Supervisor: Professor Moisa Altar.
STAT 497 LECTURE NOTES 2.
What decides the price of used cars? Group 1 Jessica Aguirre Keith Cody Rui Feng Jennifer Griffeth Joonhee Lee Hans-Jakob Lothe Teng Wang.
NONPARAMETRIC MODELING OF THE CROSS- MARKET FEEDBACK EFFECT.
SPURIOUS REGRESSIONS 1 In a famous Monte Carlo experiment, Granger and Newbold fitted the model Y t =  1 +  2 X t + u t where Y t and X t were independently-generated.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Four Ending Wednesday, September 19 (Assignment 4 which is included in this study guide.
AUTOCORRELATION 1 Assumption C.5 states that the values of the disturbance term in the observations in the sample are generated independently of each other.
2010, ECON Hypothesis Testing 1: Single Coefficient Review of hypothesis testing Testing single coefficient Interval estimation Objectives.
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
FUNCTIONAL FORMS OF REGRESSION MODELS Application 5.
EC208 – Introductory Econometrics. Topic: Spurious/Nonsense Regressions (as part of chapter on Dynamic Models)
The Box-Jenkins (ARIMA) Methodology
Partial Equilibrium Framework Empirical Evidence for Argentina ( )
1 Econ 240C Lecture Five. 2 Part I: Time Series Components Model w The conceptual framework for inertial (mechanical) time series models: w Time series.
Econ 240 C Lecture 4.
ECON 240C Lecture 7.
المبادلة بين العائد و المخاطرة دراسة قياسية السنة الدراســــــــية:
Forecasting the Return Volatility of the Exchange Rate
Vector AutoRegression models (VARs)
CH2 Time series.
BOX JENKINS (ARIMA) METHODOLOGY
Presentation transcript:

1 Econ 240 C Lecture 6

2 Part I: Box-Jenkins Magic ARMA models of time series all built from one source, white noise ARMA models of time series all built from one source, white noise

3 Analysis and Synthesis White noise, WN(t) White noise, WN(t) Is a sequence of draws from a normal distribution, N(0,   ), indexed by time Is a sequence of draws from a normal distribution, N(0,   ), indexed by time

4Analysis Random walk, RW(t) Random walk, RW(t) Analysis formulation: Analysis formulation: RW(t) = RW(t-1) + WN(t) RW(t) = RW(t-1) + WN(t) RW(t) - RW(t-1) = WN(t) RW(t) - RW(t-1) = WN(t) RW(t) – Z*RW(t) = WN(t) RW(t) – Z*RW(t) = WN(t) [1 – Z]*RW(t) = WN(t) [1 – Z]*RW(t) = WN(t)  *RW(t) = WN(t) shows how you turn a random walk into white noise  *RW(t) = WN(t) shows how you turn a random walk into white noise

5Synthesis Random Walk, Synthesis formulation Random Walk, Synthesis formulation RW(t) = {1/[1 – Z]}*WN(t) RW(t) = {1/[1 – Z]}*WN(t) RW(t) = [1 + Z + Z 2 + ….]*WN(t) RW(t) = [1 + Z + Z 2 + ….]*WN(t) RW(t) = WN(t) + Z*WN(t) + …. RW(t) = WN(t) + Z*WN(t) + …. RW(t) = WN(t) + WN(t-1) + WN(t-2) + … shows how you build a random walk from white noise RW(t) = WN(t) + WN(t-1) + WN(t-2) + … shows how you build a random walk from white noise

6Analysis Autoregressive process of the first order, analysis formulation Autoregressive process of the first order, analysis formulation ARONE(t) = b*ARONE(t-1) + WN(t) ARONE(t) = b*ARONE(t-1) + WN(t) ARONE(t) - b*ARONE(t-1) = WN(t) ARONE(t) - b*ARONE(t-1) = WN(t) ARONE(t) - b*Z*ARONE(t) = WN(t) ARONE(t) - b*Z*ARONE(t) = WN(t) [1 – b*Z]*ARONE(t) = WN(t) is a quasi- difference and shows how you turn an autoregressive process of the first order into white noise [1 – b*Z]*ARONE(t) = WN(t) is a quasi- difference and shows how you turn an autoregressive process of the first order into white noise

7Synthesis Autoregressive process of the first order, synthetic formulation Autoregressive process of the first order, synthetic formulation ARONE(t) = {1/[1 –b*Z]}*WN(t) ARONE(t) = {1/[1 –b*Z]}*WN(t) ARONE(t) = [1 + b*Z + b 2 *Z 2 + ….]*WN(t) ARONE(t) = [1 + b*Z + b 2 *Z 2 + ….]*WN(t) ARONE(t) =WN(t)+b*Z*WN(t)+b 2 *Z 2 *WN(t) +.. ARONE(t) =WN(t)+b*Z*WN(t)+b 2 *Z 2 *WN(t) +.. ARONE(t) = WN(t) + b*WN(t-1) +b 2 *WN(t-2) +. Shows how you turn white noise into an autoregressive process of the first order ARONE(t) = WN(t) + b*WN(t-1) +b 2 *WN(t-2) +. Shows how you turn white noise into an autoregressive process of the first order

8 Part II: Characterizing Time Series Behavior Mean function, m(t) = E [time_series(t)] Mean function, m(t) = E [time_series(t)] White noise: m(t) = E WN(t) = 0, all t White noise: m(t) = E WN(t) = 0, all t Random walk: m(t) = E[WN(t)+WN(t-1) +..] equals 0, all t Random walk: m(t) = E[WN(t)+WN(t-1) +..] equals 0, all t First order autoregressive process, m(t) = E[WN(t) + b*WN(t-1) + b 2 WN(t-2) + …] equals 0, all t First order autoregressive process, m(t) = E[WN(t) + b*WN(t-1) + b 2 WN(t-2) + …] equals 0, all t Note that for all three types of time series we calculate the mean function from the synthetic expression for the time series. Note that for all three types of time series we calculate the mean function from the synthetic expression for the time series.

9 Characterization: the Autocovariance Function E[WN(t)*WN(t-u)] = 0 for u>0, uses the orthogonality (independence) property of white noise E[WN(t)*WN(t-u)] = 0 for u>0, uses the orthogonality (independence) property of white noise E[RW(t)*RW(t-u)] = E{[WN(t)+WN(t-1) + WN(t-2) + …]*[WN(t-u)+WN(t-u-1) +…]} =   +     .... =  uses the orthogonality property for white noise plus the theoretically infinite history of a random walk E[RW(t)*RW(t-u)] = E{[WN(t)+WN(t-1) + WN(t-2) + …]*[WN(t-u)+WN(t-u-1) +…]} =   +     .... =  uses the orthogonality property for white noise plus the theoretically infinite history of a random walk

10 The Autocovariance Function E[ARONE(t)*ARONE(t-u)] =b*E[ARONE(t-1)* ARONE(t-u)] + E[WN(t)*ARONE(t-u)] E[ARONE(t)*ARONE(t-u)] =b*E[ARONE(t-1)* ARONE(t-u)] + E[WN(t)*ARONE(t-u)]  AR,AR (u) = b*  AR,AR (u-1) + 0 u>0, uses both the analytic and the synthetic formulations for ARONE(t). The analytic formulation is used tomultiply by ARONE(t-u) and take expectations. The synthetic formulation is used to lag and show ARONE(t-1depends only on WN(t-1) and earlier shocks.  AR,AR (u) = b*  AR,AR (u-1) + 0 u>0, uses both the analytic and the synthetic formulations for ARONE(t). The analytic formulation is used tomultiply by ARONE(t-u) and take expectations. The synthetic formulation is used to lag and show ARONE(t-1depends only on WN(t-1) and earlier shocks.

11 The Autocorrelation Function  x,x (u) =  AR,AR (u)/  AR,AR (0)  x,x (u) =  AR,AR (u)/  AR,AR (0) White Noise:  WN,WN (u) = 0 u White Noise:  WN,WN (u) = 0 u Random Walk:  RW,RW (u) = 1, all u Random Walk:  RW,RW (u) = 1, all u Autoregressive of the first order:  x,x (u) = b u Autoregressive of the first order:  x,x (u) = b u

12 Visual Preview of the Autocorrelation Function

13 Visual Preview of the Autocorrelation Function

14 Visual Preview of the Autocorrelation Function

15 Drop Lag Zero: The Mirror Image of the Mark of Zorro White Noise First Order Autoregressive Random Walk Lag 1 0

16 Part III.Analysis in the Lab: Process Identification Identification Estimation Estimation Verification Verification Forecasting Forecasting

17 Analysis in the Lab: Process Identification Identification Is the time series stationary? Is the time series stationary? Trace Trace Histogram Histogram Autocorrelation Function Autocorrelation Function If it is, proceed If it is, proceed If it is not, difference (prewhitening) If it is not, difference (prewhitening)

18 Change in Business Inventories, 1987 $ No trend, no seasonal

19 Change in Business Inventories, 1987 $ Symmetric, not normal

20 Change in Business Inventories, 1987 $ Sample: 1954:1 1998:2 Included observations: 176 AutocorrelationPartial CorrelationAC PAC Q-Stat Prob.|***** |.|***** | |*** |.|. | |** |.|. | |. | **|. | *|. |.|. | *|. |.|. | *|. |.|. | *|. |.|. | *|. |.|. | |. |.|. | |. |.|. | |. |.|. | |* |.|. |

21 Process: Analysis in the LAB Identification Identification conclude; stationary conclude; stationary conjecture: autoregressive of the first order conjecture: autoregressive of the first order

22 Process: Analysis in the LAB Estimation Estimation EVIEWS model: EVIEWS model: time series(t) = constant + residual(t) time series(t) = constant + residual(t) residual(t) =b*residual(t-1) + WN(t)? residual(t) =b*residual(t-1) + WN(t)? Combine the two: Combine the two: [time series(t) - c] =b*[time series(t-1) - c] +WN(t)? [time series(t) - c] =b*[time series(t-1) - c] +WN(t)? EVIEWS Specification: EVIEWS Specification: cbusin87 c ar(1) cbusin87 c ar(1)

23 Dependent Variable: CBUSIN87 Method: Least Squares Sample(adjusted): 1954:2 1997:4 Included observations: 175 after adjusting endpoints Convergence achieved after 3 iterations VariableCoefficientStd. Errort-StatisticProb. C AR(1) R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic) Inverted AR Roots.64

24Estimation Goodness of Fit Goodness of Fit Structure in the residuals? Are they orthogonal? Structure in the residuals? Are they orthogonal? Are the residuals normally distributed? Are the residuals normally distributed?

25 Goodness of Fit and Trace of the Residuals Conclude: Good fit, random residuals

26 Residuals Sample: 1954:2 1997:4 Included observations: 175 Q-statistic probabilities adjusted for 1 ARMA term(s) AutocorrelationPartial CorrelationAC PAC Q-Stat Prob.|. |.|. | |. |.|. | |* |.|* | *|. | *|. | *|. | *|. | |. | *|. | *|. |.|. | *|. |.|. | |. | *|. | |. |.|. | |. |.|. | |. |.|. | |. |.|. | Correlogram of the Residuals Conclude: (orthogonal)

27 Histogram of the Residuals Conclude: Not normal, kurtotic

28 Process: Analysis in the LAB Identification Identification Estimation Estimation Verification Verification Is there any structure left in the residuals? If not, we are back to our building block, orthogonal residuals, and we accept the model. Is there any structure left in the residuals? If not, we are back to our building block, orthogonal residuals, and we accept the model.

29 Process: Analysis in the LAB Identification Identification Estimation Estimation Verification Verification Is there any structure left in the residuals? If not, we are back to our building block, orthogonal residuals, and we accept the model. Is there any structure left in the residuals? If not, we are back to our building block, orthogonal residuals, and we accept the model. Forecasting Forecasting one period ahead forecasts one period ahead forecasts

30 Process: Analysis in the LAB Forecasting Forecasting The estimated model The estimated model [cbusin87(t) ] = 0.637*[cbusin87(t-1) ] + N(t) where N(t) is an independent error series but is not normally distributed [cbusin87(t) ] = 0.637*[cbusin87(t-1) ] + N(t) where N(t) is an independent error series but is not normally distributed The forecast is based on the estimated model: The forecast is based on the estimated model: [cbusin87(1998.1) ] = 0.637*[cbusin87(1997.4) ] + N(1998.1) [cbusin87(1998.1) ] = 0.637*[cbusin87(1997.4) ] + N(1998.1)

31 Process: Analysis in the LAB Estimation Estimation EVIEWS model: EVIEWS model: time series(t) = constant + residual(t) time series(t) = constant + residual(t) residual(t) =b*residual(t-1) + WN(t)? residual(t) =b*residual(t-1) + WN(t)? Combine the two: Combine the two: [time series(t) - c] =b*[time series(t-1) - c] +WN(t)? [time series(t) - c] =b*[time series(t-1) - c] +WN(t)? EVIEWS Specification: EVIEWS Specification: cbusin87 c ar(1) cbusin87 c ar(1)

32 Dependent Variable: CBUSIN87 Method: Least Squares Sample(adjusted): 1954:2 1997:4 Included observations: 175 after adjusting endpoints Convergence achieved after 3 iterations VariableCoefficientStd. Errort-StatisticProb. C AR(1) R-squared Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic) Inverted AR Roots.64

33 The Forecast Take expectations of the model, as of Take expectations of the model, as of E [cbusin87(1998.1) ] = 0.637*E [cbusin87(1997.4) ] + E N(1998.1) E [cbusin87(1998.1) ] = 0.637*E [cbusin87(1997.4) ] + E N(1998.1) E cbisin87(1998.1) is the forecast conditional on what we know as of E cbisin87(1998.1) is the forecast conditional on what we know as of cbusin87(1997.4) = 74, the value of the series in cbusin87(1997.4) = 74, the value of the series in E N(1998.1) = 0, the best guess for the shock E N(1998.1) = 0, the best guess for the shock

34 The Forecast Calculate the forecast by hand Calculate the forecast by hand for a one period ahead forecast, the standard error of the regression can be used for the standard error of the forecast for a one period ahead forecast, the standard error of the regression can be used for the standard error of the forecast calculate the upper band: forecast+ 2*SER calculate the upper band: forecast+ 2*SER calculate the lower band: forecast - 2*SER calculate the lower band: forecast - 2*SER

35 The Forecast Use EVIEWS as a check Use EVIEWS as a check

36

37 95 % Confidence Intervals and the Forecast, Visual

: NA 1996: NA 1996: NA 1996: NA 1996: NA 1997: NA 1997: NA 1997: NA 1997: NA 1998:1 NA :2 NA NA NA The Numerical Forecast in EVIEWS and the Standard Error of the Forecast

39 Part IV. Process: Fill in the Blanks The ratio of inventories to sales, total business The ratio of inventories to sales, total business

40 What is the first step?

: : : : : : : : : : : : : : : spreadsheet

42 Conclusions? Trace

43 Histogram Conclusions?

44 CorrelogramConclusions?

45 What is the Next Step?

46 Conjecture: Model

47 What do we do now?

48

49 What do we do next?

50 What conclusions can we draw?

51 Conclusions

52

53 If we accepted this model, what would the formula be? Ratioinvsale(t) Ratioinvsale(t)

54 Make a one period ahead forecast; what is the standard error of the forecast?