Econometrics I Professor William Greene Stern School of Business

Slides:



Advertisements
Similar presentations
Autocorrelation Functions and ARIMA Modelling
Advertisements

Regression with Time Series Data
Part 25: Time Series 25-1/25 Econometrics I Professor William Greene Stern School of Business Department of Economics.
Nonstationary Time Series Data and Cointegration
Econometric Modeling Through EViews and EXCEL
DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Exam 1 review: Quizzes 1-6.
Part 12: Asymptotics for the Regression Model 12-1/39 Econometrics I Professor William Greene Stern School of Business Department of Economics.
Economics 20 - Prof. Anderson1 Stationary Stochastic Process A stochastic process is stationary if for every collection of time indices 1 ≤ t 1 < …< t.
Nonstationary Time Series Data and Cointegration Prepared by Vera Tabakova, East Carolina University.
Christopher Dougherty EC220 - Introduction to econometrics (chapter 13) Slideshow: cointegration Original citation: Dougherty, C. (2012) EC220 - Introduction.
Lecture #9 Autocorrelation Serial Correlation
COINTEGRATION 1 The next topic is cointegration. Suppose that you have two nonstationary series X and Y and you hypothesize that Y is a linear function.
1 MF-852 Financial Econometrics Lecture 11 Distributed Lags and Unit Roots Roy J. Epstein Fall 2003.
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
AUTOCORRELATION 1 The third Gauss-Markov condition is that the values of the disturbance term in the observations in the sample be generated independently.
STAT 497 APPLIED TIME SERIES ANALYSIS
Part 10: Time Series Applications [ 1/64] Econometric Analysis of Panel Data William Greene Department of Economics Stern School of Business.
Economics Prof. Buckles1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
Generalized Regression Model Based on Greene’s Note 15 (Chapter 8)
Economics 20 - Prof. Anderson
14 Vector Autoregressions, Unit Roots, and Cointegration.
Part 14: Generalized Regression 14-1/46 Econometrics I Professor William Greene Stern School of Business Department of Economics.
12 Autocorrelation Serial Correlation exists when errors are correlated across periods -One source of serial correlation is misspecification of the model.
STAT 497 LECTURE NOTES 2.
What does it mean? The variance of the error term is not constant
Part 2: Model and Inference 2-1/49 Regression Models Professor William Greene Stern School of Business IOMS Department Department of Economics.
AUTOCORRELATION 1 Assumption C.5 states that the values of the disturbance term in the observations in the sample are generated independently of each other.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
Problems with the Durbin-Watson test
3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
AUTOCORRELATION: WHAT HAPPENS IF THE ERROR TERMS ARE CORRELATED?
Dynamic Models, Autocorrelation and Forecasting ECON 6002 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s notes.
EC208 – Introductory Econometrics. Topic: Spurious/Nonsense Regressions (as part of chapter on Dynamic Models)
The Box-Jenkins (ARIMA) Methodology
Module 4 Forecasting Multiple Variables from their own Histories EC 827.
Univariate Time series - 2 Methods of Economic Investigation Lecture 19.
Time Series Analysis PART II. Econometric Forecasting Forecasting is an important part of econometric analysis, for some people probably the most important.
Stationarity and Unit Root Testing Dr. Thomas Kigabo RUSUHUZWA.
Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University
Stochastic Process - Introduction
Linear Regression with One Regression
REGRESSION DIAGNOSTIC III: AUTOCORRELATION
Dynamic Models, Autocorrelation and Forecasting
Nonstationary Time Series Data and Cointegration
An Assessment of Climate Change
Kakhramon Yusupov June 15th, :30pm – 3:00pm Session 3
Econometric methods of analysis and forecasting of financial markets
Statistics 153 Review - Sept 30, 2008
Pure Serial Correlation
Applied Econometric Time-Series Data Analysis
CHAPTER 16 ECONOMIC FORECASTING Damodar Gujarati

Chapter 12 – Autocorrelation
Autocorrelation.
Serial Correlation and Heteroskedasticity in Time Series Regressions
Chengyuan Yin School of Mathematics
Serial Correlation and Heteroscedasticity in
Econometrics I Professor William Greene Stern School of Business
Autocorrelation Dr. A. PHILIP AROKIADOSS Chapter 5 Assistant Professor
Lecturer Dr. Veronika Alhanaqtah
Autocorrelation.
Lecturer Dr. Veronika Alhanaqtah
Econometrics Chengyuan Yin School of Mathematics.
Autocorrelation MS management.
Serial Correlation and Heteroscedasticity in
Econometrics I Professor William Greene Stern School of Business
Microeconometric Modeling
Econometrics I Professor William Greene Stern School of Business
Stationary Stochastic Process
Presentation transcript:

Econometrics I Professor William Greene Stern School of Business Department of Economics

Econometrics I Part 25 – Time Series

Modeling an Economic Time Series Observed y0, y1, …, yt,… What is the “sample” Random sampling? The “observation window”

Estimators Functions of sums of observations Law of large numbers? Nonindependent observations What does “increasing sample size” mean? Asymptotic properties? (There are no finite sample properties.)

Interpreting a Time Series Time domain: A “process” y(t) = ax(t) + by(t-1) + … Regression like approach/interpretation Frequency domain: A sum of terms y(t) = Contribution of different frequencies to the observed series. (“High frequency data and financial econometrics – “frequency” is used slightly differently here.)

For example,…

In parts…

Studying the Frequency Domain Cannot identify the number of terms Cannot identify frequencies from the time series Deconstructing the variance, autocovariances and autocorrelations Contributions at different frequencies Apparent large weights at different frequencies Using Fourier transforms of the data Does this provide “new” information about the series?

Autocorrelation in Regression Yt = b’xt + εt Cov(εt, εt-1) ≠ 0 Ex. RealConst = a + bRealIncome + εt U.S. Data, quarterly, 1950-2000

Autocorrelation How does it arise? What does it mean? Modeling approaches Classical – direct: corrective Estimation that accounts for autocorrelation Inference in the presence of autocorrelation Contemporary – structural Model the source Incorporate the time series aspect in the model

Stationary Time Series zt = b1yt-1 + b2yt-2 + … + bPyt-P + et Autocovariance: γk = Cov[yt,yt-k] Autocorrelation: k = γk / γ0 Stationary series: γk depends only on k, not on t Weak stationarity: E[yt] is not a function of t, E[yt * yt-s] is not a function of t or s, only of |t-s| Strong stationarity: The joint distribution of [yt,yt-1,…,yt-s] for any window of length s periods, is not a function of t or s. A condition for weak stationarity: The smallest root of the characteristic polynomial: 1 - b1z1 - b2z2 - … - bPzP = 0, is greater than one. The unit circle Complex roots Example: yt = yt-1 + ee, 1 - z = 0 has root z = 1/ , | z | > 1 => |  | < 1.

Stationary vs. Nonstationary Series

The Lag Operator Lxt = xt-1 L2 xt = xt-2 LPxt + LQxt = xt-P + xt-Q Polynomials in L: yt = B(L)yt + et A(L) yt = et Invertibility: yt = [A(L)]-1 et

Inverting a Stationary Series yt= yt-1 + et  (1- L)yt = et yt = [1- L]-1 et = et + et-1 + 2et-2 + … Stationary series can be inverted Autoregressive vs. moving average form of series

Regression with Autocorrelation yt = xt’b + et, et = et-1 + ut (1- L)et = ut  et = (1- L)-1ut E[et] = E[ (1- L)-1ut] = (1- L)-1E[ut] = 0 Var[et] = (1- L)-2Var[ut] = 1+ 2u2 + … = u2/(1- 2) Cov[et,et-1] = Cov[et-1 + ut, et-1] = =Cov[et-1,et-1]+Cov[ut,et-1] =  u2/(1- 2)

OLS vs. GLS OLS GLS Unbiased? Consistent: (Except in the presence of a lagged dependent variable) Inefficient GLS Consistent and efficient

+----------------------------------------------------+ | Ordinary least squares regression | | LHS=REALCONS Mean = 2999.436 | | Autocorrel Durbin-Watson Stat. = .0920480 | | Rho = cor[e,e(-1)] = .9539760 | +---------+--------------+----------------+--------+---------+----------+ |Variable | Coefficient | Standard Error |t-ratio |P[|T|>t] | Mean of X| Constant -80.3547488 14.3058515 -5.617 .0000 REALDPI .92168567 .00387175 238.054 .0000 3341.47598 | Robust VC Newey-West, Periods = 10 | Constant -80.3547488 41.7239214 -1.926 .0555 REALDPI .92168567 .01503516 61.302 .0000 3341.47598 +---------------------------------------------+ | AR(1) Model: e(t) = rho * e(t-1) + u(t) | | Final value of Rho = .998782 | | Iter= 6, SS= 118367.007, Log-L=-941.371914 | | Durbin-Watson: e(t) = .002436 | | Std. Deviation: e(t) = 490.567910 | | Std. Deviation: u(t) = 24.206926 | | Durbin-Watson: u(t) = 1.994957 | | Autocorrelation: u(t) = .002521 | | N[0,1] used for significance levels | |Variable | Coefficient | Standard Error |b/St.Er.|P[|Z|>z] | Mean of X| Constant 1019.32680 411.177156 2.479 .0132 REALDPI .67342731 .03972593 16.952 .0000 3341.47598 RHO .99878181 .00346332 288.389 .0000

Detecting Autocorrelation Use residuals Durbin-Watson d= Assumes normally distributed disturbances strictly exogenous regressors Variable addition (Godfrey) yt = ’xt + εt-1 + ut Use regression residuals et and test  = 0 Assumes consistency of b.

A Unit Root? How to test for  = 1? By construction: εt – εt-1 = ( - 1)εt-1 + ut Test for γ = ( - 1) = 0 using regression? Variance goes to 0 faster than 1/T. Need a new table; can’t use standard t tables. Dickey – Fuller tests Unit roots in economic data. (Are there?) Nonstationary series Implications for conventional analysis

Reinterpreting Autocorrelation

Integrated Processes Integration of order (P) when the P’th differenced series is stationary Stationary series are I(0) Trending series are often I(1). Then yt – yt-1 = yt is I(0). [Most macroeconomic data series.] Accelerating series might be I(2). Then (yt – yt-1)- (yt – yt-1) = 2yt is I(0) [Money stock in hyperinflationary economies. Difficult to find many applications in economics]

Cointegration: Real DPI and Real Consumption

Cointegration – Divergent Series?

Cointegration X(t) and y(t) are obviously I(1) Looks like any linear combination of x(t) and y(t) will also be I(1) Does a model y(t) = bx(t) + u(u) where u(t) is I(0) make any sense? How can u(t) be I(0)? In fact, there is a linear combination, [1,-] that is I(0). y(t) = .1*t + noise, x(t) = .2*t + noise y(t) and x(t) have a common trend y(t) and x(t) are cointegrated.

Cointegration and I(0) Residuals