Part 25: Time Series 25-1/25 Econometrics I Professor William Greene Stern School of Business Department of Economics.

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

STATISTICS Sampling and Sampling Distributions
Detection of Hydrological Changes – Nonparametric Approaches
Sales Forecasting using Dynamic Bayesian Networks Steve Djajasaputra SNN Nijmegen The Netherlands.
SMA 6304 / MIT / MIT Manufacturing Systems Lecture 11: Forecasting Lecturer: Prof. Duane S. Boning Copyright 2003 © Duane S. Boning. 1.
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Title Subtitle.
Addition Facts
Cointegration and Error Correction Models
1 Correlation and Simple Regression. 2 Introduction Interested in the relationships between variables. What will happen to one variable if another is.
Autocorrelation Functions and ARIMA Modelling
Introduction Describe what panel data is and the reasons for using it in this format Assess the importance of fixed and random effects Examine the Hausman.
Regression with Time Series Data
Assumptions underlying regression analysis
Time-series analysis.
Autoregressive Integrated Moving Average (ARIMA) models
Time Series Analysis -- An Introduction -- AMS 586 Week 2: 2/4,6/2014.
Chapter 4: Basic Estimation Techniques
ABC Technology Project
5-1 Chapter 5 Theory & Problems of Probability & Statistics Murray R. Spiegel Sampling Theory.
Part 3: Least Squares Algebra 3-1/27 Econometrics I Professor William Greene Stern School of Business Department of Economics.
Squares and Square Root WALK. Solve each problem REVIEW:
Christopher Dougherty EC220 - Introduction to econometrics (chapter 8) Slideshow: model b: properties of the regression coefficients Original citation:
1 MAXIMUM LIKELIHOOD ESTIMATION OF REGRESSION COEFFICIENTS X Y XiXi 11  1  +  2 X i Y =  1  +  2 X We will now apply the maximum likelihood principle.
Chapter 5 Test Review Sections 5-1 through 5-4.
GG Consulting, LLC I-SUITE. Source: TEA SHARS Frequently asked questions 2.
Addition 1’s to 20.
25 seconds left…...
Week 1.
We will resume in: 25 Minutes.
Nonstationary Time Series Data and Cointegration
Stat 13 Final Review A. Probability tables to use. B. variance algebra, correlation, covariance, regression C. Probability and Conditional probability.
Simple Linear Regression Analysis
Multiple Regression and Model Building
Econometric Modeling Through EViews and EXCEL
DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Exam 1 review: Quizzes 1-6.
Part 12: Asymptotics for the Regression Model 12-1/39 Econometrics I Professor William Greene Stern School of Business Department of Economics.
Economics 310 Lecture 16 Autocorrelation Continued.
Nonstationary Time Series Data and Cointegration Prepared by Vera Tabakova, East Carolina University.
Lecture #9 Autocorrelation Serial Correlation
1 MF-852 Financial Econometrics Lecture 11 Distributed Lags and Unit Roots Roy J. Epstein Fall 2003.
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
AUTOCORRELATION 1 The third Gauss-Markov condition is that the values of the disturbance term in the observations in the sample be generated independently.
Part 10: Time Series Applications [ 1/64] Econometric Analysis of Panel Data William Greene Department of Economics Stern School of Business.
Economics Prof. Buckles1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
Generalized Regression Model Based on Greene’s Note 15 (Chapter 8)
Economics 20 - Prof. Anderson
Part 14: Generalized Regression 14-1/46 Econometrics I Professor William Greene Stern School of Business Department of Economics.
12 Autocorrelation Serial Correlation exists when errors are correlated across periods -One source of serial correlation is misspecification of the model.
What does it mean? The variance of the error term is not constant
AUTOCORRELATION 1 Assumption C.5 states that the values of the disturbance term in the observations in the sample are generated independently of each other.
AUTOCORRELATION: WHAT HAPPENS IF THE ERROR TERMS ARE CORRELATED?
Dynamic Models, Autocorrelation and Forecasting ECON 6002 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s notes.
EC208 – Introductory Econometrics. Topic: Spurious/Nonsense Regressions (as part of chapter on Dynamic Models)
The Box-Jenkins (ARIMA) Methodology
Module 4 Forecasting Multiple Variables from their own Histories EC 827.
Econometric Analysis of Panel Data Panel Data Analysis – Linear Model One-Way Effects Two-Way Effects – Pooled Regression Classical Model Extensions.
Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University
REGRESSION DIAGNOSTIC III: AUTOCORRELATION
Dynamic Models, Autocorrelation and Forecasting
Nonstationary Time Series Data and Cointegration
Econometric methods of analysis and forecasting of financial markets
Serial Correlation and Heteroskedasticity in Time Series Regressions
Chengyuan Yin School of Mathematics
Serial Correlation and Heteroscedasticity in
Econometrics I Professor William Greene Stern School of Business
Econometrics Chengyuan Yin School of Mathematics.
Serial Correlation and Heteroscedasticity in
Presentation transcript:

Part 25: Time Series 25-1/25 Econometrics I Professor William Greene Stern School of Business Department of Economics

Part 25: Time Series 25-2/25 Econometrics I Part 25 – Time Series

Part 25: Time Series 25-3/25 Modeling an Economic Time Series Observed y 0, y 1, …, y t,… What is the sample Random sampling? The observation window

Part 25: Time Series 25-4/25 Estimators Functions of sums of observations Law of large numbers? Nonindependent observations What does increasing sample size mean? Asymptotic properties? (There are no finite sample properties.)

Part 25: Time Series 25-5/25 Interpreting a Time Series Time domain: A process y(t) = ax(t) + by(t-1) + … Regression like approach/interpretation Frequency domain: A sum of terms y(t) = Contribution of different frequencies to the observed series. (High frequency data and financial econometrics – frequency is used slightly differently here.)

Part 25: Time Series 25-6/25 For example,…

Part 25: Time Series 25-7/25 In parts…

Part 25: Time Series 25-8/25 Studying the Frequency Domain Cannot identify the number of terms Cannot identify frequencies from the time series Deconstructing the variance, autocovariances and autocorrelations Contributions at different frequencies Apparent large weights at different frequencies Using Fourier transforms of the data Does this provide new information about the series?

Part 25: Time Series 25-9/25 Autocorrelation in Regression Y t = bx t + ε t Cov(ε t, ε t-1 ) 0 Ex. RealCons t = a + bRealIncome + ε t U.S. Data, quarterly,

Part 25: Time Series 25-10/25 Autocorrelation How does it arise? What does it mean? Modeling approaches Classical – direct: corrective Estimation that accounts for autocorrelation Inference in the presence of autocorrelation Contemporary – structural Model the source Incorporate the time series aspect in the model

Part 25: Time Series 25-11/25 Stationary Time Series z t = b 1 y t-1 + b 2 y t-2 + … + b P y t-P + e t Autocovariance: γ k = Cov[y t,y t-k ] Autocorrelation: k = γ k / γ 0 Stationary series: γ k depends only on k, not on t Weak stationarity: E[y t ] is not a function of t, E[y t * y t-s ] is not a function of t or s, only of |t-s| Strong stationarity: The joint distribution of [y t,y t-1,…,y t-s ] for any window of length s periods, is not a function of t or s. A condition for weak stationarity: The smallest root of the characteristic polynomial: 1 - b 1 z 1 - b 2 z 2 - … - b P z P = 0, is greater than one. The unit circle Complex roots Example: y t = y t-1 + e e, 1 - z = 0 has root z = 1/, | z | > 1 => | | < 1.

Part 25: Time Series 25-12/25 Stationary vs. Nonstationary Series

Part 25: Time Series 25-13/25 The Lag Operator Lx t = x t-1 L 2 x t = x t-2 L P x t + L Q x t = x t-P + x t-Q Polynomials in L: y t = B(L)y t + e t A(L) y t = e t Invertibility: y t = [A(L)] -1 e t

Part 25: Time Series 25-14/25 Inverting a Stationary Series y t = y t-1 + e t (1- L)y t = e t y t = [1- L] -1 e t = e t + e t e t-2 + … Stationary series can be inverted Autoregressive vs. moving average form of series

Part 25: Time Series 25-15/25 Regression with Autocorrelation y t = x t b + e t, e t = e t-1 + u t (1- L)e t = u t e t = (1- L) -1 u t E[e t ] = E[ (1- L) -1 u t ] = (1- L) -1 E[u t ] = 0 Var[e t ] = (1- L) -2 Var[u t ] = 1+ 2 u 2 + … = u 2 /(1- 2 ) Cov[e t,e t-1 ] = Cov[ e t-1 + u t, e t-1 ] = = Cov[e t-1,e t-1 ]+Cov[u t,e t-1 ] = u 2 /(1- 2 )

Part 25: Time Series 25-16/25 OLS vs. GLS OLS Unbiased? Consistent: (Except in the presence of a lagged dependent variable) Inefficient GLS Consistent and efficient

Part 25: Time Series 25-17/ | Ordinary least squares regression | | LHS=REALCONS Mean = | | Autocorrel Durbin-Watson Stat. = | | Rho = cor[e,e(-1)] = | |Variable | Coefficient | Standard Error |t-ratio |P[|T|>t] | Mean of X| Constant REALDPI | Robust VC Newey-West, Periods = 10 | Constant REALDPI | AR(1) Model: e(t) = rho * e(t-1) + u(t) | | Final value of Rho = | | Iter= 6, SS= , Log-L= | | Durbin-Watson: e(t) = | | Std. Deviation: e(t) = | | Std. Deviation: u(t) = | | Durbin-Watson: u(t) = | | Autocorrelation: u(t) = | | N[0,1] used for significance levels | |Variable | Coefficient | Standard Error |b/St.Er.|P[|Z|>z] | Mean of X| Constant REALDPI RHO

Part 25: Time Series 25-18/25 Detecting Autocorrelation Use residuals Durbin-Watson d= Assumes normally distributed disturbances strictly exogenous regressors Variable addition (Godfrey) y t = x t + ε t-1 + u t Use regression residuals e t and test = 0 Assumes consistency of b.

Part 25: Time Series 25-19/25 A Unit Root? How to test for = 1? By construction: ε t – ε t-1 = ( - 1)ε t-1 + u t Test for γ = ( - 1) = 0 using regression? Variance goes to 0 faster than 1/T. Need a new table; cant use standard t tables. Dickey – Fuller tests Unit roots in economic data. (Are there?) Nonstationary series Implications for conventional analysis

Part 25: Time Series 25-20/25 Reinterpreting Autocorrelation

Part 25: Time Series 25-21/25 Integrated Processes Integration of order (P) when the Pth differenced series is stationary Stationary series are I(0) Trending series are often I(1). Then y t – y t-1 = y t is I(0). [Most macroeconomic data series.] Accelerating series might be I(2). Then (y t – y t-1 )- (y t – y t-1 ) = 2 y t is I(0) [Money stock in hyperinflationary economies. Difficult to find many applications in economics]

Part 25: Time Series 25-22/25 Cointegration: Real DPI and Real Consumption

Part 25: Time Series 25-23/25 Cointegration – Divergent Series?

Part 25: Time Series 25-24/25 Cointegration X(t) and y(t) are obviously I(1) Looks like any linear combination of x(t) and y(t) will also be I(1) Does a model y(t) = bx(t) + u(u) where u(t) is I(0) make any sense? How can u(t) be I(0)? In fact, there is a linear combination, [1,- ] that is I(0). y(t) =.1*t + noise, x(t) =.2*t + noise y(t) and x(t) have a common trend y(t) and x(t) are cointegrated.

Part 25: Time Series 25-25/25 Cointegration and I(0) Residuals