Why does the autocorrelation matter when making inferences?

Slides:



Advertisements
Similar presentations
SMA 6304 / MIT / MIT Manufacturing Systems Lecture 11: Forecasting Lecturer: Prof. Duane S. Boning Copyright 2003 © Duane S. Boning. 1.
Advertisements

Autocorrelation Functions and ARIMA Modelling
Dates for term tests Friday, February 07 Friday, March 07
Economics 20 - Prof. Anderson1 Stationary Stochastic Process A stochastic process is stationary if for every collection of time indices 1 ≤ t 1 < …< t.
Model Building For ARIMA time series
Time Series Analysis Materials for this lecture Lecture 5 Lags.XLS Lecture 5 Stationarity.XLS Read Chapter 15 pages Read Chapter 16 Section 15.
STAT 497 APPLIED TIME SERIES ANALYSIS
Stationary Stochastic Process
Econometric Details -- the market model Assume that asset returns are jointly multivariate normal and independently and identically distributed through.
Economics Prof. Buckles1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
Modeling Cycles By ARMA
Econ 240C Lecture Part I. VAR Does the Federal Funds Rate Affect Capacity Utilization?
1 Identifying ARIMA Models What you need to know.
Economics 20 - Prof. Anderson
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
K. Ensor, STAT Spring 2005 The Basics: Outline What is a time series? What is a financial time series? What is the purpose of our analysis? Classification.
The Autoregressive Model of Change David A. Kenny.
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
1 Terminating Statistical Analysis By Dr. Jason Merrick.
BOX JENKINS METHODOLOGY
STAT 497 LECTURE NOTES 2.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Exam 2 review: Quizzes 7-12* (*) Please note that.
Models for Non-Stationary Time Series The ARIMA(p,d,q) time series.
Intro. ANN & Fuzzy Systems Lecture 26 Modeling (1): Time Series Prediction.
K. Ensor, STAT Spring 2005 Long memory or long range dependence ARMA models are characterized by an exponential decay in the autocorrelation structure.
FORECASTING. Minimum Mean Square Error Forecasting.
K. Ensor, STAT Spring 2005 Model selection/diagnostics Akaike’s Information Criterion (AIC) –A measure of fit plus a penalty term for the number.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
Problems with the Durbin-Watson test
Time Series Analysis Lecture 11
Statistics 349.3(02) Analysis of Time Series. Course Information 1.Instructor: W. H. Laverty 235 McLean Hall Tel:
Chap 9 Regression with Time Series Data: Stationary Variables
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Computational Finance II: Time Series K.Ensor. What is a time series? Anything observed sequentially (by time?) Returns, volatility, interest rates, exchange.
Correlogram - ACF. Modeling for Forecast Forecast Data The Base Model Linear Trend Logistic Growth Others Models Look for a best approximation of the.
CHAPTER 4 ESTIMATES OF MEAN AND ERRORS. 4.1 METHOD OF LEAST SQUARES I n Chapter 2 we defined the mean  of the parent distribution and noted that the.
Subodh Kant. Auto-Regressive Integrated Moving Average Also known as Box-Jenkins methodology A type of linear model Capable of representing stationary.
Stationarity and Unit Root Testing Dr. Thomas Kigabo RUSUHUZWA.
Analysis of financial data Anders Lundquist Spring 2010.
Statistics for Business and Economics Module 2: Regression and time series analysis Spring 2010 Lecture 8: Time Series Analysis and Forecasting 2 Priyantha.
Forecasting. Model with indicator variables The choice of a forecasting technique depends on the components identified in the time series. The techniques.
1 Autocorrelation in Time Series data KNN Ch. 12 (pp )
Lecture 9 Forecasting. Introduction to Forecasting * * * * * * * * o o o o o o o o Model 1Model 2 Which model performs better? There are many forecasting.
K. Ensor, STAT Spring 2004 Volatility Volatility – conditional variance of the process –Don’t observe this quantity directly (only one observation.
Time Series Analysis By Tyler Moore.
Covariance components II autocorrelation & nonsphericity
Stochastic Process - Introduction
Covariance, stationarity & some useful operators
Why Stochastic Hydrology ?
Dynamic Models, Autocorrelation and Forecasting
The General Linear Model
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
Further Issues Using OLS with Time Series Data
Econ 240 C Lecture 4.
Model Building For ARIMA time series
Further Issues in Using OLS with Time Series Data
Regression with Autocorrelated Errors
CI for μ When σ is Unknown
Working Independence versus modeling correlation Longitudinal Example
Autocorrelation.
Predictive distributions
By Eni Sumarminingsih, Ssi, MM
The University of Texas at Dallas
Tutorial 10 SEG7550.
Further Issues Using OLS with Time Series Data
Autocorrelation.
The General Linear Model
CH2 Time series.
Stationary Stochastic Process
Presentation transcript:

Why does the autocorrelation matter when making inferences? Consider estimation of the mean of a stationary series E[X(t)]=m for all t If X(1),…,X(n) are iid what is the sampling distribution of the estimator for m, namely the sample mean? Spring 2005 K. Ensor, STAT 421

Why does the autocorrelation matter? What if X(t) has the following structure (autoregressive model of order 1 AR(1) ) X(t)-m = a (X(t-1)-m ) + e (t) Then Corr(X(t),X(t+h))= a|h| for all h And Var(X)= (1+ a)/(1- a) Var(X)/n Spring 2005 K. Ensor, STAT 421

Comparing the samples size? Let m denote the number of iid obs. Let n denote the number of correlated obs. Setting the variances equal and solving for m as a function of n yields m=n(1- a)/(1+a) Let n=100, a=.9 then m=5 iid obs. If n=100 and a=-.9 then the equivalent number of iid observations is 1900. For positive and negative a (correlation of lag 1) the equivalent sample sizes are 33 and 300. Spring 2005 K. Ensor, STAT 421

Why? Why does the autocorrelation make such a big difference in our ability to estimate the mean? The same arguments hold for other mean functions of the process or other functions of the process we want to estimate. Spring 2005 K. Ensor, STAT 421

Summary Times series is a sequentially observed series exhibiting correlation between the observations. The autocorrelation, partial autocorrelation and cross-correlations are measures of the this correlation. This dependence structure along with proper assumptions allows us to forecast the future of the process. Correct inference requires incorporating knowledge of the dependence structure. Spring 2005 K. Ensor, STAT 421