Bristol MSc Time Series Econometrics, Spring 2015 Univariate time series processes, moments, stationarity.

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

Autocorrelation Functions and ARIMA Modelling
State Space Models. Let { x t :t T} and { y t :t T} denote two vector valued time series that satisfy the system of equations: y t = A t x t + v t (The.
Dates for term tests Friday, February 07 Friday, March 07
VAR Models Gloria González-Rivera University of California, Riverside
COMM 472: Quantitative Analysis of Financial Decisions
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: The Linear Prediction Model The Autocorrelation Method Levinson and Durbin.
Vector Autoregressive Models
Model specification (identification) We already know about the sample autocorrelation function (SAC): Properties: Not unbiased (since a ratio between two.
Unit Roots & Forecasting
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003.
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
STAT 497 APPLIED TIME SERIES ANALYSIS
13 Introduction toTime-Series Analysis. What is in this Chapter? This chapter discusses –the basic time-series models: autoregressive (AR) and moving.
Econometric Details -- the market model Assume that asset returns are jointly multivariate normal and independently and identically distributed through.
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 8 1Probability, Bayes’ theorem, random variables, pdfs 2Functions of.
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Non-Seasonal Box-Jenkins Models
Lecture II-2: Probability Review
Review of Probability.
ARMA models Gloria González-Rivera University of California, Riverside
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 8: Estimation & Diagnostic Checking in Box-Jenkins.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
TIME SERIES ANALYSIS Time Domain Models: Red Noise; AR and ARMA models LECTURE 7 Supplementary Readings: Wilks, chapters 8.
#1 EC 485: Time Series Analysis in a Nut Shell. #2 Data Preparation: 1)Plot data and examine for stationarity 2)Examine ACF for stationarity 3)If not.
Module 1: Statistical Issues in Micro simulation Paul Sousa.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
Byron Gangnes Econ 427 lecture 11 slides Moving Average Models.
Byron Gangnes Econ 427 lecture 12 slides MA (part 2) and Autoregressive Models.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
Elements of Financial Risk Management Second Edition © 2012 by Peter Christoffersen 1 Simulating the Term Structure of Risk Elements of Financial Risk.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
Robert Engle UCSD and NYU and Robert F. Engle, Econometric Services DYNAMIC CONDITIONAL CORRELATIONS.
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Time Series Analysis Lecture 11
Estimation Method of Moments (MM) Methods of Moment estimation is a general method where equations for estimating parameters are found by equating population.
Time series tests of the small open economy model of the current account Birmingham MSc International Macro Autumn 2015 Tony Yates.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Normal Equations The Orthogonality Principle Solution of the Normal Equations.
Dynamic Models, Autocorrelation and Forecasting ECON 6002 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s notes.
EC208 – Introductory Econometrics. Topic: Spurious/Nonsense Regressions (as part of chapter on Dynamic Models)
Discrete-time Random Signals
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
MODELS FOR NONSTATIONARY TIME SERIES By Eni Sumarminingsih, SSi, MM.
G. Cowan Lectures on Statistical Data Analysis Lecture 9 page 1 Statistical Data Analysis: Lecture 9 1Probability, Bayes’ theorem 2Random variables and.
Univariate Time series - 2 Methods of Economic Investigation Lecture 19.
Random Signals Basic concepts Bibliography Oppenheim’s book, Appendix A. Except A.5. We study a few things that are not in the book.
EC 827 Module 2 Forecasting a Single Variable from its own History.
Dr. Thomas Kigabo RUSUHUZWA
Time series tests of the small open economy model of the current account Birmingham MSc International Macro Autumn 2015 Tony Yates.
Stochastic Process - Introduction
Financial Econometrics Lecture Notes 4
Linear Algebra Review.
Covariance, stationarity & some useful operators
VAR models and cointegration
ECON 240C Lecture 7.
Machine Learning Week 4.
STOCHASTIC HYDROLOGY Random Processes
Unit Root & Augmented Dickey-Fuller (ADF) Test
VAR Models Gloria González-Rivera University of California, Riverside
The Spectral Representation of Stationary Time Series
Lecturer Dr. Veronika Alhanaqtah
CH2 Time series.
Presentation transcript:

Bristol MSc Time Series Econometrics, Spring 2015 Univariate time series processes, moments, stationarity

Overview Moving average processes Autogregressive processes MA representation of autoregressive processes Computing first and second moments, means, variances, autocovariances, autocorrelations Stationarity, strong and weak, ergodicity The lag operator, lag polynomials, invertibility. Mostly from Hamilton (1994), but see also Cochrane’s monograph on time series.Hamilton (1994) Cochrane’s monograph on time series

Aim These univariate concepts needed in multivariate analysis Moment calculation a building block in forming the likelihood of time series data, and therefore estimation Foundational tools in descriptive time series modelling.

Two notions of the mean in time series 1.Imagine many computers simulating a series in parrallel. If at date t, we took an average of all of them, what would that converge to as we made the number I of these computers large? 2.Suppose we used 1 computer to simulate a time series process. What would the average of all these observations converge to as T got very large?

Variance, autocovariance The variance General autocovariance, which nests the variance. Related to covariance in general, (ie not just time series) multivariate analysis

Autocorrelation, correlation Autocorrelation order j is the autocovariance order j divided by the variance. Autocorrelation comes from definition and computation of general notion of the correlation from multivariate, not necessarily time-series analysis

Moving average processes First order MA process, ‘MA(1)’, mu and theta are parameters, e is a white noise shock. Computing the mean and variance of an MA(1).

White noise Cross section average of shocks is zero. Variance is some constant. No ‘correlation’ across different units. Gaussian white noise if, in addition, normally distributed.

1 st autocovariance of an MA(1) Higher order autocovariances of an MA(1) are 0. It’s an exercise to explain why this is.

Higher order MA processes MA(2) MA(n) And we can have infinite order MA processes, referred to as MA(inf).

Why ‘moving average’ process? The RHS is an ‘average’ [actually a weighted sum] And it is a sum whose coverage or window ‘moves’ as the time indicator grows.

Stationarity, ergodicity Weak or covariance stationarity: Mean and autocovariances are independent of t. Strong stationarity: joint density of these elements in the sequence depend not on t, just on the gap between the different elements. Ergodicity: convergence of ‘time-series’ average to the ‘cross- section’ average.

Cross-sectional and time series stationarity y_t=rho*y_t-1+sd*e_t; rho=0.8,sd=1 Top-panel: variance of outturns ACROSS simulations Bottom panel: rolling variance OVER TIME for 1 simulation.

Cross-sectional and time-series non- stationarity y_t=rho*y_t-1+sd*e_t; rho=1.002,sd=1 Coefficient just over unity, but cross sectional variance exploding… And rolling time series variance not constant either.

Matlab code to simulate ARs, compute and plot cs and ts variances %script to demonstrate non-stationarity in AR(1) and time series / cross %sectional notion of variance. clear all; %ensures memory doesn't carry forward errors from runs of old versions tsample=1000; %define length of time series to simulate mcsample=50; %number of time series in our monte carlo rho=1.002; %autoregressive parameter sd=1; %sdeviation of shocks shocks=randn(tsample,mcsample); y=zeros(tsample,mcsample); %store our simulated data here csvar=zeros(mcsample,1); %store cr sec variances here tsvar=zeros(tsample-1,1); for i=1:mcsample for j=2:tsample y(j,i)=rho*y(j-1,i)+sd*shocks(j,i); end %calculate cross sectional variances for i=2:tsample csvar(i-1)=var(y(i,:)); end %calculate rolling ts variances for j=2:tsample tsvar(j-1)=var(y(1:j,1)); end %chart results figure subplot(2,1,1) plot(csvar) title('Cross-sectional variances,rho=1.002') subplot(2,1,2) plot(tsvar) title('time-series variances,rho=1.002')

AR(1), MA(1) Initial shock=0, theta=0.7

Matlab code to simulate MA(1), AR(1)

AR and ARMA processes AR(1) AR(2) ARMA(1,1) Which process you use will depend on whether you have economics/theory to guide you, or statistical criteria.

MA representation of an AR(1) Derive the MA rep by repeatedly substituting out for lag Y using the AR(1) form..

MA(inf) representation of AR(1) Exists provided mod(phi)<1 Shows that for a stationary AR(1), we can view today’s Y as the sum of the infinite sequence of past shocks. Note the imprint of past shocks on today is smaller, the further back in time they happened. And that’s true because of the dampening implied by the mod(phi)<1.

Impulse response function for an AR(1) Start from zero. Effect of a shock today is the shock itself. Effect of that shock tomorrow, in period t+1 And the propagated out another period…. IRF asks: what is the effect of a shock (an impulse) at a particular horizon in the future? Note relationship with MA(inf) rep of an AR(1).

IRF for AR(1) an example Suppose phi=0.8, c=0, e_0=1 e_0=1 is what we would often take as a standardised shock size to illustrate the shape of the IRF for an estimated time series process. Or we might take a shock size=1 standard deviation. Note how the IRF for the stationary AR(1) is monotonic [always falling] and dies out.

The forecast in an AR(1) (‘time series’) expectation given information at 0 of Y_1 Forecast at some horizon h The forecast error we will record when period h comes along and we observe the data we forecast.

Forecast errors in an AR(1) Partially construct the MA rep of an outturn for Y at horizon h in the future. We see that the forecast error at horizon h is a moving average of the shock that hit between now and h.

Forecast error analysis Armed with our analytical time series forecast errors…. We can compute their expectation. We can compare the expectation to the outturn. [Are they biased?] We can compute the expected autocorrelation of the errors. Their variance…. Connection with the empirical literature on rational expectations and survey/professional forecaster measures of inflation expectations.

VAR(1) representation of AR(2) AR(2) VAR(1) representation of the AR(2). First line has the ‘meat’. Second line just identity. Bold type sometimes to denote matrices. Why do this? Certain formulae for IRFs, or standard errors, forecast error variances, are easily derivable with first order models. So get the higher order model into first order form and then proceed….

The lag operator The lag operator shifts the time subscript backwards, or, if we write its inverse, forwards. We can use it to express time series processes like AR models differently. The lag operator is commutative with multiplication.

Rediscovering the MA(inf) representation of an AR(1) with the lag operator Operate on both sides of this with this And this is what you get. Here we expand the compound operator on the LHS of the equation above

Rediscovering….ctd LHS of above written explicitly, without lag operators. Note as t goes to inf, we are left with Y_t So with the aid of the lag operator, we have rediscovered the MA(inf) representation of the AR(1).

Lag operators and invertibility of AR(1) This is what we have established. Implying that these operators are approximately inverses of one another. Note this property of (any) inverse operator `1’ here is the ‘identity operator’

Invertibility, ctd… Provided mod(phi)<1, we can operate on both sides of this with the inverse of the operator on the RHS to get this… This is what is referred to as the ‘invertibility’ property of an AR(1) process. Analogous properties are deduced of multivariate vector autoregressive (VAR) processes too.

Computing mean and variance of AR(2) More involved than for the AR(1) Introduces likelihood computation for more complex processes Introduces recursive nature of autocovariances and its usefulness NB: it will simply be an exercise to do this for an AR(1) process.

Mean of an AR(2) Here is an AR(2) process. Start with calculating the mean. To get the mean, simply take expectations of both sides.

Variance of an AR(2) Rewrite our AR(2) using this substitution for the constant term c. This is what we get after making the substitution for c.

Variance of an AR(2) * by Y_t-mu, take expectations, and we get this. The above is a recursive equation in autocovariances, which we can denote like this. This is where the sig^2 term above comes from.

Variance of an AR(2) General form of the recursive autocovariance equation, formed by multiplying not by Y_t-mu, but Y_t-j-mu, then taking expectations.

Variance of an AR(2) Divide both sides by the variance, or the 0 th order autocovariance to get an equation in autocorrelations Set j=1, to get this, noting that rho_0=1, and rho_1=rho_-1 Set j=2 and the recursive equation in autocorrelations implies this… which we can rewrite substituting in for expression for rho_1

Variance of an MA(2) Rewrite the autocovariances on the RHS in terms of autocorrelations. Then substitute in for the autocorrelations which we found on the last slide…. And rearrange as an equation in gamma_0, the variance, which is what we were trying to solve for. Done!

Recap Moving average processes Autoregressive processes ARMA processes Methods for computing first and second moments of these Impulse response Forecast, forecast errors MA(infinity) representation of an AR(1) Lag operators, polynomials in the lag operator