Byron Gangnes Econ 427 lecture 11 slides Moving Average Models.

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

Autocorrelation Functions and ARIMA Modelling
Econ 427 lecture 24 slides Forecast Evaluation Byron Gangnes.
Non-stationary data series
Model specification (identification) We already know about the sample autocorrelation function (SAC): Properties: Not unbiased (since a ratio between two.
Unit Roots & Forecasting
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
Econ 140 Lecture 61 Inference about a Mean Lecture 6.
AUTOCORRELATION 1 The third Gauss-Markov condition is that the values of the disturbance term in the observations in the sample be generated independently.
An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003.
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
Non-Seasonal Box-Jenkins Models
1 Ka-fu Wong University of Hong Kong Modeling Cycles: MA, AR and ARMA Models.
Modeling Cycles By ARMA
1 Power 2 Econ 240C. 2 Lab 1 Retrospective Exercise: –GDP_CAN = a +b*GDP_CAN(-1) + e –GDP_FRA = a +b*GDP_FRA(-1) + e.
Point estimation, interval estimation
1 Ka-fu Wong University of Hong Kong Forecasting with Regression Models.
We use the identity Generally we would like the equilibrium mean to be equal to zero! Which conditions should the expected mean satisfy?
Environmentally Conscious Design & Manufacturing (ME592) Date: May 5, 2000 Slide:1 Environmentally Conscious Design & Manufacturing Class 25: Probability.
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
BOX JENKINS METHODOLOGY
ARMA models Gloria González-Rivera University of California, Riverside
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
Byron Gangnes Econ 427 lecture 14 slides Forecasting with MA Models.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 8: Estimation & Diagnostic Checking in Box-Jenkins.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
TIME SERIES ANALYSIS Time Domain Models: Red Noise; AR and ARMA models LECTURE 7 Supplementary Readings: Wilks, chapters 8.
Autoregressive Integrated Moving Average (ARIMA) Popularly known as the Box-Jenkins methodology.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
Byron Gangnes Econ 427 lecture 3 slides. Byron Gangnes A scatterplot.
Byron Gangnes Econ 427 lecture 12 slides MA (part 2) and Autoregressive Models.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
K. Ensor, STAT Spring 2005 Model selection/diagnostics Akaike’s Information Criterion (AIC) –A measure of fit plus a penalty term for the number.
Chapter 5 Parameter estimation. What is sample inference? Distinguish between managerial & financial accounting. Understand how managers can use accounting.
Modern Navigation Thomas Herring MW 11:00-12:30 Room
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
Principles of the Global Positioning System Lecture 12 Prof. Thomas Herring Room ;
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Estimation Method of Moments (MM) Methods of Moment estimation is a general method where equations for estimating parameters are found by equating population.
Lecture 3 1 Recap Random variables Continuous random variable Sample space has infinitely many elements The density function f(x) is a continuous function.
Geology 5600/6600 Signal Analysis 14 Sep 2015 © A.R. Lowry 2015 Last time: A stationary process has statistical properties that are time-invariant; a wide-sense.
Byron Gangnes Econ 427 lecture 15 slides Forecasting with AR Models.
We use the identity Generally we would like the mean of an equilibrium error to be equal to zero! Which conditions should the expected cointegration.
Ch16: Time Series 24 Nov 2011 BUSI275 Dr. Sean Ho HW8 due tonight Please download: 22-TheFed.xls 22-TheFed.xls.
Univariate Time series - 2 Methods of Economic Investigation Lecture 19.
Geology 5600/6600 Signal Analysis 11 Sep 2015 © A.R. Lowry 2015 Last time: The Central Limit theorem : The sum of a sequence of random variables tends.
Introduction to stochastic processes
Time Series Analysis PART II. Econometric Forecasting Forecasting is an important part of econometric analysis, for some people probably the most important.
Random Signals Basic concepts Bibliography Oppenheim’s book, Appendix A. Except A.5. We study a few things that are not in the book.
Sums of Random Variables and Long-Term Averages Sums of R.V. ‘s S n = X 1 + X X n of course.
STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES
EC 827 Module 2 Forecasting a Single Variable from its own History.
Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University
Stochastic Process - Introduction
Lecture 3 B Maysaa ELmahi.
Econ 240 C Lecture 4.
Econ 427 lecture 13 slides ARMA Models Byron Gangnes.
Vector Autoregressions (cntd)
Chapter 4: Mathematical Expectation:
Forecasting with non-stationary data series
Module 3 Forecasting a Single Variable from its own History, continued
The Spectral Representation of Stationary Time Series
Econ 427 lecture 7 slides Modeling Seasonals Byron Gangnes.
CH2 Time series.
Econ 427 lecture 16 slides Stability Tests Byron Gangnes.
BOX JENKINS (ARIMA) METHODOLOGY
Mathematical Expectation
Presentation transcript:

Byron Gangnes Econ 427 lecture 11 slides Moving Average Models

Byron Gangnes Review: Wold’s Theorem Review: Wold’s Theorem says that we can represent any covariance-stationary time series process as an infinite distributed lag of white noise:

Byron Gangnes Finite approximations The problem is that we do not have infinitely many data points to use in estimating infinitely many coefficients! We want to find a good way to approximate the unknown (and unknowable) true model that drives real world data. –The key to this is to find a parsimonious, yet accurate, approximation. We will look at two building blocks of such models now: the MA and AR models, as well as their combination. We start by examining the dynamic patterns generated by the basic types of models.

Byron Gangnes Moving Average (MA) models MA(1) Intuition? –Past shocks (innovations) in the series feed into the succeeding period.

Byron Gangnes Properties of an MA(1) series Uncond. Mean = Uncond. Variance= This uses 2 statistics results: 1) for 2 random vars x and y, E(x+y)=E(x)+E(y) 2) If a is a constant: E(ax) = a E(x) This uses 2 statistics results: 1) var(x+y)= var(x) +var(y), if they are uncorrelated 2) var(ax) = a 2 var(x)

Byron Gangnes Properties of an MA(1) series Conditional mean (the mean given that we are a particular time period, t; i.e. given an “information set” ): We read this as “the expected value of y t-1 conditional on the information set.” That just means we want the expected value taking into account that we know the values of the past epsilons.

Byron Gangnes Properties of an MA(1) series Conditional variance = To see this, plug in the MA(1) expression for y t and the expression for the expected value of y t conditional on the information set (which we just calculated). Notice they differ only by, the innovation in y that we have not yet observed

Byron Gangnes Properties of an MA(1) series Autocovariance function: Let’s look at how that is derived…

Byron Gangnes Autocovariance function Here is the derivation of that: When, all terms are zero. Check it!

Byron Gangnes Properties of an MA(1) series Autocorrelation function is just this divided by the variance:

Byron Gangnes Can y t be written another way? Under certain circumstances, we can write an MA process in an autoregressive representation –The series can be written as a function of past lags of itself (plus a current innovation) Why does this form have intuitive appeal? –Today’s value of the variable is related explicitly to past values of the variable

Byron Gangnes Invertibility In this case, we say that the MA process is invertible This will occur when See the book for a demonstration of how this works and why the key criterion.

Byron Gangnes What does an MA(1) look like? ma1seriesa = whitenoise +.7*whitenoise(-1)

Byron Gangnes Correlogram from MA(1)