Algebra U = ∑ a(t) Y(t) E{U} = c Y ∑ a(t)

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

State Space Models. Let { x t :t T} and { y t :t T} denote two vector valued time series that satisfy the system of equations: y t = A t x t + v t (The.
Dates for term tests Friday, February 07 Friday, March 07
Economics 20 - Prof. Anderson1 Stationary Stochastic Process A stochastic process is stationary if for every collection of time indices 1 ≤ t 1 < …< t.
Model Building For ARIMA time series
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003.
Stationary Stochastic Process
Modeling Cycles By ARMA
3 mo treasury yield borrowing costs Dow industrials NY Times 18 Sept 2008 front page.
Time Series Basics Fin250f: Lecture 3.1 Fall 2005 Reading: Taylor, chapter
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Financial Econometrics
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
BOX JENKINS METHODOLOGY
Box Jenkins or Arima Forecasting. H:\My Documents\classes\eco346\Lectures\chap ter 7\Autoregressive Models.docH:\My Documents\classes\eco346\Lectures\chap.
ARMA models Gloria González-Rivera University of California, Riverside
Teknik Peramalan: Materi minggu kedelapan
Time Series Forecasting (Part II)
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
It’s About Time Mark Otto U. S. Fish and Wildlife Service.
Time Series Analysis, Part I. A time series is A time series is a sequence of measurements. Usually we deal with equi-spaced measurements. What distinguishes.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
Additional Topics in Prediction Methodology. Introduction Predictive distribution for random variable Y 0 is meant to capture all the information about.
3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
Stochastic models - time series. Random process. an infinite collection of consistent distributions probabilities exist Random function. a family of random.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Lecture 5,6,7: Random variables and signals Aliazam Abbasfar.
Computational Finance II: Time Series K.Ensor. What is a time series? Anything observed sequentially (by time?) Returns, volatility, interest rates, exchange.
Introduction to stochastic processes
Time Series Analysis PART II. Econometric Forecasting Forecasting is an important part of econometric analysis, for some people probably the most important.
STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES
Advanced Econometrics - Lecture 5 Univariate Time Series Models.
Stochastic Process - Introduction
Time Series Analysis.
Introduction to Time Series Analysis
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
Computational Data Analysis
Statistics 153 Review - Sept 30, 2008
Model Building For ARIMA time series
Stochastic time series.
Univariate time series modelling and forecasting
Stochastic models - time series.
Chapter 5 Nonstationary Time series Models
Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Machine Learning Week 4.
Stochastic models - time series.
ARMA models 2012 International Finance CYCU
Stochastic models - time series.
Lecture 13 Time Series: Stationarity, AR(p) & MA(q)
State Space Models.
Algebra U = ∑ a(t) Y(t) E{U} = c Y ∑ a(t)
The Spectral Representation of Stationary Time Series
Understanding Nonstationarity and Recognizing it When You See it
Univariate time series modelling and forecasting
Introduction to Time Series Analysis
Introduction to Time Series Analysis
WHY WE FILTER TO IDENTIFY THE RELATIONSHIP
Time Series introduction in R - Iñaki Puigdollers
CH2 Time series.
Stationary Stochastic Process
Presentation transcript:

Algebra U = ∑ a(t) Y(t) E{U} = c Y ∑ a(t) cov{U,V} = ∑ a(s) b(t) c YY(s-t) U is gaussian if {Y(t)} gaussian

Some useful stochastic models Purely random / white noise (i.i.d.) (often mean assumed 0) cYY(u) = cov(Y(t+u),Y(t)} = σY2 if u = 0 = 0 if u ≠ 0 ρYY(u) = 1, u=0 = 0, u ≠ 0 A building block

Random walk Y(t) = Y(t-1) + Z(t), Y(0) = 0 Y(t) = ∑i=1t Z(i) E{Y(t)} = t μZ var{Y(t)} = t σZ2 Not stationary, but ∆Y(t) = Y(t) – Y(t-1) = Z(t)

Moving average, MA(q) Y(t) = β(0)Z(t) + β(1)Z(t-1) +…+ β(q)Z(t-q) If E{Z(t)} = 0, E{Y(t)} = 0 c(u) = 0, u > q = σZ2 ∑ t=0q-k β(t) β(t+u) u=0,1,…,q = c(-u) stationary MA(1). ρ(u) = 1 u = 0 = β(1)/(1+ β(1) 2), k = ±1 = 0 otherwise

Backward shift operator remember translation operator TuY(t)=Y(t+u) BjY(t) = Y(t-j) Linear process. Need convergence condition, e.g. |i | or |i |2 < 

autoregressive process, AR(p) first-order, AR(1) Markov (**) Linear process invertible For convergence in probability/stationarity

a.c.f. of ar(1) from previous slide (**) p.a.c.f. using normal or linear definitions corr{Y(t),Y(t-m)|Y(t-1),...,Y(t-m+1)} = 0 for m  p when Y is AR(p) Proof. via multiple regression

In general case, Useful for prediction

Yule-Walker equations for AR(p). Sometimes used for estimation Correlate, with Xt-k , each side of

ARMA(p,q) (B)Yt = (B)Zt

ARIMA(p,d,q). Xt = Xt - Xt-1 2Xt = Xt - 2Xt-1 + Xt-2 arima.mle() fits by mle assuming Gaussian noise

Armax. (B)Yt = β(B)Xt + (B)Zt arima.mle(…,xreg,…) State space. st = Ft(st-1 , zt ) Yt = Ht(st , Zt ) could enclude X

Next i.i.d. → mixing stationary process Mixing has a variety of definitions e.g. normal case, ∑ |cYY(u)| < ∞, e.g.Cryer and Chan (2008) CLT Y-bar = ∑ t=1T Y(t)/T Normal with E{Y-bar} = cY var{Y-bar} = ∑ s=1T ∑ t=1T c YY(s-t) ≈ T ∑ u c YY(u) = T σYY if white noise

Cumulants. cum(Y1,Y2, ...,Yk ) Extends mean, variance, covariance cum(Y) = E{Y} cum(Y,Y) = Var{Y} cum(X,Y) = Cov(X,Y) DRB (1975)

Proof of ordinary CLT. ST = Y(1) + … + Y(T) cumk(ST) = T κ k additivity and imdependence cumk(ST/√T) = T–k/2 cumk(ST) = O( T T–k/2 ) → 0 for k > 2 as T → ∞ normal cumulants of order > 2 are 0 normal is determined by its moments (ST - Tμ)/√ T tends in distribution to N(0,σ2)