Stochastic models - time series.

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

Random Processes Introduction (2)
Dates for term tests Friday, February 07 Friday, March 07
Economics 20 - Prof. Anderson1 Stationary Stochastic Process A stochastic process is stationary if for every collection of time indices 1 ≤ t 1 < …< t.
Model Building For ARIMA time series
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003.
STAT 497 APPLIED TIME SERIES ANALYSIS
Modeling Cycles By ARMA
3 mo treasury yield borrowing costs Dow industrials NY Times 18 Sept 2008 front page.
Time Series Basics Fin250f: Lecture 3.1 Fall 2005 Reading: Taylor, chapter
Financial Econometrics
K. Ensor, STAT Spring 2005 The Basics: Outline What is a time series? What is a financial time series? What is the purpose of our analysis? Classification.
Lecture II-2: Probability Review
Review of Probability.
BOX JENKINS METHODOLOGY
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
ARMA models Gloria González-Rivera University of California, Riverside
Probability Theory and Random Processes
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
Review for Exam I ECE460 Spring, 2012.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
Elements of Stochastic Processes Lecture II
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
Random processes. Matlab What is a random process?
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
Chapter 1 Random Process
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Stochastic models - time series. Random process. an infinite collection of consistent distributions probabilities exist Random function. a family of random.
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Geology 6600/7600 Signal Analysis 09 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Introduction to stochastic processes
Random Signals Basic concepts Bibliography Oppenheim’s book, Appendix A. Except A.5. We study a few things that are not in the book.
Stochastic Process - Introduction
Multiple Random Variables and Joint Distributions
Covariance, stationarity & some useful operators
Time Series Analysis.
Introduction to Time Series Analysis
Time Series Analysis and Its Applications
Chapter 5 Joint Probability Distributions and Random Samples
Computational Data Analysis
The distribution function F(x)
Statistics 153 Review - Sept 30, 2008
Model Building For ARIMA time series
Stochastic time series.
Univariate time series modelling and forecasting
Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Machine Learning Week 4.
Stochastic models - time series.
Stochastic models - time series.
ARMA models 2012 International Finance CYCU
STOCHASTIC HYDROLOGY Random Processes
Stochastic models - time series.
Lecture 13 Time Series: Stationarity, AR(p) & MA(q)
Algebra U = ∑ a(t) Y(t) E{U} = c Y ∑ a(t)
Algebra U = ∑ a(t) Y(t) E{U} = c Y ∑ a(t)
The Spectral Representation of Stationary Time Series
Introduction to Time Series Analysis
Introduction to Time Series Analysis
CH2 Time series.
Presentation transcript:

Stochastic models - time series. Random process. an infinite collection of consistent distributions probabilities exist Random function. family of random variables, {Y(t;), t  Z, } Z = {0,±1,±2,...},  a sample space

Specified if given F(y1,...,yn;t1 ,...,tn ) = Prob{Y(t1)y1,...,Y(tn )yn } n = 1,2,... F's are symmetric in the sense F(y;t) = F(y;t),  a permutation F's are compatible F(y1 ,...,ym ,,...,;t1,...,tm,tm+1,...,tn} = F(y1,...,ym;t1,...,tm) m+1  t = 2,3,...

Finite dimensional distributions First-order F(y;t) = Prob{Y(t)  t} Second-order F(y1,y2;t1,t2) = Prob{Y(t1)  y1 and Y(t2)  y2} and so on

Normal process/series. Finite dimension distributions multivariate normal Multivatiate normal. Entries linear combinations of i.i.d standard normals Y =  +  Z : s by 1 : s by r Y: s by 1 Z: Nr(0,I) I: r by r identity E(Y) =  var(Y) = ' s by s Conditional marginals linear in Y2 when condition on it

Other methods i) Y(t;), : random variable ii) urn model iii) probability on function space iv) analytic formula Y(t) =  cos(t + ) , : fixed : uniform on (-,]

There may be densities The Y(t) may be discrete, angles, proportions, vectors, ... Kolmogorov extension theorem. To specify a stochastic process give the distribution of any finite subset {Y(1),...,Y(n)} in a consistent way,  in A

Moment functions. Mean function cY(t) = E{Y(t)} =  y dF(y;t) =  y f(y;t) dy if continuous =  yjf(yj; t) if discrete E{1Y1(t) + 2Y2(t)} =1c1(t) +2c2(t) vector-valued case Signal plus noise: Y(t) = S(t) + (t) e.g. S(.) fixed, or random

Second-moments. autocovariance function cYY(s,t) = cov{Y(s),Y(t)} = E{Y(s)Y(t)} - E{Y(s)}E{Y(t)} non-negative definite  jkcYY(tj , tk )  0 scalars  var { jY(ti)} crosscovariance function c12(s,t) = cov{Y1(s),Y2(t)}

Stationarity. Joint distributions, {Y(t+u1),...,Y(t+uk-1),Y(t)}, do not depend on t for k=2,3,... Often reasonable in practice particularly for some time stretches Replaces "identically distributed" (i.i.d.)

mean E{Y(t)} = cY for t in Z autocovariance function cov{Y(t+u),Y(t)} = cYY(u) t,u in Z u: lag = E{Y(t+u)Y(t)} if mean 0 autocorrelation function (u) = corr{Y(t+u),Y(t)}, |(u)|  1 crosscovariance function cov{X(t+u),Y(t)} = cXY(u)

joint density Prob{x < Y(t+u) < x+dx and y < Y(t) < y+ dy} = f(x,y,u) dxdy

(*) Extend to case of > 2 variables

Some useful models brief switch of notation Purely random / white noise often mean 0 Building block

Random walk not stationary

Moving average, MA(q) From (*) stationary

MA(1) 0=1 1 = -.7 Estimate of (k)

Backward shift operator remember translation operator T Linear process. Need convergence condition, e.g. |i | or |i |2 < 

autoregressive process, AR(p) first-order, AR(1) Markov * Linear process invertible For convergence in probability/stationarity

a.c.f. of ar(1) from (*) p.a.c.f. corr{Y(t),Y(t-m)|Y(t-1),...,Y(t-m+1)} linearly = 0 for m  p when Y is AR(p)

In general case, Useful for prediction

ARMA(p,q) (B)Xt = (B)Zt

ARIMA(p,d,q). Xt = Xt - Xt-1 2Xt = Xt - 2Xt-1 + Xt-2

Yule-Walker equations for AR(p). Sometimes used for estimation Correlate, with Xt-k , each side of

Cumulants. Extension of mean, variance, covariance cum(Y1 , Y2 , ..., Yk ) useful for nonlinear relationships, approximations, ... multilinear functional 0 if some subset of variantes independent of rest 0 of order > 2 for normal normal is determined by its moments, CLT proof

Some series and acf’s