Stochastic models - time series. Random process. an infinite collection of consistent distributions probabilities exist Random function. a family of random.

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

Random Processes Introduction (2)
Dates for term tests Friday, February 07 Friday, March 07
Model Building For ARIMA time series
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003.
STAT 497 APPLIED TIME SERIES ANALYSIS
Random Processes ECE460 Spring, Random (Stocastic) Processes 2.
Time series analysis - lecture 2 A general forecasting principle Set up probability models for which we can derive analytical expressions for and estimate.
Modeling Cycles By ARMA
3 mo treasury yield borrowing costs Dow industrials NY Times 18 Sept 2008 front page.
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Financial Econometrics
Review of Probability and Random Processes
Sep 20, 2005CS477: Analog and Digital Communications1 Random variables, Random processes Analog and Digital Communications Autumn
BOX JENKINS METHODOLOGY
ARMA models Gloria González-Rivera University of California, Riverside
Probability Theory and Random Processes
Teknik Peramalan: Materi minggu kedelapan
Time Series Forecasting (Part II)
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
Review for Exam I ECE460 Spring, 2012.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
Why do we Need Statistical Model in the first place? Any image processing algorithm has to work on a collection (class) of images instead of a single one.
Elements of Stochastic Processes Lecture II
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
ارتباطات داده (883-40) فرآیندهای تصادفی نیمسال دوّم افشین همّت یار دانشکده مهندسی کامپیوتر 1.
Random processes. Matlab What is a random process?
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
Chapter 1 Random Process
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
Estimation Method of Moments (MM) Methods of Moment estimation is a general method where equations for estimating parameters are found by equating population.
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Geology 6600/7600 Signal Analysis 09 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
1 Review of Probability and Random Processes. 2 Importance of Random Processes Random variables and processes talk about quantities and signals which.
Ch16: Time Series 24 Nov 2011 BUSI275 Dr. Sean Ho HW8 due tonight Please download: 22-TheFed.xls 22-TheFed.xls.
Geology 5600/6600 Signal Analysis 11 Sep 2015 © A.R. Lowry 2015 Last time: The Central Limit theorem : The sum of a sequence of random variables tends.
Introduction to stochastic processes
Random Signals Basic concepts Bibliography Oppenheim’s book, Appendix A. Except A.5. We study a few things that are not in the book.
STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES
Stochastic Process - Introduction
Time Series Analysis.
Introduction to Time Series Analysis
Time Series Analysis and Its Applications
Computational Data Analysis
Statistics 153 Review - Sept 30, 2008
Stochastic time series.
Univariate time series modelling and forecasting
Stochastic models - time series.
Machine Learning Week 4.
Stochastic models - time series.
Stochastic models - time series.
ARMA models 2012 International Finance CYCU
STOCHASTIC HYDROLOGY Random Processes
Stochastic models - time series.
Algebra U = ∑ a(t) Y(t) E{U} = c Y ∑ a(t)
Algebra U = ∑ a(t) Y(t) E{U} = c Y ∑ a(t)
The Spectral Representation of Stationary Time Series
Eni Sumarminingsih, SSi, MM
Introduction to Time Series Analysis
Introduction to Time Series Analysis
CH2 Time series.
Presentation transcript:

Stochastic models - time series. Random process. an infinite collection of consistent distributions probabilities exist Random function. a family of random variables, e.g. {Y(t), t in Z}

Specified if given F(y 1,...,y n ;t 1,...,t n ) = Prob{Y(t 1 )  y 1,...,Y(t n )  y n } that are symmetric F(  y;  t) = F(y;t),  a permutation compatible F(y 1,...,y m, ,...,  ;t 1,...,t m,t m+1,...,t n } = F(y 1,...,y m ;t 1,...,t m )

Finite dimensional distributions First-order F(y;t) = Prob{Y(t)  t} Second-order F(y 1,y 2 ;t 1,t 2 ) = Prob{Y(t 1 )  y 1 and Y(t 2 )  y 2 } and so on

Other methods i) Y(t;  ),  : random variable ii) urn model iii) probability on function space iv) analytic formula Y(t) =  cos( t +  )  : fixed  : uniform on (- ,  ]

There may be densities The Y(t) may be discrete, angles, proportions,... Kolmogorov extension theorem. To specify a stochastic process give the distribution of any finite subset {Y(  1 ),...,Y(  n )} in a consistent way,  in A

Moment functions. Mean function c Y (t) = E{Y(t)} =  y dF(y;t) =  y f(y;t) dy if continuous =  y j f(y j ; t) if discrete E{  1 Y 1 (t) +  2 Y 2 (t)} =  1 c 1 (t) +  2 c 2 (t) vector-valued case mean level - signal plus noise: S(t) +  (t) S(.): fixed

Second-moments. autocovariance function c YY (s,t) = cov{Y(s),Y(t)} = E{Y(s)Y(t)} - E{Y(s)}E{Y(t)} non-negative definite   j  k c YY (t j, t k )  0 scalars  crosscovariance function c 12 (s,t) = cov{Y 1 (s),Y 2 (t)}

Stationarity. Joint distributions, {Y(t+u 1 ),...,Y(t+u k-1 ),Y(t)}, do not depend on t for k=1,2,... Often reasonable in practice - for some time stretches Replaces "identically distributed"

mean E{Y(t)} = c Y for t in Z autocovariance function cov{Y(t+u),Y(t)} = c YY (u) t,u in Z u: lag = E{Y(t+u)Y(t)} if mean 0 autocorrelation function  (u) = corr{Y(t+u),Y(t)}, |  (u)|  1 crosscovariance function cov{X(t+u),Y(t)} = c XY (u)

joint density Prob{x < Y(t+u) < x+dx and y < Y(t) < y+ dy} = f(x,y|u) dxdy

Some useful models Chatfield notation Purely random / white noise often mean 0 Building block

Random walk not stationary

(*)

Moving average, MA(q) From (*) stationary

MA(1)  0 =1  1 = -.7

Backward shift operator Linear process. Need convergence condition

autoregressive process, AR(p) first-order, AR(1) Markov Linear process For convergence/stationarity *

a.c.f. From (*) p.a.c.f. corr{Y(t),Y(t-m)|Y(t-1),...,Y(t-m+1)} linearly = 0 for m  p when Y is AR(p)

In general case, Useful for prediction

ARMA(p,q)

ARIMA(p,d,q).

Some series and acf’s

Yule-Walker equations for AR(p). Correlate, with X t-k, each side of

Cumulants. multilinear functional 0 if some subset of variantes independent of rest 0 of order > 2 for normal normal is determined by its moments