Time Series Analysis.

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

Stationary Time Series
Dates for term tests Friday, February 07 Friday, March 07
Model Building For ARIMA time series
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
STAT 497 APPLIED TIME SERIES ANALYSIS
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Economics 20 - Prof. Anderson
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
Non-Seasonal Box-Jenkins Models
Modern methods The classical approach: MethodProsCons Time series regression Easy to implement Fairly easy to interpret Covariates may be added (normalization)
Review of Probability.
ARMA models Gloria González-Rivera University of California, Riverside
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
CHAPTER 5 SIGNAL SPACE ANALYSIS
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
Stochastic models - time series. Random process. an infinite collection of consistent distributions probabilities exist Random function. a family of random.
Linear Filters. denote a bivariate time series with zero mean. Let.
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Geology 6600/7600 Signal Analysis 09 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Multivariate Time Series Analysis
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES
Chapter 6 Random Processes
Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University
Stochastic Process - Introduction
Models for Non-Stationary Time Series
Covariance, stationarity & some useful operators
Introduction to Time Series Analysis
Estimation of the spectral density function
Time Series Analysis and Its Applications
SIGNALS PROCESSING AND ANALYSIS
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Computational Data Analysis
The distribution function F(x)
Statistics 153 Review - Sept 30, 2008
Model Building For ARIMA time series
Stochastic models - time series.
Hidden Markov Autoregressive Models
Chapter 6: Forecasting/Prediction
Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Machine Learning Week 4.
Stochastic models - time series.
ARMA models 2012 International Finance CYCU
STOCHASTIC HYDROLOGY Random Processes
Multivariate Time Series Analysis
State Space Models.
Linear Filters.
The Spectral Representation of Stationary Time Series
Eni Sumarminingsih, SSi, MM
Chapter 5 Applied Statistics and Probability for Engineers
Introduction to Time Series Analysis
Introduction to Time Series Analysis
Chapter 6 Random Processes
9. Two Functions of Two Random Variables
16. Mean Square Estimation
CH2 Time series.
Continuous Distributions
Stationary Stochastic Process
Presentation transcript:

Time Series Analysis

Definition A Time Series {xt : t  T} is a collection of random variables usually parameterized by 1) the real line T = R= (-∞, ∞) 2) the non-negative real line T = R+ = [0, ∞) 3) the integers T = Z = {…,-2, -1, 0, 1, 2, …} 4) the non-negative integers T = Z+ = {0, 1, 2, …}

If xt is a vector, the collection of random vectors {xt : t  T} is a multivariate time series or multi-channel time series. If t is a vector, the collection of random variables {xt : t  T} is a multidimensional “time” series or spatial series. (with T = Rk= k-dimensional Euclidean space or a k-dimensional lattice.)

Example of spatial time series

The project Buoys are located in a grid across the Pacific ocean Measuring Surface temperature Wind speed (two components) Other measurements The data is being collected almost continuously The purpose is to study El Nino

Technical Note: The probability measure of a time series is defined by specifying the joint distribution (in a consistent manner) of all finite subsets of {xt : t  T}. i.e. marginal distributions of subsets of random variables computed from the joint density of a complete set of variables should agree with the distribution assigned to the subset of variables.

The time series is Normal if all finite subsets of {xt : t  T} have a multivariate normal distribution. Similar statements are true for multi-channel time series and multidimensional time series.

Definition: m(t) = mean value function of {xt : t T} = E[xt] for t  T. s(t,s) = covariance function of {xt : t  T} = E[(xt - m(t))(xs - m(s))] for t,s  T.  

For multichannel time series m(t) = mean vector function of {xt : t  T} = E[xt] for t  T and S(t,s) = covariance matrix function of {xt : t  T} = E[(xt - m(t))(xs - m(s))′] for t,s  T. The ith element of the k × 1 vector m(t) mi(t) =E[xit] is the mean value function of the time series {xit : t  T} The i,jth element of the k × k matrix S(t,s) sij(t,s) =E[(xit - mi(t))(xjs - mj(s))] is called the cross-covariance function of the two time series {xit : t  T} and {xjt : t  T}  

Definition: The time series {xt : t  T} is stationary if the joint distribution of xt1, xt2, ... , xtk is the same as the joint distribution of xt1+h ,xt2+h , ... ,xtk+h for all finite subsets t1, t2, ... , tk of T and all choices of h.

Definition: The multi-channel time series {xt : t  T} is stationary if the joint distribution of xt1, xt2, ... , xtk is the same as the joint distribution of xt1+h , xt2+h , ... , xtk+h for all finite subsets t1, t2, ... , tk of T and all choices of h.

Definition: The multidimensional time series {xt : t  T} is stationary if the joint distribution of xt1, xt2, ... , xtk is the same as the joint distribution of xt1+h ,xt2+h , ... ,xtk+h for all finite subsets t1, t2, ... , tk of T and all choices of h.

Stationarity The distribution of observations at these points in time same as Time Stationarity

Some Implication of Stationarity If {xt : t  T} is stationary then: The distribution of xt is the same for all t  T. The joint distribution of xt, xt + h is the same as the joint distribution of xs, xs + h .

Implication of Stationarity for the mean value function and the covariance function If {xt : t  T} is stationary then for t  T. m(t) = E[xt] = m and for t,s  T. s(t,s) = E[(xt - m)(xs - m)] = E[(xt+h - m)(xs+h - m)] = E[(xt-s - m)(x0 - m)] with h = -s = s(t-s)

If the multi-channel time series{xt : t  T} is stationary then for t  T. m(t) = E[xt] = m and for t,s T S(t,s) = S(t-s) Thus for stationary time series the mean value function is constant and the covariance function is only a function of the distance in time (t – s)

If the multidimensional time series {xt : t T} is stationary then for t  T. m(t) = E[xt] = m and for t,s  T. s(t,s) = E[(xt - m)(xs - m)] = s(t-s) (called the Covariogram) Variogram V(t,s) = V(t - s) = Var[(xt - xs)] = E[(xt - xs)2] = Var[xt] + Var[xs] –2Cov[xt,xs] = 2[s(0) - s(t-s)]

Definition: r(t,s) = autocorrelation function of {xt : t T} = correlation between xt and xs. for t,s  T.   If {xt : t  T} is stationary then r(h) = autocorrelation function of {xt : t  T} = correlation between xt and xt+h.  

Definition: The time series {xt : t  T} is weakly stationary if: m(t) = E[xt] = m for all t  T. and s(t,s) = s(t-s) for all t,s  T. or r(t,s) = r(t-s) for all t,s  T.

Stationary time series Examples Stationary time series

Let X denote a single random variable with mean m and standard deviation s. In addition X may also be Normal (this condition is not necessary) Let xt = X for all t  T = { …,, -2, -1, 0, 1, 2, …} Then E[xt] = m = E[X] for t  T and s(h) = E[(xt+h - m)(xt - m)] = Cov(xt+h,xt ) = E[(X - m)(X - m)] = Var(X) = s2 for all h.

Excel file illustrating this time series

Suppose {xt : t  T} are identically distributed and uncorrelated (independent). Then E[xt] = m for t  T and s(h) = E[(xt+h - m)(xt - m)] = Cov(xt+h,xt )

The auto correlation function: Comment: If m = 0 then the time series {xt : t  T} is called a white noise time series. Thus a white noise time series consist of independent identically distributed random variables with mean 0 and common variance s2

Excel file illustrating this time series

Suppose X1, X2, … , Xk and Y1, Y2, … , Yk are independent independent random variables with Let l1, l2, … lk denote k values in (0,p) For any t T = { …,, -2, -1, 0, 1, 2, …}

Excel file illustrating this time series

Then

Hence

Hence using cos(A – B) = cos(A) cos(B) + sin(A) sin(B) and

The Moving Average Time series of order q, MA(q) Let a0 =1, a1, a2, … aq denote q + 1 numbers. Let {ut|t  T} denote a white noise time series with variance s2. independent mean 0, variance s2. Let {xt|t  T} be defined by the equation. Then {xt|t  T} is called a Moving Average time series of order q. MA(q)

Excel file illustrating this time series

The mean The auto covariance function

The autocovariance function for an MA(q) time series The autocorrelation function for an MA(q) time series

The Autoregressive Time series of order p, AR(p) Let b1, b2, … bp denote p numbers. Let {ut|t  T} denote a white noise time series with variance s2. independent mean 0, variance s2. Let {xt|t  T} be defined by the equation. Then {xt|t  T} is called a Autoregressive time series of order p. AR(p)

Excel file illustrating this time series

Comment: An Autoregressive time series is not necessarily stationary. Suppose {xt|t  T} is an AR(1) time series satisfying the equation: where {ut|t  T} is a white noise time series with variance s2. i.e. b1 = 1 and d = 0.

but and is not constant. A time series {xt|t  T} satisfying the equation: is called a Random Walk.

We use extensively the rules of expectation Derivation of the mean, autocovariance function and autocorrelation function of a stationary Autoregressive time series We use extensively the rules of expectation

Assume that the autoregressive time series {xt|t T} be defined by the equation: is stationary. Let m = E(xt). Then

The Autocovariance function, s(h) The Autocovariance function, s(h), of a stationary autoregressive time series {xt|t  T}can be determined by using the equation: Thus

Hence where

Now

The equations for the autocovariance function of an AR(p) time series etc

Or using s(-h) = s(h) and for h > p

Use the first p + 1 equations to find s(0), s(1) and s(p) Then use To compute s(h) for h > p

The Autoregressive Time series of order p, AR(p) Let b1, b2, … bp denote p numbers. Let {ut|t  T} denote a white noise time series with variance s2. independent mean 0, variance s2. Let {xt|t  T} be defined by the equation. Then {xt|t  T} is called a Autoregressive time series of order p. AR(p)

If the autoregressive time series {xt|t  T} be defined by the equation: is stationary. Then

The Autocovariance function, s(h), of a stationary autoregressive time series {xt|t  T} be defined by the equation: Satisfy the equations:

The autocovariance function for an AR(p) time series The mean The autocovariance function for an AR(p) time series Yule Walker Equations and for h > p

Use the first p + 1 equations (the Yole-Walker Equations) to find s(0), s(1) and s(p) Then use To compute s(h) for h > p

The Autocorrelation function, r(h), of a stationary autoregressive time series {xt|t  T}: The Yule walker Equations become:

and for h > p

To find r(h) and s(0): solve for r(1), …, r(p) Then for h > p

Suppose X1, X2, … , Xk and Y1, Y2, … , Yk are independent independent random variables with Let l1, l2, … lk denote k values in (0,p) For any t T = { …,, -2, -1, 0, 1, 2, …}

Three important Models of Stationary Time Series Sin-Cosine series with k frequencies Moving Average Time series of order q – MA(q) Autoregressive time series of order q - AR(q) Comment: Any non-trivial stationary time series can be approximated by either Sin-Cos series of order k MA(q) time series, or AR(p) time series, or

Sin-Cos series of order k for any t T = { …,, -2, -1, 0, 1, 2, …} where l1, l2, … lk denote k fixed values in (0,p) and X1, X2, … , Xk and Y1, Y2, … , Yk are independent random variables with

The Moving Average Time series of order q, MA(q) Let a0 =1, a1, a2, … aq denote q + 1 numbers. Let {ut|t  T} denote a white noise time series with variance s2. independent mean 0, variance s2. Let {xt|t  T} be defined by the equation. Then {xt|t  T} is called a Moving Average time series of order q. MA(q)

The Autoregressive Time series of order p, AR(p) Let b1, b2, … bp denote p numbers. Let {ut|t  T} denote a white noise time series with variance s2. independent mean 0, variance s2. Let {xt|t  T} be defined by the equation. Then {xt|t  T} is called a Autoregressive time series of order p. AR(p)

Mean Value, autocovariance, autocorrelation function for a Sin-Cos(k) time series (of order k) The autocovariance function for an MA(q) time series The autocorrelation function for an MA(q) time series

Mean Value, autocovariance, autocorrelation function for an MA(q) time series The autocovariance function for an MA(q) time series The autocorrelation function for an MA(q) time series

The autocovariance function for an AR(p) time series Mean Value, autocovariance, autocorrelation function for an AR(p) time series The autocovariance function for an AR(p) time series Yule Walker Equations and

The autocorrelation function for an AR(p) time series Yule Walker Equations for h > p and

Example Consider the AR(2) time series: xt = 0.7xt – 1+ 0.2 xt – 2 + 4.1 + ut where {ut} is a white noise time series with standard deviation s = 2.0 White noise ≡ independent, mean zero (normal) Find m, s(h), r(h)

To find r(h) solve the equations: or thus

for h > 2 This can be used in sequence to find: results

To find s(0) use: or = 17.778

To find s(h) use: To find m use:

An explicit formula for r(h) Auto-regressive time series of order p.

Consider solving the difference equation: This difference equation can be solved by: Setting up the polynomial where r1, r2, … , rp are the roots of the polynomial b(x).

The difference equation has the general solution: where c1, c2, … , cp are determined by using the starting values of the sequence r(h).

Example: An AR(1) time series for h > 1 and

The difference equation Can also be solved by: Setting up the polynomial Then a general formula for r(h) is:

Example: An AR(2) time series for h > 1

Setting up the polynomial

Then a general formula for r(h) is: For h = 0 and h = 1. Solving for c1 and c2.

Solving for c1 and c2. and Then a general formula for r(h) is:

are real and If is a mixture of two exponentials

are complex conjugates. If are complex conjugates. Some important complex identities

The above identities can be shown using the power series expansions:

Some other trig identities:

Hence

a damped cosine wave

Example Consider the AR(2) time series: xt = 0.7xt – 1+ 0.2 xt – 2 + 4.1 + ut where {ut} is a white noise time series with standard deviation s = 2.0 The correlation function found before using the difference equation: r(h) = 0.7 r(h – 1) + 0.2 r(h – 2)

Alternatively setting up the polynomial

Thus

Another Example Consider the AR(2) time series: xt = 0.2xt – 1- 0.5 xt – 2 + 4.1 + ut where {ut} is a white noise time series with standard deviation s = 2.0 The correlation function found before using the difference equation: r(h) = 0.2 r(h – 1) - 0.5 r(h – 2)

Alternatively setting up the polynomial

Thus where and

Conditions for stationarity Autoregressive Time series of order p, AR(p)

The Autoregressive Time series of order p, AR(p) For certain values of b1, b2, … bp the time series is not stationary Consider the AR(1) time series {xt|t  T} with d = 0 If b1 = 1 The series is a random walk which not stationary

If |b1 | > 1 and d = 0. and the value of xt increases in magnitude and ut eventually becomes negligible. The time series {xt|t  T} satisfies the equation: The time series {xt|t  T} exhibits deterministic behaviour. Finally if |b1 | < 1 then the time series {xt|t  T} is stationary |

Summarizing If |b1 | < 1 then the time series {xt|t  T} is stationary | If |b1 | > 1 then the time series {xt|t  T} is deterministic If |b1 | = 1 then the time series {xt|t  T} is random but non-stationary

Generalizing Let b1, b2, … bp denote p numbers. Let {ut|t  T} denote a white noise time series with variance s2. independent mean 0, variance s2. Let {xt|t  T} be defined by the equation. i.e. {xt|t  T} is an Autoregressive time series of order p.

Consider the polynomial with roots r1, r2 , … , rp then {xt|t T} is stationary if |ri| > 1 for all i. If |ri| < 1 for at least one i then {xt|t T} exhibits deterministic behaviour. If |ri| ≥ 1 and |ri| = 1 for at least one i then {xt|t  T} exhibits non-stationary random behaviour.

Special Cases: The AR(1) time Let {xt|t  T} be defined by the equation.

Consider the polynomial with root r1= 1/b1 {xt|t T} is stationary if |r1| > 1 or |b1| < 1 . If |ri| < 1 or |b1| > 1 then {xt|t T} exhibits deterministic behaviour. If |ri| = 1 or |b1| = 1 then {xt|t T} exhibits non-stationary random behaviour.

Special Cases: The AR(2) time Let {xt|t T} be defined by the equation.

Consider the polynomial where r1 and r2 are the roots of b(x) {xt|t T} is stationary if |r1| > 1 and |r2| > 1 . This is true if b1+b2 < 1 , b2 –b1 < 1 and b2 > -1. These inequalities define a triangular region for b1 and b2. If |ri| < 1 then {xt|t T} exhibits deterministic behaviour. If |ri| ≤ 1 for i = 1,2 and |ri| = 1 for at least on i then {xt|t T} exhibits non-stationary random behaviour.

Patterns of the ACF and PACF of AR(2) Time Series In the shaded region the roots of the AR operator are complex b2