The Spectral Representation of Stationary Time Series

Slides:



Advertisements
Similar presentations
Dates for term tests Friday, February 07 Friday, March 07
Advertisements

The Spectral Representation of Stationary Time Series.
ELG5377 Adaptive Signal Processing
Model Building For ARIMA time series
An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003.
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Continuous Random Variables and Probability Distributions
Lecture 16 Random Signals and Noise (III) Fall 2008 NCTU EE Tzu-Hsien Sang.
ELEC 303 – Random Signals Lecture 21 – Random processes
Review of Probability.
ARMA models Gloria González-Rivera University of California, Riverside
Probability Theory and Random Processes
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
1 Part 5 Response of Linear Systems 6.Linear Filtering of a Random Signals 7.Power Spectrum Analysis 8.Linear Estimation and Prediction Filters 9.Mean-Square.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
Models for Non-Stationary Time Series The ARIMA(p,d,q) time series.
Estimation of the spectral density function. The spectral density function, f( ) The spectral density function, f(x), is a symmetric function defined.
Byron Gangnes Econ 427 lecture 11 slides Moving Average Models.
ارتباطات داده (883-40) فرآیندهای تصادفی نیمسال دوّم افشین همّت یار دانشکده مهندسی کامپیوتر 1.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
Chapter 1 Random Process
Stochastic models - time series. Random process. an infinite collection of consistent distributions probabilities exist Random function. a family of random.
Linear Filters. denote a bivariate time series with zero mean. Let.
Discrete-time Random Signals
Continuous Random Variables and Probability Distributions
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Geology 6600/7600 Signal Analysis 28 Sep 2015 © A.R. Lowry 2015 Last time: Energy Spectral Density; Linear Systems given (deterministic) finite-energy.
Joint Moments and Joint Characteristic Functions.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Introduction to stochastic processes
STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES
Stochastic Process - Introduction
Chapter 4 Dynamical Behavior of Processes Homework 6 Construct an s-Function model of the interacting tank-in-series system and compare its simulation.
Models for Non-Stationary Time Series
Covariance, stationarity & some useful operators
Chapter 4 Dynamical Behavior of Processes Homework 6 Construct an s-Function model of the interacting tank-in-series system and compare its simulation.
Time Series Analysis.
Estimation of the spectral density function
SIGNALS PROCESSING AND ANALYSIS
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Lebesgue measure: Lebesgue measure m0 is a measure on i.e., 1. 2.
Computational Data Analysis
Computational Data Analysis
Model Building For ARIMA time series
UNIT II Analysis of Continuous Time signal
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
Stochastic models - time series.
Chapter 5 Nonstationary Time series Models
Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Machine Learning Week 4.
Stochastic models - time series.
ARMA models 2012 International Finance CYCU
STOCHASTIC HYDROLOGY Random Processes
Multivariate Time Series Analysis
Unit Root & Augmented Dickey-Fuller (ADF) Test
State Space Models.
Linear Filters.
Eni Sumarminingsih, SSi, MM
Chapter 6 Random Processes
16. Mean Square Estimation
CH2 Time series.
Basic descriptions of physical data
copyright Robert J. Marks II
Stationary Stochastic Process
Presentation transcript:

The Spectral Representation of Stationary Time Series

Stationary time series satisfy the properties: Constant mean (E(xt) = m) Constant variance (Var(xt) = s2) Correlation between two observations (xt, xt + h) dependent only on the distance h. These properties ensure the periodic nature of a stationary time series

Recall is a stationary Time series where l1, l2, … lk are k values in (0,p) and X1, X1, … , Xk and Y1, Y2, … , Yk are independent independent random variables with

With this time series and We can give it a non-zero mean, m, by adding m to the equation and

We now try to extend this example to a wider class of time series which turns out to be the complete set of weakly stationary time series. In this case the collection of frequencies may even vary over a continuous range of frequencies [0,p].

The Riemann integral The Riemann-Stiltjes integral If F is continuous with derivative f then: If F is is a step function with jumps pi at xi then:

First, we are going to develop the concept of integration with respect to a stochastic process. Let {U(l): l  [0,p]} denote a stochastic process with mean 0 and independent increments; that is E{[U(l2) - U(l1)][U(l4) - U(l3)]} = 0 for 0 ≤ l1 < l2 ≤ l3 < l4 ≤ p. and E[U(l) ] = 0 for 0 ≤ l ≤ p.

In addition let G(l) =E[U2(l) ] for 0 ≤ l ≤ p and assume G(0) = 0. It is easy to show that G(l) is monotonically non decreasing. i.e. G(l1) ≤ G(l2) for l1 < l2 .

Now let us define: analogous to the Riemann-Stieltjes integral

Let li denote any value in the interval [li-1,li] Consider: Let 0 = l0 < l1 < l2 < ... < ln = p be any partition of the interval. Let . Let li denote any value in the interval [li-1,li] Consider: Suppose that and there exists a random variable V such that *

Then V is denoted by:

Properties:

The Spectral Representation of Stationary Time Series

Let {X(l): l  [0,p]} and {Y(l): l  [0,p]} denote a uncorrelated stochastic process with mean 0 and independent increments. Also let F(l) =E[X2(l) ] =E[Y2(l) ] for 0 ≤ l ≤ p and F(0) = 0. Now define the time series {xt : t  T}as follows:

Then

Also

Thus the time series {xt : t  T} defined as follows: is a stationary time series with: F(l) is called the spectral distribution function: If f(l) = Fˊ(l) is called then is called the spectral density function:

Note The spectral distribution function, F(l), and spectral density function, f(l) describe how the variance of xt is distributed over the frequencies in the interval [0,p]

The autocovariance function, s(h), can be computed from the spectral density function, f(l), as follows: Also the spectral density function, f(l), can be computed from the autocovariance function, s(h), as follows:

Example: Suppose X1, X1, … , Xk and Y1, Y2, … , Yk are independent independent random variables with Let l1, l2, … lk denote k values in (0,p) Then

If we define {X(l): l  [0,p]} and {Y(l): l  [0,p]} Note: X(l) and Y(l) are “random” step functions and F(l) is a step function.

Another important comment In the case when F(l) is continuous then

Sometimes the spectral density function, f(l), is extended to the interval [-p,p] and is assumed symmetric about 0 (i.e. fs(l) = fs (-l) = f (l)/2 ) in this case It can be shown that

From now on we will use the symmetric spectral density function and let it be denoted by, f(l). Hence

Example: Let {ut : t  T} be identically distributed and uncorrelated with mean zero (a white noise series). Thus and

Graph:

Linear Filters

Let {xt : t  T} be any time series and suppose that the time series {yt : t  T} is constructed as follows: : The time series {yt : t  T} is said to be constructed from {xt : t  T} by means of a Linear Filter. Linear Filter as output yt input xt

Let sx(h) denote the autocovariance function of {xt : t T} and sy(h) the autocovariance function of {yt : t  T}. Assume also that E[xt] = E[yt] = 0. Then: :

Hence where of the linear filter

Note: hence

Spectral density function Moving Average Time series of order q, MA(q) Let a0 =1, a1, a2, … aq denote q + 1 numbers. Let {ut|t  T} denote a white noise time series with variance s2. Let {xt|t  T} denote a MA(q) time series with m = 0. Note: {xt|t  T} is obtained from {ut|t  T} by a linear filter.

Now Hence

Example: q = 1

Example: q = 2

Spectral density function for MA(1) Series

Spectral density function Autoregressive Time series of order p, AR(p) Let b1, b2, … bp denote p + 1 numbers. Let {ut|t  T} denote a white noise time series with variance s2. Let {xt|t  T} denote a AR(p) time series with d = 0. Note: {ut|t  T} is obtained from {xt|t  T} by a linear filter.

Now Hence

Example: p = 1

Example: p = 2

Example : Sunspot Numbers (1770-1869)

Autocorrelation function and partial autocorrelation function

Spectral density Estimate

Assuming an AR(2) model

A linear discrete time series Moving Average time series of infinite order

Let q0 =1, q1, q2, … denote an infinite sequence of numbers. Let {ut|t  T} denote a white noise time series with variance s2. independent mean 0, variance s2. Let {xt|t  T} be defined by the equation. Then {xt|t  T} is called a Linear discrete time series. Comment: A linear discrete time series is a Moving average time series of infinite order

The AR(1) Time series Let {xt|t  T} be defined by the equation. Then

where and An alternative approach using the back shift operator, B. The equation: can be written

Now since The equation: has the equivalent form:

For the general AR(p) time series: where The time series {xt |t  T} can be written as a linear discrete time series and [b(B)]-1can be found by carrying out the multiplication

Thus the AR(p) time series: can be written: where Hence This called the Random Shock form of the series

Thus the AR(p) time series: can be written: where Hence This called the Random Shock form of the series

The Random Shock form of an ARMA(p,q) time series: An ARMA(p,q) time series {xt |t  T} satisfies the equation: where and

Again the time series {xt |t  T} can be written as a linear discrete time series namely where q(B) =[b(B)]-1[a(B)] can be found by carrying out the multiplication

Thus an ARMA(p,q) time series can be written: where

The inverted form of a stationary time series Autoregressive time series of infinite order

An ARMA(p,q) time series {xt |t  T} satisfies the equation: where and Suppose that This will be true if the roots of the polynomial all exceed 1 in absolute value. The time series {xt |t  T} in this case is called invertible.

Then or where

Thus an ARMA(p,q) time series can be written: where This is called the inverted form of the time series. This expresses the time series an autoregressive time series of infinite order.