The Spectral Representation of Stationary Time Series.

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

Dates for term tests Friday, February 07 Friday, March 07
ELG5377 Adaptive Signal Processing
Chapter 4 Functions of Random Variables Instructor: Prof. Wilson Tang CIVL 181 Modelling Systems with Uncertainties.
Economics 20 - Prof. Anderson1 Stationary Stochastic Process A stochastic process is stationary if for every collection of time indices 1 ≤ t 1 < …< t.
Chain Rules for Entropy
Pierfrancesco Cacciola Senior Lecturer in Civil Engineering ( Structural Design ) School of Environment and Technology, University of Brighton, Cockcroft.
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
Stationary Stochastic Process
4.7 Brownian Bridge 報告者 : 劉彥君 Gaussian Process Definition 4.7.1: A Gaussian process X(t), t ≥ 0, is a stochastic process that has the property.
Rules for means Rule 1: If X is a random variable and a and b are fixed numbers, then Rule 2: If X and Y are random variables, then.
Chapter 4 Stochastic calculus 報告者:何俊儒. 4.1 Introduction.
We use the identity Generally we would like the equilibrium mean to be equal to zero! Which conditions should the expected mean satisfy?
Economics 20 - Prof. Anderson
Chapter 5 Integrals 5.2 The Definite Integral In this handout: Riemann sum Definition of a definite integral Properties of the definite integral.
Adding First, you need to know… Associative Property of Addition When: (a + b) + c = a + (b + c) Commutative Property of Addition When: a + b= b + a.
Separate multivariate observations
Copyright Robert J. Marks II ECE 5345 Random Processes - Stationary Random Processes.
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
Time Series Analysis.
Representing variables according to the ISO/IEC standard.
Chapter 13 Wiener Processes and Itô’s Lemma
Chapter 1 Partial Differential Equations
TIME SERIES ANALYSIS Time Domain Models: Red Noise; AR and ARMA models LECTURE 7 Supplementary Readings: Wilks, chapters 8.
The Mean of a Discrete RV The mean of a RV is the average value the RV takes over the long-run. –The mean of a RV is analogous to the mean of a large population.
Estimation of the spectral density function. The spectral density function, f( ) The spectral density function, f(x), is a symmetric function defined.
Security Markets VII Miloslav S. Vosvrda Teorie financnich trhu.
CHAPTER 5 SIGNAL SPACE ANALYSIS
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
Estimation of the spectral density function. The spectral density function, f( ) The spectral density function, f(x), is a symmetric function defined.
CHAPTER 5 PARTIAL DERIVATIVES
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Geology 5600/6600 Signal Analysis 14 Sep 2015 © A.R. Lowry 2015 Last time: A stationary process has statistical properties that are time-invariant; a wide-sense.
MODELS FOR NONSTATIONARY TIME SERIES By Eni Sumarminingsih, SSi, MM.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory.
We use the identity Generally we would like the mean of an equilibrium error to be equal to zero! Which conditions should the expected cointegration.
Chapter 8 Estimation ©. Estimator and Estimate estimator estimate An estimator of a population parameter is a random variable that depends on the sample.
Chapter 5: The Basic Concepts of Statistics. 5.1 Population and Sample Definition 5.1 A population consists of the totality of the observations with which.
Direct Variation Chapter 5 Section 2. Objective  Students will write and graph an equation of a direct variation.
Evaluating Hypotheses. Outline Empirically evaluating the accuracy of hypotheses is fundamental to machine learning – How well does this estimate its.
Chapter 13 Wiener Processes and Itô’s Lemma 1. Stochastic Processes Describes the way in which a variable such as a stock price, exchange rate or interest.
Chapter 6 Random Processes
CWR 6536 Stochastic Subsurface Hydrology Optimal Estimation of Hydrologic Parameters.
Tests of hypothesis Contents: Tests of significance for small samples
Lecture 3 B Maysaa ELmahi.
Integral Transform Method
Theory of Capital Markets
水分子不時受撞,跳格子(c.p. 車行) 投骰子 (最穩定) 股票 (價格是不穏定,但報酬過程略穩定) 地震的次數 (不穩定)
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Lebesgue measure: Lebesgue measure m0 is a measure on i.e., 1. 2.
CHAPTER 5 PARTIAL DERIVATIVES
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
Tutorial 8: Further Topics on Random Variables 1
Random WALK, BROWNIAN MOTION and SDEs
Generalized Spatial Dirichlet Process Models
STOCHASTIC HYDROLOGY Random Processes
Chapter 14 Wiener Processes and Itô’s Lemma
The Spectral Representation of Stationary Time Series
IDC (not IDK!) How to turn absolutely anything (well almost) into a science experiment!
6.3 Sampling Distributions
Eni Sumarminingsih, SSi, MM
Chapter 6 Random Processes
Further Topics on Random Variables: Derived Distributions
16. Mean Square Estimation
Further Topics on Random Variables: Derived Distributions
Experiments, Outcomes, Events and Random Variables: A Revisit
Basic descriptions of physical data
Further Topics on Random Variables: Derived Distributions
Applied Statistical and Optimization Models
Mathematical Expectation
Presentation transcript:

The Spectral Representation of Stationary Time Series

Stationary time series satisfy the properties: 1.Constant mean (E(x t ) =  ) 2.Constant variance (Var(x t ) =  2 ) 3.Correlation between two observations (x t, x t + h ) dependent only on the distance h. These properties ensure the periodic nature of a stationary time series

and X 1, X 1, …, X k and Y 1, Y 2, …, Y k are independent independent random variables with where 1, 2, … k are k values in (0,  ) Recall is a stationary Time series

With this time series and We can give it a non-zero mean, , by adding  to the equation

We now try to extend this example to a wider class of time series which turns out to be the complete set of weakly stationary time series. In this case the collection of frequencies may even vary over a continuous range of frequencies [0,  ].

The Riemann integral The Riemann-Stiltjes integral If F is continuous with derivative f then: If F is is a step function with jumps p i at x i then:

First, we are going to develop the concept of integration with respect to a stochastic process. Let {U( ):  [0,  ]} denote a stochastic process with mean 0 and independent increments; that is E{[U( 2 ) - U( 1 )][U( 4 ) - U( 3 )]} = 0 for 0 ≤ 1 < 2 ≤ 3 < 4 ≤ . and E[U( ) ] = 0 for 0 ≤ ≤ .

In addition let G( ) =E[U 2 ( ) ] for 0 ≤ ≤  and assume G(0) = 0. It is easy to show that G( ) is monotonically non decreasing. i.e. G( 1 ) ≤ G( 2 ) for 1 < 2.

Now let us define: analogous to the Riemann-Stieltjes integral

Let 0 = 0 < 1 < 2 <... < n =  be any partition of the interval. Let. Let i  denote any value in the interval [ i-1, i ] Consider: Suppose that and there exists a random variable V such that *

Then V is denoted by:

Properties:

The Spectral Representation of Stationary Time Series

Let {X( ):  [0,  ]} and {Y( ): l  [0,  ]} denote a uncorrelated stochastic process with mean 0 and independent increments. Also let F( ) =E[X 2 ( ) ] =E[Y 2 ( ) ] for 0 ≤ ≤  and F(0) = 0. Now define the time series {x t : t  T}as follows:

Then

Also

Thus the time series {x t : t  T} defined as follows: is a stationary time series with: F( ) is called the spectral distribution function: If f( ) = F ˊ ( ) is called then is called the spectral density function:

Note The spectral distribution function, F( ), and spectral density function, f( ) describe how the variance of x t is distributed over the frequencies in the interval [0,  ]

The autocovariance function,  (h), can be computed from the spectral density function, f( ), as follows: Also the spectral density function, f( ), can be computed from the autocovariance function,  (h), as follows:

Example: Let {u t : t  T} be identically distributed and uncorrelated with mean zero (a white noise series). Thus and

Graph:

Example: Suppose X 1, X 1, …, X k and Y 1, Y 2, …, Y k are independent independent random variables with Let 1, 2, … k denote k values in (0,  ) Then

If we define {X( ):  [0,  ]} and {Y( ):  [0,  ]} Note: X( ) and Y( ) are “random” step functions and F( ) is a step function.

Another important comment In the case when F( ) is continuous then

in this case Sometimes the spectral density function, f( ), is extended to the interval [- ,  ] and is assumed symmetric about 0 (i.e. f s ( ) = f s (- ) = f ( )/2 ) It can be shown that

Hence From now on we will use the symmetric spectral density function and let it be denoted by, f( ).

Linear Filters

Let {x t : t  T} be any time series and suppose that the time series {y t : t  T} is constructed as follows:: The time series {y t : t  T} is said to be constructed from {x t : t  T} by means of a Linear Filter. input x t output y t Linear Filter a s

Let  x (h) denote the autocovariance function of {x t : t  T} and  y (h) the autocovariance function of {y t : t  T}. Assume also that E[x t ] = E[y t ] = 0. Then::

Hence where of the linear filter

Note: hence

Spectral density function Moving Average Time series of order q, MA(q) Let  0 =1,  1,  2, …  q denote q + 1 numbers. Let {u t |t  T} denote a white noise time series with variance  2. Let {x t |t  T} denote a MA(q) time series with  = 0. Note: {x t |t  T} is obtained from {u t |t  T} by a linear filter.

Now Hence

Example: q = 1

Example: q = 2

Spectral density function for MA(1) Series

Spectral density function Autoregressive Time series of order p, AR(p) Let  1,  2, …  p denote p + 1 numbers. Let {u t |t  T} denote a white noise time series with variance  2. Let {x t |t  T} denote a AR(p) time series with  = 0. Note: {u t |t  T} is obtained from {x t |t  T} by a linear filter.

Now Hence

Example: p = 1

Example: p = 2

Example : Sunspot Numbers ( )

Autocorrelation function and partial autocorrelation function

Spectral density Estimate

Assuming an AR(2) model

A linear discrete time series Moving Average time series of infinite order

Let  0 =1,  1,  2, … denote an infinite sequence of numbers. Let {u t |t  T} denote a white noise time series with variance  2. –independent –mean 0, variance  2. Let {x t |t  T} be defined by the equation. Then {x t |t  T} is called a Linear discrete time series. Comment: A linear discrete time series is a Moving average time series of infinite order

The AR(1) Time series Let {x t |t  T} be defined by the equation. Then

where and An alternative approach using the back shift operator, B. The equation: can be written

Now since The equation: has the equivalent form:

The time series {x t |t  T} can be written as a linear discrete time series where and For the general AR(p) time series: [  (B)] -1 can be found by carrying out the multiplication

can be written: where Thus the AR(p) time series: Hence This called the Random Shock form of the series

can be written: where Thus the AR(p) time series: Hence This called the Random Shock form of the series

An ARMA(p,q) time series {x t |t  T} satisfies the equation: where and The Random Shock form of an ARMA(p,q) time series:

Again the time series {x t |t  T} can be written as a linear discrete time series namely where  (B) =[  (B)] -1 [  (B)] can be found by carrying out the multiplication

Thus an ARMA(p,q) time series can be written: where

The inverted form of a stationary time series Autoregressive time series of infinite order

An ARMA(p,q) time series {x t |t  T} satisfies the equation: where and Suppose that This will be true if the roots of the polynomial all exceed 1 in absolute value. The time series {x t |t  T} in this case is called invertible.

Then where or

Thus an ARMA(p,q) time series can be written: where This is called the inverted form of the time series. This expresses the time series an autoregressive time series of infinite order.