Multivariate Time Series Analysis

Slides:



Advertisements
Similar presentations
SJS SDI_21 Design of Statistical Investigations Stephen Senn 2 Background Stats.
Advertisements

State Space Models. Let { x t :t T} and { y t :t T} denote two vector valued time series that satisfy the system of equations: y t = A t x t + v t (The.
Dates for term tests Friday, February 07 Friday, March 07
A. The Basic Principle We consider the multivariate extension of multiple linear regression – modeling the relationship between m responses Y 1,…,Y m and.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Periodograms Bartlett Windows Data Windowing Blackman-Tukey Resources:
Multivariate distributions. The Normal distribution.
The Empirical FT continued. What is the large sample distribution of the EFT?
SYSTEMS Identification
Visual Recognition Tutorial1 Random variables, distributions, and probability density functions Discrete Random Variables Continuous Random Variables.
Correlation and spectral analysis Objective: –investigation of correlation structure of time series –identification of major harmonic components in time.
Techniques for studying correlation and covariance structure
ELEC 303 – Random Signals Lecture 21 – Random processes
Time Series Analysis.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
Random Processes ECE460 Spring, Power Spectral Density Generalities : Example: 2.
1 Part 5 Response of Linear Systems 6.Linear Filtering of a Random Signals 7.Power Spectrum Analysis 8.Linear Estimation and Prediction Filters 9.Mean-Square.
Estimation of the spectral density function. The spectral density function, f( ) The spectral density function, f(x), is a symmetric function defined.
Spectral analysis of multiple timeseries Kenneth D. Harris 18/2/15.
CHEE825 Fall 2005J. McLellan1 Spectral Analysis and Input Signal Design.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
CHAPTER 5 SIGNAL SPACE ANALYSIS
Estimation of the spectral density function. The spectral density function, f( ) The spectral density function, f(x), is a symmetric function defined.
Robotics Research Laboratory 1 Chapter 7 Multivariable and Optimal Control.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
Chapter 1 Random Process
Brief Review Probability and Statistics. Probability distributions Continuous distributions.
Linear Filters. denote a bivariate time series with zero mean. Let.
Geology 6600/7600 Signal Analysis 28 Sep 2015 © A.R. Lowry 2015 Last time: Energy Spectral Density; Linear Systems given (deterministic) finite-energy.
Multivariate Time Series Analysis
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory.
Chapter 6 Random Processes
Geology 6600/7600 Signal Analysis Last time: Linear Systems The Frequency Response or Transfer Function of a linear SISO system can be estimated as (Note.
Locating a Shift in the Mean of a Time Series Melvin J. Hinich Applied Research Laboratories University of Texas at Austin
Eeng Chapter4 Bandpass Signalling  Bandpass Filtering and Linear Distortion  Bandpass Sampling Theorem  Bandpass Dimensionality Theorem  Amplifiers.
Stochastic Process - Introduction
Covariance, stationarity & some useful operators
Tatiana Varatnitskaya Belаrussian State University, Minsk
LECTURE 30: SYSTEM ANALYSIS USING THE TRANSFER FUNCTION
Time Series Analysis.
Estimation of the spectral density function
SIGNALS PROCESSING AND ANALYSIS
Factor Analysis An Alternative technique for studying correlation and covariance structure.
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Inference for the mean vector
Lecture 26 Modeling (1): Time Series Prediction
The Empirical FT. What is the large sample distribution of the EFT?
Inference about the Slope and Intercept
Cross-spectral Analysis
Hidden Markov Models Part 2: Algorithms
Stochastic models - time series.
Modern Spectral Estimation
The regression model in matrix form
Inference about the Slope and Intercept
Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
The Empirical FT. What is the large sample distribution of the EFT?
Second order stationary p.p.
The Multivariate Normal Distribution, Part 2
Factor Analysis An Alternative technique for studying correlation and covariance structure.
State Space Models.
Linear Filters.
The Spectral Representation of Stationary Time Series
The Empirical FT. What is the large sample distribution of the EFT?
Chapter4 Bandpass Signalling Bandpass Filtering and Linear Distortion
Chapter-1 Multivariate Normal Distributions
Chapter 6 Random Processes
WHY WE FILTER TO IDENTIFY THE RELATIONSHIP
16. Mean Square Estimation
copyright Robert J. Marks II
Presentation transcript:

Multivariate Time Series Analysis

Let {xt : t  T} be a Multivariate time series. Definition: m(t) = mean value function of {xt : t  T} = E[xt] for t  T. S(t,s) = Lagged covariance matrix of {xt : t  T} = E{[ xt - m(t)][ xs - m(s)]'} for t,s  T

Definition: The time series {xt : t  T} is stationary if the joint distribution of is the same as the joint distribution of for all finite subsets t1, t2, ... , tk of T and all choices of h.

In this case then for t  T. and S(t,s) = E{[ xt - m][ xs - m]'} = E{[ xt+h - m][ xs+h - m]'} = E{[ xt-s - m][ x0 - m]'} = S(t - s) for t,s  T.

Definition: The time series {xt : t  T} is weakly stationary if : for t  T. and S(t,s) = S(t - s) for t, s  T.

In this case S(h) = E{[ xt+h - m][ xs - m]'} = Cov(xt+h,xt ) is called the Lagged covariance matrix of the process {xt : t  T}

The Cross Correlation Function and the Cross Spectrum

Note: sij(h) = (i,j)th element of S(h), and is called the cross covariance function of is called the cross correlation function of

Definitions: i) is called the cross spectrum of Note: since sij(k) ≠ sij(-k) then fij(l) is complex. ii) If fij(l) = cij(l) - i qij(l) then cij(l) is called the Cospectrum (Coincident spectral density) and qij(l) is called the quadrature spectrum

iii) If fij(l) = Aij(l) exp{ifij(l)} then Aij(l) is called the Cross Amplitude Spectrum and fij(l) is called the Phase Spectrum.

Definition: is called the Spectral Matrix

The Multivariate Wiener-Khinchin Relations (p-variate) and

Lemma: Assume that Then F(l) is: i) Positive semidefinite: a*F(l)a ≥ 0 if a*a ≥ 0, where a is any complex vector. ii) Hermitian:F(l) = F*(l) = the Adjoint of F(l) = the complex conjugate transpose of F(l). i.e.fij(l) = .

Corrollary: The fact that F(l) is positive semidefinite also means that all square submatrices along the diagonal have a positive determinant Hence and or

Definition: = Squared Coherency function Note:

Definition:

Applications and Examples of Multivariate Spectral Analysis

Example I - Linear Filters

Let denote a bivariate time series with zero mean. Suppose that the time series {yt : t  T} is constructed as follows: t = ..., -2, -1, 0, 1, 2, ...

The time series {yt : t  T} is said to be constructed from {xt : t  T} by means of a Linear Filter.

continuing

continuing Thus the spectral density of the time series {yt : t  T} is:

Comment A: is called the Transfer function of the linear filter. is called the Gain of the filter while is called the Phase Shift of the filter.

Also

continuing

Thus cross spectrum of the bivariate time series is:

Comment B: = Squared Coherency function.

Example II - Linear Filters with additive noise at the output

Let denote a bivariate time series with zero mean. Suppose that the time series {yt : t  T} is constructed as follows: t = ..., -2, -1, 0, 1, 2, ... The noise {vt : t  T} is independent of the series {xt : t  T} (may be white)

continuing Thus the spectral density of the time series {yt : t  T} is:

Also

continuing

Thus cross spectrum of the bivariate time series is:

Thus = Squared Coherency function. Noise to Signal Ratio

Estimation of the Cross Spectrum

Let denote T observations on a bivariate time series with zero mean. If the series has non-zero mean one uses in place of Again assume that T = 2m +1 is odd.

Then define: and with lk = 2pk/T and k = 0, 1, 2, ... , m.

Also and for k = 0, 1, 2, ... , m.

The Periodogram & the Cross-Periodogram

Also and for k = 0, 1, 2, ... , m.

Finally

Note: and

Also and

The sample cross-spectrum, cospectrum & quadrature spectrum

Recall that the periodogram has asymptotic expectation 4pfxx(l). Similarly the asymptotic expectation of is 4pfxy(l). An asymptotic unbiased estimator of fxy(l) can be obtained by dividing by 4p.

The sample cross spectrum

The sample cospectrum

The sample quadrature spectrum

The sample Cross amplitude spectrum, Phase spectrum & Squared Coherency

Recall

Thus their sample counter parts can be defined in a similar manner Thus their sample counter parts can be defined in a similar manner. Namely

Consistent Estimation of the Cross-spectrum fxy(l)

Daniell Estimator

= The Daniell Estimator of the Cospectrum = The Daniell Estimator of the quadrature spectrum

Weighted Covariance Estimator

Again once the Cospectrum and Quadrature Spectrum have been estimated, The Cross spectrum, Amplitude Spectrum, Phase Spectrum and Coherency can be estimated generally as follows using either the a) Daniell Estimator or b) the weighted covariance estimator of cxy(l) and qxy(l):

Namely