Multivariate Time Series Analysis

Slides:



Advertisements
Similar presentations
SJS SDI_21 Design of Statistical Investigations Stephen Senn 2 Background Stats.
Advertisements

State Space Models. Let { x t :t T} and { y t :t T} denote two vector valued time series that satisfy the system of equations: y t = A t x t + v t (The.
Definitions Periodic Function: f(t +T) = f(t)t, (Period T)(1) Ex: f(t) = A sin(2Πωt + )(2) has period T = (1/ω) and ω is said to be the frequency (angular),
Dates for term tests Friday, February 07 Friday, March 07
General Linear Model With correlated error terms  =  2 V ≠  2 I.
A. The Basic Principle We consider the multivariate extension of multiple linear regression – modeling the relationship between m responses Y 1,…,Y m and.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Periodograms Bartlett Windows Data Windowing Blackman-Tukey Resources:
ELEC 303 – Random Signals Lecture 20 – Random processes
Multivariate distributions. The Normal distribution.
The Empirical FT continued. What is the large sample distribution of the EFT?
SYSTEMS Identification
SYSTEMS Identification
Generalized Regression Model Based on Greene’s Note 15 (Chapter 8)
2. Point and interval estimation Introduction Properties of estimators Finite sample size Asymptotic properties Construction methods Method of moments.
Visual Recognition Tutorial1 Random variables, distributions, and probability density functions Discrete Random Variables Continuous Random Variables.
Correlation and spectral analysis Objective: –investigation of correlation structure of time series –identification of major harmonic components in time.
Techniques for studying correlation and covariance structure
ELEC 303 – Random Signals Lecture 21 – Random processes
Maximum Likelihood Estimation
Review of Probability.
1  The goal is to estimate the error probability of the designed classification system  Error Counting Technique  Let classes  Let data points in class.
Time Series Analysis.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
Marginal and Conditional distributions. Theorem: (Marginal distributions for the Multivariate Normal distribution) have p-variate Normal distribution.
Random Processes ECE460 Spring, Power Spectral Density Generalities : Example: 2.
1 Part 5 Response of Linear Systems 6.Linear Filtering of a Random Signals 7.Power Spectrum Analysis 8.Linear Estimation and Prediction Filters 9.Mean-Square.
Statistics and Linear Algebra (the real thing). Vector A vector is a rectangular arrangement of number in several rows and one column. A vector is denoted.
Mathematical Preliminaries. 37 Matrix Theory Vectors nth element of vector u : u(n) Matrix mth row and nth column of A : a(m,n) column vector.
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
SUPA Advanced Data Analysis Course, Jan 6th – 7th 2009 Advanced Data Analysis for the Physical Sciences Dr Martin Hendry Dept of Physics and Astronomy.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
Estimation of the spectral density function. The spectral density function, f( ) The spectral density function, f(x), is a symmetric function defined.
Spectral analysis of multiple timeseries Kenneth D. Harris 18/2/15.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
CHEE825 Fall 2005J. McLellan1 Spectral Analysis and Input Signal Design.
Techniques for studying correlation and covariance structure Principal Components Analysis (PCA) Factor Analysis.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
CHAPTER 5 SIGNAL SPACE ANALYSIS
Estimation of the spectral density function. The spectral density function, f( ) The spectral density function, f(x), is a symmetric function defined.
Robotics Research Laboratory 1 Chapter 7 Multivariable and Optimal Control.
Brief Review Probability and Statistics. Probability distributions Continuous distributions.
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
Chapter 1 Random Process
Brief Review Probability and Statistics. Probability distributions Continuous distributions.
Charles University FSV UK STAKAN III Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences.
Inference for the mean vector. Univariate Inference Let x 1, x 2, …, x n denote a sample of n from the normal distribution with mean  and variance 
Linear Filters. denote a bivariate time series with zero mean. Let.
Discrete-time Random Signals
Signal & Weight Vector Spaces
Jointly distributed Random variables Multivariate distributions.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory.
Geology 6600/7600 Signal Analysis Last time: Linear Systems The Frequency Response or Transfer Function of a linear SISO system can be estimated as (Note.
Time Series Analysis.
Estimation of the spectral density function
Inference for the mean vector
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
The Empirical FT. What is the large sample distribution of the EFT?
Cross-spectral Analysis
The regression model in matrix form
The Empirical FT. What is the large sample distribution of the EFT?
Second order stationary p.p.
Multivariate Time Series Analysis
Linear Filters.
The Spectral Representation of Stationary Time Series
The Empirical FT. What is the large sample distribution of the EFT?
Chapter-1 Multivariate Normal Distributions
16. Mean Square Estimation
Presentation transcript:

Multivariate Time Series Analysis

Let {xt : t  T} be a Multivariate time series. Definition: m(t) = mean value function of {xt : t  T} = E[xt] for t  T. S(t,s) = Lagged covariance matrix of {xt : t  T} = E{[ xt - m(t)][ xs - m(s)]'} for t,s  T

Definition: The time series {xt : t  T} is stationary if the joint distribution of is the same as the joint distribution of for all finite subsets t1, t2, ... , tk of T and all choices of h.

In this case then for t  T. and S(t,s) = E{[ xt - m][ xs - m]'} = E{[ xt+h - m][ xs+h - m]'} = E{[ xt-s - m][ x0 - m]'} = S(t - s) for t,s  T.

Definition: The time series {xt : t  T} is weakly stationary if : for t  T. and S(t,s) = S(t - s) for t, s  T.

In this case S(h) = E{[ xt+h - m][ xs - m]'} = Cov(xt+h,xt ) is called the Lagged covariance matrix of the process {xt : t  T}

The Cross Correlation Function and the Cross Spectrum

Note: sij(h) = (i,j)th element of S(h), and is called the cross covariance function of is called the cross correlation function of

Definitions: i) is called the cross spectrum of Note: since sij(k) ≠ sij(-k) then fij(l) is complex. ii) If fij(l) = cij(l) - i qij(l) then cij(l) is called the Cospectrum (Coincident spectral density) and qij(l) is called the quadrature spectrum

iii) If fij(l) = Aij(l) exp{ifij(l)} then Aij(l) is called the Cross Amplitude Spectrum and fij(l) is called the Phase Spectrum.

Definition: is called the Spectral Matrix

The Multivariate Wiener-Khinchin Relations (p-variate) and

Lemma: Assume that Then F(l) is: i) Positive semidefinite: a*F(l)a ≥ 0 if a*a ≥ 0, where a is any complex vector. ii) Hermitian:F(l) = F*(l) = the Adjoint of F(l) = the complex conjugate transpose of F(l). i.e.fij(l) = .

Corrollary: The fact that F(l) is positive semidefinite also means that all square submatrices along the diagonal have a positive determinant Hence and or

Definition: = Squared Coherency function Note:

Definition:

Applications and Examples of Multivariate Spectral Analysis

Example I - Linear Filters

Let denote a bivariate time series with zero mean. Suppose that the time series {yt : t T} is constructed as follows: t = ..., -2, -1, 0, 1, 2, ...

The time series {yt : t T} is said to be constructed from {xt : t T} by means of a Linear Filter.

continuing

continuing Thus the spectral density of the time series {yt : t T} is:

Comment A: is called the Transfer function of the linear filter. is called the Gain of the filter while is called the Phase Shift of the filter.

Also

continuing

Thus cross spectrum of the bivariate time series is:

Comment B: = Squared Coherency function.

Example II - Linear Filters with additive noise at the output

Let denote a bivariate time series with zero mean. Suppose that the time series {yt : t T} is constructed as follows: t = ..., -2, -1, 0, 1, 2, ... The noise {vt : t T} is independent of the series {xt : t T} (may be white)

continuing Thus the spectral density of the time series {yt : t T} is:

Also

continuing

Thus cross spectrum of the bivariate time series is:

Thus = Squared Coherency function. Noise to Signal Ratio

Multivariate Time Series Analysis

The Cross Correlation Function and the Cross Spectrum

Note: sij(h) = (i,j)th element of S(h), and is called the cross covariance function of is called the cross correlation function of

Definitions: i) is called the cross spectrum of Note: since sij(k) ≠ sij(-k) then fij(l) is complex. ii) If fij(l) = cij(l) - i qij(l) then cij(l) is called the Cospectrum (Coincident spectral density) and qij(l) is called the quadrature spectrum

iii) If fij(l) = Aij(l) exp{ifij(l)} then Aij(l) is called the Cross Amplitude Spectrum and fij(l) is called the Phase Spectrum. Note: and

now and

Definition: = Squared Coherency function Note:

Definition:

Example I - Linear Filters

Let denote a bivariate time series with zero mean. Suppose that the time series {yt : t T} is constructed as follows: t = ..., -2, -1, 0, 1, 2, ...

Thus the spectral density of the time series {yt : t T} is: Comment : is called the Transfer function of the linear filter. is called the Gain of the filter while is called the Phase Shift of the filter.

The cross spectrum of the bivariate time series is:

Comment B: = Squared Coherency function.

Example II - Linear Filters with additive noise at the output

Let denote a bivariate time series with zero mean. Suppose that the time series {yt : t T} is constructed as follows: t = ..., -2, -1, 0, 1, 2, ... The noise {vt : t T} is independent of the series {xt : t T} (may be white)

The the spectral density of the time series {yt : t T} is: The cross spectrum of the bivariate time series is:

Also and for k = 0, 1, 2, ... , m.

Finally

Note: and

Also and

Also and

The sample cross-spectrum, cospectrum & quadrature spectrum

Recall that the periodogram has asymptotic expectation 4pfxx(l). Similarly the asymptotic expectation of is 4pfxy(l). An asymptotic unbiased estimator of fxy(l) can be obtained by dividing by 4p.

The sample cross spectrum

The sample cospectrum

The sample quadrature spectrum

The sample Cross amplitude spectrum, Phase spectrum & Squared Coherency

Recall

Thus their sample counter parts can be defined in a similar manner Thus their sample counter parts can be defined in a similar manner. Namely

Consistent Estimation of the Cross-spectrum fxy(l)

Daniell Estimator

= The Daniell Estimator of the Cospectrum = The Daniell Estimator of the quadrature spectrum

Weighted Covariance Estimator

Again once the Cospectrum and Quadrature Spectrum have been estimated, The Cross spectrum, Amplitude Spectrum, Phase Spectrum and Coherency can be estimated generally as follows using either the a) Daniell Estimator or b) the weighted covariance estimator of cxy(l) and qxy(l):

Namely