Correlation and spectral analysis Objective: –investigation of correlation structure of time series –identification of major harmonic components in time.

Slides:



Advertisements
Similar presentations
Autocorrelation Functions and ARIMA Modelling
Advertisements

Definitions Periodic Function: f(t +T) = f(t)t, (Period T)(1) Ex: f(t) = A sin(2Πωt + )(2) has period T = (1/ω) and ω is said to be the frequency (angular),
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Periodograms Bartlett Windows Data Windowing Blackman-Tukey Resources:
Biosignals and Systems Prof. Nizamettin AYDIN 1.
GG 313 Lecture 25 Transform Pairs and Convolution 11/22/05.
Time Series Building 1. Model Identification
ELEC 303 – Random Signals Lecture 20 – Random processes
STAT 497 APPLIED TIME SERIES ANALYSIS
Environmental Data Analysis with MatLab Lecture 23: Hypothesis Testing continued; F-Tests.
Lecture 6 Power spectral density (PSD)
Environmental Data Analysis with MatLab Lecture 17: Covariance and Autocorrelation.
Correlation and Autocorrelation
Error Propagation. Uncertainty Uncertainty reflects the knowledge that a measured value is related to the mean. Probable error is the range from the mean.
Environmental Data Analysis with MatLab Lecture 24: Confidence Limits of Spectra; Bootstraps.
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by descriptive statistic.
Review of Probability.
Probability Theory and Random Processes
STAT 497 LECTURE NOTES 2.
Cross-spectral Analysis Investigates the dependence or relationship between two time series. Begin by reviewing cross-correlation function – how similar.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
Review for Exam I ECE460 Spring, 2012.
Oceanography 569 Oceanographic Data Analysis Laboratory Kathie Kelly Applied Physics Laboratory 515 Ben Hall IR Bldg class web site: faculty.washington.edu/kellyapl/classes/ocean569_.
EE484: Probability and Introduction to Random Processes Autocorrelation and the Power Spectrum By: Jason Cho
1 Part 5 Response of Linear Systems 6.Linear Filtering of a Random Signals 7.Power Spectrum Analysis 8.Linear Estimation and Prediction Filters 9.Mean-Square.
1 Enviromatics Environmental time series Environmental time series Вонр. проф. д-р Александар Маркоски Технички факултет – Битола 2008 год.
Time Series Spectral Representation Z(t) = {Z 1, Z 2, Z 3, … Z n } Any mathematical function has a representation in terms of sin and cos functions.
Cross-spectra analysis of mid-tropospheric thermodynamical variables during Southern Africa biomass season. Yemi Adebiyi MPO 524.
Wireless Communication Technologies 1 Outline Introduction OFDM Basics Performance sensitivity for imperfect circuit Timing and.
Autocorrelation correlations between samples within a single time series.
The Examination of Residuals. Examination of Residuals The fitting of models to data is done using an iterative approach. The first step is to fit a simple.
Signals CY2G2/SE2A2 Information Theory and Signals Aims: To discuss further concepts in information theory and to introduce signal theory. Outcomes:
1 Review from previous class  Error VS Uncertainty  Definitions of Measurement Errors  Measurement Statement as An Interval Estimate  How to find bias.
Elements of Stochastic Processes Lecture II
CHEE825 Fall 2005J. McLellan1 Spectral Analysis and Input Signal Design.
Chapter 6 Spectrum Estimation § 6.1 Time and Frequency Domain Analysis § 6.2 Fourier Transform in Discrete Form § 6.3 Spectrum Estimator § 6.4 Practical.
Modern Navigation Thomas Herring MW 11:00-12:30 Room
Review of Random Process Theory CWR 6536 Stochastic Subsurface Hydrology.
1 Spectrum Estimation Dr. Hassanpour Payam Masoumi Mariam Zabihi Advanced Digital Signal Processing Seminar Department of Electronic Engineering Noushirvani.
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
Testing Random-Number Generators Andy Wang CIS Computer Systems Performance Analysis.
Lecture#10 Spectrum Estimation
GG313 Lecture 24 11/17/05 Power Spectrum, Phase Spectrum, and Aliasing.
Chapter 1 Random Process
Geology 5600/6600 Signal Analysis 16 Sep 2015 © A.R. Lowry 2015 Last time: A process is ergodic if time averages equal ensemble averages. Properties of.
Dynamic Models, Autocorrelation and Forecasting ECON 6002 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s notes.
Autoregressive (AR) Spectral Estimation
Linear Filters. denote a bivariate time series with zero mean. Let.
Discrete-time Random Signals
Geology 6600/7600 Signal Analysis 28 Sep 2015 © A.R. Lowry 2015 Last time: Energy Spectral Density; Linear Systems given (deterministic) finite-energy.
DTFT continue (c.f. Shenoi, 2006)  We have introduced DTFT and showed some of its properties. We will investigate them in more detail by showing the associated.
Multivariate Time Series Analysis
ECE-7000: Nonlinear Dynamical Systems 2. Linear tools and general considerations 2.1 Stationarity and sampling - In principle, the more a scientific measurement.
Geology 5600/6600 Signal Analysis 14 Sep 2015 © A.R. Lowry 2015 Last time: A stationary process has statistical properties that are time-invariant; a wide-sense.
Geology 6600/7600 Signal Analysis 30 Sep 2015 © A.R. Lowry 2015 Last time: The transfer function relating linear SISO input & output signals is given by.
ESTIMATION METHODS We know how to calculate confidence intervals for estimates of  and  2 Now, we need procedures to calculate  and  2, themselves.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory.
Geology 6600/7600 Signal Analysis 05 Oct 2015 © A.R. Lowry 2015 Last time: Assignment for Oct 23: GPS time series correlation Given a discrete function.
The Spectrum n Jean Baptiste Fourier ( ) discovered a fundamental tenet of wave theory.
Hi everybody I am robot. You can call me simply as robo. My knowledge is 10,000 times of yours. And my memory is in the order of tera bytes. Do you know.
EC 827 Module 2 Forecasting a Single Variable from its own History.
(a) Measuring the memory in a time series with auto-correlation function (ACF)
Properties of the power spectral density (1/4)
Computational Data Analysis
National Mathematics Day
Time Series Analysis and Its Applications
SIGNALS PROCESSING AND ANALYSIS
Spectral Analysis.
Cross-spectral Analysis
Linear Filters.
8.6 Autocorrelation instrument, mathematical definition, and properties autocorrelation and Fourier transforms cosine and sine waves sum of cosines Johnson.
Presentation transcript:

Correlation and spectral analysis Objective: –investigation of correlation structure of time series –identification of major harmonic components in time series Tools: –auto-covariance and auto-correlation function –cross-covariance and cross-correlation function –variance spectrum and spectral density function

Autocovariance and autocorrelation function Autocovariance function of series Y(t): Covariance at lag 0 = variance: To ensure comparibility scaling of autocovariance by variance: For a random series:

Estimation of autocovariance and autocorrelation function –Estimation of autocovariance: –Estimation of autocorrelation function: –Note that estimators are biased (division by n rather than by (n-k-1): gives smaller mean square error –95% confidence limits for zero correlation:

Example autocorrelation function for monthly rainfall station PATAS ( )

Example PATAS continued (1)

Example PATAS continued (2) Periodicity with T = 12 months due to fixed occurrence of monsoon rainfall

Autocorrelogram of daily rainfall at PATAS Lag = 1 month

Autocorrelation function of harmonic it follows: –covariance of harmonic remains a harmonic of the same frequency f, so it preserves frequency information –information on phase shift vanishes –amplitude of covariance is equal to variance of Y(t)

Y(t) = harmonic series

Autocovariance and autocorrelation of harmonic

Cross-covariance and cross-correlation function Cross-covariance: X(t) Y(t+k) Lag = k X Y Time t

Cross-covariance and cross-correlation function Definition cross-covariance function: Definition of cross-correlation function: Note: –for lag 0:  XY (0) < 1 unless perfect correlation –maximum may occur at lag  0 –for two water level stations along river, maximum correlation will be equal to k = distance / celerity –hence upstream station X(t) will give maximum correlation with Y(t+k) at downstream site –or downstream site Y(t) will give maximum correlation with X(t-k) at upstream site

Estimation of cross-covariance and cross- correlation function Estimation of cross-covariance: Estimation of cross-correlation: Note that estimators are biased since n rather than (n-k-1) is used in divider, but estimator provides smaller mean square error

Variance spectrum and spectral density function Plot of variance of harmonics versus their frequencies is called power or variance spectrum –If S p (f) is the ordinate of continuous spectrum then the variance contributed by all frequencies in the frequency interval f, f+df is given by S p (f).df. Hence: –Hydrological processes can be considered to be frequency limited, hence harmonics with f > f c do not significantly contribute to variance of the process. Then: –f c = Nyquist or cut-off frequency

Spectrum (2) Scaling of variance spectrum by variance (to make them comparable) gives the spectral density function: spectral density function is Fourier transform of auto- correlation function

Estimation of spectrum Replace  YY (k) by r YY (k) for k=0,1,2,…,M M = maximum lag for which acf is estimated M is to be carefully selected To reduce sampling variance in estimate a smoothing function is applied: the spectral density at f k is estimated as weighted average of density at f k-1,f k,f k+1 Smoothing function is spectral window Spectral window has to be carefully designed Appropriate window is Tukey window: In frequency domain it implies f k is weighted average according to: 1/4f k-1,1/2f k,1/4f k+1 Bandwidth Number of degrees of freedom

Estimation of spectrum (2) Spectral estimator s(f) for S d (f) becomes: To be estimated at: According to Jenkins and Watts number of frequency points should be 2 to 3 times (M+1) (1-  )100% confidence limits: f c = 1/(2  t)

Confidence limits for white noise (s(f) =2) Variance reduces with decreasing M; for n > 25 variance reduces only slowly

Estimation of spectrum (3) To reduce sampling variance, M should be taken small, say M = 10 to 15 % of N (= series length) However, small M leads to large bandwidth B Large B gives smoothing over large frequency range E.g. if one expects significant harmonics with periods 16 and 24 hours in hourly series: –frequency difference is 1/16 - 1/24 = 1/48 –hence: B < 1/48 –so: M > 4x48/3 = 64 –by choosing M = 10% of N, then N > 640 data points or about one month Since it is not known in advance which harmonics are significant, estimation is to be repeated for different M White noise:  YY (k) = 0 for k > 0 it follows since 0  f  1/2, for white noise s(f)  2

Example of spectrum of monthly rainfall data for station PATAS

Example of spectrum of monthly rainfall data for station PATAS (2)

Example of spectrum of monthly rainfall data for station PATAS (3) Variance is maximum at f = = 1/12