Stochastic Process Theory and Spectral Estimation

Slides:



Advertisements
Similar presentations
ELEN 5346/4304 DSP and Filter Design Fall Lecture 15: Stochastic processes Instructor: Dr. Gleb V. Tcheslavski Contact:
Advertisements

ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Periodograms Bartlett Windows Data Windowing Blackman-Tukey Resources:
FilteringComputational Geophysics and Data Analysis 1 Filtering Geophysical Data: Be careful!  Filtering: basic concepts  Seismogram examples, high-low-bandpass.
Lecture 7 Linear time invariant systems
Digital Signal Processing
Statistical properties of Random time series (“noise”)
Artifact cancellation and nonparametric spectral analysis.
STAT 497 APPLIED TIME SERIES ANALYSIS
Spectral analysis II: Applications Bijan Pesaran Center for Neural Science New York University.
Spectral analysis for point processes. Error bars. Bijan Pesaran Center for Neural Science New York University.
Point process and hybrid spectral analysis.
Lecture 6 Power spectral density (PSD)
ECE 472/572 - Digital Image Processing Lecture 8 - Image Restoration – Linear, Position-Invariant Degradations 10/10/11.
Stochastic processes Lecture 8 Ergodicty.
Lecture 17 spectral analysis and power spectra. Part 1 What does a filter do to the spectrum of a time series?
SYSTEMS Identification
Statistical analysis and modeling of neural data Lecture 6 Bijan Pesaran 24 Sept, 2007.
Statistical analysis and modeling of neural data Lecture 17 Bijan Pesaran 12 November, 2007.
Statistical analysis and modeling of neural data Lecture 5 Bijan Pesaran 19 Sept, 2007.
1 For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if then represents its energy spectrum. This.
Review of Probability.
Introduction to Spectral Estimation
Environmental Data Analysis with MatLab Lecture 20: Coherence; Tapering and Spectral Analysis.
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
Goals For This Class Quickly review of the main results from last class Convolution and Cross-correlation Discrete Fourier Analysis: Important Considerations.
01/24/05© 2005 University of Wisconsin Last Time Raytracing and PBRT Structure Radiometric quantities.
Review for Exam I ECE460 Spring, 2012.
The Story of Wavelets.
Random Processes ECE460 Spring, Power Spectral Density Generalities : Example: 2.
Time Series Spectral Representation Z(t) = {Z 1, Z 2, Z 3, … Z n } Any mathematical function has a representation in terms of sin and cos functions.
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
Elements of Stochastic Processes Lecture II
CHEE825 Fall 2005J. McLellan1 Spectral Analysis and Input Signal Design.
Chapter 6 Spectrum Estimation § 6.1 Time and Frequency Domain Analysis § 6.2 Fourier Transform in Discrete Form § 6.3 Spectrum Estimator § 6.4 Practical.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
Review of Random Process Theory CWR 6536 Stochastic Subsurface Hydrology.
1 Spectrum Estimation Dr. Hassanpour Payam Masoumi Mariam Zabihi Advanced Digital Signal Processing Seminar Department of Electronic Engineering Noushirvani.
ارتباطات داده (883-40) فرآیندهای تصادفی نیمسال دوّم افشین همّت یار دانشکده مهندسی کامپیوتر 1.
Random processes. Matlab What is a random process?
Experiments on Noise CharacterizationRoma, March 10,1999Andrea Viceré Experiments on Noise Analysis l Need of noise characterization for  Monitoring the.
Estimation of the spectral density function. The spectral density function, f( ) The spectral density function, f(x), is a symmetric function defined.
revision Transfer function. Frequency Response
Lecture#10 Spectrum Estimation
Chapter 1 Random Process
The generalization of Bayes for continuous densities is that we have some density f(y|  ) where y and  are vectors of data and parameters with  being.
ENEE631 Digital Image Processing (Spring'04) Basics on 2-D Random Signal Spring ’04 Instructor: Min Wu ECE Department, Univ. of Maryland, College Park.
Discrete-time Random Signals
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Stochastic Process Theory and Spectral Estimation Bijan Pesaran Center for Neural Science New York University.
Geology 6600/7600 Signal Analysis 09 Oct 2015 © A.R. Lowry 2015 Last time: A Periodogram is the squared modulus of the signal FFT! Blackman-Tukey estimates.
ECE-7000: Nonlinear Dynamical Systems 2. Linear tools and general considerations 2.1 Stationarity and sampling - In principle, the more a scientific measurement.
Geology 5600/6600 Signal Analysis 14 Sep 2015 © A.R. Lowry 2015 Last time: A stationary process has statistical properties that are time-invariant; a wide-sense.
Power Spectral Estimation
Multivariate time series analysis Bijan Pesaran Center for Neural Science New York University.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory.
Geology 6600/7600 Signal Analysis 05 Oct 2015 © A.R. Lowry 2015 Last time: Assignment for Oct 23: GPS time series correlation Given a discrete function.
6/11/2016Atomic Scale Simulation1 Definition of Simulation What is a simulation? –It has an internal state “S” In classical mechanics, the state = positions.
Chapter 6 Random Processes
1 C.A.L. Bailer-Jones. Machine Learning. Data exploration and dimensionality reduction Machine learning, pattern recognition and statistical data modelling.
Biointelligence Laboratory, Seoul National University
Stochastic Process - Introduction
Multiple Random Variables and Joint Distributions
Time Series Spectral Representation
Spectral Analysis Spectral analysis is concerned with the determination of the energy or power spectrum of a continuous-time signal It is assumed that.
SIGNALS PROCESSING AND ANALYSIS
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Computational Data Analysis
Review of Probability Theory
For a deterministic signal x(t), the spectrum is well defined: If
Presentation transcript:

Stochastic Process Theory and Spectral Estimation Bijan Pesaran Center for Neural Science New York University

Overview Stochastic process theory – see Appendix Spectral estimation

Fourier Transform Real functions: Parseval’s Theorem (Total power is conserved)

Examples of Fourier Transforms Time domain Frequency domain

Time translation invariance Leads directly to spectral analysis Fourier basis is eigenbasis of

Implications for second moment If process is stationary, second moment is time translation invariant Hence, for Because

Stationarity Stationarity means neighboring frequencies are uncorrelated Not true for neighboring times Also due to stationarity, (In general)

Cross-spectrum and coherence

Coherence Coherence measures the linear association between two time series. Cross-spectrum is the Fourier transform of the cross-correlation function

Coherence Frequency-dependent time delay

Advantages of spectral estimation Neighboring bins are uncorrelated Error bars relatively easy to calculate Stable statistical estimators Separate signals together that have different frequencies Normalized quantities Allow averaging and comparisons

Estimating spectra Simple spectral estimates: Periodogram Bias Variance Tapering is smoothing your spectrum Multitaper estimates using Slepians Spectrum and coherence

Example LFP spectrum Multitaper estimate Periodogram – Single Trial - Single Trial, 2NT=10 Periodogram – Single Trial

Spectral estimation problem The Fourier transform requires an infinite sequence of data In reality, we only have finite sequences of data and so we calculate truncated DFT

What happens if we have a finite sequence of data? Finite sequence means DFT is convolution of and

Fourier transform of a rectangular window is the Dirichlet kernel: The Fourier transform of a rectangular window Convolution in frequency = product in time

Bias Bias is the difference between the expected value of an estimator and the true value. The Dirichlet kernel is not a delta function, therefore the sample estimate is biased and doesn’t equal the true value.

Normalized Dirichlet kernel 20% height Narrowband bias: Local bias due to central lobe Broadband bias: Bias from distant frequencies due to sidelobes

Data tapers We can do better than multiplying the data by a rectangular kernel. Choose a function that tapers the data to zero towards the edge of the segment Many choices of data taper exist: Hanning taper, Hamming taper, triangular taper and so on

Triangular taper Broadens central lobe Reduces sidelobes Fejer kernel, for triangular taper, compared with Dirichlet kernel, for rectangular taper.

Spectral concentration problem Tapering the data reduces sidelobes but broadens the central lobes. Are there “optimal” tapers? Find strictly time-localized functions, , whose Fourier transforms are maximally localized on the frequency interval [-W,W]

Optimal tapers The DFT, , of a finite series, Find series that maximizes energy in a [-W,W] frequency band

Discrete Prolate Spheroidal Sequences Solved by Slepian, Landau and Pollack Solutions are an orthogonal family of sequences which are solutions to the following eigenvalue functions

Slepian functions Eigenvectors of eigenvalue equation Orthonormal on [-1/2,1/2] Orthogonal on [-W,W] K=2WT-1 eigenvalues are close to 1, the rest are close to 0. Correspond to 2WT-1 functions within [-W,W]

Power of the kth Slepian function within the bandwidth [-W,W]

Comparing Slepian functions Systematic trade-off between narrowband and broadband bias

Advantages of Slepian tapers 2WT=6 Using multiple tapers recovers edge of time window

Multitaper spectral estimation Each data taper provides uncorrelated estimate. Average over them to get spectral estimate. Treat different trials as additional tapers and average over them as well

Cross-spectrum and coherency

Advantages of multiple tapers Increasing number of tapers reduces variance of spectral estimators. Explicitly control trade-off between narrowband bias, broadband bias and variance “Better microscope” Local frequency basis for analyzing signals

Time-frequency resolution 2W Frequency T Time Control resolution in the time-frequency plane using parameters of T and W in Slepians

Example LFP spectrograms Multitaper estimate - T = 0.5s, W = 10Hz Multitaper estimate - T = 0.2s, W = 25Hz

Summary Time series present particular challenges for statistical analysis Spectral analysis is a valuable form of time series analysis

Appendix

Data is modeled as a stochastic process Spikes LFP Similar considerations for EEG, MEG, ECoG, intracellular membrane potentials, intrinsic and extrinsic optical images, 2-photon line scans and so on

Stochastic process theory Defining stochastic processes Time translation invariance; Ergodicity Moments (Correlation functions) and spectra Example Gaussian processes

Stochastic processes Each time series is a realization of a stochastic process Given a sequence of observations, at times, a stochastic process is characterized by the probability distribution Akin to rolling a die for each time series Probability distribution for time series Alternative is deterministic process No stochastic variability

Defining stochastic processes High dimensional random variables Rolling one die picks a point in high dimensional space. Function in ND space. Indexed families of random variables Roll many dice

Challenge of data analysis We can never know the full probability distribution of the data Curse of dimensionality

Parametric methods Parametric methods infer the PDF by considering a parameterized subspace Employ relatively strong models of underlying process

Non-parametric methods Non-parametric methods use the observed data to infer statistical properties of the PDF Employ relatively weak models of the underlying process

Stationarity Stochastic processes don’t exactly repeat themselves They have statistical regularities: Stationarity

Ergodicity Ensemble averages are equivalent to time averages Often assumed in experimental work More stringent than stationarity is not ergodic unless only one constant Is activity with time-varying constant ergodic?

Gaussian processes Ornstein Uhlenbeck process Weiner process

Ornstein Uhlenbeck Process Exponentially decaying correlation function Obtained by passing passing white noise through a ‘leaky’ integrator Spectrum is Lorentzian

Ornstein Uhlenbeck process

Markovian process “Future depends on the past given the present” Simplifies joint probability density