Second order stationary p.p.

Slides:



Advertisements
Similar presentations
Statistical Time Series Analysis version 2
Advertisements

Change-Point Detection Techniques for Piecewise Locally Stationary Time Series Michael Last National Institute of Statistical Sciences Talk for Midyear.
Analysis of multivariate transformations. Transformation of the response in regression The normalized power transformation is: is the geometric mean of.
General Linear Model With correlated error terms  =  2 V ≠  2 I.
1 Goodness-of-Fit Tests with Censored Data Edsel A. Pena Statistics Department University of South Carolina Columbia, SC [ Research.
Chap 8: Estimation of parameters & Fitting of Probability Distributions Section 6.1: INTRODUCTION Unknown parameter(s) values must be estimated before.
The General Linear Model. The Simple Linear Model Linear Regression.
Multivariate distributions. The Normal distribution.
Goodness of Fit of a Joint Model for Event Time and Nonignorable Missing Longitudinal Quality of Life Data – A Study by Sneh Gulati* *with Jean-Francois.
Environmental Data Analysis with MatLab Lecture 17: Covariance and Autocorrelation.
The Simple Linear Regression Model: Specification and Estimation
The Empirical FT continued. What is the large sample distribution of the EFT?
Point processes on the line. Nerve firing.. Stochastic point process. Building blocks Process on R {N(t)}, t in R, with consistent set of distributions.
Presenting: Assaf Tzabari
Inference. Estimates stationary p.p. {N(t)}, rate p N, observed for 0
Lecture II-2: Probability Review
The Multivariate Normal Distribution, Part 2 BMTRY 726 1/14/2014.
1 Multivariate Normal Distribution Shyh-Kang Jeng Department of Electrical Engineering/ Graduate Institute of Communication/ Graduate Institute of Networking.
1  The goal is to estimate the error probability of the designed classification system  Error Counting Technique  Let classes  Let data points in class.
Simulation Output Analysis
Marginal and Conditional distributions. Theorem: (Marginal distributions for the Multivariate Normal distribution) have p-variate Normal distribution.
Time Series Spectral Representation Z(t) = {Z 1, Z 2, Z 3, … Z n } Any mathematical function has a representation in terms of sin and cos functions.
Borgan and Henderson:. Event History Methodology
Estimation of the spectral density function. The spectral density function, f( ) The spectral density function, f(x), is a symmetric function defined.
ELEC 303 – Random Signals Lecture 18 – Classical Statistical Inference, Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 4, 2010.
CHEE825 Fall 2005J. McLellan1 Spectral Analysis and Input Signal Design.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
Estimation of the spectral density function. The spectral density function, f( ) The spectral density function, f(x), is a symmetric function defined.
Confidence Interval & Unbiased Estimator Review and Foreword.
Chapter 1 Random Process
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Multivariate Time Series Analysis
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory.
Learning Theory Reza Shadmehr Distribution of the ML estimates of model parameters Signal dependent noise models.
Chapter 3: Uncertainty "variation arises in data generated by a model" "how to transform knowledge of this variation into statements about the uncertainty.
Environmental Data Analysis with MatLab 2 nd Edition Lecture 22: Linear Approximations and Non Linear Least Squares.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Locating a Shift in the Mean of a Time Series Melvin J. Hinich Applied Research Laboratories University of Texas at Austin
Charles University Charles University STAKAN III
Standard Errors Beside reporting a value of a point estimate we should consider some indication of its precision. For this we usually quote standard error.
Time Series Spectral Representation
Tatiana Varatnitskaya Belаrussian State University, Minsk
Estimation of the spectral density function
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Parameter Estimation 主講人:虞台文.
Time series power spectral density.
Sample Mean Distributions
Inference. Estimates stationary p.p. {N(t)}, rate pN , observed for 0
The Empirical FT. What is the large sample distribution of the EFT?
CI for μ When σ is Unknown
Mixing. Stationary case unless otherwise indicated
Power spectral density. frequency-side, , vs. time-side, t
Stochastic models - time series.
6-1 Introduction To Empirical Models
Statistical Assumptions for SLR
The Empirical FT. What is the large sample distribution of the EFT?
Stochastic models - time series.
Summarizing Data by Statistics
Stat Oct 2008 D. R. Brillinger Chapter 7 - Spectral analysis 7.1 Fourier analysis Xt = μ + α cos ωt + βsin ωt + Zt Cases ω known versus.
Multivariate Time Series Analysis
FT makes the New Yorker, October 4, 2010 page 71
Inference. Estimates stationary p.p. {N(t)}, rate pN , observed for 0
The Empirical FT. What is the large sample distribution of the EFT?
Parametric Methods Berlin Chen, 2005 References:
Multivariate Methods Berlin Chen
Networks. partial spectra - trivariate (M,N,O)
Multivariate Methods Berlin Chen, 2005 References:
16. Mean Square Estimation
Applied Statistics and Probability for Engineers
Presentation transcript:

Second order stationary p.p. dN(t) = ∫ exp{ i λ t} dZ N (λ ) dt Kolmogorov N(t) = Power spectrum cov{dZ N(λ ), dZ N (μ } = δ (λ - μ) f NN d λ d μ remember Z complex-valued Bivariate p.p. process {M,N} Coherency R MN (λ) = f MN / √{f MM f NN} cp. r Coherence | R MN (λ) | 2 cp. r2

The Empirical FT. What is the large sample distribution of the EFT?

The complex normal.

Theorem. Suppose p.p. N is stationary mixing, then

Proof. Write dN(t) = ∫ exp{ i λ t} dZ N (λ ) dt Kolmogorov then d NT(λ) = ∫ δ T (λ – α) dZ N (α) with δ T (λ )= ∫ 0 T exp{ -i λ t} dt Evaluate first and second-order cumulants Bound higher cumulants Normal is determined by its moments

Consider

Comments. Already used to study rate estimate Tapering makes Get asymp independence for different frequencies The frequencies 2r/T are special, e.g. T(2r/T)=0, r 0 Also get asymp independence if consider separate stretches p-vector version involves p by p spectral density matrix fNN( )

Estimation of the (power) spectrum. An estimate whose limit is a random variable

Some moments. The estimate is asymptotically unbiased Final term drops out if  = 2r/T  0 Best to correct for mean, work with

Periodogram values are asymptotically independent since dT values are - independent exponentials Use to form estimates

Approximate marginal confidence intervals

More on choice of L

Approximation to bias

Indirect estimate Could approximate p.p. by 0-1 time series and use t.s. estimate choice of cells?

Estimation of finite dimensional . approximate likelihood (assuming IT values independent exponentials)

Bivariate case.

Crossperiodogram. Smoothed periodogram.

Complex Wishart

Predicting N via M

Plug in estimates.

Large sample distributions. var log|AT|  [|R|-2 -1]/L var argAT  [|R|-2 -1]/L

Networks. partial spectra - trivariate (M,N,O) Remove linear time invariant effect of M from N and O and examine coherency of residuals,

Point process system. An operation carrying a point process, M, into another, N N = S[M] S = {input,mapping, output} Pr{dN(t)=1|M} = { + a(t-u)dM(u)}dt System identification: determining the charcteristics of a system from inputs and corresponding outputs {M(t),N(t);0 t<T} Like regression vs. bivariate

Realizable case. a(u) = 0 u<0 A() is of special form e.g. N(t) = M(t-) arg A( ) = -

Advantages of frequency domain approach. techniques formany stationary processes look the same approximate i.i.d sample values assessing models (character of departure) time varying variant ...