The Empirical FT. What is the large sample distribution of the EFT?

Slides:



Advertisements
Similar presentations
Change-Point Detection Techniques for Piecewise Locally Stationary Time Series Michael Last National Institute of Statistical Sciences Talk for Midyear.
Advertisements

General Linear Model With correlated error terms  =  2 V ≠  2 I.
Distributions of sampling statistics Chapter 6 Sample mean & sample variance.
Chap 8: Estimation of parameters & Fitting of Probability Distributions Section 6.1: INTRODUCTION Unknown parameter(s) values must be estimated before.
The General Linear Model. The Simple Linear Model Linear Regression.
ELEC 303 – Random Signals Lecture 18 – Statistics, Confidence Intervals Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 10, 2009.
Goodness of Fit of a Joint Model for Event Time and Nonignorable Missing Longitudinal Quality of Life Data – A Study by Sneh Gulati* *with Jean-Francois.
Probability theory 2010 Order statistics  Distribution of order variables (and extremes)  Joint distribution of order variables (and extremes)
Meet the professor Friday, January 23 at SFU 4:30 Beer and snacks reception.
The Empirical FT continued. What is the large sample distribution of the EFT?
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
The Fourier transform. Properties of Fourier transforms Convolution Scaling Translation.
Presenting: Assaf Tzabari
Parametric Inference.
The Multivariate Normal Distribution, Part 2 BMTRY 726 1/14/2014.
Standard error of estimate & Confidence interval.
Maximum Likelihood Estimation
1  The goal is to estimate the error probability of the designed classification system  Error Counting Technique  Let classes  Let data points in class.
Simulation Output Analysis
Chapter 7 Estimation: Single Population
Random Sampling, Point Estimation and Maximum Likelihood.
Estimating parameters in a statistical model Likelihood and Maximum likelihood estimation Bayesian point estimates Maximum a posteriori point.
Time Series Spectral Representation Z(t) = {Z 1, Z 2, Z 3, … Z n } Any mathematical function has a representation in terms of sin and cos functions.
Chapter 7 Point Estimation
Borgan and Henderson:. Event History Methodology
Estimation of the spectral density function. The spectral density function, f( ) The spectral density function, f(x), is a symmetric function defined.
CHEE825 Fall 2005J. McLellan1 Spectral Analysis and Input Signal Design.
Consistency An estimator is a consistent estimator of θ, if , i.e., if
Estimation of the spectral density function. The spectral density function, f( ) The spectral density function, f(x), is a symmetric function defined.
The final exam solutions. Part I, #1, Central limit theorem Let X1,X2, …, Xn be a sequence of i.i.d. random variables each having mean μ and variance.
Confidence Interval & Unbiased Estimator Review and Foreword.
The generalization of Bayes for continuous densities is that we have some density f(y|  ) where y and  are vectors of data and parameters with  being.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Review Normal Distributions –Draw a picture. –Convert to standard normal (if necessary) –Use the binomial tables to look up the value. –In the case of.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
Multivariate Time Series Analysis
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
G. Cowan Lectures on Statistical Data Analysis Lecture 9 page 1 Statistical Data Analysis: Lecture 9 1Probability, Bayes’ theorem 2Random variables and.
Empirical Evaluation (Ch 5) how accurate is a hypothesis/model/dec.tree? given 2 hypotheses, which is better? accuracy on training set is biased – error:
Chapter 8 Estimation ©. Estimator and Estimate estimator estimate An estimator of a population parameter is a random variable that depends on the sample.
Learning Theory Reza Shadmehr Distribution of the ML estimates of model parameters Signal dependent noise models.
Chapter 3: Uncertainty "variation arises in data generated by a model" "how to transform knowledge of this variation into statements about the uncertainty.
Density Estimation in R Ha Le and Nikolaos Sarafianos COSC 7362 – Advanced Machine Learning Professor: Dr. Christoph F. Eick 1.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Stat 223 Introduction to the Theory of Statistics
Confidence intervals for µ
Tatiana Varatnitskaya Belаrussian State University, Minsk
STATISTICAL INFERENCE
Probability Theory and Parameter Estimation I
Estimation of the spectral density function
Parameter Estimation 主講人:虞台文.
Time series power spectral density.
Sample Mean Distributions
The Empirical FT. What is the large sample distribution of the EFT?
Stochastic models - time series.
Lecture 2 Interval Estimation
Stochastic models - time series.
Second order stationary p.p.
Summarizing Data by Statistics
Multivariate Time Series Analysis
7.4 Fourier Transform Theorems, Part I
Stat 223 Introduction to the Theory of Statistics
The Empirical FT. What is the large sample distribution of the EFT?
Learning From Observed Data
16. Mean Square Estimation
Probabilistic Surrogate Models
How Confident Are You?.
Presentation transcript:

The Empirical FT. What is the large sample distribution of the EFT?

The complex normal.

Theorem. Suppose X is stationary mixing, then

Proof. Write Evaluate first and second-order cumulants Bound higher cumulants Normal is determined by its moments

Consider

Comments. Already used to study mean estimate Tapering, h(t)X(t). makes Get asymp independence for different frequencies The frequencies 2r/T are special, e.g. T(2r/T)=0, r 0 Also get asymp independence if consider separate stretches p-vector version involves p by p spectral density matrix fXX( )

Estimation of the (power) spectrum. An estimate whose limit is a random variable

Some moments. The estimate is asymptotically unbiased Final term drops out if  = 2r/T  0 Best to correct for mean, work with

Periodogram values are asymptotically independent since dT values are - independent exponentials Use to form estimates

Approximate marginal confidence intervals

More on choice of L

Approximation to bias

Indirect estimate

Estimation of finite dimensional . approximate likelihood (assuming IT values independent exponentials)

Bivariate case.

Crossperiodogram. Smoothed periodogram.

Complex Wishart

Predicting Y via X

Plug in estimates.

Large sample distributions. var log|AT|  [|R|-2 -1]/L var argAT  [|R|-2 -1]/L

Advantages of frequency domain approach. techniques for many stationary processes look the same approximate i.i.d sample values assessing models (character of departure) time varying variant...