Estimation of the spectral density function. The spectral density function, f( ) The spectral density function, f(x), is a symmetric function defined.

Slides:



Advertisements
Similar presentations
Definitions Periodic Function: f(t +T) = f(t)t, (Period T)(1) Ex: f(t) = A sin(2Πωt + )(2) has period T = (1/ω) and ω is said to be the frequency (angular),
Advertisements

“Students” t-test.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Periodograms Bartlett Windows Data Windowing Blackman-Tukey Resources:
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
The General Linear Model. The Simple Linear Model Linear Regression.
Lecture note 6 Continuous Random Variables and Probability distribution.
Multivariate distributions. The Normal distribution.
The Empirical FT continued. What is the large sample distribution of the EFT?
Point estimation, interval estimation
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Chapter 6 Continuous Random Variables and Probability Distributions
Probability theory 2011 The multivariate normal distribution  Characterizing properties of the univariate normal distribution  Different definitions.
2. Point and interval estimation Introduction Properties of estimators Finite sample size Asymptotic properties Construction methods Method of moments.
Statistical Background
Continuous Random Variables and Probability Distributions
Inferences About Process Quality
Copyright © Cengage Learning. All rights reserved. 7 Statistical Intervals Based on a Single Sample.
Review of Probability and Statistics
Probability theory 2008 Outline of lecture 5 The multivariate normal distribution  Characterizing properties of the univariate normal distribution  Different.
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 7 Statistical Intervals Based on a Single Sample.
The Neymann-Pearson Lemma Suppose that the data x 1, …, x n has joint density function f(x 1, …, x n ;  ) where  is either  1 or  2. Let g(x 1, …,
Review of Probability.
STAT 497 LECTURE NOTES 2.
The Examination of Residuals. The residuals are defined as the n differences : where is an observation and is the corresponding fitted value obtained.
1 Introduction to Estimation Chapter Concepts of Estimation The objective of estimation is to determine the value of a population parameter on the.
Marginal and Conditional distributions. Theorem: (Marginal distributions for the Multivariate Normal distribution) have p-variate Normal distribution.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
The Examination of Residuals. Examination of Residuals The fitting of models to data is done using an iterative approach. The first step is to fit a simple.
Continuous Distributions The Uniform distribution from a to b.
Estimation of the spectral density function. The spectral density function, f( ) The spectral density function, f(x), is a symmetric function defined.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
CHEE825 Fall 2005J. McLellan1 Spectral Analysis and Input Signal Design.
Chapter 2 Statistical Background. 2.3 Random Variables and Probability Distributions A variable X is said to be a random variable (rv) if for every real.
The Lognormal Distribution
The Simple Linear Regression Model: Specification and Estimation ECON 4550 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s.
Brief Review Probability and Statistics. Probability distributions Continuous distributions.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Lecture#10 Spectrum Estimation
Chapter 1 Random Process
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Brief Review Probability and Statistics. Probability distributions Continuous distributions.
Statistics 349.3(02) Analysis of Time Series. Course Information 1.Instructor: W. H. Laverty 235 McLean Hall Tel:
Charles University FSV UK STAKAN III Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences.
Describing Samples Based on Chapter 3 of Gotelli & Ellison (2004) and Chapter 4 of D. Heath (1995). An Introduction to Experimental Design and Statistics.
Copyright © Cengage Learning. All rights reserved. 7 Statistical Intervals Based on a Single Sample.
Multivariate Time Series Analysis
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory.
Chapter 8 Estimation ©. Estimator and Estimate estimator estimate An estimator of a population parameter is a random variable that depends on the sample.
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 7 Inferences Concerning Means.
Probability & Statistics Review I 1. Normal Distribution 2. Sampling Distribution 3. Inference - Confidence Interval.
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
Biostatistics Class 3 Probability Distributions 2/15/2000.
4.3 Probability Distributions of Continuous Random Variables: For any continuous r. v. X, there exists a function f(x), called the density function of.
Tatiana Varatnitskaya Belаrussian State University, Minsk
Normal Distribution and Parameter Estimation
Estimation of the spectral density function
CHAPTER 2 RANDOM VARIABLES.
SIGNALS PROCESSING AND ANALYSIS
4.3 Probability Distributions of Continuous Random Variables:
The Empirical FT. What is the large sample distribution of the EFT?
Estimating Population Variance
The Empirical FT. What is the large sample distribution of the EFT?
Multivariate Time Series Analysis
4.3 Probability Distributions of Continuous Random Variables:
The Spectral Representation of Stationary Time Series
The Empirical FT. What is the large sample distribution of the EFT?
Presentation transcript:

Estimation of the spectral density function

The spectral density function, f( ) The spectral density function, f(x), is a symmetric function defined on the interval [- ,  ] satisfying and The spectral density function, f(x), can be calculated from the autocovariance function and vice versa.

Some complex number results: Use

Expectations of Linear and Quadratic forms of a weakly stationary Time Series

Expectations, Variances and Covariances of Linear forms

TheoremLet {x t :t  T} be a weakly stationary time series. Let Then and where andS r = {1,2,..., T-r}, if r ≥ 0, S r = {1- r, 2 - r,..., T} if r ≤ 0.

Proof

Also since Q.E.D.

TheoremLet {x t :t  T} be a weakly stationary time series. Let and

Expectations, Variances and Covariances of Linear forms Summary

TheoremLet {x t :t  T} be a weakly stationary time series. Let Then and where andS r = {1,2,..., T-r}, if r ≥ 0, S r = {1- r, 2 - r,..., T} if r ≤ 0.

TheoremLet {x t :t  T} be a weakly stationary time series. Let and Then where and

Then where and Also S r = {1,2,..., T-r}, if r ≥ 0, S r = {1- r, 2 - r,..., T} if r ≤ 0.

Expectations, Variances and Covariances of Quadratic forms

TheoremLet {x t :t  T} be a weakly stationary time series. Let Then

and

and S r = {1,2,..., T-r}, if r ≥ 0, S r = {1- r, 2 - r,..., T} if r ≤ 0,  (h,r,s) = the fourth order cumulant = E[(x t -  )(x t+h -  )(x t+r -  )(x t+s -  )] - [  (h)  (r-s)+  (r)  (h-s)+  (s)  (h-r)] Note  (h,r,s) = 0 if {x t :t  T}is Normal.

TheoremLet {x t :t  T} be a weakly stationary time series. Let Then

and where

Examples The sample mean

and Thus

Also

and where

Thus Compare with

Basic Property of the Fejer kernel: If g() is a continuous function then : Thus

The sample autocovariance function The sample autocovariance function is defined by:

where or if  is known

where or if  is known

TheoremAssume  is known and the time series is normal, then: E(C x (h))=  (h),

and

Proof Assume  is known and the the time series is normal, then: and

where

since

hence

Thus

and Finally

Where

Thus

Expectations, Variances and Covariances of Linear forms Summary

TheoremLet {x t :t  T} be a weakly stationary time series. Let Then and where andS r = {1,2,..., T-r}, if r ≥ 0, S r = {1- r, 2 - r,..., T} if r ≤ 0.

TheoremLet {x t :t  T} be a weakly stationary time series. Let and Then where and

Expectations, Variances and Covariances of Quadratic forms

TheoremLet {x t :t  T} be a weakly stationary time series. Let Then

and

and S r = {1,2,..., T-r}, if r ≥ 0, S r = {1- r, 2 - r,..., T} if r ≤ 0,  (h,r,s) = the fourth order cumulant = E[(x t -  )(x t+h -  )(x t+r -  )(x t+s -  )] - [  (h)  (r-s)+  (r)  (h-s)+  (s)  (h-r)] Note  (h,r,s) = 0 if {x t :t  T}is Normal.

TheoremLet {x t :t  T} be a weakly stationary time series. Let Then

Estimation of the spectral density function

The Discrete Fourier Transform

Let x 1,x 2,x 3,...x T denote T observations on a univariate one-dimensional time series with zero mean (If the series has non-zero mean one uses in place of x t ). Also assume that T = 2m +1 is odd. Then

where with k = 2  k/T and k = 0, 1, 2,..., m.

The Discrete Fourier transform: k = 0, 1,2,...,m.

Note:

Since

Thus

Summary: The Discrete Fourier transform k = 0, 1,2,...,m.

Theorem with k  k/T) E[X k ] = 0 with k  k/T) and h  h/T)

where

Proof Note Thus

where

Thus Also

with  =2  (k/T)+ with  =2  (h/T)+

Thus and

Defn: The Periodogram: k = 0,1,2,..., m with k = 2  k/T and k = 0, 1, 2,..., m.

Periodogram for the sunspot data

note:

Theorem

In addition: If k ≠ 0 If k ≠ h

Proof Note Let

Recall Basic Property of the Fejer kernel: If g() is a continuous function then : The remainder of the proof is similar

Consistent Estimation of the Spectral Density function f( )

Smoothed Periodogram Estimators

Defn: The Periodogram: k = 0,1,2,..., m

Properties: If k ≠ 0 If k ≠ h

Spectral density Estimator

Properties: If k ≠ 0 The second properties states that: is not a consistent estimator of f( ):

Periodogram Spectral density Estimator Properties: If k ≠ 0 The second property states that: is not a consistent estimator of f( ): Asymptotically unbiased

Examples of using packages SPSS, Statistica

Example 1 – Sunspot data

Using SPSS Open the Data

Select Graphs- > Time Series - > Spectral

The following window appears Select the variable

Select the Window Choose the periodogram and/or spectral density Choose whether to plot by frequency or period

Periodogram Spectral density Estimator Properties: If k ≠ 0 The second property states that: is not a consistent estimator of f( ): Asymptotically unbiased

Smoothed Estimators of the spectral density

The Daniell Estimator

Properties

Now let T  ∞, d  ∞ such that d/T  0. Then we obtain asymptotically unbiased and consistent estimators, that is

Choosing the Daniell option in SPSS

k = 5

k = 9

k = 5

Other smoothed estimators

More generally consider the Smoothed Periodogram and where

Theorem (Asymptotic behaviour of Smoothed periodogram Estimators ) and Let where {u t } are independent random variables with mean 0 and variance  2 with Let d T be an increasing sequence such that

and Then Proof (See Fuller Page 292)

Weighted Covariance Estimators Recall that where

The Weighted Covariance Estimator where {w m (h): h = 0, ±1,±2,...} are a sequence of weights such that: i) 0 ≤ w m (h) ≤ w m (0) = 1 ii) w m (-h) = w m (h) iii) w m (h) = 0 for |h| > m

The Spectral Window for this estimator is defined by: i) W m ( ) = W m (- ) ii) Properties :

also (Using a Reimann-Sum Approximation) = the Smoothed Periodogram Estimator Note:

1. Asymptotic behaviour for large T 2. 3.

1. Bartlett Examples w m (h) = w(h/m) Note:

2. Parzen w(x) = 1 -2 a + 2a cos(  x) 3. Blackman-Tukey with a = 0.23 (Hamming), a = 0.25 (Hanning)

DaniellTukey Parzen Bartlett

1. Approximate Distribution and Consistency 2. 3.

1. Note: If W m ( ) is concentrated in a "peak" about = 0 and f( ) is nearly constant over its width, then 2. and

Confidence Limits in Spectral Density Estimation

1. Satterthwaites Approximation: 2. where c and r are chosen so that

Thus = The equivalent df (EDF)

and Now Thus

Then a [1-  100 % confidence interval for f( ) is: Confidence Limits for The Spectral Density function f( ) : Let  and  denote the upper and lower critical values for the Chi-square distribution with r d.f. i.e.

Estimation of the spectral density function Summary

The spectral density function, f( ) The spectral density function, f(x), is a symmetric function defined on the interval [- ,  ] satisfying and

Periodogram Spectral density Estimator Properties: If k ≠ 0 The second property states that: is not a consistent estimator of f( ): Asymptotically unbiased

Smoothed Estimators of the spectral density

Smoothed Periodogram Estimators and where The Daniell Estimator

The Weighted Covariance Estimator where {w m (h): h = 0, ±1,±2,...} are a sequence of weights such that: i) 0 ≤ w m (h) ≤ w m (0) = 1 ii) w m (-h) = w m (h) iii) w m (h) = 0 for |h| > m

1. Bartlett Choices for w m (h) = w(h/m) 2. Parzen w(x) = 1 -2 a + 2a cos(  x) 3. Blackman-Tukey with a = 0.23 (Hamming), a = 0.25 (Hanning)

The Spectral Window for this estimator is defined by: i) W m ( ) = W m (- ) ii) Properties :

also (Using a Reimann-Sum Approximation) = the Smoothed Periodogram Estimator Note:

1. Approximate Distribution and Consistency 2. 3.

1. Note: If W m ( ) is concentrated in a "peak" about = 0 and f( ) is nearly constant over its width, then 2. and

Then a [1-  100 % confidence interval for f( ) is: Confidence Limits for The Spectral Density function f( ) : Let  and  denote the upper and lower critical values for the Chi-square distribution with r d.f. i.e.

and Now Thus

Then a [1-  100 % confidence interval for f( ) is: Confidence Limits for The Spectral Density function f( ) : Let  and  denote the upper and lower critical values for the Chi-square distribution with r d.f. i.e.

and