Inference. Estimates stationary p.p. {N(t)}, rate pN , observed for 0<t<T First-order.

Slides:



Advertisements
Similar presentations
Advanced topics in Financial Econometrics Bas Werker Tilburg University, SAMSI fellow.
Advertisements

Pattern Recognition and Machine Learning
T h e G a s L a w s. T H E G A S L A W S z B o y l e ‘ s L a w z D a l t o n ‘ s L a w z C h a r l e s ‘ L a w z T h e C o m b i n e d G a s L a w z B.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Likelihood Ratio, Wald, and Lagrange Multiplier (Score) Tests
Tracking Unknown Dynamics - Combined State and Parameter Estimation Tracking Unknown Dynamics - Combined State and Parameter Estimation Presenters: Hongwei.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Spectral analysis for point processes. Error bars. Bijan Pesaran Center for Neural Science New York University.
Distribution of Sample Means, the Central Limit Theorem If we take a new sample, the sample mean varies. Thus the sample mean has a distribution, called.
The Empirical FT continued. What is the large sample distribution of the EFT?
From PET to SPLIT Yuri Kifer. PET: Polynomial Ergodic Theorem (Bergelson) preserving and weakly mixing is bounded measurable functions polynomials,integer.
Estimation of parameters. Maximum likelihood What has happened was most likely.
1 1.MLE 2.K-function & variants 3.Residual methods 4.Separable estimation 5.Separability tests Estimation & Inference for Point Processes.
Probability By Zhichun Li.
Parametric Inference.
Today Today: Chapter 9 Assignment: 9.2, 9.4, 9.42 (Geo(p)=“geometric distribution”), 9-R9(a,b) Recommended Questions: 9.1, 9.8, 9.20, 9.23, 9.25.
Inference. Estimates stationary p.p. {N(t)}, rate p N, observed for 0
The moment generating function of random variable X is given by Moment generating function.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Statistical inference.
Approximations to Probability Distributions: Limit Theorems.
Probability Theory Summary
AS 737 Categorical Data Analysis For Multivariate
Random Sampling, Point Estimation and Maximum Likelihood.
Please turn off cell phones, pagers, etc. The lecture will begin shortly. There will be a quiz at the end of today’s lecture. Friday’s lecture has been.
Convergence in Distribution
Chapter 7: Sample Variability Empirical Distribution of Sample Means.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 7 Sampling Distributions.
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
Problem: 1) Show that is a set of sufficient statistics 2) Being location and scale parameters, take as (improper) prior and show that inferences on ……
1 Standard error Estimated standard error,s,. 2 Example 1 While measuring the thermal conductivity of Armco iron, using a temperature of 100F and a power.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Brief Review Probability and Statistics. Probability distributions Continuous distributions.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
Likelihood-tuned Density Estimator Yeojin Chung and Bruce G. Lindsay The Pennsylvania State University Nonparametric Maximum Likelihood Estimator Nonparametric.
Lecture 2: Everything you need to know to know about point processes Outline: basic ideas homogeneous (stationary) Poisson processes Poisson distribution.
A Joint Distribution of Discrete Random Variables X Marginal Y Distribution Of Y Marginal Distribution.
Variance Stabilizing Transformations. Variance is Related to Mean Usual Assumption in ANOVA and Regression is that the variance of each observation is.
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
中央财政支持社会组织参与社会服务项目 PNT 无创治疗青光眼防盲治盲医疗救助项目 中期情况汇报 中国盲人协会.
Stat 223 Introduction to the Theory of Statistics
Chapter 6: Sampling Distributions
Sampling and Sampling Distributions
4-1 Continuous Random Variables 4-2 Probability Distributions and Probability Density Functions Figure 4-1 Density function of a loading on a long,
Tatiana Varatnitskaya Belаrussian State University, Minsk
STATISTICS POINT ESTIMATION
Stat 223 Introduction to the Theory of Statistics
Estimation of the spectral density function
Chapter 4. Inference about Process Quality
STATISTICAL INFERENCE PART I POINT ESTIMATION
Time series power spectral density.
Parameter, Statistic and Random Samples
Linear Combination of Two Random Variables
Special Topics In Scientific Computing
Likelihood Ratio, Wald, and Lagrange Multiplier (Score) Tests
The Empirical FT. What is the large sample distribution of the EFT?
Lectures prepared by: Elchanan Mossel Yelena Shvets
The Empirical FT. What is the large sample distribution of the EFT?
Second order stationary p.p.
Pattern Recognition and Machine Learning
Inference. Estimates stationary p.p. {N(t)}, rate pN , observed for 0
Stat 223 Introduction to the Theory of Statistics
The estimate of the proportion (“p-hat”) based on the sample can be a variety of values, and we don’t expect to get the same value every time, but the.
The Empirical FT. What is the large sample distribution of the EFT?
Point processes on the line. Nerve firing.
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Presentation transcript:

Inference. Estimates stationary p.p. {N(t)}, rate pN , observed for 0<t<T First-order.

Asymptotically normal.

Theorem. Suppose cumulant spectra bounded, then N(T) is asymptotically N(TpN , 2Tf2 (0)). Proof. The normal is determined by its moments

Second-order.

Bivariate p.p.

Volkonski and Rozanov (1959); If NT(I), T=1,2,… sequence of point processes with pNT  0 as T   then, under further regularity conditions, sequence with rescaled time, NT(I/pNT ), T=1,2,…tends to a Poisson process. Perhaps INMT(u) approximately Poisson, rate TpNMT(u) Take:  = L/T, L fixed NT(t) spike if M spike in (t,t+dt] and N spike in (t+u,t+u+L/T] rate ~ pNM(u) /T  0 as T   NT(IT) approx Poisson INMT(u) ~ N T(IT) approx Poisson, mean TpNM(u)

Variance stabilizing transfor for Poisson: square root

For large mean the Poisson is approx normal

Nonstationary case. pN(t)

A Poisson case. Rate (t) = exp{ +  cos(t + )}  = (,,,) log-likelihood l() =  log ( i) -  (t) dt 0 < t < T