Point processes on the line. Nerve firing.. Stochastic point process. Building blocks Process on R {N(t)}, t in R, with consistent set of distributions.

Slides:



Advertisements
Similar presentations
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Advertisements

Independence of random variables
Point process and hybrid spectral analysis.
Introduction to stochastic process
EE322 Digital Communications
The Empirical FT continued. What is the large sample distribution of the EFT?
Probability theory 2010 Main topics in the course on probability theory  Multivariate random variables  Conditional distributions  Transforms  Order.
Probability theory 2011 Main topics in the course on probability theory  The concept of probability – Repetition of basic skills  Multivariate random.
Cedar movie. Cedar Fire Operations/systems. Processes so far considered: Y(t), 0
If time is continuous we cannot write down the simultaneous distribution of X(t) for all t. Rather, we pick n, t 1,...,t n and write down probabilities.
Stat Oct 2008 D. R. Brillinger Chapter 6 - Stationary Processes in the Frequency Domain One model Another R: amplitude α: decay rate ω: frequency,
Extending the VolatilityConcept to Point Processes David R. Brillinger Statistics Department University of California Berkeley, CA
Inference. Estimates stationary p.p. {N(t)}, rate p N, observed for 0
Point processes rates are a point process concern.
The joint probability distribution function of X and Y is denoted by f XY (x,y). The marginal probability distribution function of X, f X (x) is obtained.
Joint Probability distribution
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Exponential Distribution & Poisson Process
Correlations and Copulas 1. Measures of Dependence 2 The risk can be split into two parts: the individual risks and the dependence structure between them.
DATA ANALYSIS Module Code: CA660 Lecture Block 3.
Random Processes ECE460 Spring, Power Spectral Density Generalities : Example: 2.
1 Part 5 Response of Linear Systems 6.Linear Filtering of a Random Signals 7.Power Spectrum Analysis 8.Linear Estimation and Prediction Filters 9.Mean-Square.
Chapter 3 Random vectors and their numerical characteristics.
0 K. Salah 2. Review of Probability and Statistics Refs: Law & Kelton, Chapter 4.
Flows and Networks Plan for today (lecture 4): Last time / Questions? Output simple queue Tandem network Jackson network: definition Jackson network: equilibrium.
Chapter 4 DeGroot & Schervish. Variance Although the mean of a distribution is a useful summary, it does not convey very much information about the distribution.
1 Birth and death process N(t) Depends on how fast arrivals or departures occur Objective N(t) = # of customers at time t. λ arrivals (births) departures.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
Random processes. Matlab What is a random process?
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
Chapter 1 Random Process
ENEE631 Digital Image Processing (Spring'04) Basics on 2-D Random Signal Spring ’04 Instructor: Min Wu ECE Department, Univ. of Maryland, College Park.
Stochastic models - time series. Random process. an infinite collection of consistent distributions probabilities exist Random function. a family of random.
Multivariate Time Series Analysis
1 G Lect 3M Regression line review Estimating regression coefficients from moments Marginal variance Two predictors: Example 1 Multiple regression.
1 Part Three: Chapters 7-9 Performance Modeling and Estimation.
The Chinese University of Hong Kong
Components of Time Series Su, Chapter 2, section II.
Spinal cord stimulation restores locomotion in animal models of Parkinson's disease "... led us to hypothesize that stimulation... could alleviate motor.
Outline Random variables –Histogram, Mean, Variances, Moments, Correlation, types, multiple random variables Random functions –Correlation, stationarity,
Multiple Random Variables and Joint Distributions
Tatiana Varatnitskaya Belаrussian State University, Minsk
水分子不時受撞,跳格子(c.p. 車行) 投骰子 (最穩定) 股票 (價格是不穏定,但報酬過程略穩定) 地震的次數 (不穩定)
Main topics in the course on probability theory
Pertemuan ke-7 s/d ke-10 (minggu ke-4 dan ke-5)
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
The Chinese University of Hong Kong
Time series power spectral density.
Segue from time series to point processes.
Point processes rates are a point process concern.
The Empirical FT. What is the large sample distribution of the EFT?
Point processes. Some special cases.
Mixing. Stationary case unless otherwise indicated
Power spectral density. frequency-side, , vs. time-side, t
Stochastic models - time series.
Point processes rates are a point process concern.
The Empirical FT. What is the large sample distribution of the EFT?
Point processes. Some special cases.
Stochastic models - time series.
Second order stationary p.p.
FT makes the New Yorker, October 4, 2010 page 71
Extending the VolatilityConcept to Point Processes
Point process data points along the line
Independence of random variables
The Empirical FT. What is the large sample distribution of the EFT?
Point processes on the line. Nerve firing.
Handout Ch 4 實習.
Networks. partial spectra - trivariate (M,N,O)
Presentation transcript:

Point processes on the line. Nerve firing.

Stochastic point process. Building blocks Process on R {N(t)}, t in R, with consistent set of distributions Pr{N(I 1 )=k 1,..., N(I n )=k n } k 1,...,k n integers  0 I's Borel sets of R. Consistentency example. If I 1, I 2 disjoint Pr{N(I 1 )= k 1, N(I 2 )=k 2, N(I 1 or I 2 )=k 3 } =1 if k 1 + k 2 =k 3 = 0 otherwise Guttorp book, Chapter 5

Points:...   -1   0   1 ... discontinuities of {N} N(t) = #{0 <  j  t} Simple:  j   k if j  k points are isolated dN(t) = 0 or 1 Surprise. A simple point process is determined by its void probabilities Pr{N(I) = 0} I compact

Conditional intensity. Simple case History H t = {  j  t} Pr{dN(t)=1 | H t } =  (t:  )dt  r.v. Has all the information Probability points in [0,T) are t 1,...,t N Pr{dN(t 1 )=1,..., dN(t N )=1} =  (t 1 )...  (t N )exp{-   (t)dt}dt 1... dt N [1-  (h)h][1-  (2h)h]...  (t 1 )  (t 2 )...

Parameters. Suppose points are isolated dN(t) = 1 if point in (t,t+dt] = 0 otherwise 1. (Mean) rate/intensity. E{dN(t)} = p N (t)dt = Pr{dN(t) = 1}  j g(  j ) =  g(s)dN(s) E{  j g(  j )} =  g(s)p N (s)ds Trend: p N (t) = exp{  +  t} Cycle:  cos(  t+  )

Product density of order 2. Pr{dN(s)=1 and dN(t)=1} = E{dN(s)dN(t)} = [  (s-t)p N (t) + p NN (s,t)]dsdt Factorial moment

Autointensity. Pr{dN(t)=1|dN(s)=1} = (p NN (s,t)/p N (s))dt s  t = h NN (s,t)dt = p N (t)dt if increments uncorrelated

Covariance density/cumulant density of order 2. cov{dN(s),dN(t)} = q NN (s,t)dsdt s  t = [  (s-t)p N (s)+q NN (s,t)]dsdt generally q NN (s,t) = p NN (s,t) - p N (s) p N (t) s  t

Identities. 1.  j,k  g(  j,  k ) =  g(s,t)dN(s)dN(t) Expected value. E{  g(s,t)dN(s)dN(t)} =  g(s,t)[  (s-t)p N (t)+p NN (s,t)]dsdt =  g(t,t)p N (t)dt +  g(s,t)p NN (s,t)dsdt

2. cov{  g(  j ),  g(  k )} = cov{  g(s)dN(s),  h(t)dN(t)} =  g(s) h(t)[  (s-t)p N (s)+q NN (s,t)]dsdt =  g(t)h(t)p N (t)dt +  g(s)h(t)q NN (s,t)dsdt

Product density of order k. t 1,...,t k all distinct Prob{dN(t 1 )=1,...,dN(t k )=1} =E{dN(t 1 )...dN(t k )} = p N...N (t 1,...,t k )dt 1...dt k

Cumulant density of order k. t 1,...,t k distinct cum{dN(t 1 ),...,dN(t k )} = q N...N (t 1,...,t k )dt 1...dt k

Stationarity. Joint distributions, Pr{N(I 1 +t)=k 1,..., N(I n +t)=k n } k 1,...,k n integers  0 do not depend on t for n=1,2,... Rate. E{dN(t)=p N dt Product density of order 2. Pr{dN(t+u)=1 and dN(t)=1} = [  (u)p N + p NN (u)]dtdu

Autointensity. Pr{dN(t+u)=1|dN(t)=1} = (p NN (u)/p N )du u  0 = h N (u)du Covariance density. cov{dN(t+u),dN(t)} = [  (u)p N + q NN (u)]dtdu

Mixing. cov{dN(t+u),dN(t)} small for large |u| |p NN (u) - p N p N | small for large |u| h NN (u) = p NN (u)/p N ~ p N for large |u|  |q NN (u)|du <  See preceding examples

Power spectral density. frequency-side,, vs. time-side, t /2  : frequency (cycles/unit time) Non-negative Unifies analyses of processes of widely varying types

Examples.

Spectral representation. stationary increments - Kolmogorov

Algebra/calculus of point processes. Consider process {  j,  j +u}. Stationary case dN(t) = dM(t) + dM(t+u) Taking "E", p N dt = p M dt+ p M dt p N = 2 p M

Taking "E" again,

Association. Measuring? Due to chance? Are two processes associated? Eg. t.s. and p.p. How strongly? Can one predict one from the other? Some characteristics of dependence: E(XY)  E(X) E(Y) E(Y|X) = g(X) X = g (  ), Y = h(  ),  r.v. f (x,y)  f (x) f(y) corr(X,Y)  0

Bivariate point process case. Two types of points (  j,  k ) Crossintensity. Prob{dN(t)=1|dM(s)=1} =(p MN (t,s)/p M (s))dt Cross-covariance density. cov{dM(s),dN(t)} = q MN (s,t)dsdt no  ()

Frequency domain approach. Coherency, coherence Cross-spectrum. Coherency. R MN ( ) = f MN ( )/  {f MM ( ) f NN ( )} complex-valued, 0 if denominator 0 Coherence |R MN ( )| 2 = |f MN ( )| 2 /{f MM ( ) f NN ( )| |R MN ( )| 2  1, c.p. multiple R 2

where A( ) =  exp{-i u}a(u)du f OO ( ) is a minimum at A( ) = f NM ( )f MM ( ) -1 Minimum: (1 - |R MN ( )| 2 )f NN ( ) 0  |R MN ( )| 2  1 Proof. Filtering. M = {  j }  a(t-v)dM(v) =  a(t-  j ) Consider dO(t) = dN(t) -  a(t-v)dM(v)dt, (stationary increments)

Proof. Coherence, measure of the linear time invariant association of the components of a stationary bivariate process.

Empirical examples. sea hare

Muscle spindle

Spectral representation approach. Filtering. dO(t)/dt =  a(t-v)dM(v) =  a(t-  j ) =  exp{it }dZ M ( )

Partial coherency. Trivariate process {M,N,O} “Removes” the linear time invariant effects of O from M and N