Presentation is loading. Please wait.

Presentation is loading. Please wait.

Point processes on the line. Nerve firing.. Stochastic point process. Building blocks Process on R {N(t)}, t in R, with consistent set of distributions.

Similar presentations


Presentation on theme: "Point processes on the line. Nerve firing.. Stochastic point process. Building blocks Process on R {N(t)}, t in R, with consistent set of distributions."— Presentation transcript:

1 Point processes on the line. Nerve firing.

2 Stochastic point process. Building blocks Process on R {N(t)}, t in R, with consistent set of distributions Pr{N(I 1 )=k 1,..., N(I n )=k n } k 1,...,k n integers  0 I's Borel sets of R. Consistentency example. If I 1, I 2 disjoint Pr{N(I 1 )= k 1, N(I 2 )=k 2, N(I 1 or I 2 )=k 3 } =1 if k 1 + k 2 =k 3 = 0 otherwise Guttorp book, Chapter 5

3 Points:...   -1   0   1 ... discontinuities of {N} N(t) = #{0 <  j  t} Simple:  j   k if j  k points are isolated dN(t) = 0 or 1 Surprise. A simple point process is determined by its void probabilities Pr{N(I) = 0} I compact

4 Conditional intensity. Simple case History H t = {  j  t} Pr{dN(t)=1 | H t } =  (t:  )dt  r.v. Has all the information Probability points in [0,T) are t 1,...,t N Pr{dN(t 1 )=1,..., dN(t N )=1} =  (t 1 )...  (t N )exp{-   (t)dt}dt 1... dt N [1-  (h)h][1-  (2h)h]...  (t 1 )  (t 2 )...

5 Parameters. Suppose points are isolated dN(t) = 1 if point in (t,t+dt] = 0 otherwise 1. (Mean) rate/intensity. E{dN(t)} = p N (t)dt = Pr{dN(t) = 1}  j g(  j ) =  g(s)dN(s) E{  j g(  j )} =  g(s)p N (s)ds Trend: p N (t) = exp{  +  t} Cycle:  cos(  t+  )

6 Product density of order 2. Pr{dN(s)=1 and dN(t)=1} = E{dN(s)dN(t)} = [  (s-t)p N (t) + p NN (s,t)]dsdt Factorial moment

7 Autointensity. Pr{dN(t)=1|dN(s)=1} = (p NN (s,t)/p N (s))dt s  t = h NN (s,t)dt = p N (t)dt if increments uncorrelated

8 Covariance density/cumulant density of order 2. cov{dN(s),dN(t)} = q NN (s,t)dsdt s  t = [  (s-t)p N (s)+q NN (s,t)]dsdt generally q NN (s,t) = p NN (s,t) - p N (s) p N (t) s  t

9 Identities. 1.  j,k  g(  j,  k ) =  g(s,t)dN(s)dN(t) Expected value. E{  g(s,t)dN(s)dN(t)} =  g(s,t)[  (s-t)p N (t)+p NN (s,t)]dsdt =  g(t,t)p N (t)dt +  g(s,t)p NN (s,t)dsdt

10 2. cov{  g(  j ),  g(  k )} = cov{  g(s)dN(s),  h(t)dN(t)} =  g(s) h(t)[  (s-t)p N (s)+q NN (s,t)]dsdt =  g(t)h(t)p N (t)dt +  g(s)h(t)q NN (s,t)dsdt

11 Product density of order k. t 1,...,t k all distinct Prob{dN(t 1 )=1,...,dN(t k )=1} =E{dN(t 1 )...dN(t k )} = p N...N (t 1,...,t k )dt 1...dt k

12 Cumulant density of order k. t 1,...,t k distinct cum{dN(t 1 ),...,dN(t k )} = q N...N (t 1,...,t k )dt 1...dt k

13 Stationarity. Joint distributions, Pr{N(I 1 +t)=k 1,..., N(I n +t)=k n } k 1,...,k n integers  0 do not depend on t for n=1,2,... Rate. E{dN(t)=p N dt Product density of order 2. Pr{dN(t+u)=1 and dN(t)=1} = [  (u)p N + p NN (u)]dtdu

14 Autointensity. Pr{dN(t+u)=1|dN(t)=1} = (p NN (u)/p N )du u  0 = h N (u)du Covariance density. cov{dN(t+u),dN(t)} = [  (u)p N + q NN (u)]dtdu

15

16 Mixing. cov{dN(t+u),dN(t)} small for large |u| |p NN (u) - p N p N | small for large |u| h NN (u) = p NN (u)/p N ~ p N for large |u|  |q NN (u)|du <  See preceding examples

17 Power spectral density. frequency-side,, vs. time-side, t /2  : frequency (cycles/unit time) Non-negative Unifies analyses of processes of widely varying types

18 Examples.

19

20 Spectral representation. stationary increments - Kolmogorov

21 Algebra/calculus of point processes. Consider process {  j,  j +u}. Stationary case dN(t) = dM(t) + dM(t+u) Taking "E", p N dt = p M dt+ p M dt p N = 2 p M

22 Taking "E" again,

23 Association. Measuring? Due to chance? Are two processes associated? Eg. t.s. and p.p. How strongly? Can one predict one from the other? Some characteristics of dependence: E(XY)  E(X) E(Y) E(Y|X) = g(X) X = g (  ), Y = h(  ),  r.v. f (x,y)  f (x) f(y) corr(X,Y)  0

24 Bivariate point process case. Two types of points (  j,  k ) Crossintensity. Prob{dN(t)=1|dM(s)=1} =(p MN (t,s)/p M (s))dt Cross-covariance density. cov{dM(s),dN(t)} = q MN (s,t)dsdt no  ()

25

26 Frequency domain approach. Coherency, coherence Cross-spectrum. Coherency. R MN ( ) = f MN ( )/  {f MM ( ) f NN ( )} complex-valued, 0 if denominator 0 Coherence |R MN ( )| 2 = |f MN ( )| 2 /{f MM ( ) f NN ( )| |R MN ( )| 2  1, c.p. multiple R 2

27 where A( ) =  exp{-i u}a(u)du f OO ( ) is a minimum at A( ) = f NM ( )f MM ( ) -1 Minimum: (1 - |R MN ( )| 2 )f NN ( ) 0  |R MN ( )| 2  1 Proof. Filtering. M = {  j }  a(t-v)dM(v) =  a(t-  j ) Consider dO(t) = dN(t) -  a(t-v)dM(v)dt, (stationary increments)

28 Proof. Coherence, measure of the linear time invariant association of the components of a stationary bivariate process.

29 Empirical examples. sea hare

30

31 Muscle spindle

32 Spectral representation approach. Filtering. dO(t)/dt =  a(t-v)dM(v) =  a(t-  j ) =  exp{it }dZ M ( )

33 Partial coherency. Trivariate process {M,N,O} “Removes” the linear time invariant effects of O from M and N

34


Download ppt "Point processes on the line. Nerve firing.. Stochastic point process. Building blocks Process on R {N(t)}, t in R, with consistent set of distributions."

Similar presentations


Ads by Google