Download presentation
Presentation is loading. Please wait.
Published byRoderick Cross Modified over 9 years ago
1
Random processes
2
Matlab What is a random process?
3
A random process Is defined by its finite-dimensional distributions –The probability of events at a finite number of time points The finite dimensional distributions have to be ‘consistent’ –Integrating over one time point gives the finite- dimensional distribution for the other time points Given a consistent family of finite-dimensional distributions on ‘good enough’ spaces, there is a unique process with those distributions (Kolmogorov) –‘Good enough’ means Borel
4
Stationarity and ergodicity How to measure the resting membrane potential of a neuron?
5
Stationarity and ergodicity I arrive this morning to the lab, prepare a neuron for recording and measure its membrane potential at 10am sharp. The value is -75.3 mV. Is this the resting potential of the neuron?
6
Stationarity and ergodicity The measurement is noisy We want to have a number of repeats of the same measurement How to get repeated measurements?
7
Stationarity and ergodicity Repeated measurement: –I arrive this morning a second time to the lab, prepare a neuron for recording and measure its membrane potential at 10am sharp. The value is -80.9 mV. What is the problem?
8
Stationarity and ergodicity Repeated measurement 1: –I arrive this morning to the lab 600 times, prepare a neuron for recording and measure its membrane potential at 10am sharp. Repeated measurement 2: –I measure the membrane potential of the same neuron as before once a second from 10:00 to 10:10 (I get 600 measurements)
9
Go to Matlab
10
Theoretically, Repeated measurement 1: –I arrive this morning to the lab 600 times, prepare a neuron for recording and measure its membrane potential at 10am sharp. Repeated measurement 2: –I measure the membrane potential of the same neuron as before once a second from 10:00 to 10:10 (I get 600 measurements)
11
Practically, Repeated measurement 1: –I arrive this morning to the lab 600 times, prepare a neuron for recording and measure its membrane potential at 10am sharp. Repeated measurement 2: –I measure the membrane potential of the same neuron as before once a second from 10:00 to 10:10 (I get 600 measurements)
12
What to do?
13
Ergodicity For an ergodic process, –Averaging across many repeated trials (repeated measurements 1) –Averaging across time for a single trial (repeated measurements 2) –Are equal An ergodic process is always stationary, the reverse may not be true
14
What makes a stationary process ergodic? Asymptotic independence Samples that are far enough in time are independent
15
Correlation, independence, gaussian and non-gaussian processes
16
Independence vs. lack of correlation Two variables are independent if knowing anything about one of them doesn’t allow you to make any deductions that you couldn’t already make about the other one Two variables are uncorrelated if their covariance is 0 Independence implies lack of correlation Lack of correlation in general does not imply independence
17
Go to Matlab
18
Independence vs. lack of correlation For variables that are jointly Gaussian, lack of correlation implies independence What are jointly Gaussian variables?
19
Jointly Gaussian variables The distribution of each by itself is gaussian The joint distribution of each pair is gaussian The joint distribution of each triplet is gaussian … (allowing for degeneracy)
20
Go to Matlab
21
Jointly gaussian variables Because of the issue of degeneracy, the formal definition is indirect For example: random variables are jointly gaussian if all linear combinations are gaussian (allowing the degenerate case of identically 0 variables) Or using characteristic functions
22
Characterizing jointly gaussian variables A 1-d Gaussian variable is fully characterized by its mean and variance These determine its probability density function and therefore all other quantifiers An n-d Gaussian variable is fully characterized by the mean of each component and their covariances These determine the joint probability density and therefore all other quantifiers
23
Gaussian process A random process is gaussian if all finite- dimensional distributions are jointly gaussian A Gaussian process is determined by specifying the mean at each moment in time and a matrix of covariances between the values at different moments in time All finite-dimensional distributions are Gaussian, and are therefore determined by the above data
24
Stationary Gaussian processes If the process is in addition stationary –The mean and variances are constant as a function of time –the 2-d distributions do not depend on the absolute time In that case, the covariance matrix is constant along the diagonals –‘Toeplitz matrices’ The covariance is specified by a function of the delay between samples
25
Stationary gaussian processes The autocovariance function is also called –Autocorrelation function –Covariance function –Correlation function –… Make sure you know the normalization (what is the value of the function at 0)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.