Spike Train decoding
Summary Decoding of stimulus from response –Two choice case Discrimination ROC curves –Population decoding MAP and ML estimators Bias and variance Fisher information, Cramer-Rao bound –Spike train decoding
Chapter 4
Entropy
Mutual information H_noise< H
Mutual information
KL divergence
Continuous variables
Entropy maximization
Population of neurons
Retinal Ganglion Cell Receptive Fields
Temporal processing in LGN
Temporal vs spatial coding
Entropy of spike trains
Spike train mutual information measurements quantify stimulus specific aspects of neural encoding. Mutual information of bullfrog peripheral auditory neurons was estimated –1.4 bits/sec for broadband noise stimulus –7.8 bits/sec for bullfrog call-like stimulus
Summary Information theory quantifies how much a response says about a stimulus –Stimulus, response entropy –Noise entropy –Mutual information, KL divergence Maximizing information transfer yields biological receptive fields –Factorial codes –Equalization –Whitening Spike train mutual information