Spike Train decoding Summary Decoding of stimulus from response –Two choice case Discrimination ROC curves –Population decoding MAP and ML estimators.

Slides:



Advertisements
Similar presentations
What is the neural code? Puchalla et al., What is the neural code? Encoding: how does a stimulus cause the pattern of responses? what are the responses.
Advertisements

What is the neural code?. Alan Litke, UCSD Reading out the neural code.
What do we know about Primary Visual Cortex (V1)
The linear/nonlinear model s*f 1. The spike-triggered average.
Chapter 2.
Pattern Recognition and Machine Learning
Biological Modeling of Neural Networks: Week 9 – Coding and Decoding Wulfram Gerstner EPFL, Lausanne, Switzerland 9.1 What is a good neuron model? - Models.
Neural Computation Chapter 3. Neural Computation Outline Comparison of behavioral and neural response on a discrimination task –Bayes rule –ROC curves.
Neuronal Coding in the Retina and Fixational Eye Movements Christian Mendl, Tim Gollisch Max Planck Institute of Neurobiology, Junior Research Group Visual.
Shin Ishii Nara Institute of Science and Technology
NOISE and DELAYS in NEUROPHYSICS Andre Longtin Center for Neural Dynamics and Computation Department of Physics Department of Cellular and Molecular Medicine.
BCS547 Neural Encoding.
Estimating mutual information Kenneth D. Harris 25/3/2015.
TAC Meeting Neuronal Coding in the Retina and Fixational Eye Movements Neuronal Coding in the Retina and Fixational Eye Movements Christian.
Reading population codes: a neural implementation of ideal observers Sophie Deneve, Peter Latham, and Alexandre Pouget.
How well can we learn what the stimulus is by looking at the neural responses? We will discuss two approaches: devise and evaluate explicit algorithms.
For stimulus s, have estimated s est Bias: Cramer-Rao bound: Mean square error: Variance: Fisher information How good is our estimate? (ML is unbiased:
For a random variable X with distribution p(x), entropy is given by H[X] = -  x p(x) log 2 p(x) “Information” = mutual information: how much knowing the.
Reinagel lectures 2006 Take home message about LGN 1. Lateral geniculate nucleus transmits information from retina to cortex 2. It is not known what computation.
Application of Statistical Techniques to Neural Data Analysis Aniket Kaloti 03/07/2006.
Center for Computational Biology Department of Mathematical Sciences Montana State University Collaborators: Alexander Dimitrov John P. Miller Zane Aldworth.
Center for Computational Biology Department of Mathematical Sciences Montana State University Collaborators: Alexander Dimitrov John P. Miller Zane Aldworth.
Efficient Coding of Natural Sounds Grace Wang HST 722 Topic Proposal.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
Connected Populations: oscillations, competition and spatial continuum (field equations) Lecture 12 Course: Neural Networks and Biological Modeling Wulfram.
Cracking the Population Code Dario Ringach University of California, Los Angeles.
Neural basis of Perceptual Learning Vikranth B. Rao University of Rochester Rochester, NY.
Tahereh Toosi IPM. Recap 2 [Churchland and Abbott, 2012]
Population Coding Alexandre Pouget Okinawa Computational Neuroscience Course Okinawa, Japan November 2004.
Receptive Field Microstructure and Dendritic Geometry of Retinal Ganglion Cells Solange P. Brown, Shigang He, Richard H. Masland Neuron Volume 27, Issue.
Neuronal Coding in the Retina and Fixational Eye Movements Friday Seminar Talk November 6, 2009 Friday Seminar Talk November 6, 2009 Christian Mendl Tim.
Neural Information in the Visual System By Paul Ruvolo Bryn Mawr College Fall 2012.
Low Level Visual Processing. Information Maximization in the Retina Hypothesis: ganglion cells try to transmit as much information as possible about the.
Sensory systems basics. Sensing the external world.
Lecture #8 Intro. Sensory Receptors 1) Types Chemoreceptors (smell, taste) Mechanoreceptors (touch, hearing, balance) Photoreceptors (vision) Electroreceptors.
Neural coding (1) LECTURE 8. I.Introduction − Topographic Maps in Cortex − Synesthesia − Firing rates and tuning curves.
Information theory and decoding approaches. Single-cell responses averaged over several repetitions of stimuli or behaviours. Neuroscience recorded the.
Projects: 1.Predictive coding in balanced spiking networks (Erwan Ledoux). 2.Using Canonical Correlation Analysis (CCA) to analyse neural data (David Schulz).
Studies of Information Coding in the Auditory Nerve Laurel H. Carney Syracuse University Institute for Sensory Research Departments of Biomedical & Chemical.
Population coding Population code formulation Methods for decoding: population vector Bayesian inference maximum a posteriori maximum likelihood Fisher.
Web page: Textbook. Abbott and Dayan. Homework and grades Office Hours.
BCS547 Neural Decoding. Population Code Tuning CurvesPattern of activity (r) Direction (deg) Activity
Estimating the firing rate
BCS547 Neural Decoding.
Image Stabilization by Bayesian Dynamics Yoram Burak Sloan-Swartz annual meeting, July 2009.
Neural Coding: Integrate-and-Fire Models of Single and Multi-Neuron Responses Jonathan Pillow HHMI and NYU Oct 5, Course.
Understanding early visual coding from information theory By Li Zhaoping Lecture at EU advanced course in computational neuroscience, Arcachon, France,
1 2 Spike Coding Adrienne Fairhall Summary by Kim, Hoon Hee (SNU-BI LAB) [Bayesian Brain]
6. Population Codes Presented by Rhee, Je-Keun © 2008, SNU Biointelligence Lab,
Evolution of the Vestibular and Auditory End Organs
Confirmatory analysis for multiple spike trains Kenneth D. Harris 29/7/15.
Sensory Neural Systems 5 February 2008 Rachel L. León
Eizaburo Doi, CNS meeting at CNBC/CMU, 2005/09/21 Redundancy in the Population Code of the Retina Puchalla, Schneidman, Harris, and Berry (2005)
Information Processing by Neuronal Populations Chapter 6: Single-neuron and ensemble contributions to decoding simultaneously recoded spike trains Information.
1 5. Representations and the neural code Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
Fisher Information and Applications MLCV Reading Group 3Mar16.
Center for Computational Biology Department of Mathematical Sciences Montana State University Collaborators: Alexander Dimitrov John P. Miller Zane Aldworth.
Chapter 9 Day 2. Warm-up  If students picked numbers completely at random from the numbers 1 to 20, the proportion of times that the number 7 would be.
Bayesian Perception.
The Neural Code Baktash Babadi SCS, IPM Fall 2004.
Authors: Peter W. Battaglia, Robert A. Jacobs, and Richard N. Aslin
2016 METHODS IN COMPUTATIONAL NEUROSCIENCE COURSE
Synapses Signal is carried chemically across the synaptic cleft.
Predicting Every Spike
spike-triggering stimulus features
Origin and Dynamics of Extraclassical Suppression in the Lateral Geniculate Nucleus of the Macaque Monkey  Henry J. Alitto, W. Martin Usrey  Neuron  Volume.
Efficiency of Information Transmission by Retinal Ganglion Cells
Neural Coding: Sparse but On Time
Volume 83, Issue 1, Pages (July 2014)
Looking for cognition in the structure within the noise
Presentation transcript:

Spike Train decoding

Summary Decoding of stimulus from response –Two choice case Discrimination ROC curves –Population decoding MAP and ML estimators Bias and variance Fisher information, Cramer-Rao bound –Spike train decoding

Chapter 4

Entropy

Mutual information H_noise< H

Mutual information

KL divergence

Continuous variables

Entropy maximization

Population of neurons

Retinal Ganglion Cell Receptive Fields

Temporal processing in LGN

Temporal vs spatial coding

Entropy of spike trains

Spike train mutual information measurements quantify stimulus specific aspects of neural encoding. Mutual information of bullfrog peripheral auditory neurons was estimated –1.4 bits/sec for broadband noise stimulus –7.8 bits/sec for bullfrog call-like stimulus

Summary Information theory quantifies how much a response says about a stimulus –Stimulus, response entropy –Noise entropy –Mutual information, KL divergence Maximizing information transfer yields biological receptive fields –Factorial codes –Equalization –Whitening Spike train mutual information