Center for Computational Biology Department of Mathematical Sciences Montana State University Collaborators: Alexander Dimitrov Tomas Gedeon John P. Miller.

Slides:



Advertisements
Similar presentations
1 ECE 776 Project Information-theoretic Approaches for Sensor Selection and Placement in Sensor Networks for Target Localization and Tracking Renita Machado.
Advertisements

Feature Selection as Relevant Information Encoding Naftali Tishby School of Computer Science and Engineering The Hebrew University, Jerusalem, Israel NIPS.
Support Vector Machines
Continuation and Symmetry Breaking Bifurcation of the Information Distortion Function September 19, 2002 Albert E. Parker Complex Biological Systems Department.
1 Testing the Efficiency of Sensory Coding with Optimal Stimulus Ensembles C. K. Machens, T. Gollisch, O. Kolesnikova, and A.V.M. Herz Presented by Tomoki.
The University of Manchester Introducción al análisis del código neuronal con métodos de la teoría de la información Dr Marcelo A Montemurro
Hidden Markov Models Fundamentals and applications to bioinformatics.
Chapter 6 Information Theory
For stimulus s, have estimated s est Bias: Cramer-Rao bound: Mean square error: Variance: Fisher information How good is our estimate? (ML is unbiased:
Mutual Information Mathematical Biology Seminar
Modelling and Control Issues Arising in the Quest for a Neural Decoder Computation, Control, and Biological Systems Conference VIII, July 30, 2003 Albert.
Symmetry Breaking Bifurcations of the Information Distortion Dissertation Defense April 8, 2003 Albert E. Parker III Complex Biological Systems Department.
For a random variable X with distribution p(x), entropy is given by H[X] = -  x p(x) log 2 p(x) “Information” = mutual information: how much knowing the.
Fundamental limits in Information Theory Chapter 10 :
Center for Computational Biology Department of Mathematical Sciences Montana State University Collaborators: Alexander Dimitrov John P. Miller Zane Aldworth.
Efficient Statistical Pruning for Maximum Likelihood Decoding Radhika Gowaikar Babak Hassibi California Institute of Technology July 3, 2003.
Symmetry Breaking Bifurcation of the Distortion Problem Albert E. Parker Complex Biological Systems Department of Mathematical Sciences Center for Computational.
We use Numerical continuation Bifurcation theory with symmetries to analyze a class of optimization problems of the form max F(q,  )=max (G(q)+  D(q)).
Symmetry breaking clusters when deciphering the neural code September 12, 2005 Albert E. Parker Department of Mathematical Sciences Center for Computational.
Center for Computational Biology Department of Mathematical Sciences Montana State University Collaborators: Alexander Dimitrov John P. Miller Zane Aldworth.
A Bifurcation Theoretical Approach to the Solving the Neural Coding Problem June 28 Albert E. Parker Complex Biological Systems Department of Mathematical.
Collaborators: Tomas Gedeon Alexander Dimitrov John P. Miller Zane Aldworth Information Theory and Neural Coding PhD Oral Examination November 29, 2001.
Spike Train decoding Summary Decoding of stimulus from response –Two choice case Discrimination ROC curves –Population decoding MAP and ML estimators.
We use Numerical continuation Bifurcation theory with symmetries to analyze a class of optimization problems of the form max F(q,  )=max (G(q)+  D(q)).
Neural Coding Through The Ages February 1, 2002 Albert E. Parker Complex Biological Systems Department of Mathematical Sciences Center for Computational.
Copyright N. Friedman, M. Ninio. I. Pe’er, and T. Pupko. 2001RECOMB, April 2001 Structural EM for Phylogentic Inference Nir Friedman Computer Science &
Sufficient Dimensionality Reduction with Irrelevance Statistics Amir Globerson 1 Gal Chechik 2 Naftali Tishby 1 1 Center for Neural Computation and School.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
Linear Codes for Distributed Source Coding: Reconstruction of a Function of the Sources -D. Krithivasan and S. Sandeep Pradhan -University of Michigan,
Information Theory and Security
Phase Transitions in the Information Distortion NIPS 2003 workshop on Information Theory and Learning: The Bottleneck and Distortion Approach December.
Noise, Information Theory, and Entropy
Recognition stimulus input Observer (information transmission channel) response Response: which category the stimulus belongs to ? What is the “information.
Noise, Information Theory, and Entropy
1 Statistical NLP: Lecture 5 Mathematical Foundations II: Information Theory.
Basic Concepts in Information Theory
Some basic concepts of Information Theory and Entropy
2. Mathematical Foundations
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
1. Entropy as an Information Measure - Discrete variable definition Relationship to Code Length - Continuous Variable Differential Entropy 2. Maximum Entropy.
Romain Brette Ecole Normale Supérieure, Paris Philosophy of the spike.
Tahereh Toosi IPM. Recap 2 [Churchland and Abbott, 2012]
1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,
Chapter 11: The Data Survey Supplemental Material Jussi Ahola Laboratory of Computer and Information Science.
Big Ideas Differentiation Frames with Icons. 1. Number Uses, Classification, and Representation- Numbers can be used for different purposes, and numbers.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
Derivative Free Optimization G.Anuradha. Contents Genetic Algorithm Simulated Annealing Random search method Downhill simplex method.
On information theory and association rule interestingness Loo Kin Kong 5 th July, 2002.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
The Information Bottleneck Method clusters the response space, Y, into a much smaller space, T. In order to informatively cluster the response space, the.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Coding Theory Efficient and Reliable Transfer of Information
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
1 Source Coding and Compression Dr.-Ing. Khaled Shawky Hassan Room: C3-222, ext: 1204, Lecture 10 Rate-Distortion.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
1 On the Channel Capacity of Wireless Fading Channels C. D. Charalambous and S. Z. Denic School of Information Technology and Engineering, University of.
Basic Concepts of Information Theory A measure of uncertainty. Entropy. 1.
Rate Distortion Theory. Introduction The description of an arbitrary real number requires an infinite number of bits, so a finite representation of a.
Chance Constrained Robust Energy Efficiency in Cognitive Radio Networks with Channel Uncertainty Yongjun Xu and Xiaohui Zhao College of Communication Engineering,
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
July 23, BSA, a Fast and Accurate Spike Train Encoding Scheme Benjamin Schrauwen.
Mutual Information, Joint Entropy & Conditional Entropy
Center for Computational Biology Department of Mathematical Sciences Montana State University Collaborators: Alexander Dimitrov John P. Miller Zane Aldworth.
(C) 2000, The University of Michigan 1 Language and Information Handout #2 September 21, 2000.
Introduction to Information theory
Information Theory Michael J. Watts
Random Noise in Seismic Data: Types, Origins, Estimation, and Removal
Pseudorandom number, Universal Hashing, Chaining and Linear-Probing
Information Theoretical Analysis of Digital Watermarking
Dr. Arslan Ornek MATHEMATICAL MODELS
Presentation transcript:

Center for Computational Biology Department of Mathematical Sciences Montana State University Collaborators: Alexander Dimitrov Tomas Gedeon John P. Miller Zane Aldworth Neural Coding and Decoding Albert E. Parker

Problem: Determine a coding scheme: How does neural ensemble activity represent information about sensory stimuli? Our Approach: Construct a model using Probability and Information Theory Optimize the model to cluster the neural responses which gives an approximation of a coding scheme given the available data. Apply our method to the cricket cercal system

Neural Coding and Decoding. Goal: What conditions must a coding scheme satisfy? Demands: An animal needs to recognize the same object on repeated exposures. Coding has to be deterministic at this level. The code must deal with uncertainties introduced by the environment and neural architecture. Coding is by necessity stochastic at this finer scale. Major Problem: The search for a coding scheme requires large amounts of data

How to determine a coding scheme? Idea: Model a part of a neural system as a communication channel using Information Theory. This model enables us to: Meet the demands of a coding scheme: oDefine a coding scheme as a relation between stimulus and neural response classes. oConstruct a coding scheme that is stochastic on the finer scale yet almost deterministic on the classes. Deal with the major problem: oUse whatever quantity of data is available to construct coarse but optimally informative approximations of the coding scheme. oRefine the coding scheme as more data becomes available. Investigate the cricket cercal sensory system.

X Y Q(Y|X) inputoutput A Stochastic Map The relationship between X and Y is completely described by the conditional probability Q. stimulus sequence X=xresponse sequence Y=y Realizations of X and Y in neural coding Q(Y=y|X=x)

Y X stimulus sequences response sequences Determining Stimulus/Response Classes Given a joint probability p(X,Y):

Stimulus and Response Classes stimulus sequences response sequences Distinguishable stimulus/response classes Y X

Information Theoretic Quantities A quantizer or encoder, Q, relates the environmental stimulus, X, to the neural response, Y, through a process called quantization. In general, Q is a stochastic map The Reproduction space Y is a quantization of X. This can be repeated: Let Y f be a reproduction of Y. So there is a quantizer Use Mutual information to measure the degree of dependence between X and Y f. Use Conditional Entropy to measure the self-information of Y f given Y

The Model Problem: To determine a coding scheme between X and Y requires large amounts of data Idea: Determine the coding scheme between X and Y f, a clustering (reproduction) of Y, such that: Y f preserves as much information (mutual information) with X as possible and the self-information (entropy) of Y f |Y is maximized. That is, we are searching for an optimal mapping (quantizer): that satisfies these conditions. Justification: Jayne's maximum entropy principle, which states that of all the quantizers that satisfy a given set of constraints, choose the one that maximizes the entropy.

 Maximum entropy: maximize F ( q(y f |y) ) = H(Y f |Y) constrained by I(X;Y f )  I o I o determines the informativeness of the reproduction.  Deterministic annealing (Rose, ’98): maximize F ( q(y f |y) ) = H(Y f |Y) +  I(X,Y f ). Small  favor maximum entropy, large  : maximum I(X,Y f ).  Augmented Lagrangian with Newton CG line search  Implicit solution:  Simplex Algorithm: maximize I(X,Y f ) over vertices of constraint space Equivalent Optimization Problems

Random clusters Application to synthetic data (p(X,Y) is known)

The Optimization Problem for Real Data Maximum entropy: maximize F(q(y f |y)) = H(Y f |Y) constrained by H(X)-H G (X|Y f )  I o I o determines the informativeness of the reproduction.

? ?

Signal Nervous system Communication channel Modeling the cricket cercal sensory system as a communication channel

Wind Stimulus and Neural Response in the cricket cercal system Neural Responses (over a 30 minute recording) caused by white noise wind stimulus. T, ms Neural Responses (these are all doublets) for a 12 ms window Some of the air current stimuli preceding one of the neural responses Time in ms. A t T=0, the first spike occurs X Y

YfYf Y Quantization: A quantizer is any map f: Y -> Y f from Y to a reproduction space Y f with finitely many elements. Quantizers can be deterministic or refined Y probabilistic

Applying the algorithm to cricket sensory data. Y YfYf YfYf Class 1 Class 2 Class 1 Class 2 Class 3

Conclusions We model a part of the neural system as a communication channel. define a coding scheme through relations between classes of stimulus/response pairs. -Coding is probabilistic on the individual elements of X and Y. -Coding is almost deterministic on the stimulus/response classes. To recover such a coding scheme, we propose a new method to quantify neural spike trains. -Quantize the response patterns to a small finite space (Y f ). -Use information theoretic measures to determine optimal quantizer for a fixed reproduction size. -Refine the coding scheme by increasing the reproduction size. present preliminary results with cricket sensory data.