Center for Computational Biology Department of Mathematical Sciences Montana State University Collaborators: Alexander Dimitrov John P. Miller Zane Aldworth.

Slides:



Advertisements
Similar presentations
Chapter 1: Information and information-processing
Advertisements

1 ECE 776 Project Information-theoretic Approaches for Sensor Selection and Placement in Sensor Networks for Target Localization and Tracking Renita Machado.
Feature Selection as Relevant Information Encoding Naftali Tishby School of Computer Science and Engineering The Hebrew University, Jerusalem, Israel NIPS.
2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
Deep Learning and Neural Nets Spring 2015
Center for Computational Biology Department of Mathematical Sciences Montana State University Collaborators: Alexander Dimitrov Tomas Gedeon John P. Miller.
Continuation and Symmetry Breaking Bifurcation of the Information Distortion Function September 19, 2002 Albert E. Parker Complex Biological Systems Department.
1 Testing the Efficiency of Sensory Coding with Optimal Stimulus Ensembles C. K. Machens, T. Gollisch, O. Kolesnikova, and A.V.M. Herz Presented by Tomoki.
Shin Ishii Nara Institute of Science and Technology
BCS547 Neural Encoding.
The University of Manchester Introducción al análisis del código neuronal con métodos de la teoría de la información Dr Marcelo A Montemurro
Chapter 6 Information Theory
Channel Assignment using Chaotic Simulated Annealing Enhanced Neural Network Channel Assignment using Chaotic Simulated Annealing Enhanced Hopfield Neural.
For stimulus s, have estimated s est Bias: Cramer-Rao bound: Mean square error: Variance: Fisher information How good is our estimate? (ML is unbiased:
Digital Data Transmission ECE 457 Spring Information Representation Communication systems convert information into a form suitable for transmission.
Modelling and Control Issues Arising in the Quest for a Neural Decoder Computation, Control, and Biological Systems Conference VIII, July 30, 2003 Albert.
Symmetry Breaking Bifurcations of the Information Distortion Dissertation Defense April 8, 2003 Albert E. Parker III Complex Biological Systems Department.
For a random variable X with distribution p(x), entropy is given by H[X] = -  x p(x) log 2 p(x) “Information” = mutual information: how much knowing the.
Application of Statistical Techniques to Neural Data Analysis Aniket Kaloti 03/07/2006.
Fundamental limits in Information Theory Chapter 10 :
Center for Computational Biology Department of Mathematical Sciences Montana State University Collaborators: Alexander Dimitrov John P. Miller Zane Aldworth.
Efficient Statistical Pruning for Maximum Likelihood Decoding Radhika Gowaikar Babak Hassibi California Institute of Technology July 3, 2003.
2015/6/15VLC 2006 PART 1 Introduction on Video Coding StandardsVLC 2006 PART 1 Variable Length Coding  Information entropy  Huffman code vs. arithmetic.
Symmetry Breaking Bifurcation of the Distortion Problem Albert E. Parker Complex Biological Systems Department of Mathematical Sciences Center for Computational.
We use Numerical continuation Bifurcation theory with symmetries to analyze a class of optimization problems of the form max F(q,  )=max (G(q)+  D(q)).
Symmetry breaking clusters when deciphering the neural code September 12, 2005 Albert E. Parker Department of Mathematical Sciences Center for Computational.
A Bifurcation Theoretical Approach to the Solving the Neural Coding Problem June 28 Albert E. Parker Complex Biological Systems Department of Mathematical.
Collaborators: Tomas Gedeon Alexander Dimitrov John P. Miller Zane Aldworth Information Theory and Neural Coding PhD Oral Examination November 29, 2001.
Spike Train decoding Summary Decoding of stimulus from response –Two choice case Discrimination ROC curves –Population decoding MAP and ML estimators.
We use Numerical continuation Bifurcation theory with symmetries to analyze a class of optimization problems of the form max F(q,  )=max (G(q)+  D(q)).
Neural Coding Through The Ages February 1, 2002 Albert E. Parker Complex Biological Systems Department of Mathematical Sciences Center for Computational.
2015/7/12VLC 2008 PART 1 Introduction on Video Coding StandardsVLC 2008 PART 1 Variable Length Coding  Information entropy  Huffman code vs. arithmetic.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
Linear Codes for Distributed Source Coding: Reconstruction of a Function of the Sources -D. Krithivasan and S. Sandeep Pradhan -University of Michigan,
Phase Transitions in the Information Distortion NIPS 2003 workshop on Information Theory and Learning: The Bottleneck and Distortion Approach December.
Noise, Information Theory, and Entropy
Noise, Information Theory, and Entropy
Basic Concepts in Information Theory
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
Tahereh Toosi IPM. Recap 2 [Churchland and Abbott, 2012]
1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,
Channel Capacity
EE 6332, Spring, 2014 Wireless Communication Zhu Han Department of Electrical and Computer Engineering Class 11 Feb. 19 th, 2014.
Derivative Free Optimization G.Anuradha. Contents Genetic Algorithm Simulated Annealing Random search method Downhill simplex method.
On information theory and association rule interestingness Loo Kin Kong 5 th July, 2002.
Prof. Amr Goneid, AUC1 Analysis & Design of Algorithms (CSCE 321) Prof. Amr Goneid Department of Computer Science, AUC Part 8. Greedy Algorithms.
Computer Vision – Compression(1) Hanyang University Jong-Il Park.
The Information Bottleneck Method clusters the response space, Y, into a much smaller space, T. In order to informatively cluster the response space, the.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Coding Theory Efficient and Reliable Transfer of Information
1 Source Coding and Compression Dr.-Ing. Khaled Shawky Hassan Room: C3-222, ext: 1204, Lecture 10 Rate-Distortion.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Digital Image Processing Lecture 22: Image Compression
1Computer Sciences Department. 2 Advanced Design and Analysis Techniques TUTORIAL 7.
Basic Concepts of Information Theory A measure of uncertainty. Entropy. 1.
ICS 353: Design and Analysis of Algorithms Backtracking King Fahd University of Petroleum & Minerals Information & Computer Science Department.
July 23, BSA, a Fast and Accurate Spike Train Encoding Scheme Benjamin Schrauwen.
Mutual Information, Joint Entropy & Conditional Entropy
Center for Computational Biology Department of Mathematical Sciences Montana State University Collaborators: Alexander Dimitrov John P. Miller Zane Aldworth.
More on HMMs and Multiple Sequence Alignment BMI/CS 776 Mark Craven March 2002.
Introduction to genetic algorithm
Introduction to Discrete-Time Control Systems fall
Introduction to Information theory
Information Theory Michael J. Watts
Independent Encoding for the Broadcast Channel
Random Noise in Seismic Data: Types, Origins, Estimation, and Removal
Construction Engineering Department Construction Project with Resources Constraints By. M. Chelaka, D. Greenwood & E. Johansen, /9/2019.
ICS 353: Design and Analysis of Algorithms
Information Theoretical Analysis of Digital Watermarking
Dr. Arslan Ornek MATHEMATICAL MODELS
Presentation transcript:

Center for Computational Biology Department of Mathematical Sciences Montana State University Collaborators: Alexander Dimitrov John P. Miller Zane Aldworth Tomas Gedeon Brendan Mumey Neural Coding and Decoding Albert E. Parker

Problem: How does neural ensemble activity represent information about sensory stimuli? Our Approach: Build a model using Information Theory My Research: Probability Theory Use the model to determine a coding scheme My Research: Numerical Optimization Techniques

Neural Coding and Decoding. Goal: Determine a coding scheme: How does neural ensemble activity represent information about sensory stimuli? Demands: An animal needs to recognize the same object on repeated exposures. Coding has to be deterministic at this level. The code must deal with uncertainties introduced by the environment and neural architecture. Coding is by necessity stochastic at this finer scale. Major Problem: The search for a coding scheme requires large amounts of data

How to determine a coding scheme? Idea: Model a part of a neural system as a communication channel using Information Theory. This model enables us to: Meet the demands of a coding scheme: oDefine a coding scheme as a relation between stimulus and neural response classes. oConstruct a coding scheme that is stochastic on the finer scale yet almost deterministic on the classes. Deal with the major problem: oUse whatever quantity of data is available to construct coarse but optimally informative approximations of the coding scheme. oRefine the coding scheme as more data becomes available. Investigate the cricket cercal sensory system.

Y X stimulus sequences response sequences stimulus/response sequence pairs distinguishable classes of stimulus/response pairs Stimulus and Response Classes

Information Theoretic Quantities A quantizer or encoder, Q, relates the environmental stimulus, X, to the neural response, Y, through a process called quantization. In general, Q is a stochastic map The Reproduction space Y is a quantization of X. This can be repeated: Let Y f be a reproduction of Y. So there is a quantizer Use Mutual information to measure the degree of dependence between X and Y f. Use Conditional Entropy to measure the self-information of Y f given Y

The Model Problem: To determine a coding scheme between X and Y requires large amounts of data Idea: Determine the coding scheme between X and Y f, a squashing (reproduction) of Y, such that: Y f preserves as much information (mutual information) with X as possible and the self-information (entropy) of Y f |Y is maximized. That is, we are searching for an optimal mapping (quantizer): that satisfies these conditions. Justification: Jayne's maximum entropy principle, which states that of all the quantizers that satisfy a given set of constraints, choose the one that maximizes the entropy.

Equivalent Optimization Problems  Maximum entropy: maximize F ( q(y f |y) ) = H(Y f |Y) constrained by I(X;Y f )  I o I o determines the informativeness of the reproduction.  Deterministic annealing (Rose, ’98): maximize F ( q(y f |y) ) = H(Y f |Y) +  I(X,Y f ). Small  favor maximum entropy, large  - maximum I(X,Y f ).  Simplex Algorithm: maximize I(X,Y f ) over vertices of constraint space  Implicit solution:

? ?

Signal Nervous system Communication channel Modeling the cricket cercal sensory system as a communication channel

Wind Stimulus and Neural Response in the cricket cercal system Neural Responses (over a 30 minute recording) caused by white noise wind stimulus. T, ms Neural Responses (these are all doublets) for a 12 ms window Some of the air current stimuli preceding one of the neural responses Time in ms. A t T=0, the first spike occurs X Y

YfYf Y Quantization: A quantizer is any map f: Y -> Y f from Y to a reproduction space Y f with finitely many elements. Quantizers can be deterministic or refined Y probabilistic

Applying the algorithm to cricket sensory data. Y YfYf YfYf

Conclusions We model a part of the neural system as a communication channel. define a coding scheme through relations between classes of stimulus/response pairs. -Coding is probabilistic on the individual elements of X and Y. -Coding is almost deterministic on the stimulus/response classes. To recover such a coding scheme, we propose a new method to quantify neural spike trains. -Quantize the response patterns to a small finite space (Y f ). -Use information theoretic measures to determine optimal quantizer for a fixed reproduction size. -Refine the coding scheme by increasing the reproduction size. present preliminary results with cricket sensory data.