Optimizing Neural Information Capacity through Discretization

Slides:



Advertisements
Similar presentations
Preference Distributions of Primary Motor Cortex Neurons Reflect Control Solutions Optimized for Limb Biomechanics Timothy P. Lillicrap, Stephen H. Scott.
Advertisements

The Pathobiology of Vascular Dementia Costantino Iadecola Neuron Volume 80, Issue 4, Pages (November 2013) DOI: /j.neuron Copyright.
Synapse-Specific Adaptations to Inactivity in Hippocampal Circuits Achieve Homeostatic Gain Control while Dampening Network Reverberation Jimok Kim, Richard.
Building Better Models of Visual Cortical Receptive Fields
Stirred, Not Shaken: Motor Control with Partially Mixed Selectivity
CA3 Sees the Big Picture while Dentate Gyrus Splits Hairs
A Crystal-Clear Interaction: Relating Neuroligin/Neurexin Complex Structure to Function at the Synapse  Joshua N. Levinson, Alaa El-Husseini  Neuron 
Collins Assisi, Mark Stopfer, Maxim Bazhenov  Neuron 
Xaq Pitkow, Dora E. Angelaki  Neuron 
Volume 69, Issue 4, Pages (February 2011)
David M. Schneider, Sarah M.N. Woolley  Neuron 
The Brain as an Efficient and Robust Adaptive Learner
Dense Inhibitory Connectivity in Neocortex
Emre O. Neftci  iScience  Volume 5, Pages (July 2018) DOI: /j.isci
Lav R. Varshney, Per Jesper Sjöström, Dmitri B. Chklovskii  Neuron 
Jason A. Cromer, Jefferson E. Roy, Earl K. Miller  Neuron 
Ian M. Finn, Nicholas J. Priebe, David Ferster  Neuron 
Optimal Degrees of Synaptic Connectivity
Cooperative Nonlinearities in Auditory Cortical Neurons
Shaul Druckmann, Dmitri B. Chklovskii  Current Biology 
There's More Than One Way to Scale a Synapse
Michael L. Morgan, Gregory C. DeAngelis, Dora E. Angelaki  Neuron 
Volume 68, Issue 2, Pages (October 2010)
Gustavo Deco, Morten L. Kringelbach  Neuron 
Divisive Normalization in Olfactory Population Codes
Confidence as Bayesian Probability: From Neural Origins to Behavior
Linking Memories across Time via Neuronal and Dendritic Overlaps in Model Neurons with Active Dendrites  George Kastellakis, Alcino J. Silva, Panayiota.
K+ Channel Regulation of Multicompartmental Signal Integration
Aligning a Synapse Neuron
Jianing Yu, David Ferster  Neuron 
Volume 70, Issue 5, Pages (June 2011)
Neuronal Computations Made Visible with Subcellular Resolution
Sparseness and Expansion in Sensory Representations
David M. Schneider, Sarah M.N. Woolley  Neuron 
Motor Learning with Unstable Neural Representations
Principles Governing the Operation of Synaptic Inhibition in Dendrites
Volume 91, Issue 5, Pages (September 2016)
Collins Assisi, Mark Stopfer, Maxim Bazhenov  Neuron 
Lav R. Varshney, Per Jesper Sjöström, Dmitri B. Chklovskii  Neuron 
Uma R. Karmarkar, Dean V. Buonomano  Neuron 
Synaptic Ménage à Trois
Yann Zerlaut, Alain Destexhe  Neuron 
James M. Jeanne, Tatyana O. Sharpee, Timothy Q. Gentner  Neuron 
Patrick Kaifosh, Attila Losonczy  Neuron 
Cian O’Donnell, Terrence J. Sejnowski  Neuron 
Why Seeing Is Believing: Merging Auditory and Visual Worlds
Neuronal Origins of Choice Variability in Economic Decisions
Volume 64, Issue 6, Pages (December 2009)
Computational Models of Grid Cells
Local and Global Contrast Adaptation in Retinal Ganglion Cells
Volume 91, Issue 6, Pages (September 2016)
Michael A. Silver, Amitai Shenhav, Mark D'Esposito  Neuron 
The Normalization Model of Attention
Does Neuronal Synchrony Underlie Visual Feature Grouping?
Transcellular Nanoalignment of Synaptic Function
From Functional Architecture to Functional Connectomics
Enhanced Functions of Electrical Junctions
Jason A. Cromer, Jefferson E. Roy, Earl K. Miller  Neuron 
Dynamic Shape Synthesis in Posterior Inferotemporal Cortex
Paying Attention to the Details of Attention
The Brain as an Efficient and Robust Adaptive Learner
Volume 74, Issue 1, Pages (April 2012)
Transcellular Nanoalignment of Synaptic Function
Toward Functional Classification of Neuronal Types
Plasticity to the Rescue
Volume 27, Issue 1, Pages (January 2017)
Principles Governing the Operation of Synaptic Inhibition in Dendrites
Encoding Light Intensity by the Cone Photoreceptor Synapse
Michael A. Silver, Amitai Shenhav, Mark D'Esposito  Neuron 
Patrick Kaifosh, Attila Losonczy  Neuron 
Presentation transcript:

Optimizing Neural Information Capacity through Discretization Tatyana O. Sharpee  Neuron  Volume 94, Issue 5, Pages 954-960 (June 2017) DOI: 10.1016/j.neuron.2017.04.044 Copyright © 2017 Elsevier Inc. Terms and Conditions

Figure 1 Approximating Optimal Nonlinear Transformations with Threshold-like Computations (A) Networks with three or more layers serve as “universal functional approximations,” meaning that they can compute any desired function of input variables provided the intermediate layer has a sufficient number of units. Deep networks can reduce the size of the intermediate layer while still approximating complex functions of inputs. (B) A logistic function, a necessary ingredient for achieving flexible computations, can be approximated by averaging over a large number of identical low-precision devices. (C) An alternative is to use many high-precision threshold devices while staggering their thresholds to match the input distribution. (D) Graph of optimal thresholds showing their successive splitting with increased precision of individual neurons. Neuron 2017 94, 954-960DOI: (10.1016/j.neuron.2017.04.044) Copyright © 2017 Elsevier Inc. Terms and Conditions

Figure 2 Examples of Discrete Symmetry Breaking in Biology (A) Two mutually exclusive genetic codes with similar binding profiles. The top 2 × 2 matrix corresponds to standard base pairing, and the bottom right 2 × 2 matrix is an alternative code with notations from Piccirilli et al. (1990). Binding affinities are denoted by color and approximate binding strengths based on the model from Wagner (2005). The two codes are mutually exclusive because of strong mixed binding (off-diagonal blocks). (B) Breaking symmetry in neural function to create separate neuronal types. Optimal encoding based on two (or more) distinct thresholds (orange and yellow curves here) requires sharper activation functions compared to encoding based on a single type of nonlinearity (blue). Further, the activation function (orange) whose threshold is closer to the peak of the input distribution (gray) should be sharper than the activation function with a larger (further removed from the peak) threshold (yellow). The theoretical approach can be applied to encoding by neurons or intracellular receptors. Neuron 2017 94, 954-960DOI: (10.1016/j.neuron.2017.04.044) Copyright © 2017 Elsevier Inc. Terms and Conditions

Figure 3 Predictions that Relate the Distribution of Ion Channel Type and Density to the Statistics of Presynaptic Inputs (A) Schematic representation of the synaptic inputs. (B) The amplitude of synaptic inputs should be matched by the distribution of thresholds for Na channel types present at the synapse and their density. (C) The Fourier spectra should be matched to the inactivation dynamics of ion channels present at the synapse. Neuron 2017 94, 954-960DOI: (10.1016/j.neuron.2017.04.044) Copyright © 2017 Elsevier Inc. Terms and Conditions