by Khaled Nasr, Pooja Viswanathan, and Andreas Nieder

Slides:



Advertisements
Similar presentations
Deep Learning Overview Sources: workshop-tutorial-final.pdf
Advertisements

Convolutional Neural Network
Multiplexed Spike Coding and Adaptation in the Thalamus
Volume 87, Issue 2, Pages (July 2015)
Volume 20, Issue 5, Pages (May 1998)
A Source for Feature-Based Attention in the Prefrontal Cortex
Predicting Every Spike
Araceli Ramirez-Cardenas, Maria Moskaleva, Andreas Nieder 
Coding of Cognitive Magnitude
Volume 64, Issue 3, Pages (November 2009)
Volume 82, Issue 1, Pages (April 2014)
Volume 19, Issue 2, Pages (August 1997)
Combining satellite imagery and machine learning to predict poverty
Ian M. Finn, Nicholas J. Priebe, David Ferster  Neuron 
Optimal Degrees of Synaptic Connectivity
Volume 66, Issue 6, Pages (June 2010)
Volume 20, Issue 5, Pages (May 1998)
Learning to Simulate Others' Decisions
Sheng Li, Stephen D. Mayhew, Zoe Kourtzi  Neuron 
Michael L. Morgan, Gregory C. DeAngelis, Dora E. Angelaki  Neuron 
Complementary Roles for Primate Frontal and Parietal Cortex in Guarding Working Memory from Distractor Stimuli  Simon Nikolas Jacob, Andreas Nieder  Neuron 
First-Pass Processing of Value Cues in the Ventral Visual Pathway
Probabilistic Population Codes for Bayesian Decision Making
Vincent B. McGinty, Antonio Rangel, William T. Newsome  Neuron 
Differential Impact of Behavioral Relevance on Quantity Coding in Primate Frontal and Parietal Neurons  Pooja Viswanathan, Andreas Nieder  Current Biology 
A Channel for 3D Environmental Shape in Anterior Inferotemporal Cortex
Feature- and Order-Based Timing Representations in the Frontal Cortex
Cascaded Effects of Spatial Adaptation in the Early Visual System
Relationship of Correlated Spontaneous Activity to Functional Ocular Dominance Columns in the Developing Visual Cortex  Chiayu Chiu, Michael Weliky  Neuron 
Volume 66, Issue 4, Pages (May 2010)
Volume 71, Issue 4, Pages (August 2011)
A Map for Horizontal Disparity in Monkey V2
Torben Ott, Simon Nikolas Jacob, Andreas Nieder  Neuron 
Adaptation Disrupts Motion Integration in the Primate Dorsal Stream
Liu D. Liu, Christopher C. Pack  Neuron 
Benjamin Scholl, Daniel E. Wilson, David Fitzpatrick  Neuron 
Neuronal Selectivity and Local Map Structure in Visual Cortex
Origin and Function of Tuning Diversity in Macaque Visual Cortex
Prediction of Orientation Selectivity from Receptive Field Architecture in Simple Cells of Cat Visual Cortex  Ilan Lampl, Jeffrey S. Anderson, Deda C.
Volume 98, Issue 3, Pages e8 (May 2018)
Integration of Local Features into Global Shapes
Multiplexed Spike Coding and Adaptation in the Thalamus
Effects of Long-Term Visual Experience on Responses of Distinct Classes of Single Units in Inferior Temporal Cortex  Luke Woloszyn, David L. Sheinberg 
Multiplexed Spike Coding and Adaptation in the Thalamus
Greg Schwartz, Sam Taylor, Clark Fisher, Rob Harris, Michael J. Berry 
Volume 54, Issue 2, Pages (April 2007)
Guilhem Ibos, David J. Freedman  Neuron 
Xiangying Meng, Joseph P.Y. Kao, Hey-Kyoung Lee, Patrick O. Kanold 
Adrián Hernández, Antonio Zainos, Ranulfo Romo  Neuron 
Timing, Timing, Timing: Fast Decoding of Object Information from Intracranial Field Potentials in Human Visual Cortex  Hesheng Liu, Yigal Agam, Joseph.
Visual Feature-Tolerance in the Reading Network
The Normalization Model of Attention
Benjamin Scholl, Daniel E. Wilson, David Fitzpatrick  Neuron 
Learning to Simulate Others' Decisions
Dario Maschi, Vitaly A. Klyachko  Neuron 
Predictive Neural Coding of Reward Preference Involves Dissociable Responses in Human Ventral Midbrain and Ventral Striatum  John P. O'Doherty, Tony W.
by Kenneth W. Latimer, Jacob L. Yates, Miriam L. R
Tuning to Natural Stimulus Dynamics in Primary Auditory Cortex
Dynamic Shape Synthesis in Posterior Inferotemporal Cortex
Gregor Rainer, Earl K Miller  Neuron 
Biased Associative Representations in Parietal Cortex
Daniela Vallentin, Andreas Nieder  Current Biology 
Lysann Wagener, Maria Loconsole, Helen M. Ditz, Andreas Nieder 
Volume 50, Issue 1, Pages (April 2006)
Volume 99, Issue 1, Pages e4 (July 2018)
Volume 66, Issue 4, Pages (May 2010)
Supratim Ray, John H.R. Maunsell  Neuron 
Emergence of resonances and prediction of distinct network responses
Binocular neurons mismatched in spatial frequency are also mismatched in orientation preference. Binocular neurons mismatched in spatial frequency are.
Visual Feature-Tolerance in the Reading Network
Presentation transcript:

by Khaled Nasr, Pooja Viswanathan, and Andreas Nieder Number detectors spontaneously emerge in a deep neural network designed for visual object recognition by Khaled Nasr, Pooja Viswanathan, and Andreas Nieder Science Volume 5(5):eaav7903 May 8, 2019 Published by AAAS

Fig. 1 An HCNN for object recognition. An HCNN for object recognition. (A) Simplified architecture of the HCNN. The feature extraction network consists of convolutional layers that compute multiple feature maps. Each feature map represents the presence of a certain visual feature at all possible locations in the input and is computed by convolving the input with a filter and then applying a nonlinear activation function. Max-pooling layers aggregate responses by computing the maximum response in small nonoverlapping regions of their input. The classification network consists of a global average-pooling layer that computes the average response in each input feature map, and a fully connected layer where the response of each unit represents the probability that a specific object class is present in the input image. (B) Successful classification of a wolf spider by the network from other arthropods is shown as an example. Example images representative of those used in the test set and the top 5 predictions made by the network for each image ranked by confidence. Ground-truth labels are shown above each image. Images shown here are from the public domain (Wikimedia Commons). Khaled Nasr et al. Sci Adv 2019;5:eaav7903 Published by AAAS

Fig. 2 Numerosity-tuned units emerging in the HCNN. Numerosity-tuned units emerging in the HCNN. (A) Examples of the stimuli used to assess numerosity encoding. Standard stimuli contain dots of the same average radius. Dots in Area & Density stimuli have a constant total area and density across all numerosities. Dots in Shape & Convex hull stimuli have random shapes and a uniform pentagon convex hull (for numerosities >4). (B) Tuning curves for individual numerosity-selective network units. Colored curves show the average responses for each stimulus set. Black curves show the average responses over all stimulus set. Error bars indicate SE measure. PN, preferred numerosity. (C) Same as (B), but for neurons in monkey prefrontal cortex (20). Only the average responses over all stimulus sets are shown. (D) Distribution of preferred numerosities of the numerosity-selective network units. (E) Same as (D), but for real neurons recorded in monkey prefrontal cortex [data from (20)]. Khaled Nasr et al. Sci Adv 2019;5:eaav7903 Published by AAAS

Fig. 3 Tuning curves of numerosity-selective network units. Tuning curves of numerosity-selective network units. Average tuning curves of numerosity-selective network units tuned to each numerosity. Each curve is computed by averaging the responses of all numerosity-selective units that have the same preferred numerosity. The pooled responses are normalized to the 0 to 1 range. Preferred numerosity and number of numerosity-selective network units are indicated above each curve. Error bars indicate SE measure. Khaled Nasr et al. Sci Adv 2019;5:eaav7903 Published by AAAS

Fig. 4 Tuning properties of numerosity-selective network units. Tuning properties of numerosity-selective network units. (A) Left: Average tuning curves for network units preferring each numerosity plotted on a linear scale. Right: Same tuning curves plotted on a logarithmic scale. (B) Average goodness-of-fit measure for fitting Gaussian functions to the tuning curves on different scales [Plinear-log = 0.009; Plinear-pow(1/2) = 0.003; Plinear-pow(1/3) = 0.001]. (C) SD of the best-fitting Gaussian function for each of the tuning curves of numerosity-selective network units for different scales. Khaled Nasr et al. Sci Adv 2019;5:eaav7903 Published by AAAS

Fig. 5 Relevance of numerosity-selective units to network performance. Relevance of numerosity-selective units to network performance. Average activity of numerosity-selective network units shown as a function of numerical distance between preferred numerosities and sample numerosities in the matching task. Left: Data from network units. Responses were average separately for correct trials (black) and error trials (gray). Responses during error trials are normalized to the maximum average response during correct trials (P = 0.019). Right: Same plot but for real neurons recorded from monkey prefrontal cortex [data from (20)]. Khaled Nasr et al. Sci Adv 2019;5:eaav7903 Published by AAAS

Fig. 6 Performance of the HCNN model in the numerosity matching task. Performance of the HCNN model in the numerosity matching task. (A) Left: Performance functions resulting from the discrimination of numerosities plotted on a linear scale. Each curve shows the probability of the model predicting that the sample image contains the same number of items as the test image (peak of the function). Sample numerosity is indicated above each curve. Right: Same performance functions but plotted on a logarithmic scale. (B) Average goodness-of-fit measure for fitting Gaussian functions to the performance tuning curves on different scales [Plinear-log = 0.003; Plinear-pow(1/2) = 0.049; Plinear-pow(1/3) = 0.016]. *P < 0.05. (C) SD of the best-fitting Gaussian function for each of the performance tuning curves for different scales. Khaled Nasr et al. Sci Adv 2019;5:eaav7903 Published by AAAS