What is the role of the Dentate Gyrus? The role of DG CA1 Sb CA3 DG

Slides:



Advertisements
Similar presentations
Distributed Representation, Connection-Based Learning, and Memory Psychology 209 February 1, 2013.
Advertisements

Neural Network of the Cerebellum: Temporal Discrimination and the Timing of Responses Michael D. Mauk Dean V. Buonomano.
Mode shifting between storage and recall based on novelty detection in oscillating hippocampal circuits M. Meeter J. M. J. Murre L. M. Talamini Date of.
Fundamental limits in Information Theory Chapter 10 :
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Un Supervised Learning & Self Organizing Maps Learning From Examples
Basic Models in Neuroscience Oren Shriki 2010 Associative Memory 1.
The Decisive Commanding Neural Network In the Parietal Cortex By Hsiu-Ming Chang ( 張修明 )
Associative Learning in Hierarchical Self Organizing Learning Arrays Janusz A. Starzyk, Zhen Zhu, and Yue Li School of Electrical Engineering and Computer.
We all live under the same roof PALEOCORTEX p  C / [a ln(1/a)] i p  N a ln(1/a) I/CN  O(1 bit)
Hebbian Coincidence Learning
Artificial Intelligence Methods Neural Networks Lecture 4 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
The brain is impossibly complicated - if it were simple enough to understand, we'd be too simple to understand it. - Lyall Watson.
An Instructable Connectionist/Control Architecture: Using Rule-Based Instructions to Accomplish Connectionist Learning in a Human Time Scale Presented.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
What to make of: distributed representations summation of inputs Hebbian plasticity ? Competitive nets Pattern associators Autoassociators.
Pencil-and-Paper Neural Networks Prof. Kevin Crisp St. Olaf College.
A Passive Approach to Sensor Network Localization Rahul Biswas and Sebastian Thrun International Conference on Intelligent Robots and Systems 2004 Presented.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
DO LOCAL MODIFICATION RULES ALLOW EFFICIENT LEARNING ABOUT DISTRIBUTED REPRESENTATIONS ? A. R. Gardner-Medwin THE PRINCIPLE OF LOCAL COMPUTABILITY Neural.
Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary Zoltán Somogyvári.
A Role for Hilar Cells in Pattern Separation in the Dentate Gyrus: A Computational Approach Journal Club 5/16/12.
A novel approach to visualizing dark matter simulations
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
Cholinergic Modulation of the Hippocampus Computational Models of Neural Systems Lecture 2.5 David S. Touretzky September, 2007.
Today’s Lecture Neural networks Training
 ({ri}) ri (x,t) r (x,t) r (t)
OPERATING SYSTEMS CS 3502 Fall 2017
Chapter 7. Classification and Prediction
Deep Feedforward Networks
QUANTUM TRANSITIONS WITHIN THE FUNCTIONAL INTEGRATION REAL FUNCTIONAL
Pattern Separation and Completion in the Hippocampus
Deep Learning Amin Sobhani.
Ch7: Hopfield Neural Model
Valentino Braitenberg
Dimension Review Many of the geometric structures generated by chaotic map or differential dynamic systems are extremely complex. Fractal : hard to define.
International Workshop
Real Neurons Cell structures Cell body Dendrites Axon
Synaptic Dynamics: Unsupervised Learning
Capacity of auto-associative networks
Computer Science Department Brigham Young University
9. Continuous attractor and competitive networks
distributed representations summation of inputs Hebbian plasticity ?
Multiply ? YES ! cat but only later hedgehog monkey First…
Arealization and Memory in the Cortex
Neural Networks and Their Application in the Fields of Coporate Finance By Eric Séverin Hanna Viinikainen.
Amar Sahay, Donald A. Wilson, René Hen  Neuron 
Dopamine DA serotonin 5-HT noradrenaline NA acetylchol. ACh.
Dopamine DA serotonin 5-HT noradrenaline NA acetylchol. ACh.
OCNC Statistical Approach to Neural Learning and Population Coding ---- Introduction to Mathematical.
Frequency-specific network connectivity increases underlie accurate spatiotemporal memory retrieval Andrew J Watrous, Nitin Tandon, Chris R Conner, Thomas.
distributed representations summation of inputs Hebbian plasticity ?
The modular perspective
Artificial Intelligence Lecture No. 28
CA3 Retrieves Coherent Representations from Degraded Input: Direct Evidence for CA3 Pattern Completion and Dentate Gyrus Pattern Separation  Joshua P.
Boltzmann Machine (BM) (§6.4)
Linking Memories across Time via Neuronal and Dendritic Overlaps in Model Neurons with Active Dendrites  George Kastellakis, Alcino J. Silva, Panayiota.
What do grid cells contribute to place cell firing?
Information Processing by Neuronal Populations Chapter 5 Measuring distributed properties of neural representations beyond the decoding of local variables:
Patrick Kaifosh, Attila Losonczy  Neuron 
Cian O’Donnell, Terrence J. Sejnowski  Neuron 
Adult Neurogenesis in the Hippocampus: From Stem Cells to Behavior
Computational Models of Grid Cells
Relating Hippocampal Circuitry to Function
Volume 25, Issue 23, Pages R1116-R1121 (December 2015)
SISSA limbo CÉREBRO, VIDA e CULTURA We can easily describe the brain;
Volume 27, Issue 2, Pages (August 2000)
Volume 87, Issue 7, Pages (December 1996)
CSC 578 Neural Networks and Deep Learning
Patrick Kaifosh, Attila Losonczy  Neuron 
Presentation transcript:

What is the role of the Dentate Gyrus? The role of DG CA1 Sb CA3 DG Trisynaptic circuit in the hippocampus Perforant Path (input from EC) CA1 Mossy fibers (DG-CA3 connections) Sb Schaffer collaterals (CA3-CA1 connections) Recurrent collaterals (CA3-CA3 connections) Erika Cerasti CA3 DG Introduction The study of the info has been performed through simulations and analytical calculation, here you can see the results. Regarding the dependence on the connectivity... in this graph What is the role of the Dentate Gyrus?

it does not seem to be a new frame of mind Rodriguez et al, 2002 and others from the Sevilla group

The entire Hippocampus, or rather Medial Pallium.. Reptiles Mammals Birds neurogenesis neurogenesis neurogenesis Zinc Zinc Zinc acetylcholine acetylcholine acetylcholine recurrent nets recurrent nets recurrent nets spatial memory spatial memory spatial memory Turtles spatial memory Font et al, 2001 Lindsey and Tropepe, 2006 Bingman et al, 2005 Smeets et al, 1989 Martinez-Garcia and Olucha, 1987 Montagnese et al, 1993 Reiner 1993 Krebs et al, 1996 Sherry et al, 1993 Rodriguez et al, 2002 No Dentate Gyrus! Goldfish spatial memory

The new mammalian contraption is the Dentate Gyrus Georg Striedter J Comp Neurology (2016) Evolution of the hippocampus in reptiles and birds The new mammalian contraption is the Dentate Gyrus

Autoassociative network stored in the synaptic weights of the network Introduction The role of DG RC contribution Autoassociative network CA3 Input pattern 2 Input pattern 1 Recurrent connections (David Marr, Bruce McNaughton, Edmund Rolls) different patterns (memories) stored in the synaptic weights of the network Hebbian learning HOPFIELD Model To try to answer this question we refer to theories which consider CA3 to work as an autoassociator to perform memory abilities. Through the presence of a large number of RC in the areas. when an external input comes it impose an activity to the network; this pattern of activity is the network representation of the external input Hebbian learning that is modification of synaptic weights Storage capacity – the maximum number of storable patterns depends on the connectivity of the network and on the sparseness of its representations

Introduction The role of DG RC contribution the Hopfield model does not say how representations are established Memory storage is difficult to implement in CA3 by itself because of the interference due to previously stored memories (and recurrent connection dominance)‏ Differentiation of storage and retrieval phase External Input activity should prevail during storage Recurrent collateral activity should prevail during retrieval CA3 External input CA3 mossy fibers DG Some studies suggest the DG could also help CA3 to form memory representation McNaughton’s detonator synapses The Acetylcholine hypothesis (Hasselmo et al) However the hopfield model does not say what determines the activity distribution relative to each pattern or external input That is how the storage occurs In particular for dg some studies suggest can help in solving this problem DG can impose in the CA3 activity the new pattern to store MF have strong and sparse synapses, suitable for that, as shown in the work of treves and rolls 1992 DENTATE GYRUS a preprocessing stage can separate storage and retrieval, imposing to CA3 the new pattern to store MF have strong and sparse synapses

MF inputs establish new representations Introduction The role of DG RC contribution Functional differentiation of the afferent inputs to CA3 - Information measures the information available for retrieval should be at least equal to the amount which is indeed retrievable by the net Information vs CA3 sparseness (a) Comparison of the amount of storable information carried by afferent inputs to CA3 the Perforant Path input is too weak to force learning and that mf input to CA3 is the one crucial to establish informative representation in CA3, as shown in treves… this theoretical work involves only discrete representations Treves & Rolls, Hippocampus 1992 Binary distribution of DG activity - Discrete patterns Hypothesis: MF inputs establish new representations PP inputs relay the cue for retrieval

Spatial tasks in rodents Introduction The role of DG RC contribution Experimental tests of the hypothesis: Spatial tasks in rodents While the experimental support to this hypothesis come from studies with rodents performing spatial task, involving spatial representations. So we want to concentrate in our study on the role of DG in the storage of spatial representations in CA3. We can do that because now we know the firing code of DG, As leutgeb et al shows in their work the dg cells present multiple firing fields

Spatial tasks in rodents Introduction The role of DG RC contribution Experimental tests of the hypothesis: Spatial tasks in rodents While the experimental support to this hypothesis come from studies with rodents performing spatial task, involving spatial representations. So we want to concentrate in our study on the role of DG in the storage of spatial representations in CA3. We can do that because now we know the firing code of DG, As leutgeb et al shows in their work the dg cells present multiple firing fields

Spatial tasks in rodents Introduction The role of DG RC contribution Experimental tests of the hypothesis: Spatial tasks in rodents Acquisition Index: Errors (T1-5 D1) - (T6-10 D1) Retrieval Index: Errors (T6-10 D1) - (T1-5 D2) While the experimental support to this hypothesis come from studies with rodents performing spatial task, involving spatial representations. So we want to concentrate in our study on the role of DG in the storage of spatial representations in CA3. We can do that because now we know the firing code of DG, As leutgeb et al shows in their work the dg cells present multiple firing fields

Discrete representations Continuous representations Introduction The role of DG RC contribution Information from DG relative to spatial locations Discrete representations Continuous representations (x0 y0) Continuous 2-Dimensional Attractor (x1 y1) In an IDEAL continuous attractor each position in space corresponds to an attractor for the activity in the network. The ensemble of such attractors for all locations in space comprises the memory of the environment (chart). The theoretical study discrete representation while the experimental work involve spatial representation in rodents We want in our study on the role of DG we want to pass from discrete representations to continuous representation in the network and to do that we have to refer to the concept of the 2D continuous attractor in the storage of spatial representations in CA3 If we want to represent spatial locations in the network we need configurations of the network to be change continuously, as the spatial input does. as the space we want to represent (x2 y2) Attractor state for the population vector Memory of multiple environments may require the storage of multiple charts. Multi-charts model (Samsonovich and MNaughton 1997)

Multiple firing fields for DG cells Introduction The role of DG RC contribution Spatial Representations in DG : Firing activity in DG: spatial localized receptive fields, place-like. Multiple firing fields for DG cells Coding model: While the experimental support to this hypothesis come from studies with rodents performing spatial task, involving spatial representations. So we want to concentrate in our study on the role of DG in the storage of spatial representations in CA3. We can do that because now we know the firing code of DG. As leutgeb et al shows in their work the dg cells present multiple firing fields Sparse level of firing p ≃ 0.03 Number of firing fields given by Poisson probability with q ≃ 1.7 Leutgeb et al, Science 2007

Model

The role of DG Storing a new map δ Introduction RC contribution DG (x)‏ Hypothesis Mossy fibers drive the storage of new information External Input Modeling assumption only DG inputs carry the relevant information about a new spatial map to be stored in CA3. CA3 DG c δ noise AMOUNT of INFORMATION about a new spatial map mossy fibers The model we build is composed by a layer of DG cells projecting to a layer of CA3 cell, the connectivity level is indicated by c that is the number of dg cells projecting to a single ca3 cell. We remove the recurrent connections in ca3 and consider them as noise. Then we define the fields and study how the amount of info depend on the paramenters storage of a new map, representative of an environment

c = # DGCs projecting to a single CA3 cell The role of DG Introduction RC contribution c δ CA3 DG c δ AMOUNT of INFORMATION about a new spatial map Sum of Gaussian functions noise c = # DGCs projecting to a single CA3 cell DG pDG ≃ 0.03 The model we build is composed by a layer of DG cells projecting to a layer of CA3 cell, the connectivity level is indicated by c that is the number of dg cells projecting to a single ca3 cell. Then we define the fields and study how the amount of info depend on the parameters A Gaussian for each fields on the dg unit The firing in CA3 then results from this formula Poisson distribution with q = 1.7 CA3 Threshold-linear units Qj = number of field for cell j

Analytical Calculation

The role of DG Mutual Information Introduction RC contribution We calculate the amount of information about a new environment (x) that is established in CA3 cells (η) by DG spatial activity (β)‏ Mutual Information ENTROPY CONDITIONAL ENTROPY

The role of DG Introduction RC contribution ?

m-fields decomposition The role of DG Introduction RC contribution Mutual Information per single CA3 cell <I> = ∑mCm < Dm> m-fields decomposition m-fields contribution to the information <> Quenched variables average Decomposed in pieces each one considering the contribution of a fixed number of input fields x y Dm - spatial signal produced by m DG fields Cm - combination of Poisson coefficients

Simulations

The role of DG CA1 Sb CA3 DG Introduction RC contribution Trisynaptic circuit in the hippocampus Perforant Path (input from EC) CA1 Mossy fibers (DG-CA3 connections) Sb Schaffer collaterals (CA3-CA1 connections) Recurrent collaterals (CA3-CA3 connections) CA3 DG A spatial random number generator dependence on c, the connectivity DG  CA3: The study of the info has been performed through simulations and analytical calculation, here you can see the results. Regarding the dependence on the connectivity... in this graph Simulations Analytical Result Erika Cerasti & AT, 2010, 2013

Recurrent Collaterals Continuous Attractors? Adding Recurrent Collaterals Continuous Attractors? Now we look at what happen when we add the RC collateral in our system, Pre-wired vs self-organized

Enhanced effect of Neurogenesis? DG CA3 New potentially neurogenesis-related observations: (by Charlotte Alme in the Moser lab) Sizable fraction of «overactive» CA3 units (perhaps the majority of IEG-active ones, as seen by Nora Abrous)

number of rooms in which a CA3 cell is active frequency

Enhanced effect of Neurogenesis? DG CA3 New potentially neurogenesis-related observations: (by Charlotte Alme in the Moser lab) Sizable fraction of «overactive» CA3 units (perhaps the majority of IEG-active ones, as seen by Nora Abrous) Hypothesis: They are those driven by 1, 2+ youthful DG units (see model by S Temprana, E Kropff & A Schinder)

+ + number of rooms in which a CA3 cell is active frequency 2 ha GC inputs + + 1 ha GC input No hyperactive adult-born GC input

to the different habitat ? Future plans: differences among rodents… burrows of a Norway rat …may they be related to the different habitat ? CA1 CA3 DG burrows of a Cape mole-rat lab of Irmgard Amrein, Zurich

If not before, then stop here, please.

RC contribution Learning RC Introduction The role of DG New Maps (x)‏ Retrieving a stored map New Map (x)‏ External Input c MF Learning AMOUNT of INFORMATION about the stored spatial map RC Till now we considered only the storage now and in particular the retrieval of the spatial representations established in CA3 by the DG Now we add the activity of RC, allowing the plasticity oh them when the input is present and then So the model is the same as before, with the add of such terms in the firing of CA3 units. Now we have the input from the DG and the input coming from the other CA3 unit that determine the activity of a CA3 unit, with noise and a threshold as before. Jij weights Differently from before we allow plasticity during the presentation of the input So we have an external input coming from dg about a new map, a learning phase in which J are allowed to change, and then a recover phase in which the input is turned off and the system have to maintain the information We see the amount of information the RC activity contain about a previously presented map, in absence of input. So we study the retrieval of a single spatial map when a single map is stored and when multiple maps are stored RC + 

Pre-wired connectivity RC contribution Introduction The role of DG LEARNING ? Pre-wired connectivity Hebbian Learning 1d 2d established in several model The learning could be represented by a pre-wired connectivity or a hebbian learning We study the system and compare the two cases. For the pre-wired connectivity the weights are chosen as exponential decreasing functions of the distance between place centers, What happen is quite well established by previous studies on 1D or 2D models While for the self organizing connectivity we allow weights to change in the learning phase according to this formula. Where a trace term is present.

recurrent connections preserve it, but such randomly established representations are distorted... accurate distorted MF inputs alone, acting as a spatial random number generator, produce a reasonable spatial code

Lack of precision is a finite size effect Real CA3 attractors can be precise λ=1 self-org λ=2 Self-organized ones may not be accurate

Such lack of accuracy comes with less spatial information ..but much more contextual information than for grid cells

Can one iron out the wrinkles in CA3 charts, with RC learning? More learning makes attractors more informative about position More learning makes attractors less informative about context  we should pay attention to Rene Hen’s distinction between position and context position context Cerasti & Treves, 2013