Presentation is loading. Please wait.

Presentation is loading. Please wait.

Jochen Triesch, UC San Diego, 1 Short-term and Long-term Memory Motivation: very simple circuits can store patterns of.

Similar presentations


Presentation on theme: "Jochen Triesch, UC San Diego, 1 Short-term and Long-term Memory Motivation: very simple circuits can store patterns of."— Presentation transcript:

1 Jochen Triesch, UC San Diego, http://cogsci.ucsd.edu/~triesch 1 Short-term and Long-term Memory Motivation: very simple circuits can store patterns of activity Short-term: stability of activity patterns due to non-linear activity dynamics Long-term: storage of patterns through modifications of synapses The simplest STM system: Short-term memory (STM) network Stimulus specific activity in delay period in units in temporal and pre- frontal cortex (after Fuster, 1996):

2 Jochen Triesch, UC San Diego, http://cogsci.ucsd.edu/~triesch 2 Stationary points: all 3 satisfy: This is a cubic equation for e 0 with 3 solutions: Linearizing around these three stationary points results in the 3 linear systems: (τ=20ms)

3 Jochen Triesch, UC San Diego, http://cogsci.ucsd.edu/~triesch 3 Simulation: started at (14,25): started at (50,20):

4 Jochen Triesch, UC San Diego, http://cogsci.ucsd.edu/~triesch 4 consider additional input K to both units: KK Hysteresis in STM Model

5 Jochen Triesch, UC San Diego, http://cogsci.ucsd.edu/~triesch 5 Problem: there should be some forgetting Idea: incorporate adaptation (fatigue) into units Forgetting in STM Model a i variables slowly adjust slope of Naka Rushton non-linearity. If unit active for long time, it will experience “fatigue”. (models very slow hyperpolarizing potassium current.)

6 Jochen Triesch, UC San Diego, http://cogsci.ucsd.edu/~triesch 6 Simulation: present brief input K=50 to a 1 (t) between 200ms < t < 400ms Observation: between 5 and 6 seconds after stimulus, network forgets Explanation: treat a i (t) as constant (slowly changing variable). Plot isoclines of e-dynamics with a i as parameter: stable and unstable nodes join and vanish

7 Jochen Triesch, UC San Diego, http://cogsci.ucsd.edu/~triesch 7 Discussion of STM Model Positive: simple account of behavior of prefrontal neurons in delayed match to sample tasks Limitations: provides only qualitative account no notion of interference …

8 Jochen Triesch, UC San Diego, http://cogsci.ucsd.edu/~triesch 8 Long Term Memory (associative memory) Our simplest model neuron so far: McCulloch Pitts neuron binary, i.e. two states: -1 (inactive) and +1 (active) N neurons connected via weighted connections w ij that represent different synaptic strengths (positive and negative) Next activity determined by applying non-linear function to difference of a unit’s weighted sum of inputs and threshold μ i.

9 Jochen Triesch, UC San Diego, http://cogsci.ucsd.edu/~triesch 9 Network of McCulloch-Pitts neurons with symmetric all-to-all connections and zero threshold. connection from unit j to unit i: symmetry: zero threshold: The Hopfield Network asynchronous updating: pick unit at random, apply update rule for this unit only, then pick next unit at random and update it, etc. or

10 Jochen Triesch, UC San Diego, http://cogsci.ucsd.edu/~triesch 10 Figure from Hertz, Krogh, Palmer (1991) Example: object recognition, each pixel has a corresponding unit, exhibits pattern completion

11 Jochen Triesch, UC San Diego, http://cogsci.ucsd.edu/~triesch 11 Idea: activity pattern (state) ξ “stored” if it is fixed point of the update equation, i.e. if network is in this state and the update rule is applied, then state does not change: Storing a single pattern Claim: pattern stabilized if weights set according to: Proof: let’s be specific and set : and hence:

12 Jochen Triesch, UC San Diego, http://cogsci.ucsd.edu/~triesch 12 Multiple patterns, storage capacity Second term so-called crosstalk term. Can make pattern unstable if big. Happens for large P and small N, i.e. many patterns in small net. Capacity: P max ~ 0.138 N Consider stability of pattern ξ p : Weights: Split sum over k into two parts: k=p and rest First term alone would mean pattern is stable.

13 Jochen Triesch, UC San Diego, http://cogsci.ucsd.edu/~triesch 13 Energy Function for Hopfield Net The dynamics of the Hopfield network is governed by a bounded function of the state that decreases over time (energy function, Lyapunov function). Metaphor: energy landscape. Every unit update can only bring us downhill or let’s us stay at same level. Slide down to closest local energy minimum. Note: inaccurate picture because states are not points on a plane but corners of N-dimensional hypercube. energy states stored pattern with its basin of attraction

14 Jochen Triesch, UC San Diego, http://cogsci.ucsd.edu/~triesch 14 Example: phone book Idea: code text strings by having a number of units for each letter/digit stored patterns recall from only partial pattern:spurious states:

15 Jochen Triesch, UC San Diego, http://cogsci.ucsd.edu/~triesch 15 Positive: - content addressable memory - weights can be computed directly, learning is instantaneous - distributed architecture results in fault tolerance (graceful degradation) if, e.g. some units or connections pruned - many extensions, e.g. for temporal patterns Negative: - poor biological realism - poor generalization (no invariance, can’t shape basins of attractions) - only qualitative account Note: more biologically plausible models of specific memory systems (e.g. the CA3 region of the Hippocampus) have been proposed. Critique of Hopfield memory

16 Jochen Triesch, UC San Diego, http://cogsci.ucsd.edu/~triesch 16 Positive: conceptually simple models, only stereotypic connectivity range of “interesting” phenomena, providing qualitative account of cognitive phenomena at least: metaphors for how a range of things may work Negative: not necessarily easy to scale up need many parameters no learning Conclusions: Neurodynamics


Download ppt "Jochen Triesch, UC San Diego, 1 Short-term and Long-term Memory Motivation: very simple circuits can store patterns of."

Similar presentations


Ads by Google