Memory Network Maintenance Using Spike-Timing Dependent Plasticity David Jangraw, ELE ’07 Advisor: John Hopfield, Department of Molecular Biology 12 T.

Slides:



Advertisements
Similar presentations
Introduction to Neural Networks
Advertisements

Bioinspired Computing Lecture 16
Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Synchrony in Neural Systems: a very brief, biased, basic view Tim Lewis UC Davis NIMBIOS Workshop on Synchrony April 11, 2011.
Learning crossmodal spatial transformations through STDP Gerhard Neumann Seminar B, SS 06.
Artificial Spiking Neural Networks
The role of spike blocking as spike-timing-dependent plasticity mechanism Eleftheria Kyriaki Pissadaki Computational Biology Laboratory Institute of Molecular.
Ella Gale, Ben de Lacy Costello and Andrew Adamatzky Observation and Characterization of Memristor Current Spikes and their Application to Neuromorphic.
Reading population codes: a neural implementation of ideal observers Sophie Deneve, Peter Latham, and Alexandre Pouget.
1 ECE 20B: Introduction to Electrical and Computer Engineering Winter 2003 Recitation 1: Operational Amplifiers.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Un Supervised Learning & Self Organizing Maps Learning From Examples
Perceptron Learning Rule Assuming the problem is linearly separable, there is a learning rule that converges in a finite time Motivation A new (unseen)

Lecture 4 Neural Networks ICS 273A UC Irvine Instructor: Max Welling Read chapter 4.
October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 1 Example I: Predicting the Weather We decide (or experimentally determine) to use.
December 7, 2010Neural Networks Lecture 21: Hopfield Network Convergence 1 The Hopfield Network The nodes of a Hopfield network can be updated synchronously.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Battery Principles.
CS623: Introduction to Computing with Neural Nets (lecture-10) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Artificial Neural Networks
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004.
Lecture 10: Mean Field theory with fluctuations and correlations Reference: A Lerchner et al, Response Variability in Balanced Cortical Networks, q-bio.NC/ ,
Announcements For lectures 8 to 10 please be reading Chapter 3
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Pencil-and-Paper Neural Networks Prof. Kevin Crisp St. Olaf College.
Aim of neuromorphic engineering: o Design and construct physical models of biological neural networks that replicate:  robust computation  adaptability.
Artificial Neural Networks Chapter 4 Perceptron Gradient Descent Multilayer Networks Backpropagation Algorithm 1.
IE 585 History of Neural Networks & Introduction to Simple Learning Rules.
Keeping the neurons cool Homeostatic Plasticity Processes in the Brain.
Spiking Neural Networks Banafsheh Rekabdar. Biological Neuron: The Elementary Processing Unit of the Brain.
Chapter 9 CAPACITOR.
Jochen Triesch, UC San Diego, 1 Part 3: Hebbian Learning and the Development of Maps Outline: kinds of plasticity Hebbian.
Cellular Neural Networks and Visual Computing Leon O. Chua and Tamás Roska Presentation by Max Pflueger.
Branch:- Electrical (09)
Real Neurons Cell structures Cell body Dendrites Axon
Volume 32, Issue 2, Pages (October 2001)
Volume 97, Issue 6, Pages e5 (March 2018)
Covariation Learning and Auto-Associative Memory
Emre O. Neftci  iScience  Volume 5, Pages (July 2018) DOI: /j.isci
Ian M. Finn, Nicholas J. Priebe, David Ferster  Neuron 
Learning Precisely Timed Spikes
Carlos D. Brody, J.J. Hopfield  Neuron 
John Widloski, Ila R. Fiete  Neuron 
Brendan K. Murphy, Kenneth D. Miller  Neuron 
Matias J. Ison, Rodrigo Quian Quiroga, Itzhak Fried  Neuron 
Volume 40, Issue 6, Pages (December 2003)
Bidirectional Modification of Presynaptic Neuronal Excitability Accompanying Spike Timing-Dependent Synaptic Plasticity  Cheng-yu Li, Jiang-teng Lu, Chien-ping.
Volume 86, Issue 1, Pages (April 2015)
Michiel W.H. Remme, Máté Lengyel, Boris S. Gutkin  Neuron 
Volume 36, Issue 5, Pages (December 2002)
Thomas Akam, Dimitri M. Kullmann  Neuron 
Volume 72, Issue 5, Pages (December 2011)
Volume 88, Issue 3, Pages (November 2015)
Sparseness and Expansion in Sensory Representations
Experience-Dependent Asymmetric Shape of Hippocampal Receptive Fields
H.Sebastian Seung, Daniel D. Lee, Ben Y. Reis, David W. Tank  Neuron 
Prediction of Orientation Selectivity from Receptive Field Architecture in Simple Cells of Cat Visual Cortex  Ilan Lampl, Jeffrey S. Anderson, Deda C.
Collins Assisi, Mark Stopfer, Maxim Bazhenov  Neuron 
Mechanisms of Concerted Firing among Retinal Ganglion Cells
Uma R. Karmarkar, Dean V. Buonomano  Neuron 
Yann Zerlaut, Alain Destexhe  Neuron 
Patrick Kaifosh, Attila Losonczy  Neuron 
Xiangying Meng, Joseph P.Y. Kao, Hey-Kyoung Lee, Patrick O. Kanold 
Computational Models of Grid Cells
Robust Spatial Working Memory through Homeostatic Synaptic Scaling in Heterogeneous Cortical Networks  Alfonso Renart, Pengcheng Song, Xiao-Jing Wang 
Spike-Time-Dependent Plasticity and Heterosynaptic Competition Organize Networks to Produce Long Scale-Free Sequences of Neural Activity  Ila R. Fiete,
Robust Spatial Working Memory through Homeostatic Synaptic Scaling in Heterogeneous Cortical Networks  Alfonso Renart, Pengcheng Song, Xiao-Jing Wang 
Patrick Kaifosh, Attila Losonczy  Neuron 
Volume 27, Issue 2, Pages (January 2017)
Presentation transcript:

Memory Network Maintenance Using Spike-Timing Dependent Plasticity David Jangraw, ELE ’07 Advisor: John Hopfield, Department of Molecular Biology 12 T 21 T 12 Synaptic Drift Associative MemoryStabilization of Firing Spike-Timing Dependent Plasticity Synchronization Figure (except titles) from J. J. Hopfield (2006). “Searching for memories, Sudoku, implicit check-bits, and the iterative use of not-always-correct rapid neural computation.” arXiv.org, 19 Sep δ T  + δ I  – δ t  – δ T  – δ I Associative Memory: Learned connections between ideas one remembers as being associated Associative memory is not fully understood; here we refine a simplified model of it Each property of the person or thing being remembered is represented by the activity of one neuron All neurons active in a memory are connected, so thinking of one thing about a memory (i.e. a name) will recall other things (i.e. height, eye color) We will model the activity of these neurons using MATLAB Synapse: A connection between two neurons Synaptic Strength (matrix T): T 21 is proportional to the size of the electrical response in neuron 2 evoked by an electrical spike in neuron 1 Synaptic Drift ( δ T): Small, random changes in synaptic strength due to “noisy” cellular processes The Problem: Normal memories have equal activity in each participating neuron, but synaptic drift causes unequal activity (a corrupted memory) or even memory loss. …Can the system correct itself? Spike-Timing Dependent Plasticity (STDP): Experimentally observed phenomenon by which relative spike timing changes synaptic strength Each neuron is modeled as a parallel RC circuit with a firing threshold When the membrane voltage exceeds threshold, the cell fires. The firing frequency ‘f’ is our measure of activity. If we feed an AC input of the form: I=Io + A sin(2π wt) into a neuron, a large range of DC offsets (Io) will drive the cell to fire at a frequency w, creating a plateau on the f-I curve (below left). Modeled disconnected neurons given slightly different input currents, causing varying delays in spike timing STDP applied based on average delay after 50 spikes of each neuron STDP successfully used to synchronize firing of many neurons Slope of STDP rule affects speed and stability of synchronization Drift Correction Simplified memory network using continuous-variable neurons with spiking neuron’s f-I curve and spike timing patterns Applied random synaptic drift (≤5%) to each active connection in a memory. Each drifted memory did converge to an ideal memory This indicates that spiking neurons could correct for synaptic drift using STDP Future Directions Load multiple, overlapping memories into network and test performance Create converging memory network of spiking neurons Use more realistic STDP rule I = Io sin(2π wt) Io = I AC t Ideal Memory Drifted Memory Non-Memory - Neuron Activity Eyes: Br Bl Gr … Each black square in this grid represents an active neuron encoding something about a person. For example, if the bottom row of neurons encodes eye color, and the ‘green’ neuron is active, this person has green eyes. Stability Analysis: A small positive change in T (synaptic drift, + δ T) will produce a proportional positive change in the input current (+ δ I) to the receiving neuron This will make the neuron fire more quickly than the neurons connected to it (- δ t) STDP would transform this spike timing delay into negative changes in synaptic strength (- δ T) This would produce a proportional negative change in input current (+ δ I) that could stabilize the memory! Changes in input current (while still on the f-I plateau) lead to changes in spike timing. Changes in spike timing are converted to proportional changes in synaptic strength. Spikes too far separated are ignored. AC input produces a plateau on the neuron’s f-I curve. Two neurons with slightly different input currents are synchronized using STDP rules with different slopes (m). Low m produced slow convergence; high m produced damped oscillation; extremely high m (not shown) produced instability.