Biologically-Inspired Neural Nets Modeling the Hippocampus.

Slides:



Advertisements
Similar presentations
Introduction to Neural Networks
Advertisements

Introduction to Artificial Neural Networks
Lecture 13: Associative Memory References: D Amit, N Brunel, Cerebral Cortex 7, (1997) N Brunel, Network 11, (2000) N Brunel, Cerebral.
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Artificial neural networks:
Artificial Neural Networks - Introduction -
Marseille, Jan 2010 Alfonso Renart (Rutgers) Jaime de la Rocha (NYU, Rutgers) Peter Bartho (Rutgers) Liad Hollender (Rutgers) Néstor Parga (UA Madrid)
Artificial Neural Networks - Introduction -
Introduction to Mathematical Methods in Neurobiology: Dynamical Systems Oren Shriki 2009 Modeling Conductance-Based Networks by Rate Models 1.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Machine Learning Neural Networks
Soft computing Lecture 6 Introduction to neural networks.
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Neural Networks: Concepts (Reading:
Un Supervised Learning & Self Organizing Maps Learning From Examples
Connectionist Modeling Some material taken from cspeech.ucd.ie/~connectionism and Rich & Knight, 1991.
How does the mind process all the information it receives?
An Introduction To The Backpropagation Algorithm Who gets the credit?
Lecture 09 Clustering-based Learning
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Bump attractors and the homogeneity assumption Kevin Rio NEUR April 2011.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Jochen Triesch, UC San Diego, 1 Short-term and Long-term Memory Motivation: very simple circuits can store patterns of.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
Artificial Neural Network Unsupervised Learning
Advances in Modeling Neocortex and its impact on machine intelligence Jeff Hawkins Numenta Inc. VS265 Neural Computation December 2, 2010 Documentation.
Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University EE459 Neural Networks The Structure.
NEURAL NETWORKS FOR DATA MINING
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
What to make of: distributed representations summation of inputs Hebbian plasticity ? Competitive nets Pattern associators Autoassociators.
Lecture 9: Introduction to Neural Networks Refs: Dayan & Abbott, Ch 7 (Gerstner and Kistler, Chs 6, 7) D Amit & N Brunel, Cerebral Cortex 7, (1997)
Ming-Feng Yeh1 CHAPTER 16 AdaptiveResonanceTheory.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: ML and Simple Regression Bias of the ML Estimate Variance of the ML Estimate.
Neural Networks Steven Le. Overview Introduction Architectures Learning Techniques Advantages Applications.
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang 12-1 Chapter 12 Advanced Intelligent Systems.
Version 0.10 (c) 2007 CELEST VISI  N BRIGHTNESS CONTRAST: ADVANCED MODELING CLASSROOM PRESENTATION.
Introduction to Neural Networks. Biological neural activity –Each neuron has a body, an axon, and many dendrites Can be in one of the two states: firing.
Artificial Neural Networks Students: Albu Alexandru Deaconescu Ionu.
Oscillatory Models of Hippocampal Activity and Memory Roman Borisyuk University of Plymouth, UK In collaboration with.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Framework For PDP Models Psych /719 Jan 18, 2001.
CHEE825 Fall 2005J. McLellan1 Nonlinear Empirical Models.
Alternating and Synchronous Rhythms in Reciprocally Inhibitory Model Neurons Xiao-Jing Wang, John Rinzel Neural computation (1992). 4: Ubong Ime.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
ECE 471/571 - Lecture 16 Hopfield Network 11/03/15.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Perceptron vs. the point neuron Incoming signals from synapses are summed up at the soma, the biological “inner product” On crossing a threshold, the cell.
Where are we? What’s left? HW 7 due on Wednesday Finish learning this week. Exam #4 next Monday Final Exam is a take-home handed out next Friday in class.
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
An Introduction To The Backpropagation Algorithm.
Dr. Unnikrishnan P.C. Professor, EEE
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
Capacity of auto-associative networks
Simple learning in connectionist networks
Chapter 12 Advanced Intelligent Systems
Presented by Rhee, Je-Keun
N-Gram Model Formulas Word sequences Chain rule of probability
An Introduction To The Backpropagation Algorithm
Learning Precisely Timed Spikes
CS 416 Artificial Intelligence
Volume 40, Issue 6, Pages (December 2003)
Linking Memories across Time via Neuronal and Dendritic Overlaps in Model Neurons with Active Dendrites  George Kastellakis, Alcino J. Silva, Panayiota.
Experience-Dependent Asymmetric Shape of Hippocampal Receptive Fields
H.Sebastian Seung, Daniel D. Lee, Ben Y. Reis, David W. Tank  Neuron 
Simple learning in connectionist networks
Presentation transcript:

Biologically-Inspired Neural Nets Modeling the Hippocampus

Hippocampus 101 In 1957, Scoville and Milner reported on patient HM Since then, numerous studies have used fMRI and PET scans to demonstrate use of hippocampus during learning and recall Numerous rat studies that monitor individual neurons demonstrate the existence of place cells Generally, hippocampus is associated with intermediate term memory (ITM).

Hippocampus 101 In 1994, Wilson and McNaughton demonstrated that sharp wave bursts (SPW) during sleep are time-compressed sequences learned earlier Levy hypothesizes that the hippocampus teaches learned sequences to the neocortex as part of a biased random processes Levy also hypothesizes that erasure/bias demotion happens when the neocortex signals to the hippocampus that the sequence was acquired, probably during slow-wave sleep (SWS).

Cornus Ammon The most significant feature in the hippocampus is the Cornus Ammon (CA) Most work in the Levy Lab focuses specifically on the CA3 region, although recently we’ve started re-examining the CA1 region as well

Minimal Model CA3 recurrent activity

Typical Equations Definitions y j net excitation of j x j external input to j z j output state of j θthreshold to fire K I feedforward inhibition K R feedback inhibition K 0 resting conductance c ij connectivity from i to j w ij weight between i and j εrate constant of synaptic modification αspike decay rate ttime

Fundamental Properties Neurons are McCulloch-Pitts-type threshold elements Synapses modify associatively on a local Hebbian-type rule Most connections are excitatory Recurrent excitation is sparse, asymmetric, and randomly connected Inhibitory neurons approximately control net activity In CA3, recurrent excitation contributes more to activity than external excitation Activity is low, but not too low

Model Variables Functional 1.Average activity 2.Activity fluctuations 3.Sequence length memory capacity 4.Average lifetime of local context neurons 5.Speed of learning 6.Ratio of external to recurrent excitations Actual 1.Number of neurons 2.Percent connectivity 3.Time span of synaptic associations 4.Threshold to fire 5.Feedback inhibition weight constant 6.Feedforward inhibition weight constant 7.Resting conductance 8.Rate constant of synaptic modification 9.Input code

Eleven Problems 1.Simple sequence completion 2.Spontaneous rebroadcast 3.One-trial learning 4.Jump-ahead recall 5.Sequence disambiguation (context past) 6.Finding a shortcut 7.Goal finding (context future) 8.Combining appropriate subsequences 9.Transverse patterning 10.Transitive inference 11.Trace conditioning

Sequence Completion Train on sequence ABCDEFG Provide input A Network recalls BCDEFG

Rebroadcast Train network on one or more sequences Provide random input patterns All or part of one of the trained sequences is recalled

One-trial learning Requires high synaptic modification Does not use same parameters as other problems Models short-term memory (STM) instead of intermediate-term memory (ITM- hippocampus)

Jump-ahead recall With adjusted inhibition, sequence completion can be short-circuited Train network on ABCDEFG Provide A Network recalls G or possibly BDG, etc. Inhibition in hippocampus does vary

Disambiguation Train network on patterns ABC456GHI and abc456ghi Present pattern A to the network Network recalls BC456GHI Requires patterns 4, 5, and 6 to be coded differently depending on past context

Shortcuts Train network on pattern ABC456GHIJKL456PQR Present pattern A to the network Network recalls BC456PQR Uses common neurons of patterns 4, 5, and 6 to generate a shortcut

Goal Finding Train network on pattern ABC456GHIJKL456PQR Present pattern A and part of pattern K to the network Network recalls BC456GHIJK… Requires use of context future

Combinations Train network on patterns ABC456GHI and abc456ghi Present pattern A and part of pattern i to the network Network recalls BC456ghi Also requires use of context future

Transverse Patterning Similar to rock, paper, scissors Train network on sequences [AB]a+, [AB]b-, [BC]b+, [BC]c-, [AC]c+, [AC]a- Present [AB] and part of + to network and network will generate a Present [BC] and part of + to network and network will generate b Present [AC] and part of + to network and network will generate c

Transitive Inference Transitivity: if A>B and B>C, then A>C Train network on [AB]a+, [AB]b-, [BC]b+, [BC]c-, [CD]c+, [CD]d-, [DE]d+, [DE]e- Present [BD] and part of + to network, and it will generate b

Trace Conditioning Train network on sequence A……B Vary the amount of time between presentation of pattern A and pattern B Computational results match experimental results on trace conditioning in rabbits

Important Recent Discoveries Addition of random “starting pattern” improves performance of network Synaptic failures improve performance (and reduce energy requirements) Addition of CA1 decoder improves performance