CS 416 Artificial Intelligence

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Advertisements

Introduction to Neural Networks
Slides from: Doug Gray, David Poole
Introduction to Artificial Neural Networks
Artificial Neural Networks (1)
Sparse Coding in Sparse Winner networks Janusz A. Starzyk 1, Yinyin Liu 1, David Vogel 2 1 School of Electrical Engineering & Computer Science Ohio University,
Neural Network of the Cerebellum: Temporal Discrimination and the Timing of Responses Michael D. Mauk Dean V. Buonomano.
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Artificial neural networks:
Artificial Neural Networks - Introduction -
Plasticity in the nervous system Edward Mann 17 th Jan 2014.
Marseille, Jan 2010 Alfonso Renart (Rutgers) Jaime de la Rocha (NYU, Rutgers) Peter Bartho (Rutgers) Liad Hollender (Rutgers) Néstor Parga (UA Madrid)
Artificial Neural Networks - Introduction -
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
Un Supervised Learning & Self Organizing Maps Learning From Examples
November 9, 2010Neural Networks Lecture 16: Counterpropagation 1 Unsupervised Learning So far, we have only looked at supervised learning, in which an.
Pattern Recognition using Hebbian Learning and Floating-Gates Certain pattern recognition problems have been shown to be easily solved by Artificial neural.
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #31 4/17/02 Neural Networks.
Lecture 09 Clustering-based Learning
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Biologically-Inspired Neural Nets Modeling the Hippocampus.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Jochen Triesch, UC San Diego, 1 Short-term and Long-term Memory Motivation: very simple circuits can store patterns of.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
Artificial Neural Network Unsupervised Learning
Advances in Modeling Neocortex and its impact on machine intelligence Jeff Hawkins Numenta Inc. VS265 Neural Computation December 2, 2010 Documentation.
NEURAL NETWORKS FOR DATA MINING
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
What to make of: distributed representations summation of inputs Hebbian plasticity ? Competitive nets Pattern associators Autoassociators.
Cerebellum Overview and structure of cerebellum Microcircuitry of cerebellum Motor learning.
Lecture 9: Introduction to Neural Networks Refs: Dayan & Abbott, Ch 7 (Gerstner and Kistler, Chs 6, 7) D Amit & N Brunel, Cerebral Cortex 7, (1997)
Chapter 7. Network models Firing rate model for neuron as a simplification for network analysis Neural coordinate transformation as an example of feed-forward.
Artificial Neural Networks Students: Albu Alexandru Deaconescu Ionu.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
CS 621 Artificial Intelligence Lecture /11/05 Guest Lecture by Prof
Spiking Neural Networks Banafsheh Rekabdar. Biological Neuron: The Elementary Processing Unit of the Brain.
Perceptrons Michael J. Watts
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Perceptron vs. the point neuron Incoming signals from synapses are summed up at the soma, the biological “inner product” On crossing a threshold, the cell.
Where are we? What’s left? HW 7 due on Wednesday Finish learning this week. Exam #4 next Monday Final Exam is a take-home handed out next Friday in class.
“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 2 ARTIFICIAL.
Building a scalable neural processing subsystem Joy Bose Supervisors Steve Furber (Amulet Group) Jonathan Shapiro (AI Group) Amulet Group Meeting 30 January.
EXPLAIN HOW BIOLOGICAL FACTORS MAY AFFECT ONE COGNITIVE PROCESS By Yulia.
Complementary Learning Systems
Neural Networks.
LEARNING & MEMORY Jaan Aru
ECE 471/571 - Lecture 15 Hopfield Network 03/29/17.
Artificial neural networks:
NATURE NEUROSCIENCE 2007 Coordinated memory replay in the visual cortex and hippocampus during sleep Daoyun Ji & Matthew A Wilson Department of Brain.
DEPARTMENT: COMPUTER SC. & ENGG. SEMESTER : VII
Capacity of auto-associative networks
Simple learning in connectionist networks
Financial Informatics –XVII: Unsupervised Learning
ECE 471/571 - Lecture 19 Hopfield Network.
Learning, Memory, Language
Chapter 12 Advanced Intelligent Systems
Presented by Rhee, Je-Keun
An Introduction To The Backpropagation Algorithm
Brendan K. Murphy, Kenneth D. Miller  Neuron 
Volume 40, Issue 6, Pages (December 2003)
The Naïve Bayes (NB) Classifier
Linking Memories across Time via Neuronal and Dendritic Overlaps in Model Neurons with Active Dendrites  George Kastellakis, Alcino J. Silva, Panayiota.
Information Processing by Neuronal Populations Chapter 5 Measuring distributed properties of neural representations beyond the decoding of local variables:
Simple learning in connectionist networks
Relating Hippocampal Circuitry to Function
Introduction to Neural Network
Presentation transcript:

CS 416 Artificial Intelligence Lecture 20 Biologically-Inspired Neural Nets Modeling the Hippocampus

Hippocampus 101 In 1957, Scoville and Milner reported on patient HM Since then, numerous studies have used fMRI and PET scans to demonstrate use of hippocampus during learning and recall Numerous rat studies that monitor individual neurons demonstrate the existence of place cells Generally, hippocampus is associated with intermediate term memory (ITM). HM bilaterally had large regions of his hippocampus and other temporal lobe structures removed for treatment of epilepsy. Consequently suffered from retrograde and anterograde amnesia. Retrograde amnesia was graded with respect to time, with no noticeable loss for events happening more than 10 years ago. (IQ actually went up a couple of points, but within margin of error for test.)

Hippocampus 101 In 1994, Wilson and McNaughton demonstrated that sharp wave bursts (SPW) during sleep are time-compressed sequences learned earlier Levy hypothesizes that the hippocampus teaches learned sequences to the neocortex as part of a biased random processes Levy also hypothesizes that erasure/bias demotion happens when the neocortex signals to the hippocampus that the sequence was acquired, probably during slow-wave sleep (SWS). Busaki achieved similar results in 1996 Levy hypothesis relate to PTSD

Cornu Ammonis The most significant feature in the hippocampus is the Cornu Ammonis (CA) Most work in the Levy Lab focuses specifically on the CA3 region, although recently we’ve started re-examining the CA1 region as well Ammon, “the hidden one”, is the supreme divinity in the Egyptian pantheon

Minimal Model CA3 recurrent activity The hippocampus is getting highly processed, abstract information Other important areas of the brain include the visual, auditory, and motor cortices, Broca’s region and Wernicke’s region, and multimodal cortices such as Brodman’s area 46/9 – the prefrontal cortex. The hippocampus is strongly inter-connected to multimodal regions of the brain. The hippocampus is thought to be a random recoder that is activated more strongly when novelty is encountered whose purpose is to associate things we can’t otherwise associate High dimensionality is key CA3 recurrent activity

Typical Equations Definitions yj net excitation of j xj external input to j zj output state of j θ threshold to fire KI feedforward inhibition KR feedback inhibition K0 resting conductance cij connectivity from i to j wij weight between i and j ε rate constant of synaptic modification α spike decay rate t time Memory is stored in the weights Neurons have binary outputs 0/1

Fundamental Properties Neurons are McCulloch-Pitts-type threshold elements Synapses modify associatively on a local Hebbian-type rule Most connections are excitatory Recurrent excitation is sparse, asymmetric, and randomly connected Inhibitory neurons approximately control net activity In CA3, recurrent excitation contributes more to activity than external excitation Activity is low, but not too low

Model Variables Functional Average activity Activity fluctuations Sequence length memory capacity Average lifetime of local context neurons Speed of learning Ratio of external to recurrent excitations Actual Number of neurons Percent connectivity Time span of synaptic associations Threshold to fire Feedback inhibition weight constant Feedforward inhibition weight constant Resting conductance Rate constant of synaptic modification Input code

Eleven Problems Simple sequence completion Spontaneous rebroadcast One-trial learning Jump-ahead recall Sequence disambiguation (context past) Finding a shortcut Goal finding (context future) Combining appropriate subsequences Transverse patterning Transitive inference Trace conditioning

Sequence Completion Train on sequence ABCDEFG Provide input A Network recalls BCDEFG

Rebroadcast Train network on one or more sequences Provide random input patterns All or part of one of the trained sequences is recalled

One-trial learning Requires high synaptic modification rate Does not use same parameters as other problems Models short-term memory (STM) instead of intermediate-term memory (ITM-hippocampus)

Jump-ahead recall With adjusted inhibition, sequence completion can be short-circuited Train network on ABCDEFG Provide A Network recalls G or possibly BDG, etc. Inhibition in hippocampus does vary

Disambiguation Train network on patterns ABC456GHI and abc456ghi Present pattern A to the network Network recalls BC456GHI Requires patterns 4, 5, and 6 to be coded differently depending on past context

Shortcuts Train network on pattern ABC456GHIJKL456PQR Present pattern A to the network Network recalls BC456PQR Uses common neurons of patterns 4, 5, and 6 to generate a shortcut

Goal Finding Train network on pattern ABC456GHIJKL456PQR Present pattern A and part of pattern K to the network Network recalls BC456GHIJK… Requires use of context future

Combinations Train network on patterns ABC456GHI and abc456ghi Present pattern A and part of pattern i to the network Network recalls BC456ghi Also requires use of context future

Transverse Patterning Similar to rock, paper, scissors Train network on sequences [AB]a+, [AB]b-, [BC]b+, [BC]c-, [AC]c+, [AC]a- Present [AB] and part of + to network and network will generate a Present [BC] and part of + to network and network will generate b Present [AC] and part of + to network and network will generate c

Transitive Inference Transitivity: if A>B and B>C, then A>C Train network on [AB]a+, [AB]b-, [BC]b+, [BC]c-, [CD]c+, [CD]d-, [DE]d+, [DE]e- Present [BD] and part of + to network, and it will generate b

Trace Conditioning Train network on sequence A……B Vary the amount of time between presentation of pattern A and pattern B Computational results match experimental results on trace conditioning in rabbits

Important Recent Discoveries Addition of random “starting pattern” improves performance of network Synaptic failures improve performance (and reduce energy requirements) Addition of CA1 decoder improves performance