Connectionist Modelling Summer School Lecture Two.

Slides:



Advertisements
Similar presentations
Alan Pickering Autumn Term 2005
Advertisements

PDP: Motivation, basic approach. Cognitive psychology or “How the Mind Works”
Chapter 7 Supervised Hebbian Learning.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Un Supervised Learning & Self Organizing Maps Learning From Examples
Connectionist Modeling Some material taken from cspeech.ucd.ie/~connectionism and Rich & Knight, 1991.
September 16, 2010Neural Networks Lecture 4: Models of Neurons and Neural Networks 1 Capabilities of Threshold Neurons By choosing appropriate weights.
AN INTERACTIVE TOOL FOR THE STOCK MARKET RESEARCH USING RECURSIVE NEURAL NETWORKS Master Thesis Michal Trna
Chapter Seven The Network Approach: Mind as a Web.
Supervised Hebbian Learning. Hebb’s Postulate “When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing.
Artificial neural networks.
PART 5 Supervised Hebbian Learning. Outline Linear Associator The Hebb Rule Pseudoinverse Rule Application.
COMP305. Part I. Artificial neural networks.. Topic 3. Learning Rules of the Artificial Neural Networks.
1 Representing Regularity: The English Past Tense Matt Davis William Marslen-Wilson Centre for Speech and Language Birkbeck College University of London.
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
Rules or Connections in Past Tense Inflections Psychology 209 February 4, 2013.
Unsupervised learning
Supervised Hebbian Learning
James L. McClelland Stanford University
Elements & Principles 6 th Grade Knowledge Targets Unit 1.
PSY105 Neural Networks 4/5 4. “Traces in time” Assignment note: you don't need to read the full book to answer the first half of the question. You should.
2101INT – Principles of Intelligent Systems Lecture 10.
Artificial Neural Network Yalong Li Some slides are from _24_2011_ann.pdf.
Chapter 8 Fuzzy Associative Memories Li Lin
Brief Overview of Connectionism to understand Learning Walter Schneider P2476 Cognitive Neuroscience of Human Learning & Instruction
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
7 1 Supervised Hebbian Learning. 7 2 Hebb’s Postulate “When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
What to make of: distributed representations summation of inputs Hebbian plasticity ? Competitive nets Pattern associators Autoassociators.
Unsupervised learning
Modelling Language Evolution Lecture 1: Introduction to Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.
Methodology of Simulations n CS/PY 399 Lecture Presentation # 19 n February 21, 2001 n Mount Union College.
What is modularity good for? Michael S. C. Thomas, Neil A. Forrester, Fiona M. Richardson
Neural-Network-Based Fuzzy Logical Control and Decision System 主講人 虞台文.
The Past Tense Model Psych /719 Feb 13, 2001.
The computer and the mind 4. CONNECTIONISM Professor Mark Bishop
Bain on Neural Networks and Connectionism Stephanie Rosenthal September 9, 2015.
Approaches to A. I. Thinking like humans Cognitive science Neuron level Neuroanatomical level Mind level Thinking rationally Aristotle, syllogisms Logic.
Neural Networks 2nd Edition Simon Haykin
1 Financial Informatics –XVII: Unsupervised Learning 1 Khurshid Ahmad, Professor of Computer Science, Department of Computer Science Trinity College, Dublin-2,
The PDP Approach to Understanding the Mind and Brain Jay McClelland Stanford University January 21, 2014.
James L. McClelland Stanford University
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Storage capacity: consider the neocortex ~20*10^9 cells, 20*10^13 synapses.
Connectionist Modelling Summer School Lecture Three.
IE 585 History of Neural Networks & Introduction to Simple Learning Rules.
0 Chapter 4: Associators and synaptic plasticity Fundamentals of Computational Neuroscience Dec 09.
Pattern Associators, Generalization, Processing Psych /719 Feb 6, 2001.
Biological and cognitive plausibility in connectionist networks for language modelling Maja Anđel Department for German Studies University of Zagreb.
The Emergentist Approach To Language As Embodied in Connectionist Networks James L. McClelland Stanford University.
“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 2 ARTIFICIAL.
Introduction to Connectionism Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
NEURONAL NETWORKS AND CONNECTIONIST (PDP) MODELS Thorndike’s “Law of Effect” (1920’s) –Reward strengthens connections for operant response Hebb’s “reverberatory.
Self-Organizing Network Model (SOM) Session 11
Data Mining, Neural Network and Genetic Programming
Neural Networks.
James L. McClelland SS 100, May 31, 2011
Applications for Aphasiology
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
Simple learning in connectionist networks
Financial Informatics –XVII: Unsupervised Learning
Hebb and Perceptron.
Outline Associative Learning: Hebbian Learning
Backpropagation.
Representation of Language Knowledge: Is it All in your Connections?
Simple learning in connectionist networks
EXPLICIT RULES: INPUT-OUTPUT FORMULAS
The Network Approach: Mind as a Web
Introduction to Neural Network
Supervised Hebbian Learning
Presentation transcript:

Connectionist Modelling Summer School Lecture Two

The Mapping Principle Patterns of Activity An input pattern is transformed to an output pattern. Activation States are Vectors Each pattern of activity can be considered a unique point in a space of states. The activation vector identifies this point in space. x y z x y z Mapping Functions T = F (S) The network maps a source space S (the network inputs) to a target space T (the outputs). The mapping function F is most likely complex. No simple mathematical formula can capture it explicitly. Hyperspace Input states generally have a high dimensionality. Most network states are therefore considered to populate HyperSpace. S T

The Principle of Superposition Matrix Matrix Composite Matrix

Hebbian Learning Cellular Association “When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process of metabolic change takes place in one or both cells such that A’s efficiency, as one of the cells firing B, is increased.” (Hebb 1949, p.50) Learning Connections Take the product of the excitation of the two cells and change the value of the connection in proportion to this product. w a in a out The Learning Rule ε is the learning rate. Changing Connections If a in = 0.5, a out = 0.75, and ε = 0.5 then Δw = 0.5(0.75)(0.5) = And if w start = 0.0, then w next = Calculating Correlations InputOutput

Nature of mental representation Mind as a physical symbol system –Software/hardware distinction –Symbol manipulation by rule-governed processes S NP VP NPV NArt N Theboybrokethevase

Nature of mental representation Mind as a parallel distributed processing system n Representations are coded in connections and node activities

Evidence for rules Regular and Irregular Morphology –talk => talked –ram => rammed –pit => pitted –hit => hit –come => came –sleep => slept –go => went

Evidence for rules Errors in performance –hitted –sleeped –goed, wented U-shaped development Recovery from errors

Evidence for rules Rote Learning Processes Initial error-free performance Rule Extraction and Application Overgeneralisation errors Speedy learning of new regular past tense forms Rote plus Rule Continued application of regularisation process Recovery from regularisation of irregulars

Models of English past tense Dual mechanism account –Rule-governed component deals with regular mappings –Separate listing of exceptions –Blocking principle –Imperfect retrieval of irregular past tense representations result in overgeneralisation –Pinker & Prince 1988 ExceptionsRule Input Stem Output Inflection

Models of English past tense PDP accounts –Single homogeneous architecture –Superposition –Competition between different different verb types result in overregularisation and irregularisation –Vocabulary discontinuity –Rumelhart & McClelland 1986