Download presentation
Presentation is loading. Please wait.
Published byAubrey Wesley Jordan Modified over 8 years ago
1
Connectionist Modelling Summer School Lecture Two
2
The Mapping Principle Patterns of Activity An input pattern is transformed to an output pattern. Activation States are Vectors Each pattern of activity can be considered a unique point in a space of states. The activation vector identifies this point in space. x y z x y z Mapping Functions T = F (S) The network maps a source space S (the network inputs) to a target space T (the outputs). The mapping function F is most likely complex. No simple mathematical formula can capture it explicitly. Hyperspace Input states generally have a high dimensionality. Most network states are therefore considered to populate HyperSpace. S T
3
The Principle of Superposition Matrix 1 +1 +1 -0.25+0.25 -0.25 -0.25+025+0.25-0.25 +0.25-0.25 +0.25+1 +0.25-0.25 +0.25+1 +1+1 +0.25-0.25+0.25-0.25 -0.25+0.25-0.25+0.25+1 -0.25+0.25-0.25+0.25+1 +0.25-0.25+0.25-0.25 Matrix 2 0.0 +0.5-0.5 +0.50.0 -0.5+0.5 -0.50.00.9 Composite Matrix
4
Hebbian Learning Cellular Association “When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process of metabolic change takes place in one or both cells such that A’s efficiency, as one of the cells firing B, is increased.” (Hebb 1949, p.50) Learning Connections Take the product of the excitation of the two cells and change the value of the connection in proportion to this product. w a in a out The Learning Rule ε is the learning rate. Changing Connections If a in = 0.5, a out = 0.75, and ε = 0.5 then Δw = 0.5(0.75)(0.5) = 0.1875 And if w start = 0.0, then w next = 0.1875 Calculating Correlations InputOutput 012 +++ +-- -+- --+ 0 1 2
5
Nature of mental representation Mind as a physical symbol system –Software/hardware distinction –Symbol manipulation by rule-governed processes S NP VP NPV NArt N Theboybrokethevase
6
Nature of mental representation Mind as a parallel distributed processing system n Representations are coded in connections and node activities
7
Evidence for rules Regular and Irregular Morphology –talk => talked –ram => rammed –pit => pitted –hit => hit –come => came –sleep => slept –go => went
8
Evidence for rules Errors in performance –hitted –sleeped –goed, wented U-shaped development Recovery from errors
9
Evidence for rules Rote Learning Processes Initial error-free performance Rule Extraction and Application Overgeneralisation errors Speedy learning of new regular past tense forms Rote plus Rule Continued application of regularisation process Recovery from regularisation of irregulars
10
Models of English past tense Dual mechanism account –Rule-governed component deals with regular mappings –Separate listing of exceptions –Blocking principle –Imperfect retrieval of irregular past tense representations result in overgeneralisation –Pinker & Prince 1988 ExceptionsRule Input Stem Output Inflection
11
Models of English past tense PDP accounts –Single homogeneous architecture –Superposition –Competition between different different verb types result in overregularisation and irregularisation –Vocabulary discontinuity –Rumelhart & McClelland 1986
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.