Connectionist Modelling Summer School Lecture Two
The Mapping Principle Patterns of Activity An input pattern is transformed to an output pattern. Activation States are Vectors Each pattern of activity can be considered a unique point in a space of states. The activation vector identifies this point in space. x y z x y z Mapping Functions T = F (S) The network maps a source space S (the network inputs) to a target space T (the outputs). The mapping function F is most likely complex. No simple mathematical formula can capture it explicitly. Hyperspace Input states generally have a high dimensionality. Most network states are therefore considered to populate HyperSpace. S T
The Principle of Superposition Matrix Matrix Composite Matrix
Hebbian Learning Cellular Association “When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process of metabolic change takes place in one or both cells such that A’s efficiency, as one of the cells firing B, is increased.” (Hebb 1949, p.50) Learning Connections Take the product of the excitation of the two cells and change the value of the connection in proportion to this product. w a in a out The Learning Rule ε is the learning rate. Changing Connections If a in = 0.5, a out = 0.75, and ε = 0.5 then Δw = 0.5(0.75)(0.5) = And if w start = 0.0, then w next = Calculating Correlations InputOutput
Nature of mental representation Mind as a physical symbol system –Software/hardware distinction –Symbol manipulation by rule-governed processes S NP VP NPV NArt N Theboybrokethevase
Nature of mental representation Mind as a parallel distributed processing system n Representations are coded in connections and node activities
Evidence for rules Regular and Irregular Morphology –talk => talked –ram => rammed –pit => pitted –hit => hit –come => came –sleep => slept –go => went
Evidence for rules Errors in performance –hitted –sleeped –goed, wented U-shaped development Recovery from errors
Evidence for rules Rote Learning Processes Initial error-free performance Rule Extraction and Application Overgeneralisation errors Speedy learning of new regular past tense forms Rote plus Rule Continued application of regularisation process Recovery from regularisation of irregulars
Models of English past tense Dual mechanism account –Rule-governed component deals with regular mappings –Separate listing of exceptions –Blocking principle –Imperfect retrieval of irregular past tense representations result in overgeneralisation –Pinker & Prince 1988 ExceptionsRule Input Stem Output Inflection
Models of English past tense PDP accounts –Single homogeneous architecture –Superposition –Competition between different different verb types result in overregularisation and irregularisation –Vocabulary discontinuity –Rumelhart & McClelland 1986