Download presentation
Presentation is loading. Please wait.
Published byEsmond Wiggins Modified over 9 years ago
1
The Past Tense Model Psych 85-419/719 Feb 13, 2001
2
Facts About English Morphology Lots of rule governed items (like, add -ed) Lots of exceptions (run->ran, go->went) Past tense sound conditioned by last sound of stem (/t/, /d/, /Id/) U-Shaped learning...
3
U-Shaped Learning Early on, child memorizes lots of word forms Then, begins to overregularize –go->goed, wented Later on, sorts out regulars and exceptions Accuracy Time
4
A Standard Account Knowledge of Morpho Rules Knowledge of Phono Rules Knowledge of Exceptions Input
5
U-Shaped Learning In The Standard Account Child learns words; creates lexical entries for words. High accuracy. Begins to infer the rule. Overregularizations arise from conflict Essence of phenomena: Nature of morphological rule acquisition mechanism. Domain specific.
6
The Rumelhart & McClelland Alternative There are no separate modules for rules and exceptions U-Shaped learning arises based on statistics of language –High frequency words contain lots of exceptions, LF words less so –Presumption: children learn hf words first. Then, lots more regulars –Cast U-shaped learning as arising from basic learning principles. Domain general.
7
The Simulation Created a pattern associator to map uninflected phonological forms onto their inflections Divided training into phases: early stage with many high frequency items, later stage with proportionally more regulars
8
The Representation Coded triplets of phonemes –BLADE -> /#bl/, /bla/, /lad/, /ad#/ Each phoneme coded as features –voicing, front, back… Created detectors for triplets of features. Input and output: activity over sets of such feature detectors.
9
Results of Simulation Successfully learned both regulars and exceptions in same system Showed U-Shaped pattern Learned cues to final phoneme (/d/, /t/, /Id/) Made novel predictions about degree of regularization based on verb type
10
Pinker’s Objection I: The Representation Can’t represent all words –Was it supposed to? Doesn’t capture right regularity –stem + /d/ Allows for rules to be inferred that never happen –Is it the business of theories to cover this?
11
Pinker’s Objection II. Modeling Just Inflections The “rule” for inflection applies more broadly than past tense –Similar rule for plurals, word stems Implicit claim: if regularities are present in several domains, must have one representation to accommodate them all Not learned by past tense module, and redundantly by plural module, etc.
12
Pinker’s Objection III: It Isn’t Just Phonology Homophones: right/write –“righted the boat; wrote the novel” Verbs derived from nouns or adjectives: –Batter flied out to left field (flew) –I braked the car suddenly (broke) –scissors Semantic Distance? –The Toronto Maple Leafs –The Minnesota Timberwolves
13
Pinker’s Objection IV: The Model Doesn’t Work Very Well Makes a lot of errors on generalization … and errors that aren’t what people would make –mail -> membled
14
Pinker’s Objection V: The Frequency Story On U-Shaped Learning is Bogus When you look at children’s vocabulary, regulars and exceptions aren’t blocked out like they were in simulation Claim: This is fundamentally the wrong account of the phenomena. –Not an accident of frequency distribution of language, but result of induction mechanism.
15
Pinker & Prince’s Conclusions The implemented model is broken. –Wrong account, wrong predictions, doesn’t work as advertised. Ergo, doesn’t provide counter to standard account AND FURTHER: You need rules to adequately account for phenomena
16
Does Either Theory Really Explain the Why Stem Is Preserved For Regulars? PDP: The “rule” is learned because it is present in the training set … and, it’s easier to learn than an arbitrary one Traditional Account: –Because that is what the rule is
17
What About the /d/, /t/ Business? PDP Account: that’s in the statistics in the input Traditional account: Part of our phonological knowledge of language Perhaps articulatory reason: harder to switch voicing between two obstruents Is this incompatible with either story?
18
What is Really Going On With Morphology? People CAN inflect novel forms –wug->wugged But is that what language is about? Intuition: People generally don’t produce speech by creating uninflected forms, then processing them to produce inflection More on this later on in the course
19
Next Class: The Backprop Learning Algorithm Read for class: PDP1, Chapter 8, “Learning internal representations…”
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.