The Emergentist Approach To Language As Embodied in Connectionist Networks James L. McClelland Stanford University.

Slides:



Advertisements
Similar presentations
Intermediate 1 ESOL Grammar: The Correct Tense
Advertisements

Simple Past, Past continuous and Past perfect tense
PRESENT PERFECT.
Syntax Lecture 10: Auxiliaries. Types of auxiliary verb Modal auxiliaries belong to the category of inflection – They are in complementary distribution.
Introduction to Computational Natural Language Learning Linguistics (Under: Topics in Natural Language Processing ) Computer Science (Under:
Simple past tense Richard Ortega.
Learning linguistic structure with simple recurrent networks February 20, 2013.
PDP: Motivation, basic approach. Cognitive psychology or “How the Mind Works”
Language, Mind, and Brain by Ewa Dabrowska Chapter 10: The cognitive enterprise.
Language processing What are the components of language, and how do we process them?
VERBS.
Basics of the English grammar
Lecture 1 Introduction: Linguistic Theory and Theories
Generative Grammar(Part ii)
Regular and Irregular Verbs
Form (structure) Meaning & Use Pronunciation
Rules or Connections in Past Tense Inflections Psychology 209 February 4, 2013.
The simple past tense Musaab Salkhi Technical English
Phonetics and Phonology
James L. McClelland Stanford University
The Linguistics of Second Language Acquisition
ASPECTS OF LINGUISTIC COMPETENCE 4 SEPT 09, 2013 – DAY 6 Brain & Language LING NSCI Harry Howard Tulane University.
Company Success Language School presents:
PRESENT PERFECT. PRESENT PERFECT FORM The present perfect of any verb is composed of two elements : the appropriate form of the auxiliary verb to have.
THE PAST SIMPLE.
Modelling Language Evolution Lecture 1: Introduction to Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.
The Simple Past Tense.
Presented by Karen Inchy
What is modularity good for? Michael S. C. Thomas, Neil A. Forrester, Fiona M. Richardson
The Past Tense Model Psych /719 Feb 13, 2001.
ENG 150 English for Nursing Unit 1 -Grammar
Rapid integration of new schema- consistent information in the Complementary Learning Systems Theory Jay McClelland, Stanford University.
The Minimalist Program
SIMPLE PAST TENSE PAST PROGRESSIVE FUTURE PROGRESSIVE PERFECT ASPECT.
November 16, 2004 Lexicon (An Interacting Subsystem in UG) Part-II Rajat Kumar Mohanty IIT Bombay.
Spelling, punctuation and grammar. KEEP CALM.. New curriculum expectations. Year 1 Grammar and Punctuation: Regular Plural Noun Suffixes. Suffixes and.
The PDP Approach to Understanding the Mind and Brain Jay McClelland Stanford University January 21, 2014.
SYNTAX.
Levels of Linguistic Analysis
3 Phonology: Speech Sounds as a System No language has all the speech sounds possible in human languages; each language contains a selection of the possible.
Connectionist Modelling Summer School Lecture Three.
The Emergent Structure of Semantic Knowledge
the Past Perfect tense What is this tense and when do we use it in English?
Powerpoint Templates Page 1 Presupposition By Rina Husnaini Febriyanti.
Emergent Semantics: Meaning and Metaphor Jay McClelland Department of Psychology and Center for Mind, Brain, and Computation Stanford University.
Connectionist Modelling Summer School Lecture Two.
Language Development. Four Components of Language Phonology sounds Semantics meanings of words Grammar arrangements of words into sentences Pragmatics.
Development and Disintegration of Conceptual Knowledge: A Parallel-Distributed Processing Approach James L. McClelland Department of Psychology and Center.
MENTAL GRAMMAR Language and mind. First half of 20 th cent. – What the main goal of linguistics should be? Behaviorism – Bloomfield: goal of linguistics.
Unit: 8 The simple past tense
Simple Past, Past continuous and Past perfect
VISUAL WORD RECOGNITION. What is Word Recognition? Features, letters & word interactions Interactive Activation Model Lexical and Sublexical Approach.
Review of Past Tense.
Learning linguistic structure with simple and more complex recurrent neural networks Psychology February 2, 2017.
Assessing Grammar Module 5 Activity 5.
The simple past tense PAST SIMPLE IN MOVIES
James L. McClelland SS 100, May 31, 2011
Assessing Grammar Module 5 Activity 5.
The simple past tense PAST SIMPLE IN MOVIES
Simple learning in connectionist networks
Emergence of Semantics from Experience
Saidna Zulfiqar bin Tahir STATE UNIVERSITY OF MAKASSAR
Learning linguistic structure with simple and more complex recurrent neural networks Psychology February 8, 2018.
Common Irregular Past Tense Verbs
Learning linguistic structure with simple recurrent neural networks
Representation of Language Knowledge: Is it All in your Connections?
Simple learning in connectionist networks
THE PAST SIMPLE TENSE.
Psychology Chapter 8 Section 5: Language.
Presentation transcript:

The Emergentist Approach To Language As Embodied in Connectionist Networks James L. McClelland Stanford University

Some Simple Sentences The man liked the book. The boy loved the sun. The woman hated the rain.

The Standard Approach: Units and Rules Sentences Clauses and phrases Words Morphemes Phonemes S -> NP VP VP-> Tense V (NP) Tense V -> V+{past} V -> like, love, hate… N -> man, boy, sun… man -> /m/ /ae/ /n/ {past} -> ed ed -> /t/ or /d/ or /^d/ ‡ ‡ depends on context

What happens with exceptions?

Standard Approach to the Past Tense We form the past tense by using a (simple) rule. If an item is an exception, the rule is blocked. –So we say ‘took’ instead of ‘taked’ If you’ve never seen an item before, you use the rule If an item is an exception, but you forget the exceptional past tense, you apply the rule Predictions: –Regular inflection of ‘nonce forms’ This man is blinging. Yesterday he … This girl is tupping. Yesterday she … –Over-regularization errors: Goed, taked, bringed

The Emergentist Approach Language (like perception, etc) arises from the interactions of neurons, each of which operates according to a common set of simple principles of processing, representation and learning. Units and rules are useful to approximately describe what emerges from these interactions but have no mechanistic or explanatory role in language processing, language change, or language learning.

An Emergentist Theory: Natural Selection No grand design Organisms produce offspring with random differences (mating helps with this) Forces of nature favor those best suited to survive Survivors leave more offspring, so their traits are passed on The full range of the animal kingdom including all the capabilities of the human mind emerge from these very basic principles

An Emergentist/Connectionist Approach to the Past Tense Knowledge is in connections Experience causes connections to change Sensitivity to regularities emerges

The RM Model Learns from verb [root, past tense] pairs –[Like, liked]; [love, loved]; [carry, carried]; [take, took] Present and past are represented as patterns of activation over units that stand for phonological features.

Examples of ‘wickelfeatures’ in the verb ‘baked’ Starts with a stop followed by a vowel Has a long vowel preceded by a stop and followed by a stop Ends with a dental stop preceded by a velar stop Ends with an unvoiced sound preceded by another unvoiced sound

A Pattern Associator Network Pattern representing sound of the verb’s past tense Pattern representing the sound of the verb root Matrix of connections Summed input p(a=1)

Learning rule for the Pattern Associator network For each output unit: –Determine activity of the unit based on its input. –If the unit is active when target is not: Reduce each weight coming into the unit from each active input unit. –If the unit is inactive when the target is active: Increase the weight coming into the unit from each active input unit. Each connection weight adjustment is very small –Learning is gradual and cumulative

Most frequent past tenses in English: –Felt –Had –Made –Got –Gave –Took –Came –Went –Looked –Needed Trained with top ten words only. Here’s where 400 more words were introduced Over-regularization errors in the RM network

Some features of the model Regulars co-exist with exceptions. The model produces the regular past for most unfamiliar test items. The model captures the different subtypes among the regulars: –like-liked –love-loved –hate-hated The model is sensitive to the no-change pattern among irregulars: –hit-hit –cut-cut –hide-hid

Additional characteristics The model exploits gangs of related exceptions. –dig-dug –cling-clung –swing-swung The ‘regular pattern’ infuses exceptions as well as regulars: –say-said, do-did –have-had –keep-kept, sleep-slept –Burn-burnt –Teach-taught

Extensions As was stated 2-3 times throughout the quarter, networks that used fixed inputs and outputs without hidden units in between are limited  So subsequent models have relied on intermediate layers of units between inputs and outputs. The principles of the model apply to other forms of information, not only phonological information.  If semantic as well as phonological information about an item is available as part of the input, then the model will exploit semantic as well as phonological similarity (as we argue humans do). One can re-formulate the learning problem as one involving a speaker and a hearer. Here the learning problem for the speaker is formulated as: –Produce an output that will allow the listener to construct the message you wish to transmit (e.g. hit + past; read + present) –Keep the output as short as possible, as long as it can be understood  This causes networks to create a special kind of irregular form, which we will call reduced regulars: Did, Made, Had, Said

Lupyan & McClelland (2003) Model of Past Tense Learning Weights in the network are adjusted using an error correcting rule –so that the network learns to map phonological input onto specified target outputs Activations of input units are also adjusted –to ensure that the input contains the information needed to produce the output –to allow the input to be as simple as possible Initial input patterns are fully regular High frequency items are presented more often during training Because the network weights are stronger for these items, the network can afford to reduce the input for these high frequency items Phonological Input Subject to Reduction Semantic Target Pattern Corresponding to Intended Meaning

Simulation of Emergence of Reduced Regulars In English, frequent items are less likely to be regular. Also, d/t items are less likely to be regular. This emerged in our simulation. While the past tense is usually one phoneme longer than present, this is less true for the high frequency past tense items. Reduction of high frequency past tenses affects a phoneme other than the word final /d/ or /t/.

Key Features of the Past Tense model No lexical entries and no rules No problem of rule induction or grammar selection