Language Mind and Brain: The Unification Problem Colin Phillips Cognitive Neuroscience of Language Laboratory Department of Linguistics University of Maryland.

Slides:



Advertisements
Similar presentations
Computational language: week 10 Lexical Knowledge Representation concluded Syntax-based computational language Sentence structure: syntax Context free.
Advertisements

All slides © S. J. Luck, except as indicated in the notes sections of individual slides Slides may be used for nonprofit educational purposes if this copyright.
Psycholinguistic what is psycholinguistic? 1-pyscholinguistic is the study of the cognitive process of language acquisition and use. 2-The scope of psycholinguistic.
The Interaction of Lexical and Syntactic Ambiguity by Maryellen C. MacDonald presented by Joshua Johanson.
Shallow Processing Eva M. Fernández Queens College & Graduate Center City University of New York.
Are the anterior negativities to grammatical violations indexing working memory? Manuel Martin-Loeches, Francisco munoz, Pilar Casado, A. Melcon, C. Fernandez-frias,
Intervention by gaps in online sentence processing Michael Frazier, Peter Baumann, Lauren Ackerman, David Potter, Masaya Yoshida Northwestern University.
Online satisfaction of lexical requirements determines the time course of gap creation Sachiko Aoshima, Colin Phillips & Amy Weinberg University of Maryland,
Using prosody to avoid ambiguity: Effects of speaker awareness and referential context Snedeker and Trueswell (2003) Psych 526 Eun-Kyung Lee.
Hagoort, P., Brown, C.M., Groothusen, J. (1993) The Syntactic Positive Shift (SPS) as an ERP measure of syntactic processing.
Hemispheric asymmetries and joke comprehension Coulson, S., & Williams, R. F. (2005) Neuropsychologia, 43,
MORPHOLOGY - morphemes are the building blocks that make up words.
Experiment 2: MEG Study Materials and Methods: 11 right-handed subjects with 20:20 vision were run. 3 subjects’ data was discarded because of poor performance.
The Real-time Status of Island Constraints Colin Phillips, Beth Rabbin Leticia Pablos, Kaia Wong Cognitive Neuroscience of Language Laboratory Department.
Language (and Decomposition). Linguistics provides… a highly articulated “computational” (generative) theory of the mental representations of language.
Linguistic Theory Lecture 8 Meaning and Grammar. A brief history In classical and traditional grammar not much distinction was made between grammar and.
1 The role of structural prediction in rapid syntactic analysis Ellen Lau, Clare Sroud, Silke Plesch, Colin Phillips, 2006 PSYC Soondo Baek.
August 23, 2010 Grammars and Lexicons How do linguists study grammar?
Amirkabir University of Technology Computer Engineering Faculty AILAB Efficient Parsing Ahmad Abdollahzadeh Barfouroush Aban 1381 Natural Language Processing.
Introduction to Cognitive Science Sept 2005 :: Lecture #1 :: Joe Lau :: Philosophy HKU.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 4.
Language, Mind, and Brain by Ewa Dabrowska Chapter 2: Language processing: speed and flexibility.
C SC 620 Advanced Topics in Natural Language Processing 3/9 Lecture 14.
Models of Generative Grammar Smriti Singh. Generative Grammar  A Generative Grammar is a set of formal rules that can generate an infinite set of sentences.
Lecture 1 Introduction: Linguistic Theory and Theories
Linguistic Theory Lecture 3 Movement. A brief history of movement Movements as ‘special rules’ proposed to capture facts that phrase structure rules cannot.
The time-course of prediction in incremental sentence processing: Evidence from anticipatory eye movements Yuki Kamide, Gerry T.M. Altman, and Sarah L.
Emergence of Syntax. Introduction  One of the most important concerns of theoretical linguistics today represents the study of the acquisition of language.
Empirical Methods in Information Extraction Claire Cardie Appeared in AI Magazine, 18:4, Summarized by Seong-Bae Park.
Word category and verb-argument structure information in the dynamics of parsing Frisch, Hahne, and Friedericie (2004) Cognition.
Language Comprehension and Word-Order Variation Colin Phillips Cognitive Neuroscience of Language Laboratory Department of Linguistics University of Maryland.
1 Words and rules Linguistics lecture #2 October 31, 2006.
Grammaticality Judgments Do you want to come with?20% I might could do that.38% The pavements are all wet.60% Y’all come back now.38% What if I were Romeo.
Differential effects of constraints in the processing of Russian cataphora Kazanina and Phillips 2010.
Binding Theory Describing Relationships between Nouns.
THE BIG PICTURE Basic Assumptions Linguistics is the empirical science that studies language (or linguistic behavior) Linguistics proposes theories (models)
METHODOLOGY Experiment 1: - Within-subjects 2 (CW/ RW) x 2 (consistent/ inconsistent) design - 40 experimental items in each condition (total 160) displayed.
Electrophysiological evidence for the role of animacy and lexico-semantic associations in processing nouns within passive structures Martin Paczynski 1,
1 Compiler Construction (CS-636) Muhammad Bilal Bashir UIIT, Rawalpindi.
PS: Introduction to Psycholinguistics Winter Term 2005/06 Instructor: Daniel Wiechmann Office hours: Mon 2-3 pm Phone:
Avoiding the Garden Path: Eye Movements in Context
Linear Order and Constituency Colin Phillips Cognitive Neuroscience of Language Laboratory Department of Linguistics University of Maryland.
An experimental investigation of referential/non-referential asymmetries in syntactic reconstruction akira omaki anastasia conroy jeffrey lidz Quantitative.
Culture , Language and Communication
N400-like semantic incongruity effect in 19-month-olds: Processing known words in picture contexts Manuela Friedrich and Angela D. Friederici J. of cognitive.
Rules, Movement, Ambiguity
Artificial Intelligence: Natural Language
E BERHARD- K ARLS- U NIVERSITÄT T ÜBINGEN SFB 441 Coordinate Structures: On the Relationship between Parsing Preferences and Corpus Frequencies Ilona Steiner.
Introduction Can you read the following paragraph? Can we derive meaning from words even if they are distorted by intermixing words with numbers? Perea,
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 2.
SYNTAX.
Language and Cognition Colombo, June 2011 Day 2 Introduction to Linguistic Theory, Part 3.
Grammar Chapter 10. What is Grammar? Basic Points description of patterns speakers use to construct sentences stronger patterns - most nouns form plurals.
Parsers and Grammars Colin Phillips. Outline The Standard History of Psycholinguistics Parsing and rewrite rules Initial optimism Disappointment and the.
Revision Lecture Cognitive Science. Past papers What is the answer to the question? The answer will nearly always involve: “How amazing it is that people.
Chapter 11 Language. Some Questions to Consider How do we understand individual words, and how are words combined to create sentences? How can we understand.
MENTAL GRAMMAR Language and mind. First half of 20 th cent. – What the main goal of linguistics should be? Behaviorism – Bloomfield: goal of linguistics.
‘Potential’ contributions of event-related potentials to the elicitation of different types of knowledge of L2 morphosyntax Kara Morgan-Short University.
Chapter 3 Language Acquisition: A Linguistic Treatment Jang, HaYoung Biointelligence Laborotary Seoul National University.
SYNTAX.
Natural Language Processing Vasile Rus
Child Syntax and Morphology
English Syntax Week 1. Introduction.
SYNTAX.
Chapter Eight Syntax.
Part I: Basics and Constituency
Chapter Eight Syntax.
Traditional Grammar VS. Generative Grammar
Presentation transcript:

Language Mind and Brain: The Unification Problem Colin Phillips Cognitive Neuroscience of Language Laboratory Department of Linguistics University of Maryland

Objective is to bridge gap between linguistic models, real- time models of mental processes, and brain-level models, i.e., focus is on the Unification Problem Cognitive Neuroscience of Language necessarily draws on disparate areas - not just looking at pictures of brains! Unification Problem

?? Unification Problem

No longer concerned with questions of whether linguistics is a natural science Language is clearly a remarkable specialization of human neurobiology (e.g., a sophisticated symbolic, recursive system, with fixed and parameterized aspects) What is it about human psychology & neurobiology that allows it to support the things that we know about language? In addressing this question, are we faced with problems or with mysteries?

Unification Problem...And if there are mismatches between theories at different levels, whose problem is this? Linguists? Psychologists? Neuroscientists?

Unification Problem On mismatches between cognitive and neural models “The more we learn about the brain, the greater the disanalogy becomes.” (A philosophy of (neuro-)science talk, October 2001)

Unification Problem On mismatches between cognitive and neural models “If language is unlike anything else in the biological world, …then too bad for biology!” (linguist, often accompanied by story about chemistry & physics)

Encoding & Computation Two main issues –How are linguistic representations encoded? –How are linguistic representations computed?

Sensory Maps Internal representations of the outside world. Cellular neuroscience has discovered a great deal in this area.

Notions of sensory maps may be applicable to human phonetic representations… …although attempts to find them have had little success to date. Vowel Space

Encoding of Symbols: Abstraction But most areas of linguistics (phonology, morphology, syntax, semantics) are concerned with symbolic, abstract representations,...which do not involve internal representations of dimensions of the outside world. …hence, the notion of sensory maps does not get us very far into language

Computation: Discrete Infinity In neuroscience there are many findings about long-term storage of object representations (e.g., edges, faces, grandmothers, toothbrushes, …). Such representations are always finite in number, which can be retrieved from long-term memory BUT, much of what interests us in linguistics is infinite If representations are drawn from an infinite set, they cannot be retrieved from long-term memory; they must be constructed on-line, as needed - this poses a different kind of challenge …

Overview of Talks

1. The Unification Problem

Overview of Talks 1. The Unification Problem 2. Building Syntactic Relations どの生徒に …

Overview of Talks 1. The Unification Problem 2. Building Syntactic Relations 3. Abstraction: Sounds to Symbols どの生徒に …

Overview of Talks 1. The Unification Problem 2. Building Syntactic Relations 3. Abstraction: Sounds to Symbols 4. Linguistics and Learning どの生徒に …

with help from... University of Maryland Shani Abada Sachiko Aoshima Daniel Garcia-Pedrosa Ana Gouvea Nina Kazanina Moti Lieberman Leticia Pablos David Poeppel Beth Rabbin Silke Urban Carol Whitney University of Delaware Evniki Edgar Bowen Hui Baris Kabak Tom Pellathy Dave Schneider Kaia Wong Alec Marantz, MIT Elron Yellin, MIT National Science Foundation James S. McDonnell Foundation Human Frontiers Science Program Japan Science & Technology Program Kanazawa Institute of Technology

Outline The Challenge Real-time Grammar Accurate Parsing Incremental Parsing Mapping onto the Brain: Electrophysiology Encoding Outlook

Outline The Challenge Real-time Grammar Accurate Parsing Incremental Parsing Mapping onto the Brain: Electrophysiology Encoding Outlook

Discrete Infinity Linguistic Creativity Ability to make infinite use of finite means The finite and infinite aspects of this system present rather different challenges for explicit linking hypotheses

Discrete Infinity Linguistic Creativity Ability to make infinite use of finite means Lexical entries, Argument Structure templates Widespread assumption: same representations accessed in comprehension, production, acceptability ratings, etc. Learner’s task is to construct a single lexical entry that covers all of these tasks

Discrete Infinity Linguistic Creativity Ability to make infinite use of finite means Sentence structures Structures must be assembled, cannot simply be retrieved from memory Widespread assumption: multiple different systems responsible for structure assembly in comprehension, production, acceptability, etc. Learner must master a number of different systems

Discrete Infinity Linguistic Creativity Ability to make infinite use of finite means Time-independent vs. Time-dependent systems

Standard View ? 217 x 32 = ? arithmetic

Standard View ? 217 x 32 = ? specialized algorithm arithmetic

Standard View ? 217 x 32 = ? specialized algorithm ? something deeper arithmetic

Standard View specialized algorithm speaking understanding grammatical knowledge, competence language

Standard View specialized algorithm speaking understanding grammatical knowledge, competence precise but ill-adapted to real-time operation language

Standard View specialized algorithm speaking understanding grammatical knowledge, competence well-adapted to real-time operation but maybe inaccurate language

If speaking and understanding involve different systems, there must be an additional store of knowledge that encodes what is shared between speaking and understanding.

As soon as we assume constructs such as ‘parsing strategies’, we are adopting task-specific mechanisms, and endorsing something like the standard architecture.

Standard View specialized algorithm speaking understanding grammatical knowledge, competence language

speaking, understanding, grammaticality Analysis-by-Synthesis Sentences are understood by internally generating a representation that matches the input No separate time-independent system of knowledge We know that humans have a real-time system for linguistic computation - issue is whether that’s all there is

But wait a minute... Wasn’t this all shown to be wrong long ago? (Fodor, Bever & Garrett 1974; Levelt 1974; Fillenbaum 1971; etc., etc.)? And a recent commentary: “In this desert of ignorance there have been attempts to resurrect earlier claims that the grammar and the parser are one and the same thing. … The enterprise is misconceived … probably incoherent.” (Smith, 1999)

Motivation for Standard Architecture How to constrain hypothesis generation Grammars are not incremental real-time systems Evidence for input/output-specific strategies Analysis-by-synthesis implies active generation, ahead of input Reputation of performance systems as fast but inaccurate Debunking of ‘Derivational Theory of Complexity’

Time-dependent system of computation makes it feasible to generate testable linking hypotheses Motivation for Alternative Architecture

Outline The Challenge Real-time Grammar Accurate Parsing Incremental Parsing Mapping onto the Brain: Electrophysiology Encoding Outlook

Linear Order and Constituency Linguistic Inquiry, 2003

Incremental Structure Building Evidence that sentence structures can only be assembled in a left-to-right derivation.

John said that he atethe entire pizza S NP VP S’ Comp V S NP VP VNP

John said that he atethe entire pizza S NP VP S’ Comp V S NP VP VNP Constituents

John said that he atethe entire pizza S NP VP S’ Comp V S NP VP VNP Constituents

John said that he atethe entire pizza S NP VP S’ Comp V S NP VP VNP Constituents

John said that he atethe entire pizza S NP VP S’ Comp V S NP VP VNP Constituents

Many tools used to diagnose groupings of words: –coordination –deletion –interpretation (coreference) –movement, focus, topicalization –etc. There are many cases where the tools converge on the same result There are also many cases where the tools yield conflicting results

Incremental Structure Building A (Phillips 2003)

Incremental Structure Building AB (Phillips 2003)

Incremental Structure Building A BC (Phillips 2003)

Incremental Structure Building A B CD (Phillips 2003)

Incremental Structure Building A B C DE (Phillips 2003)

Incremental Structure Building AB (Phillips 2003)

Incremental Structure Building AB constituent (Phillips 2003)

Incremental Structure Building A BC constituent is destroyed by addition of new material (Phillips 2003)

Incremental Structure Building A BC (Phillips 2003)

Incremental Structure Building A BC constituent (Phillips 2003)

Incremental Structure Building A B CD constituent is destroyed by addition of new material (Phillips 2003)

Incremental Structure Building the cat (Phillips 2003)

Incremental Structure Building the catsat (Phillips 2003)

Incremental Structure Building the cat saton (Phillips 2003)

Incremental Structure Building the cat sat on the rug (Phillips 2003)

Incremental Structure Building the cat saton (Phillips 2003)

Incremental Structure Building the cat sat on the rug (Phillips 2003)

Incremental Structure Building the cat sat on the rug [sat on] is a temporary constituent, which is destroyed as soon as the NP [the rug] is added. (Phillips 2003)

Incremental Structure Building Conflicting Constituency Tests Verb + Preposition sequences can undergo coordination… (1) The cat sat on and slept under the rug. …but cannot undergo pseudogapping (Baltin & Postal, 1996) (2) *The cat sat on the rug and the dog did the chair. (Phillips 2003)

Incremental Structure Building the cat saton (Phillips 2003)

Incremental Structure Building the cat satonsleptunder and (Phillips 2003)

Incremental Structure Building the cat satonsleptunder and coordination applies early, before the V+P constituent is destroyed. (Phillips 2003)

Incremental Structure Building the cat saton (Phillips 2003)

Incremental Structure Building the cat sat on the rug (Phillips 2003)

Incremental Structure Building the cat sat on the rug andthe dogdid (Phillips 2003)

Incremental Structure Building the cat sat on the rug andthe dogdid pseudogapping applies too late, after the V+P constituent is destroyed. (Phillips 2003)

Incremental Structure Building Constituency Problem Different diagnostics of constituency frequently yield conflicting results Incrementality Hypothesis (a) Syntactic processes see a ‘snapshot’ of a derivation - they target constituents that are present when the process applies (b) Conflicts reflect the fact that different processes have different linear properties start

Ellipsis blocks Scope/Binding Bill read all the books in a week (ambiguous: collective/distributive scope) …and Sue did in a month (unambiguous: collective scope only) Bill read as many books as Sue did in a week. (ambiguous) Bill read as many books in a week as Sue did in a month. (unambiguous)

Ellipsis blocks Scope/Binding John gave books to the children on Tuesday …and Mary did on Thursday John gave books to the children on each other’s birthdays *…and Mary did on each other’s first day of school

Japanese               John                                                                                                               

gave books to the children on each other’s birthdays VP V V IP John +fin I’ read all the books in a week VP V PP IP John +fin I’

gave books to the children on each other’s birthdays VP V V IP John +fin I’ read all the books in a week VP V PP IP John +fin I’

gave books to the children on each other’s birthdays VP V V IP John +fin I’ and VP IP Bill did I’

gave books to the children on each other’s birthdays VP V V IP John +fin I’ and VP IP Bill did I’

gave books to the children VP V IP John +fin I’

gave books to the children on each other’s birthdays VP V IP John +fin I’

gave books to the children on each other’s birthdays VP V IP John +fin I’ IP and VP IP Bill did I’

gave books to the children on each other’s birthdays VP V IP John +fin I’ IP and VP IP Bill did I’

gave books to the children on each other’s birthdays VP V IP John +fin I’ IP and VP IP Bill did I’ gave books to the children V VP

gave books to the children on each other’s birthdays VP V IP John +fin I’ IP and VP IP Bill did I’ gave books to the children V VP on each other’s first day of school VP

Movement & Binding a.John gave books to them on each other’s birthdays.

Movement & Binding a.John gave books to them on each other’s birthdays. gave books to them on each other’s birthdays VP V V (Pesetsky 1995)

Movement & Binding a.John gave books to them on each other’s birthdays. gave books to them on each other’s birthdays VP V V (Pesetsky 1995)

Movement & Binding b. …and [give books to them] he did ___ on each other’s birthdays (Pesetsky 1995)

give books VP V to them

give books VP V to them IP hedid

give books VP V to them IP hedid

give books to them VP V give books VP V to them IP he did I’ constituent movement

give books to them on each other’s birthdays VP V V give books VP V to them IP he did I’ constituent movement

give books to them on each other’s birthdays VP V V give books VP V to them IP he did I’ constituent movement binding under c-command

Interim Conclusion By building syntactic structures from left-to-right we can explain a number of otherwise mysterious constituency phenomena (see Phillips 1996, 2003 for more examples; see Richards 1999, 2002 for some applications to Japanese) We knew independently that humans have a left-to-right structure-building system (i.e. parser, producer) Possibility arises that the incremental left-to-right system is the only structure- building system that humans have Other arguments leading to related conclusions about grammar, in widely varying formalisms: Kempson et al. (2001), Steedman (2000), Kempen (1999), Milward (1992, 1994)

Outline The Challenge Real-time Grammar Accurate Parsing Incremental Parsing Mapping onto the Brain: Electrophysiology Encoding Outlook

Grammatical Accuracy It is not enough to show that syntactic structure-building looks like a real-time process If the real-time system is the only system, then it should also show the syntactic sophistication normally associated with the grammar - the parser cannot be ‘dumb’ Question: does the parser access only grammatically legal structural analyses?

Beyond Ambiguity Much of parsing literature focuses on issues of ambiguity, i.e. when 2 structures are grammatically possible, how to choose the right one? More basic question: grammatical search i.e. how do we figure out if a sequence of words has any grammatical analyses?

Example: Argument Structure Dative and Double-Object Constructions Alternator Verb: give The millionaire gave the painting to the museum. The millionaire gave the museum the painting. Non-alternator Verb: donate The millionaire donated the painting to the museum. *The millionaire donated the museum the painting. (Phillips, Edgar & Kabak, 2000)

Example: Argument Structure A Severe Garden-Path Sentence Alternator Verb: give The man gave the boy the dog bit a cookie

Example: Argument Structure A Severe Garden-Path Sentence Alternator Verb: give The man gave [the boy [the dog bit]] a cookie

Example: Argument Structure A Severe Garden-Path Sentence Alternator Verb: give The man gave [the boy [the dog bit]] a cookie availability of double-object parse leads to difficulty at embedded verb

Example: Argument Structure A Severe Garden-Path Sentence Alternator Verb: give The man gave [the boy [the dog bit]] a cookie Non-Alternator Verb: donate The man donated [the boy [the dog bit … availability of double-object parse leads to difficulty at embedded verb difficulty should arise earlier in the sentence

Results (partial) Relative to unambiguous control sentence, readers get into difficulty at V with alternator verbs, at NP2 with non-alternators. --> Argument structure constraint immediately active on-line Alternators (e.g. give)Non-Alternators (e.g. donate) start

Example: Movement Constraints

Grammatical Accuracy in Parsing Wh-Questions Englishmen cook wonderful dinners.

Grammatical Accuracy in Parsing Wh-Questions Englishmen cook wonderful dinners.

Grammatical Accuracy in Parsing Wh-Questions Englishmen cook what

Grammatical Accuracy in Parsing Wh-Questions Englishmen cook what

Grammatical Accuracy in Parsing Wh-Questions What do Englishmen cook

Grammatical Accuracy in Parsing Wh-Questions What do Englishmen cook gap

Grammatical Accuracy in Parsing Wh-Questions What do Englishmen cook gap 

Grammatical Accuracy in Parsing Long-distance Wh-Questions Few people think that anybody realizes that Englishmen cook wonderful dinners

Grammatical Accuracy in Parsing Long-distance Wh-Questions Few people think that anybody realizes that Englishmen cook what

Grammatical Accuracy in Parsing Long-distance Wh-Questions What do few people think that anybody realizes that Englishmen cook gap 

Grammatical Accuracy in Parsing The plan to remove the equipment ultimately destroyed the building.

Grammatical Accuracy in Parsing The plan to remove the equipment ultimately destroyed the building. Direct Object NP

Grammatical Accuracy in Parsing The plan to remove the equipment ultimately destroyed the building. Direct Object NP Main Clause

Grammatical Accuracy in Parsing The plan to remove the equipment ultimately destroyed the building. Direct Object NP Main Clause Subject NP Embedded Clause

Grammatical Accuracy in Parsing What did the plan to remove the equipment ultimately destroy

Grammatical Accuracy in Parsing What did the plan to remove the equipment ultimately destroy gap 

Grammatical Accuracy in Parsing What did the plan to remove ultimately destroy the building

Grammatical Accuracy in Parsing What did the plan to remove gap ultimately destroy the building 

Grammatical Accuracy in Parsing What did the plan to remove gap ultimately destroy the building  Subject Island Constraint A wh-phrase cannot be moved out of a subject.

Question… Do people respect island constraints on movement immediately on-line? (tomorrow’s talk)

Outline The Challenge Real-time Grammar Accurate Parsing Incremental Parsing Mapping onto the Brain: Electrophysiology Encoding Outlook

Incrementality Question: is structure building immediate? Does it operate on a word-by-word level? …Even in a language where this may be hard?

Incremental Application of Binding Constraints in Japanese Sachiko Aoshima Colin Phillips Amy Weinberg

Structure-building in Japanese NP-wa NP-ni [NP-ga NP-o V] V Head-driven Parsing (e.g. Pritchett 1991, Mulders 2002) Structure-building is delayed until verbs are processed –explains how parsing is possible in Japanese –accounts for flexibility, limited garden-paths in Japanese Incremental Parsing Structure-building occurs immediately –accounts for native-speaker intuition of continuous comprehension –hard to demonstrate experimentally

John-ga … (Mazuka & Itoh 1995)

John-ga Mary-ni … (Mazuka & Itoh 1995)

John-ga Mary-ni ringo-o … (Mazuka & Itoh 1995)

John-ga Mary-ni ringo-o tabeta … (Mazuka & Itoh 1995)

John-ga Mary-ni ringo-o tabeta inu-o ageta (Mazuka & Itoh 1995)

Verb Surprise Effects Surprise effect appears at verb Can be explained by both head-driven and incremental theories (Schneider 1999, Mulders 2002) Better: evidence of structure-building that precedes the verb e.g. immediate application of grammatical constraints

English To which of his children did the man give a gift (Aoshima, Phillips & Weinberg 2002)

English To which of his children did the man give a gift Which of his children gave the man a gift? (Aoshima, Phillips & Weinberg 2002)

Japanese which of his children (DAT) the man (NOM) … which of his children (NOM) the man (DAT) … (Aoshima, Phillips & Weinberg 2002)

Japanese pronoun and its antecedent which of his children (DAT) the man (NOM) … his which of his children (NOM) the man (DAT) … *?*?*?*? which of his children (DAT)

which of his children (DAT) the man (NOM) … which of his children (NOM) the man (DAT) … Gender Mismatch the woman

Gender Mismatch which of his children (DAT) the man (NOM) … his which of his children (NOM) the man (DAT) … *?*?*?*? which of his children (DAT) the woman Gender mismatch the woman Gender mismatch irrelevant

Conditions a. Scrambled - Gender Mismatch Adverb / [his / which NP]-dat / Adverb / NP FEMALE -nom / Adverb / NP-acc / verb-Q / NP MALE -top / verb b. Scrambled - Gender Match Adverb / [his / which NP]-dat / Adverb / NP MALE -nom / Adverb / NP-acc / verb-Q / NP FEMALE -top / verb c. Non-scrambled - Gender Mismatch Adverb / [his / which NP]-nom / Adverb / NP FEMALE -dat / Adverb / NP-acc / verb-Q / NP MALE -top / verb d. Non-scrambled - Gender Match Adverb / [his / which NP]-nom / Adverb / NP MALE -dat / Adverb / NP-acc / verb-Q / NP MALE -top / verb.

Examples a. 台所で 彼の どの子供に 朝食後 叔母が 急いで お弁当を 渡 したか 父親は 覚えていた。 b. 台所で 彼の どの子供に 朝食後 叔父が 急いで お弁当を 渡 したか 叔母は 覚えていた。 c. 台所で 彼の どの子供が 朝食後 叔母に 急いで お弁当を 渡 したか 父親は 覚えていた。 d. 台所で 彼の どの子供が 朝食後 叔父に 急いで お弁当を 渡 したか 父親は 覚えていた。

Design & Procedure 2 X 2 factorial design 4 lists were created by distributing 24 items in a Latin Square design 56 filler sentences Comprehension questions: matching a subject with a predicate Self-paced reading task -Moving Window - 40 native speakers of Japanese

Self-paced reading task

Self-paced reading task どの子供に

Self-paced reading task 叔母は

Self-paced reading task 母親が

Self-paced reading task ケーキを

Self-paced reading task 焼いたと

Self-paced reading task 台所で

Self-paced reading task お手伝いさんに

Self-paced reading task 知らせましたか。

Results: Scrambled conditions Slowdown at mismatching NP is observed. his/her ± Match F 1 (1, 39) = 8.6, p<.01; F 2 (1,23)=7.4, p<.01

Results: Non-scrambled conditions Slowdown at mismatching NP only when NP is possible antecedent. his/her ± Match F S <1

Summary: Experiment 3 NP-nom Verb HIS-WH gap  Binding constraint application takes place in advance of the verb.  Wh-gap is posited in a simple clause. start

Experiment 3 (off-line): Grammatical judgment test which of his children (DAT) the man (NOM) … which of his children (NOM) the man (DAT) … *?*?*?*?

Experiment 3 (off-line): Stimuli a. Non-wh, Non-scrambled [His children]-nom Adv the man-dat b. Non-wh, Scrambled [His children]- dat Adv the man-nom c. Wh, Non-scrambled [Which of his children]-nom Adv the man-dat d. Wh, Scrambled [Which of his children]-dat Adv the man-nom

Experiment 3 (off-line): Design & Procedure 4 lists were created by distributing 32 items in a Latin Square design 16 items: same materials from online test, and 16 items: different from those in online test. 32 filler sentences Anaphoric relation judgment task 40 native speakers of Japanese, same individuals as the online test

Experiment 3 (off-line): Results Backwards anaphora are more allowed in scrambled conditions. It confirms that the binding facts underlying in the online test are correct. start

Grammatical Search and Reanalysis David Schneider Colin Phillips Journal of Memory & Language, 2001

Previous study showed incremental application of grammatical constraints, i.e. derivations operate on a word-by-word time-scale Next study: do real-time derivations show consistency, i.e does structure-building keep to a single derivation? how does is grammar searched to find possible analyses?

Grammatical Search the man knowsthe woman NP VP V NP S (Schneider & Phillips, 2001)

Grammatical Search the man knowsthe woman NP VP V S NP likes V (Schneider & Phillips, 2001)

Grammatical Search the man knows the woman NP VP V S NP likes V S (Schneider & Phillips, 2001)

Grammatical Search the man knows the woman NP VP V S NP likes V S It’s clear that this is the right conclusion, but it’s less clear how the system reaches this conclusion. (Schneider & Phillips, 2001)

Grammatical Search the man knowsthe woman NP VP V S NP likes V Option 1: combine with a local NP, ignoring existing status of the NP. (Schneider & Phillips, 2001)

Grammatical Search the man knowsthe woman NP VP V S NP likes V Option 1: combine with a local NP, ignoring existing status of the NP. (Schneider & Phillips, 2001)

Grammatical Search the man knowsthe woman NP VP V S NP likes V Option 2: search the structure for an NP subject that currently lacks a  -role, i.e., focused search. (Schneider & Phillips, 2001)

Grammatical Search the man knowsthe woman NP VP V S NP likes V Option 2: search the structure for an NP subject that currently lacks a  -role, i.e., focused search. (Schneider & Phillips, 2001)

Grammatical Search the man knowsthe woman NP VP V S NP likes V Option 2: search the structure for an NP subject that currently lacks a  -role, i.e., focused search. This fails, so reanalysis is needed, … but only as a last resort operation.

the man knowsthe woman NP VP V S NP likes V Test Case If there is a higher NP, currently lacking a  -role, a focused search will find it. (Schneider & Phillips, 2001)

the man knowsthe woman NP VP V S NP likes V S’ NP t who Test Case If there is a higher NP, currently lacking a  -role, a focused search will find it. (Schneider & Phillips, 2001)

the man knowsthe woman NP VP V S NP likes V S’ NP t who S Test Case If there is a higher NP, currently lacking a  -role, a focused search will find it. (Schneider & Phillips, 2001)

the man knowsthe woman NP VP V S NP Test Case If there is a higher NP, currently lacking a  -role, a focused search will find it. S’ NP t who An unconstrained search will not be affected by the presence of the higher NP. likes V (Schneider & Phillips, 2001)

the man knows the woman NP VP V S NP Test Case If there is a higher NP, currently lacking a  -role, a focused search will find it. S’ NP t who An unconstrained search will not be affected by the presence of the higher NP. likes V S (Schneider & Phillips, 2001)

the man knows the woman NP VP V S NP Test Case If there is a higher NP, currently lacking a  -role, a focused search will find it. S’ NP t who An unconstrained search will not be affected by the presence of the higher NP. likes V S Probe Antecedents for reflexives. (Schneider & Phillips, 2001)

the man knows the woman NP VP V S NP Test Case If there is a higher NP, currently lacking a  -role, a focused search will find it. S’ NP t who An unconstrained search will not be affected by the presence of the higher NP. likes V S Probe Antecedents for reflexives....the recipe herself...the recipe himself (Schneider & Phillips, 2001)

Test Case If there is a higher NP, currently lacking a  -role, a focused search will find it. An unconstrained search will not be affected by the presence of the higher NP. Probe Antecedents for reflexives. the man knowsthe woman NP VP V S NP likes V S’ NP t who S...the recipe herself...the recipe himself (Schneider & Phillips, 2001)

Grammatical Search Relative to its unambiguous control, high attached reflexives pose no difficulty. (Schneider & Phillips, 2001)

Grammatical Search Relative to its unambiguous control, high attached reflexives pose no difficulty. …but low attached reflexives present great difficulty. (Schneider & Phillips, 2001)

Therefore... High attachment is chosen …despite well-known local- attachment biases. the man knowsthe woman NP VP V S NP likes V S’ NP t who S Grammatical search is focused, constrained by existing commitments (Schneider & Phillips, 2001)

Therefore... High attachment is chosen …despite well-known local- attachment biases. Grammatical search is focused, constrained by existing commitments knowsthe woman NP VP V S likes V the man NP This helps us to understand how grammatical search proceeds in simple cases.

Conclusions Local attachment is easy The local reanalysis is easy (cf. Sturt et al. 2000) So why is local attachment avoided? Reanalysis is a Last Resort (even if it’s easy)

Reversal of results…

NP-biased hear, warn, understand S-biased claim, believe, suspect

Outline The Challenge Real-time Grammar Accurate Parsing Incremental Parsing Mapping onto the Brain: Electrophysiology Encoding Outlook

Time Resolution Syntax: phrase-by-phrase time scale Reading-time studies: word-by-word time scale Brain recordings: millisecond time scale

Event-Related Potentials (ERPs) s1s2s3 John islaughing.

Evolving understanding of ERP components associated with language…

Semantically unexpected input She spread the warm bread with socks. (Kutas & Hillyard, 1980) She was stung by a fly. (Kutas, et al., 1984; Federmeier & Kutas, 1999) (Slide from Kaan (2001)

N400 Negative polarity peaking at around 400 ms central scalp distribution (Slide from Kaan (2001)

ERP Sentence Processing Developing understanding of N400 is informative Response to ‘violations’ N400 I drink my coffee with cream and sugar I drink my coffee with cream and socks Kutas & Hillyard (1980)

Vervet Monkeys Many predators: leopard, lion hyena, jackal, eagle, etc. etc. Distinct alarm calls for different predators

ERP Sentence Processing Developing understanding of N400 is informative Response to normal sentences N400 Fully Congruent Most new drugs are tested on white lab rats. Van Petten & Kutas (1991)

ERP Sentence Processing N400 N400 to semantic anomalies is a special case of a much more general phenomenon All words elicit N400-like response, timing and amplitude proportional to congruency, frequency, etc. More detailed understanding is contingent on more detailed models of semantic interpretation

Morpho-Syntactic violations Every Monday he mows the lawn. Every Monday he *mow the lawn. The plane brought us to paradise. The plane brought *we to paradise. (Coulson et al., 1998) (Slide from Kaan (2001)

he mows he *mow P600 (Slide from Kaan (2001)

he mows he *mow P600 Left Anterior Negativity (LAN) (Slide from Kaan (2001)

ERP Sentence Processing LAN, P600 Sie bereist das neuter Land neuter … Sie bereist den masculine Land neuter … she travels the land... Gunter et al. (2000)

Kaan et al. (2000) ERP Sentence Processing P600 Emily wondered who the performer in the concert had imitated for the audience’s amusement. Emily wondered whether the performer in the concert had imitated a pop star for the audience’s amusement. P600 reflects normal structure- building processes.

Electrophysiology of Wh-movement Colin Phillips, Nina Kazanina, Shani Abada, Daniel Garcia-Pedrosa [revision of work done at UDel in 2000]

Experiment Design Materials a. The actress wished that the producers knew that the witty host would tell the jokes during the party. b. The actress wished that the producers knew which jokes the witty host would tell __ during the party. c. The producers knew that the actress wished that the witty host would tell the jokes during the party. d. The producers knew which jokes the actress wished that the witty host would tell __ during the party.

a. The actress wished that the producers knew that the witty host would tell the jokes during the party. b. The actress wished that the producers knew which jokes the witty host would tell during the party. c. The producers knew that the actress wished that the witty host would tell the jokes during the party. d. The producers knew which jokes the actress wished that the witty host would tell during the party. Short wh-dependency Experiment Design Materials

a. The actress wished that the producers knew that the witty host would tell the jokes during the party. b. The actress wished that the producers knew which jokes the witty host would tell during the party. c. The producers knew that the actress wished that the witty host would tell the jokes during the party. d. The producers knew which jokes the actress wished that the witty host would tell during the party. Long wh-dependency Experiment Design Materials

+wh -wh Electrode PZ Short Conditionsn=20 Effect of wh-movement significant (p<.01) from ms onwards Verb

+wh -wh Electrode PZ Long Conditionsn=20 Effect of wh-movement significant (p<.01) from ms onwards Verb start

How fast is Structural Computation? Silke Urban Colin Phillips

Background Early Left Anterior Negativity (Angela Friederici, Anja Hahne, et al.)

Neville et al., 1991 The scientist criticized a proof of the theorem. The scientist criticized Max’s of proof the theorem.

500ms/word

Hahne & Friederici, 1999  Das Baby wurde gefüttert The baby was fed  Das Baby wurde im gefüttert The baby was in-the fed Question: are the brain responses to violations automatic?

Hahne & Friederici, 1999 P600

Hahne & Friederici, 1999 ELAN

Very fast: ms Automatic Left anterior (= frontal) scalp distribution Elicited by a subclass of syntactic violations “ Phase 1 (100–300 ms) represents the time window in which the initial syntactic structure is formed on the basis of information about the word category.” (Friederici 2002)

Questions about ELAN How plausible is it that ELAN reflects syntactic structure building? –Speed: 150ms is faster than lexical access! –Generality: ELAN is not elicited by most violations; almost all studies on ELAN involve one construction (for each of German, English) –Localization…

Brodmann Areas

160 SQUID whole-head array pickup coil & SQUID assembly Magnetoencephalography (MEG)

(Friederici et al. 2000)

Two regions of interest (Friederici et al. 2000)

Anterior Temporal Lobe? Why is anterior temporal lobe so important in ELAN? How does it differ from Broca’s area (BA 44 etc.) that are implicated so often in other studies? Friederici: both responsible for ‘structure building’; BA44 also responsible for ‘syntactic working memory’; ‘the inferior portion of BA44 is selectively activated when syntactic processes are in focus.’ Anterior temporal lobe associated with –lexical information –activated in fMRI by comparisons of sentences with word lists

Alternative Interpretation ELAN reflects violation/suppression of automatic lexical prediction –accounts for localization to anterior temporal lobe –accounts for very early timing –might account for automaticity –accounts for very limited distribution

Neville et al., 1991 The scientist criticized a proof of the theorem. The scientist criticized Max’s of proof the theorem. NP Max’s N

Hahne & Friederici, 1999  Das Baby wurde gefüttert The baby was fed  Das Baby wurde im gefüttert The baby was in-the fed NP the N in PP

Prediction If ELAN reflects violation of lexical prediction, rather than syntactic structure-building, then… –change lexical predictions –keep syntactic violation the same –should ‘turn off’ ELAN brain response

Neville et al., 1991 The scientist criticized a proof of the theorem. The scientist criticized Max’s of proof the theorem. NP Max’s N Possible to block the automatic prediction of an N following a possessor: ellipsis

Ellipsis Possessors may appear alone in ellipsis contexts: Although I like Mary’s theory, I don’t like John’s.

Experimental Conditions Although Erica kissed Mary’s mother, she did not kiss the daughter of the bride. Although Erica kissed Mary’s mother, she did not kiss Dana’s of the bride. Although the bridesmaid kissed Mary, she did not kiss the daughter of the bride. Although the bridesmaid kissed Mary, she did not kiss Dana’s of the bride.

Experimental Conditions Although Erica kissed Mary’s mother, she did not kiss the daughter of the bride. Although Erica kissed Mary’s mother, she did not kiss Dana’s of the bride. Although the bridesmaid kissed Mary, she did not kiss the daughter of the bride. Although the bridesmaid kissed Mary, she did not kiss Dana’s of the bride.  ellipsis possible ellipsis impossible

Experimental Design 384 sentences per session –128 targets (drawn from 128 sets of 4 conditions) –64 items designed to elicit ‘agreement violation’ LAN –192 filler items, designed to hide violations and promote ellipsis Procedure –RSVP (Rapid Serial Visual Presentation), 500ms/word –Grammaticality judgment task Recording: 32-electrode montage 22 subjects (so far)

+

Although

Erica

kissed

Mary’s

mother,

she

did

not

kiss

Dana’s

of

the

bride.

??? good bad

Preliminary Results a. Although … Mary’s mother … Dana’s of … b. Although … Mary … Dana’s of … b a

Interim Conclusion Preliminary results lend support to our interpretation of the (E)LAN - the anterior negativity is reduced in an ellipsis context –structural violation is identical in both conditions –obligatory lexical prediction of N following possessor (e.g. Mary’s…) is absent in ellipsis context Structure-building may begin ~ ms after a word is presented

Outline The Challenge Real-time Grammar Accurate Parsing Incremental Parsing Mapping onto the Brain: Electrophysiology Encoding Outlook

The Binding Problem Discrete infinity Individual neurons (or groups of neurons) can store finite information about objects, words, etc. But sentences are infinite in number! Representing structure: can’t just activate all words e.g. THE + MAN + ATE + PIZZA Must create & discard structures quickly: 100s of msec

Temporal Binding the man ate pizza phase locked ~25ms, 40Hz

Evidence from Animals & Humans Direct recordings (cat)EEG recordings (human) (Singer 1999)(Tallon-Baudry & Bertrand 1999)

Limitations… No evidence yet of role in human syntax Limited capacity - ~7 bindings 1-level of hierarchy only Interesting hypothesis (Whitney & Weinberg 2002) –temporal binding is the neural representation used for the ‘syntactic workspace’ –additional neural encoding mechanism used for long-term representation and storage

Outline The Challenge Real-time Grammar Accurate Parsing Incremental Parsing Mapping onto the Brain: Electrophysiology Encoding Outlook

Overview –challenge for unification: real-time hypotheses Real-time Grammar –syntactic derivations look like real-time derivations Accurate Parsing –real-time derivations have the sophistication that is needed Incremental Parsing –real-time interpretation is time-locked to incoming words Mapping onto the Brain: Electrophysiology –extreme time-precision of EEG/MEG can be linked to detailed linguistic constructs Encoding –plausible models of neural encoding of structure are emerging Unification: Problem or Mystery…