Parsers and Grammars Colin Phillips. Outline The Standard History of Psycholinguistics Parsing and rewrite rules Initial optimism Disappointment and the.

Slides:



Advertisements
Similar presentations
ARCHITECTURES FOR ARTIFICIAL INTELLIGENCE SYSTEMS
Advertisements

Lecture 2: Constraints on Movement.  Formal movement rules (called Transformations) were first introduced in the late 1950s  During the 1960s a lot.
Lecture 4: The Complementiser System
07/05/2005CSA2050: DCG31 CSA2050 Introduction to Computational Linguistics Lecture DCG3 Handling Subcategorisation Handling Relative Clauses.
Lexical Functional Grammar History: –Joan Bresnan (linguist, MIT and Stanford) –Ron Kaplan (computational psycholinguist, Xerox PARC) –Around 1978.
Language and Cognition Colombo, June 2011 Day 2 Introduction to Linguistic Theory, Part 4.
Long Distance Dependencies (Filler-Gap Constructions) and Relative Clauses October 10, : Grammars and Lexicons Lori Levin (Examples from Kroeger.
Introduction: The Chomskian Perspective on Language Study.
1 Introduction to Linguistics II Ling 2-121C, group b Lecture 4 Eleni Miltsakaki AUTH Spring 2006.
Statistical NLP: Lecture 3
PSY 369: Psycholinguistics Some basic linguistic theory part2.
Introduction and Jurafsky Model Resource: A Probabilistic Model of Lexical and Syntactic Access and Disambiguation, Jurafsky 1996.
The Real-time Status of Island Constraints Colin Phillips, Beth Rabbin Leticia Pablos, Kaia Wong Cognitive Neuroscience of Language Laboratory Department.
Linguistic Theory Lecture 8 Meaning and Grammar. A brief history In classical and traditional grammar not much distinction was made between grammar and.
Sag et al., Chapter 4 Complex Feature Values 10/7/04 Michael Mulyar.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 4.
1 Introduction to Computational Natural Language Learning Linguistics (Under: Topics in Natural Language Processing ) Computer Science (Under:
Intro to Psycholinguistics What its experiments are teaching us about language processing and production.
Transformational Grammar p.33 - p.43 Jack October 30 th, 2012.
The students will be able to know:
Models of Generative Grammar Smriti Singh. Generative Grammar  A Generative Grammar is a set of formal rules that can generate an infinite set of sentences.
Syntax.
Lecture 1 Introduction: Linguistic Theory and Theories
Generative Grammar(Part ii)
The syntax of language How do we form sentences? Processing syntax. Language and the brain.
B ASIC P ROPERTIES OF E NGLISH S YNTAX Lecture One Iwan Fauzi, MA.
Syntax Nuha AlWadaani.
Linguistic Theory Lecture 3 Movement. A brief history of movement Movements as ‘special rules’ proposed to capture facts that phrase structure rules cannot.
Three Generative grammars
Emergence of Syntax. Introduction  One of the most important concerns of theoretical linguistics today represents the study of the acquisition of language.
The Language Instinct Talking Heads.
Interference in Short-Term Memory The Magical Number Two (or Three) in Sentence Processing ` (Sat.) / Chan-hoon Park Hypernetwork Models of Learning.
Empirical Methods in Information Extraction Claire Cardie Appeared in AI Magazine, 18:4, Summarized by Seong-Bae Park.
1 Words and rules Linguistics lecture #2 October 31, 2006.
Dr. Monira Al-Mohizea MORPHOLOGY & SYNTAX WEEK 12.
1 Natural Language Processing Lecture 11 Efficient Parsing Reading: James Allen NLU (Chapter 6)
THE BIG PICTURE Basic Assumptions Linguistics is the empirical science that studies language (or linguistic behavior) Linguistics proposes theories (models)
1 Language processing in the mind Linguistics lecture #5 November 9, 2006.
NLP. Introduction to NLP Is language more than just a “bag of words”? Grammatical rules apply to categories and groups of words, not individual words.
PS: Introduction to Psycholinguistics Winter Term 2005/06 Instructor: Daniel Wiechmann Office hours: Mon 2-3 pm Phone:
Linear Order and Constituency Colin Phillips Cognitive Neuroscience of Language Laboratory Department of Linguistics University of Maryland.
Semantic Construction lecture 2. Semantic Construction Is there a systematic way of constructing semantic representation from a sentence of English? This.
CTM 2. EXAM 2 Exam 1 Exam 2 Letter Grades Statistics Mean: 60 Median: 56 Modes: 51, 76.
PSY270 Michaela Porubanova. Language  a system of communication using sounds or symbols that enables us to express our feelings, thoughts, ideas, and.
Rules, Movement, Ambiguity
CSA2050 Introduction to Computational Linguistics Parsing I.
Linguistic Theory Lecture 5 Filters. The Structure of the Grammar 1960s (Standard Theory) LexiconPhrase Structure Rules Deep Structure Transformations.
1 LIN 1310B Introduction to Linguistics Prof: Nikolay Slavkov TA: Qinghua Tang CLASS 16, March 6, 2007.
Ian Roberts  Generate well-formed structural descriptions  “create” trees/labelled bracketings  More (X’) or less (PS-rules) abstract.
SYNTAX.
◦ Process of describing the structure of phrases and sentences Chapter 8 - Phrases and sentences: grammar1.
1 Some English Constructions Transformational Framework October 2, 2012 Lecture 7.
 2003 CSLI Publications Ling 566 Oct 17, 2011 How the Grammar Works.
Language and Cognition Colombo, June 2011 Day 2 Introduction to Linguistic Theory, Part 3.
X-Bar Theory. The part of the grammar regulating the structure of phrases has come to be known as X'-theory (X’-bar theory'). X-bar theory brings out.
MENTAL GRAMMAR Language and mind. First half of 20 th cent. – What the main goal of linguistics should be? Behaviorism – Bloomfield: goal of linguistics.
Week 3. Clauses and Trees English Syntax. Trees and constituency A sentence has a hierarchical structure Constituents can have constituents of their own.
SYNTAX.
King Faisal University جامعة الملك فيصل Deanship of E-Learning and Distance Education عمادة التعلم الإلكتروني والتعليم عن بعد [ ] 1 King Faisal University.
Child Syntax and Morphology
Lecture 4: The Complementiser System
Statistical NLP: Lecture 3
SYNTAX.
Chapter Eight Syntax.
Part I: Basics and Constituency
Structural relations Carnie 2013, chapter 4 Kofi K. Saah.
Chapter Eight Syntax.
Competence and performance
Psycholinguistics II The Dynamics of Language
Presentation transcript:

Parsers and Grammars Colin Phillips

Outline The Standard History of Psycholinguistics Parsing and rewrite rules Initial optimism Disappointment and the DTC Emergence of independent psycholinguistics Reevaluating relations between competence and performance systems

Standard View ? 217 x 32 = ? arithmetic

Standard View ? 217 x 32 = ? specialized algorithm arithmetic

Standard View ? 217 x 32 = ? specialized algorithm ? something deeper arithmetic

Standard View specialized algorithm recursive characterization of well-formed expressions speaking understanding grammatical knowledge, competence language

Standard View specialized algorithm recursive characterization of well-formed expressions speaking understanding grammatical knowledge, competence precise but ill-adapted to real-time operation language

Standard View specialized algorithm recursive characterization of well-formed expressions speaking understanding grammatical knowledge, competence well-adapted to real-time operation but maybe inaccurate language

Grammatical Knowledge How is grammatical knowledge accessed in syntactic computation for... (a) grammaticality judgment (b) understanding (c) speaking Almost no proposals under standard view This presents a serious obstacle to unification at the level of syntactic computation

Townsend & Bever (2001, ch. 2) “Linguists made a firm point of insisting that, at most, a grammar was a model of competence - that is, what the speaker knows. This was contrasted with effects of performance, actual systems of language behaviors such as speaking and understanding. Part of the motive for this distinction was the observation that sentences can be intuitively ‘grammatical’ while being difficult to understand, and conversely.”

Townsend & Bever (2001, ch. 2) “…Despite this distinction the syntactic model had great appeal as a model of the processes we carry out when we talk and listen. It was tempting to postulate that the theory of what we know is a theory of what we do, thus answering two questions simultaneously. 1. What do we know when we know a language? 2. What do we do when we use what we know?

Townsend & Bever (2001, ch. 2) “…It was assumed that this knowledge is linked to behavior in such a way that every syntactic operation corresponds to a psychological process. The hypothesis linking language behavior and knowledge was that they are identical.

Miller (1962) 1. Mary hit Mark.K(ernel) 2. Mary did not hit Mark. N 3. Mark was hit by Mary.P 4. Did Mary hit Mark?Q 5. Mark was not hit by Mary.NP 6. Didn’t Mary hit Mark?NQ 7. Was Mark hit by Mary?PQ 8. Wasn’t Mark hit by Mary?PNQ

Miller (1962) Transformational Cube

Townsend & Bever (2001, ch. 2) “The initial results were breathtaking. The amount of time it takes to produce a sentence, given another variant of it, is a function of the distance between them on the sentence cube. (Miller & McKean 1964).” “…It is hard to convey how exciting these developments were. It appeared that there was to be a continuing direct connection between linguistic and psychological research. […] The golden age had arrived.”

Townsend & Bever (2001, ch. 2) “Alas, it soon became clear that either the linking hypothesis was wrong, or the grammar was wrong, or both.”

Townsend & Bever (2001, ch. 2) “The moral of this experience is clear. Cognitive science made progress by separating the question of what people understand and say from how they understand and say it. The straightforward attempt to use the grammatical model directly as a processing model failed. The question of what humans know about language is not only distinct from how children learn it, it is distinct from how adults use it.”

A Simple Derivation S (starting axiom) S

A Simple Derivation S (starting axiom) 1. S  NP VP S NPVP

A Simple Derivation S (starting axiom) 1. S  NP VP 2. VP  V NP S NPVP VNP

A Simple Derivation S (starting axiom) 1. S  NP VP 2. VP  V NP 3. NP  D N S NPVP VNP DN

A Simple Derivation S (starting axiom) 1. S  NP VP 2. VP  V NP 3. NP  D N 4. N  Bill S NPVP VNP DN Bill

A Simple Derivation S (starting axiom) 1. S  NP VP 2. VP  V NP 3. NP  D N 4. N  Bill 5. V  hit S NPVP VNP DN Bill hit

A Simple Derivation S (starting axiom) 1. S  NP VP 2. VP  V NP 3. NP  D N 4. N  Bill 5. V  hit 6. D  the S NPVP VNP DN Bill hit the

A Simple Derivation S (starting axiom) 1. S  NP VP 2. VP  V NP 3. NP  D N 4. N  Bill 5. V  hit 6. D  the 7. N  ball S NPVP VNP DN Bill hit theball

A Simple Derivation S (starting axiom) 1. S  NP VP 2. VP  V NP 3. NP  D N 4. N  Bill 5. V  hit 6. D  the 7. N  ball S NPVP VNP DN Bill hit theball

Reverse the derivation...

A Simple Derivation S (starting axiom) 1. S  NP VP 2. VP  V NP 3. NP  D N 4. N  Bill 5. V  hit 6. D  the 7. N  ball Bill

A Simple Derivation S (starting axiom) 1. S  NP VP 2. VP  V NP 3. NP  D N 4. N  Bill 5. V  hit 6. D  the 7. N  ball NP Bill

A Simple Derivation S (starting axiom) 1. S  NP VP 2. VP  V NP 3. NP  D N 4. N  Bill 5. V  hit 6. D  the 7. N  ball NP Bill hit

A Simple Derivation S (starting axiom) 1. S  NP VP 2. VP  V NP 3. NP  D N 4. N  Bill 5. V  hit 6. D  the 7. N  ball NP VBill hit

A Simple Derivation S (starting axiom) 1. S  NP VP 2. VP  V NP 3. NP  D N 4. N  Bill 5. V  hit 6. D  the 7. N  ball NP VBill hit the

A Simple Derivation S (starting axiom) 1. S  NP VP 2. VP  V NP 3. NP  D N 4. N  Bill 5. V  hit 6. D  the 7. N  ball NP V D Bill hit the

A Simple Derivation S (starting axiom) 1. S  NP VP 2. VP  V NP 3. NP  D N 4. N  Bill 5. V  hit 6. D  the 7. N  ball NP V D Bill hit theball

A Simple Derivation S (starting axiom) 1. S  NP VP 2. VP  V NP 3. NP  D N 4. N  Bill 5. V  hit 6. D  the 7. N  ball NP V DN Bill hit theball

A Simple Derivation S (starting axiom) 1. S  NP VP 2. VP  V NP 3. NP  D N 4. N  Bill 5. V  hit 6. D  the 7. N  ball NP V DN Bill hit theball

A Simple Derivation S (starting axiom) 1. S  NP VP 2. VP  V NP 3. NP  D N 4. N  Bill 5. V  hit 6. D  the 7. N  ball NPVP VNP DN Bill hit theball

A Simple Derivation S (starting axiom) 1. S  NP VP 2. VP  V NP 3. NP  D N 4. N  Bill 5. V  hit 6. D  the 7. N  ball S NPVP VNP DN Bill hit theball

A Simple Derivation S (starting axiom) 1. S  NP VP 2. VP  V NP 3. NP  D N 4. N  Bill 5. V  hit 6. D  the 7. N  ball S NPVP VNP DN Bill hit theball

Transformations wh-movement Xwh-NPY > 2103

Transformations VP-ellipsis XVP 1 YVP 2 Z > condition: VP 1 = VP 2

Difficulties How to build structure incrementally in right- branching structures How to recognize output of transformations that create nulls

Summary Running the grammar ‘backwards’ is not so straightforward - problems of indeterminacy and incrementality Disappointment in empirical tests of Derivational Theory of Complexity Unable to account for processing of local ambiguities

Standard View specialized algorithm recursive characterization of well-formed expressions speaking understanding grammatical knowledge, competence language

Grammatical Knowledge How is grammatical knowledge accessed in syntactic computation for... (a) grammaticality judgment (b) understanding (c) speaking Almost no proposals under standard view This presents a serious obstacle to unification at the level of syntactic computation

Arguments for Architecture 1. Available grammars don’t make good parsing devices 2. Grammaticality ≠ Parsability 3. Failure of DTC 4. Evidence for parser-specific structure 5. Parsing/production have distinct properties 6. Possibility of independent damage to parsing/production 7. Competence/performance distinction is necessary, right?

Arguments for Architecture 1. Available grammars don’t make good parsing devices 2. Grammaticality ≠ Parsability 3. Failure of DTC 4. Evidence for parser-specific structure 5. Parsing/production have distinct properties 6. Possibility of independent damage to parsing/production 7. Competence/performance distinction is necessary, right?

Grammar as Parser - Problems Incremental structure building with PS Rules (e.g. S -> NP VP) –delay –prediction/guessing Indeterminacy ( how to recover nulls created by transformations)

Grammar as Parser - Solutions Lexicalized grammars make incremental structure- building much easier (available in HPSG, minimalism, LFG, Categorial Grammar, etc. etc.) sat onthe rug VP -> V PP PP -> P NP VP PP NP P V

Grammar as Parser - Solutions Lexicalized grammars make incremental structure- building much easier (available in HPSG, minimalism, LFG, Categorial Grammar, etc. etc.) sat onthe rug sit comp: __ P on comp: __ N VP PP NP P V

Grammar as Parser - Solutions Problem of seeking nulls in movement structures

Transformations wh-movement Xwh-NPY > 2103

Transformations VP-ellipsis XVP 1 YVP 2 Z > condition: VP 1 = VP 2

Grammar as Parser - Solutions Problem of seeking nulls in movement structures …becomes problem of seeking licensing features for displaced phrases, e.g. for wh-phrase, seek Case assigner and thematic role assigner. Requirement to find licensing features is a basic component of all syntactic composition

Incremental Structure Building An investigation of the grammatical consequences of incremental, left-to-right structure building

Incremental Structure Building A

AB

A BC

A B CD

A B C DE

AB

AB constituent

Incremental Structure Building A BC constituent is destroyed by addition of new material

Incremental Structure Building A BC

A BC constituent

Incremental Structure Building A B CD constituent is destroyed by addition of new material

Incremental Structure Building the cat

Incremental Structure Building the catsat

Incremental Structure Building the cat saton

Incremental Structure Building the cat sat on the rug

Incremental Structure Building the cat saton

Incremental Structure Building the cat sat on the rug

Incremental Structure Building the cat sat on the rug [sat on] is a temporary constituent, which is destroyed as soon as the NP [the rug] is added.

Incremental Structure Building Conflicting Constituency Tests Verb + Preposition sequences can undergo coordination… (1) The cat sat on and slept under the rug. …but cannot undergo pseudogapping (Baltin & Postal, 1996) (2) *The cat sat on the rug and the dog did the chair.

Incremental Structure Building the cat saton

Incremental Structure Building the cat satonsleptunder and

Incremental Structure Building the cat satonsleptunder and coordination applies early, before the V+P constituent is destroyed.

Incremental Structure Building the cat saton

Incremental Structure Building the cat sat on the rug

Incremental Structure Building the cat sat on the rug andthe dogdid

Incremental Structure Building the cat sat on the rug andthe dogdid pseudogapping applies too late, after the V+P constituent is destroyed.

Incremental Structure Building Constituency Problem Different diagnostics of constituency frequently yield conflicting results Incrementality Hypothesis (a) Structures are assembled strictly incrementally (b) Syntactic processes see a ‘snapshot’ of a derivation - they target constituents that are present when the process applies (c) Conflicts reflect the simple fact that different processes have different linear properties Applied to interactions among binding, movement, ellipsis, prosodic phrasing, clitic placement, islands, etc. (Phillips 1996, in press; Richards 1999, 2000; Guimaraes 1999; etc.)

Interim Conclusion Grammatical derivations look strikingly like the incremental derivations of a parsing system But we want to be explicit about this, so...

Computational Modeling (Schneider 1999; Schneider & Phillips, 1999)

Arguments for Architecture 1. Available grammars don’t make good parsing devices 2. Grammaticality ≠ Parsability 3. Failure of DTC 4. Evidence for parser-specific structure 5. Parsing/production have distinct properties 6. Possibility of independent damage to parsing/production 7. Competence/performance distinction is necessary, right?

Townsend & Bever (2001, ch. 2) “Linguists made a firm point of insisting that, at most, a grammar was a model of competence - that is, what the speaker knows. This was contrasted with effects of performance, actual systems of language behaviors such as speaking and understanding. Part of the motive for this distinction was the observation that sentences can be intuitively ‘grammatical’ while being difficult to understand, and conversely.”

Grammaticality ≠ Parsability “It is straightforward enough to show that sentence parsing and grammaticality judgments are different. There are sentences which are easy to parse but ungrammatical (e.g. that-trace effects), and there are sentences which are extremely difficult to parse, but which may be judged grammatical given appropriate time for reflection (e.g. multiply center embedded sentences). This classic argument shows that parsing and grammar are not identical, but it tells us very little about just how much they have in common.” (Phillips, 1995)

Grammaticality ≠ Parsability Grammatical sentences that are hard to parse –The cat the dog the rat bit chased fled –John gave the man the dog bit a sandwich Ungrammatical sentences that are understandable –Who do you think that left? –The children is happy –The millionaire donated the museum a painting

Grammaticality ≠ Parsability Grammatical sentences that are hard to parse –The cat the dog the rat bit chased fled –John gave the man the dog bit a sandwich Can arise independent of grammar –Resource (memory) limitations –Incorrect choices in ambiguity

(Preliminary) Incomplete structural dependencies have a cost (that’s what yields center embedding)

A Contrast (Gibson 1998) Relative Clause within a Sentential Complement (RC  SC): The fact [ CP that the employee [ RC who the manager hired] stole office supplies] worried the executive. Sentential Complement within a Relative Clause (SC  RC): #The executive [ RC who the fact [ CP that the employee stole office supplies] worried] hired the manager. RC  SC is easier to process than SC  RC

A Contrast (Gibson 1998) Relative Clause within a Sentential Complement (RC  SC): [ SC that the employee [ RC who the manager hired] stole Sentential Complement within a Relative Clause (SC  RC): [ RC who the fact [ SC that the employee stole office supplies] worried] RC  SC is easier to process than SC  RC

A Contrast (Gibson 1998) Relative Clause within a Sentential Complement (RC  SC): [ SC that the employee [ RC who the manager hired] stole Sentential Complement within a Relative Clause (SC  RC): [ RC who the fact [ SC that the employee stole office supplies] worried] RC  SC is easier to process than SC  RC

A Contrast (Gibson 1998) Relative Clause within a Sentential Complement (RC  SC): [ SC that the employee [ RC who the manager hired] stole Sentential Complement within a Relative Clause (SC  RC): [ RC who the fact [ SC that the employee stole office supplies] worried] Contrast is motivated by off-line complexity ratings

Grammaticality ≠ Parsability Ungrammatical sentences that are understandable –Who do you think that left? –The children is happy –The millionaire donated the museum a painting System can represent illegal combinations (e.g. categories are appropriate, but feature values are inappropriate) Fact that understandable errors are (i) diagnosable, (ii) nearly grammatical, should not be overlooked

Grammaticality ≠ Parsability Are the parser’s operations fully grammatically accurate?

Standard View specialized algorithm recursive characterization of well-formed expressions speaking understanding grammatical knowledge, competence well-adapted to real-time operation but maybe inaccurate language

Grammatical Accuracy in Parsing The grammar looks rather like a parser BUT, does the parser look like a grammar? i.e., are the parser’s operations fully grammatically accurate at every step … even in situations where such accuracy appears quite difficult to achieve (Phillips & Wong 2000)

Self-Paced Reading (e.g. Phillips & Wong 2000)

We (e.g. Phillips & Wong 2000) Self-Paced Reading

-- can (e.g. Phillips & Wong 2000) Self-Paced Reading

measure (e.g. Phillips & Wong 2000) Self-Paced Reading

reading (e.g. Phillips & Wong 2000) Self-Paced Reading

time (e.g. Phillips & Wong 2000) Self-Paced Reading

per (e.g. Phillips & Wong 2000) Self-Paced Reading

word. (e.g. Phillips & Wong 2000) Self-Paced Reading

Grammatical Accuracy in Parsing Wh-Questions (Phillips & Wong 2000)

Grammatical Accuracy in Parsing Wh-Questions Englishmen cook wonderful dinners. (Phillips & Wong 2000)

Grammatical Accuracy in Parsing Wh-Questions Englishmen cook wonderful dinners. (Phillips & Wong 2000)

Grammatical Accuracy in Parsing Wh-Questions Englishmen cook what (Phillips & Wong 2000)

Grammatical Accuracy in Parsing Wh-Questions Englishmen cook what (Phillips & Wong 2000)

Grammatical Accuracy in Parsing Wh-Questions What do Englishmen cook (Phillips & Wong 2000)

Grammatical Accuracy in Parsing Wh-Questions What do Englishmen cook gap (Phillips & Wong 2000)

Grammatical Accuracy in Parsing Wh-Questions What do Englishmen cook gap  (Phillips & Wong 2000)

Grammatical Accuracy in Parsing Long-distance Wh-Questions Few people think that anybody realizes that Englishmen cook wonderful dinners (Phillips & Wong 2000)

Grammatical Accuracy in Parsing Long-distance Wh-Questions Few people think that anybody realizes that Englishmen cook what (Phillips & Wong 2000)

Grammatical Accuracy in Parsing Long-distance Wh-Questions What do few people think that anybody realizes that Englishmen cook gap  (Phillips & Wong 2000)

Grammatical Accuracy in Parsing ‘Parasitic Gaps’ The plan to remove the equipment ultimately destroyed the building. (Phillips & Wong 2000)

Grammatical Accuracy in Parsing ‘Parasitic Gaps’ The plan to remove the equipment ultimately destroyed the building. Direct Object NP (Phillips & Wong 2000)

Grammatical Accuracy in Parsing ‘Parasitic Gaps’ The plan to remove the equipment ultimately destroyed the building. Direct Object NP Main Clause (Phillips & Wong 2000)

Grammatical Accuracy in Parsing ‘Parasitic Gaps’ The plan to remove the equipment ultimately destroyed the building. Direct Object NP Main Clause Subject NP Embedded Clause (Phillips & Wong 2000)

Grammatical Accuracy in Parsing ‘Parasitic Gaps’ What did the plan to remove the equipment ultimately destroy (Phillips & Wong 2000)

Grammatical Accuracy in Parsing ‘Parasitic Gaps’ What did the plan to remove the equipment ultimately destroy gap  (Phillips & Wong 2000)

Grammatical Accuracy in Parsing ‘Parasitic Gaps’ What did the plan to remove ultimately destroy the building (Phillips & Wong 2000)

Grammatical Accuracy in Parsing ‘Parasitic Gaps’ What did the plan to remove gap ultimately destroy the building  (Phillips & Wong 2000)

Grammatical Accuracy in Parsing ‘Parasitic Gaps’ What did the plan to remove gap ultimately destroy the building  Subject Island Constraint A wh-phrase cannot be moved out of a subject. (Phillips & Wong 2000)

Grammatical Accuracy in Parsing ‘Parasitic Gaps’ What did the plan to remove ultimately destroy (Phillips & Wong 2000)

Grammatical Accuracy in Parsing ‘Parasitic Gaps’ What did the plan to remove ultimately destroy (Phillips & Wong 2000)

Grammatical Accuracy in Parsing ‘Parasitic Gaps’ What did the plan to remove ultimately destroy  Parasitic Gap Generalization: the good gap ‘rescues’ the bad gap

Grammatical Accuracy in Parsing ‘Parasitic Gaps’ What did the plan to remove ultimately destroy  Generalization: the good gap ‘rescues’ the bad gap Infinitive

Grammatical Accuracy in Parsing ‘Parasitic Gaps’ What did the plan that removed ultimately destroy  Revised Generalization (informal) Only mildly bad gaps can be rescued by good gaps. Finite 

Grammaticality Ratings Ratings from 50 subjects

Grammatical Accuracy in Parsing A ‘Look-Ahead’ Problem What did the plan to remove ultimately destroy  Infinitive The good gap rescues the bad gap BUT The bad gap appears before the good gap … a look-ahead problem

Grammatical Accuracy in Parsing A ‘Look-Ahead’ Problem What did the plan to remove ultimately destroy  Infinitive Question When the parser reaches the embedded verb, does it construct a dependency - even though the gap would be a ‘bad’ gap? Embedded Verb

Grammatical Accuracy in Parsing A ‘Look-Ahead’ Problem What did the plan to remove ultimately destroy  Infinitive Risky What did the plan that removed ultimately destroy  Finite  Reckless

Grammatical Accuracy in Parsing Question What do speakers do when they get to the verb embedded inside the subject NP? (i) RISKY: create a gap in infinitival clauses only - violates a constraint, but may be rescued (ii) RECKLESS: create a gap in all clause types - violates a constraint; cannot be rescued (iii) CONSERVATIVE: do not create a gap (Phillips & Wong 2000)

a.…what … infinitival verb...[infinitive, gap ok] b.… whether..infinitival verb...[infinitive, no gap] c.… what … finite verb...[finite, gap not ok] d. … whether … finite verb...[finite, no gap] Grammatical Accuracy in Parsing Materials (Phillips & Wong 2000)

a.…what … infinitival verb...[infinitive, gap ok] b.… whether..infinitival verb...[infinitive, no gap] c.… what … finite verb...[finite, gap not ok] d. … whether … finite verb...[finite, no gap] Grammatical Accuracy in Parsing Materials Gap here: RISKY (Phillips & Wong 2000)

a.…what … infinitival verb...[infinitive, gap ok] b.… whether..infinitival verb...[infinitive, no gap] c.… what … finite verb...[finite, gap not ok] d. … whether … finite verb...[finite, no gap] Grammatical Accuracy in Parsing Materials Gap here: RECKLESS (Phillips & Wong 2000)

a.The outspoken environmentalist worked to investigate what the local campaign to preserve the important habitats had actually harmed in the area that the birds once used as a place for resting while flying south. [infinitive, gap] b.…whether the local campaign to preserve… [infinitive, no gap] c.…what the local campaign that preserved… [finite, gap] d.…whether the local campaign that preserved … [finite, no gap] Grammatical Accuracy in Parsing Materials (Phillips & Wong 2000)

Grammatical Accuracy in Parsing What did the plan to remove ultimately destroy  Infinitive Risky (Phillips & Wong 2000)

Grammatical Accuracy in Parsing What did the plan that removed ultimately destroy  Finite  Reckless (Phillips & Wong 2000)

Grammatical Accuracy in Parsing Conclusion Structure-building is extremely grammatically accurate, even when the word-order of a language is not cooperative Constraints on movement are violated in exactly the environments where the grammar allows the violation to be forgiven (may help to explain discrepancies in past studies) Such accuracy is required if grammatical computation is to be understood as real-time on-line computation

Arguments for Architecture 1. Available grammars don’t make good parsing devices 2. Grammaticality ≠ Parsability 3. Failure of DTC 4. Evidence for parser-specific structure 5. Parsing/production have distinct properties 6. Possibility of independent damage to parsing/production 7. Competence/performance distinction is necessary, right?

Derivational Theory of Complexity ‘The psychological plausibility of a transformational model of the language user would be strengthened, of course, if it could be shown that our performance on tasks requiring an appreciation of the structure of transformed sentences is some function of the nature, number and complexity of the grammatical transformations involved.’ (Miller & Chomsky 1963: p. 481)

Miller (1962) 1. Mary hit Mark.K(ernel) 2. Mary did not hit Mark. N 3. Mark was hit by Mary.P 4. Did Mary hit Mark?Q 5. Mark was not hit by Mary.NP 6. Didn’t Mary hit Mark?NQ 7. Was Mark hit by Mary?PQ 8. Wasn’t Mark hit by Mary?PNQ

Miller (1962) Transformational Cube

Derivational Theory of Complexity Miller & McKean (1964): Matching sentences with the same meaning or ‘kernel’ Joe warned the old woman.K The old woman was warned by Joe.P1.65s Joe warned the old woman.K Joe didn’t warn the old woman.N1.40s Joe warned the old woman.K The old woman wasn’t warned by Joe.PN3.12s

McMahon (1963) a.i. seven precedes thirteenK (true) ii. thirteen precedes sevenK (false) b.i. thirteen is preceded by sevenP (true) ii. seven is preceded by thirteenP (false) c.i. thirteen does not precede sevenN (true) ii. seven does not precede thirteenN (false) d.i. seven is not preceded by thirteenPN (true) ii. thirteen is not preceded by sevenPN (false)

Easy Transformations Passive –The first shot the tired soldier the mosquito bit fired missed. –The first shot fired by the tired soldier bitten by the mosquito missed. Heavy NP Shift –I gave a complete set of the annotated works of H.H. Munro to Felix. –I gave to Felix a complete set of the annotated works of H.H. Munro. Full Passives –Fido was kissed (by Tom). Adjectives –The {red house/house which is red} is on fire.

Failure of DTC? Any DTC-like prediction is contingent on a particular theory of grammar, which may be wrong It’s not surprising that transformations are not the only contributor to perceptual complexity –memory demands, may increase or decrease –ambiguity, where grammar does not help –difficulty of access

Arguments for Architecture 1. Available grammars don’t make good parsing devices 2. Grammaticality ≠ Parsability 3. Failure of DTC 4. Evidence for parser-specific structure 5. Parsing/production have distinct properties 6. Possibility of independent damage to parsing/production 7. Competence/performance distinction is necessary, right?

Garden Paths & Temporary Ambiguity –The horse raced past the barn fell. –Weapons test scores a hit. –John gave the man the dog bit a sandwich. Grammar can account for the existence of global ambiguities (e.g. ‘Visiting relatives can be boring’), but not local ambiguities … since the grammar does not typically assemble structure incrementally

Garden Paths & Temporary Ambiguity Ambiguity originally studied as test of solution to the incrementality problem Heuristics & Strategies (e.g. Bever, 1970) –NP V => subject verb –V NP => verb object –V NP NP => verb object object Garden paths used as evidence for effects of heuristics

Garden Paths & Temporary Ambiguity Heuristics & Strategies –NP V => subject verb The horse raced past the barn fell –V NP => verb object The student knew the answer was wrong –V NP NP => verb object object John gave the man the dog bit a sandwich

Ambiguity Resolution Observation: ‘heuristics’ miss a generalization about how ambiguities are preferentially resolved Kimball (1973): Seven principles of surface structure parsing (e.g. Right Association) Frazier (1978), Fodor & Frazier (1978): Minimal Attachment, Late Closure Various others, much controversy...

Ambiguity Resolution Assumptions –grammatical parses are accessed (unclear how) –simplest analysis of ambiguity chosen (uncontroversial) –structural complexity affects simplicity (partly controversial) –structural complexity determines simplicity (most controversial)

Ambiguity Resolution Relevance to architecture of language –Comprehension-specific heuristics which compensate for inadequacy of grammar imply independent system –Comprehension-specific notions of structural complexity compatible with independent system If grammar says nothing about ambiguity, and structural complexity is irrelevant to ambiguity resolution, as some argue, then ambiguity is irrelevant to question of parser-grammar relations.

Arguments for Architecture 1. Available grammars don’t make good parsing devices 2. Grammaticality ≠ Parsability 3. Failure of DTC 4. Evidence for parser-specific structure 5. Parsing/production have distinct properties 6. Possibility of independent damage to parsing/production 7. Competence/performance distinction is necessary, right?

Parsing ≠ Production Parsing generates meaning from form Production generates form from meaning Different ‘bottlenecks’ in the two areas –garden paths in comprehension –word-category constraint in production errors –etc., etc. Lexical access: speaking and recognizing words differs, but do we assume that this reflects different systems? Contemporary production theories are now incremental structure-building systems, more similar to comprehension models

Arguments for Architecture 1. Available grammars don’t make good parsing devices 2. Grammaticality ≠ Parsability 3. Failure of DTC 4. Evidence for parser-specific structure 5. Parsing/production have distinct properties 6. Possibility of independent damage to parsing/production 7. Competence/performance distinction is necessary, right?

Competence & Performance Different kinds of formal systems: ‘Competence systems’ and ‘Performance systems’ The difference between what a system can generate given unbounded resources, and what it can generate given bounded resources The difference between a cognitive system and its behavior

Competence & Performance (1) It’s impossible to deny the distinction between cognitive states and actions, the distinction between knowledge and its deployment. (2) How to distinguish ungrammatical-but-comprehensible examples (e.g. John speaks fluently English) from hard-to- parse examples. (3) How to distinguish garden-path sentences (e.g. The horse raced past the barn fell) from ungrammatical sentences. (4) How to distinguish complexity overload sentences (e.g. The cat the dog the rat chased saw fled) from ungrammatical sentences.

Competence & Performance “It is straightforward enough to show that sentence parsing and grammaticality judgments are different. There are sentences which are easy to parse but ungrammatical (e.g. that-trace effects), and there are sentences which are extremely difficult to parse, but which may be judged grammatical given appropriate time for reflection (e.g. multiply center embedded sentences). This classic argument shows that parsing and grammar are not identical, but it tells us very little about just how much they have in common.” (Phillips, 1995) This argument is spurious!

Summary Motivation for combining learning theories with theories of adult knowledge is well-understood; much more evidence needed. Theories of comprehension and production long thought to be independent of ‘competence’ models. In fact, combination of these is quite feasible; if true, possible to investigate linguistic knowledge in real time.