Download presentation
Presentation is loading. Please wait.
1
Double R Theory January 2011
Jerry Ball Human Effectiveness Directorate 711th Human Performance Wing Air Force Research Laboratory
2
Theoretical Foundations Language Representation and Processing
Double R Grammar Cognitive Linguistic theory of the grammatical encoding of referential and relational meaning Double R Process Psycholinguistic theory of the processing of English text into Double R Grammar based representations Double R Model Computational implementation using the ACT-R cognitive architecture and modeling environment DoubleRTheory.com
3
Theoretical Foundations Language Representation and Processing
Double R Grammar Cognitive Linguistic theory of the grammatical encoding of referential and relational meaning Double R Process Psycholinguistic theory of the processing of English text into Double R Grammar based representations Double R Model Computational implementation using the ACT-R cognitive architecture and modeling environment DoubleRTheory.com
4
Theoretical Foundations Grounding Language in Experience
Symbol Grounding (Harnad) Ungrounded symbols are meaningless There must be a chain from abstract to perceptually grounded concepts that provides the grounding for abstract concepts Perceptual Symbol Systems (Barsalou) No purely abstract concepts The brain is a highly evolved perceptual (motor) organ Imagery simulates perceptual experience Embodied Cognition (Lakoff et al.) Abstract concepts are often understood via metaphorical association with more concrete concepts Good is up—Bad is down; Life is a journey
5
Theoretical Foundations Situation Model
Situation Model (Kintsch et al.) Originally viewed as a propositional text base (van Dijk & Kintsch) Elaboration of propositions in linguistic input Now viewed as a Spatial-Imaginal (and Temporal) representation of the objects and situations described by linguistic expressions and encoded directly from the environment (Zwann et al.) Non-propositional (in part) Non-textual No available computational implementations Provides grounding for linguistic representations
6
Abstract Concepts vs. Perceptually Grounded Language
The Prevailing “Cognitive Psychological” View Real World Mental Box Perception Cognition “pilot” “pilot” XY-123 (aka PILOT) Concept ~ abstract amodal fixed point in conceptual space
7
Abstract Concepts vs. Perceptually Grounded Language
An Emerging “Embodied Cognition” View Real World Mental Box Do we really need abstract concepts? How are they learned? perception “pilot” “pilot” grounding Perceptual Symbol Explicit (Perceptual) Cognition is the simulation of perceptual experience perception Concept ~ dynamic and tangled interconnections of associated experiences
8
Language is Grounded in a Situation Model
SRE: Situation Referring Expression ORE: Object Referring Expression PRED: Predicate The horse runs ORE refers the subj horse SRE head runs refers PRED Dynamic mental simulation of horse running would be better!
9
Language is Grounded in a Situation Model
Each experience of a running event changes the RUN concept! The paint runs ORE refers the subj paint SRE head runs refers PRED Dynamic mental simulation of paint running would be better!
10
Guiding Linguistic Principles
Jackendoff’s (1983) Grammatical Constraint: …one should prefer a semantic theory that explains otherwise arbitrary generalizations about the syntax and the lexicon…a theory’s deviations from efficient encoding must be vigorously justified, for what appears to be an irregular relationship between syntax and semantics may turn out merely to be a bad theory of one or the other
11
Guiding Linguistic Principles
Langacker’s Cognitive Grammar (1987, 1991) Grammar is simply the structuring and symbolization of semantic content Exclusionary Fallacy – one analysis, motivation, categorization, cause, function or explanation for a linguistic phenomenon necessarily precludes another Rule/List Fallacy – the assumption, on grounds of simplicity, that particular statements (i.e. lists) must be excised from the grammar of a language if general statements (i.e. rules) that subsume them can be established
12
Construction Grammar (Fillmore, Goldberg, Sag, etc.)
Constructions—the basic units of grammar—are pairings of form, function and meaning form the man hit the ball function subject predicator object meaning hit(agent:man patient:ball) semantic roles “concepts” uppercase word syndrome
13
Construction Grammar Declarative Clause + Intransitive Verb construction The woman sneezed Decl Clause + Transitive Verb construction The man hit the ball Wh-Question + Ditransitive Verb + Passive constr. Who was given the ball? Decl Clause + Intrans Verb + Causative constr. The woman sneezed the napkin off the table
14
X-Bar Theory Key element of Chomsky’s Generative Grammar from the 1970’s to the 1990’s Theory of the universal structure of all languages Autonomous from meaning X-Bar structure presumed to be innate (not learned) Replaced Phrase Structure Grammar component of earlier theory (e.g. S NP VP; NP Det N; …) Has gone thru several major revisions resulting in more and more complex syntactic representations Subsumed by other theoretical considerations in Chomsky’s Minimalist Program (circa. 1995)
15
X-Bar Theory (Chomsky 1970)
Universal structure of all languages except that relative locations can vary (e.g. complements may occur before or after head) Generalization over Syntactic Categories – NP, VP, AP, PP XP Specifier X-Bar X (head) Complement(s) Universal structure of all languages – very strong claim – generative linguists spent next 20+ years trying to demonstrate it! XP Spec X-Bar X-Bar X (Head) Comp(s)
16
X-Bar Theory ~ 1993 Universal structure of all languages Something went seriously wrong! Locally adheres to X-Bar Schema Globally very complex! XP (X’’) Spec X-Bar (X’) X Comp (YP) X-Bar schema Universal structure of clause
17
X-Bar Theory (adapted in Ball 2007)
What’s right about X-Bar Theory: Referential layer Relational layer Grammatical functions: specifier, head, complement, modifier (but need to be semantically motivated) Generalization over grammatical categories – referring expression XP complements – arguments of relational head referential layer Specifier X-Bar relational layer specifier – indicates referential function X (head) Complement(s) head – semantically most significant element
18
Simpler Syntax (Culicover & Jackendoff 2005)
Reaction against the complex syntactic representations of modern mainstream generative grammar Against syntactocentrism If there is a level of meaning representation, then syntactic representations can be simpler Flat as opposed to deeply nested syntactic representations Culicover & Jackendoff are former students of Chomsky
19
Comprehensive Grammars of English
Cambridge Grammar (Huddleston & Pullum, 2002) Informed by linguistic theory, but attempts to cover most of English with all its exceptions Adds functional categories to syntactic representations Longman’s Grammar (Quirk et al., 1985) Focus on basic functions of linguistic elements In the spirit of Functional Grammar as opposed to Chomsky’s Generative Grammar
20
Double R Grammar Theory of the grammatical encoding of Referential and Relational meaning Derived from X-Bar Theory prior to the introduction of functional heads (Chomsky, 1970) Grammatical Functions (GFs) explicitly represented Phrase Level: Specifier, Head, Complement, Modifier Clause Level: Specifier, Head, Subject (Comp), Modifier Specifier + Head Referring Expression (Max Proj) All the grammatical info needed to support reference Specifier = locus of Referential meaning Head = locus of Relational meaning
21
Basic Nominal – X-Bar Theory (Chomsky 1970)
NP D N-Bar N-Bar N the captain Maximal Projection NP Syntactic Category D N-Bar Head (implicit) Specifier (implicit) N Lexical Item Later – D reanalyzed as head of DP (functional head) DP D-bar NP the captain Noun is head of nominal (NP) N-bar level is required Grammatical Functions are implicit in syntactic representation
22
Basic Nominal – Simpler Syntax
NP D N (head) the captain Double Line marks head Syntactic Category NP D N Lexical Item the captain One (explicit) phrase level GF: Head Noun is head of nominal (NP) No N-bar level
23
Basic Nominal – Cambridge Grammar
NP Det:D Head:N the captain NP GF: Syntactic Category Det: D Head: N Lexical Item the captain Four phrase level (NP) GF’s: Head Determiner Complement Modifier Noun is head of nominal (NP) N-Bar level allowed, but not required Note: Nominal = N-bar, not NP for H&P
24
Nominal ~ Referring Expression
John Lyons, Semantics, Vol 2, 1977, p. 445 “Looked at from a semantic point of view, nominals are referring expressions” “They are expressions which have a certain potential for reference”
25
Basic Nominal – Double R
ORE Spec Head; Spec D; Head N the captain Object Referring Expression (ORE) Grammatical Function (GF) Spec Head Grammatical/ Lexical Construction D N Lexical Item the captain Referential pole Relational pole Four phrase level GF’s: Head Specifier Complement Modifier Nominal ~ Object Referring Expression Noun is head of nominal (NP) No N-bar level
26
Basic Clause X-Bar Theory ~ 1970s
S NP VP VP Specv V-Bar V-Bar V Joe runs S Deep Structure NP VP N-bar Specv V-bar N V TENSEpres Joe run Structure of S not explained by X-Bar Theory circa. 1970 -- no specifier or head of S Deep Structure gets transformed into Surface Structure (Transformational Grammar) -- TENSEpres + run runs
27
Basic Clause – Simpler Syntax
S NP AUX VP Joe runs S Clause level GF’s: Subject Object Second Object NP AUX VP Syntactic Tier: N TENSEpres V Joe run Vestige of Transformational Grammar GF Tier: Subject affix hopping CS: RUN(AGENT:X) Head of S not specified in Culicover (2009) In Jackendoff (2002), no lexical items in syntactic tier
28
Basic Clause – Cambridge Grammar
Clause Subj:NP Predicate:VP Predicate:VP Predicator:V Joe runs Clause Subj: NP Predicate: VP Head: N Predicator: V Joe runs Clause level GF’s: Predicate ~ Head of Clause Subject ~ External Complement Modifier Additional phrase level (VP) GF: Predicator ~ Head of VP No equivalent to determiner at clause level!
29
Situation Referring Expression (SRE)
Basic Clause – Double R SRE Subj (Spec+)Head Subj ORE (Spec+)Head Vfin Joe runs Situation Referring Expression (SRE) Grammatical Construction Subj | ORE (Spec+)Head PN Joe (Spec+)Head Vfin | runs Clause level GF’s: Head Specifier Subject ~ External Complement Modifier SRE ~ Clause or S Specification fused with Head
30
Basic Clause X-Bar Theory ~ 1970s
S NP VP VP Specv V-Bar (head) V-Bar V (head) NP (comp) NP D (spec) N-bar (head) N-Bar N (head) Joe kicks the ball S VP NP | N-bar N Joe Later – VP reanalyzed as head of S & Subject NP reanalyzed as specifier of S – left of head so must be spec! S NP (spec) VP (head) Specv V-Bar V | kick N-bar | N ball D the NP TENSEpres Later – tense reanalyzed as head of IP; S reanalyzed as CP (complementizer phrase) with C-bar = IP CP IP = C-bar (head) IP (inflection phrase) NP (spec) I-bar (head) I-bar I (tense head) VP (comp)
31
Basic Clause – X-Bar Theory ~ 1980s
CP IP = C-bar (head) IP NP (spec) I-bar (head) I-bar I (head) VP (comp) VP V-Bar (head) NP (comp) V-Bar V (head) Joe kicks the ball CP IP = C-bar Complement of I-bar I-bar NP | N-bar N Joe VP I | TENSEpres Spec of IP (subject) V-Bar N-bar | N ball D the NP V | kick Later – additional levels proposed: AgrP (agreement) AgrSP, AgrOP NegP (negation) ModP (modality) Etc. Head of CP Sentence now adheres to X-Bar Theory!
32
Basic Clause – X-Bar Theory ~ 1993
Joe kicks the ball C’ = C-bar Subj Agreement | Joe? TP = IP Some languages have object agreement, so universal, innate structure must have this layer! Obj Agreement TENSEpres VP way down here! Structure below VP not shown Universal clausal structure of all languages! kick the ball
33
Basic Clause – Simpler Syntax
S NP AUX VP VP V (head) NP Joe kicks the ball S NP || N | Joe AUX VP || V | kick Syntactic Tier: TENSEpres N | ball D the NP affix hopping GF Tier: Subject Object CS: KICK(AGENT:X PATIENT:Y )
34
Basic Clause – Cambridge Grammar
Clause Subj:NP Predicate:VP Predicate:VP Predicator:V Obj:NP Joe kicks the ball Clause Predicate: VP Subj: NP | Head: N Joe Predicator: V | kicks Obj: NP Det: D | the Head: N | ball Additional phrase level (VP) GF: Object ~ Complement
35
Basic Clause – Double R SRE Subj (Spec+)Head Subj ORE
Head Pred-Trans-Verb PTV Head Obj Joe kicks the ball SRE Grammatical Construction (Spec+)Head | Pred-Trans-Verb Subj | ORE (Spec+)Head PN Joe (Spec+)Head | Vfin kicks Obj | ORE Spec | D the Head | N ball Additional phrase level GF: Object ~ Complement
36
Basic Clause with Auxiliary – Simpler Syntax
S NP AUX VP VP V (head) NP Joe is kicking the ball S NP || N | Joe AUX VP [prOG-part] || V | kick Syntactic Tier: TENSEpres VAUX | be NP D | the N | ball affix hopping GF Tier: Subject Object CS: KICK(AGENT:X PATIENT:Y )
37
Basic Clause with Auxiliary – Cambridge Grammar
Clause Subj:NP Pred:VP Pred:VP Pred-or:V Comp:Clausebare Comp:Clausebare Pred:VP Pred:VP Pred-or:V Obj:NP Predicate: VP Subj: NP | Head: N Joe Predicator: V | is Comp: Clausebare | bare clause (no subj or tense) Predicate: VP head of clause! Predicator: V | kicking No specifier GF Obj: NP catenative verbs Det: D | the Head: N | ball Joe is kicking the ball
38
Basic Clause with Auxiliary – Double R
SRE Subj Spec Head Subj ORE Spec Aux Head Pred-Trans-Verb PTV Head Obj SRE Subj | ORE Head N Joe Spec | Aux is Head | Pred-Trans-Verb Head | V kicking Obj | ORE Spec | D the Head | N ball head of clause Joe is kicking the ball
39
Possessive Nominal – Simpler Syntax
NP NP’s N Joe’s book NP No label! NP ’s N Joe book
40
Possessive Nominal – Cambridge Grammar
NPPlain Subj+Det:NPGen Head:N Joe’s book Fused subject-determiner NPPlain Subj+Det: NPGen Head: N Joe’s book Additional phrase level GF: Subj ~ Complement H & P allow GF’s to be fused – consistent with grammatical evidence
41
Possessive Nominal – Double R
Poss-ORE RefPt+Spec Head Joe’s book Possessive Object Referring Expression (ORE) Grammatical Construction RefPt Spec Head ORE – (Spec+)Head – PN Poss-Mkr N Joe ’s book Referential pole Relational pole Additional phrase level GF: Ref Pt ~ Complement
42
Clause without Main Verb – Simpler Syntax
S NP AUX PP the book is on the table S NP AUX PP || P | on Syntactic Tier: TENSEpres VAUX | be D | the N | book NP D | the N | table GF Tier: Subject CS Tier: BE(THEME:X, ON(THEME:Y))
43
Clause without Main Verb – Cambridge Grammar
Clause Subj:NP Pred:VP Pred:VP Pred-or:V Comp:PP Comp:PP Head:P Obj:NP the book is on the table Clause Predicate: VP Subj: NP Predicator: V | is Comp: PP Det: D | the Head: N | book Head: P | on Obj: NP Det: D | the Head: N | table head of clause!
44
Clause without Main Verb – Double R
SRE Subj Spec Head Subj ORE Spec Aux Head Pred-Prep Pred-Prep Head Obj SRE Subj | ORE Spec | Aux is Head | Pred-Prep Grammatical Construction Spec | D the Head | N book Head | P on Obj | ORE Spec | D the Head | N table head of clause! the book is on the table
45
Clause without Main Verb – Simpler Syntax
S NP AUX PP the book’s on the table S NP AUX PP || P | on Syntactic Tier: TENSEpres VAUX | be D | the N | book NP D | the N | table GF Tier: Subject CS Tier: BE(THEME:X, ON(THEME:Y))
46
Clause without Main Verb – Cambridge Grammar
Clause Subj:NP Pred:VP Pred:VP Pred-or:V Comp:PP Comp:PP Head:P Obj:NP the book’s on the table Clause Predicate: VP Subj: NP fused? Predicator: V | ’s Comp: PP Det: D | the Head: N | book Head: P | on Obj: NP Don’t see how H&P can allow GF’s to be fused – inconsistent with grammatical evidence Det: D | the Head: N | table
47
Clause without Main Verb – Double R
SRE Subj+Spec Head Subj ORE Spec Aux Head Pred-Prep Pred-Prep Head Obj SRE Subj | ORE Spec | Aux ’s Head | Pred-Prep Spec | D the Head | N book Head | P on Obj | ORE Spec | D the Head | N table the book’s on the table
48
Passive Clause – Simpler Syntax
S NP AUX VPbe VPbe be (head) VP[PASS] VP[PASS] V[PASS] (PPby) PPby by NP the book was taken by Joe S Syntactic Tier: VP NP AUX TENSEpast VP [PASSIVE] bev-aux the book V [PASSIVE] | take PP byp NP GF Tier: Subject Object Joe CS: TAKE(AGENT:X, PATIENT:Y)
49
Passive Clause – Cambridge Grammar
Clause Subj:NP Pred:VP Pred:VP Pred-or:V Comp:Clausebare Comp:Clausebare Pred:VP Pred:VP Pred-or:V Comp:PP Clause Predicate: VP Subj: NP Comp: Clausebare | Predicator: V | was Det: D | the Head: N | book Predicate: VP Predicator: V | taken Comp: PP Comp: P | by Comp: NP Joe the book was taken by Joe
50
Passive Clause – Double R
SRE Subj Spec Head Subj ORE1 Spec Aux Head Pred-Trans-Verb PTV Head Obj Mod Obj Bind1 SRE Subj | ORE1 Spec | Aux was Head | Pred-Trans-Verb Grammatical Construction Spec | D the Head | N book Head | V taken Obj | Bind1 Mod | Pass-By-RE Head, P by Obj ORE Head, PN Joe the book was taken by Joe
51
Yes-No-Question – Double R
Y-N-Quest-SRE Op Subj (Spec) Head Pred-Trans-Verb Head Obj Did he take it? Y-N-Quest-SRE Grammatical Construction Operator | Aux did Subj | ORE Head Pron he Head | Pred-Trans-Verb Head | V take Obj | ORE Head, Pron | it Additional clause level GF: Operator ~ Specifier
52
Yes-No-Question – Double R
Could he have taken it? Y-N-Quest-SRE Op Subj (Spec) Head Pred-Trans-Verb Head Obj Y-N-Quest-SRE Operator | Aux could Subj | ORE Head Pron he Spec | Aux have Head | Pred-Trans-Verb Head | V taken Obj | ORE Head, Pron | it
53
Wh-Question – Double R Wh-Quest-SRE Wh-Focus Op Subj (Spec) Head
Pred-Trans-Verb Head Obj What did he take? Wh-Quest-SRE Grammatical Construction Wh-Focus | Wh-ORE1 Head Wh-Pron what Operator | Aux did Subj | ORE1 Head Pron he Head | Pred-Trans-Verb Head | V take Obj | Bind1 Additional clause level GF: Wh-Focus ~ Complement
54
Wh-Question – Double R What could he have taken?
Wh-Quest-SRE Wh-Focus Op Subj (Spec) Head Pred-Trans-Verb Head Obj Wh-Quest-SRE Wh-Focus | Wh-ORE1 Operator | Aux could Subj | ORE Head Pron he Spec | Aux have Head | Pred-Trans-Verb Head | Wh-Pron what Head | V taken Obj | Bind1
55
Wh-Question + Passive + Ditrans – Double R
What could he have been given? Wh-Quest-SRE Wh-Focus Op Subj (Spec) Head Pred-Ditrans-Verb Head (IObjxor) Obj (Recipxor) Wh-Quest-SRE Grammatical Construction Wh-Focus | Wh-ORE1 Head Wh-Proninan what Operator | Aux could Subj | ORE2 Head Pronhuman he Spec | Aux have been Head | Pred-Ditrans-Verb Head | V given IObj | Bind2 Obj | Bind1 Animacy determines binding!
56
Wh-Question + Passive + Ditrans – Double R
Who could it have been given? Wh-Quest-SRE Wh-Focus Op Subj (Spec) Head Pred-Ditrans-Verb Head (IObjxor) Obj (Recipxor) Wh-Quest-SRE Wh-Focus | Wh-ORE1 Operator | Aux could Subj | ORE2 Head Proninan it Spec | Aux have been Head | Pred-Ditrans-Verb Head | Wh-Pronhuman who Head | V given IObj | Bind1 Obj | Bind2 Animacy determines binding!
57
Wh-Question + Passive + Ditrans – Double R
Who could it have been given to? Wh-Quest-SRE Wh-Focus Op Subj (Spec) Head Pred-Ditrans-Verb Head (IObjxor) Obj (Recipxor) Wh-Quest-SRE Wh-Focus | Wh-ORE1 Operator | Aux could Subj | ORE2 Head Pron it Spec | Aux have been Head | Pred-Ditrans-Verb Head | Wh-Pron who Head | V given Obj | Bind2 Recip | To-LRE P | to Obj | Bind1
58
Grammatical Features of Nominals in English
Definiteness – definite, indefinite, universal Number – singular, plural Animacy – human, animate, inanimate Gender – male, female Person – first, second, third Case – subj, obj, gen (2)
59
Why We Need Grammatical Features
Definiteness: Give me the ball (definite) Give me a ball (indefinite) Number The men (plural) kick the ball (sing). They (plural)… Animacy The man (human) kicks the ball (inanimate). It (inanimate)… Gender The man (male) likes the woman (female). She (female)… or
60
Simple Nominal the man “the” projects definite to obj-refer-expr
singular human male “the” projects definite to obj-refer-expr “man” projects singular, human and male
61
Grammatical Features of Clauses in English
Tense – present, past, non-finite Aspect – perfect, progressive Modality – “could”, “should”, “must”… Polarity – negative Voice – active, inactive, passive
62
Simple Clause …could not have gone
present active negative finite perfect “could” “could not” recognized as a multi-word unit “could” projects finite present tense and modality “not” projects negative polarity “have gone” projects perfect aspect and active voice
63
Summary Representations matter! Language is complex!
In complex systems, overall coherence is more important than overall simplicity! Einstein: make your theory as simple as possible, but no simpler! Computational implementation necessitates coherence If axioms + logical reasoning incoherence or a system that is obviously false, then question your axioms or your “logical” reasoning E.g. if innateness assumptions lead to overly complex representations, then question the innateness assumptions or the reasoning
64
Theoretical Foundations Language Representation and Processing
Double R Grammar Cognitive Linguistic theory of the grammatical encoding of referential and relational meaning Double R Process Psycholinguistic theory of the processing of English text into Double R Grammar based representations Double R Model Computational implementation using the ACT-R cognitive architecture and modeling environment DoubleRTheory.com
65
Double R Process Serial, incremental, pseudo-deterministic language processor with a non-monotonic context accommodation mechanism (with limited parallelism) that is capable of making modest changes to the evolving representation Parallel, interactive, highly context sensitive, probabilistic mechanism which uses all available information to make the best choice at each choice point Processor presents the appearance and efficiency of deterministic processing, but is capable of handling the ambiguity which makes truly deterministic processing impossible
66
Double R Process Construction Driven Language Processing
Activation, Selection and Integration of constructions corresponding to the linguistic input Lexical items in the input activate constructions Activation depends on current input, current context, and prior history of use “give” activates ditransitive verb construction Most highly activated construction is selected Selected construction is integrated with evolving representation
67
Double R Process Adhere to well-established cognitive constraints on Human Language Processing Don’t use any obviously cognitively implausible mechanisms! Adhering to cognitive constraints may actually facilitate the development of functional NLP systems Pushes development in directions that are more likely to be successful given inherently human nature of language processing You don’t know what you’re giving up when you adopt cognitively implausible mechanisms!
68
ACT-R Cognitive Architecture
Theory of human cognition based on 40+ years of psychological research (Anderson, 2007) Computational implementation since 1993 Combines a symbolic procedural memory implemented as a production system with a symbolic frame based declarative memory (DM) Includes modules for vision, audition, and motor processing Supports interaction with external world
69
ACT-R Cognitive Architecture
Procedural memory is the central component All modules interface to procedural memory via buffers (e.g. goal buffer, retrieval buffer, visual buffer) Productions have “subsymbolic” utilities Productions match against buffers of other modules Intentional module goal buffer is primary driver of behavior Matching production with highest utility is selected for execution
70
ACT-R Cognitive Architecture
DM contains chunks which are frame based Chunk type + slot-value pairs (aka AVMs) Chunk types are organized into a single inheritance hierarchy Chunks have “subsymbolic” activations based on current input, current context and prior history of use Chunks are retrieved from memory by execution of a production which specifies a retrieval template DM chunk with highest activation that matches retrieval template is retrieved (soft constraint retrieval)
71
ACT-R Cognitive Architecture
Intentional Module (not identified) Declarative Module (Temporal/Hippocampus) Goal Buffer (DLPFC) Retrieval Buffer (VLPFC) Matching (Striatum) Execution (Thalamus) Selection (Pallidum) (Basal Ganglia) Productions Visual Module (Occipital/etc) Manual Module (Motor/Cerebellum) Visual Buffer (Parietal) Manual Buffer (Motor) External World modules & buffers mapped to brain regions
72
ACT-R Cognitive Architecture
Supports timing of cognitive processing Production execution takes 50 ms DM chunk retrieval time depends on level of activation of retrieved chunk Timing of motor events based on Fitts’ Law Used for empirical validation of models Provides a powerful debugging environment
73
Architectural Constraints
No language specific module although buffers and productions accessing buffers might be viewed as a module Forward chaining productions with no backtracking Limited pattern matching – not full unification Serial bottleneck only one production can execute at a time Modules interact with production system via buffers buffers have limited capacity for storing current context Activation spreads in parallel Activation and Utility subject to noise
74
Constraints on Human Language Processing
Visual World Paradigm (Tanenhaus et al. 1995) Subjects presented with a visual scene Subjects listen to auditory linguistic input describing scene Immediate determination of meaning Subjects look immediately at referents of linguistic expressions, sometimes before end of expression Incremental processing Interactive processing (Trueswell et al. 1999) Ambiguous expressions are processed consistent with scene “the green…” “put the arrow on the paper into the box”
75
Constraints on Human Language Processing
According to Crocker (1999), there are three basic mechanisms for dealing with ambiguity in natural language Serial processing with backtracking or reanalysis Deterministic processing with lookahead (Marcus 1980) Parallel processing with alternative analyses carried forward in parallel (Gibson 1991; MacDonald, Pearlmutter & Seidenberg 1994; Trueswell & Tanenhaus 1994) According to Lewis (2000) “…existing evidence is compatible only with probabilistic serial-reanalysis models, or ranked parallel models augmented with a reanalysis component.” According to Gibson & Pearlmutter (2000) “noncompetitive ranked parallel models” are most consistent with the empirical evidence
76
Constraints on Human Language Processing
Serial and deterministic with reanalysis for pathological input Empirical evidence that we don’t carry forward all representations in parallel – Garden Path Sentences “The horse raced past the barn fell” (Bever 1970) “The old train the young” (Just & Carpenter, 1987) Empirical evidence that we don’t retract previously built representations (Christianson et al. 2001) “While Mary dressed the baby sat up on the bed” In a post test, a majority of subjects answered yes to the question “Did Mary dress the baby?” Processing doesn’t slow down with increasing length of non-pathological input Typically only aware of a single interpretation
77
Constraints on Human Language Processing
Parallel and probabilistic with reanalysis for pathological input Empirical evidence that we may carry forward multiple representations in parallel – Garden Path Effects can be eliminated with sufficient context Empirical evidence that dispreferred representations can affect processing time (Gibson & Pearlmutter 2000) It’s extremely difficult to empirically falsify either Could be parallel slow down or occasional switch between serial alternatives that causes effect Don’t have all the answers, but maybe it’s both! A parallel, probabilistic substrate may make a pseudo-deterministic serial processing mechanism possible!
78
Cognitively Implausible Mechanism
Serial processing with algorithmic backtracking Algorithmically simple, but… Computationally intractable for NLP which is highly ambiguous Context which led to dead end is retracted on backtracking Why give up the context? How do we know it’s a dead end? Practical Consequences No hope for on-line processing in real-time in large coverage NLP system No hope for integration with speech recognition system Performance degrades with length of input Can’t easily handle degraded or ungrammatical input
79
Cognitively Implausible Mechanism
Multiple pass or multi-stage parsing Separate passes tokenize and assign part of speech Can’t use full context in each pass Errors get propagated Separate pass builds structure Typically limited to using part of speech of words Separate pass determines meaning Practical Consequences Difficult to do on-line processing in real-time Can’t easily integrate with speech recognition Performance degrades with length of input Limited context available to handle ambiguity at each stage
80
Outrageously Implausible Mechanism!
Parsing input from right to left (Microsoft NLP system) May have engineering advantages, but… Presumes a staged approach to NLP Completely ignores cognitive plausibility Practical consequences Impossible to do on-line processing in real-time Must wait for end of input Nearly impossible to integrate with speech recognition
81
Cognitively Plausible Mechanism?
Deterministic processing with lookahead Many ambiguities resolved by looking ahead a few words, but… Don’t know how far to look ahead Cognitive plausibility improved by limiting amount of lookahead 3 constituent lookahead (Marcus 1980) 1 word lookahead (Henderson 2004) Practical consequences Difficult to use with eager algorithms for which there is good empirical evidence (immediate determination of meaning) The smaller the lookahead, the less deterministic
82
Cognitively Plausible Mechanism?
Parallel processing with multiple analyses carried forward “Full parallelism – where every analysis is pursued – is not psychologically possible” (Crocker 1999) Cognitive plausibility improved by limiting number of analyses carried forward and ranking alternatives (bounded ranked parallelism) and not having analyses compete Practical Consequences The longer and more ambiguous the input, the less likely to have the correct representation in the parallel spotlight – necessitating a reanalysis mechanism Impractical if multiple representations must be built at each choice point as opposed to just being selected
83
Cognitively Plausible Mechanism
Pseudo-deterministic, serial processing mechanism with context accommodation operating over a parallel, probabilistic substrate Parallel, probabilistic substrate proposes best alternative given current context Processor proceeds as though it were serial and deterministic, but accommodates the subsequent input as needed Integrates the advantages of parallel processing with an essentially serial processing mechanism Practical Consequences How to accommodate when things go seriously wrong? Mechanism is essentially non-monotonic
84
Cognitively Plausible Mechanism
Serial, Pseudo-deterministic processing and Context Accommodation Uses ACT-R’s production system Builds structure Limited parallelism Parallel, Probabilistic processing Uses ACT-R’s declarative memory Retrieves existing structure from memory
85
Context Accommodation
If current input is unexpected given the prior context, then accommodate the input Adjust the representation Coerce the input into the representation The following example demonstrates the context accommodation mechanism “no target airspeed or altitude restrictions”
86
“no” object referring expression
“no” projects obj-refer-expr and functions as specifier “head-indx” indicates head expected “bind-indx” provides index for binding
87
“target” head no target integration
tree structures generated automatically with dynamic visualization tool (Heiberg, Harris & Ball 2007) based on phpSyntaxTree software (Eisenberg & Eisenberg)
88
“airspeed” head no target airspeed override function shift
integration Accommodation of second noun via function shift and overriding
89
“or altitude” conj no target airspeed or altitude integration
Conjunction integrated into noun
90
“restrictions” head no target airspeed or altitude restrictions
override function shift integration Accommodation of new head via function shift and override Appearance of parallel processing!
91
Types of Accommodation
Coercion “the hiking of Mt Lemon” – head of nominal “hiking” construed objectively, arguments not expressed (“of Mt Lemon” functions as a modifier) “a Bin Laden supporter” Proper Noun functions as modifier “you’re no Jack Kennedy” Proper Noun functions as head (following specifier) “the newspaper boy porched the newspaper” – nonce expression (H. Clark 1983) “porched” construed as transitive action
92
Types of Accommodation
Override Single word vs. Multi-Word Expression (MWE) “kicked…” transitive verb “kicked the bucket” idiomatic expression “take…” transitive verb “take a hike” “take five” “take time” “take place” “take out” “take my wife, please” “take a long walk off a short pier” … many idiomatic expressions Not possible to carry all forward in parallel Morphologically simple vs. complex “car…” noun (sing) “carpet…” noun (sing) “carpets” noun (plur) “carpeting” noun (sing) or verb
93
Types of Accommodation
Grammatical Function Shift “he gave it to me” direct object (initial preference due to inanimacy) “he gave it the ball” direct object (initial preference) indirect object “he gave her the ball” indirect object (initial preference due to animacy) “he gave her to the groom” indirect object (initial preference) direct object
94
Types of Accommodation
Nominal Head Override “he gave her the dog biscuit” head = her “he gave her dog the biscuit” head = dog Grammatical Function “Juggling” “he gave the…” indirect object “he gave the very old bone…” direct object “he gave the very old bone collector…” indirect object “he gave the very old dog…” indirect object “he gave the very old dog collar…” direct object “he gave the very old dog to me” direct object
95
Types of Accommodation
Grammatical Function Shift “he said that…” In context of “said”, “that” typically functions as a complementizer But subsequent context can cause a function shift from complementizer “he said that she was happy” To nominal specifier to “he said that book was funny” To nominal head “he said that.”
96
Types of Accommodation
Grammatical Function Shift “pressure” vs “pressure valve” vs “pressure valve adjustment” vs “pressure valve adjustment screw” vs “pressure valve adjustment screw fastener” vs “pressure valve adjustment screw fastener part” vs. “pressure valve adjustment screw fastener part number” Serial nouns (and verbs) incrementally shift from head to modifier function as each new head is processed Functions like lookahead, but isn’t limited Not clear if a bounded ranked parallel mechanism can handle this! 2n possibilities if head or modifier at each word
97
Types of Accommodation
Modulated Projection “the rice” vs. “rice” “the” projects a nominal and functions as a specifier In the context of “the” “rice” is integrated as the head of the nominal When there is no specifier, “rice” projects a nominal and functions as the head without separate specification nominal nominal vs. spec head head the rice rice “the rice” “rice”
98
Grammatical Feature Accommodation
Grammatical features may be redundantly encoded and may conflict without the expression being ungrammatical aindef+sing fewindef+plur booksindef+plur thedef booksindef+plur someindef+plur booksing aindef+sing Ronald Reagandef+sing republicansing he haspres+act givenpass+perf me the ball he ispres+inact givenpass+perf the ball he haspres+act beenpass+perf givenpass+perf it
99
Summary of Context Accommodation
Context Accommodation is part and parcel of the pseudo-deterministic processing mechanism Not viewed as a repair mechanism (Lewis 1998) Processor proceeds as though it were deterministic, but accommodates the input as needed Gives the appearance of parallel processing in a serial, deterministic mechanism
100
Combining Serial, Deterministic and Parallel, Probabilistic Mechanisms
The parallel probabilistic substrate makes a pseudo-deterministic serial processing mechanism possible! Parallel Probabilistic Parallel Distributed Processing PDP Supertag Stapling Probabilistic LTAG Tree Supertagging Construction Activation & Selection Construction Integration Pseudo Deterministic Range Double R Lexicalized PCFG Lexical Rule Selection Rule Application Rule Selection Rule Application PCFG Rule Selection & Application Nondeterministic CFG Serial Deterministic
101
Questions?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.