Download presentation
Presentation is loading. Please wait.
Published byBerenice Thoms Modified over 10 years ago
1
Embodied Construction Grammar ECG (Formalizing Cognitive Linguistics) 1.Community Grammar and Core Concepts 2.Deep Grammatical Analysis 3.Computational Implementation a.Test Grammars b.Applied Projects – Question Answering 4.Map to Connectionist Models, Brain 5.Models of Grammar Acquisition
2
Simulation specification The analysis process produces a simulation specification that includes image-schematic, motor control and conceptual structures provides parameters for a mental simulation
3
Summary: ECG Linguistic constructions are tied to a model of simulated action and perception Embedded in a theory of language processing –Constrains theory to be usable –Basis for models of grammar learning Precise, computationally usable formalism –Practical computational applications, like MT and NLU –Testing of functionality, e.g. language learning A shared theory and formalism for different cognitive mechanisms –Constructions, metaphor, mental spaces, etc. Reduction to Connectionist and Neural levels
4
physicslowest energy state chemistrymolecular fit biology fitness, MEU N euroeconomics vision threats, friends language errors, NTL Constrained Best Fit in Nature inanimate animate society, politics framing, compromise
5
Competition-based analyzer An analysis is made up of: –A constructional tree –A semantic specification –A set of resolutions Bill gaveMarythe book MaryBill Ref-Exp Give A-GIVE-B-X subj vobj1 obj2 book01@Man@WomanGive-Action @Book giver recipient theme Johno Bryant
6
Combined score determines best-fit Syntactic Fit: –Constituency relations –Combine with preferences on non-local elements –Conditioned on syntactic context Antecedent Fit: –Ability to find referents in the context –Conditioned on syntax match, feature agreement Semantic Fit: –Semantic bindings for frame roles –Frame roles’ fillers are scored
7
0 Eve 1 walked 2 into 3 the 4 house 5 Constructs -------------- NPVP[0] (0,5) Eve[3] (0,1) ActiveSelfMotionPath [2] (1,5) WalkedVerb[57] (1,2) SpatialPP[56] (2,5) Into[174] (2,3) DetNoun[173] (3,5) The[204] (3,4) House[205] (4,5) Schema Instances ------------------- SelfMotionPathEvent [1] HouseSchema[66] WalkAction[60] Person[4] SPG[58] RD[177] ~ house RD[5]~ Eve
8
Unification chains and their fillers SelfMotionPathEvent[1].mover SPG[58].trajector WalkAction[60].walker RD[5].resolved-ref RD[5].category Filler: Person4 SpatialPP[56].m Into[174].m SelfMotionPathEvent[1].spg Filler: SPG58 SelfMotionPathEvent[1].landmark House[205].m RD[177].category SPG[58].landmark Filler:HouseSchema66 WalkedVerb[57].m WalkAction[60].routine WalkAction[60].gait SelfMotionPathEvent[1].motion Filler:WalkAction60
9
Mother (I) give you this (a toy). CHILDES Beijing Corpus (Tardiff, 1993; Tardiff, 1996) ma1+magei3ni3zhei4+ge mothergive2PSthis+CLS You give auntie [the peach]. Oh (go on)! You give [auntie] [that]. Productive Argument Omission (Mandarin) Johno Bryant & Eva Mok 1 2 3 ni3gei3yi2 2PSgiveauntie aoni3gei3ya EMP 2PSgive EMP 4 gei3 give [I] give [you] [some peach].
10
Arguments are omitted with different probabilities All args omitted: 30.6% No args omitted: 6.1%
11
Analyzing ni3 gei3 yi2 (You give auntie) Syntactic Fit: –P(Theme omitted | ditransitive cxn) = 0.65 –P(Recipient omitted | ditransitive cxn) = 0.42 Two of the competing analyses: ni3gei3yi2omitted ↓↓↓↓ GiverTransferRecipientTheme ni3gei3omittedyi2 ↓↓↓↓ GiverTransferRecipientTheme (1-0.78)*(1-0.42)*0.65 = 0.08(1-0.78)*(1-0.65)*0.42 = 0.03
12
Using frame and lexical information to restrict type of reference Lexical Unit gei3 Giver (DNI) Recipient (DNI) Theme (DNI) The Transfer Frame Giver Recipient Theme Manner Means Place Purpose Reason Time
13
Can the omitted argument be recovered from context? Antecedent Fit: ni3gei3yi2omitted ↓↓↓↓ GiverTransferRecipientTheme ni3gei3omittedyi2 ↓↓↓↓ GiverTransferRecipientTheme Discourse & Situational Context childmother peachauntie table ?
14
How good of a theme is a peach? How about an aunt? The Transfer Frame Giver (usually animate) Recipient (usually animate) Theme (usually inanimate) ni3gei3yi2omitted ↓↓↓↓ GiverTransferRecipientTheme ni3gei3omittedyi2 ↓↓↓↓ GiverTransferRecipientTheme Semantic Fit: ni3gei3yi2omitted ↓↓↓↓ GiverTransferRecipientTheme
15
The argument omission patterns shown earlier can be covered with just ONE construction Each construction is annotated with probabilities of omission Language-specific default probability can be set SubjVerbObj1Obj2 ↓↓↓↓ GiverTransferRecipientTheme 0.780.420.65 P(omitted|cxn):
16
Leverage process to simplify representation The processing model is complementary to the theory of grammar By using a competition-based analysis process, we can: –Find the best-fit analysis with respect to constituency structure, context, and semantics –Eliminate the need to enumerate allowable patterns of argument omission in grammar This is currently being applied in models of language understanding and grammar learning.
17
Best-fit example with theme omitted SubjVerbObj1Obj2 ↓↓↓↓ GiverTransferRecipientTheme You give auntie [the peach]. 2 Verb ↓ Transfer local? omitted? local Subj ↓ Giver omitted local? omitted? local Obj1 ↓ Recipient Obj2 ↓ Theme ni3gei3yi2 2PSgiveauntie
18
Lexical Unit gei3 Giver Recipient Theme How to recover the omitted argument, in this case the peach? The Transfer Frame Giver Recipient Theme Manner Means Place Purpose Reason Time (DNI) Discourse & Situational Context child mother auntie peach table omitted Obj2 ↓ Theme
19
Best-fit example with theme omitted Oh (go on)! You give [auntie] [that]. 3 Verb ↓ Transfer local? omitted? local omitted Subj ↓ Giver omitted local? omitted? local Obj1 ↓ Recipient Obj2 ↓ Theme aoni3gei3ya EMP2PSgiveEMP
20
Lexical Unit gei3 Giver Recipient Theme How to recover the omitted argument, in this case the aunt and the peach? The Transfer Frame Giver Recipient Theme Manner Means Place Purpose Reason Time (DNI) Discourse & Situational Context child mother auntie peach table omitted Obj2 ↓ Theme omitted Obj1 ↓ Recipient
21
Modeling context for language understanding and learning Linguistic structure reflects experiential structure –Discourse participants and entities –Embodied schemas: action, perception, emotion, attention, perspective –Semantic and pragmatic relations: spatial, social, ontological, causal ‘Contextual bootstrapping’ for grammar learning
22
The context model tracks accessible entities, events, and utterances Discourse & Situational Context Discourse01 participants: Eve, Mother objects: Hands,... discourse-history: DS01 situational-history: Wash-Action Discourse:
23
Each of the items in the context model has rich internal structure Situational History:Discourse History: Participants:Objects: Discourse: Wash-Action washer: Eve washee: Hands DS01 speaker: Mother addressee: Eve attentional-focus: Hands content: {"are they clean yet?"} speech-act: question Eve category: child gender: female name: Eve age: 2 Mother category: parent gender: female name: Eve age: 33 Hands category: BodyPart part-of: Eve number: plural accessibility: accessible
24
Analysis produces a semantic specification Linguistic Knowledge Utterance Discourse & Situational Context Semantic Specification World Knowledge Analysis “ You washed them ” WASH-ACTION washer: Eve washee: Hands
25
How Can Children Be So Good At Learning Language? Gold’s Theorem: No superfinite class of language is identifiable in the limit from positive data only Principles & Parameters Babies are born as blank slates but acquire language quickly (with noisy input and little correction) → Language must be innate: Universal Grammar + parameter setting But babies aren’t born as blank slates! And they do not learn language in a vacuum!
26
Key ideas for a NT of language acquisition Nancy Chang and Eva Mok Embodied Construction Grammar Opulence of the Substrate –Prelinguistic children already have rich sensorimotor representations and sophisticated social knowledge Basic Scenes –Simple clause constructions are associated directly with scenes basic to human experience (Goldberg 1995, Slobin 1985) Verb Island Hypothesis –Children learn their earliest constructions (arguments, syntactic marking) on a verb-specific basis (Verb Island Hypothesis, Tomasello 1992)
27
Embodiment and Grammar Learning Paradigm problem for Nature vs. Nurture The poverty of the stimulus The opulence of the substrate Intricate interplay of genetic and environmental, including social, factors.
28
Two perspectives on grammar learning Computational models Grammatical induction –language identification –context-free grammars, unification grammars –statistical NLP (parsing, etc.) Word learning models –semantic representations logical forms discrete representations continuous representations –statistical models Developmental evidence Prior knowledge –primitive concepts –event-based knowledge –social cognition –lexical items Data-driven learning –basic scenes –lexically specific patterns –usage-based learning
29
Key assumptions for language acquisition Significant prior conceptual/embodied knowledge –rich sensorimotor/social substrate Incremental learning based on experience –Lexically specific constructions are learned first. Language learning tied to language use –Acquisition interacts with comprehension, production; reflects communication and experience in world. –Statistical properties of data affect learning
30
Context Eve washer Wash-Action Hands washee Discourse Segment addressee attentional- focus Analysis draws on constructions and context before MeaningForm you Addressee washer Wash-Action washed washee ContextElement them
31
Learning updates linguistic knowledge based on input utterances Learning Discourse & Situational Context Linguistic Knowledge Analysis Utterance Partial SemSpec World Knowledge
32
Context Eve washer Wash-Action Hands washee Discourse Segment addressee attentional- focus Context aids understanding: Incomplete grammars yield partial SemSpec MeaningForm you Addressee washer Wash-Action washed washee ContextElement them
33
Context Eve washer Wash-Action Hands washee Discourse Segment addressee attentional- focus Context bootstraps learning: new construction maps form to meaning MeaningForm you AddresseeWash-Action washed ContextElement them before washer washee
34
Context bootstraps learning: new construction maps form to meaning MeaningForm you AddresseeWash-Action washed ContextElement them before washer washee YOU-WASHED-THEM constituents: YOU, WASHED, THEM form: YOU before WASHED WASHED before THEM meaning: WASH-ACTION washer: addressee washee: ContextElement
35
Grammar learning: suggesting new CxNs and reorganizing existing ones reinforcement reorganize merge join split Linguistic Knowledge Discourse & Situational Context Analysis Utterance Partial SemSpec World Knowledge hypothesize map form to meaning learn contextual constraints
36
Challenge: How far up to generalize Eat rice Eat apple Eat watermelon Want rice Want apple Want chair Inanimate Object Manipulable Objects Manipulable Objects Unmovable Objects Food Furniture Fruit Savory Chair Sofa apple watermelon rice
37
Challenge: Omissible constituents In Mandarin, almost anything available in context can be omitted – and often is in child-directed speech. Intuition: Same context, two expressions that differ by one constituent a general construction with the constituent being omissible May require verbatim memory traces of utterances + “relevant” context
38
When does the learning stop? Most likely grammar given utterances and context The grammar prior includes a preference for the “kind” of grammar In practice, take the log and minimize cost Minimum Description Length (MDL) Bayesian Learning Framework Schemas + Constructions SemSpec Analysis +Resolution Context Fitting
39
Intuition for MDL S -> Give me NP NP -> the book NP -> a book S -> Give me NP NP -> DET book DET -> the DET -> a 39 Suppose that the prior is inversely proportional to the size of the grammar (e.g. number of rules) It’s not worthwhile to make this generalization
40
Intuition for MDL S -> Give me NP NP -> the book NP -> a book NP -> the pen NP -> a pen NP -> the pencil NP -> a pencil NP -> the marker NP -> a marker S -> Give me NP NP -> DET N DET -> the DET -> a N -> book N -> pen N -> pencil N -> marker
41
Usage-based learning: comprehension and production reinforcement (usage) reinformcent (correction) reinforcement (usage) hypothesize constructions & reorganize reinforcement (correction) constructicon world knowledge discourse & situational context simulation analysis utterance analyze & resolve utterance response comm. intent generate
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.