Download presentation
Presentation is loading. Please wait.
Published byHarrison Millis Modified over 9 years ago
1
Lexicalizing and Combining Paul M. Pietroski University of Maryland Dept. of Linguistics, Dept. of Philosophy
2
Talk: One Slide Version Lexical Meanings are what Composition Operations need them to be –LMs are combinable via COs –LMs exhibit types that COs can operate on if Composition Operations are rabidly conjunctive, as in neo-Davidsonian semantic theories, then lexicalization has to be creative in ways that are otherwise unexpected familiar (and otherwise puzzling) facts suggest that lexicalization is creative in these ways so perhaps lexicalization is at heart of what is uniquely human about our semantic capacities
3
Large Background Question What makes humans linguistically special? (i)Lexicalization: capacity to acquire words (ii)Combination: capacity to combine words (iii)Lexicalization and Combination (iv)Something else entirely: e.g., distinctive representations that are simply paired with signals
4
Outline background assumptions: Chomskyan specific proposal (and caveats): neo-medieval Fregean reminder: an invented language can be used to analyze (“recarve”) prior thoughts in cognitively useful ways suggestion: acquiring a natural human language is cognitively useful, though in different ways evidence that in lexicalization, prior concepts are creatively linked to monadic analogs (as required by neo-Davidsonian composition)
5
Composition Constrains Lexicalization Semantic Composition (in Natural Human Languages) each expression-meaning is determined somehow by the constituent words and their arrangement Immediate Questions what do the words contribute? what does their arrangement contribute? Cognitive Science Project a Ia Marr describe the/a mapping from arrangements of words to the corresponding expression-meanings; and say how this mapping is computed in terms of representations and operations that expressions can invoke Semantic Composition: Corollary word meanings are combinable via the (implemented/invokable) operations that correspond to arrangements of words in a naturally acquirable human language
6
Big Background Assumptions natural human languages are I- Languages in Chomsky’s sense a lexical meaning (I-meaning) can be described as an instruction to fetch a concept of some sort a phrasal meaning can be described as an instruction to combine concepts in a certain way ‘I’ is for ‘Intensional’ (Procedural, Algorithmic) contrast with ‘Extensional’ (sets of I/O pairs) ‘I’ is also for: ‘Implemented’ and ‘Invokable’ this leaves room for Externalism about concepts/truth expressions as pairs of instructions SEM: fetch and combine mental representations (in accord with certain constraints) ?PHON: make and combine articulatory gestures?
7
Composition Constrains Lexicalization Semantic Composition lexical meanings are combinable via I-Operations --implemented and invoked by I-Languages --on display in Logical Forms determined by LFs Fido Felixchase CHASE(FIDO, FELIX) AGENT(, FIDO) & CHASE( ) & PATIENT(, FELIX) CHASE(, FIDO, FELIX) for some _: AGENT(, _) & FIDO(_) CHASE( ) & for some _: PATIENT(, _) & FELIX(_) saturation conjunction closure
8
Composition Constrains Lexicalization Semantic Composition lexical meanings are combinable via I-Operations --implemented and invoked by I-Languages --on display in Logical Forms determined by LFs Lexicalization Conforms to Composition if I-operations only operate on meanings of certain types, then lexical meanings are meanings of those types, even if the concepts (representations) lexicalized are not Lexicalization Provides what Composition Needs if lexical meanings are instructions to fetch concepts that can be combined via I-operations, then lexicalization may have to be a little creative
9
Foreshadowing: Two Kinds of Creativity Introduce a language that invokes general and powerful operations (like Function-Application /λ-abstraction) – lexicalize many concepts “directly” KICK(x,y) + PF(‘kick’) –reanalyze many Subject-Predicate thoughts in polyadic terms Number(3) ≡ df ANCESTRAL[Predecessor(x, y)] Introduce a language that invokes simple and restrictive operations (like Predicate-Conjunction/Monadicization) –when lexicalizing nonmonadic concepts, make monadic analogs KICK(x,y) + PF(‘kick’) KICK(x,y) ≡ df for some e, KICK(e,x,y) KICK(e,x,y) ≡ df AGENT(e, x) & KICK(e) & PATIENT(e, y) –use this language to construct “neo-medieval” thoughts with many monadic constituents and just a few dyadic/thematic constituents or indirectly, by “shifting up” KICK(x,y) + PF(‘kick’)
10
Compositionality is an Explanandum we can invent languages in which for any expressions E1 and E2 Meaning(E1^E2) = SATURATE[Meaning(E1), Meaning(E2)] Meaning (E1^E2) = CONJOIN[Meaning(E1), Meaning(E2)] Meaning (E1^E2) = DISJOIN[Meaning(E1), Meaning(E2)] Meaning (E1^E2) = …E1…E2…if…, and otherwise__E1__E2__ perhaps in human languages, E1 and E2 differ in meaning only if (a) they differ with regard to the arrangement of their atomic parts, or (b) at least one atomic part of E1 differs in meaning from the corresponding atomic part of E2 but a language could meet this weak condition so long as each mode of combining expressions indicates some operation on meanings, even if none of the operations are naturally computable for humans, and none of the atomic meanings are results of human lexicalization If compositionality is a “supervenience” thesis (see Szabo), we want to know which I-operations realize compositionality in human languages
11
Natural Composition is Constrained Natural Semantic Composition is Implemented –in kids –with innate circuitry that had to be evolved Natural Semantic Composition is Systematic –colorful human words combine easily –even if animal concepts are less promiscuous Natural Semantic Composition is Fast –novel expressions are often understood “on line” –as if words were associated with concepts that can be systematically combined, on demand, via simple operations implemented with innate circuitry
12
Foreshadowing: Two Kinds of Composition Function-Application / Saturation – allows for “direct” lexicalization of many concepts KICK(x,y) + PF(‘kick’) –is it implemented/invokable as a recursive I-operation? Predicate-Conjunction / Modification –requires “reformatting” for any prelexical nonmonadic concepts KICK(x,y) + PF(‘kick’) –is it implemented/invokable as a recursive I-operation?
13
Specific Proposal a verb meaning is an instruction to fetch a monadic concept of “things” (events, states, processes, …) that can have “participants” (agents, patients, instruments…) Instance of a more general claim: a lexical meaning is an instruction to fetch a monadic concept of (i) things with participants, or (ii) participants of such things Consequence of (neo-Davidsonian) composition: a phrasal meaning is an instruction to conjoin monadic concepts corresponding to the constituents [chase V [a [brown rat N ]]] CHASE(_) & PATIENT(_, ∃ : BROWN[_] & RAT[_]) |_________|_______|
14
Specific Proposal [chase V [a [brown rat N ]]] CHASE(_) & PATIENT(_, ∃ : BROWN[_] & RAT[_]) |_________|_______| chase V CHASE(_) [cut V [to [the chase N ]]] chase N CHASE(_) kick V Caesar N [a [swift kick N ]]] kick KICK(_) V TENSABLE(_) N INDEXABLE(_) & TENSABLE(_) & INDEXABLE(_)
15
Closely Related Questions Which concepts can be fetched with words? –Singular, Valence +1 –Monadic, Valence -1 –Dyadic, Valence -2 –Triadic, Valence -3 –??? Which operations get invoked to combine them? –Saturation of an n-adic Concept –Conjunction of Monadic Concepts –??? Which adicities are exhibited by the “fetchable” concepts? (Adicity Reduction) -1 & -1 -1
16
Another Question Even supposing that this is a coherent and defensible conception of adult competence… What does a lexicalizer do if (i) composition principles imply that chase V is an instruction to fetch a monadic concept, but (ii) the only good candidate concept for lexicalization is CHASE(x, y) or some other polyadic concept
17
Another Question What does a lexicalizer do if (i) composition principles imply that kick V is an instruction to fetch a monadic concept, but (ii) the only good candidate concept for lexicalization is KICK(x, y) or some other polyadic concept (of type >> or higher)
18
Another Question What does a lexicalizer do if (i) composition principles imply that give V is an instruction to fetch a monadic concept, but (ii) the only good candidate concept for lexicalization is GIVE(x, y, z) or some other polyadic concept (of type >> or higher)
19
Another Question What does a lexicalizer do if (i) composition principles imply that Caesar N is an instruction to fetch a monadic concept, but (ii) the only good candidate concept for lexicalization is a singular concept (a mental label, of type ) like CAESAR
20
A Possible Mind
21
First-Pass Hypothesis about Human Languages/Children chase V fetches CHASE(_) eat V fetches EAT(_) donate V fetches DONATE(_) give V fetches GIVE(_) rain V fetches RAIN(_) surround V fetches SURROUND(_) Caesar N fetches CAESARED(_) even if the concept lexicalized is not monadic because (i) semantic composition principles dictate that (open class) lexical items are instructions to fetch monadic concepts, and (ii) lexicalizers can and do invent monadic analogs of any nonmonadic concepts they lexicalize
22
Alternative Hypotheses (for comparison) chase V fetches CHASE(x, y), the concept lexicalized Caesar N fetches CAESAR, the concept lexicalized [chase V Caesar N ] constructs CHASE(x, CAESAR) Caesar N fetches λX.X(CAESAR) chase V fetches λΨ.λΦ.Φ{λx.Ψ[λy.CHASE(x,y)]} [chase V Caesar N ] constructs λΦ.Φ{λx.CHASE(x,CAESAR)} [every D ] fetches λY.λX.INCLUDES({x:X(x)}, {x:Y(x)}) [dog N ] fetches λx.Dog(x) [every D dog N ] constructs λX.INCLUDES({x:X(x)}, {x:Dog(x)} [chase V [every D dog N ]] constructs λΦ.Φ{λx.for every dog y, CHASE(x, y)} ) because (i) combining lexical items is an instruction to saturate, and (ii) lexicalizers can and do reanalyze chase V and Caesar N
23
First-Pass Hypothesis about Human Languages/Children chase V fetches CHASE(_) Caesar N fetches CAESARED(_) even if the concept lexicalized is not monadic because (i) semantic composition principles dictate that (open class) lexical items are instructions to fetch monadic concepts, and (ii) lexicalizers can and do invent monadic analogs of any nonmonadic concepts they lexicalize
24
Caveat: Polysemy To a first approximation, book fetches BOOK(_) To a second approximation, book fetches one of -abstract BOOK(_), +abstract BOOK(_) To a third approximation, book fetches one of +/-abstract 1BOOK(_), +/-abstract 2BOOK(_), … To a fourth approximation, book N fetches one of …BOOK(_) and conjoins it with INDEXABLE(_)
25
Caveat: Subcategorization Not saying that a verb meaning is merely an instruction to fetch a (tense-friendly) monadic concept of things that can have participants Distinguish: Semantic Composition Adicity Number (SCAN) (instructions to fetch) singular concepts +1 (instructions to fetch) monadic concepts -1 (instructions to fetch) dyadic concepts -2 > … Property of Smallest Sentential Entourage (POSSE) zero (indexable) terms, one term, two terms, … Hypothesis is that –the SCAN of every verb/noun/adjective/adverb is -1 –but POSSE facts vary: zero, one, two, …
26
Caveats POSSE facts may reflect, among other things (e.g. statistical experience), adicities of concepts lexicalized with verbs, as opposed to adicities of concepts fetched with verbs the verb put V may have a (lexically represented) POSSE of three in part because put V lexicalizes PUT(x, y, z) Not saying that every concept lexicalized is monadic arrive V may lexicalize ARRIVE(x) eat V may lexicalize EAT(x, y) and/or EAT(x) chase V may lexicalize CHASE(x, y) give V may lexicalize GIVE(x, y, z) sell V may lexicalize SELL(x, y, z, w) rain V may lexicalize RAIN(X) surround V may lexicalize god knows what Caesar N may (initially) lexicalize JULIUS
27
Terminology I-Operations: composition operations that are invoked by I-languages I-Concepts: concepts that are combinable via I-operations Human infants may have, and adults may retain, many concepts that are not I-concepts Humans may acquire many I-concepts by lexicalizing prior concepts that we share with other animals
28
prelexical concepts I-concepts + words prelexical concepts
29
Historical Remark When Frege invented the modern logic that semanticists now take as given, his aim was to recast the Dedekind-Peano axioms (which were formulated with “subject-predicate” sentences, like ‘Every number has a successor’) in a new format, by using a new language that allowed for “fruitful definitions” and “transparent derivations” Frege’s invented language (Begriffsschrift) was a tool for abstracting formally new concepts, not just a tool for signifying existing concepts
30
But Frege… wanted a fully general Logic for (Ideal Scientific) Polyadic Thought treated monadicity as a special case of relationality: relations objects bear to truth values often recast predicates like ‘number’ in higher-order relational terms, as in ‘thing to which zero bears the (identity-or-)ancestral-of-the-predecessor-relation relation’ Number(x) iff {ANCESTRAL[Predecessor(y, z)]} allowed for abstraction by having composition signify function-application, without constraints on atomic types; so his Begriffsschrift respects only a weak (and arguably unhuman) compositionality constraint
31
By Contrast, my suggestion is that… I-Languages let us create concepts that formally efface adicity distinctions already exhibited by the concepts we lexicalize (and presumably share with other animals) the payoff lies in creating I-concepts that can be combined quickly, via dumb but implementable operations (like monadic concept-conjunction) FREGEAN ABSTRACTION: use powerful operations to extract logically interesting polyadic concepts from subject-predicate thoughts NEO-DAVIDSONIAN ABSTRACTION: use simple operations to extract logically boring monadic concepts from diverse animal thoughts
32
Related Topic (for another day): Number Neutrality as Effacing Conceptual Distinctions I chase V those who chase V me They gave V them the vase I gave V you Italy surrounds V any pope whose priests surround its towns chase V fetches CHASE(_) give V fetches GIVE(_) surround V fetches SURROUND(_) fetchable concepts may be number-neutral even if lexicalized concepts aren’t because (i) semantic composition principles may dictate that (open class) lexical items are instructions to fetch #-neutral concepts, and (ii) lexicalizers can/do invent #-neutral analogs of any essentially numbered concepts they lexicalize
33
Lexicalization as Concept-Abstraction Concept of adicity n Concept of adicity n (before) Concept of adicity n Concept of adicity n Concept of adicity k Concept of adicity k Perceptible Signal
34
Lexicalization as Monadic-Concept-Abstraction Concept of adicity n Concept of adicity n (before) Concept of adicity n Concept of adicity n Concept of adicity -1 Concept of adicity -1 Perceptible Signal KICK(_, _) KICK(event) KICK(event, _, _,)
35
Two Kinds of Facts to Accommodate Flexibilities –Brutus kicked Caesar –Caesar was kicked –The baby kicked –I get a kick out of you –Brutus kicked Caesar the ball Inflexibilities – Brutus put the ball on the table –*Brutus put the ball –*Brutus put on the table
36
Concept of adicity n Concept of adicity n Concept of adicity n Concept of adicity n Word: adicity (SCAN) n Perceptible Signal (before) Concept of adicity n Concept of adicity n Concept of adicity -1 Concept of adicity -1 Perceptible Signal Word: adicity -1 further “flexibility” facts (as for ‘kick’) further “posse” facts (as for ‘put’) Two Pictures of Lexicalization
37
“Negative” Facts to Accommodate Striking absence of certain (open-class) lexical meanings that would be permitted if I-Languages permit nonmonadic semantic types >>> (instructions to fetch) tetradic concepts >> (instructions to fetch) triadic concepts > (instructions to fetch) dyadic concepts (instructions to fetch) singular concepts,, t>> (instructions to fetch) second-order dyadic concepts
38
“Negative” Facts to Accommodate Brutus sald a car Caesar a dollar sald SOLD(x, w, z, y) [sald [a car]] SOLD(x, w, z, a car) [[sald [a car]] Caesar] SOLD(x, w, Caesar, a car) [[[sald [a car]] Caesar]] a dollar] SOLD(x, $, Caesar, a car) _________________________________________________ Brutus tweens Caesar Antony tweens BETWEEN(x, z, y) [tweens Caesar] BETWEEN(x, z, Caesar) [[tweens Caesar] Antony] BETWEEN(x, Antony, Caesar) x sold y to z (in exchange) for w
39
“Negative” Facts to Accommodate Alexander jimmed the lock a knife jimmed JIMMIED(x, z, y) [jimmed [the lock] JIMMIED(x, z, the lock) [[jimmed [the lock] [a knife]] JIMMIED(x, a knife, the lock) _________________________________________________ Brutus froms Rome froms COMES-FROM(x, y) [froms Rome] COMES-FROM(x, Rome)
40
“Negative” Facts to Accommodate Brutus talls Caesar talls IS-TALLER-THAN(x, y) [talls Caesar] IS-TALLER-THAN(x, Caesar) _________________________________________________ *Julius Caesar Julius JULIUS Caesar CAESAR * ^
41
Recall: Two Kinds of Creativity Introduce a language that invokes general and powerful operations (like Function-Application /λ-abstraction) – lexicalize many concepts “directly” KICK(x,y) + PF(‘kick’) –reanalyze many Subject-Predicate thoughts in polyadic terms Number(3) ≡ df ANCESTRAL[Predecessor(x, y)] Introduce a language that invokes simple and restrictive operations (like Predicate-Conjunction/Monadicization) –when lexicalizing nonmonadic concepts, make monadic analogs KICK(x,y) + PF(‘kick’) –use this language to construct “neo-medieval” thoughts with many monadic constituents and just a few dyadic/thematic constituents
42
Quantifiers also Present Negative Facts Every boy who arrived Every INCLUDES(X, Y) boy {x: arrived[x]} [Every boy] INCLUDES(X, {y: boy[y]}) who arrived {x: arrived[x]} [Every boy] [who arrived] INCLUDES({x: arrived[x]}, {y: boy[y]}) ________________________________________________ But why not, if Every is of type,, t>>, and boy is of type, and who arrived is of type ? Maybe Every is not of type,, t>>. Maybe the concept INCLUDES(X, Y) is lexicalized differently. (see Events & Semantic Architecture for monadic analysis) can’t mean that every boy arrived
43
Quantifiers also Present Negative Facts Equi ONE-TO-ONE(X, Y) [Equi boy] ONE-TO-ONE (X, {y: boy[y]}) arrived {x: arrived[x]} [Equi boy] arrived ONE-TO-ONE ({x: arrived[x]}, {y: boy[y]}) no deteriminer fetches a “nonconservative” second-order dyadic concept, just as no verb fetches a “UTAH-violating” first-order dyadic concept quase λy.λx.CHASE(y, x) [quase Felix] λx.CHASE(Felix, x) Fido [quase Felix] CHASE(Felix, Fido)
44
REMEMBER… There is little to no evidence of any (open class) lexical items fetching supradyadic I-concepts (… SCAN -5, SCAN -4, or even SCAN -3) Brutus gave Caesar the ball Brutus kicked Caesar the ball Brutus gave/kicked the ball to Caesar Various (e.g., Larsonian) analyses of ditransitive constructions, without ditransitive verbs
45
REMEMBER… Not even English provides good evidence for (open class) lexical nouns of type. On the contrary, it seems that singular concepts are not lexicalized “straight” with simple tags of type Every Tyler I saw was a philosopher Every philosopher I saw was a Tyler There were three Tylers at the party That Tyler stayed late, and so did this one Philosophers have wheels, and Tylers have stripes The Tylers are coming to dinner At noon, I saw Tyler Burge I saw Tyler at noon I saw Burge at noon
46
But… If the basic mode of semantic composition is conjunction of monadic concepts, then we can start to explain the absence of lexical meanings like… SOLD(x, w, z, y) BETWEEN(x, z, y) JIMMIED(x, z, y) COMES-FROM(x, y) IS-TALLER-THAN(x, y) TYLER
47
Alternative Hypotheses (for comparison) chase V fetches CHASE(x, y), the concept lexicalized Caesar N fetches CAESAR, the concept lexicalized [chase V Caesar N ] constructs CHASE(x, CAESAR) Caesar N fetches λX.X(CAESAR) chase V fetches λΨ.λΦ.Φ{λx.Ψ[λy.CHASE(x,y)]} [chase V Caesar N ] constructs λΦ.Φ{λx.CHASE(x,CAESAR)} If we don’t lexicalize “straight,” and if we reformat “monadically” as opposed to other ways, how come?
48
General Acquisition Question For any n… if I-Languages permit lexical items of adicity n, but we don’t lexicalize concepts of adicity n straightforwardly with words of adicity (SCAN) n, then theorists need to ask: How come? Possible Answer: I-languages don’t permit lexical items of adicity n
49
Specific Acquisition Questions For each n such that n ≠ -1 if I-Languages permit lexical items of adicity n, but we don’t lexicalize concepts of adicity n straightforwardly with words of adicity (SCAN) n, then theorists need to ask: How come? Possible Answer: I-languages don’t permit lexical items of any adicity (SCAN) other than -1
50
Related Questions Why do I-Languages have functional vocabulary? prepositions, little ‘v’, … And why are certain grammatical relations like “dedicated prepositions” that invoke certain thematic relations? Possible Answer: I-languages don’t permit (open class) lexical items of any adicity (SCAN) other than -1 And relational concepts cannot be monadicized without some “functional residue”
51
Recall Caveat: Subcategorization Not saying that a verb meaning is merely an instruction to fetch a (tense-friendly) monadic concept of things that can have participants Distinguish: Semantic Composition Adicity Number (SCAN) (instructions to fetch) singular concepts +1 (instructions to fetch) monadic concepts -1 (instructions to fetch) dyadic concepts -2 > … Property of Smallest Sentential Entourage (POSSE) zero (indexable) terms, one term, two terms, … Hypothesis is that –the SCAN of every verb/noun/adjective/adverb is -1 –but POSSE facts vary: zero, one, two, …
52
Recall: Facts to Accommodate Flexibilities –Brutus kicked Caesar –Caesar was kicked –The baby kicked –I get a kick out of you –Brutus kicked Caesar the ball Inflexibilities – Brutus put the ball on the table –*Brutus put the ball –*Brutus put on the table suggests that ‘kick’ fetches a monadic concept to which thematic conjuncts can be added compatible with ‘put’ fetching a monadic concept to which thematic conjuncts must be added
53
Concept of adicity n Concept of adicity n Concept of adicity n Concept of adicity n Word: adicity n Perceptible Signal (before) Concept of adicity n Concept of adicity n Concept of adicity -1 Concept of adicity -1 Perceptible Signal Word: adicity -1 further “flexibility” facts (as for ‘kick’) further “posse” facts (as for ‘put’) Two Pictures of Lexicalization
54
What Makes us Humans Special Linguistically? Lexicalization and Combination Lexicalization, but not Combination I-operations implemented by our cousins So maybe I-operations are simple and ancient, and lexicalization lets us employ I-operations in new ways Combination, but not Lexicalization in which case, our cousins can lexicalize like us
55
InfantChild Modules: Vision Audition … concepts Experience + and Growth Human Language Faculty in its Initial State Modules: Vision Audition … Human Language Faculty in a Mature State: LEXICON COMBINATORICS concepts I-CONCEPTS + Lexicalization
56
Talk: One Slide Version Lexical Meanings are what Composition Operations need them to be –LMs are combinable via COs –LMs exhibit types that COs can operate on if Composition Operations are rabidly conjunctive, as in neo-Davidsonian semantic theories, then lexicalization has to be creative, in ways that are otherwise unexpected familiar (and otherwise puzzling) facts suggest that lexicalization is creative in these ways so perhaps lexicalization is at heart of what is uniquely human about our semantic capacities
57
THANKS
58
Ancient Question + words prelexical concepts and I-Operations prelexical concepts and I-Operations and I-Concepts
59
EXTRA SLIDES
60
Marr on my Mind compatible with a multi-stage conception of how semantic properties are determined/computed “elementary” semantic composition may deliver only “primal sketches” of thoughts word-string SEM: Sketch-1α SEM: Sketch-1β SEM: Sketch-2α SEM: Sketch-3α word-string* (homophonous) … SEM: Sketch-1ω SEM: Sketch-1α
61
Question What does a lexicalizer do if (i) composition principles imply that brown rat is an instruction to conjoin monadic concepts, but (ii) rat lexicalizes RAT(x), a (categorial) concept of certain animals, while brown lexicalizes BROWN(s) or BROWN(s, x) a (perhaps relational) concept of certain surfaces
62
DM-ish Description chase √ + null N chase N the D + chase N [the D chase N ] D alternatively… the D + chase √ [the D chase √ ] D [the D chase √ ] D = [the D chase N ] D
63
DM-ish Description chase √ + null N chase N the D + chase N [the D chase N ] D Caesar √ + null N Caesar N that-1 D + Caesar N [that-1 D Caesar N ] D chase √ + null V chase V chase V + [the D man N ] D [chase V [the D man N ] D ] V
64
Combination Decomposed the D + man N [the D man N ] D chase V + [the D man N ] D [chase V [the D man N ] D ] V CONCATENATE(the D, man N ) the D ^man N LABEL(the D ^man N ) [the D man N ] D CONCATENATE(chase V, [the D man N ] D ) chase V ^[the D man N ] D LABEL(chase V ^[the D man N ] D ) [chase V [the D man N ] D ] V
65
Combination Decomposed chase √ + null N chase N CONCATENATE(chase √, null N ) chase √ ^null N LABEL(chase √ ^null N ) [chase √ null N ] N chase √ + null V chase V CONCATENATE(chase √, null V ) chase √ ^null V LABEL(chase √ ^null V ) [chase √ ^null V ] V abbreviation: [chase √ null N ] N = chase N abbreviation: [chase √ null V ] V = chase V
66
Meanings? chase √ null N null V [chase √ null N ] N [chase √ null V ] V Caesar √ [Caesar √ null N ] N that-1 D [that-1 D [Caesar √ null N ] N ] D CHASE(_) an instruction to FETCH a monadic concept like INDEXABLE(_) TENSABLE(_) an instruction to FETCH and CONJOIN two monadic concepts, forming a concept like CHASE(_) INDEXABLE(_) CHASE(_) TENSABLE(_) CALLED(_, PF:Caesar) INDEXABLE(_)INDEXED(_, 1)
67
Terminology Meanings: compositional properties of expressions of naturally acquired languages Concepts: composable mental representations Contents: mind&language-independent aspects of the world that we think/talk about Meanings (or if you prefer, “SEMs”) are (i) properties of naturally acquired expressions (ii) compositional in some specific way(s) that theorists have to discover
68
Assumptions Theorists can focus on I-Languages in Chomsky’s sense –Implemented Intensions, not Extensions (sets) of expressions –procedures (algorithms) for generating expressions –“steady states” of the human cognitive system that supports our natural acquisition and use of expression-generating procedures –think of expressions as pairs of instructions (PHON, SEM) to/from “articulatory/perceptual” and “conceptual/intentional” systems The following (Fregean) idea is at least coherent –the atomic meaningful expressions of an acquired language can do more than merely associate prior concepts with signals –acquiring words may be a process in which formally new concepts are abstracted from prior concepts CHASE(x, y) + chase V CHASE(e, x, y) CHASE(e)
69
Bloom: How Children Learn the Meanings of Words” word meanings are, at least mainly, concepts that kids have prior to lexicalization learning word meanings is, at least mainly, a process of figuring out which existing concepts are paired with which word-sized signals in this process, kids draw on many capacities-- including those that support recognition of syntactic cues and speaker intentions--but not capacities specific to learning word meanings
70
Lidz, Gleitman, and Gleitman “Clearly, the number of noun phrases required for the grammaticality of a verb in a sentence is a function of the number of participants logically implied by the verb meaning. It takes only one to sneeze, and therefore sneeze is intransitive, but it takes two for a kicking act (kicker and kickee), and hence kick is transitive. Of course there are quirks and provisos to these systematic form-to-meaning-correspondences…”
71
Terminology Language: anything that associates signals of some kind with interpretations of some kind (Human) I-Language: ‘I’ for ‘Intensional’, ‘Implemented’ a state of the language faculty that implements a child-acquirable algorithm for associating signals with concepts in a human way (Human) I-Operation: operation invoked by I-languages with regard to semantics, an invokable operation that permits combination of “fetchable/constructable” concepts (Human) I-Concepts: combinable via I-operations Human infants may have, and adults may retain, many concepts that are not I-concepts Humans may acquire many I-concepts by lexicalizing prior concepts that we share with other animals Brain as Computer metaphor: for any algorithm/program executed, what computational operations are invoked, (and how are they implemented)? if conjunction (as opposed to saturation) is the basic I-operation for semantic composition, that will have consequences for lexicalization
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.