Presentation is loading. Please wait.

Presentation is loading. Please wait.

6/3/2015CPSC503 Winter 20071 CPSC 503 Computational Linguistics Lecture 10 Giuseppe Carenini.

Similar presentations


Presentation on theme: "6/3/2015CPSC503 Winter 20071 CPSC 503 Computational Linguistics Lecture 10 Giuseppe Carenini."— Presentation transcript:

1 6/3/2015CPSC503 Winter 20071 CPSC 503 Computational Linguistics Lecture 10 Giuseppe Carenini

2 6/3/2015CPSC503 Winter 20072 Knowledge-Formalisms Map (including probabilistic formalisms) Logical formalisms (First-Order Logics) Rule systems (and prob. versions) (e.g., (Prob.) Context-Free Grammars) State Machines (and prob. versions) (Finite State Automata,Finite State Transducers, Markov Models) Morphology Syntax Pragmatics Discourse and Dialogue Semantics AI planners

3 6/3/2015CPSC503 Winter 20073 Next three classes What meaning is and how to represent it How to map sentences into their meaning Meaning of individual words (lexical semantics) Computational Lexical Semantics Tasks –Word sense disambiguation –Word Similarity –Semantic Labeling

4 6/3/2015CPSC503 Winter 20074 Today 16/10 Semantics / Meaning /Meaning Representations Linguistically relevant Concepts in FOPC / POL Semantic Analysis

5 6/3/2015CPSC503 Winter 20075 Semantics Def. Semantics: The study of the meaning of words, intermediate constituents and sentences Def1. Meaning: a representation that expresses the linguistic input in terms of objects, actions, events, time, space… beliefs, attitudes...relationships Def2. Meaning: a representation that links the linguistic input to knowledge of the world Language independent!

6 6/3/2015CPSC503 Winter 20076 Semantic Relations involving Sentences Paraphrase: have the same meaning I gave the apple to John vs. I gave John the apple I bought a car from you vs. you sold a car to me The thief was chased by the police vs. …… Same truth conditions Entailment: “implication” The park rangers killed the bear vs. The bear is dead Nemo is a fish vs. Nemo is an animal Contradiction: I am in Vancouver vs. I am in India

7 6/3/2015CPSC503 Winter 20077 Meaning Structure of Language How does language convey meaning? –Grammaticization –Display a partially compositional semantics –Display a basic predicate-argument structure (e.g., verb complements) –Words

8 6/3/2015CPSC503 Winter 20078 Grammaticization ConceptAffix Past More than one Again Negation -ed -s re- in-, un-, de- Words from Nonlexical categories Obligation Possibility Definite, Specific Indefinite, Non-specific Disjunction Negation Conjunction must may the a or not and

9 6/3/2015CPSC503 Winter 20079 Common Meaning Representations FOL Semantic Nets Frames Conceptual Dependency I have a car

10 6/3/2015CPSC503 Winter 200710 Requirements for Meaning Representations e.g, Does Maharani serve vegetarian food? -> Yes What restaurants are close to the ocean? -> C and Monks Sample NLP Task: giving advice about restaurants –Accept queries in NL –Generate appropriate responses by consulting a KB

11 6/3/2015CPSC503 Winter 200711 Verifiability (in the world?) Example: Does LeDog serve vegetarian food? Knowledge base (KB) expressing our world model (in a formal language) Convert question to KB language and verify its truth value against the KB content Yes / No / I do not know

12 6/3/2015CPSC503 Winter 200712 Canonical Form Paraphrases should be mapped into the same representation. Does LeDog have vegetarian dishes? Do they have vegetarian food at LeDog? Are vegetarian dishes served at LeDog? Does LeDog serve vegetarian fare? ……………

13 6/3/2015CPSC503 Winter 200713 How to Produce a Canonical Form Words have different senses –food ___ –dish ___|____one overlapping meaning sense –fare ___| Meaning of alternative syntactic constructions are systematically related serverthing-being-served –[S [NP Maharani] serves [NP vegetarian dishes]] thing-being-served server [S [NP vegetarian dishes] are served at [NP Maharani]]

14 6/3/2015CPSC503 Winter 200714 Inference and Expressiveness Consider a more complex request –Can vegetarians eat at Maharani? –Vs: Does Maharani serve vegetarian food? Why do these result in the same answer? Inference: System’s ability to draw valid conclusions based on the meaning representations of inputs and its KB serve(Maharani,VegetarianFood) => CanEat(Vegetarians,At(Maharani)) Expressiveness: system must be able to handle a wide range of subject matter

15 6/3/2015CPSC503 Winter 200715 Non Yes/No Questions Example: I'd like to find a restaurant where I can get vegetarian food. Indefinite reference variable serve(x,VegetarianFood) Matching succeeds only if variable x can be replaced by known object in KB. What restaurants are close to the ocean? -> C and Monks

16 6/3/2015CPSC503 Winter 200716 Meaning Structure of Language How does language convey meaning? –Grammaticization –Display a partially compositional semantics –Display a basic predicate-argument structure (e.g., verb complements) –Words

17 6/3/2015CPSC503 Winter 200717 Predicate-Argument Structure Subcategorization frames specify number, position, and syntactic category of arguments Examples: give NP2 NP1, find NP, sneeze [] Represent relationships among concepts Some words act like arguments and some words act like predicates: –Nouns as concepts or arguments: red(ball) –Adj, Adv, Verbs as predicates: red(ball)

18 6/3/2015CPSC503 Winter 200718 Semantic (Thematic) Roles Semantic Roles : Participants in an event –Agent: George hit Bill. Bill was hit by George –Theme: George hit Bill. Bill was hit by George Source, Goal, Instrument, Force… This can be extended to the realm of semantics Verb subcategorization: Allows linking arguments in surface structure with their semantic roles Mary gave/sent/read a book to Ming Agent Theme Goal Mary gave/sent/read Ming a book Agent Goal Theme

19 6/3/2015CPSC503 Winter 200719 Non-verbal predicate-argument structures Semantic (Selectional) Restrictions : Constrain the types of arguments verbs take –George assassinated the senator –*The spider assassinated the fly Selectional Restrictions A Spanish restaurant under the bridge Under(SpanishRestaurant, bridge)

20 6/3/2015CPSC503 Winter 200720 First Order Predicate Calculus (FOPC) FOPC provides sound computational basis for verifiability, inference, expressiveness… –Supports determination of truth –Supports Canonical Form –Supports compositionality of meaning –Supports question-answering (via variables) –Supports inference –Argument-Predicate structure

21 6/3/2015CPSC503 Winter 200721 Today 16/10 Semantics / Meaning /Meaning Representations Linguistically relevant Concepts in FOPC / POL Semantic Analysis

22 6/3/2015CPSC503 Winter 200722 Linguistically Relevant Concepts in FOPC Categories & Events (Reification) Representing Time Beliefs (optional, read if relevant to your project) Aspects (optional, read if relevant to your project) Description Logics (optional, read if relevant to your project)

23 6/3/2015CPSC503 Winter 200723 Categories & Events Events: eg. “Make a reservation” –Reservation (Speaker,Joe’s,Today,8PM,2) –Problems: Determining the correct number of roles Representing facts about the roles associated with an event Ensuring that all and only the correct inferences can be drawn Categories: –VegetarianRestaurant (Joe’s) - relation vs. object –MostPopular(Joe’s,VegetarianRestaurant) Reification–ISA (Joe’s,VegetarianRestaurant) –AKO (VegetarianRestaurant,Restaurant)

24 6/3/2015CPSC503 Winter 200724 MUC-4 Example INCIDENT: DATE30 OCT 89 INCIDENT: LOCATIONEL SALVADOR INCIDENT: TYPEATTACK INCIDENT: STAGE OF EXECUTIONACCOMPLISHED INCIDENT: INSTRUMENT ID INCIDENT: INSTRUMENT TYPE PERP: INCIDENT CATEGORYTERRORIST ACT PERP: INDIVIDUAL ID"TERRORIST" PERP: ORGANIZATION ID "THE FMLN" PERP: ORG. CONFIDENCEREPORTED: "THE FMLN" PHYS TGT: ID PHYS TGT: TYPE PHYS TGT: NUMBER PHYS TGT: FOREIGN NATION PHYS TGT: EFFECT OF INCIDENT PHYS TGT: TOTAL NUMBER HUM TGT: NAME HUM TGT: DESCRIPTION"1 CIVILIAN" HUM TGT: TYPE CIVILIAN: "1 CIVILIAN" HUM TGT: NUMBER1: "1 CIVILIAN" HUM TGT: FOREIGN NATION HUM TGT: EFFECT OF INCIDENTDEATH: "1 CIVILIAN" HUM TGT: TOTAL NUMBER On October 30, 1989, one civilian was killed in a reported FMLN attack in El Salvador.

25 6/3/2015CPSC503 Winter 200725 Subcategorization frames I ate I ate a turkey sandwich I ate a turkey sandwich at my desk I ate at my desk I ate lunch I ate a turkey sandwich for lunch I ate a turkey sandwich for lunch at my desk no fixed “arity”!

26 6/3/2015CPSC503 Winter 200726 Reification Again Reification Advantages: –No need to specify fixed number of arguments for a given surface predicate –No more roles are postulated than mentioned in the input –Logical connections among related examples are specified “I ate a turkey sandwich for lunch”  w: Isa(w,Eating)  Eater(w,Speaker)  Eaten(w,TurkeySandwich)  MealEaten(w,Lunch)

27 6/3/2015CPSC503 Winter 200727 Representing Time Events are associated with points or intervals in time. We can impose an ordering on distinct events using the notion of precedes. Temporal logic notation: (  w,x,t) Arrive(w,x,t) Constraints on variable t I arrived in New York (  t) Arrive(I,NewYork,t)  precedes(t,Now)

28 6/3/2015CPSC503 Winter 200728 Interval Events Need t start and t end “She was driving to New York until now”  t start,t end,e, i ISA(e,Drive) Driver(e, She) Dest(e, NewYork)  IntervalOf(e,i) Endpoint(i, t end ) Startpoint(i, t end ) Precedes(t start,Now)  Equals(t end,Now)

29 6/3/2015CPSC503 Winter 200729 Relation Between Tenses and Time Relation between simple verb tenses and points in time is not straightforward Present tense used like future: –We fly from Baltimore to Boston at 10 Complex tenses: –Flight 1902 arrived late –Flight 1902 had arrived late

30 6/3/2015CPSC503 Winter 200730 Reference Point Reichenbach (1947) introduced notion of Reference point (R), separated out from Utterance time (U) and Event time (E) Example: –When Mary's flight departed, I ate lunch –When Mary's flight departed, I had eaten lunch Departure event specifies reference point.

31 6/3/2015CPSC503 Winter 200731 Today 16/10 Semantics / Meaning /Meaning Representations Linguistically relevant Concepts in FOPC / POL Semantic Analysis

32 6/3/2015CPSC503 Winter 200732 Semantic Analysis Syntax-driven Semantic Analysis Sentence Literal Meaning Discourse Structure Meanings of words Meanings of grammatical structures Context Common-Sense Domain knowledge Intended meaning Further Analysis INFERENCEINFERENCE

33 6/3/2015CPSC503 Winter 200733 Compositional Analysis Principle of Compositionality –The meaning of a whole is derived from the meanings of the parts What parts? –The constituents of the syntactic parse of the input What could it mean for a part to have a meaning?

34 6/3/2015CPSC503 Winter 200734 Compositional Analysis: Example AyCaramba serves meat

35 6/3/2015CPSC503 Winter 200735 Augmented Rules Augment each syntactic CFG rule with a semantic formation rule The class of actions performed by f will be quite restricted. Abstractly i.e., The semantics of A can be computed from some function applied to the semantics of its parts.

36 6/3/2015CPSC503 Winter 200736 Simple Extension of FOL: Lambda Forms –Lambda-reduction: variables are bound by treating the lambda form as a function with formal arguments –A FOL sentence with variables in it that are to be bound.

37 6/3/2015CPSC503 Winter 200737 Augmented Rules: Example –PropNoun -> AyCaramba –MassNoun -> meat Attachments {AyCaramba} {MEAT} assigning constants Easy parts… copying from daughters up to mothers. –NP -> PropNoun –NP -> MassNoun Attachments {PropNoun.sem} {MassNoun.sem}

38 6/3/2015CPSC503 Winter 200738 Augmented Rules: Example Verb -> serves {VP.sem(NP.sem)} {Verb.sem(NP.sem) Semantics attached to one daughter is applied to semantics of the other daughter(s). S -> NP VP VP -> Verb NP lambda-form

39 6/3/2015CPSC503 Winter 200739 Example S -> NP VP VP -> Verb NP Verb -> serves NP -> PropNoun NP -> MassNoun PropNoun -> AyCaramba MassNoun -> meat {VP.sem(NP.sem)} {Verb.sem(NP.sem) {PropNoun.sem} {MassNoun.sem} {AC} {MEAT} MEAT ……. y y AC

40 6/3/2015CPSC503 Winter 200740 Full story more complex To deal properly with quantifiers –Permit lambda-variables to range over predicates. E.g., –Introduce complex terms to remain agnostic about final scoping

41 6/3/2015CPSC503 Winter 200741 Similarly to PP attachment, number of possible interpretations exponential in the number of complex terms Solution: Quantifier Scope Ambiguity likelihood of different orderings Mirror surface ordering Domain specific knowledge Weak methods to prefer one interpretation over another:

42 6/3/2015CPSC503 Winter 200742 Attachments for a fragment of English (Sect. 18.5) Sentences Noun-phrases Verb-phrases Prepositional-phrases Based on “The core Language Engine” 1992

43 6/3/2015CPSC503 Winter 200743 Integration with a Parser Assume you’re using a dynamic-programming style parser (Earley or CYK). Two basic approaches –Integrate semantic analysis into the parser (assign meaning representations as constituents are completed) –Pipeline… assign meaning representations to complete trees only after they’re completed

44 6/3/2015CPSC503 Winter 200744 Pros and Cons Integration –use semantic constraints to cut off parses that make no sense –assign meaning representations to constituents that don’t take part in any correct parse Pipeline –assign meaning representations only to constituents that take part in a correct parse –parser needs to generate all correct parses

45 6/3/2015CPSC503 Winter 200745 Next Time Read Chp. 19 (Lexical Semantics)

46 6/3/2015CPSC503 Winter 200746 Non-Compositionality Unfortunately, there are lots of examples where the meaning of a constituent can’t be derived from the meanings of the parts - metaphor, (e.g., corporation as person) –metonymy, (??) –idioms, –irony, –sarcasm, –indirect requests, etc

47 6/3/2015CPSC503 Winter 200747 English Idioms “buy the farm” “bite the bullet” “bury the hatchet” etc… Lots of these… constructions where the meaning of the whole is either –Totally unrelated to the meanings of the parts (“kick the bucket”) –Related in some opaque way (“run the show”)

48 6/3/2015CPSC503 Winter 200748 The Tip of the Iceberg –“Enron is the tip of the iceberg.” NP -> “the tip of the iceberg” {….} –“the tip of an old iceberg” –“the tip of a 1000-page iceberg” –“the merest tip of the iceberg” NP -> TipNP of IcebergNP {…} TipNP: NP with tip as its head IcebergNP NP with iceberg as its head

49 6/3/2015CPSC503 Winter 200749 Handling Idioms –Mixing lexical items and grammatical constituents –Introduction of idiom-specific constituents –Permit semantic attachments that introduce predicates unrelated with constituents NP -> TipNP of IcebergNP {small-part(), beginning()….} TipNP: NP with tip as its head IcebergNP NP with iceberg as its head


Download ppt "6/3/2015CPSC503 Winter 20071 CPSC 503 Computational Linguistics Lecture 10 Giuseppe Carenini."

Similar presentations


Ads by Google