Presentation is loading. Please wait.

Presentation is loading. Please wait.

Semantics September 8, 2006 11/16/2018.

Similar presentations


Presentation on theme: "Semantics September 8, 2006 11/16/2018."— Presentation transcript:

1 Semantics September 8, 2006 11/16/2018

2 Selectional Restrictions
Selectional Restrictions: constraints on the types of arguments verbs take George assassinated the senator. *The spider assassinated the fly. assassinate: intentional (political?) killing NOTE: dependence on the particular verb being used! 11/16/2018

3 So? What about Case in General?
You may or may not see particular cases used in semantic analysis. In the book, they have NOT used the specific cases. But, note, the “roles” they use are derived from the general cases identified in Fillmore’s work – they make them verb-specific. Semantic analysis is going to take advantage of the syntactic regularities and selectional restrictions to identify the role being played by each constituent in a sentence! 11/16/2018

4 Representational Schemes
Let’s go back to the question – what kind of semantic representation should we derive for a given sentence? We’re going to make use of First Order Predicate Calculus (FOPC) as our representational framework Not because we think its perfect All the alternatives turn out to be either too limiting or They turn out to be notational variants Essentially the important parts are the same no matter which variant you choose! 11/16/2018

5 FOPC Allows for… The analysis of truth conditions
Allows us to answer yes/no questions Supports the use of variables Allows us to answer questions through the use of variable binding Supports inference Allows us to answer questions that go beyond what we know explicitly 11/16/2018

6 FOPC This choice isn’t completely arbitrary or driven by the needs of practical applications FOPC reflects the semantics of natural languages because it was designed that way by human beings In particular… 11/16/2018

7 Meaning Structure of Language
The semantics of human languages… Display a basic predicate-argument structure Make use of variables (e.g., indefinites) Make use of quantifiers (e.g., every, some) Use a partially compositional semantics (sort of) 11/16/2018

8 Predicate-Argument Structure
Events, actions and relationships can be captured with representations that consist of predicates and arguments. Languages display a division of labor where some words and constituents function as predicates and some as arguments. E.g., predicates represent the verb, and the arguments (in the right order) represent the cases of the verb. 11/16/2018

9 Predicate-Argument Structure
Predicates Primarily Verbs, VPs, PPs, adjectives, Sentences Sometimes Nouns and NPs Arguments Primarily Nouns, Nominals, NPs But also everything else; as we’ll see it depends on the context 11/16/2018

10 Example John gave a book to Mary Giving(John, Mary, Book)
More precisely Gave conveys a three-argument predicate The first argument is the giver (agent) The second is the recipient (to-poss), which is conveyed by the NP in the PP The third argument is the thing given (theme), conveyed by the direct object 11/16/2018

11 More Examples What about situation of missing/additional cases?
John gave Mary a book for Susan. Giving(John, Mary, Book, Susan) John gave Mary a book for Susan on Wednesday. Giving(John, Mary, Book, Susan, Wednesday) John gave Mary a book for Susan on Wednesday in class. Giving(John, Mary, Book, Susan, Wednesday, InClass) Problem: Remember each of these predicates would be different because of the different number of arguments! Except for the suggestive names of predicates and arguments, there is nothing that indicates the obvious logical relations among them. 11/16/2018

12 Meaning Representation Problems
Assumes that the predicate representing the meaning of a verb has the same number of arguments as are present in the verb’s syntactic categorization frame. This makes it hard to Determine the correct number of roles for any given event Represent facts about the roles associated with the event Insure that all and only the correct inferences can be derived from the representation of an event 11/16/2018

13 Better Book ) , ( )^ ^ y Isa x Mary Givee Given John Giver Giving $
Turns out this representation isn’t quite as useful as it could be. Giving(John, Mary, Book) Better would be one where the “roles” or “cases” are separated out. E.g., consider: Note: essentially Giver=Agent, Given=Theme, Givee=To-Poss Book ) , ( )^ ^ y Isa x Mary Givee Given John Giver Giving $ 11/16/2018

14 Predicates The notion of a predicate just got more complicated…
In this example, think of the verb/VP providing a template like the following The semantics of the NPs and the PPs in the sentence plug into the slots provided in the template (we’ll worry about how in a bit!) 11/16/2018

15 Advantages Can have variable number of arguments associated with an event: events have many roles and fillers can be glued on as appear in the input. Specifies categories (e.g., book) so that we can make assertions about categories themselves as well as their instances. E.g., Isa(MobyDick, Novel), AKO(Novel, Book). Reifies events so that they can be quantified and related to other events and objects via sets of defined relations. Can see logical connections between closely related examples without the need for meaning postulates. 11/16/2018

16 Additional Material The following are some aspects covered in the book that will likely not be covered in lecture! 11/16/2018

17 FOPC Syntax Terms: constants, functions, variables
Constants: objects in the world, e.g. Huey Functions: concepts, e.g. sisterof(Huey) Variables: x, e.g. sisterof(x) Predicates: symbols that refer to relations that hold among objects in some domain or properties that hold of some object in a domain likes(Huey, kibble) cat(Huey) 11/16/2018

18 Logical connectives permit compositionality of meaning
kibble(x)  likes(Huey,x) cat(Vera) ^ weird(Vera) sleeping(Huey) v eating(Huey) Sentences in FOPC can be assigned truth values, T or F, based on whether the propositions they represent are T or F in the world Atomic formulae are T or F based on their presence or absence in a DB (Closed World Assumption?) Composed meanings are inferred from DB and meaning of logical connectives 11/16/2018

19 Limitations: cat(Huey) sibling(Huey,Vera)
sibling(x,y) ^ cat(x)  cat(y) cat(Vera)?? Limitations: Do ‘and’ and ‘or’ in natural language really mean ‘^’ and ‘v’? Mary got married and had a baby. Your money or your life! She was happy but ignorant. Does ‘’ mean ‘if’? I’ll go if you promise to wear a tutu. 11/16/2018

20 Quantifiers: Inference: Production systems:
Existential quantification: There is a unicorn in my garden. Some unicorn is in my garden. Universal quantification: The unicorn is a mythical beast. Unicorns are mythical beasts. Inference: Modus ponens: rich(Harry) x rich(x)  happy(x) happy(Harry) Production systems: Forward and backward chaining 11/16/2018

21 Temporal Representations
How do we represent time and temporal relationships between events? Last year Martha Stewart was happy but soon she will be sad. Where do we get temporal information? Verb tense Temporal expressions Sequence of presentation Linear representations: Reichenbach ‘47 11/16/2018

22 Utterance time: when the utterance occurs
Reference time: the temporal point-of-view of the utterance Event time: when events described in the utterance occur George had intended to eat a sandwich. E – R – U  George is eating a sandwich. -- E,R,U  George had better eat a sandwich soon. --R,U – E  11/16/2018

23 Verbs and Event Types: Aspect
Statives: states or properties of objects at a particular point in time Mary needs sleep. *Mary is needing sleep. *Need sleep. *Mary needs sleep in a week. Activities: events with no clear endpoint Harry drives a Porsche. *Harry drives a Porsche in a week. 11/16/2018

24 Achievements: events that change state but have no particular duration
Accomplishments: events with durations and endpoints that result in some change of state Marlon filled out the form. Marlon stopped filling out the form (Marlon did not fill out the form) vs. Harry stopped driving a Porsche (Harry still drove a Porsche …for a while) Achievements: events that change state but have no particular duration Larry reached the top. *Larry stopped reaching the top. *Larry reached the top for a few minutes. 11/16/2018

25 Beliefs, Desires and Intentions
How do we represent internal speaker states like believing, knowing, wanting, assuming, imagining..? Not well modeled by a simple DB lookup approach Truth in the world vs. truth in some possible world George imagined that he could dance. Geroge believed that he could dance. Augment FOPC with special modal operators that take logical formulae as arguments, e.g. believe, know 11/16/2018

26 Semantics A way of representing meaning
Abstracts away from syntactic structure Example: First-Order Logic: watch(I,terrapin) Can be: “I watched the terrapin” or “The terrapin was watched by me” I = experiencer Watch the Terrapin = predicate The Terrapin = patient 11/16/2018

27 Compositional Semantics
Association of parts of a proposition with semantic roles Scoping Proposition Experiencer Predicate: Be (perc) I (1st pers, sg) pred patient saw the Terrapin 11/16/2018

28 Predicate-Argument Structure
Syntactic Structures: I want Turkish food NP want NP I want to spend less than five dollars. NP want InfVP I want it to be close by here NP want NP InfVP Verb sub-categorization rules allow the linking of the arguments of syntactic structures with the semantic roles of these arguments in the semantic representation of that sentence. The study of semantic roles associated with verbs is known as thematic role. In syntactic structures, there are restrictions on the categories of their arguments. Similarly, there are also semantic restrictions on the arguments of the predicates. The selectional restrictions specify semantic restrictions on the arguments of verbs. 11/16/2018

29 Representation of Categories
The semantics of the arguments are expressed in the form of selectional restrictions. These selectional restrictions are expressed in the form of semantically-based categories. The most common way to represent a category is to create a unary predicate. VegaterianRestraunt(Anarkali) Here categories are relations (not objects), and difficult to make assertions about categories. We cannot use MostPopular(Anarkali,VegetarianRestaurant) because VegetarianRestraunt is not an object. The arguments of formulas must be Terms (Predicates cannot be arguments ın FOPC) 11/16/2018

30 Representation of Categories -- Reification
Solution is to make each category an object. This technique is know as reification. Thus we can define relations between objects and categories and relations between categories. Membership relation ISA between objects and categories. ISA (Anarkali,VegetarianRestraunt) A category inclusion relation AKO between categories. AKO (VegetarianRestaurant,Restaurant) 11/16/2018

31 Representations of Events
The simplest approach to predicate-argument representation of a verb is to have the same number of arguments present in that verb’s subcategorization frame. But this simple approach may cause some difficulties: determining correct number of arguments. Ensuring soundness and completeness Example: I ate. Eating1(Speaker) I ate a turkey sandwich Eating2(Speaker,TurkeySandwich) I ate a turkey sandwich at my desk. Eating3(Speaker,TurkeySandwich,Desk) I ate at my desk. Eating4(Speaker,Desk) I ate lunch. Eating5(Speaker,Lunch) I ate a turkey sandwich for lunch. Eating6(Speaker,TurkeySandwich,Lunch) I ate a turkey sandwich for lunch at my desk. Eating7(Speaker,TurkeySandwich,Lunch,Desk) 11/16/2018

32 Representations of Events -- Another Approach
Using the maximum number of the arguments and the existential quantifiers will not solve the problem. I ate at my desk. x,y Eating(Speaker,x,y,Desk) I ate lunch. x,y Eating(Speaker,x,Lunch,y) I ate lunch at my desk. x Eating(Speaker,x,Lunch,Desk) If we know that 1st and 2nd formulas represent the same event, they can be combined as 3rd formula. But we cannot do this, because we cannot relate events in this approach. 11/16/2018

33 Representations of Events -- A Solution
We employ reification to elevate events to objects. I ate. x ISA(x,Eating)  Eater(x,Speaker) I ate a turkey sandwich. x ISA(x,Eating)  Eater(x,Speaker)  Eaten(x,TurkeySandwich) I ate at my desk. x ISA(x,Eating)  Eater(x,Speaker)  PlaceEaten(x,Desk) I ate lunch. x ISA(x,Eating)  Eater(x,Speaker)  MealEaten(x,Lunch) With the reified-event approach: There is no need to specify a fixed number of arguments Many roles can be glued when they appear in the input. We do not need to define relations between different versions of eating (postulate) 11/16/2018

34 Representations of Time
Time flows forward, and the events are asocaiated with either points or intervals in time. An ordering among events can be gotten by putting them on the timeline. There can be different schemas for represesenting this kind of temoral information. (the study of temporal logic) The tense of a sentence will correspond to an ordering of events related with that sentence. (the study of tense logic) 11/16/2018

35 Representations of Time -- Example
1.I arrived in Ankara. 2.I am arriving in Ankara. 3.I will arrive in Ankara. All three sentences can be represented with the following formula without any temporal information. w ISA(w,Arriving)  Arriver(w,Speaker)  Destination(w,Ankara) We can add the following representations of temporal information to represent the tenses of these examples. 1. w,i,e ISA(w,Arriving)  Arriver(w,Speaker)  Destination(w,Ankara)  IntervalOf(w,i)  EndPoint(i,e)  Precedes(e,Now) 2. w,i,e ISA(w,Arriving)  Arriver(w,Speaker)  Destination(w,Ankara)  IntervalOf(w,i)  MemberOf(i,Now) 3. w,i,e ISA(w,Arriving)  Arriver(w,Speaker)  Destination(w,Ankara)  IntervalOf(w,i)  EndPoint(i,e)  Precedes(Now,e) 11/16/2018

36 Representations of Time (cont.)
The relation between simple verb tenses and points in time is not straightforward. We fly from Ankara to Istanbul. -- present tense refers to a future event Flight 12 will be at gate an hour now. -- future tense refers to a past event In some formalisms, the tense of a sentence is expressed with the relation among times of events in that sentence, time of a reference point, and time of utterance. 11/16/2018

37 Reinhenbach’s Approach to Representing Tenses
U Past Perfect I had eaten. Past I ate. R,E U U,R,E Present I eat. E R,U Present Perfect I have eaten. U,R E Future I will eat. U E R Future Perfect I will have eaten. 11/16/2018

38 Representation of Aspect
The notion of Aspect concerns with: whether an event has ended or is ongoing whether it is conceptualized as happening at a point in time or over some interval. whether a particular state exists because of it. Event expressions can be divided into four aspectual classes: Stative -- an event participant having a property at a point of time. I know my departure gate. Activity -- an event associated with some interval (without a clear end point) I drove a Ferrari. Accomplishment -- an event with a natural end point and results in a particular state. He booked me a reservation. Achievement -- an event results in a particular state, but an instant event. He found the gate. 11/16/2018

39 Representations of Beliefs
We can represent a belief as follows: I believe that Mary ate Thai food. u,v ISA(u,Believing)  ISA(v,Eating)  Believer(u,Speaker)  Believed(u,v)  Eater(v,Mary)  Eaten(v,ThaiFood) But from this, we can get the following (which may not be correct). v ISA(v,Eating)  Eater(v,Mary)  Eaten(v,ThaiFood) We may think that we can represent this as follows, but it will not be a FOPC formula. Believing(Speaker,Eating(Mary,ThaiFood)) A solution is to augment FOPC with operators. (modal logic with modal operators). Believing(Speaker, v ISA(v,Eating)  Eater(v,Mary)  Eaten(v,ThaiFood)) Inference will be complicated with modal logic. 11/16/2018

40 Frames We may use other representation languages instead of FOPC. But they will be equivalent to their representations in FOPC. For example, we may use frames to represent our believing example. BELIEVING BELIEVER Speaker EATING BELIEVED EATER Mary EATEN ThaiFood 11/16/2018

41 Semantic Analysis Semantic Analysis -- Meaning representations are assigned to linguistic inputs. We need static knowledge from grammar and lexicon. How much semantic analysis do we need? Deep Analysis -- Through syntactic and semantic analysis of the text to capture all pertinent information in the text. Information Extraction -- does not require complete syntactic and semantic analysis. With a cascade of FSAs to produce a robust semantic analyzer. 11/16/2018

42 Syntax-Driven Semantic Analysis
Principle of Compositionality -- the meaning of a sentence can be composed of meanings of its parts. Ordering and groupings will be important. Kirac serves meat. e ISA(e,Serving)  Server(e,Anarkali)  Served(e,Meat) S NP VP NP ProperNoun Verb MassNoun Anarkali serves meat 11/16/2018

43 Semantic Augmentation to CFG Rules
CFG Rules are attached with semantic attachments. These semantic attachments specify how to compute the meaning representation of a construction from the meanings of its constituent parts. A CFG rule with semantic attachment will be as follows: A  1,…,n { f(j.sem,…,k.sem) } The meaning representation of A, A.sem, will be calculated by applying the function f to the semantic representations of some constituents. 11/16/2018

44 Naïve Approach ProperNoun  Anarkali {Anarkali }
MassNoun  meat { Meat } NP  ProperNoun {ProperNoun.sem } NP  MassNoun { MassNoun.sem } Verb  serves {e,x,y ISA(e,Serving)  Server(e,x)  Served(e,y) } But we cannot propagate this representation to upper levels. 11/16/2018

45 Using Lambda Notations
ProperNoun  Anarkali { Anarkali } MassNoun  meat { Meat } NP  ProperNoun {ProperNoun.sem } NP  MassNoun { MassNoun.sem } Verb  serves {xy e ISA(e,Serving)  Server(e,y)  Served(e,x) } VP  Verb NP { Verb.sem(NP.sem) } S  NP VP { VP.sem(NP.sem) } application of lambda expression lambda expression 11/16/2018

46 Quasi-Logical Form During semantic analysis, we may use quantified expressions as terms. In this case, our formula will not be a FOPC formula. We call this form of formulas as quasi-logical form. A quasi-logical form should be converted into a normal FOPC formula by applying simple syntactic translations. Server(e,<x ISA(x,Restaurant)>) a quasi-logical formula x ISA(x,Restaurant )  Server(e,x) a normal FOPC formula 11/16/2018

47 Parse Tree with Logical Forms
write(bertrand,principia) NP bertand VP y.write(y,principia) V x.y.write(y,x) principia bertrand writes 11/16/2018

48 Lecture 24 Lexical Semantics
September 28, 2005 11/16/2018

49 Meaning The meaning of a text or discourse
Traditionally, meaning in language has been studied from three perspectives The meaning of a text or discourse The meanings of individual sentences or utterances The meanings of individual words We started in the middle, now we’ll look at the meanings of individual words. 11/16/2018

50 Word Meaning We didn’t assume much about the meaning of words when
we talked about sentence meanings Verbs provided a template-like predicate argument structure Nouns were practically meaningless constants There has be more to it than that The internal structure of words that determines where they can go and what they can do (syntagmatic) 11/16/2018

51 What’s a word? Words?: Types, tokens, stems, roots, inflected forms?
Lexeme – An entry in a lexicon consisting of a pairing of a form with a single meaning representation Lexicon - A collection of lexemes 11/16/2018

52 Lexical Semantics The linguistic study of systematic meaning related structure of lexemes is called Lexical Semantics. A lexeme is an individual entry in the lexicon. A lexicon is meaning structure holding meaning relations of lexemes. A lexeme may have different meanings. A lexeme’s meaning component is known as one of its senses. Different senses of the lexeme duck. an animal, to lower the head, ... Different senses of the lexeme yüz face, to swim, to skin, the front of something, hundred, ... 11/16/2018

53 Relations Among Lexemes and Their Senses
Homonymy Polysemy Snonymy Hyponymy Hypernym 11/16/2018

54 Homonymy Homonymy is a relation holds between words have same form
with unrelated meanings. Bank -- financial institution, river bank For this, we should create two senses of the lexeme bank. Bat -- (wooden stick-like thing) vs (flying scary mammal thing) The problematic part of understanding homonymy isn’t with the forms, it’s the meanings. Causes ambiguity. Nothing particularly important would happen to anything else in English if we used a different word for the little flying mammal things 11/16/2018

55 Polysemy Polysemy is the phenomenon of multiple related meanings
in a same lexeme. Bank -- financial institution, blood bank these senses are related. Are we going to create a single sense or two different senses? While some banks furnish sperm only to married women, others are less restrictive Serve Which flights serve breakfast? Does America West serve Philadelphia? Does United serve breakfast and San Jose? Most non-rare words have multiple meanings The number of meanings is related to its frequency Verbs tend more to polysemy Distinguishing polysemy from homonymy isn’t always easy (or necessary) 11/16/2018

56 Synonymy Synonymy is the phenomenon of two different lexemes having
the same meaning. Big and large In fact, one of the senses of two lexemes are same. There aren’t any true synonyms. Two lexemes are synonyms if they can be successfully substituted for each other in all situations What does successfully mean? Preserves the meaning But may not preserve the acceptability based on notions of politeness, slang, ... Example - Big and large? That’s my big sister a big plane That’s my large sister a large plane 11/16/2018

57 Hyponymy and Hypernym Hyponymy: one lexeme denotes a subclass of the other lexeme. The more specific lexeme is a hyponymy of the more general lexeme. The more general lexeme is a hypernym of the more specific lexeme. A hyponymy relation can be asserted between two lexemes when the meanings of the lexemes entail a subset relation Since dogs are canids Dog is a hyponym of canid and Canid is a hypernym of dog Car is a hyponymy of vehicle, vehicle is a hypernym of car. 11/16/2018

58 Ontology The term ontology refers to a set of distinct objects resulting from analysis of a domain. A taxonomy is a particular arrangements of the elements of an ontology into a tree-like class inclusion structure. A lexicon holds different senses of lexemes together with other relations among lexemes. 11/16/2018

59 Lexical Resourses There are lots of lexical resources available
Word lists On-line dictionaries Corpora The most ambitious one is WordNet A database of lexical relations for English Versions for other languages are under development 11/16/2018

60 WordNet WordNet is widely used lexical database for English.
WebPage: It holds: The senses of the lexemes holds relations among nouns such as hypernym, hyponym, MemberOf, .. Holds relations among verbs such as hypernym, … Relations are held for each different senses of a lexeme. 11/16/2018

61 WordNet Relations Some of WordNet Relations (for nouns) 11/16/2018

62 WordNet Hierarchies Hyponymy chains for the senses of the lexeme bass
11/16/2018

63 WordNet - bass The noun "bass" has 8 senses in WordNet. 1. bass -- (the lowest part of the musical range) 2. bass, bass part -- (the lowest part in polyphonic music) 3. bass, basso -- (an adult male singer with the lowest voice) 4. sea bass, bass -- (the lean flesh of a saltwater fish of the family Serranidae) 5. freshwater bass, bass -- (any of various North American freshwater fish with lean flesh (especially of the genus Micropterus)) 6. bass, bass voice, basso -- (the lowest adult male singing voice) 7. bass -- (the member with the lowest range of a family of musical instruments) 8. bass -- (nontechnical name for any of numerous edible marine and freshwater spiny-finned fishes) The adjective "bass" has 1 sense in WordNet. 1. bass, deep -- (having or denoting a low vocal or instrumental range; "a deep voice"; "a bass voice is lower than a baritone voice"; "a bass clarinet") 11/16/2018

64 WordNet –bass Hyponyms
Results for "Hyponyms (...is a kind of this), full" search of noun "bass" 6 of 8 senses of bass                                                    Sense 2 bass, bass part -- (the lowest part in polyphonic music)        => ground bass -- (a short melody in the bass that is constantly repeated)        => thorough bass, basso continuo -- (a bass part written out in full and accompanied by figures for successive chords)        => figured bass -- (a bass part in which the notes have numbers under them to indicate the chords to be played) Sense 4 sea bass, bass -- (the lean flesh of a saltwater fish of the family Serranidae)        => striped bass, striper -- (caught along the Atlantic coast of the United States) Sense 5 freshwater bass, bass -- (any of various North American freshwater fish with lean flesh (especially of the genus Micropterus))        => largemouth bass -- (flesh of largemouth bass)        => smallmouth bass -- (flesh of smallmouth bass) Sense 6 bass, bass voice, basso -- (the lowest adult male singing voice)        => basso profundo -- (a very deep bass voice) Sense 7 bass -- (the member with the lowest range of a family of musical instruments)        => bass fiddle, bass viol, bull fiddle, double bass, contrabass, string bass -- (largest and lowest member of the violin family)        => bass guitar -- (the lowest six-stringed guitar)        => bass horn, sousaphone, tuba -- (the lowest brass wind instrument)            => euphonium -- (a bass horn (brass wind instrument) that is the tenor of the tuba family)            => helicon, bombardon -- (a tuba that coils over the shoulder of the musician)        => bombardon, bombard -- (a large shawm; the bass member of the shawm family) Sense 8 bass -- (nontechnical name for any of numerous edible marine and freshwater spiny-finned fishes)        => freshwater bass -- (North American food and game fish) 11/16/2018

65 WordNet – bass Synonyms
Results for "Synonyms, ordered by estimated frequency" search of noun "bass" 8 senses of bass                                                         Sense 1 bass -- (the lowest part of the musical range)        => low pitch, low frequency -- (a pitch that is perceived as below other pitches) Sense 2 bass, bass part -- (the lowest part in polyphonic music)        => part, voice -- (the melody carried by a particular voice or instrument in polyphonic music; "he tried to sing the tenor part") Sense 3 bass, basso -- (an adult male singer with the lowest voice)        => singer, vocalist, vocalizer, vocaliser -- (a person who sings) Sense 4 sea bass, bass -- (the lean flesh of a saltwater fish of the family Serranidae)        => saltwater fish -- (flesh of fish from the sea used as food) Sense 5 freshwater bass, bass -- (any of various North American freshwater fish with lean flesh (especially of the genus Micropterus))        => freshwater fish -- (flesh of fish from fresh water used as food) Sense 6 bass, bass voice, basso -- (the lowest adult male singing voice)        => singing voice -- (the musical quality of the voice while singing) Sense 7 bass -- (the member with the lowest range of a family of musical instruments)        => musical instrument, instrument -- (any of various devices or contrivances that can be used to produce musical tones or sounds) Sense 8 bass -- (nontechnical name for any of numerous edible marine and freshwater spiny-finned fishes)        => percoid fish, percoid, percoidean -- (any of numerous spiny-finned fishes of the order Perciformes) 11/16/2018

66 Internal Structure of Words
Paradigmatic relations connect lexemes together in particular ways but don’t say anything about what the meaning representation of a particular lexeme should consist of. Various approaches have been followed to describe the semantics of lexemes. Thematic roles in predicate-bearing lexemes Selection restrictions on thematic roles Decompositional semantics of predicates Feature-structures for nouns 11/16/2018

67 Thematic Roles Thematic roles provide a shallow semantic language for characterizing certain arguments of verbs. For example: Ali broke the glass. Veli opened the door. Ali is Breaker and the glass is BrokenThing of Breaking event; Veli is Opener and the door is OpenedThing of Opening event. These are deep roles of arguments of events. Both of these events have actors which are doer of a volitional event, and things affected by this action. A thematic role is a way of expressing this kind of commonality. AGENT and THEME are thematic roles. 11/16/2018

68 Some Thematic Roles AGENT --The volitional causer of an event -- She broke the door EXPERIENCER -- The experiencer of an event -- Ali has a headache. FORCE -- The non-volitional causer of an event -- The wind blows it. THEME -- The participant most directly effected by an event -- She broke the door. INSTRUMENT -- An instrument used in an event -- He opened it with a knife. BENEFICIARY -- A beneficiary of an event -- I bought it for her. SOURCE -- The origin of the object of a transfer event -- I flew from Rome. GOAL -- The destination of the object of a transfer event -- I flew to Ankara. 11/16/2018

69 Thematic Roles (cont.) Takes some of the work away from the verbs.
It’s not the case that every verb is unique and has to completely specify how all of its arguments uniquely behave. Provides a mechanism to organize semantic processing It permits us to distinguish near surface-level semantics from deeper semantics 11/16/2018

70 Linking AGENTS are often subjects
Thematic roles, syntactic categories and their positions in larger syntactic structures are all intertwined in complicated ways. For example… AGENTS are often subjects In a VP->V NP NP rule, the first NP is often a GOAL and the second a THEME 11/16/2018

71 Deeper Semantics He melted her reserve with a husky-voiced paean to her eyes. If we label the constituents He and her reserve as the Melter and Melted, then those labels lose any meaning they might have had. If we make them Agent and Theme then we don’t have the same problems 11/16/2018

72 Selectional Restrictions
A selectional restriction augments thematic roles by allowing lexemes to place certain semantic restrictions on the lexemes and phrases can accompany them in a sentence. I want to eat someplace near Bilkent. Now we can say that eat is a predicate that has an AGENT and a THEME And that the AGENT must be capable of eating and the THEME must be capable of being eaten Each sense of a verb can be associated with selectional restrictions. THY serves NewYork. -- direct object (theme) is a place THY serves breakfast. -- direct object (theme) is a meal. We may use these selectional restrictions to disambiguate a sentence. 11/16/2018

73 As Logical Statements For eat…
Eating(e) ^Agent(e,x)^ Theme(e,y)^Isa(y, Food) (adding in all the right quantifiers and lambdas) 11/16/2018

74 WordNet Use WordNet hyponyms (type) to encode the selection restrictions 11/16/2018

75 Specificity of Restrictions
What can you say about THEME in each with respect to the verb? Some will be high up in the WordNet hierarchy, others not so high… PROBLEMS Unfortunately, verbs are polysemous and language is creative… … ate glass on an empty stomach accompanied only by water and tea you can’t eat gold for lunch if you’re hungry … get it to try to eat Afghanistan 11/16/2018

76 Discovering the Restrictions
Instead of hand-coding the restrictions for each verb, can we discover a verb’s restrictions by using a corpus and WordNet? Parse sentences and find heads Label the thematic roles Collect statistics on the co-occurrence of particular headwords with particular thematic roles Use the WordNet hypernym structure to find the most meaningful level to use as a restriction 11/16/2018

77 Motivation Find the lowest (most specific) common ancestor that covers a significant number of the examples 11/16/2018

78 Word-Sense Disambiguation
Word sense disambiguation refers to the process of selecting the right sense for a word from among the senses that the word is known to have Semantic selection restrictions can be used to disambiguate Ambiguous arguments to unambiguous predicates Ambiguous predicates with unambiguous arguments Ambiguity all around 11/16/2018

79 Word-Sense Disambiguation
We can use selectional restrictions for disambiguation. He cooked simple dishes. He broke the dishes. But sometimes, selectional restrictions will not be enough to disambiguate. What kind of dishes do you recommend? -- we cannot know what sense is used. There can be two lexemes (or more) with multiple senses. They serve vegetarian dishes. Selectional restrictions may block the finding of meaning. If you want to kill Turkey, eat its banks. Kafayı yedim. These situations leave the system with no possible meanings, and they can indicate a metaphor. 11/16/2018

80 WSD and Selection Restrictions
Ambiguous arguments Prepare a dish Wash a dish Ambiguous predicates Serve Denver Serve breakfast Both Serves vegetarian dishes 11/16/2018

81 WSD and Selection Restrictions
This approach is complementary to the compositional analysis approach. You need a parse tree and some form of predicate-argument analysis derived from The tree and its attachments All the word senses coming up from the lexemes at the leaves of the tree Ill-formed analyses are eliminated by noting any selection restriction violations 11/16/2018

82 Problems As we saw last time, selection restrictions are violated all the time. This doesn’t mean that the sentences are ill-formed or preferred less than others. This approach needs some way of categorizing and dealing with the various ways that restrictions can be violated 11/16/2018

83 WSD Tags A dictionary sense? What’s a tag?
For example, for WordNet an instance of “bass” in a text has 8 possible tags or labels (bass1 through bass8). 11/16/2018

84 WordNet Bass The noun ``bass'' has 8 senses in WordNet
bass - (the lowest part of the musical range) bass, bass part - (the lowest part in polyphonic music) bass, basso - (an adult male singer with the lowest voice) sea bass, bass - (flesh of lean-fleshed saltwater fish of the family Serranidae) freshwater bass, bass - (any of various North American lean-fleshed freshwater fishes especially of the genus Micropterus) bass, bass voice, basso - (the lowest adult male singing voice) bass - (the member with the lowest range of a family of musical instruments) bass -(nontechnical name for any of numerous edible marine and freshwater spiny-finned fishes) 11/16/2018

85 Representations Vectors of sets of feature/value pairs
Most supervised ML approaches require a very simple representation for the input training data. Vectors of sets of feature/value pairs I.e. files of comma-separated values So our first task is to extract training data from a corpus with respect to a particular instance of a target word This typically consists of a characterization of the window of text surrounding the target 11/16/2018

86 Representations This is where ML and NLP intersect If you stick to trivial surface features that are easy to extract from a text, then most of the work is in the ML system If you decide to use features that require more analysis (say parse trees) then the ML part may be doing less work (relatively) if these features are truly informative 11/16/2018

87 Surface Representations
Collocational and co-occurrence information Collocational Encode features about the words that appear in specific positions to the right and left of the target word Often limited to the words themselves as well as they’re part of speech Co-occurrence Features characterizing the words that occur anywhere in the window regardless of position Typically limited to frequency counts 11/16/2018

88 Collocational [guitar, NN, and, CJC, player, NN, stand, VVB]
Position-specific information about the words in the window guitar and bass player stand [guitar, NN, and, CJC, player, NN, stand, VVB] In other words, a vector consisting of [position n word, position n part-of-speech…] 11/16/2018

89 Co-occurrence Information about the words that occur within the window. First derive a set of terms to place in the vector. Then note how often each of those terms occurs in a given window. 11/16/2018

90 Classifiers Naïve Bayes (the right thing to try first) Decision lists
Once we cast the WSD problem as a classification problem, then all sorts of techniques are possible Naïve Bayes (the right thing to try first) Decision lists Decision trees Neural nets Support vector machines Nearest neighbor methods… 11/16/2018

91 Classifiers The choice of technique, in part, depends on the set of features that have been used Some techniques work better/worse with features with numerical values Some techniques work better/worse with features that have large numbers of possible values For example, the feature the word to the left has a fairly large number of possible values 11/16/2018

92 Statistical Word-Sense Disambiguation
Where s is a vector of senses, V is the vector representation of the input By Bayesian rule By making independence assumption of meanings. This means that the result is the product of the probabilities of its individual features given that its sense 11/16/2018

93 Problems One for each ambiguous word in the language
Given these general ML approaches, how many classifiers do I need to perform WSD robustly One for each ambiguous word in the language How do you decide what set of tags/labels/senses to use for a given word? Depends on the application 11/16/2018


Download ppt "Semantics September 8, 2006 11/16/2018."

Similar presentations


Ads by Google