CMSC 723: Intro to Computational Linguistics November 24, 2004 Lecture 12: Lexical Semantics Bonnie Dorr Christof Monz.

Slides:



Advertisements
Similar presentations
Artificial Intelligence
Advertisements

March 1, 2009 Dr. Muhammed Al-Mulhem 1 ICS 482 Natural Language Processing Semantics (Chapter 17) Muhammed Al-Mulhem March 1, 2009.
Semantics (Representing Meaning)
Syntactic analysis using Context Free Grammars. Analysis of language Morphological analysis – Chairs, Part Of Speech (POS) tagging – The/DT man/NN left/VBD.
Knowledge Representation. Essential to artificial intelligence are methods of representing knowledge. A number of methods have been developed, including:
Natural Language Processing Lecture 22: Meaning Representation Languages.
Statistical NLP: Lecture 3
Word sense disambiguation and information retrieval Chapter 17 Jurafsky, D. & Martin J. H. SPEECH and LANGUAGE PROCESSING Jarmo Ritola -
Chapter 17. Lexical Semantics From: Chapter 17 of An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition, by.
Computational Semantics Ling 571 Deep Processing Techniques for NLP February 7, 2011.
1 Words and the Lexicon September 10th 2009 Lecture #3.
NLP and Speech 2004 Semantics I Semantics II
Meaning Representation and Semantic Analysis Ling 571 Deep Processing Techniques for NLP February 9, 2011.
CS 4705 Relations Between Words. Today Word Clustering Words and Meaning Lexical Relations WordNet Clustering for word sense discovery.
Introduction to Semantics To be able to reason about the meanings of utterances, we need to have ways of representing the meanings of utterances. A formal.
CS 4705 Lecture 19 Word Sense Disambiguation. Overview Selectional restriction based approaches Robust techniques –Machine Learning Supervised Unsupervised.
1/27 Semantics Going beyond syntax. 2/27 Semantics Relationship between surface form and meaning What is meaning? Lexical semantics Syntax and semantics.
CS 4705 Semantics: Representations and Analyses. What kinds of meaning do we want to capture? Categories/entities –IBM, Jane, a black cat, Pres. Bush.
Knowledge Representation using First-Order Logic (Part II) Reading: Chapter 8, First lecture slides read: Second lecture slides read:
CS 4705 Lecture Lexical Semantics. What is lexical semantics? Meaning of Words Lexical Relations WordNet Thematic Roles Selectional Restrictions Conceptual.
November 20, 2003 Chapter 16 Lexical Semantics. Words have structured meanings Lexeme – a pairing of a form with a sense Orthographic form – the way the.
Semantics Ling 571 Fei Xia Week 6: 11/1-11/3/05. Outline Meaning representation: what formal structures should be used to represent the meaning of a sentence?
Categories – relations or individuals? What are the differences in representing collie as a relation vs. an individual? As a relation: collie(lassie) –
CS 4705 Word Sense Disambiguation. Overview Selectional restriction based approaches Robust techniques –Machine Learning Supervised Unsupervised –Dictionary-based.
Lecture Lexical Semantics CS 4705.
CS 4705 Lexical Semantics. Today Words and Meaning Lexical Relations WordNet Thematic Roles Selectional Restrictions Conceptual Dependency.
Categories – relations or individuals? What are the differences in representing collie as a relation vs. an individual? As a relation: collie(lassie) –
Meaning and Language Part 1.
February 2009Introduction to Semantics1 Logic, Representation and Inference Introduction to Semantics What is semantics for? Role of FOL Montague Approach.
1 LIN 1310B Introduction to Linguistics Prof: Nikolay Slavkov TA: Qinghua Tang CLASS 14, Feb 27, 2007.
BİL711 Natural Language Processing
Chapter 14. Representing Meaning From: Chapter 14 of An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition,
November 2003CSA4050: Semantics I1 CSA4050: Advanced Topics in NLP Semantics I What is semantics for? Role of FOL Montague Approach.
WORD SENSE DISAMBIGUATION STUDY ON WORD NET ONTOLOGY Akilan Velmurugan Computer Networks – CS 790G.
1 Natural Language Processing Lecture Notes 11 Chapter 15 (part 1)
SEMANTIC ANALYSIS WAES3303
Lexical Semantics Chapter 16
LOGIC AND ONTOLOGY Both logic and ontology are important areas of philosophy covering large, diverse, and active research projects. These two areas overlap.
CS 4705 Lecture 19 Word Sense Disambiguation. Overview Selectional restriction based approaches Robust techniques –Machine Learning Supervised Unsupervised.
11 Chapter 19 Lexical Semantics. 2 Lexical Ambiguity Most words in natural languages have multiple possible meanings. –“pen” (noun) The dog is in the.
Computational Semantics Day 5: Inference Aljoscha.
Linguistic Essentials
1 Natural Language Processing Chapter Transition First we did words (morphology) Then we looked at syntax Now we’re moving on to meaning.
Semantic Construction lecture 2. Semantic Construction Is there a systematic way of constructing semantic representation from a sentence of English? This.
Albert Gatt LIN3021 Formal Semantics Lecture 4. In this lecture Compositionality in Natural Langauge revisited: The role of types The typed lambda calculus.
Wordnet - A lexical database for the English Language.
Ontology Engineering: from Cognitive Science to the Semantic Web Maria Teresa Pazienza University of Roma Tor Vergata, Italy 1.
Word Relations Slides adapted from Dan Jurafsky, Jim Martin and Chris Manning.
The meaning of Language Chapter 5 Semantics and Pragmatics Week10 Nov.19 th -23 rd.
CS460/IT632 Natural Language Processing/Language Technology for the Web Lecture 24 (14/04/06) Prof. Pushpak Bhattacharyya IIT Bombay Word Sense Disambiguation.
Word Meaning and Similarity
Knowledge Structure Vijay Meena ( ) Gaurav Meena ( )
First-Order Logic Semantics Reading: Chapter 8, , FOL Syntax and Semantics read: FOL Knowledge Engineering read: FOL.
Lec. 10.  In this section we explain which constituents of a sentence are minimally required, and why. We first provide an informal discussion and then.
Natural Language Processing Vasile Rus
Natural Language Processing
Statistical NLP: Lecture 3
Semantics (Representing Meaning)
Ontology Engineering: from Cognitive Science to the Semantic Web
Representations of Meaning
Word Relations Slides adapted from Dan Jurafsky, Jim Martin and Chris Manning.
CSC 594 Topics in AI – Applied Natural Language Processing
Semantics September 23, /22/2018.
CPSC 503 Computational Linguistics
Word Relations Slides adapted from Dan Jurafsky, Jim Martin and Chris Manning.
Linguistic Essentials
CS4705 Natural Language Processing
Structure of a Lexicon Debasri Chakrabarti 13-May-19.
Relations Between Words
Presentation transcript:

CMSC 723: Intro to Computational Linguistics November 24, 2004 Lecture 12: Lexical Semantics Bonnie Dorr Christof Monz

Meaning So far, we have focused on the structure of language, not on what things mean We have seen that words have different meaning, depending on the context in which they are used Every day language tasks that require some semantic processing: Answering an essay question on an exam Deciding what to order at a restaurant by reading a menu Realizing you’ve been insulted

Meaning (continued) meaning representations are representations that link linguistic forms to knowledge of the world We are going to cover: What is the meaning of a word How can we represent the meaning What formalisms can be used Meaning representation languages

What Can Serve as a Meaning Representation? Anything that serves the core practical purposes of a program that is doing semantic processing What is a Meaning Representation Language? What is Semantic Analysis?

Requirements for Meaning Representation Verifiability Unambiguous Representation Canonical Form Inference Expressiveness

Verifiability System can match input representation against representations in knowledge base. If it finds a match, it can return Yes; Otherwise No. Does Maharani serve vegetarian food? Serves(Maharani,vegetarian food)

Unambiguous Representation Single linguistic input can have different meaning representations Each representation unambiguously characterizes one meaning. Example: small cars and motorcycles are allowed car(x) & small(x) & motorcycle(y) & small(y) & allowed(x) & allowed(y) car(x) & small(x) & motorcycle(y) & allowed(x) & allowed(y)

Ambiguity and Vagueness An expression is ambiguous if, in a given context, it can be disambiguated to have a specific meaning, from a number of discrete, possible meanings. E.g., bank (financial institution) vs bank (river bank) An expression is vague, if it refers to a range of a scalar variable, such that, even in a specific context, it’s hard to specify the range entirely. E.g., he’s tall, it’s warm, etc.

Representing Similar Concepts Distinct inputs could have the same meaning Does Maharani have vegetarian dishes? Do they have vegetarian food at Maharani? Are vegetarian dishes served at Maharani? Does Maharani serve vegetarian fare? Alternatives Four different semantic representations Store all possible meaning representations in KB

Canonical Form Solution: Inputs that mean same thing have same meaning representation Is this easy? No! Vegetarian dishes, vegetarian food, vegetarian fare Have, serve What to do?

How to Produce a Canonical Form Systematic Meaning Representations can be derived from thesaurus food ___ dish ___|____one overlapping meaning sense fare ___| We can systematically relate syntactic constructions [S [NP Maharani] serves [NP vegetarian dishes]] [S [NP vegetarian dishes] are served at [NP Maharani]]

Inference Consider a more complex request Can vegetarians eat at Maharani? Vs: Does Maharani serve vegetarian food? Why do these result in the same answer? Inference: Draw conclusions about truth of propositions not explicitly stored in KB serve(Maharani,VegetarianFood) => CanEat(Vegetarians,AtMaharani)

Non-Yes/No Questions Example: I'd like to find a restaurant where I can get vegetarian food. serve(x,VegetarianFood) Matching succeeds only if variable x can be replaced by known object in KB.

Meaning Structure of Language Human Languages Display a basic predicate-argument structure Make use of variables Make use of quantifiers Display a partially compositional semantics

Compositionality The compositionality principle is an important principle in formal semantics: The meaning of an expression is a strict function of the meanings of its parts It allows to build meaning representations incrementally Standard predicate logic does not adhere to this principle (donkey sentences)

Predicate-Argument Structure Represent concepts and relationships among them Some words act like arguments and some words act like predicates: Nouns as concepts or arguments: red(ball) Adj, Adv, Verbs as predicates: red(ball) Subcategorization (argument) frames specify number, position, and syntactic category of arguments Examples: NP give NP2 NP1 NP give NP1 to NP2 give(x,y,z)

Semantic (thematic) Roles Semantic Roles: Participants in an event Agent: George hit Bill. Bill was hit by George Patient: George hit Bill. Bill was hit by George Semantic (Selectional) Restrictions: Constrain the types of arguments verbs take George assassinated the senator *The spider assassinated the fly Verb subcategorization: Allows linking arguments in surface structure with their semantic roles Prepositions are like verbs Under(ItalianRestaurant,$15)

First Order Predicate Calculus (FOPC) FOPC provides sound computational basis for verifiability, inference, expressiveness Supports determination of truth Supports compositionality of meaning Supports question-answering (via variables) Supports inference

FOPC Syntax Terms Constants: Maharani Functions: LocationOf(Maharani) Variables: x in LocationOf(x) Predicates: Relations that hold among objects Serves(Maharani,VegetarianFood) Logical Connectives: Permit compositionality of meaning I only have $5 and I don’t have a lot of time Have(I,$5)  Have(I,LotofTime)

FOPC Semantics Sentences in FOPC can be assigned truth values True or False

Variables and Quantifiers Existential (  ): There exists A restaurant that serves Mexican food near UMD (  x) Restaurant(x) Serves(x,MexicalFood) Near(LocationOf(x),LocationOf(UMD)) Universal (  ): For all All vegetarian restaurants serve vegetarian food (  x) VegetarianRestaurant(x) -> Serves(x,VegetarianFood)

FOPC Examples John gave Mary a book Previously: Give(John,Mary,book) Better: (  x) Giving(x) Giver(John,x)  Givee(Mary,x) Given(book,x) Full Definition of Give: (  w,x,y,z) Giving(x)  Giver(w,x)  Givee(z,x)  Given(y,x)

Why use Variables? Multiple sentences containing “eat” I ate. I ate a turkey sandwich. I ate a turkey sandwich at my desk. I ate at my desk. I ate lunch. I ate a turkey sandwich for lunch I ate a turkey sandwich for lunch at my desk. Seven different Representations: Eating 1 (Speaker) Eating 2 (Speaker,TurkeySandwich) Eating 3 (Speaker,TurkeySandwich,Desk) Eating 4 (Speaker,Desk) Eating 5 (Speaker,Lunch) Eating 6 (Speaker,TurkeySandwich,Lunch) Eating 7 (Speaker,TurkeySandwich,Lunch,Desk)

Solution with Variables Eating(v,w,x,y) Examples revisited: (  w,x,y) Eating(Speaker,w,x,y) (  x,y) Eating(Speaker,TurkeySandwich,x,y) (  x) Eating(Speaker,TurkeySandwich,x,Desk) (  w,x) Eating(Speaker,w,x,Desk) (  w,y) Eating(Speaker,w,Lunch,y) (  y) Eating(Speaker,TurkeySandwich,Lunch,y) Eating(Speaker,TurkeySandwich,Lunch,Desk)

Representing Time Events are associated with points or intervals in time. We can impose an ordering on distinct events using notion of precedes. Temporal logic notation: (  w,x,t) Arrive(w,x,t) Constraints on variable t I arrived in New York (  t) Arrive(I,NewYork,t)  precedes(t,Now)

Interval Events Need t start and t end She was driving to New York until now (  t start,t end ) Drive(She,NewYork)  precedes(t start,Now)  Equals(t end,Now)

Relation Between Tenses and Time Relation between simple verb tenses and points in time is not straightforward Present tense used like future: We fly from Baltimore to Boston at 10 Complex tenses: Flight 1902 arrived late Flight 1902 had arrived late

Reference Point Reichenbach (1947) introduced notion of Reference point (R), separated out from Speech time (S) and Event time (E) Example: When Mary's flight departed, I ate lunch When Mary's flight departed, I had eaten lunch Departure event specifies reference point.

Reichenbach Applied to Tenses S S R,S S,R,ES,R S We refer to the S,R,E notation as a Basic Tense Structure (BTS)

Logical Inference The main motivation for using logic as a meaning representation is that it allows for sound and complete inference methods In propositional logic, a proposition P containing the propositional variable Q 1,…,Q n is valid, if P is true for all truth values of Q 1,…,Q n

Logical Inference Assume we have a number of sentences S 1,…,S n and their respective logical representations P 1,…,P n, and we want to determine whether some Q follows from them We check whether P 1 … P n -> Q is logically valid

Theorem Proving Considering all possible truth value instantiations is computationally infeasible: For n propositional variables, there are 2 n possible instantiations Finding computationally feasible ways to test for validity is the task of the research field of theorem proving (or automated reasoning)

Definitions What is the lexicon? A list of lexemes What is a lexeme? Word Orthography + Word Phonology + Word Sense What is the word sense? What is a dictionary? What is a computational lexicon?

Lexical Relations I: Homonomy What is homonomy? A bank holds investments in a custodial account Agriculture is burgeoning on the east bank Variants homophones: “read” vs. “red” homographs: “bass” vs. “bass”

Lexical Relations II: Polysemy What is polysemy? The bank is constructed from red brick I withdrew the money from the bank Distinguishing polysemy from homonymy is not straightforward

Word Sense Disambiguation For any given lexeme, can its senses be reliably distinguished? Assumes a fixed set of senses for each lexical item

Lexical Relations IV: Synonymy What is synonymy? How big is that plane? How large is that plane? Very hard to find true synonyms A big fat apple ?A large fat apple Influences on substitutability subtle shades of meaning differences polysemy register collocational constraints

Lexical Relations V: Hyponymy What is hyponymy? Not symmetric Example: car is a hyponym of vehicle and vehicle is a hypernym of car Test: That is a car implies That is a vehicle What is an ontology? Ex: CAR#1 is an object of type car What is a taxonomy? Ex: car is a kind of vehicle. CAR#1 is an object of type car What is an object hierarchy?

WordNet Most widely used hierarchically organized lexical database for English (Fellbaum, 1998) Demo:

Format of WordNet Entries

Distribution of Senses among WordNet Verbs

Lexical Relations in WordNet

Synsets in WordNet Example: {chump, fish, fool, gull, mark, patsy, fall guy, sucker, schlemiel, shlemiel, soft touch, mug} Definition: “a person who is gullible and easy to take advantage of”. Important: This exact synset makes up one sense for each of the entries listed in the synset. Theoretically, each synset can be viewed as a concept in a taxonomy Compare to: (  w,x,y,z) Giving(x) ^ Giver(w,x) ^ Givee(z,x) ^ Given(y,x). WN represents “give” as 45 senses, one of which is the synset {supply, provide, render, furnish}.

Hyponomy in WordNet

Automated Word Sense Disambiguation One of the main applications of WordNet is word-sense disambiguation. Supervised WSD: A training corpus is manually annotated with WordNet synsets. Foreach phrase-synset pair a list of words occurring in the context is stored. New phrases are classified according to the closet context vector

Automated Word Sense Disambiguation Unsupervised WSD: Given two phrases, consider all possible synsets. Select the two synsets that are closest in the WordNet hierarchy. Distance can be defined as: Number of edges (possibly weighted) Word overlap of the glosses

Selectional Preferences Verbs often exhibit type preferences for their arguments: Eat (OBJ: food) Think (SUBJ: intelligent entity) Analyzing a corpus with verb-argument pairs, it’s possible to derive the proper semantic types by looking at the hypernyms of the arguments

Readings J&M Chapter 17