CS 4705 Semantic Roles and Disambiguation. Today Semantic Networks: Wordnet Thematic Roles Selectional Restrictions Selectional Association Conceptual.

Slides:



Advertisements
Similar presentations
Natural Language Processing Lecture 2: Semantics.
Advertisements

Semantics (Representing Meaning)
1 CS 388: Natural Language Processing Story Understanding Raymond J. Mooney University of Texas at Austin.
Syntax-Semantics Mapping Rajat Kumar Mohanty CFILT.
1 Extended Gloss Overlaps as a Measure of Semantic Relatedness Satanjeev Banerjee Ted Pedersen Carnegie Mellon University University of Minnesota Duluth.
Conceptual Dependency ATRANS – abstract transfer (give) PTRANS – physical transfer (move) PROPEL – application of force (push) MTRANS – mental transfer.
Knowledge Representation. Essential to artificial intelligence are methods of representing knowledge. A number of methods have been developed, including:
CPSC 322 Introduction to Artificial Intelligence November 5, 2004.
LEDIR : An Unsupervised Algorithm for Learning Directionality of Inference Rules Advisor: Hsin-His Chen Reporter: Chi-Hsin Yu Date: From EMNLP.
LING 388: Language and Computers Sandiway Fong Lecture 24.
Steven Schoonover.  What is VerbNet?  Levin Classification  In-depth look at VerbNet  Evolution of VerbNet  What is FrameNet?  Applications.
Chapter 17. Lexical Semantics From: Chapter 17 of An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition, by.
CS 4705 Relations Between Words. Today Word Clustering Words and Meaning Lexical Relations WordNet Clustering for word sense discovery.
CS 4705 Relationships among Words, Semantic Roles, and Word- Sense Disambiguation.
CS Word Sense Disambiguation. 2 Overview A problem for semantic attachment approaches: what happens when a given lexeme has multiple ‘meanings’?
CS 4705 Lecture 19 Word Sense Disambiguation. Overview Selectional restriction based approaches Robust techniques –Machine Learning Supervised Unsupervised.
Semantics Where are we in the “Big Picture” Morph Analysis Parsing WORLD of FACTS Inference Engine Semantic Interpreter ASRSpeech Text Syntactic Parse.
Constraining X-bar theory using the mental dictionary
CS 4705 Semantics: Representations and Analyses. What kinds of meaning do we want to capture? Categories/entities –IBM, Jane, a black cat, Pres. Bush.
The Informative Role of WordNet in Open-Domain Question Answering Marius Paşca and Sanda M. Harabagiu (NAACL 2001) Presented by Shauna Eggers CS 620 February.
CS 4705 Lecture Lexical Semantics. What is lexical semantics? Meaning of Words Lexical Relations WordNet Thematic Roles Selectional Restrictions Conceptual.
Markov Model Based Classification of Semantic Roles A Final Project in Probabilistic Methods in AI Course Submitted By: Shlomit Tshuva, Libi Mann and Noam.
CS 4705 Word Sense Disambiguation. Overview Selectional restriction based approaches Robust techniques –Machine Learning Supervised Unsupervised –Dictionary-based.
Lecture Lexical Semantics CS 4705.
1/17 Acquiring Selectional Preferences from Untagged Text for Prepositional Phrase Attachment Disambiguation Hiram Calvo and Alexander Gelbukh Presented.
CS 4705 Lexical Semantics. Today Words and Meaning Lexical Relations WordNet Thematic Roles Selectional Restrictions Conceptual Dependency.
The syntax of language How do we form sentences? Processing syntax. Language and the brain.
Albert Gatt Corpora and Statistical Methods Lecture 5.
Albert Gatt LIN 3098 Corpus Linguistics. In this lecture Some more on corpora and grammar Construction Grammar as a theoretical framework Collostructional.
NLU: Frames Frame KR is a good way to represent common sense –can define stereotypical aspects of some domain we are interested in analyzing –sentences.
1 Statistical NLP: Lecture 10 Lexical Acquisition.
Computational Lexical Semantics Lecture 8: Selectional Restrictions Linguistic Institute 2005 University of Chicago.
PropBank, VerbNet & SemLink Edward Loper. PropBank 1M words of WSJ annotated with predicate- argument structures for verbs. –The location & type of each.
“How much context do you need?” An experiment about context size in Interactive Cross-language Question Answering B. Navarro, L. Moreno-Monteagudo, E.
Based on “Semi-Supervised Semantic Role Labeling via Structural Alignment” by Furstenau and Lapata, 2011 Advisors: Prof. Michael Elhadad and Mr. Avi Hayoun.
Jennie Ning Zheng Linda Melchor Ferhat Omur. Contents Introduction WordNet Application – WordNet Data Structure - WordNet FrameNet Application – FrameNet.
1 Query Operations Relevance Feedback & Query Expansion.
Lexical Semantics Chapter 16
Early Work Masterman: 100 primitive concepts, 15,000 concepts Wilks: Natural Language system using semantic networks Shapiro: Propositional calculus based.
CS 4705 Lecture 19 Word Sense Disambiguation. Overview Selectional restriction based approaches Robust techniques –Machine Learning Supervised Unsupervised.
A Cascaded Finite-State Parser for German Michael Schiehlen Institut für Maschinelle Sprachverarbeitung Universität Stuttgart
Modelling Human Thematic Fit Judgments IGK Colloquium 3/2/2005 Ulrike Padó.
11 Chapter 19 Lexical Semantics. 2 Lexical Ambiguity Most words in natural languages have multiple possible meanings. –“pen” (noun) The dog is in the.
Combining Lexical Resources: Mapping Between PropBank and VerbNet Edward Loper,Szu-ting Yi, Martha Palmer September 2006.
Disambiguation Read J & M Chapter 17.1 – The Problem Washington Loses Appeal on Steel Duties Sue caught the bass with the new rod. Sue played the.
Rules, Movement, Ambiguity
WordNet Enhancements: Toward Version 2.0 WordNet Connectivity Derivational Connections Disambiguated Definitions Topical Connections.
CPSC 422, Lecture 27Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27 Nov, 16, 2015.
Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.
Answer Mining by Combining Extraction Techniques with Abductive Reasoning Sanda Harabagiu, Dan Moldovan, Christine Clark, Mitchell Bowden, Jown Williams.
Lecture 19 Word Meanings II Topics Description Logic III Overview of MeaningReadings: Text Chapter 189NLTK book Chapter 10 March 27, 2013 CSCE 771 Natural.
NLP. Introduction to NLP Last week, Min broke the window with a hammer. The window was broken with a hammer by Min last week With a hammer, Min broke.
Using Wikipedia for Hierarchical Finer Categorization of Named Entities Aasish Pappu Language Technologies Institute Carnegie Mellon University PACLIC.
Knowledge Structure Vijay Meena ( ) Gaurav Meena ( )
Overview of Statistical NLP IR Group Meeting March 7, 2006.
SEMANTICS Chapter 10 Ms. Abrar Mujaddidi. What is semantics?  Semantics is the study of the conventional meaning conveyed by the use of words, phrases.
Lec. 10.  In this section we explain which constituents of a sentence are minimally required, and why. We first provide an informal discussion and then.
Representations of Meaning
Conceptual Dependency (CD)
Strong Slot-and-Filler Structures
Lecture 19 Word Meanings II
CSC 594 Topics in AI – Applied Natural Language Processing
Unit-4 Lexical Semantics M.B.Chandak, HoD CSE,
WordNet: A Lexical Database for English
Strong Slot-and-Filler Structures
WordNet WordNet, WSD.
Lecture 19 Word Meanings II
Structure of a Lexicon Debasri Chakrabarti 13-May-19.
Relations Between Words
Conceptual Dependency Theory
Presentation transcript:

CS 4705 Semantic Roles and Disambiguation

Today Semantic Networks: Wordnet Thematic Roles Selectional Restrictions Selectional Association Conceptual Dependency

Semantic Networks Used to represent lexical relationships –e.g. WordNet (George Miller et al) –Most widely used hierarchically organized lexical database for English –Synset: set of synonyms, a dictionary-style definition (or gloss), and some examples of uses --> a concept –Databases for nouns, verbs, and modifiers Applications can traverse network to find synonyms, antonyms, hierarchies,... –Available for download or online use –

Using WN, e.g. in Question-Answering Pasca & Harabagiu ’01 results on TREC corpus –Parses questions to determine question type, key words (Who invented the light bulb?) –Person question; invent, light, bulb –The modern world is an electrified world. It might be argued that any of a number of electrical appliances deserves a place on a list of the millennium's most significant inventions. The light bulb, in particular, profoundly changed human existence by illuminating the night and making it hospitable to a wide range of human activity. The electric light, one of the everyday conveniences that most affects our lives, was invented in 1879 simultaneously by Thomas Alva Edison in the United States and Sir Joseph Wilson Swan in England. Finding key words is not enough

Compare expected answer ‘type’ to potential answers –For questions of type person, expect answer is person –Identify potential person names (NEs) in passages retrieved by IR –Check in WN to find which of these are hyponyms of instance? Person? Or, Consider reformulations of question: Who invented the light bulb –For key words in query, look for WN synonyms –E.g. Who fabricated the light bulb? –Use this query for initial IR Results: improve system accuracy by 147% (on some question types)

Thematic Roles E w,x,y,z {Giving(x) ^ Giver(w,x) ^ Givee(z, x) ^ Given(y,x)} A set of roles for each event: –Agent: volitional causer -- John hit Bill. –Experiencer: experiencer of event – Bill got a headache. –Force: non-volitional causer – The concrete block struck Bill on the head. –Theme/patient: most affected participant – John hit Bill. –Result: end product – Bill got a headache. –Content: proposition of propositional event – Bill thought he should take up martial arts.

–Instrument: instrument used -- John hit Bill with a bat –Beneficiary: qui bono – John hit Bill to avenge his friend –Source: origin of object of transfer event – Bill fled from New York to Timbuktu –Goal: destination of object -- Bill led from New York to Timbuktu But there are a lot of verbs, with a lot of frames… Framenet encodes frames for many verb categoriesFramenet

Thematic Roles and Selectional Restrictions Selectional restrictions: semantic constraint that a word (lexeme) imposes on the concepts that go with it George hit Bill with ….John/a gun/gusto. Jim killed his philodendron/a fly/Bill. ?His philodenron killed Jim. The flu/Misery killed Jim.

In practical use: –Given e.g. a verb and a corpus (plus FrameNet) –What conceptual roles are likely to accompany it? –What lexemes are likely to fill those roles? Assassinate Give Imagine Fall Serve

Disambiguation via Selectional Restrictions “Verbs are known by the company they keep” –Different verbs select for different thematic roles wash the dishes (takes washable-thing as patient) serve delicious dishes (takes food-type as patient) Method: another semantic attachment in grammar –Semantic attachment rules are applied as sentences are syntactically parsed VP --> V NP V  serve {theme:food-type} –Selectional restriction violation: no parse

But this means we must: –Write selectional restrictions for each sense of each predicate – or use FrameNetFrameNet Serve alone has 15 verb senses –Hierarchical type information about each argument (using WordNet)WordNet How many hypernyms does dish have? How many lexemes are hyponyms of dish?hyponyms But also: –Sometimes selectional restrictions don’t restrict enough (Which dishes do you like?) –Sometimes they restrict too much (Eat dirt, worm! I’ll eat my hat!) Can we take a statistical approach?

Selectional Association (Resnik ‘97) Selectional Preference Strength: how much does a predicate tell us about the word class of its argument? George is a monster, George cooked a steak –S R (v): How different is p(c), the probability that any direct object will be a member of some class c, from p(c|v), the probability that a direct object of a specific verb will fall into that class? Estimate conditional probabilities of word senses from a parsed corpus, counting how often each predicate occurs with an object argument –e.g. How likely is dish to be an object of served? Jane served/V the dish/Obj Then estimate the strength of association between each predicate and the super-class (hypernym) of the argument in Wordnet

–E.g. For each object of serve (e.g. ragout, Mary, dish) Look up all its hypernym classes in WordNet (e.g dish isa piece of crockery, dish isa food item,…) Distribute “credit” for dish (with serve) among all hypernym classes (≈sense) to which dish belongs (1/n for n classes) –Pr(c|v) is estimated at count(c,v)/count(v) –Why does this work? Ambiguous words have many superordinate classes John served food/the dish/tuna/curry There is a common sense among these which gets “credit” in each instance, eventually dominating the likelihood score

–How can we use this in wsd? Choose the class (sense) of the direct object with the highest probability, given the verb Mary served the dish proudly. Results: –Baselines: random choice of word sense is 26.8% choose most frequent sense (NB: requires sense- labeled training corpus) is 58.2% –Resnik’s: 44% correct with only pred/arg relations labeled

Schank's Conceptual Dependency Eleven predicate primitives represent all predicates –Atrans: abstract transfer of possession or ctrl from x to y –Ptrans: physical transfer of object from one place to another –Mtrans: transfer of mental concepts –Mbuild: creation of new information w/in entity –Propel, Move, Ingest, Expel, Speak, Attend

Objects decomposed into primitive categories and modifiers But few predicates result in very complex representations of simple things Ex,y Atrans(x) ^ Actor(x,John) ^ Object(x,Book) ^ To(x,Mary) ^ Ptrans(y) ^ Actor(y,John) ^ Object(y,Book) ^ To(y,Mary) John gave Mary a Book

Next time Chapter 15:5-6 Homework III (the last, now) assignedHomework III