PropBank, VerbNet & SemLink Edward Loper. PropBank 1M words of WSJ annotated with predicate- argument structures for verbs. –The location & type of each.

Slides:



Advertisements
Similar presentations
Page 1 SRL via Generalized Inference Vasin Punyakanok, Dan Roth, Wen-tau Yih, Dav Zimak, Yuancheng Tu Department of Computer Science University of Illinois.
Advertisements

Syntax-Semantics Mapping Rajat Kumar Mohanty CFILT.
Layering Semantics (Putting meaning into trees) Treebank Workshop Martha Palmer April 26, 2007.
FrameNet, PropBank, VerbNet Rich Pell. FrameNet, PropBank, VerbNet  When syntactic information is not enough  Lexical databases  Annotate a natural.
VerbNet Martha Palmer University of Colorado LING 7800/CSCI September 16,
CL Research ACL Pattern Dictionary of English Prepositions (PDEP) Ken Litkowski CL Research 9208 Gue Road Damascus,
Language and Cognition Colombo, June 2011 Day 2 Introduction to Linguistic Theory, Part 4.
E XTRACTING SEMANTIC ROLE INFORMATION FROM UNSTRUCTURED TEXTS Diana Trandab ă 1 and Alexandru Trandab ă 2 1 Faculty of Computer Science, University “Al.
Natural Language Processing Semantic Roles. Semantics Road Map 1.Lexical semantics 2.Disambiguating words Word sense disambiguation Coreference resolution.
Max-Margin Matching for Semantic Role Labeling David Vickrey James Connor Daphne Koller Stanford University.
The SALSA experience: semantic role annotation Katrin Erk University of Texas at Austin.
Statistical NLP: Lecture 3
Albert Gatt LIN 1080 Semantics Lecture 13. In this lecture We take a look at argument structure and thematic roles these are the parts of the sentence.
Semantic Role Labeling Abdul-Lateef Yussiff
PropBanks, 10/30/03 1 Penn Putting Meaning Into Your Trees Martha Palmer Paul Kingsbury, Olga Babko-Malaya, Scott Cotton, Nianwen Xue, Shijong Ryu, Ben.
Towards Parsing Unrestricted Text into PropBank Predicate- Argument Structures ACL4 Project NCLT Seminar Presentation, 7th June 2006 Conor Cafferkey.
Steven Schoonover.  What is VerbNet?  Levin Classification  In-depth look at VerbNet  Evolution of VerbNet  What is FrameNet?  Applications.
Linguistic Theory Lecture 8 Meaning and Grammar. A brief history In classical and traditional grammar not much distinction was made between grammar and.
LTAG Semantics on the Derivation Tree Presented by Maria I. Tchalakova.
1 NSF-ULA Sense tagging and Eventive Nouns Martha Palmer, Miriam Eckert, Jena D. Hwang, Susan Windisch Brown, Dmitriy Dligach, Jinho Choi, Nianwen Xue.
1/27 Semantics Going beyond syntax. 2/27 Semantics Relationship between surface form and meaning What is meaning? Lexical semantics Syntax and semantics.
LCS and Approximate Interlingua at UMD Semantic Annotation Planning Meeting April 14, 2004 Bonnie J. Dorr University of Maryland.
10/9/01PropBank1 Proposition Bank: a resource of predicate-argument relations Martha Palmer, Dan Gildea, Paul Kingsbury University of Pennsylvania February.
Extracting Opinions, Opinion Holders, and Topics Expressed in Online News Media Text Soo-Min Kim and Eduard Hovy USC Information Sciences Institute 4676.
Comments to Karin Kipper Christiane Fellbaum Princeton University and Berlin-Brandenburgische Akademie der Wissenschaften.
Assessing the Impact of Frame Semantics on Textual Entailment Authors: Aljoscha Burchardt, Marco Pennacchiotti, Stefan Thater, Manfred Pinkal Saarland.
SALSA The Saarbrücken Lexical Semantics Annotation & Acquisition Project Aljoscha Burchardt, Katrin Erk, Anette Frank, Andrea Kowalski, Sebastian Pado,
Interpreting Dictionary Definitions Dan Tecuci May 2002.
Based on “Semi-Supervised Semantic Role Labeling via Structural Alignment” by Furstenau and Lapata, 2011 Advisors: Prof. Michael Elhadad and Mr. Avi Hayoun.
Jennie Ning Zheng Linda Melchor Ferhat Omur. Contents Introduction WordNet Application – WordNet Data Structure - WordNet FrameNet Application – FrameNet.
The Encoding of Lexical Implications in VerbNet Change of Location Predicates Annie Zaenen, Danny Bobrow and Cleo Condoravdi.
Improving Subcategorization Acquisition using Word Sense Disambiguation Anna Korhonen and Judith Preiss University of Cambridge, Computer Laboratory 15.
Lexical Semantics Chapter 16
AQUAINT Workshop – June 2003 Improved Semantic Role Parsing Kadri Hacioglu, Sameer Pradhan, Valerie Krugler, Steven Bethard, Ashley Thornton, Wayne Ward,
Semantic Role Labeling: English PropBank
INTRODUCTION: RESEARCH AREA 1. Chinese Semantics 2. Semantic difference related to syntax 3. Module Attribute Representation of Verbal Semantics (MARVS)
Modelling Human Thematic Fit Judgments IGK Colloquium 3/2/2005 Ulrike Padó.
11 Chapter 19 Lexical Semantics. 2 Lexical Ambiguity Most words in natural languages have multiple possible meanings. –“pen” (noun) The dog is in the.
Annotation for Hindi PropBank. Outline Introduction to the project Basic linguistic concepts – Verb & Argument – Making information explicit – Null arguments.
Linguistic Essentials
Lecture 17 Ling 442. Exercises 1.What is the difference between (a) and (b) regarding the thematic roles of the subject DPs. (a)Bill ran. (b) The tree.
Combining Lexical Resources: Mapping Between PropBank and VerbNet Edward Loper,Szu-ting Yi, Martha Palmer September 2006.
1/21 Automatic Discovery of Intentions in Text and its Application to Question Answering (ACL 2005 Student Research Workshop )
The meaning of Language Chapter 5 Semantics and Pragmatics Week10 Nov.19 th -23 rd.
Fact Extraction Ontology Ontological- Semantic Analysis Text Meaning Representation (TMR) Fact Repository (FR) Text Sources Lexicons Grammars Static Knowledge.
CS 4705 Lecture 17 Semantic Analysis: Robust Semantics.
ARDA Visit 1 Penn Lexical Semantics at Penn: Proposition Bank and VerbNet Martha Palmer, Dan Gildea, Paul Kingsbury, Olga Babko-Malaya, Bert Xue, Karin.
◦ Process of describing the structure of phrases and sentences Chapter 8 - Phrases and sentences: grammar1.
FILTERED RANKING FOR BOOTSTRAPPING IN EVENT EXTRACTION Shasha Liao Ralph York University.
NLP. Introduction to NLP Last week, Min broke the window with a hammer. The window was broken with a hammer by Min last week With a hammer, Min broke.
1 Fine-grained and Coarse-grained Word Sense Disambiguation Jinying Chen, Hoa Trang Dang, Martha Palmer August 22, 2003.
Overview of Statistical NLP IR Group Meeting March 7, 2006.
AQUAINT Mid-Year PI Meeting – June 2002 Integrating Robust Semantics, Event Detection, Information Fusion, and Summarization for Multimedia Question Answering.
AnCora-Net Mapping the Spanish Ancora-Verb lexicon to VerbNet Mariona Taulé Maria Antònia Martí Oriol Borrega UNIVERSITAT DE BARCELONA.
Lec. 10.  In this section we explain which constituents of a sentence are minimally required, and why. We first provide an informal discussion and then.
COSC 6336: Natural Language Processing
Leonardo Zilio Supervisors: Prof. Dr. Maria José Bocorny Finatto
Semantic/Thematic Roles Oct 9, 2007 Christopher Manning
Statistical NLP: Lecture 3
INAGO Project Automatic Knowledge Base Generation from Text for Interactive Question Answering.
Answer Extraction: Semantics
Lecture 19 Word Meanings II
CSC 594 Topics in AI – Applied Natural Language Processing
Unit-4 Lexical Semantics M.B.Chandak, HoD CSE,
Political map- shows the boarders between different countries and states.
Lecture 19 Word Meanings II
Structure of a Lexicon Debasri Chakrabarti 13-May-19.
Unsupervised Learning of Narrative Schemas and their Participants
Progress report on Semantic Role Labeling
Owen Rambow 6 Minutes.
Presentation transcript:

PropBank, VerbNet & SemLink Edward Loper

PropBank 1M words of WSJ annotated with predicate- argument structures for verbs. –The location & type of each verb’s arguments Argument types are defined on a per-verb basis. –Consistent across uses of a single verb (sense) But the same tags are used (Arg0, Arg1, Arg2, …) –Arg0  proto-typical agent (Dowty) –Arg1  proto-typical patient

PropBank Example: cover (smear, put over) Arguments: –Arg0 = causer of covering –Arg1 = thing covered –Arg2 = covered with Example: John covered the bread with peanut butter.

PropBank: Trends in Argument Numbering Arg0 = proto-typical agent ( Dowty ) Agent (85%), Experiencer (7%), Theme (2%), … Arg1 = proto-typical patient Theme (47%),Topic (23%), Patient (11%), … Arg2 = Recipient (22%), Extent (15%), Predicate (14%), … Arg3 = Asset (33%), Theme2 (14%), Recipient (13%), … Arg4 = Location (89%), Beneficiary (5%), … Arg5 = Location (94%), Destination (6%) (Percentages indicate how often argument instances were mapped to VerbNet roles in the PropBank corpus)

PropBank: Adjunct Tags Variety of ArgM’s (Arg#>5): – TMP: when? – LOC: where at? – DIR: where to? – MNR:how? – PRP: why? – REC: himself, themselves, each other – PRD: this argument refers to or modifies another – ADV: others

VerbNet Organizes verbs into classes that have common syntax/semantics linking behavior Classes include… –A list of member verbs (w/ WordNet senses) –A set of thematic roles (w/ selectional restr.s) –A set of frames, which define both syntax & semantics using thematic roles. Classes are organized hierarchically

VerbNet - cover contiguous_location-47.8

VerbNet Thematic Roles Actor Actor1 Actor2 Agent Asset Attribute Beneficiary Cause Destination Experiencer Extent Instrument Location Material Patient Patient1 Patient2 Predicate Product Proposition Recipient Source Stimulus Theme Theme1 Theme2 Time Topic Value

SemLink: Mapping Lexical Resources Different lexical resources provide us with different information. To make useful inferences, we need to combine this information. In particular: –PropBank -- How does a verb relate to its arguments? Includes annotated text. –VerbNet -- How do verbs w/ shared semantic & syntactic features (and their arguments) relate? –FrameNet -- How do verbs that describe a common scenario relate? –WordNet -- What verbs are synonymous? – …

What do mappings look like? 2 Types of mappings: –Type mappings describe which entries from two resources might correspond; and how their fields (e.g. arguments) relate. Potentially many-to-many Generated manually or semi-automatically –Token mappings tell us, for a given sentence or instance, which type mapping applies. Can often be thought of as a type of classifier –Built from a single corpus w/ parallel annotations Can also be though of as word sense disambiguation –Because each resource defines word senses differently!

Mapping from PB to VerbNet

Mapping Issues Mappings are often many-to-many –Different resources focus on different distinctions Incomplete coverage –A resource may be missing a relevant lexical item entirely. –A resource may have the relevant lexical item, but not in the appropriate category or w/ the appropriate sense Field mismatches –It may not be possible to map the field information for corresponding entries. (E.g., predicate arguments) Extra fields Missing fields Mismatched fields

Mapping Issues (2) VerbNet verbs mapped to FrameNet VerbNet clear-10.3 clear clean drain empty FrameNet Classes Removing Emptying

Mapping Issues (3) VerbNet verbs mapped to FrameNet FrameNet frame: place Frame Elements: Agent Cause Theme Goal Examples: … VN Class: put 9.1 Members: arrange*, immerse, lodge, mount, sling** Thematic roles: agent (+animate) theme (+concrete) destination (+loc, -region) Frames: … *different sense ** not in FrameNet