VerbNet Martha Palmer University of Colorado LING 7800/CSCI 7000-017 September 16, 2014 1.

Slides:



Advertisements
Similar presentations
Semantics (Representing Meaning)
Advertisements

Syntax-Semantics Mapping Rajat Kumar Mohanty CFILT.
INTRODUCTION TO ARTIFICIAL INTELLIGENCE Massimo Poesio Relation Extraction.
Beyond NomBank: A Study of Implicit Arguments for Nominal Predicates Matthew Gerber and Joyce Y. Chai Department of Computer Science Michigan State University.
Layering Semantics (Putting meaning into trees) Treebank Workshop Martha Palmer April 26, 2007.
FrameNet, PropBank, VerbNet Rich Pell. FrameNet, PropBank, VerbNet  When syntactic information is not enough  Lexical databases  Annotate a natural.
CL Research ACL Pattern Dictionary of English Prepositions (PDEP) Ken Litkowski CL Research 9208 Gue Road Damascus,
E XTRACTING SEMANTIC ROLE INFORMATION FROM UNSTRUCTURED TEXTS Diana Trandab ă 1 and Alexandru Trandab ă 2 1 Faculty of Computer Science, University “Al.
Natural Language Processing Semantic Roles. Semantics Road Map 1.Lexical semantics 2.Disambiguating words Word sense disambiguation Coreference resolution.
Overview of the Hindi-Urdu Treebank Fei Xia University of Washington 7/23/2011.
Outline Linguistic Theories of semantic representation  Case Frames – Fillmore – FrameNet  Lexical Conceptual Structure – Jackendoff – LCS  Proto-Roles.
Max-Margin Matching for Semantic Role Labeling David Vickrey James Connor Daphne Koller Stanford University.
Statistical NLP: Lecture 3
Albert Gatt LIN 1080 Semantics Lecture 13. In this lecture We take a look at argument structure and thematic roles these are the parts of the sentence.
Semantic Role Labeling Abdul-Lateef Yussiff
PropBanks, 10/30/03 1 Penn Putting Meaning Into Your Trees Martha Palmer Paul Kingsbury, Olga Babko-Malaya, Scott Cotton, Nianwen Xue, Shijong Ryu, Ben.
Shallow Semantics. LING NLP 2 Semantics and Pragmatics High-level Linguistics (the good stuff!) Semantics: the study of meaning that can be.
Towards Parsing Unrestricted Text into PropBank Predicate- Argument Structures ACL4 Project NCLT Seminar Presentation, 7th June 2006 Conor Cafferkey.
LING NLP 1 Introduction to Computational Linguistics Martha Palmer April 19, 2006.
Steven Schoonover.  What is VerbNet?  Levin Classification  In-depth look at VerbNet  Evolution of VerbNet  What is FrameNet?  Applications.
The Relevance of a Cognitive Model of the Mental Lexicon to Automatic Word Sense Disambiguation Martha Palmer and Susan Brown University of Colorado August.
1 NSF-ULA Sense tagging and Eventive Nouns Martha Palmer, Miriam Eckert, Jena D. Hwang, Susan Windisch Brown, Dmitriy Dligach, Jinho Choi, Nianwen Xue.
1/27 Semantics Going beyond syntax. 2/27 Semantics Relationship between surface form and meaning What is meaning? Lexical semantics Syntax and semantics.
Ontology Learning and Population from Text: Algorithms, Evaluation and Applications Chapters Presented by Sole.
EMPOWER 2 Empirical Methods for Multilingual Processing, ‘Onoring Words, Enabling Rapid Ramp-up Martha Palmer, Aravind Joshi, Mitch Marcus, Mark Liberman,
Albert Gatt LIN 3098 Corpus Linguistics. In this lecture Some more on corpora and grammar Construction Grammar as a theoretical framework Collostructional.
NLU: Frames Frame KR is a good way to represent common sense –can define stereotypical aspects of some domain we are interested in analyzing –sentences.
CIS630 1 Penn Putting Meaning Into Your Trees Martha Palmer Collaborators: Paul Kingsbury, Olga Babko-Malaya, Bert Xue, Scott Cotton Karin Kipper, Hoa.
Comments to Karin Kipper Christiane Fellbaum Princeton University and Berlin-Brandenburgische Akademie der Wissenschaften.
Wordnet, Raw Text Pinker, continuing Chapter 2
PropBank, VerbNet & SemLink Edward Loper. PropBank 1M words of WSJ annotated with predicate- argument structures for verbs. –The location & type of each.
September 17, : Grammars and Lexicons Lori Levin.
Assessing the Impact of Frame Semantics on Textual Entailment Authors: Aljoscha Burchardt, Marco Pennacchiotti, Stefan Thater, Manfred Pinkal Saarland.
Interpreting Dictionary Definitions Dan Tecuci May 2002.
Jennie Ning Zheng Linda Melchor Ferhat Omur. Contents Introduction WordNet Application – WordNet Data Structure - WordNet FrameNet Application – FrameNet.
The Encoding of Lexical Implications in VerbNet Change of Location Predicates Annie Zaenen, Danny Bobrow and Cleo Condoravdi.
Lexical Semantics Chapter 16
Penn 1 Kindle: Knowledge and Inference via Description Logics for Natural Language Dan Roth University of Illinois, Urbana-Champaign Martha Palmer University.
AQUAINT Workshop – June 2003 Improved Semantic Role Parsing Kadri Hacioglu, Sameer Pradhan, Valerie Krugler, Steven Bethard, Ashley Thornton, Wayne Ward,
Semantic Role Labeling: English PropBank
Modelling Human Thematic Fit Judgments IGK Colloquium 3/2/2005 Ulrike Padó.
11 Chapter 19 Lexical Semantics. 2 Lexical Ambiguity Most words in natural languages have multiple possible meanings. –“pen” (noun) The dog is in the.
Lecture 17 Ling 442. Exercises 1.What is the difference between (a) and (b) regarding the thematic roles of the subject DPs. (a)Bill ran. (b) The tree.
Semantic Role Labeling. Introduction Semantic Role Labeling AgentThemePredicateLocation.
Combining Lexical Resources: Mapping Between PropBank and VerbNet Edward Loper,Szu-ting Yi, Martha Palmer September 2006.
1/21 Automatic Discovery of Intentions in Text and its Application to Question Answering (ACL 2005 Student Research Workshop )
NUMBER OR NUANCE: Factors Affecting Reliable Word Sense Annotation Susan Windisch Brown, Travis Rood, and Martha Palmer University of Colorado at Boulder.
LING 6520: Comparative Topics in Linguistics (from a computational perspective) Martha Palmer Jan 15,
CSE391 – 2005 NLP 1 Events From KRR lecture. CSE391 – 2005 NLP 2 Ask Jeeves – A Q/A, IR ex. What do you call a successful movie? Tips on Being a Successful.
CIS630 1 Penn Different Sense Granularities Martha Palmer, Olga Babko-Malaya September 20, 2004.
ARDA Visit 1 Penn Lexical Semantics at Penn: Proposition Bank and VerbNet Martha Palmer, Dan Gildea, Paul Kingsbury, Olga Babko-Malaya, Bert Xue, Karin.
NLP. Introduction to NLP Last week, Min broke the window with a hammer. The window was broken with a hammer by Min last week With a hammer, Min broke.
1 Fine-grained and Coarse-grained Word Sense Disambiguation Jinying Chen, Hoa Trang Dang, Martha Palmer August 22, 2003.
An Introduction to Semantic Parts of Speech Rajat Kumar Mohanty rkm[AT]cse[DOT]iitb[DOT]ac[DOT]in Centre for Indian Language Technology Department of Computer.
Overview of Statistical NLP IR Group Meeting March 7, 2006.
Chinese Proposition Bank Nianwen Xue, Chingyi Chia Scott Cotton, Seth Kulick, Fu-Dong Chiou, Martha Palmer, Mitch Marcus.
AnCora-Net Mapping the Spanish Ancora-Verb lexicon to VerbNet Mariona Taulé Maria Antònia Martí Oriol Borrega UNIVERSITAT DE BARCELONA.
Lec. 10.  In this section we explain which constituents of a sentence are minimally required, and why. We first provide an informal discussion and then.
Event Semantics Martha Palmer Orin Hargraves University of Colorado LING 7800/CSCI February 19, 2013.
CIS630, 9/13/04 1 Penn Putting Meaning into Your Trees Martha Palmer CIS630 September 13, 2004.
COSC 6336: Natural Language Processing
English Proposition Bank: Status Report
Coarse-grained Word Sense Disambiguation
Statistical NLP: Lecture 3
Representation of Actions as an Interlingua
Lecture 19 Word Meanings II
CS224N Section 3: Corpora, etc.
Structure of a Lexicon Debasri Chakrabarti 13-May-19.
CS224N Section 3: Project,Corpora
Progress report on Semantic Role Labeling
Presentation transcript:

VerbNet Martha Palmer University of Colorado LING 7800/CSCI September 16,

Outline Recap Levin’s Verb Classes VerbNet PropBank 2

Recap Fillmore – Cases  useful generalizations, fewer sense distinctions, Jackendoff – Lexical Conceptual Structure  Thematic roles are defined by the predicates they are arguments to Dowty – Proto-typical Agents and Patients  A bag of “agentive” entailments Levin – Verb classes based on syntax  syntactic behavior is a reflection of the underlying semantics 3

A Preliminary Classification of English Verbs, Beth Levin Based on diathesis alternations The range of syntactic variations for a class of verbs is a reflection of the underlying semantics 4

Levin classes (3100 verbs) 47 top level classes, 193 second and third level Based on pairs of syntactic frames. John broke the jar. / Jars break easily. / The jar broke. John cut the bread. / Bread cuts easily. / *The bread cut. John hit the wall. / *Walls hit easily. / *The wall hit. Reflect underlying semantic components contact, directed motion, exertion of force, change of state Synonyms, syntactic patterns (conative), relations

Confusions in Levin classes? Not semantically homogenous  {braid, clip, file, powder, pluck, etc...} Multiple class listings  homonymy or polysemy? Alternation contradictions?  Carry verbs disallow the Conative, but include  {push,pull,shove,kick,draw,yank,tug}  also in Push/pull class, does take the Conative

Intersective Levin Classes “at” ¬CH-LOC “across the room” CH-LOC “apart” CH-STATE Dang, Kipper & Palmer, ACL98

Regular Sense Extensions John pushed the chair. +force, +contact John pushed the chairs apart. +ch-state John pushed the chairs across the room. +ch-loc John pushed at the chair. -ch-loc The train whistled into the station. +ch-loc The truck roared past the weigh station. +ch-loc AMTA98,ACL98,TAG98

Intersective Levin Classes More syntactically and semantically coherent  sets of syntactic patterns  explicit semantic components  relations between senses VERBNET

VerbNet: Overview Purpose of VN is to classify English verbs based on semantic and syntactic regularities (Levin,1993) Classification used for numerous NLP tasks, primarily semantic role labeling (Schuler, 2002; Shi and Mihalcea, 2005, Yi, et. al., 2007)) In each verb class, thematic roles are used to link syntactic alternations to semantic predicates, which can serve as foundation for further inferences 11

VerbNet – based on Levin, B.,93 Kipper, et. al., LRE08 Class entries:  Capture generalizations about verb behavior  Organized hierarchically  Members have common semantic elements, semantic roles, syntactic frames, predicates Verb entries:  Refer to a set of classes (different senses)  each class member linked to WN synset(s), ON groupings, PB frame files, FrameNet frames, 12

The Unified Verb Index 13

VerbNet: An in-depth example “Behavior of a verb... is to a large extent determined by its meaning” (p. 1) Amanda hacked the wood with an ax. Amanda hacked at the wood with an ax. Craig notched the wood with an ax. *Craig notched at the wood with an ax. Can we move from syntactic behavior back to semantics?

Hacking and Notching Same thematic roles: Agent, Patient, Instrument Some shared syntactic frames, e.g. Basic Transitive (Agent V Patient) Hack: cut-21.1 cause(Agent, E) manner(during(E), Motion, Agent) contact(during(E), ?Instrument, Patient) degradation_material_integrity(result(E), Patient)

Hacking and Notching Same thematic roles: Agent, Patient, Instrument Some shared syntactic frames, e.g. Basic Transitive (Agent V Patient) Notch: carve-21.2 cause(Agent, E) contact(during(E), ?Instrument, Patient) degradation_material_integrity(result(E), Patient) physical_form(result(E), Form, Patient)

Also Temporal Characteristics Needed for distinguishing between Verbs of Assuming a Position and Verbs of Spatial Configuration Semantic predicates are associated with an event variable, e, and often have an additional argument:  START(e) – in force at the START of the event  END(e) – in force at the END of the event  DURING(e) – in force DURING the related time period for the entire event

VerbNet: send-11.1 (Members: 11, Frames: 5) includes “ship” Roles Agent [+animate | +organization] Theme [+concrete] Source [+location] Destination [+animate | [+location & -region]] Syntactic Frame:NP V NP PP.destination  example " Nora sent the book to London."  syntax Agent V Theme {to} Destination  semantics motion(during(E), Theme) location(end(E), Theme, Destination) cause(Agent, E) 18

19 VerbNet can also provide inferences  Every path from back door to yard was covered by a grape-arbor, and every yard had fruit trees.  Where are the grape arbors located?

20 VerbNet – cover, fill-9.8 class Members: fill, …, cover,…, staff, …. Thematic Roles: Agent Theme Destination Syntactic Frames with Semantic Roles “The employees staffed the store" “ The grape arbors covered every path" Theme V Destination location(E,Theme,Destination) location(E,grape_arbor,path)

21 Recovering Implicit Arguments [Palmer, et. al., 1986, Gerber & Chai, 2010] [ Arg0 The two companies] [ REL1 produce] [ Arg1 market pulp, containerboard and white paper]. The goods could be manufactured closer to customers, saving [ REL2 shipping] costs. Used VerbNet for subcategorization frames

Implicit arguments SYNTAX Agent V Theme {to} Destination [AGENT] shipped [THEME] to [DESTINATION] SEMANTICS  CAUSE(AGENT,E)  MOTION(DURING(E), THEME),  LOCATION(END(E), THEME, DESTINATION), 22

Implicit arguments instantiated using coreference [AGENT] shipped [THEME] to [DESTINATION] [Companies] shipped [goods] to [customers]. SEMANTICS  CAUSE(Companies, E)  MOTION(DURING(E), goods),  LOCATION(END(E), goods, customers), 23 Can annotate, semi-automatically!

Limitations to VerbNet as a sense inventory Concrete criteria for sense distinctions  Distinct semantic roles, but very fine-grained; leads to sparse data problems  Distinct frames  Distinct entailments But…. Limited coverage of lemmas For each lemma, limited coverage of senses CLEAR – Colorado 24

Goal of PropBank Supply consistent, simple, general purpose labeling of semantic roles Provide consistent argument labels across different syntactic realizations Support the training of automatic semantic role labelers Semantic representation can support…

Training data supporting… Machine translation Text editing Text summary / evaluation Question and answering

The Problem Levin (1993) and others have demonstrated promising relationship between syntax and semantics Same verb with same subcategorization can assign different semantic roles How can we take advantage of clear relationships and empirically study how and why syntactic alternations take place?

VerbNet and Real Data VerbNet is based on linguistic theory – how useful is it? How well does it correspond to syntactic variations found in naturally occurring text? Use PropBank to investigate these issues

What is PropBank? Semantic information over the syntactically parsed (i.e. treebanked) text Semantic information -> predicate argument structure of a verb or a relation Unlike VerbNet, the predicate argument structure is specific to the verb or relation in question Seeks to 1. provide consistent argument labels across different syntactic realizations of the same verb 2. assign general functional tags to all modifiers or adjuncts to the verb

“1. PB seeks to provide consistent argument labels across different syntactic realizations” Jin broke the projector. The projector broke. Syntax: NP SUB V NP OBJ Syntax: NP SUB V Thematic Roles:PATIENT REL Thematic Roles:AGENT REL PATIENT

Why numbered arguments? Avoids lack of consensus concerning a specific set of semantic role labels Numbers correspond to labels that are verb- specific Arg0 and Arg1 correspond to Dowty’s (1991) proto-agent and proto-patient Args 2-5 are highly variable

“1. PB seeks to provide consistent argument labels across different syntactic realizations” Uuuuuusually… Arg0 = agent Arg1 = patient Arg2 = benefactive / instrument / attribute / end state Arg3 = start point / benefactive / instrument / attribute Arg4 = end point These correspond to VN Thematic Roles

“2. PB seeks to assign functional tags to all modifiers or adjuncts to the verb” Variety of ArgM’s: TMP - when? yesterday, 5pm on Saturday, recently LOC - where? in the living room, on the newspaper DIR - where to/from? down, from Antartica MNR - how? quickly, with much enthusiasm PRP/CAU -why? because …, so that … REC - himself, themselves, each other GOL - end point of motion, transfer verbs? To the floor, to Judy ADV - hodge-podge, miscellaneous, “nothing-fits!” PRD - this argument refers to or modifies another: …ate the meat raw

Different verb senses… Have different subcategorization frames PropBank assigns coarse-grained senses to verbs PropBank “framesets,” lexical resource New senses, or “rolesets” are added only when the syntax and semantics of a usage are distinct Annotators use “frame files” to assign appropriate numbered arg structure

Propbank: sense distinctions? Mary left the room Mary left her daughter-in-law her pearls in her will Frameset leave.01 "move away from": Arg0: entity leaving Arg1: place left Frameset leave.02 "give": Arg0: giver Arg1: thing given Arg2: beneficiary

WordNet: - call, 28 senses, 9 groups WN5, WN16,WN12 WN15 WN26 WN3 WN19 WN4 WN 7 WN8 WN9 WN1 WN22 WN20 WN25 WN18 WN27 WN2 WN 13WN6 WN23 WN28 WN17, WN 11 WN10, WN14, WN21, WN24, Loud cry Label Phone/radio Bird or animal cry Request Call a loan/bond Visit Challenge Bid

WordNet: - call, 28 senses, 9 groups, add PB WN5, WN16,WN12 WN15 WN26 WN3 WN19 WN4 WN 7 WN8 WN9 WN1 WN22 WN20 WN25 WN18 WN27 WN2 WN 13WN6 WN23 WN28 WN17, WN 11 WN10, WN14, WN21, WN24, Loud cry Label Phone/radio Bird or animal cry Request Call a loan/bond Visit Challenge Bid

Overlap between Groups and Framesets – 95% WN1 WN2 WN3 WN4 WN6 WN7 WN8 WN5 WN 9 WN10 WN11 WN12 WN13 WN 14 WN19 WN20 Frameset1 Frameset2 develop Palmer, Dang & Fellbaum, NLE 2004

39 Sense Hierarchy (Palmer, et al, SNLU04 - NAACL04, NLE07, Chen, et. al, NAACL06) PropBank Framesets – ITA >90% coarse grained distinctions 20 Senseval2 verbs w/ > 1 Frameset Maxent WSD system, 73.5% baseline, 90%  Sense Groups (Senseval-2) - ITA 82% Intermediate level (includes Levin classes) – 71.7% WordNet – ITA 73% fine grained distinctions, 64% Tagging w/groups, ITA 90%, Taggers % Semeval07 Chen, Dligach & Palmer, ICSC 2007 Dligach & Palmer, ACL-11, - 88% CLEAR – Colorado

40 SEMLINK Extended VerbNet: 5,391 senses (91% PB) Type-type mapping PB/VN, VN/FN Semi-automatic mapping of WSJ PropBank instances to VerbNet classes and thematic roles, hand-corrected. (now FrameNet also) VerbNet class tagging as automatic WSD Run SRL, map Arg2 to VerbNet roles, Brown performance improves Yi, Loper, Palmer, NAACL07 Brown, Dligach, Palmer, IWCS 2011

41 Mapping from PropBank to VerbNet (similar mapping for PB-FrameNet) Frameset id = leave.02 Sense = give VerbNet class = future-having 13.3 Arg0GiverAgent/Donor* Arg1Thing givenTheme Arg2BenefactiveRecipient VerbNet CLEAR – Colorado *FrameNet Label Baker, Fillmore, & Lowe, COLING/ACL-98 Fillmore & Baker, WordNetWKSHP, 2001

42 Mapping from PB to VerbNet verbs.colorado.edu/~semlink VerbNet CLEAR – Colorado

Generative Lexicon - VerbNet GL: use(Agent, Entity, Purpose)  use, sense 1: apply or employ something for a purpose (the most general sense) Use 105  use, sense 2: consume or ingest, usually habitually Eat  use, sense 3: expend a quantity (e.g., use up something, use something up) Consume 66 43

Generative Lexicon - VerbNet GL: use(Agent, Entity, Purpose)  use, sense 1: apply or employ something for a purpose (the most general sense) Use  use, sense 2: consume or ingest, usually habitually Eat  use, sense 3: expend a quantity (e.g., use up something, use something up) Consume

Additional Entailments Sense 1 is the most general Senses 2 and 3 provide additional specific entailments  Sense 2: Entity is ingested by an animate being, who then undergoes a change of state  Sense 3: in the process of using the Entity, it is depleted 45