Layering Semantics (Putting meaning into trees) Treebank Workshop Martha Palmer April 26, 2007.

Slides:



Advertisements
Similar presentations
CS460/IT632 Natural Language Processing/Language Technology for the Web Lecture 2 (06/01/06) Prof. Pushpak Bhattacharyya IIT Bombay Part of Speech (PoS)
Advertisements

Syntax-Semantics Mapping Rajat Kumar Mohanty CFILT.
Chapter 4 Syntax.
INTRODUCTION TO ARTIFICIAL INTELLIGENCE Massimo Poesio Relation Extraction.
Sequence Classification: Chunking Shallow Processing Techniques for NLP Ling570 November 28, 2011.
Class-based nominal semantic role labeling: a preliminary investigation Matt Gerber Michigan State University, Department of Computer Science.
Hindi Syntax Annotating Dependency, Lexical Predicate-Argument Structure, and Phrase Structure Martha Palmer (University of Colorado, USA) Rajesh Bhatt.
Overview of the Hindi-Urdu Treebank Fei Xia University of Washington 7/23/2011.
Max-Margin Matching for Semantic Role Labeling David Vickrey James Connor Daphne Koller Stanford University.
Language Data Resources Treebanks. A treebank is a … database of syntactic trees corpus annotated with morphological and syntactic information segmented,
计算机科学与技术学院 Chinese Semantic Role Labeling with Dependency-driven Constituent Parse Tree Structure Hongling Wang, Bukang Wang Guodong Zhou NLP Lab, School.
Statistical NLP: Lecture 3
Semantic Role Labeling Abdul-Lateef Yussiff
A Joint Model For Semantic Role Labeling Aria Haghighi, Kristina Toutanova, Christopher D. Manning Computer Science Department Stanford University.
10/9/01PropBank1 Proposition Bank: a resource of predicate-argument relations Martha Palmer University of Pennsylvania October 9, 2001 Columbia University.
PropBanks, 10/30/03 1 Penn Putting Meaning Into Your Trees Martha Palmer Paul Kingsbury, Olga Babko-Malaya, Scott Cotton, Nianwen Xue, Shijong Ryu, Ben.
April 26th, 2007 Workshop on Treebanking, HLT/NAACL, Rochester 1 Layering of Annotations in the Penn Discourse TreeBank (PDTB) Rashmi Prasad Institute.
Towards Parsing Unrestricted Text into PropBank Predicate- Argument Structures ACL4 Project NCLT Seminar Presentation, 7th June 2006 Conor Cafferkey.
LING NLP 1 Introduction to Computational Linguistics Martha Palmer April 19, 2006.
Probabilistic Parsing: Enhancements Ling 571 Deep Processing Techniques for NLP January 26, 2011.
The Hindi-Urdu Treebank Lecture 7: 7/29/ Multi-representational, Multi-layered treebank Traditional approach: – Syntactic treebank: PS or DS, but.
Introduction to treebanks Session 1: 7/08/
PCFG Parsing, Evaluation, & Improvements Ling 571 Deep Processing Techniques for NLP January 24, 2011.
Annotation Types for UIMA Edward Loper. UIMA Unified Information Management Architecture Analytics framework –Consists of components that perform specific.
1 Annotation Guidelines for the Penn Discourse Treebank Part B Eleni Miltsakaki, Rashmi Prasad, Aravind Joshi, Bonnie Webber.
DS-to-PS conversion Fei Xia University of Washington July 29,
NomBank 1.0: ULA08 Workshop March 18, 2007 NomBank 1.0 Released 12/2007 Unified Linguistic Annotation Workshop Adam Meyers New York University March 18,
Workshop on Treebanks, Rochester NY, April 26, 2007 The Penn Treebank: Lessons Learned and Current Methodology Ann Bies Linguistic Data Consortium, University.
CS 4705 Lecture 11 Feature Structures and Unification Parsing.
Probabilistic Parsing Ling 571 Fei Xia Week 5: 10/25-10/27/05.
PropBank Martha Palmer University of Colorado. Unified Linguistic Annotation: Merging PropBank, NomBank, TimeBank, Penn Discourse Treebank, Coreference,
10/9/01PropBank1 Proposition Bank: a resource of predicate-argument relations Martha Palmer, Dan Gildea, Paul Kingsbury University of Pennsylvania February.
Empirical Methods in Information Extraction Claire Cardie Appeared in AI Magazine, 18:4, Summarized by Seong-Bae Park.
Probabilistic Parsing Reading: Chap 14, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
PropBank, VerbNet & SemLink Edward Loper. PropBank 1M words of WSJ annotated with predicate- argument structures for verbs. –The location & type of each.
LIRICS mid-term review 1 LIRICS WP3: Morpho-syntactic and syntactic annotations Thierry Declerck DFKI-LT - Saarbrücken 23rd May 2006.
Korean Treebank & Propbank Martha Palmer, Narae Han, Jinyoung Choi, Shijong Ryu University of Pennsylvania May 23, 2005.
1 Statistical Parsing Chapter 14 October 2012 Lecture #9.
The Prague (Czech-)English Dependency Treebank Jan Hajič Charles University in Prague Computer Science School Institute of Formal and Applied Linguistics.
CS : Language Technology for the Web/Natural Language Processing Pushpak Bhattacharyya CSE Dept., IIT Bombay Constituent Parsing and Algorithms (with.
Penn 1 Kindle: Knowledge and Inference via Description Logics for Natural Language Dan Roth University of Illinois, Urbana-Champaign Martha Palmer University.
AQUAINT Workshop – June 2003 Improved Semantic Role Parsing Kadri Hacioglu, Sameer Pradhan, Valerie Krugler, Steven Bethard, Ashley Thornton, Wayne Ward,
INSTITUTE OF COMPUTING TECHNOLOGY Forest-based Semantic Role Labeling Hao Xiong, Haitao Mi, Yang Liu and Qun Liu Institute of Computing Technology Academy.
A Cascaded Finite-State Parser for German Michael Schiehlen Institut für Maschinelle Sprachverarbeitung Universität Stuttgart
CSE 517 Natural Language Processing Winter 2015 Frames Yejin Choi Some slides adapted from Martha Palmer, Chris Manning, Ray Mooney, Lluis Marquez... TexPoint.
Linguistic Essentials
Rules, Movement, Ambiguity
CPSC 422, Lecture 27Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27 Nov, 16, 2015.
ARDA Visit 1 Penn Lexical Semantics at Penn: Proposition Bank and VerbNet Martha Palmer, Dan Gildea, Paul Kingsbury, Olga Babko-Malaya, Bert Xue, Karin.
Arabic Syntactic Trees Zdeněk Žabokrtský Otakar Smrž Center for Computational Linguistics Faculty of Mathematics and Physics Charles University in Prague.
FILTERED RANKING FOR BOOTSTRAPPING IN EVENT EXTRACTION Shasha Liao Ralph York University.
NLP. Introduction to NLP Last week, Min broke the window with a hammer. The window was broken with a hammer by Min last week With a hammer, Min broke.
1 Fine-grained and Coarse-grained Word Sense Disambiguation Jinying Chen, Hoa Trang Dang, Martha Palmer August 22, 2003.
December 2011CSA3202: PCFGs1 CSA3202: Human Language Technology Probabilistic Phrase Structure Grammars (PCFGs)
Multilinugual PennTools that capture parses and predicate-argument structures, for use in Applications Martha Palmer, Aravind Joshi, Mitch Marcus, Mark.
Towards Semi-Automated Annotation for Prepositional Phrase Attachment Sara Rosenthal William J. Lipovsky Kathleen McKeown Kapil Thadani Jacob Andreas Columbia.
Chinese Proposition Bank Nianwen Xue, Chingyi Chia Scott Cotton, Seth Kulick, Fu-Dong Chiou, Martha Palmer, Mitch Marcus.
Lec. 10.  In this section we explain which constituents of a sentence are minimally required, and why. We first provide an informal discussion and then.
Natural Language Processing Vasile Rus
COSC 6336: Natural Language Processing
Embedded Clauses in TAG
English Proposition Bank: Status Report
Statistical NLP: Lecture 3
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27
Probabilistic and Lexicalized Parsing
LING/C SC 581: Advanced Computational Linguistics
Probabilistic and Lexicalized Parsing
Towards comprehensive syntactic and semantic annotations of the clinical narrative Daniel Albright, Arrick Lanfranchi, Anwen Fredriksen, William F Styler.
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27
Progress report on Semantic Role Labeling
Presentation transcript:

Layering Semantics (Putting meaning into trees) Treebank Workshop Martha Palmer April 26, 2007

Treebank Workshop/NAACL07 Outline Semantic role labeling, verbs and nouns Treebank/PropBank discrepancies Treebank/NomBank discrepancies Dependencies vs. Phrase Structure

Treebank Workshop/NAACL07 A TreeBanked phrase a GM-Jaguar pact NP S VP would VP give the US car maker NP an eventual 30% stake NP the British company NP PP-LOC in A GM-Jaguar pact would give the U.S. car maker an eventual 30% stake in the British company. Marcus, et al, 1993

Treebank Workshop/NAACL07 The same phrase,PropBanked* a GM-Jaguar pact would give the US car maker an eventual 30% stake in the British company Arg0 Arg2 Arg1 give(GM-J pact, US car maker, 30% stake) A GM-Jaguar pact would give the U.S. car maker an eventual 30% stake in the British company.

Treebank Workshop/NAACL07 Frames File example: give Roles: Arg0: giver Arg1: thing given Arg2: entity given to Example: double object The executives gave the chefs a standing ovation. Arg0: The executives REL: gave Arg2: the chefs Arg1: a standing ovation

Treebank Workshop/NAACL07 NomBank Frame File example: gift (nominalizations, noun predicates, partitives, etc. Roles: Arg0: giver Arg1: thing given Arg2: entity given to Example: nominalization Nancy’s gift from her cousin was a complete surprise. Arg0: her cousin REL: gave Arg2: Nancy Arg1: gift

Treebank Workshop/NAACL07 Status of TB/PB merge Extracted and classified mismatch examples Reconciled different policy decisions  An example of TB changes: Modified list of verbs which take small clauses and sentential complements (e.g. keep their markets active)  An example of PB changes: A different approach to annotation of empty categories: John seems [*T* to know the answer]-Arg1 Some mismatches require manual adjudication  (e.g. PP-attachment ambiguity): The region lacks necessary mechanisms for handling the aid and accounting items. Unresolved issues: gapping

Treebank Workshop/NAACL07 Treebank-PropBank Reconciliation Problem: One PropBank arg can involve many parse nodes Solution: Single argument – single parse node analysis Problem: Different semantic analyses Solution: Reconcile semantic assumptions

Treebank Workshop/NAACL07 NomBank Issues Treebank has no structure for noun phrases Many nominal event arguments do not correspond to tree nodes

Treebank Workshop/NAACL07 Dependency Trees vs Phrase Structure CATEGORYDTPST Finding constituent boundaries√ Noun phrase structure√ Empty categories√ Coordination – what is the head?√ Phrasal verbs√ Discontinuous constituents√

Treebank Workshop/NAACL07 Dependency Trees vs Phrase Structure Really depends on choices with respect to guidelines and the annotation tool display  Penn Treebank chose not to bracket NP’s  Dependency trees can be displayed visually to clarify constituent structure  Empty categories can be a help or a hindrance depending on the annotator’s task  Coordination heads are defined in the guidelines  Penn Treebank could define phrasal verbs