Semantic Annotation Meeting April 14, 2005 NomBank & the Down-to-Earth Parts of Pie-in-the-Sky Adam Meyers New York University April 14, 2004.

Slides:



Advertisements
Similar presentations
Writing up an investigation
Advertisements

A Common Standard for Data and Metadata: The ESDS Qualidata XML Schema Libby Bishop ESDS Qualidata – UK Data Archive E-Research Workshop Melbourne 27 April.
Layering Semantics (Putting meaning into trees) Treebank Workshop Martha Palmer April 26, 2007.
GLARF-ULA: ULA08 Workshop March 19, 2007 GLARF-ULA: Working Towards Usability Unified Linguistic Annotation Workshop Adam Meyers New York University March.
1 An Integrated Annotation DB in OntoNotes Sameer Pradhan, Eduard Hovy, Mitchell Marcus, Martha Palmer, Lance Ramshaw, and Ralph Weischedel
Unified Modeling Language Philip Liew
Statistical NLP: Lecture 3
April 26th, 2007 Workshop on Treebanking, HLT/NAACL, Rochester 1 Layering of Annotations in the Penn Discourse TreeBank (PDTB) Rashmi Prasad Institute.
Using someone else’s words: Quote, Summarize and Paraphrase.
Tasks Talk: ULA08 Workshop March 18, 2007 A Talk about Tasks Unified Linguistic Annotation Workshop Adam Meyers New York University March 18, 2008.
Chapter 6 Methodology Conceptual Databases Design Transparencies © Pearson Education Limited 1995, 2005.
Survey of Annotation Work Joint session Thursday afternoon, April 14 Chair: Eduard Hovy, ISI.
NomBank 1.0: ULA08 Workshop March 18, 2007 NomBank 1.0 Released 12/2007 Unified Linguistic Annotation Workshop Adam Meyers New York University March 18,
Character Traits A Literary Analysis.
Scrimmage ON DEMAND Scrimmage Scrimmage: a practice session or informal game, as between two units of the same team. Stepp – CCR3.
TimeBank Status Status of TimeML annotation for the ULA project James Pustejovsky and Marc Verhagen Brandeis University.
PropBank Martha Palmer University of Colorado. Unified Linguistic Annotation: Merging PropBank, NomBank, TimeBank, Penn Discourse Treebank, Coreference,
Argumentative essays.  Usually range from as little as five paragraphs to as many as necessary  Focus is mainly on your side  But there is also a discussion.
Presented by Clay Renick Span Apps: the bridge to better literacy Gordon County Schools /
Methodology - Conceptual Database Design Transparencies
1 Chapter 15 Methodology Conceptual Databases Design Transparencies Last Updated: April 2011 By M. Arief
ELA Common Core Shifts. Shift 1 Balancing Informational & Literary Text.
“Greg Ousley Is Sorry for Killing His Parents. Is That Enough?
Grade 6 Module 2a Unit 1 Lesson 7
The Prague (Czech-)English Dependency Treebank Jan Hajič Charles University in Prague Computer Science School Institute of Formal and Applied Linguistics.
The Current State of FrameNet CLFNG June 26, 2006 Fillmore.
Methods for the Automatic Construction of Topic Maps Eric Freese, Senior Consultant ISOGEN International.
© 2006 SOUTH-WESTERN EDUCATIONAL PUBLISHING 11th Edition Hulbert & Miller Effective English for Colleges Chapter 7 PREPOSITIONS.
Critical Reading & Writing Around Complex Texts Tiffany Abbott Fuller Cassie Parson Rome City Schools.
Methodology - Conceptual Database Design
MASC The Manually Annotated Sub- Corpus of American English Nancy Ide, Collin Baker, Christiane Fellbaum, Charles Fillmore, Rebecca Passonneau.
7 TH GRADE ELA WEEK OF MONDAY, UNPACK: BINDER, TEXTBOOK, PENCIL, HIGH LIGHTER, AGENDA, HOMEWORK PAPERS TO TURN IN DO NOW: PUT THIS WEEK’S.
1 CSE 2337 Introduction to Data Management Textbook: Chapter 1.
Resemblances between Meaning-Text Theory and Functional Generative Description Zdeněk Žabokrtský Institute of Formal and Applied Linguistics Charles University,
Linguistic Essentials
TimeML compliant text analysis for Temporal Reasoning Branimir Boguraev and Rie Kubota Ando.
For Monday Read chapter 24, sections 1-3 Homework: –Chapter 23, exercise 8.
For Friday Finish chapter 24 No written homework.
For Monday Read chapter 26 Last Homework –Chapter 23, exercise 7.
Deep structure (semantic) Structure of language Surface structure (grammatical, lexical, phonological) Semantic units have all meaning components such.
MedKAT Medical Knowledge Analysis Tool December 2009.
Fact Extraction Ontology Ontological- Semantic Analysis Text Meaning Representation (TMR) Fact Repository (FR) Text Sources Lexicons Grammars Static Knowledge.
nd PIRE project workshop1 Tectogrammatical Representation of English Silvie Cinková Lucie Mladová, Anja Nedoluzhko, Jiří Semecký, Jana Šindlerová,
March 5, 2008Companions Semantic Representation and Dialog Interfacing Workshop - Intro 1 The Prague Dependency Treebank (PDT) Introduction Jan Hajič Institute.
How do I read these documents? WHAT WHAT is the document saying? HOW HOW is the document saying it? TO WHOM TO WHOM is the document saying it? CONNECTION.
Supertagging CMSC Natural Language Processing January 31, 2006.
CS 4705 Lecture 17 Semantic Analysis: Robust Semantics.
FILTERED RANKING FOR BOOTSTRAPPING IN EVENT EXTRACTION Shasha Liao Ralph York University.
For Monday Read chapter 26 Homework: –Chapter 23, exercises 8 and 9.
Using Semantic Relations to Improve Information Retrieval
Annotating and measuring Temporal relations in texts Philippe Muller and Xavier Tannier IRIT,Université Paul Sabatier COLING 2004.
A Database of Narrative Schemas A 2010 paper by Nathaniel Chambers and Dan Jurafsky Presentation by Julia Kelly.
English Proposition Bank: Status Report
Methodology Conceptual Databases Design
Common Core Strategies with Primary Texts
EmSAT English Achieve.
Methodology Conceptual Database Design
Statistical NLP: Lecture 3
Objects, Functions and Parameters
English B50 The Rhetorical Precis.
“The Joy of Reading and Writing: Superman and Me”
Text-to-Text Generation
Linguistic Essentials
Methodology Conceptual Databases Design
Managing Private and Public Views of DDI Metadata Repositories
Final Review English II.
Progress report on Semantic Role Labeling
Information Retrieval
Owen Rambow 6 Minutes.
Presentation transcript:

Semantic Annotation Meeting April 14, 2005 NomBank & the Down-to-Earth Parts of Pie-in-the-Sky Adam Meyers New York University April 14, 2004

Semantic Annotation Meeting April 14, 2005 Status of NomBank Arguments of Common Nouns in Penn II Corpus About 200,000 noun instances Speed = approximately 25 instances/hour About 75% Complete Annotation Agreement is about 86% Complete Distribution – August 2005

Semantic Annotation Meeting April 14, 2005 NomBank Examples Verb-Related –His/ARG0 gift of a book/ARG1 to me/ARG2 Adjective Related –The absence of patent lawyers/ARG1 on the court/ARG2 Classes of Nouns (16 Total) –Her/ARG1 husband/ARG0 [Relational-defrel] –A set of tasks/ARG1 [Partitive] –An Oct. 1/ARG2 date for the attack/ARG1 [Attribute]

Semantic Annotation Meeting April 14, 2005 Special NP Phenomena Support –The judge/ARG0 made demands on his staff/ARG2 Transparent Nouns –His/ARG0 first batch of questions

Semantic Annotation Meeting April 14, 2005 Future? NomBank2 – Bridging Coreference –Apple/ARG1 is trying to acquire Disney. –Rivals/ARG0 Microsoft and IBM claim this is unfair. ETCBank (adjectives, idioms, comparatives, etc.) –He/ARG0 was answerable to the state department/ARG1 –He/ARG0 kept tabs on Mary/ARG1 –The WMD was too small/ARGQ for inspectors to see/ARGC __ FactBank – Is that a fact? Who says so? –Mary said/FACT that John did not find/NOT-FACT WMDs.

Semantic Annotation Meeting April 14, 2005 Goals for “Pie in the Sky” The Ultimate Annotation A complete semantic analysis Language independent Analysis anchored by text Merges all current annotation schemata Includes everything, even quantifier scope How much Pie in the Sky is Down to Earth?

Semantic Annotation Meeting April 14, 2005 Pie in the Sky: So Far We chose 2 consecutive sentences and have been iteratively merging annotation from: –PropBank I & II (including identity coreference) –NomBank I & II (including bridging coreference) –Discourse Treebank –EtcBank (NYU adjectives, determiners, etc.) –TimeML & Timex2 –Gazetteer and Word Class Info –FactBank (NYU – Attribution and FACT/NOT-FACT)

Semantic Annotation Meeting April 14, 2005 Pie in the Sky Combine Information from all Current Annotation Schemes into One Representation Don’t Record Everything –Resolve Conflicts –Remove Redundancies (Conservatively) Expand Annotation Where Appropriate –Extend Verb-oriented annotation to other parts of speech Fight Task Centrism – Extension of Annotation to All “Important” Information –Allocate information to appropriate place

Semantic Annotation Meeting April 14, Sentences from ACE File: NBC but Yemen's president says the FBI has told him the explosive material could only have come from the U.S., Israel or two Arab countries. and to a former federal bomb investigator, that description suggests a powerful military-style plastic explosive c-4 that can be cut or molded into different shapes.

Semantic Annotation Meeting April 14, 2005 A Former Federal Bomb Investigator

Semantic Annotation Meeting April 14, 2005 What’s Next? Frontiers in Corpus Annotation II: Pie in the Sky –Incorporate Everything From Workshop Possibilities After Frontiers Workshop –New Text Sample? –Larger Piece of Text: paragraph or whole file? –Other Phenomena? Idioms, Discourse Connectives, Temporal Relations –Choose Bitext and Find International Partners