MCTest-160 [1] contains pairs of narratives and multiple choice questions. An example of narrative QA: Story: “Sally had a very exciting summer vacation.

Slides:



Advertisements
Similar presentations
Subjects and Predicates
Advertisements

Subjective and Objective Case Pronouns
COGEX at the Second RTE Marta Tatu, Brandon Iles, John Slavick, Adrian Novischi, Dan Moldovan Language Computer Corporation April 10 th, 2006.
And ButOr Amal and Maha are friends. He will drive or fly. It is early but we can go. They ate and drank. He will be here on Monday or Tuesday. She is.
Using Query Patterns to Learn the Durations of Events Andrey Gusev joint work with Nate Chambers, Pranav Khaitan, Divye Khilnani, Steven Bethard, Dan Jurafsky.
“Everyday Use” & Characterization
The Old Woman Who Lived with the Wolves
A Database of Nate Chambers and Dan Jurafsky Stanford University Narrative Schemas.
Monday.
9 Tips for TAKS ELA Success – 11 th Grade. 10/11 TAKS ELA Breakdown There are 73 “raw points” possible: 48 multiple choice questions (1 pt. ea.) 48 3.
ACT Grammar Lesson More PUNCTUATION. Semicolons (;) Punctuation marks used to put two or more clauses together to form one big sentence. Falls somewhere.
© 2014 wheresjenny.com BEGINNERS WRITING 4. © 2014 wheresjenny.com Try to answer the question below. Question: What is your favorite hobby? Why? Example.
Book By: Patricia Beatty Report By: Addison Johnson.
Mrs. Coffin. Jeff Arnold Gray and Erin Marie Gray wanted to have a family and kids of their own. On an October afternoon, they had two children, twins!
The Old Woman Who Lived with the Wolves
The subject of a sentence is the person, place, thing, or idea that is doing or being something. Look at the subjects highlighted red below. What do they.
Empirical Methods in Information Extraction Claire Cardie Appeared in AI Magazine, 18:4, Summarized by Seong-Bae Park.
Essay Writing A Quick Overview English Language Arts.
Story Elements This presentation is best shown over several days. There are natural breaks after character and setting, conflict, plot, and theme. (My.
Dual Coordinate Descent Algorithms for Efficient Large Margin Structured Prediction Ming-Wei Chang and Scott Wen-tau Yih Microsoft Research 1.
By: ___________________
Unit 9: FRIENDSHIP READING.
Graeme Cosslett NZCER 2010 Unpacking PAT Tests from the NZCER Marking Website.
Illinois-Coref: The UI System in the CoNLL-2012 Shared Task Kai-Wei Chang, Rajhans Samdani, Alla Rozovskaya, Mark Sammons, and Dan Roth Supported by ARL,
I hear and I forget, I see and I understand, I do and I remember.
Elements of a Narrative (Story Elements)
Differentiating the content and the environment Finding the area of composite shapes Year 8.
Incident Threading for News Passages (CIKM 09) Speaker: Yi-lin,Hsu Advisor: Dr. Koh, Jia-ling. Date:2010/06/14.
Reading During Summer Vacation Arcola Elementary School Grades: Headstart-5 th.
Coreference Resolution with Knowledge Haoruo Peng March 20,
Who is this?. Who Who is Daniel? Who is Liam? Who ate a surf board?
Grammar Unit II: Lesson 1.2 Using Subjective Personal Pronouns Correctly.
Agreement. I. Number A. Number is the form a word takes to show if a word is singular or plural.
Relation Alignment for Textual Entailment Recognition Cognitive Computation Group, University of Illinois Experimental ResultsTitle Mark Sammons, V.G.Vinod.
Grade Two Sight Word Lists Southington Public Schools.
Write a Story.
Creating Subjective and Objective Sentence Classifier from Unannotated Texts Janyce Wiebe and Ellen Riloff Department of Computer Science University of.
Inference Protocols for Coreference Resolution Kai-Wei Chang, Rajhans Samdani, Alla Rozovskaya, Nick Rizzolo, Mark Sammons, and Dan Roth This research.
© 2015 albert-learning.com Beginners Writing Exercise 4 BEGINNERS WRITING EXERCISE 4.
CS460/IT632 Natural Language Processing/Language Technology for the Web Lecture 1 (03/01/06) Prof. Pushpak Bhattacharyya IIT Bombay Introduction to Natural.
Writing the Essay: Mentoring & Modelling First Paragraph Structure Johanna de Leeuw, PhD Stacie Goegan, M Teaching, BA (Honors)
Presented by: Mikaela Phillips “ THE GOD OF THE ANIMALS”
Responding to Literature Houghton Mifflin Grade 3 D. Crisler 2012/2013.
A Database of Narrative Schemas A 2010 paper by Nathaniel Chambers and Dan Jurafsky Presentation by Julia Kelly.
Subject/Verb Agreement. Making Subject and Verbs Agree in Number 1.A verb must agree in number with its subject. 2.If the subject is singular, the verb.
Story Summary (Fiction). What is a Summary? A story summary tells the most important ideas in a fiction story.
Subjects and Predicates
Part 2 Applications of ILP Formulations in Natural Language Processing
CREATIVE WRITING: TASK
Unit 3 Jeopardy Q $100 Q $100 Q $100 Q $100 Q $100 Q $200 Q $200
Semi-Structured Reasoning for Answering Science Questions
Everyday use By Alice Walker.
Where did you go on vacation ?
Two Discourse Driven Language Models for Semantics
Old Wire Elementary School Sherri Yates’s Kindergarten Class
Subject – Verb Agreement Rules
Do Now: take out HOMS.
Complete Sentences Fragments Run-On Sentences Compound Sentences
Flow 2: Coherence & Cohesion
Paragraph Structure Review
Weather traffic light help games Read the passage and answer the questions. Last week, Fahad and his mother went to Qatar Mall to buy clothes. It was.
Read Aloud.
Essential Question for our Unit: How do I write a text-based analysis essay that analyzes and responds to fiction or nonfiction texts with relevant, well-chosen.
Subject-Verb Agreement – Compound Subjects
OSSLT: lesson 3 Booklet 2.
Conjunctions Join Things.
Complex sentences Review
Answer p. 79, Activity No. 5 of English Expressways Reading (E.E.R.)
Unsupervised Learning of Narrative Schemas and their Participants
CoQA: A Conversational Question Answering Challenge
Presentation transcript:

MCTest-160 [1] contains pairs of narratives and multiple choice questions. An example of narrative QA: Story: “Sally had a very exciting summer vacation. She went to summer camp for the first time. She made friends with a girl named Tina. They shared a bunk bed in their cabin. Sally's favorite activity was walking in the woods because she enjoyed nature. Tina liked arts and crafts. Together, they made some art using leaves they found in the woods. Even after she fell in the water, Sally still enjoyed canoeing. … ” Q: Who went to the beach with Sally? 1) her sisters 2) Tina 3) her brothers, mother, and father 4) herself Introduction An early observation: pruning stories helps! Relgrams Acknowledgment Scott Yih, Erin Renshaw, Andrzej Pastusiak, Matt Richardson, and Chris Meek Daniel Khashabi, Chris J.C. Burges Flow of Semantics in Narratives Future work Using the timeline information for improving coreference QA on the subset chosen by event timeline Neural models for transition between events SRL-gram References [1] Richardson, Matthew, et al "MCTest: A Challenge Dataset for the Open- Domain Machine Comprehension of Text." EMNLP [2] Chang, Wen-tau Yih Ming-Wei, and Christopher Meek Andrzej Pastusiak. "Question answering using enhanced lexical semantic models." (2013). [3 ] Chang, Kai-Wei, et al. "Inference protocols for coreference resolution." CoNLL: Shared Task, [4] Balasubramanian, Niranjan, et al. "Generating Coherent Event Schemas at Scale." EMNLP Creating timeline with event probabilities Accuracies on baseline error No pruningSchema 1Schema %38.77 %14.13 % System1: Finding the Right Sentence(s) System2: Answering the question, given the selected sentence(s) Suppose we are given coreference chains of a given story. Can we create a coherent set of event chains, for each animate entity? Experiment: Extract coreference chains for each story Separate chains into plural and singular chains, and find connection between them. And create timeline of singular chains, with plural chains attached to them. An example is shown below. The output can be much noisier. Next: Can we use this timeline to improve QA? Experiment: Given question, find its main character, and choose the subset of sentences, related to this character, using the timeline. Result: On average the reduced size of the stories is 83.2% (no. of words) of the original stories. But the performance almost doesn’t change. Promising point: although no improvement yet, but reduced text Define an event to be: ( Subject; Relation; Object ) Ex.: ( Abramoff; was indicted in; Florida ) Normalize the arguments: (Person; be indicted in; State) In [4] the authors have gathered co-occurance information for the normalized events, for 1.8 M words, gathering about 11M pair of events. Coreference for creating timeline Scores normalized with respect to question length: threshold Result: Increased performance comes at the cost of losing substantial coverage! Next: Can we chunk events, based on their animate characters? Accuracy vs Threshold Decreasing story size Mistake in timeline; should have belonged to the red chain. Mistake in timeline; wrong subject.