Download presentation
Presentation is loading. Please wait.
1
Machine Reading
2
Introduction Definition: a task that deals with the automatic understanding of texts the “KnowItAll” research group at the University of Washington Target: understanding text
3
Related work Question Answering Information Extraction
utilize supervised learning techniques, which rely on hand-tagged training examples Information Extraction Often utilizes extraction rules learned from example extractions of each target relation Impractical to generate a set of hand-tagged examples
4
Difference Fact MR the relations are not known in advance
impractical to generate hand-tagged examples of each relation of interest. MR Inherently unsupervised forging and updating connections between beliefs vs. focusing on isolated “nuggets” obtained from text
5
Reference [1] Etzioni, Oren, Michele Banko, and Michael J. Cafarella. "Machine Reading."AAAI. Vol
6
QA4MRE In 2011, 2012, 2013
7
Summary Main objective
Develop a methodology for evaluating Machine Reading systems through Question Answering and Reading Comprehension Tests. Systems should be able to extract knowledge from large volumes of text and use this knowledge to answer questions. The methodology should allow the comparison of systems' performance and the study of the best approaches.
8
QA4MRE Organization, tasks, participants, results, winner’s methods ..
9
CLEF 2011 Host Time Place the University of Amsterdam The Netherlands
19-22 September 2011 Place Amsterdam, the Kingdom of the Netherlands
10
CLEF 2012 Host Time Place the University "La Sapienza"
17-20 September 2012 Place Rome, Italy
11
CLEF 2013 Host Time Place the Technical University of Valencia
23-26 September 2013 Place Spain
12
The Task Reading of single documents and the identification of the answers to a set of questions In the form of multiple choice Only one correct answer Require both semantic understanding and reasoning process
13
Requirements Understand the test questions
Analyze the relation among entities contained in questions and entities expressed by the candidate answers Understand the information contained in the documents Extract useful pieces of knowledge from the background collections Select the correct answer from the five alternatives proposed.
14
Testing Data Available in several languages
Arabic, Bulgarian, English, German, Italian, Romanian, and Spanish Test sets were divided into topics AIDS, Climate Change, Music, Society and Alzheimer’s disease Background collection, testing documents, candidate answers are provided
15
Background Collections
2011 2012&2013
16
Evaluation In order to improve results, systems might reduce the amount of incorrect answers while keeping the proportion of correct ones, by leaving some questions unanswered consider the possibility of leaving questions unanswered Responses: R W NoA NoA_R ( with a right hypothetical answer ) NoA_W ( with a wrong hypothetical answer )
17
Difference In 2013 Systems able to decide whether all candidate answers were incorrect or not should be rewarded over systems that just rank answers Introducing in our tests a portion of questions (39%) where none of the options are correct and including a new last option in all questions: “None of the above answers is correct” (NCA) (baseline should be 0.39) Obviously the answer NCA is better than NoA
18
Evaluation Measure nR: number of questions correctly answered.
nU: number of questions unanswered. n: total number of questions Baseline:0.2
19
Evaluation Measure Secondary measure
nR: number of questions correctly answered. nU: number of questions unanswered. n: total number of questions Baseline:0.2
20
Evaluation Measure Evaluate the validation performance
nUR: number of unanswered questions whose candidate answer was correct nUW: number of unanswered questions whose candidate answer was incorrect nUE: number of unanswered questions whose candidate answer was empty
21
Overview of results
22
Results in 2011
23
Result in 2012
24
Result in 2013
25
Result in 2013
26
The Use of External Knowledge
Run 1 : no external knowledge allowed but the background collections provided by the organization
27
Websites http://clef2011.clef-initiative.eu/index.php
28
Thank you
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.