Natural Language Questions for the Web of Data Mohamed Yahya, Klaus Berberich, Gerhard Weikum Max Planck Institute for Informatics, Germany Shady Elbassuoni Qatar Computing Research Institute Maya Ramanath Dept. of CSE, IIT-Delhi, India Volker Tresp Siemens AG, Corporate Technology, Munich, Germany EMNLP 2012
Introduction SPARQL ?x hasGender female ?x isa actor ?x actedIn Casablanca (film) ?x marriedTo ?w ?w isa writer ?w bornIn Rome Natural language question qNL “Which female actor played in Casablanca and is married to a writer who was born in Rome?”. Problem : SPARQL is way too difficlut. Target : Convert SPARQL to qNL.
Knowledge Base: Yago2 Yago2 is a huge semantic knowledge base, derived from Wikipedia, WordNet and GeoNames.
Framework DEANNA (DEep Answers for maNy Naturally Asked questions)
Framework Phrase Detection Phrase Mapping Q-Unit Generation Disambiguation of Phrase Mappings Query Generation
Phrase Detection A detected phrase p is a pair where Toks is a phrase and l is a label, l ∈ {concept, relation}. Such as qNL : “Which female actor played in Casablanca and is married to a writer who was born in Rome?”. Concept phrase : Relation phrase : Framework
Phrase Detection Framework Concept detection Using the Yago2 knowledge base.
Phrase Detection Framework Relation detection Using the ReVerb (Fader et al., 2011) which is a relation detector. qNL : “Which female actor played in Casablanca and is married to a writer who was born in Rome?”.
Phrase Detection Framework
Phrase Mapping Framework The mapping of concept phrases also relies on the phrase-concept dictionary. : Using Yago2 knowledge base. The mapping of relation phrases relies on a corpus of textual patterns to relation mappings of the form.
Phrase Mapping Framework
Dependency Parsing & Q-Unit Generation Framework Dependency parsing identifies triples of tokens, or triploids,, where trel, targ1, targ2 ∈ qNL are seeds for phrases. Dependency Parsing
Dependency Parsing & Q-Unit Generation Framework qNL : “Which female actor played in Casablanca and is married to a writer who was born in Rome?”. actor played / played in Casablanca Triploid :
Dependency Parsing & Q-Unit Generation Framework A q-unit is a triple of sets of phrases,, where trel ∈ prel and similarly for arg1 and arg2. Q-Unit Generation
Dependency Parsing & Q-Unit Generation Framework
Dependency Parsing & Q-Unit Generation Framework
Disambiguation of Phrase Mappings Framework Disambiguation Graph Esim ⊆ Vp × Vs Ecoh ⊆ Vs × Vs Eq ⊆ Vq×Vp×d, where d ∈ {rel, arg1, arg2} is a q-edge.
Disambiguation of Phrase Mappings Framework Disambiguation Graph(Cohsem) For Yago2, the characterize an entity e by its inlinks InLinks(e): the set of Yago2 entities whose corresponding Wikipedia pages link to the entity. InLinks(Taipei_zoo):
Disambiguation of Phrase Mappings Framework Disambiguation Graph(Cohsem) For class c with entities e, its inlinks are defined as follows: InLinks(Taiwan):
Disambiguation of Phrase Mappings Framework Disambiguation Graph(Cohsem) For class r with entities e, its inlinks are defined as follows:
Disambiguation of Phrase Mappings Framework Disambiguation Graph(Simsem) For entities How often a phrase refers to a certain entity in Wikipedia. For classes Normalized prior the reflects the Number of members in a class For relations The maximum n-gram similarity between the phrase and any of the relation’s surface forms
Disambiguation of Phrase Mappings Framework Objective function is :
Disambiguation of Phrase Mappings Framework Definitions:
Disambiguation of Phrase Mappings Framework Definitions:
Disambiguation of Phrase Mappings Framework Constraints:
Disambiguation of Phrase Mappings Framework Constraints:
Disambiguation of Phrase Mappings Framework Constraints:
Query Generation Framework
Evaluation Experiments are based on two collections of questions: QALD-1 (27 questions out of 50) NAGA (44 questions out of 87) Using 19 questions from the QALD-1 Test set for tuning hyperparameters (α, β, γ) in the ILP objective function
Evaluation Evaluating the output of DEANNA at three stages in the processing pipeline: a) Disambiguation b) Query Generation c) Question Answering At each of the three stages, the output was shown to two human assessors. If the two were in disagreement, then a third person resolved the judgment.
Evaluation Define coverage and precision as follows:
Evaluation a) Disambiguation
Evaluation b) Query Generation
Evaluation c) Question Answering
Evaluation
Conclusions A method for translating natural-language questions into structured queries.