A Noisy-Channel Approach to Question Answering Authors: Echihabi and Marcu Date: 2003 Presenters: Omer Percin Emin Yigit Koksal Ustun Ozgur IR 2013.

Slides:



Advertisements
Similar presentations
1 Evaluation Rong Jin. 2 Evaluation  Evaluation is key to building effective and efficient search engines usually carried out in controlled experiments.
Advertisements

Date : 2014/12/04 Author : Parikshit Sondhi, ChengXiang Zhai Source : CIKM’14 Advisor : Jia-ling Koh Speaker : Sz-Han,Wang.
QA-LaSIE Components The question document and each candidate answer document pass through all nine components of the QA-LaSIE system in the order shown.
Enter question text... 1.Enter answer text.... Enter question text... 1.Enter answer text...
SEARCHING QUESTION AND ANSWER ARCHIVES Dr. Jiwoon Jeon Presented by CHARANYA VENKATESH KUMAR.
Enter question text... 1.Enter answer text.... Enter question text... 1.Enter answer text...
Language Model based Information Retrieval: University of Saarland 1 A Hidden Markov Model Information Retrieval System Mahboob Alam Khalid.
1 Question Answering in Biomedicine Student: Andreea Tutos Id: Supervisor: Diego Molla.
Evaluating Search Engine
Answer Extraction Ling573 NLP Systems and Applications May 19, 2011.
Presenters: Başak Çakar Şadiye Kaptanoğlu.  Typical output of an IR system – static predefined summary ◦ Title ◦ First few sentences  Not a clear view.
Deliverable #3: Document and Passage Retrieval Ling 573 NLP Systems and Applications May 10, 2011.
Modern Information Retrieval Chapter 1: Introduction
1 Quasi-Synchronous Grammars  Based on key observations in MT: translated sentences often have some isomorphic syntactic structure, but not usually in.
Question Answering using Language Modeling Some workshop-level thoughts.
Gobalisation Week 8 Text processes part 2 Spelling dictionaries Noisy channel model Candidate strings Prior probability and likelihood Lab session: practising.
What is the Jeopardy Model? A Quasi-Synchronous Grammar for Question Answering Mengqiu Wang, Noah A. Smith and Teruko Mitamura Language Technology Institute.
An investigation of query expansion terms Gheorghe Muresan Rutgers University, School of Communication, Information and Library Science 4 Huntington St.,
Information Retrieval in Practice
1 A Discriminative Approach to Topic- Based Citation Recommendation Jie Tang and Jing Zhang Presented by Pei Li Knowledge Engineering Group, Dept. of Computer.
Leveraging Conceptual Lexicon : Query Disambiguation using Proximity Information for Patent Retrieval Date : 2013/10/30 Author : Parvaz Mahdabi, Shima.
AnswerBus Question Answering System Zhiping Zheng School of Information, University of Michigan HLT 2002.
A Data Driven Approach to Query Expansion in Question Answering Leon Derczynski, Robert Gaizauskas, Mark Greenwood and Jun Wang Natural Language Processing.
Question Answering.  Goal  Automatically answer questions submitted by humans in a natural language form  Approaches  Rely on techniques from diverse.
Ontology-Based Information Extraction: Current Approaches.
A Probabilistic Graphical Model for Joint Answer Ranking in Question Answering Jeongwoo Ko, Luo Si, Eric Nyberg (SIGIR ’ 07) Speaker: Cho, Chin Wei Advisor:
Evaluating What’s Been Learned. Cross-Validation Foundation is a simple idea – “ holdout ” – holds out a certain amount for testing and uses rest for.
Natural Language Based Reformulation Resource and Web Exploitation for Question Answering Ulf Hermjakob, Abdessamad Echihabi, Daniel Marcu University of.
Solving Crossword Puzzles with AI:
Retrieval Models for Question and Answer Archives Xiaobing Xue, Jiwoon Jeon, W. Bruce Croft Computer Science Department University of Massachusetts, Google,
First and Last Name Date10/25/13 Hour 1
Basic Probability Permutations and Combinations: -Combinations: -The number of different packages of data taken r at time from a data set containing n.
Predicting Question Quality Bruce Croft and Stephen Cronen-Townsend University of Massachusetts Amherst.
Date : 2012/10/25 Author : Yosi Mass, Yehoshua Sagiv Source : WSDM’12 Speaker : Er-Gang Liu Advisor : Dr. Jia-ling Koh 1.
Collocations and Information Management Applications Gregor Erbach Saarland University Saarbrücken.
1 Multi-Perspective Question Answering Using the OpQA Corpus (HLT/EMNLP 2005) Veselin Stoyanov Claire Cardie Janyce Wiebe Cornell University University.
WIRED Week 3 Syllabus Update (next week) Readings Overview - Quick Review of Last Week’s IR Models (if time) - Evaluating IR Systems - Understanding Queries.
A Novel Pattern Learning Method for Open Domain Question Answering IJCNLP 2004 Yongping Du, Xuanjing Huang, Xin Li, Lide Wu.
CIKM Opinion Retrieval from Blogs Wei Zhang 1 Clement Yu 1 Weiyi Meng 2 1 Department of.
Chapter 8 Evaluating Search Engine. Evaluation n Evaluation is key to building effective and efficient search engines  Measurement usually carried out.
Ling573 NLP Systems and Applications May 7, 2013.
Inquiry Refers to the diverse ways in which scientists study the natural world and propose explanations based on the evidence derived from their work.
A Word Clustering Approach for Language Model-based Sentence Retrieval in Question Answering Systems Saeedeh Momtazi, Dietrich Klakow University of Saarland,Germany.
Searching Specification Documents R. Agrawal, R. Srikant. WWW-2002.
Advantages of Query Biased Summaries in Information Retrieval by A. Tombros and M. Sanderson Presenters: Omer Erdil Albayrak Bilge Koroglu.
A Classification-based Approach to Question Answering in Discussion Boards Liangjie Hong, Brian D. Davison Lehigh University (SIGIR ’ 09) Speaker: Cho,
Comparing Document Segmentation for Passage Retrieval in Question Answering Jorg Tiedemann University of Groningen presented by: Moy’awiah Al-Shannaq
Mining Dependency Relations for Query Expansion in Passage Retrieval Renxu Sun, Chai-Huat Ong, Tat-Seng Chua National University of Singapore SIGIR2006.
1 Evaluating High Accuracy Retrieval Techniques Chirag Shah,W. Bruce Croft Center for Intelligent Information Retrieval Department of Computer Science.
Towards Entailment Based Question Answering: ITC-irst at Clef 2006 Milen Kouylekov, Matteo Negri, Bernardo Magnini & Bonaventura Coppola ITC-irst, Centro.
Paragraph Structure Instructor: Sharon Lai. SENTENCE vs. PARAGRAPH How is a sentence different from a paragraph? Compare: 1.Alex is a good student. (sentence)
Finding the Right Facts in the Crowd: Factoid Question Answering over Social Media J. Bian, Y. Liu, E. Agichtein, and H. Zha ACM WWW, 2008.
Relevance Models and Answer Granularity for Question Answering W. Bruce Croft and James Allan CIIR University of Massachusetts, Amherst.
SEMANTIC VERIFICATION IN AN ONLINE FACT SEEKING ENVIRONMENT DMITRI ROUSSINOV, OZGUR TURETKEN Speaker: Li, HueiJyun Advisor: Koh, JiaLing Date: 2008/5/1.
Learning to Rank: From Pairwise Approach to Listwise Approach Authors: Zhe Cao, Tao Qin, Tie-Yan Liu, Ming-Feng Tsai, and Hang Li Presenter: Davidson Date:
For Monday Read chapter 26 Homework: –Chapter 23, exercises 8 and 9.
UIC at TREC 2006: Blog Track Wei Zhang Clement Yu Department of Computer Science University of Illinois at Chicago.
AQUAINT Mid-Year PI Meeting – June 2002 Integrating Robust Semantics, Event Detection, Information Fusion, and Summarization for Multimedia Question Answering.
1 Web Search What is a keyword? 2 Thinking What makes a webpage relevant to you? 3 Web Search/Thinking Does the number of words you type into.
Date of download: 7/2/2016 Copyright © 2016 SPIE. All rights reserved. Proposed enrollment and production of an identification document. Figure Legend:
EXAMPLE 3 Multiply polynomials vertically and horizontally a. Multiply –2y 2 + 3y – 6 and y – 2 in a vertical format. b. Multiply x + 3 and 3x 2 – 2x +
Ling573 NLP Systems and Applications May 16, 2013
Language Models for Information Retrieval
Donna M. Gates Carnegie Mellon University
Multiplying binomial with polynomial
CS246: Information Retrieval
3.5 Limits at Infinity Horizontal Asymptote.
Information Retrieval and Web Design
Topic A Grade 1.
Presentation transcript:

A Noisy-Channel Approach to Question Answering Authors: Echihabi and Marcu Date: 2003 Presenters: Omer Percin Emin Yigit Koksal Ustun Ozgur IR 2013

Question Answering Approaches Matching same words  Does not always work Who is the leader of France? Hadjenberg, who is the leader of France Jewish Community... Solution: Some Kind of Transformation Needed

Noisy Channel Approach IDEA: Question is a noisy version of the Answer. ANSWER QUESTION NOISY CHANNEL MODEL

QA System Two Modules  IR Engine: Get relevant documents and their sentences  Answer Identification Module From these sentences, identify substrings S(A) as candidate answers For each substring (answer candidate) and sentence, calculate probability of obtaining the Question text using a Generative Model

Answer Identification Module HORIZONTAL CUTTER PERMUTER ANSWER MARKER FERTILIZER REPLACE R CANDIDATE SENTENCE QUESTION p1 p2p3 p4p5 P(Q|A) = p1 x p2 x p3 x p4 x p5 GENERATIVE MODEL

FERTILIZER REPLACER PERMUTER CUTTER ANSWER MARKER

Training and Testing Use the generative model Training:  Deterministic cutter  Answer known Testing: Given a Question  Exhaustive cutter (try each cut for each sentence)  Exhaustive answer marker (try each word as possible answer)  Generate probabilities: P(Q| Sentence+Cut+Answer)  Select Sentence+Cut+Answer combination with Max. P

Performance Not bad compared to systems with 10s modules, only 2 modules Beats QA-base in MRR on TREC datasets  ( vs 0.291) QA-base was state-of-art and ranked 2-7 in TREC in last 3 years (in 2003)