Download presentation
Presentation is loading. Please wait.
Published byJazmyn Garbutt Modified over 10 years ago
1
Question Answering Gideon Mann Johns Hopkins University gsm@cs.jhu.edu
2
Information Retrieval Tasks Retired General Wesley Clark How old is General Clark? How long did Clark serve in the military? Will Clark run for President?
3
Ad-Hoc Queries Prior work has been concerned mainly with answering ad-hoc queries : General Clark Typically a few words long, not an entire question What is desired is general information about the subject in question
4
Answering Ad-Hoc Queries Main focus of Information Retrieval past 2-3 decades Solution(s) : –Vector-based methods –SVD, query expansion, language modeling –Return a page as an answer Resulting systems Extremely Useful –Google, Altavista
5
Traditional IR Query Document Collection Document Ranking Document retrieval
6
But not all queries are Ad-Hoc! How old is General Clark? Does not fit well into an Ad-hoc paradigm –“How” and “is” are not relevant for appropriate retrieval –Potentially useful cues in the question are ignored in traditional ad-hoc retrieval system
7
Documents are not Facts Traditional IR systems return Pages –Useful when only a vague information need has been identified Insufficient when a fact is desired: –How old is General Clark? 58 –How long did Clark serve in the mililary? 36 years –Will Clark run for president? Maybe
8
Question Answering as Retrieval Given a document collection and a question: A question answering system should retrieve a short snippet of text which exactly answer the question asked.
9
Question Answering Query Document Collection Document Ranking Document retrieval Ranked Answers Answer Extraction(Sentence ranking)
10
QA as a Comprehension Task For perfect recall, the answer only has to appear once in the collection. In essence, this forces the QA system to function as a text understanding system Thus QA may be interesting, not only for retrieval, but also to test understanding
11
QA as a stepping stone Current QA focused on Fact extraction –Answers appear verbatim in text How old is General Clark? How can we answer questions which don’t appear exactly in the text? How long has Clark been in the military? Will Clark run for President? Maybe build on low-level QA extracted facts
12
QA Methods Two Main Categories of Methods for Question Answering –Answer Preference Matching –Answer Context Matching
13
Lecture Outline 1.Answer Preferences Question Analysis Type identification Learning Answering Typing 2.Answer Context Learning Context Similarity Alignment Surface Text Patterns
14
Answer Type Identification From the question itself infer the likely type of the answer How old is General Clark? How Old When did Clark retire? When Who is the NBC war correspondent? Who
15
NASSLI! April 12 Deadline – –Could be extended…. –Mail hale@jhu.edu to ask for more timehale@jhu.edu
16
Answer Type Identification From the question itself infer the likely type of the answer How old is General Clark? How Old Age When did Clark retire? When Date Who is the NBC war correspondent? Correspondent Person
17
Wh-Words WhoPerson, Organization, Location WhenDate, Year WhereLocation In WhatLocation What??
18
Difficult to Enumerate All Possibilities Though What is the service ceiling for a PAC750?
19
WordNet wingspan length diameterradius altitude ceiling
20
WordNet For Answer Typing wingspan length diameterradius altitude ceiling NUMBER What is the service ceiling for a PAC750?
21
Lecture Outline 1.Answer Preferences Question Analysis Type identification Learning Answering Typing 2.Answer Context Learning Context Similarity Alignment Surface Text Patterns
22
Answer Typing gives the Preference… From Answer Typing, we have the preferences imposed by the question But in order to use those preferences, we must have a way to detect potential candidate answers
23
Some are Simple… Number [0-9]+ Date ($month) ($day) ($year) Age 0 – 100
24
… Others Complicated Who shot Martin Luther King? –Person preference Requires a Named Entity Identifier Who saved Chrysler from bankruptcy? –Not just confined to people… –Need a Tagger to identify appropriate candidates
25
Use WordNet for Type Identification “What 20 th century poet wrote Howl?” writer poet Ginsburg Frost Rilke Candidate Set communicator
26
Simple Answer Extraction How old is General Clark? Age General Clark, from Little Rock, Arkansas, turns 58 after serving 36 years in the service, this December 23, 2002. General Clark, from Little Rock, Arkansas, turns 58 after serving 36 years in the service, this December 23, 2002. Age Tagger
27
Lecture Outline 1.Answer Preferences Question Analysis Type identification Learning Answering Typing 2.Answer Context Learning Context Similarity Alignment Surface Text Patterns
28
Learning Answer Typing What is desired is a model which predicts P(type|question) Usually a variety of possible types –Who Person (“Who shot Kennedy?” Oswald) Organization (“Who rescued Chrysler from bankruptcy?” The Government) Location (“Who won the Superbowl?” New England)
29
What training data? Annotated Questions –“Who shot Kennedy” [PERSON] Problems : –Expensive to annotate –Must be redone, every time the tag set is devised
30
Trivia Questions! Alternatively, use unannotated Trivia Questions –Q: “Who shot Kennedy” –A: Lee Harvey Oswald Run your Type-Tagger over the answers, to get tags –A: Lee Harvey Oswald [ PERSON]
31
MI Model From tags, you can build a MI model –Predict from the question head-word MI(Question Head Word, Type Tag) = P(Type Tag | QuestionHeadWord) --------------------------------------------- P(Type Tag) –From this you can judge the fit of a question/word pair –(Mann 2001)
32
MaxEnt Model Rather than just use head word alone train on the entire set of words, and build a Maximum Entropy model to combine features suggested by the entire phrase “What was the year in which Castro was born?” (Ittycheriah et al. 2001)
33
Maybe you don’t even need training data! Looking at occurrences of words in text, look at what types occur next to them Use these co-occurrence statistics to determine appropriate type of answer (Prager et al. 2002)
34
Lecture Outline 1.Answer Preferences Question Analysis Type identification Learning Answering Typing 2.Answer Context Learning Context Similarity Alignment Surface Text Patterns
35
Is Answer Typing Enough? Even when you’ve found the correct sentence, and know the type of the answer a lot of ambiguity in the answer still remains Some experiments show that in every sentence, around 2/3 choices of appropriate type for a sentence which answers a question For high precision systems, this is unacceptable
36
Answer Context Who shot Martin Luther King? Answer Preference Answer Context
37
Using Context Many systems simply look for an answer of the correct type in a context which seems appropriate –Many matching keywords –Perhaps using query expansion
38
Another alternative If the question is “Who shot Kennedy” Search for all exact phrases matches “X shot Kennedy” And simple alternations “Kennedy was shot by X” (Brill et al. 2001)
39
Beyond… The first step beyond simple keyword matching, is to use relative position information One way of doing this is to use alignment information
40
Lecture Outline 1.Answer Preferences Question Analysis Type identification Learning Answering Typing 2.Answer Context Learning Context Similarity Alignment Surface Text Patterns
41
Local Alignment Who shot Kennedy? Jack assassinated Oswald, the man who shot Kennedy, and was Mrs. Ruby’s Husband. Three Potential Candidates by type
42
Local Alignment Who shot Kennedy? Jack assassinated Oswald, the man who shot Kennedy, and was Mrs. Ruby’s Husband. Matching Context Question Head word
43
Local Alignment Who shot Kennedy? Jack assassinated Oswald, the man who shot Kennedy, and was Mrs. Ruby’s Husband. Anchor word
44
Local Alignment Who shot Kennedy? Jack assassinated Oswald, the man who shot Kennedy, and was Mrs. Ruby’s Husband. Potential alignments
45
Local Alignment Who shot Kennedy? Jack assassinated Oswald, the man who shot Kennedy, and was Mrs. Ruby’s Husband. One Alignment Three Alignment Features :
46
Local Alignment Who shot Kennedy? Jack assassinated Oswald, the man who shot Kennedy, and was Mrs. Ruby’s Husband. One Alignment Three Alignment Features : 1.Dws : Distance between Question Head word and Anchor in the sentence 2
47
Local Alignment Who shot Kennedy? Jack assassinated Oswald, the man who shot Kennedy, and was Mrs. Ruby’s Husband. Three Alignment Features : 2. Dwq Distance between Question Head word and Anchor In the question 1
48
Local Alignment Who shot Kennedy? Jack assassinated Oswald, the man who shot Kennedy, and was Mrs. Ruby’s Husband. Three Alignment Features : 3. R Has the Head Word changed position? Headword position flipped
49
Build a Statistical Model Pr (answer | question, sentence) = Pr ( Dws | answer, question, sentence) *Pr(Dwq | answer, question, sentence) *Pr(R | answer, question, sentence) and if unsure about type preference, can add in a term there
50
In essence, this local alignment model gives a robust method for using the context of the question to pick out the correct answer from a given sentence containing an answer
51
Surface text Patterns Categorize question into what kind of data it is looking for Use templates to build specialized models Use resulting “surface text patterns” for searching
52
Birthday Templates W. A. Mozart1756 I. Newton1642 M. Gandhi1869 V. S. Naipaul1932 Bill Gates1951
53
Web Search to generate patterns Web pages w/“Mozart” “1756” Sentences with “Mozart” “1756” Substrings with “Mozart” “1756”
54
How can we pick good patterns? Frequent ones may be too general Infrequent ones not that useful Want precise, specific ones Use held out templates to evaluate patterns
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.