Presentation is loading. Please wait.

Presentation is loading. Please wait.

Jan Wiebe University of Pittsburgh Claire Cardie Cornell University Ellen Riloff University of Utah Opinions in Question Answering.

Similar presentations


Presentation on theme: "Jan Wiebe University of Pittsburgh Claire Cardie Cornell University Ellen Riloff University of Utah Opinions in Question Answering."— Presentation transcript:

1 Jan Wiebe University of Pittsburgh Claire Cardie Cornell University Ellen Riloff University of Utah Opinions in Question Answering

2 Overview Techniques and tools to support multi- perspective question answering (MPQA) Goals:  produce high-level summaries of opinions  incorporate rich information about opinions extracted from text

3 Overview Opinion-oriented information extraction Extract opinion frames for individual expressions Combine to create opinion-oriented “scenario” templates Opinion Summary Template

4 MPQA Corpus Grew out of the 2002 ARDA NRRC Workshop on Multi-Perspective Question Answering Detailed annotations of opinions Freely available (thanks to David Day): nrrc.mitre.org/NRRC/publications.htm

5 Collaborations Interactions with end-to-end system teams Integrated corpus annotation Pilot opinion evaluation

6 Outline Recent activities  Subjective sentence identifier  Clause intensity identifier  Extended annotation scheme Version 1  Q&A corpus  Nested opinions  Opinion summaries What’s next

7 Subjective Sentence Identifier Input is unlabeled data Evaluated on manual annotations of the MPQA corpus Accuracy as good as supervised systems which classify all sentences

8 Subjective Sentence Identifier Bootstraps from a known subjective vocabulary, labeling the sentences it can with confidence Extraction pattern learner finds clues of subjectivity in that corpus Incorporated into a statistical model trained on the automatically labeled data Multiple classification strategies  76% accuracy with 54% baseline  80% subj. precision and 66% subj. recall  80% obj. precision: and 51% obj. recall

9 Clause-level intensity (strength) identification Maximum intensity of the opinions in a clause Neutral, low, medium, high Evaluated on manual annotations of the MPQA corpus

10 I am furious that my landlord refused to return my security deposit until I sued them. Example return my that am them sued I to refused landlord furious I until deposit securitymy High Strength Medium Strength Neutral Opinionated Sentence

11 Clause-level intensity (strength) identification Classification and regression learners Accuracy: how many clauses are assigned exactly the correct class? Mean Squared Error: how close are the answers to the right ones? Accuracy: classification > regression  23-79% over baseline MSE: regression > classification  57-64% over baseline

12 Opinion Frames direct subjective annotation Span: “strongly criticized and condemned” Source: Strength (intensity): high Attitudes: negative toward the report Target: report The report has been strongly criticized and condemned by many countries.

13 Major Attitude Types Positive Negative Arguing for ones world view Intention

14 Negative and Positive Example People are happy because Chavez has fallen, she said. direct subjective annotation span: are happy source: attitude: attitude annotation span: are happy because Chavez has fallen type: positive and negative positive target: negative target: target annotation span: Chavez has fallen target annotation span: Chavez

15 Arguing for World View Example Putin remarked that events in Chechnia “could be interpreted only in the context of the struggle against international terrorism.” direct subjective annotation span: remarked source: attitude: attitude annotation span: could be interpreted only in the context of the struggle against international terrorism type: argue for world view target: target annotation span: events in Chechnia

16 Characteristics Sarcastic "Great, keep on buying dollars so there'll be more and more poor people in the country," shouted one. Speculative Leaders probably held their breath… Characteristics of the linguistic realization

17 Q&A Corpus Includes 98 documents from the NRRC corpus, split into four topics: Kyoto Protocol 2002 elections in Zimbabwe U.S. annual human rights report 2002 coup in Venezuela

18 Q&A Corpus Includes 30 questions 15 questions classified as fact  What is the Kyoto Protocol about?  What is the Kiko Network?  Where did Mugabe vote in the 2002 presidential election? 15 questions classified as opinion  How do European Union countries feel about the US opposition the Kyoto protocol?  Are the Japanese unanimous in their opinion of Bush’s position on the Kyoto Protocol?  What was the American and British reaction to the reelection of Mugabe?

19 Q&A Corpus Answers annotations added by two annotators  Minimal spans that constituted or contributed to an answer Confidence Partial?

20 Difficulties in Corpus Creation Annotating answers  Difficult to decide what constitutes an answer: Q: “Did most Venezuelans support the 2002 coup?” A: “Protesters…failed to gain the support of the army.” ???  Not clear what sources to attribute to collective entities European Union: The EU Parliament? Great Britain? GB government? Tony Blair? The Japanese: The Japanese government? Emperor Akihito? Empress Michiko? The Kiko Network?

21 Q&A Corpus Interannotator agreement  85% on average  using Wiebe et. al’s agr(a||b) measure  78% and 93%, respectively for each annotator

22 Evaluating MPQA Opinion Annotations Answer probability: estimate P(opinion answer | opinion question) P(fact answer | fact question)  Low-level opinion information reliable predictor facts: 78% opinions: 93% Answer rank  Sentence-based retrieval  Filter based on opinion annotations  Examine rank of first sentence w/answer  Filtering improves answer rank

23 Summary Representations of Opinions Direct subjective annotation Source: Attitude: Opinion Summary Template

24 Reporting in text Clapp sums up the environmental movement’s reaction: “The polluters are unreasonable’’ Charlie was angry at Alice’s claim that Bob was unhappy

25 Hierarchy of Perspective & Speech Expressions Charlie was angry at Alice’s claim that Bob was unhappy angryclaimimplicit speech event unhappy sums up implicit speech event reaction Clapp sums up the environmental movement’s reaction: “The polluters are unreasonable’’

26 Baseline 1: Only filter through writer 66% correct angryclaim implicit unhappy

27 Baseline 2: Dependency Tree 72% correct angry implicit claim unhappy claim unhappy claim unhappy

28 ML Approach Features  Parse-based  Positional  Lexical  Genre-specific IND decision trees (mml criterion) 78% correct

29 Summary Representations of Opinions Direct subjective annotation Source: Attitude: Opinion Summary Template

30 Opinion Summaries Summaries based on manual annotations  Single-document summaries  Opinion annotations grouped by source and target  Sources characterized by degree of subjectivity/objectivity  Simple graph-based graphical interface Overview of entire graph Focus on portion of the graph Drill-down to opinion annotations (highlighted) Grouping/deleting of sources/targets JGRAPH package

31 The next 6 months Identify individual expressions of subjectivity Perform manual annotations Extract Sources Opinion summaries with automatic annotations


Download ppt "Jan Wiebe University of Pittsburgh Claire Cardie Cornell University Ellen Riloff University of Utah Opinions in Question Answering."

Similar presentations


Ads by Google