Download presentation
Presentation is loading. Please wait.
Published byAnis Stevens Modified over 9 years ago
1
Semantic Entailment Nathaniel Story Ginger Buckbee Greg Lorge Billy Dean
2
What is it? Given sentence A, can you infer sentence B? iTunes software has seen strong sales in Europe. Strong sales for iTunes in Europe True Kerry hit Cheney hard on his conduct on the war in Iraq. Kerry shot Cheney.False
3
Challenges Paraphrasing Negation Pre-Suppositions World Knowledge Juiciness
4
Paraphrasing Example “There is a cat on the table.” “A cat is on the table.” Different structurally, but infers same meaning
5
Negation “I am lazy” “I am not lazy” “I’m not unhappy” (Double negation) “I’m happy” “It’s not unnecessary” “It’s necessary”
6
Pre-Suppositions “Bob doesn’t think it’s raining” “Bob doesn’t know it’s raining” Conversational Pragmatics Contextual knowledge
7
World Knowledge “Japan is the only country that currently has an emperor.” “Columbia doesn’t have an emperor.” First sentence entails second, but you need to know that Columbia is a country.
8
Approach Tools: Stemmer Parser from Dan Bikel’s site MALLET (maxEnt classifier) Wordnet (synset) Focusing on Comparable Document task Start with simple features like word matching, synonym matching Add in more complicated functions like phrase structure comparisons Test the system out, see how it works. Continue adding features to improve performance.
9
Data Recognizing Textual Entailment Challenge (RTE) training data set Training set is labeled Best data set as was used in the European Competition
10
Evaluation International Competition Best ≈ 60% accuracy Strive for >52% accuracy Comparing against annotated test set Improvement: Print out incorrect ones, then look for mistakes.
11
The End
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.