MASTERS THESIS DEFENSE MASTERS THESIS DEFENSE Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning.

Slides:



Advertisements
Similar presentations
Web Passive Voice Tutor: an Intelligent Computer Assisted Language Learning System over the WWW Maria Virvou & Victoria Tsiriga Department of Informatics,
Advertisements

Computational language: week 10 Lexical Knowledge Representation concluded Syntax-based computational language Sentence structure: syntax Context free.
CILC2011 A framework for structured knowledge extraction and representation from natural language via deep sentence analysis Stefania Costantini Niva Florio.
COGEX at the Second RTE Marta Tatu, Brandon Iles, John Slavick, Adrian Novischi, Dan Moldovan Language Computer Corporation April 10 th, 2006.
COGEX at the Second RTE Marta Tatu, Brandon Iles, John Slavick, Adrian Novischi, Dan Moldovan Language Computer Corporation April 10 th, 2006.
Semantics (Representing Meaning)
1 CS 388: Natural Language Processing Story Understanding Raymond J. Mooney University of Texas at Austin.
Proceedings of the Conference on Intelligent Text Processing and Computational Linguistics (CICLing-2007) Learning for Semantic Parsing Advisor: Hsin-His.
Artificial Intelligence and Lisp TDDC65 Course leader: Erik Sandewall Lab assistants: Henrik Lundberg, John Olsson Administrator: Anna Grabska Eklund Webpage:
Explanation Producing Combination of NLP and Logical Reasoning through Translation of Text to KR Formalisms CHITTA BARAL ARIZONA STATE UNIVERSITY 1 School.
Robust Textual Inference via Graph Matching Aria Haghighi Andrew Ng Christopher Manning.
Dialogue – Driven Intranet Search Suma Adindla School of Computer Science & Electronic Engineering 8th LANGUAGE & COMPUTATION DAY 2009.
USC Graduate Student DayColumbia, SCMarch 2006 Presented by: Jingshan Huang Computer Science & Engineering Department University of South Carolina PhD.
Research topics Semantic Web - Spring 2007 Computer Engineering Department Sharif University of Technology.
Ontologies and the Semantic Web by Ian Horrocks presented by Thomas Packer 1.
Knowledge Acquisitioning. Definition The transfer and transformation of potential problem solving expertise from some knowledge source to a program.
Using Information Extraction for Question Answering Done by Rani Qumsiyeh.
How can Computer Science contribute to Research Publishing?
Semantics For the Semantic Web: The Implicit, the Formal and The Powerful Amit Sheth, Cartic Ramakrishnan, Christopher Thomas CS751 Spring 2005 Presenter:
Information Retrieval and Extraction 資訊檢索與擷取 Chia-Hui Chang National Central University
Knowledge Representation
Module 2b: Modeling Information Objects and Relationships IMT530: Organization of Information Resources Winter, 2007 Michael Crandall.
BIS310: Week 7 BIS310: Structured Analysis and Design Data Modeling and Database Design.
Semantic Web Technologies Lecture # 2 Faculty of Computer Science, IBA.
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2005 Lecture 1 21 July 2005.
9/8/20151 Natural Language Processing Lecture Notes 1.
Empirical Methods in Information Extraction Claire Cardie Appeared in AI Magazine, 18:4, Summarized by Seong-Bae Park.
RuleML-2007, Orlando, Florida1 Towards Knowledge Extraction from Weblogs and Rule-based Semantic Querying Xi Bai, Jigui Sun, Haiyan Che, Jin.
Steps Toward an AGI Roadmap Włodek Duch ( Google: W. Duch) AGI, Memphis, 1-2 March 2007 Roadmaps: A Ten Year Roadmap to Machines with Common Sense (Push.
A Z Approach in Validating ORA-SS Data Models Scott Uk-Jin Lee Jing Sun Gillian Dobbie Yuan Fang Li.
AnswerBus Question Answering System Zhiping Zheng School of Information, University of Michigan HLT 2002.
Natural Language Processing Introduction. 2 Natural Language Processing We’re going to study what goes into getting computers to perform useful and interesting.
Scott Duvall, Brett South, Stéphane Meystre A Hands-on Introduction to Natural Language Processing in Healthcare Annotation as a Central Task for Development.
A Probabilistic Graphical Model for Joint Answer Ranking in Question Answering Jeongwoo Ko, Luo Si, Eric Nyberg (SIGIR ’ 07) Speaker: Cho, Chin Wei Advisor:
RCDL Conference, Petrozavodsk, Russia Context-Based Retrieval in Digital Libraries: Approach and Technological Framework Kurt Sandkuhl, Alexander Smirnov,
Dimitrios Skoutas Alkis Simitsis
NATURAL LANGUAGE UNDERSTANDING FOR SOFT INFORMATION FUSION Stuart C. Shapiro and Daniel R. Schlegel Department of Computer Science and Engineering Center.
1 Learning Sub-structures of Document Semantic Graphs for Document Summarization 1 Jure Leskovec, 1 Marko Grobelnik, 2 Natasa Milic-Frayling 1 Jozef Stefan.
Sergey Gromov Yulia Krasilnikova Vladimir Polyakov (NRTU MISIS, Moscow) KNOWLEDGE BASE CREATION FOR NATIONAL NANOTECHNOLOGY NETWORKS «CONSTRUCTIONAL NANOMATERIALS»
Introduction to Dialogue Systems. User Input System Output ?
Computational linguistics A brief overview. Computational Linguistics might be considered as a synonym of automatic processing of natural language, since.
Christoph F. Eick University of Houston Organization 1. What are Ontologies? 2. What are they good for? 3. Ontologies and.
Semantic Nets, Frames, World Representation CS – W February, 2004.
1/21 Automatic Discovery of Intentions in Text and its Application to Question Answering (ACL 2005 Student Research Workshop )
Of 33 lecture 1: introduction. of 33 the semantic web vision today’s web (1) web content – for human consumption (no structural information) people search.
Knowledge Representation
Semantic web Bootstrapping & Annotation Hassan Sayyadi Semantic web research laboratory Computer department Sharif university of.
DeepDive Model Dongfang Xu Ph.D student, School of Information, University of Arizona Dec 13, 2015.
CPSC 422, Lecture 27Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27 Nov, 16, 2015.
Support Vector Machines and Kernel Methods for Co-Reference Resolution 2007 Summer Workshop on Human Language Technology Center for Language and Speech.
Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.
Emotion Recognition from Text Using Situational Information and a Personalized Emotion Model Yong-soo Seol 1, Han-woo Kim 1, and Dong-joo Kim 2 1 Department.
Knowledge Structure Vijay Meena ( ) Gaurav Meena ( )
AUTONOMOUS REQUIREMENTS SPECIFICATION PROCESSING USING NATURAL LANGUAGE PROCESSING - Vivek Punjabi.
Semantic Data Extraction for B2B Integration Syntactic-to-Semantic Middleware Bruno Silva 1, Jorge Cardoso 2 1 2
Overview of Statistical NLP IR Group Meeting March 7, 2006.
Solving Hard Coreference Problems Haoruo Peng, Daniel Khashabi and Dan Roth Problem Description  Problems with Existing Coref Systems Rely heavily on.
Writing an Objective Summary.  Follow along in your textbook p. lii-liii (Roman numerals mean it’s before p.1) or on mrscthompson.com – choose “More”
Sentiment Analysis Using Common- Sense and Context Information Basant Agarwal 1,2, Namita Mittal 2, Pooja Bansal 2, and Sonal Garg 2 1 Department of Computer.
Definition and Technologies Knowledge Representation.
Rule-based Reasoning in Semantic Text Analysis
Using Winograd Schemas for Evaluation of Implicit Information Extraction Systems Ivan Rygaev, Dialogue 2017 Laboratory of Computational Linguistics Institute.
Topics Beyond the imitation game Reading the web Turing test
Associative Query Answering via Query Feature Similarity
David W. Embley Brigham Young University Provo, Utah, USA
Automatic Detection of Causal Relations for Question Answering
CS246: Information Retrieval
The Winograd Schema Challenge Hector J. Levesque AAAI, 2011
VERB PHYSICS: Relative Physical Knowledge of Actions and Objects
Presentation transcript:

MASTERS THESIS DEFENSE MASTERS THESIS DEFENSE Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning School of Computing, Informatics, and Decision Systems Engineering Arizona State University BY ARPIT SHARMA ADVISOR: DR. CHITTA BARAL OCTOBER 31ST

Presentation Overview  Background and Motivation  Problem and Related Work  The System  Semantic Parser & Pronoun Extractor  Automatic Background Knowledge Extractor  Logical Reasoning Engine  System Evaluation and Error Analysis  Contributions and Future Works School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning 2

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Background and Motivation 3

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Background  One of the goals of AI : simulation of human-level intelligence in machines  Ability to think and reason, based on the commonsense knowledge about things  How to measure ?  Turing Test in 1950 (Deceive humans in conversation)  Not an ideal test A conversation with Scott Joel Aaronson, computer scientist the MIT 4

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Background  Hector J. Levesque suggested the Winograd Schema Challenge as an alternative to the Turing test in 2011  Its aim is not to deceive humans, but simulate human-like reasoning process 5

The town councilors refused to give the demonstrators a permit because they feared violence. The town councilors refused to give the demonstrators a permit because they advocated violence  Contains a pair of sentences that differ in only one or two words  The sentences contain an ambiguity that is resolved in opposite ways in the two sentences  Requires the use of world knowledge and reasoning for its resolution School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Winograd Schema Example 6

 A Question Answering test  A Collection of 141 Winograd Schemas.  282 Total Sentences  A Question about each Sentence. School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning The Winograd Schema Challenge The town councilors refused to give the demonstrators a permit because they feared violence. Who feared violence ? The town councilors refused to give the demonstrators a permit because they advocated violence. Who advocated violence? Example 7

 Helpful in:  Text Summarization  Reading Comprehension  Deep Question Answering  Ultimate Thinking Machines School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Motivation 8

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Problem and Related Work 9

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning The Problem The Fish ate the worm because it was hungry Who was hungry ? Of course Quagmire, the answer is “the fish” Hey Peter, can you answer the above question based on the sentence ? How did you know ? The sentence does not mention it. Ooo!!!! Does that mean I am GOD!!!! No Peter!!! You are just a fat HUMAN!! 10

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning The Problem Humans have commonsense or background knowledge about things and events 11 How do humans get this knowledge ? And, from where ?

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Resolving Complex Cases of Definite Pronouns: The Winograd Schema Challenge  By Altaf Rahman and Vincent Ng, Human Language Technology Research Institute, 2012  Used statistical techniques and machine learning framework to combine their results (ranking-based approach)  Created a new, Winograd Schema Challenge like, corpus.  941 Winograd Schema (30% test set)  73% accuracy  Contains redundancy John shot Bill and he died. The man shot his friend and he died. 12 Related Work

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Google Lions eat zebras because they are predators Queries: “lions are predators” “zebras are predators” What if the sentence is, “Lions eat zebras because they are hungry” 13 Resolving Complex Cases of Definite Pronouns: The Winograd Schema Challenge Related Work

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Narrative Chains “partially ordered set of events centered around a common protagonist” - Nathanael Chambers, 2010 borrow-s invest-s spend-s pay-s raise-s lend-s Drawbacks  Only events (verbs)  Less in number The Fish ate the worm because it was hungry Who was hungry ? 14 Resolving Complex Cases of Definite Pronouns: The Winograd Schema Challenge Related Work

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning  By Peter Schuller, Marmara University, Department of Computer Engineering, 2014  Converted the given sentence into a dependency graph  Manually created background knowledge graph  Combined both graphs to get the answer  Shows usability on 4 Winograd Schema Background Knowledge 15 Tackling Winograd Schemas by Formalizing Relevance Theory in Knowledge Graphs Related Work

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning The System 16

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning The Workflow 17 Given Sentence and Question Answer Automatic Background Knowledge Extractor Logical Reasoning Module Background Sentence Semantic Representation of the Sentence and question Semantic Representation of the Background Sentence Pronoun Extractor Semantic Parser Pronoun to Be Resolved

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Semantic Parser & Pronoun Extractor 18

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Semantic Parser Represent text into an Expressive Formal Representation Preserve Grammatical Structure Syntactic Dependency Parse Distinguish words with same conceptual sense Ontology (WordNet) Uses General Set of Relations Knowledge Machine (KM) Slot Dictionary 19

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Semantic Parser Stanford Dependency Parse of “The man loves his wife” 20 Syntactic Dependency Parse loves VBZ man NN wife NN his PRP$ The DT det nsubj dobj poss

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Semantic Parser Semantic Parse of “The man loves his wife” 21 Knowledge Machine Slot Dictionary Mapping loves VBZ man NN wife NN his PRP$ agent recipient possesed_by Mapping Stanford Dependency Relations to KM Slot Dictionary Using Intuitive Rules

superclass instance_of School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Semantic Parser Ontology Addition to the Semantic Parse of “The man loves his wife because she loved him” 22 Ontology Addition loves_3 man_2 wife_5 his_4 agent recipient possesed_by man instance_of loved_8 she_7him_9 agent recipient caused_by person love instance_of person emotion superclass his wife superclass

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Pronoun Extractor 23 The man could not lift his son because he is so weak. lift_5 man_2 son_7 his_6 agent recipient possesed_by man instance_of he_9 weak_12 trait person instance_of participant lift not_4 negative q_1 weak_3 q weak trait instance_of Who is weak? weak instance_of he son his superclass

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Automatic Background Knowledge Extractor 24

The Idea is, to learn the usage of English words and the contexts in which they are used School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Automatic Background Knowledge Extractor 25  Creating query by using formal representation of the given sentence and the question  Extracting background knowledge sentences from a big source of raw text That is done by,

 Causal  Non Causal  Temporal  Locative School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Automatic Background Knowledge Extractor 26 The fish ate the worm because it was tasty. Mary took out her flute and played one of her favorite pieces. She has had it since she was a child. Jackson was greatly influenced by Arnold, though he lived two centuries earlier. Sam’s drawing was hung just above Tina’s and it did look much better with another one above it. Categorization of Winograd Schema

Two subtypes of Causal category are solved by the system School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Type2 : Causal Attributive Example:- The man could not lift his son because he is so weak. Who is weak? Type1 : Direct Causal Events Example:- Ann asked Mary what time the library closes, but she had forgotten. Who had forgotten? Automatic Background Knowledge Extractor 27

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Automatic Background Knowledge Extractor Creating Queries The man could not lift his son because he was so weak. Who was weak ? Query Set 1 (Q1): “.*not.*lift.*because.*weak.*” “.*not.*lift.*because.*so.*weak.*” Queries Type1:  Use semantic graph of the given sentence and the question  Trace all nodes of the question into the given sentence (except “Wh” nodes)  Extract semantically important words (except entities)  Also consider the connective words  Combine the words in their order of occurrence in the sentence and join them using wildcard (.*) and quotes (“”) 28

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Automatic Background Knowledge Extractor Creating Queries 29 lift_4 man_2son_7 his_6 he_9weak_12 not_3 so_11 q_1 weak_3 man son not weak so he his person weak agent negative traitrecipient instance_of agent instance_of trait instance_of possesed_by superclass Sentence Question

Queries Type2:  Replace verbs with synonyms in query type 1.  Consider all combinations School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Automatic Background Knowledge Extractor Creating Queries The man could not lift his son because he was so weak. Who was weak ? A query among Q1 = “.*not.*lift.*because.*weak.*” Query Set 2 (Q2): “.*not.*pick.*because.*weak.*” Final Queries: Final Set of Queries (Q) = Q1 ∪ Q2 30 Final Query Set (Q): “.*not.*lift.*because.*weak.*” “.*not.*lift.*because.*so.*weak.*” “.*not.*pick.*because.*weak.*”

 Using big source of raw text  Use search engine School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Automatic Background Knowledge Extractor Extracting Background Knowledge Sentences 31

 Two ways in which sentences are extracted from WWW  Example Query: “.*not.*lift.*because.*weak.*” School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Automatic Background Knowledge Extractor Extracting Background Knowledge Sentences 32

 Filtering the extracted sentences  Should not contain the original sentence  Should contain all the words in the query (in any form)  Should not contain partial sentences School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Automatic Background Knowledge Extractor Extracting Background Knowledge Sentences The man could not lift his son because he was so weak. Query: “.*not.*lift.*because.*weak.*” Filtered sentences:  She could not lift it off the floor because she is a weak girl  She could not even lift her head because she was so weak  I could not even lift my leg to turn over because the muscles were weak after surgery  ….. 33

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Automatic Background Knowledge Extractor Parsing the Background Sentences 34 She could not lift it off the floor because she is a weak girl

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Logical Reasoning Engine 35

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Logical Reasoning Engine Given Sentence Logical Reasoning Engine (ASP Rules) Answer Background Knowledge Sentence Background Knowledge Sentences 36 Pronoun

 Answer Set Programming  Represent the Semantic Representation of the Given Sentence and the Background sentence in ASP predicates  Use ASP Reasoning Rules School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Logical Reasoning Engine 37

Winograd Sentence Ann asked Mary what time the library closes, but she had forgotten has(winograd,asked_2,agent,ann_1). has(winograd,asked_2,recipient,mary_3). has(winograd,asked_2,instance_of,ask). …….. School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Logical Reasoning Engine Representing the Winograd and the Background Sentences Background Sentence But you asked me the security question but I forgotten has(background,asked_103,agent,you_102). has(background,asked_103,instance_of,ask). has(background,ask,superclass,communication). …….. 38 asked_2 ann_1Mary_3ask agent instance_of agent asked_2 you_102 agent ask instance_of communic ation instance_of

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Logical Reasoning Engine ASP rules to capture general properties in Background and Winograd sentences  Reachability (Transitivity within context)  Cross context siblings (words belonging to same class in different contexts)  Negative Words (words with negative word associated with them) 39

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Logical Reasoning Engine Reachability 40 Background reachableFrom(background, asked_3,forgotten_10)

Basic transitivity relationship between event nodes in a particular context. reachableFrom(C,X,Y) :- has(C,X,REL,Y), context(C), eventRelation(REL). reachableFrom(C,X,Z) :- reachableFrom(C,X,Y), has(C,Y,REL,Z), context(C), eventRelation(REL), X!=Y, Y!=Z. School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Logical Reasoning Engine Reachability Event Relations from KM causes caused_by defeats defeated_by enables enabled_by inhibits inhibited_by ……. (15 more) 41

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Logical Reasoning Engine Cross-Context Siblings 42 WinogradBackground crossContextSiblings(asked_2,asked_3)

Words in different sentences (Winograd or Background) are instances of the same conceptual class then they are defined as cross context siblings crossContextSiblings(E1,E2) :- has(background,E1,instance_of,C), has(winograd,E2,instance_of,C), E1!=E2. School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Logical Reasoning Engine Cross-Context Siblings 43

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Logical Reasoning Engine Negative Polarity 44 negativePolarity(lift_4)

Words associated with a negation word like ”not”, are defined by negativePolarity predicate. negativePolarity(E) :- has(C,E,negative,N1), context(C). School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Logical Reasoning Engine Negative Polarity 45

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Logical Reasoning Engine Type Specific Reasoning 46 Type1: Direct Causal Events AB P EVENT1EVENT2 XY X EVENT1’EVENT2’ rel1 rel2 rel3 rel4 WinogradBackground Ann asked Mary what time the library closes, but she had forgotten. Who had forgotten?

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Logical Reasoning Engine Type Specific Reasoning 47 Type1: Direct Causal Events matchingEvents(asked_2,forgotten_13,asked_3,forgotten10) Winograd Background

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning matchingEvents(A,B,A1,B1) :-crossContextSiblings(A,A1), reachableFrom(winograd,A,B), crossContextSiblings(B,B1), reachableFrom(background,A1,B1), negativePolarity(A), not negativePolarity(B), negativePolarity(A1), not negativePolarity(B1). Logical Reasoning Engine Type1: Direct Causal Events Step1: A and B are the reachable nodes in the sentence graph which has A1 and B1 as crossContextSibling, reachable events respectively from Background sentence graph. 48

Pronoun to be resolved School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning 49 eventSubgraph(winograd,forgotten_13,agent,she_11) Logical Reasoning Engine Type1: Direct Causal Events Winograd

Step2: Extract the sub graph from the Winograd sentence which contains the pronoun to be resolved, the event in which it participates and their relation. eventSubgraph(winograd,A,S,X) :- matchingEvents(A,B,C,D), has(winograd,A,S,X), toBeResolved(X). School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning 50 Logical Reasoning Engine Type1: Direct Causal Events

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning 51 forgotten_110 entity1_109 agent forget instance_of forget instance_of Background eventSubgraph(background,forgotten_110,agent,entity1_109) Logical Reasoning Engine Type1: Direct Causal Events

Step3: Extract the sub graph from Background sentence which contains a matching event of the event to which the pronoun to be resolved is related in the Winograd sentence. eventsubgraph(background,A1,S,X1) :- eventSubgraph(winograd,A,S,X), matchingEvents(A,B,A1,B1), has(background,A1,S,X1). School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning 52 Logical Reasoning Engine Type1: Direct Causal Events

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning 53 forgotten_110 entity1_109 agent forget instance_of entity1 instance_of Background asked_103 entity1_104 recipient entity1 instance_of next_event … … eventPronounRelation(background,asked_103,recipient) Logical Reasoning Engine Type1: Direct Causal Events

Step4: Extract the event and relation from the Background graph. It is helpful in getting the final answer. eventPronounRelation(background,D,S1) :- matchingEvents(A,B,C,D), eventSubgraph(background,C,S,X1), has(background,D,S1,X2), has(background,X1,instance_of,X), has(background,X2,instance_of,X). School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning 54 Logical Reasoning Engine Type1: Direct Causal Events

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning 55 Logical Reasoning Engine Type1: Direct Causal Events hasCoreferent(she_11,mary_3) Winograd

Step5: Extract the co-referent of the pronoun to be resolved from the Winograd sentence graph. hasCoreferent(P,X) :- eventPronounRelation(background,C,S), matchingEvents(A,B,C,D), has(winograd,A,S,X), toBeResolved(P), P!=X. School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning 56 Logical Reasoning Engine Type1: Direct Causal Events

 The ASP implementation is similar to the Type1 implementation  Some more type specific rules are used along with the general rules School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning 57 Logical Reasoning Engine Type2: Causal Attributive

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning System Evaluation & Error Analysis 58

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning System Evaluation 59  Total 282 sentences in WSC  Causal category has >200  Causal sub-categories, Type1 and Type2, combined have 100 sentences  Results Total Number of Sentences Evaluated AnsweredBackground Knowledge Not Found Answered Correctly Answered Incorrectly Percentage Correct

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Error Analysis 60  20 out of 100 not answered  Suitable background knowledge was not found Mark ceded the presidency to John because he was less popular

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Error Analysis 61 Bob paid for Charlie’s college education, he is very grateful  10 out of 80 incorrectly answered  Deeper analysis of background knowledge is required I paid the price for my stupidity. How grateful I am. Background Sentence: Winograd Sentence:

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Contributions and Future Works 62

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Contributions 63  Implemented a system to solve the Winograd Schema Challenge by using Background Knowledge  Implemented an approach to automatically extract commonsense knowledge  Co-Implemented a new semantic representation system (available at

School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Future Works 64  Solving other WSC categories  Participate in NUANCE’s competition  Creating a commonsense Knowledge Base  Solve Reading Comprehension and other problems

65 School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning THANK YOU!!!

66 School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

THANK YOU!!! 67