1 Commonsense Reasoning in and over Natural Language Hugo Liu Push Singh Media Laboratory Massachusetts Institute of Technology Cambridge, MA 02139, USA.

Slides:



Advertisements
Similar presentations
1 Knowledge Representation Introduction KR and Logic.
Advertisements

CILC2011 A framework for structured knowledge extraction and representation from natural language via deep sentence analysis Stefania Costantini Niva Florio.
ConceptNet: A Wonderful Semantic World
A Probabilistic Framework for Information Integration and Retrieval on the Semantic Web by Livia Predoiu, Heiner Stuckenschmidt Institute of Computer Science,
Semantics For the Semantic Web: The Implicit, the Formal and The Powerful Amit Sheth, Cartic Ramakrishnan, Christopher Thomas CS751 Spring 2005 Presenter:
Foundations This chapter lays down the fundamental ideas and choices on which our approach is based. First, it identifies the needs of architects in the.
Adding Common Sense into Artificial Intelligence Common Sense Computing Initiative Software Agents Group MIT Media Lab.
Knowledge Mediation in the WWW based on Labelled DAGs with Attached Constraints Jutta Eusterbrock WebTechnology GmbH.
Common Sense Computing MIT Media Lab Interaction Challenges for Agents with Common Sense Henry Lieberman MIT Media Lab Cambridge, Mass. USA
LDK R Logics for Data and Knowledge Representation Semantic Matching.
Ontology Learning from Text: A Survey of Methods Source: LDV Forum,Volume 20, Number 2, 2005 Authors: Chris Biemann Reporter:Yong-Xiang Chen.
ConceptNet 5 Jason Gaines 1. Overview What Is ConceptNet 5? History Structure Demonstration Questions Further Information 2.
Logics for Data and Knowledge Representation Semantic Matching.
Text Summarization -- In Search of Effective Ideas and Techniques Shuhua Liu, Assistant Professor Department of Information Systems Åbo Akademi University,
Automatic Lexical Annotation Applied to the SCARLET Ontology Matcher Laura Po and Sonia Bergamaschi DII, University of Modena and Reggio Emilia, Italy.
Knowledge representation
INF 384 C, Spring 2009 Ontologies Knowledge representation to support computer reasoning.
Author: William Tunstall-Pedoe Presenter: Bahareh Sarrafzadeh CS 886 Spring 2015.
Artificial intelligence project
A Cognitive Substrate for Natural Language Understanding Nick Cassimatis Arthi Murugesan Magdalena Bugajska.
Extracting Semantic Constraint from Description Text for Semantic Web Service Discovery Dengping Wei, Ting Wang, Ji Wang, and Yaodong Chen Reporter: Ting.
Annotating Words using WordNet Semantic Glosses Julian Szymański Department of Computer Systems Architecture, Faculty of Electronics, Telecommunications.
Knowledge Modeling, use of information sources in the study of domains and inter-domain relationships - A Learning Paradigm by Sanjeev Thacker.
An Intelligent Analyzer and Understander of English Yorick Wilks 1975, ACM.
Pattern-directed inference systems
LOGIC AND ONTOLOGY Both logic and ontology are important areas of philosophy covering large, diverse, and active research projects. These two areas overlap.
You Are What You Tag Yi-Ching Huang and Chia-Chuan Hung and Jane Yung-jen Hsu Department of Computer Science and Information Engineering Graduate Institute.
Artificial Intelligence Chapter 18. Representing Commonsense Knowledge.
Key Concepts Representation Inference Semantics Discourse Pragmatics Computation.
KNOWLEDGE BASED SYSTEMS
Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.
Intelligent Database Systems Lab N.Y.U.S.T. I. M. 1 Mining knowledge from natural language texts using fuzzy associated concept mapping Presenter : Wu,
ece 627 intelligent web: ontology and beyond
Knowledge Structure Vijay Meena ( ) Gaurav Meena ( )
For Monday Read chapter 26 Homework: –Chapter 23, exercises 8 and 9.
Of 29 lecture 15: description logic - introduction.
Sentiment Analysis Using Common- Sense and Context Information Basant Agarwal 1,2, Namita Mittal 2, Pooja Bansal 2, and Sonal Garg 2 1 Department of Computer.
Artificial Intelligence Knowledge Representation.
COMPUTER SYSTEM FUNDAMENTAL Genetic Computer School INTRODUCTION TO ARTIFICIAL INTELLIGENCE LESSON 11.
Definition and Technologies Knowledge Representation.
XML Databases Presented By: Pardeep MT15042 Anurag Goel MT15006.
OWL (Ontology Web Language and Applications) Maw-Sheng Horng Department of Mathematics and Information Education National Taipei University of Education.
Knowledge Representation Techniques
David Dodds
ece 720 intelligent web: ontology and beyond
IB Assessments CRITERION!!!.
Ontology From Wikipedia, the free encyclopedia
Knowledge Representation
Knowledge Representation
Survey of Knowledge Base Content
David W. Embley Brigham Young University Provo, Utah, USA
Ontology.
Grade 6 Outdoor School Program Curriculum Map
Extracting Semantic Concept Relations
Presented by: Prof. Ali Jaoua
ece 627 intelligent web: ontology and beyond
Social Research Methods
KNOWLEDGE REPRESENTATION
Service-Oriented Computing: Semantics, Processes, Agents
Automatic Detection of Causal Relations for Question Answering
Ontology.
Information Networks: State of the Art
Reading Objectives.
The Winograd Schema Challenge Hector J. Levesque AAAI, 2011
VERB PHYSICS: Relative Physical Knowledge of Actions and Objects
Deniz Beser A Fundamental Tradeoff in Knowledge Representation and Reasoning Hector J. Levesque and Ronald J. Brachman.
ONTOMERGE Ontology translations by merging ontologies Paper: Ontology Translation on the Semantic Web by Dejing Dou, Drew McDermott and Peishen Qi 2003.
Information Retrieval
Habib Ullah qamar Mscs(se)
Presentation transcript:

1 Commonsense Reasoning in and over Natural Language Hugo Liu Push Singh Media Laboratory Massachusetts Institute of Technology Cambridge, MA 02139, USA

2 Outline What is ConceptNet? Origin and Structure of ConceptNet Commonsense Reasoning in Natural Language? Methodology for Reasoning over Natural Language Concepts Conclusion

3 What is ConceptNet? (1/5) ConceptNet ( is the largest freely available, machine-useable commonsense resource. Structured as a network of semi-structured natural language fragments, ConceptNet presently consists of over 250,000 elements of commonsense knowledge.

4 As a result, we adopted the semantic network representation of WordNet, but extended the representation in several key ways. First, we extended WordNet’s lexical notion of nodes to a conceptual notion of nodes, but we kept the nodes in natural language, because one of the primary strengths of WordNet in the textual domain is that its knowledge representation is itself textual. What is ConceptNet? (2/5)

5 Second, we extended WordNet’s small ontology of semantic relations, which are primarily taxonomic in nature, to include a richer set of relations appropriate to concept-level nodes. What is ConceptNet? (3/5)

6 What is ConceptNet? (4/5)

7 Third, we supplement the ConceptNet semantic network with some methodology for reasoning over semi-structured natural language fragments. What is ConceptNet? (5/5) Fourth, we supplement the ConceptNet semantic network with a toolkit and API which supports making practical commonsense inferences about text, such as context finding, inference chaining, and conceptual analogy. → “Buy food” ∣ “Purchase groceries”

8 Origin and Structure of ConceptNet (1/4) Origin 1. ConceptNet is a machine-usable resource mined out of the Open Mind Common-sense (OMCS) corpus, a collection of nearly 700,000 English sentences of commonsense facts collected from over 14,000 contributors from around the world, over the past three years.

9 2. ConceptNet is produced by an automatic process which applies a set of ‘commonsense extraction rules’ to the semi-structured English sentences of the OMCS corpus. Origin and Structure of ConceptNet (2/4) 3. A pattern matching parser uses roughly 40 mapping rules to easily parse semi-structured sentences into an ontology of predicate relations, and arguments which are short fragments of English. →The effect of [falling off a bike] is [you get hurt]

10 Structure 1. ConceptNet nodes are natural language fragments semi-structured to conform to preferred syntactic patterns which fall into three general classes: Noun Phrases, Attributes, and Activity Phrases. Origin and Structure of ConceptNet (3/4)

11 2. ConceptNet edges are described by an ontology of at present 19 binary relations shown below in Table 2. Origin and Structure of ConceptNet (4/4)

12 Commonsense Reasoning in Natural Language? (1/4) Where logic excels ? Logic is precise – its symbols are unambiguous, and inference amounts to deductive theorem proving.

13 How do people represent and reason? 1.Human commonsense reasoning has a number of properties that distinguish it from traditional logical reasoning, and that have inspired various extensions to the logical approach. 2. One reason why humans are such successful inductive reasoners is that we collect a very large number of features in observing objects and events, thus providing a wealth of information from which patterns can emerge. Commonsense Reasoning in Natural Language? (2/4)

14 Natural language as a lingua for common sense 1.In our work with ConceptNet, we are exploring natural language as a lingua for representing common sense. 2. Consider that Cyc employs logicians to author its knowledge, while ConceptNet was automatically mined from a knowledge base of English sentences contributed by 14,000 people over the web, many of whom have no computer science background. Commonsense Reasoning in Natural Language? (3/4)

15 3. By posing common sense in natural language, we can benefit from the implicit conceptual framework of human language. 4. Ambiguity is a particular aspect that needs to be dealt with when reasoning in natural language. 5. Whereas a logical symbol is concise, there may exist many different natural language fragments which mean essentially the same thing, and may seem equally suitable to include, and so in this sense logic can be seen as more economical. Commonsense Reasoning in Natural Language? (4/4) →Red apple

16 Computing conceptual similarity 1.To compute the meaning of a concept, the concept is first decomposed into first-order atomic concepts, while preserving the dependence relationships between the concepts. Methodology for Reasoning over Natural Language Concepts (1/6) “buy good cheese” → 「 buy 」、「 good 」、「 cheese 」

17 2. To compute the similarity of two concepts, we produce a heuristic score by comparing corresponding atoms of the two decomposed concepts using each of the semantic resources. Methodology for Reasoning over Natural Language Concepts (2/6) Example : In WordNet’s subsumption hierarchy, the inferential distance between “apple” and “food” is proportional to 3. → “apple” isA “edible fruit” isA “produce” isA “food”

18 Methodology for Reasoning over Natural Language Concepts (3/6) Flexible Inference 1.Graph-based reasoning is associative and thus not as expressive, exact, or certain as logical inferences, but it is much more straightforward to perform, and useful for reasoning practically over text. 2. We demonstrate three kinds of flexible inference in ConceptNet: context finding, inference chaining, and conceptual analogy.

19 Context finding - One task useful across many textual reasoning applications is determining the context around a concept, or around the intersection of several concepts. Methodology for Reasoning over Natural Language Concepts (4/6) → GetContext () - The contextual neighborhood around a node is found by performing spreading activation from that source node, radiating outwardly to include other concepts. “go to bed” → “take off clothes”, “go to sleep”, “sleep”, “lie down”, “lay down”, “close eyes”, “turn off light”, “dream”, “brush tooth”.

20 Inference chaining - Another basic type of inference that can be done on a graph is building inference chains: Traversing the graph from one node to another node via some path of connectedness. Methodology for Reasoning over Natural Language Concepts (5/6) →FindPathsBetween() “buy food” → “have food” → “eat food” → “feel full” → “feel sleepy” → “fall asleep.” -For example, ConceptNet can generate all the temporal chains between “buy food” and “fall asleep.” One chain may be:

21 Conceptual analogy - A third practical textual inference task is finding concepts which are structurally analogous. Methodology for Reasoning over Natural Language Concepts (6/6) → GetAnalogousConcepts() - Structural analogy is not just a measure of semantic distance. funeral wedding bride

22 Conclusion Presently the largest freely available commonsense resource, ConceptNet comes with a knowledge browser, and a preliminary set of tools to support several kinds of practical inferences over text. To maintain an easy-to-use knowledge representation, while at the same time incorporating more complex higher-order commonsense concepts and relations, we chose to represent concepts as semi-structured natural language fragments.