Knowledge Structure Vijay Meena (99005027) Gaurav Meena (00005020)

Slides:



Advertisements
Similar presentations
School of something FACULTY OF OTHER School of Computing FACULTY OF ENGINEERING A comparative study of the tagging of adverbs in modern English corpora.
Advertisements

March 1, 2009 Dr. Muhammed Al-Mulhem 1 ICS 482 Natural Language Processing Semantics (Chapter 17) Muhammed Al-Mulhem March 1, 2009.
CILC2011 A framework for structured knowledge extraction and representation from natural language via deep sentence analysis Stefania Costantini Niva Florio.
Exploring the Effectiveness of Lexical Ontologies for Modeling Temporal Relations with Markov Logic Eun Y. Ha, Alok Baikadi, Carlyle Licata, Bradford Mott,
The Meaning of Language
So What Does it All Mean? Geospatial Semantics and Ontologies Dr Kristin Stock.
Identifying Prepositional Phrases
ConceptNet: A Wonderful Semantic World
Section 4: Language and Intelligence Overview Instructor: Sandiway Fong Department of Linguistics Department of Computer Science.
Chapter 17. Lexical Semantics From: Chapter 17 of An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition, by.
CROC — a Representational Ontology for Concepts. Contents  Introduction  Semantic Web  Conceptuology  Language  CROC — a Representational Ontology.
Introduction to Semantics To be able to reason about the meanings of utterances, we need to have ways of representing the meanings of utterances. A formal.
1 Noun Homograph Disambiguation Using Local Context in Large Text Corpora Marti A. Hearst Presented by: Heng Ji Mar. 29, 2004.
Semantics For the Semantic Web: The Implicit, the Formal and The Powerful Amit Sheth, Cartic Ramakrishnan, Christopher Thomas CS751 Spring 2005 Presenter:
Latent Semantic Analysis (LSA). Introduction to LSA Learning Model Uses Singular Value Decomposition (SVD) to simulate human learning of word and passage.
PRESENTING NEW LANGUAGE STRUCTURE LANGUAGE STUDENTS ARE NOT ABLE TO USE YET LANGUAGE SHOULD BE PRESENTED IN CONTEXT CHARACTERISTICS TYPES SHOWS WHAT LANGUAGE.
Meaning and Language Part 1.
Detection of Relations in Textual Documents Manuela Kunze, Dietmar Rösner University of Magdeburg C Knowledge Based Systems and Document Processing.
March 1, 2009 Dr. Muhammed Al-Mulhem 1 ICS 482 Natural Language Processing INTRODUCTION Muhammed Al-Mulhem March 1, 2009.
Style, Grammar and Punctuation
Ontology Learning and Population from Text: Algorithms, Evaluation and Applications Chapters Presented by Sole.
Adding Common Sense into Artificial Intelligence Common Sense Computing Initiative Software Agents Group MIT Media Lab.
Created By: Benjamin J. Van Someren.  Natural Language Translation – Translating one natural language such as German to another natural language such.
Common Sense Computing MIT Media Lab Interaction Challenges for Agents with Common Sense Henry Lieberman MIT Media Lab Cambridge, Mass. USA
+ Predication: Verbs, EVENTS, and STATES Presenter: Emily Lu.
Computational Lexical Semantics Lecture 8: Selectional Restrictions Linguistic Institute 2005 University of Chicago.
Artificial intelligence project
ESLG 320 Ch. 12 A little grammar language…. Parts of Speech  Noun: a person/place/thing/idea  Verb: an action or a state of being  Adjective: a word.
Overview Project Goals –Represent a sentence in a parse tree –Use parses in tree to search another tree containing ontology of project management deliverables.
Natural Language Processing Introduction. 2 Natural Language Processing We’re going to study what goes into getting computers to perform useful and interesting.
Based on “Semi-Supervised Semantic Role Labeling via Structural Alignment” by Furstenau and Lapata, 2011 Advisors: Prof. Michael Elhadad and Mr. Avi Hayoun.
Jennie Ning Zheng Linda Melchor Ferhat Omur. Contents Introduction WordNet Application – WordNet Data Structure - WordNet FrameNet Application – FrameNet.
WORD SENSE DISAMBIGUATION STUDY ON WORD NET ONTOLOGY Akilan Velmurugan Computer Networks – CS 790G.
Extract Questions from Sentences. Purpose The behavior of extracting questions from sentences can be regarded as extracting semantics or knowledge from.
Artificial Intelligence: Natural Language
Soft Computing Lecture 19 Part 2 Hybrid Intelligent Systems.
Grammars Grammars can get quite complex, but are essential. Syntax: the form of the text that is valid Semantics: the meaning of the form – Sometimes semantics.
Wordnet - A lexical database for the English Language.
Artificial Intelligence: Natural Language
WordNet Enhancements: Toward Version 2.0 WordNet Connectivity Derivational Connections Disambiguated Definitions Topical Connections.
$100 $200 $300 $400 $500 $100 $200 $300 $400 $500 $100 $200 $300 $400 $500 $100 $200 $300 $400 $500 $100 $200 $300 $400 $500 $100.
KNOWLEDGE BASED SYSTEMS
Finding frequent and interesting triples in text Janez Brank, Dunja Mladenić, Marko Grobelnik Jožef Stefan Institute, Ljubljana, Slovenia.
Chapter 2: The Representation of Knowledge
Data Mining: Text Mining
Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.
Finite State Machines (FSM) OR Finite State Automation (FSA) - are models of the behaviors of a system or a complex object, with a limited number of defined.
◦ Process of describing the structure of phrases and sentences Chapter 8 - Phrases and sentences: grammar1.
2/10/2016Semantic Similarity1 Semantic Similarity Methods in WordNet and Their Application to Information Retrieval on the Web Giannis Varelas Epimenidis.
For Monday Read chapter 26 Homework: –Chapter 23, exercises 8 and 9.
Scholastic Aptitude Test Developing Critical Reading Skills Doc Holley.
Sentiment Analysis Using Common- Sense and Context Information Basant Agarwal 1,2, Namita Mittal 2, Pooja Bansal 2, and Sonal Garg 2 1 Department of Computer.
Multi-Class Sentiment Analysis with Clustering and Score Representation Yan Zhu.
Grammar The “4 – Level” Analysis. The 4 - Levels Jack ate a delicious sandwich. Level 1 – parts of speech Level 2 – parts of a sentence Level 3 – phrases.
Use of Concordancers A corpus (plural corpora) – a large collection of texts, written or spoken, stored on a computer. A concordancer – a computer programme.
SYNTAX.
King Faisal University جامعة الملك فيصل Deanship of E-Learning and Distance Education عمادة التعلم الإلكتروني والتعليم عن بعد [ ] 1 King Faisal University.
1 Commonsense Reasoning in and over Natural Language Hugo Liu Push Singh Media Laboratory Massachusetts Institute of Technology Cambridge, MA 02139, USA.
Knowledge Representation. A knowledge base can be organised in several different configurations to facilitate fast inferencing Knowledge Representation.
Parts of Speech Review.
Beginning Syntax Linda Thomas
Lecture – VIII Monojit Choudhury RS, CSE, IIT Kharagpur
PHRASE.
CSC 594 Topics in AI – Applied Natural Language Processing
WordNet: A Lexical Database for English
Knowledge Representation and Inference
FIRST SEMESTER GRAMMAR
PREPOSITIONAL PHRASES
Prepositions and Prepositional Phrases
Presentation transcript:

Knowledge Structure Vijay Meena ( ) Gaurav Meena ( )

Introduction Human being involve a lot of common sense - A child is always youngr than his mother - You can push something with a straight stick But computer fails to recognize them We can write programs that exceeds the capability of experts, but yet we cant write program that match the level of a three years old child at recognizing objects.

Common sense knowledge Computer lacks common sense Why has it been so hard to give computer this common sense ? - involves a great amount of knowledge - Many kind of representation. - AI has become the gold mine of techniques - We don’t want to give computers knowledge about particular areas, but instead want to give common sense.

Common sense is problem of great scale and diversity Two parts 1. How to give millions of piece of knowledge to computer? - need a database. 2. How to give computers the capacity for common sense reasoning? - having database is not enough - don’t have enough idea about how to represent, organize and use common sense knowledge

Some projects CyC – Started by Douglous lelant in 1994 WordNet – Started by Fellbaum in 1998 ConceptNet - Started by MIT Media LabTeam after WorldNet

CyC - CyCorp To create the world’s first true AI having both common sense and the ability to reason about it CyC represents it’s common knowledge in a language call CyCL. CyCL is a second order logical language with second order features such as quantification over predicates.

WordNet WordNet is an online lexical reference system Have semantic network representation Nodes have lexical notation English nouns, verbs, adjectives and adverbs are organized into synonym sets, each representing one underlying lexical concept.

Concept Net Very large semantic network of common sense knowledge. Structured as a network of semi structured natural language fragment. Captures a wide range of common sense concept and relations. Ease of use camparable to WordNet.

ConceptNet Presently consists of 250,000 elements of common sense knowledge. Combined the best of both CyC and WordNet. Extended Sementic network representation of WordNet.

WordNet v/s ConceptNet WordNet lexical notation of node Small ontology of semantic relations ConceptNet Conceptual notation of node Richer set of relation appropiate to concept level node At present there are 19 semantic relations used in ConceptNet representing different categories.

Representation of Knowledge Logic representation : unambigious Natutal language representation : ambigious Maintaining some ambiguity lends us greater flexibility. Methodology for reasoning over natural language fragment

Concept Net Focus on the Knowledge representation aspects of ConceptNet

Origin machine-usable resource mined OMCS CRIS mined predicate argument structures from OMCS Is produced by an automatic process which applies a set of ‘common sense extraction rules’ to the semi-structured English sentences of the OMCS corpus.

Structure Semi-Structured natural-language fragment nodes falls into three general classes : Noun Phrases (things, places, people) Attributes (modifiers) Activity Phrases (actions and actions compounded with a noun phrase or prepositional phrase)

Node classA portion of the grammar Examples of valid nodes Noun PhrasesNOUN, NOUN NOUN, ADJ NOUN, NOUN PREP NOUN “apple”; “San Francisco”; “fast car”; “life of party” AttributesADJ, ADV ADJ “red”; “very red” Activity PhrasesVERB, VERB NOUN, ADV VERB, VERB PREP NOUN, VERB NOUN, PREP NOUN “eat”; “eat cookie”; “quickly eat”; “get into accident”; “eat food with fork” Grammer for partially structuring natural language cncepts

Semantic Relation Types currently in ConceptNet Category Things Events Actions Saptial Goals Functions Generic

Methodology for Reasoning over Natural Language Concepts Computing conceptual similarity Flexible Inference: context finding, inference chaining, conceptual analogy.

Conclusion To support several kinds of practical inferences over text To maintain an easy-to-use knowledge representation ConceptNet follows the easy-to-use semantic network structure of WordNet, but incorporates a greater diversity of relations and concepts inspired by Cyc.

References Focusing on ConceptNet's natural language knowledge representation, Liu, H. & Singh, P. (2004)