Natural Language Processing by Reasoning and Learning Pei Wang Temple University Philadelphia, USA.

Slides:



Advertisements
Similar presentations
Pat Langley Computational Learning Laboratory Center for the Study of Language and Information Stanford University, Stanford, California
Advertisements

Pat Langley School of Computing and Informatics Arizona State University Tempe, Arizona USA Modeling Social Cognition in a Unified Cognitive Architecture.
Computational language: week 10 Lexical Knowledge Representation concluded Syntax-based computational language Sentence structure: syntax Context free.
The Logic of Intelligence Pei Wang Department of Computer and Information Sciences Temple University.
Semiotics and Ontologies. Ontologies contain categories, lexicons contain word senses, terminologies contain terms, directories contain addresses, catalogs.
An Introduction to Artificial Intelligence Presented by : M. Eftekhari.
Intelligent systems Lecture 6 Rules, Semantic nets.
FT228/4 Knowledge Based Decision Support Systems Rule-Based Systems Ref: Artificial Intelligence A Guide to Intelligent Systems Michael Negnevitsky – Aungier.
CPSC 322 Introduction to Artificial Intelligence September 15, 2004.
Outline Recap Knowledge Representation I Textbook: Chapters 6, 7, 9 and 10.
NLP and Speech Course Review. Morphological Analyzer Lexicon Part-of-Speech (POS) Tagging Grammar Rules Parser thethe – determiner Det NP → Det.
Varieties of Learning Structural descriptions and instances Scenarios and locations; eating in a fast food restaurant Perceptual and semantic representations.
Introduction to AI & AI Principles (Semester 1) WEEK 8 (07/08) [Barnden’s slides only] John Barnden Professor of Artificial Intelligence School of Computer.
Knowledge in intelligent systems So far, we’ve used relatively specialized, naïve agents. How can we build agents that incorporate knowledge and a memory?
Cognitive Linguistics Croft & Cruse 10 An overview of construction grammars (part 1, through )
Introduction to AI & AI Principles (Semester 1) WEEK 7 John Barnden Professor of Artificial Intelligence School of Computer Science University of Birmingham,
PSY 369: Psycholinguistics Some basic linguistic theory part3.
Knowledge Representation Reading: Chapter
Chapter 4: Language Semantics The need for language semantics: a. for the programmer - to know how to use language constructs b. for the implementor -
Introduction to Rule-Based Systems, Expert Systems, Fuzzy Systems Introduction to Rule-Based Systems, Expert Systems, Fuzzy Systems (sections 2.7, 2.8,
General Knowledge Dr. Claudia J. Stanny EXP 4507 Memory & Cognition Spring 2009.
Chapter 3: Methods of Inference
Scott Duvall, Brett South, Stéphane Meystre A Hands-on Introduction to Natural Language Processing in Healthcare Annotation as a Central Task for Development.
Dr. Matthew Iklé Department of Mathematics and Computer Science Adams State College Probabilistic Quantifier Logic for General Intelligence: An Indefinite.
Formal Models in AGI Research Pei Wang Temple University Philadelphia, USA.
Pattern-directed inference systems
1 Knowledge Representation. 2 Definitions Knowledge Base Knowledge Base A set of representations of facts about the world. A set of representations of.
AGI Architectures & Control Mechanisms. Realworld environment Anatomy of an AGI system Intellifest 2012 Sensors Actuators Data Processes Control.
Case-by-Case Problem Solving Pei Wang Temple University Philadelphia, USA.
LOGIC AND ONTOLOGY Both logic and ontology are important areas of philosophy covering large, diverse, and active research projects. These two areas overlap.
Self-Symbol in NARS (Non-Axiomatic Reasoning System)
Universal Grammar Noam Chomsky.
Machine Learning Chapter 5. Artificial IntelligenceChapter 52 Learning 1. Rote learning rote( โรท ) n. วิถีทาง, ทางเดิน, วิธีการตามปกติ, (by rote จากความทรงจำ.
Artificial Intelligence: Natural Language
For Monday Read chapter 26 Last Homework –Chapter 23, exercise 7.
Cognitive Linguistics Croft&Cruse
Learning to Share Meaning in a Multi-Agent System (Part I) Ganesh Padmanabhan.
1 Knowledge Acquisition and Learning by Experience – The Role of Case-Specific Knowledge Knowledge modeling and acquisition Learning by experience Framework.
TEFL METHODOLOGY I COMMUNICATIVE LANGUAGE TEACHING.
PRACTICAL KNOWLEDGE REPRESENTATION FOR THE WEB Frank van Harmelen Dieter Fensel AIFB Kim Kangil Structural Complexity Laboratory.
1 CS 385 Fall 2006 Chapter 7 Knowledge Representation 7.1.1, 7.1.5, 7.2.
Language Language - a system for combining symbols (such as words) so that an unlimited number of meaningful statements can be made for the purpose of.
Inductive Definitions Kangwon National University 임현승 Programming Languages These slides were originally created by Prof. Sungwoo Park at POSTECH.
For Monday Read chapter 26 Homework: –Chapter 23, exercises 8 and 9.
Issues in Temporal and Causal Inference Pei Wang Temple University, USA Patrick Hammer Graz University of Technology, Austria.
Verbal Representation of Knowledge
From NARS to a Thinking Machine Pei Wang Temple University.
Artificial Intelligence Knowledge Representation.
Computer Systems Architecture Edited by Original lecture by Ian Sunley Areas: Computer users Basic topics What is a computer?
Chapter 8 Thinking and Language.
Unit 7 Teaching Grammar Objectives: Know the importance and role of grammar in ELT Know how to present grammar Know how to guide students to practice grammar.
Chapter 7. Propositional and Predicate Logic
Prepared by Arabella Volkov
What is Language Acquisition?
What do cats eat ? They eat … 2A : Unit 8.
REASONING WITH UNCERTANITY
Entity-Relationship Modelling
Jie Bao, Doina Caragea and Vasant G Honavar
Artificial Intelligence (CS 370D)
Inductive and Deductive Logic
Entity-Relationship Modelling
Pima Medical Institute Online Education
Thought and Language Chapter 11.
CPSC 322 Introduction to Artificial Intelligence
Cognition Joey Watson.
1.3 Classifying Engineering Tasks
Artificial Intelligence 2004 Speech & Natural Language Processing
Generalized Diagnostics with the Non-Axiomatic Reasoning System (NARS)
Embodiment: Does a laptop have a body?
NARS an Artificial General Intelligence Project
Presentation transcript:

Natural Language Processing by Reasoning and Learning Pei Wang Temple University Philadelphia, USA

NLP in NARS: basics To represent linguistic knowledge in the same form as other knowledge To derive linguistic knowledge from the system's experience using inference rules To treat language understanding and production as reasoning There is no separate “NLP module”

Knowledge Representation A term names a concept in the system A term may correspond to a sensation, an action, or a word in a language A term may be a compound formed from other terms Two terms linked by a copula forms a statement indicating their substitutability

Experience-Grounded Semantics The truth-value of a statement is a pair, ‹frequency, confidence›, in [0, 1] x (0, 1) that indicating its evidential support Frequency is the proportion of positive evidence among all evidence; confidence is the proportion of available evidence among evidence at an evidential horizon The meaning of a term is its experienced relation with the other terms

Inference Rules NARS has rules for various types of inference, including deduction, induction, abduction, revision, choice, comparison, analogy, compound composition, etc. Each inference rule has a truth-value function that calculates the evidence provided by the premises to the conclusion Rules can be strong or weak, w.r.t. the confidence of the conclusion

Memory and Control NARS is based on the assumption of insufficient knowledge and resources, i.e., the system has finite capacity, works in real time, and is open to unanticipated tasks When processing a task, the system only selectively uses its knowledge, and each concept involved only uses partial meaning The tasks are processed case by case

Memory as a Network t135 t8762 bird chicken t8734 鸟 鸡 乌鸦 crow raven t1978 Inheritance represent

Architecture and Work Cycle

An Example [1, input] {cat * cat} → represent ‹1, 0.9› [2, input] {fish * fish} → represent ‹1, 0.9› (2) [3, input] {{cat * eat  * fish} * ((cat * fish) → food)} → represent ‹1, 0.9› [4, induction from 1&3] ({$1 * $2} → represent)  ({{$1 * eat* fish} *(($2 * fish ) → food )} → represent) ‹1, 0.45› [5, induction from 2&4] (({$1 * $2} → represent) ∧ ({$3 * $4} → represent ))  ({{$1 * eat * $3} * (($2 * $4) → food)} → represent) ‹1, 0.29› [6, input] {dog * dog} → represent ‹1, 0.9› [7, input] {meat * meat} → represent ‹1, 0.9› [8, deduction from 4&6] {{dog * eat *fish} * ((dog * fish) → food)} → represent ‹1, 0.41› [9, deduction from 5&7] {{dog * eat * meat}* ((dog * meat) → food)} → represent ‹1, 0.26›

Features Unified treatment of syntax, semantics, and pragmatics Do not depends on a given grammar or grammatical categories, and represent grammatical knowledge at multiple levels Learning is on-line, one-shot, incremental, life-long, and is carried out by reasoning To treat meaning as experience-grounded and context-sensitive

Summary Both grammar and vocabulary can be learned from the experience of the system The meaning of a word should be determined by experience, rather than by denotation or definition It is possible to carry out NLP by a unified reasoning-learning mechanism, rather than by a separate module