CPSC 433 Artificial Intelligence CPSC 433 : Artificial Intelligence Tutorials T01 & T02 Andrew “M” Kuipers note: please include.

Slides:



Advertisements
Similar presentations
Artificial Intelligence
Advertisements

1 Knowledge Representation Introduction KR and Logic.
Some Prolog Prolog is a logic programming language
Computer Science CPSC 322 Lecture 25 Top Down Proof Procedure (Ch 5.2.2)
Set Based Search Modeling Examples II
CHAPTER 13 Inference Techniques. Reasoning in Artificial Intelligence n Knowledge must be processed (reasoned with) n Computer program accesses knowledge.
Artificial Intelligence Inference in first-order logic Fall 2008 professor: Luigi Ceccaroni.
Reasoning Forward and Backward Chaining Andrew Diniz da Costa
Agents That Reason Logically Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 7 Spring 2004.
Propositional Logic Reading: C , C Logic: Outline Propositional Logic Inference in Propositional Logic First-order logic.
Logic CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Logic.
CS 484 – Artificial Intelligence1 Announcements Choose Research Topic by today Project 1 is due Thursday, October 11 Midterm is Thursday, October 18 Book.
Inferences The Reasoning Power of Expert Systems.
Reasoning System.  Reasoning with rules  Forward chaining  Backward chaining  Rule examples  Fuzzy rule systems  Planning.
Artificial Intelligence Chapter 14. Resolution in the Propositional Calculus Artificial Intelligence Chapter 14. Resolution in the Propositional Calculus.
Artificial Intelligence Lecture No. 16
1 5.0 Expert Systems Outline 5.1 Introduction 5.2 Rules for Knowledge Representation 5.3 Types of rules 5.4 Rule-based systems 5.5 Reasoning approaches.
ITCS 3153 Artificial Intelligence Lecture 11 Logical Agents Chapter 7 Lecture 11 Logical Agents Chapter 7.
1 Chapter 9 Rules and Expert Systems. 2 Chapter 9 Contents (1) l Rules for Knowledge Representation l Rule Based Production Systems l Forward Chaining.
Rules and Expert Systems
Knoweldge Representation & Reasoning
EXPERT SYSTEMS Part I.
Introduction to Logic for Artificial Intelligence Lecture 1 Erik Sandewall 2010.
Artificial Intelligence CSC 361
Sepandar Sepehr McMaster University November 2008
INFERENCE IN FIRST-ORDER LOGIC IES 503 ARTIFICIAL INTELLIGENCE İPEK SÜĞÜT.
13: Inference Techniques
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 19, 2012.
‘In which we introduce a logic that is sufficent for building knowledge- based agents!’
Pattern-directed inference systems
Logical Agents Logic Propositional Logic Summary
Automated Reasoning Early AI explored how to automated several reasoning tasks – these were solved by what we might call weak problem solving methods as.
Logical Agents Chapter 7. Outline Knowledge-based agents Logic in general Propositional (Boolean) logic Equivalence, validity, satisfiability.
1 Logical Inference Algorithms CS 171/271 (Chapter 7, continued) Some text and images in these slides were drawn from Russel & Norvig’s published material.
Intelligent Control Methods Lecture 7: Knowledge representation Slovak University of Technology Faculty of Material Science and Technology in Trnava.
KNOWLEDGE BASED SYSTEMS
Computer Science CPSC 322 Lecture 22 Logical Consequences, Proof Procedures (Ch 5.2.2)
11 Artificial Intelligence CS 165A Thursday, October 25, 2007  Knowledge and reasoning (Ch 7) Propositional logic 1.
Chapter 7. Propositional and Predicate Logic Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Chapter 9. Rules and Expert Systems Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
CPSC 433 Artificial Intelligence Set Based Search Modeling Examples Andrew Kuipers Please include [CPSC433] in the subject line.
Dr. Shazzad Hosain Department of EECS North South Universtiy Lecture 04 – Part B Propositional Logic.
1 Propositional Logic Limits The expressive power of propositional logic is limited. The assumption is that everything can be expressed by simple facts.
Forward and Backward Chaining
Some Thoughts to Consider 5 Take a look at some of the sophisticated toys being offered in stores, in catalogs, or in Sunday newspaper ads. Which ones.
CS 416 Artificial Intelligence Lecture 13 First-Order Logic Chapter 9 Lecture 13 First-Order Logic Chapter 9.
By Muhammad Safdar MCS [E-Section].  There are times in life when you are faced with challenging decisions to make. You have rules to follow and general.
Artificial Intelligence Logical Agents Chapter 7.
Logical Agents. Outline Knowledge-based agents Logic in general - models and entailment Propositional (Boolean) logic Equivalence, validity, satisfiability.
EA C461 Artificial Intelligence
Chapter 7. Propositional and Predicate Logic
CS 4700: Foundations of Artificial Intelligence
Chapter 9. Rules and Expert Systems
EA C461 – Artificial Intelligence Logical Agent
Propositional Definite Clause Logic: Syntax, Semantics, R&R and Proofs
CS 4700: Foundations of Artificial Intelligence
Architecture Components
Artificial Intelligence
CS62S: Expert Systems Based on:
Logic: Top-down proof procedure and Datalog
CS 416 Artificial Intelligence
Back to “Serious” Topics…
CPSC 433 : Artificial Intelligence Tutorials T01 & T02
Chapter 7. Propositional and Predicate Logic
CS 416 Artificial Intelligence
CSNB234 ARTIFICIAL INTELLIGENCE
Chapter 9. Rules and Expert Systems
Inference and Resolution for Problem Solving
Logical Agents Prof. Dr. Widodo Budiharto 2018
Presentation transcript:

CPSC 433 Artificial Intelligence CPSC 433 : Artificial Intelligence Tutorials T01 & T02 Andrew “M” Kuipers note: please include [cpsc 433] in the subject line of any s regarding this course

CPSC 433 Artificial Intelligence Expert Systems Designed to function similar to a human expert operating within a specific problem domain Used to: –Provide an answer to a certain problem, or –Clarify uncertainties where normally a human expert would be consulted Often created to operate in conjunction with humans working within the given problem domain, rather than as a replacement for them

CPSC 433 Artificial Intelligence Components of an Expert Systems Knowledge Base –Stores knowledge used by the system, usually represented in a formal logical manner Inference System –Defines how existing knowledge may be used to derive new knowledge Search Control –Determines which inference to apply at a given stage of the deduction

CPSC 433 Artificial Intelligence Knowledge Representation For now, we’ll use a simple If … Then … consequence relation using English semantics ie: If [it is raining] Then [I should wear a coat] –[it is raining] is the antecedent of the relation –[I should wear a coat] is the consequent of the relation Facts can be understood as consequence relations with an empty antecedent –ie: “If [] Then [it is raining]” is equivalent to the fact that [it is raining]

CPSC 433 Artificial Intelligence Inferring New Knowledge New knowledge can be constructed from existing knowledge using inference rules For instance, the inference rule modus ponens can be used to derive the consequent of a consequence relation, given that the antecedent is true ie: –k1: If [it is raining] Then [I should wear a coat] –k2: [it is raining] –result: [I should wear a coat]

CPSC 433 Artificial Intelligence Goal Directed Reasoning Inference rules are applied to knowledge base in order to achieve a particular goal The goal in an expert system is formed as a question, or query, to which we want the answer ie: [I should wear a coat]? –note: this would read easier in English as “should I wear a coat”, but we want to use the same propositional symbol as is in our knowledge base The goal of the search is to determine an answer to the query, which may be boolean as above or more complex

CPSC 433 Artificial Intelligence Forward Chaining Forward chaining is a data driven method of deriving a particular goal from a given knowledge base and set of inference rules Inference rules are applied by matching facts to the antecedents of consequence relations in the knowledge base The application of inference rules results in new knowledge (from the consequents of the relations matched), which is then added to the knowledge base

CPSC 433 Artificial Intelligence Forward Chaining Inference rules are successively applied to elements of the knowledge base until the goal is reached A search control method is needed to select which element(s) of the knowledge base to apply the inference rule to at any point in the deduction

CPSC 433 Artificial Intelligence Forward Chaining Example Knowledge Base: –If [X croaks and eats flies] Then [X is a frog] –If [X chirps and sings] Then [X is a canary] –If [X is a frog] Then [X is colored green] –If [X is a canary] Then [X is colored yellow] –[Fritz croaks and eats flies] Goal: –[Fritz is colored Y]?

CPSC 433 Artificial Intelligence Forward Chaining Example Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goal [Fritz is colored Y]?

CPSC 433 Artificial Intelligence Forward Chaining Example Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goal [Fritz is colored Y]?

CPSC 433 Artificial Intelligence Forward Chaining Example If [X croaks and eats flies] Then [X is a frog] [Fritz croaks and eats flies] [Fritz is a frog] Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goal [Fritz is colored Y]?

CPSC 433 Artificial Intelligence Forward Chaining Example If [X croaks and eats flies] Then [X is a frog] [Fritz croaks and eats flies] [Fritz is a frog] Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] [Fritz is a frog] Goal [Fritz is colored Y]?

CPSC 433 Artificial Intelligence Forward Chaining Example If [X croaks and eats flies] Then [X is a frog] [Fritz croaks and eats flies] [Fritz is a frog] ? Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] [Fritz is a frog] Goal [Fritz is colored Y]?

CPSC 433 Artificial Intelligence Forward Chaining Example If [X croaks and eats flies] Then [X is a frog] [Fritz croaks and eats flies] [Fritz is a frog] Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] [Fritz is a frog] Goal [Fritz is colored Y]?

CPSC 433 Artificial Intelligence Forward Chaining Example If [X croaks and eats flies] Then [X is a frog] [Fritz croaks and eats flies] [Fritz is a frog] If [X is a frog] Then [X is colored green] [Fritz is colored green] Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] [Fritz is a frog] Goal [Fritz is colored Y]?

CPSC 433 Artificial Intelligence Forward Chaining Example If [X croaks and eats flies] Then [X is a frog] [Fritz croaks and eats flies] [Fritz is a frog] If [X is a frog] Then [X is colored green] [Fritz is colored green] Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] [Fritz is a frog] [Fritz is colored green] Goal [Fritz is colored Y]?

CPSC 433 Artificial Intelligence Forward Chaining Example If [X croaks and eats flies] Then [X is a frog] [Fritz croaks and eats flies] [Fritz is a frog] If [X is a frog] Then [X is colored green] [Fritz is colored green] ? Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] [Fritz is a frog] [Fritz is colored green] Goal [Fritz is colored Y]?

CPSC 433 Artificial Intelligence Forward Chaining Example If [X croaks and eats flies] Then [X is a frog] [Fritz croaks and eats flies] [Fritz is a frog] If [X is a frog] Then [X is colored green] [Fritz is colored green] Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] [Fritz is a frog] [Fritz is colored green] Goal [Fritz is colored Y]?

CPSC 433 Artificial Intelligence Forward Chaining Example If [X croaks and eats flies] Then [X is a frog] [Fritz croaks and eats flies] [Fritz is a frog] If [X is a frog] Then [X is colored green] [Fritz is colored green] [Fritz is colored Y] ? Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] [Fritz is a frog] [Fritz is colored green] Goal [Fritz is colored Y]? Y = green

CPSC 433 Artificial Intelligence Backward Chaining Backward chaining is a goal driven method of deriving a particular goal from a given knowledge base and set of inference rules Inference rules are applied by matching the goal of the search to the consequents of the relations stored in the knowledge base When such a relation is found, the antecedent of the relation is added to the list of goals (and not into the knowledge base, as is done in forward chaining)

CPSC 433 Artificial Intelligence Backward Chaining Search proceeds in this manner until a goal can be matched against a fact in the knowledge base –Remember: facts are simply consequence relations with empty antecedents, so this is like adding the ‘empty goal’ to the list of goals As with forward chaining, a search control method is needed to select which goals will be matched against which consequence relations from the knowledge base

CPSC 433 Artificial Intelligence Backward Chaining Example Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]?

CPSC 433 Artificial Intelligence Backward Chaining Example Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]?

CPSC 433 Artificial Intelligence Backward Chaining Example Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]? [Fritz is colored Y] If [X is a frog] Then [X is colored green] [X is a frog]

CPSC 433 Artificial Intelligence Backward Chaining Example Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]? [X is a frog] [Fritz is colored Y] If [X is a frog] Then [X is colored green] [X is a frog]

CPSC 433 Artificial Intelligence Backward Chaining Example Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]? [X is a frog] [Fritz is colored Y] If [X is a frog] Then [X is colored green] [X is a frog]

CPSC 433 Artificial Intelligence Backward Chaining Example Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]? [X is a frog] [Fritz is colored Y] If [X is a frog] Then [X is colored green] [X is a frog] If [X is a canary] Then [X is colored yellow] [X is a canary]

CPSC 433 Artificial Intelligence Backward Chaining Example Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]? [X is a frog] [X is a canary] [Fritz is colored Y] If [X is a frog] Then [X is colored green] [X is a frog] If [X is a canary] Then [X is colored yellow] [X is a canary]

CPSC 433 Artificial Intelligence Backward Chaining Example Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]? [X is a frog] [X is a canary] [Fritz is colored Y] If [X is a frog] Then [X is colored green] [X is a frog] If [X is a canary] Then [X is colored yellow] [X is a canary]

CPSC 433 Artificial Intelligence Backward Chaining Example Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]? [X is a frog] [X is a canary] [Fritz is colored Y] If [X is a frog] Then [X is colored green] [X is a frog] If [X is a canary] Then [X is colored yellow] [X is a canary] If [X croaks and eats flies] Then [X is a frog] [X croaks and eats flies]

CPSC 433 Artificial Intelligence Backward Chaining Example Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]? [X is a frog] [X is a canary] [X croaks and eats flies] [Fritz is colored Y] If [X is a frog] Then [X is colored green] [X is a frog] If [X is a canary] Then [X is colored yellow] [X is a canary] If [X croaks and eats flies] Then [X is a frog] [X croaks and eats flies]

CPSC 433 Artificial Intelligence Backward Chaining Example Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]? [X is a frog] [X is a canary] [X croaks and eats flies] [Fritz is colored Y] If [X is a frog] Then [X is colored green] [X is a frog] If [X is a canary] Then [X is colored yellow] [X is a canary] If [X croaks and eats flies] Then [X is a frog] [X croaks and eats flies]

CPSC 433 Artificial Intelligence Backward Chaining Example Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]? [X is a frog] [X is a canary] [X croaks and eats flies] [Fritz is colored Y] If [X is a frog] Then [X is colored green] [X is a frog] If [X is a canary] Then [X is colored yellow] [X is a canary] If [X croaks and eats flies] Then [X is a frog] [X croaks and eats flies] [Fritz croaks and eats flies] X = Fritz, Y = green

CPSC 433 Artificial Intelligence Backward Chaining Example Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]? [X is a frog] [X is a canary] [X croaks and eats flies] [Fritz is colored Y] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [X is a frog][X is a canary] If [X croaks and eats flies] Then [X is a frog] [X croaks and eats flies] [Fritz croaks and eats flies] X = Fritz, Y = green