Planning Chapter 7 article 7.4 Production Systems Chapter 5 article 5.3 RBSChapter 7 article 7.2.

Slides:



Advertisements
Similar presentations
ICS-171:Notes 8: 1 Notes 8: Uncertainty, Probability and Optimal Decision-Making ICS 171, Winter 2001.
Advertisements

Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 1 Lecture 12 Dealing With Uncertainty Probabilistic.
Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
CHAPTER 13 Inference Techniques. Reasoning in Artificial Intelligence n Knowledge must be processed (reasoned with) n Computer program accesses knowledge.
Rulebase Expert System and Uncertainty. Rule-based ES Rules as a knowledge representation technique Type of rules :- relation, recommendation, directive,
CS 484 – Artificial Intelligence1 Announcements Choose Research Topic by today Project 1 is due Thursday, October 11 Midterm is Thursday, October 18 Book.
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
I NTRODUCTION TO U NCERTAINTY S OURCES OF U NCERTAINTY Imperfect representations of the world Imperfect observation of the world Laziness, efficiency.
FT228/4 Knowledge Based Decision Support Systems
Uncertainty in Engineering The presence of uncertainty in engineering is unavoidable. Incomplete or insufficient data Design must rely on predictions or.
Chapter 4: Reasoning Under Uncertainty
B. Ross Cosc 4f79 1 Uncertainty Knowledge can have uncertainty associated with it - Knowledge base: rule premises, rule conclusions - User input: uncertain,
5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn.
1 Slides for the book: Probabilistic Robotics Authors: Sebastian Thrun Wolfram Burgard Dieter Fox Publisher: MIT Press, Web site for the book & more.
AI – CS364 Uncertainty Management Introduction to Uncertainty Management 21 st September 2006 Dr Bogdan L. Vrusias
Chapter 12: Expert Systems Design Examples
Rule Based Systems Alford Academy Business Education and Computing
Probability.
HSA 171 CAR. 1436/6/9  Decision Making Conditions.  Behavioral Influences individual Decision Making. 3.
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotic Embedded Systems Laboratory University of Southern California
KI2 - 2 Kunstmatige Intelligentie / RuG Probabilities Revisited AIMA, Chapter 13.
1 Chapter 9 Rules and Expert Systems. 2 Chapter 9 Contents (1) l Rules for Knowledge Representation l Rule Based Production Systems l Forward Chaining.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
Ai in game programming it university of copenhagen Welcome to... the Crash Course Probability Theory Marco Loog.
EXPERT SYSTEMS Part I.
Lecture 05 Rule-based Uncertain Reasoning
1 Introducing Uncertainty (It is not the world that is imperfect, it is our knowledge of it) R&N: Chap. 3, Sect Chap. 13.
QM Spring 2002 Business Statistics Some Probability Essentials.
Uncertainty Logical approach problem: we do not always know complete truth about the environment Example: Leave(t) = leave for airport t minutes before.
Sepandar Sepehr McMaster University November 2008
I NTRODUCTION TO U NCERTAINTY Intelligent user interfaces Communication codes Protein sequence alignment Object tracking.
Bayesian Learning By Porchelvi Vijayakumar. Cognitive Science Current Problem: How do children learn and how do they get it right?
Uncertainty1 Uncertainty Russell and Norvig: Chapter 14 Russell and Norvig: Chapter 13 CS121 – Winter 2003.
School of Computer Science and Technology, Tianjin University
Knowledge based Humans use heuristics a great deal in their problem solving. Of course, if the heuristic does fail, it is necessary for the problem solver.
Basics of Probability. A Bit Math A Probability Space is a triple, where  is the sample space: a non-empty set of possible outcomes; F is an algebra.
What Science Is and Is Not What is the goal of science?
1 Robot Environment Interaction Environment perception provides information about the environment’s state, and it tends to increase the robot’s knowledge.
1 CHAPTER 13 Decision Support Systems and Intelligent Systems, Efraim Turban and Jay E. Aronson 6th ed, Copyright 2001, Prentice Hall, Upper Saddle River,
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
Chapter 13 Artificial Intelligence and Expert Systems.
ES component and structure Dr. Ahmed Elfaig The production system or rule-based system has three main component and subcomponents shown in Figure 1. 1.Knowledge.
Uncertainty Management in Rule-based Expert Systems
Reasoning Under Uncertainty. 2 Objectives Learn the meaning of uncertainty and explore some theories designed to deal with it Find out what types of errors.
Computer Science CPSC 322 Lecture 27 Conditioning Ch Slide 1.
Please turn off cell phones, pagers, etc. The lecture will begin shortly. There will be a very easy quiz at the end of today’s lecture.
Reasoning with Uncertainty دكترمحسن كاهاني
Do I need statistical methods? Samu Mäntyniemi. Learning from experience Which way a bottle cap is going to land? Think, and then write down your opinion.
Chapter 1 What is Biology? 1.1 Science and the Natural World.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
Artificial Intelligence
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
Planning Chapter 7 article 7.4 Production Systems Chapter 5 article 5.3 RBSChapter 7 article 7.2.
Probabilistic Robotics Probability Theory Basics Error Propagation Slides from Autonomous Robots (Siegwart and Nourbaksh), Chapter 5 Probabilistic Robotics.
Artificial Intelligence: Applications
Matching ® ® ® Global Map Local Map … … … obstacle Where am I on the global map?                                   
Learning Objectives Describe the hypothesis testing process Distinguish the types of hypotheses Explain hypothesis testing errors Solve hypothesis testing.
REASONING UNDER UNCERTAINTY: CERTAINTY THEORY
CHAPTER 5 Handling Uncertainty BIC 3337 EXPERT SYSTEM.
Architecture Components
Where are we in CS 440? Now leaving: sequential, deterministic reasoning Entering: probabilistic reasoning and machine learning.
Markov ó Kalman Filter Localization
Probability Topics Random Variables Joint and Marginal Distributions
CS 188: Artificial Intelligence Fall 2008
Decision Theory: Single Stage Decisions
Artificial Intelligence: Logic agents
28th September 2005 Dr Bogdan L. Vrusias
Certainty Factor Model
Making Simple Decisions
Presentation transcript:

Planning Chapter 7 article 7.4 Production Systems Chapter 5 article 5.3 RBSChapter 7 article 7.2

2 RBS CS 331/531 Dr M M Awais RBS: Handling Uncertainties How to handle vague concepts? Why vagueness occurs? All rules are not 100% deterministic Certain rules are often true but not always Headache may be caused in flu, but may not always occur An expert may not always be sure about certain relations and associations

3 RBS CS 331/531 Dr M M Awais First Source of Uncertainty: The Representation Language  There are many more states of the real world than can be expressed in the representation language  So, any state represented in the language may correspond to many different states of the real world, which the agent can’t represent distinguishably A BC A BC A BC On(A,B)  On(B,Table)  On(C,Table)  Clear(A)  Clear(C)

4 RBS CS 331/531 Dr M M Awais First Source of Uncertainty: The Representation Language  6 propositions On(x,y), where x, y = A, B, C and x  y  3 propositions On(x,Table), where x = A, B, C  3 propositions Clear(x), where x = A, B, C  At most 2 12 states can be distinguished in the language [in fact much fewer, because of state constraints such as On(x,y)   On(y,x)]  But there are infinitely many states of the real world A BC A BC A BC On(A,B)  On(B,Table)  On(C,Table)  Clear(A)  Clear(C)

5 RBS CS 331/531 Dr M M Awais Second source of Uncertainty: Imperfect Observation of the World Observation of the world can be:  Partial, e.g., a vision sensor can’t see through obstacles (lack of percepts) R1R1 R2R2 The robot may not know whether there is dust in room R2

6 RBS CS 331/531 Dr M M Awais Second source of Uncertainty: Imperfect Observation of the World Observation of the world can be:  Partial, e.g., a vision sensor can’t see through obstacles  Ambiguous, e.g., percepts have multiple possible interpretations A B C On(A,B)  On(A,C)

7 RBS CS 331/531 Dr M M Awais Second source of Uncertainty: Imperfect Observation of the World Observation of the world can be:  Partial, e.g., a vision sensor can’t see through obstacles  Ambiguous, e.g., percepts have multiple possible interpretations  Incorrect

8 RBS CS 331/531 Dr M M Awais Third Source of Uncertainty: Ignorance, Laziness, Efficiency  An action may have a long list of preconditions, e.g.: Drive-Car: P = Have(Keys)   Empty(Gas-Tank)  Battery-Ok  Ignition-Ok   Flat-Tires   Stolen(Car)...  The agent’s designer may ignore some preconditions... or by laziness or for efficiency, may not want to include all of them in the action representation  The result is a representation that is either incorrect – executing the action may not have the described effects – or that describes several alternative effects

9 RBS CS 331/531 Dr M M Awais Representation of Uncertainty  Many models of uncertainty  We will consider two important models: Non-deterministic model: Uncertainty is represented by a set of possible values, e.g., a set of possible worlds, a set of possible effects,... Probabilistic model: Uncertainty is represented by a probabilistic distribution over a set of possible values

10 RBS CS 331/531 Dr M M Awais Example: Belief State  In the presence of non-deterministic sensory uncertainty, an agent belief state represents all the states of the world that it thinks are possible at a given time or at a given stage of reasoning  In the probabilistic model of uncertainty, a probability is associated with each state to measure its likelihood to be the actual state

11 RBS CS 331/531 Dr M M Awais What do probabilities mean?  Probabilities have a natural frequency interpretation  The agent believes that if it was able to return many times to a situation where it has the same belief state, then the actual states in this situation would occur at a relative frequency defined by the probabilistic distribution This state would occur 20% of the times

12 RBS CS 331/531 Dr M M Awais Example  Consider a world where a dentist agent D meets a new patient P  D is interested in only one thing: whether P has a cavity, which D models using the proposition Cavity  Before making any observation, D’s belief state is:  This means that if D believes that a fraction p of patients have cavities Cavity  Cavity p 1-p

13 RBS CS 331/531 Dr M M Awais Where do probabilities come from?  Frequencies observed in the past, e.g., by the agent, its designer, or others  Symmetries, e.g.: If I roll a dice, each of the 6 outcomes has probability 1/6  Subjectivism, e.g.: If I drive on Highway 280 at 120mph, I will get a speeding ticket with probability 0.6 Principle of indifference: If there is no knowledge to consider one possibility more probable than another, give them the same probability

14 RBS CS 331/531 Dr M M Awais Expert System: A SYSTEM that mimics a human expert Human experts always have in most case some vague (not very precise) ideas about the associations Handling uncertainties is a essential part of an expert system Expert systems are RBS with some level of uncertainty incorporated in the system

15 RBS CS 331/531 Dr M M Awais Choosing a Problem Costs: Choose problems that justify the development cost of the expert systems Technical Problems: Choose a problem that is highly technical in nature problems with Well-formed knowledge are the best choice. Highly specialized expert requirements, solution time for the problem is not short time. Cooperation from an expert: Experts are willingly to participate in the activity.

16 RBS CS 331/531 Dr M M Awais Choosing a Problem Problems that are not suitable Problems for which experts are not available at all, or they are not willingly to participate Problems in which high common sense knowledge is involved Problems which involve high physical skills

17 RBS CS 331/531 Dr M M Awais ES Architecture interface user Explanation system Inference engine Knowledge Base editor Case specific Data Knowledge Base Expert System Shell

18 RBS CS 331/531 Dr M M Awais ES Architecture interface user Explanation system Inference engine Knowledge Base editor Case specific Data Knowledge Base Expert System Shell Uses Menus, NLP, etc… Which is used to interact With the users

19 RBS CS 331/531 Dr M M Awais ES Architecture interface user Explanation system Inference engine Knowledge Base editor Case specific Data Knowledge Base Expert System Shell Explains why a decision is taken, uses keywords Such as HOW, WHY etc… Implements the reasoning methods Generally backward chaining Updates the KB

20 RBS CS 331/531 Dr M M Awais ES Architecture interface user Explanation system Inference engine Knowledge Base editor Case specific Data Knowledge Base Expert System Shell Pre-solved problems, Frequently referred cases Collection of facts And rules

21 RBS CS 331/531 Dr M M Awais Shells General purpose toolkit/shell is problem independent Shells commercially available CLIPS is one such shell Freely available

22 RBS CS 331/531 Dr M M Awais Reasoning with Uncertainty Case Studies: MYCIN Implements certainty factors approach INTERNIST: Modeling Human Problem Solving Implements Probability approach

23 RBS CS 331/531 Dr M M Awais Probability based ES Probability: Degree of believe in a fact ‘x’, P(x) P(H): degree of believe in H, when supporting evidence is NOT given, H is the hypothesis P(H|E): degree of believe in H, when supporting evidence is given, E is the evidence supporting hypothesis P(H|E): conditional probability

24 RBS CS 331/531 Dr M M Awais Conditional Probability P(H|E): conditional probability is given through a LAW (RULE)P(H|E)=P(H^E)/P(E) where P(H^E) is the probability of H and E occurring together (both are TRUE)

25 RBS CS 331/531 Dr M M Awais Evaluating: Conditional Probability P(H|E): P(Heart Attack|shooting arm pain) Two approaches can be adopted: Asking an expert about the frequency of it happening Finding the probability from the given data Second Approach Collect the data for all the patients complaining about the shooting arm pain. Find the proportion of the patients that had an heart attack from the data collected in step 1

26 RBS CS 331/531 Dr M M Awais Evaluating: Conditional Probability P(H|E): P(Heart Attack|shooting arm pain) Probability of Disease given symptoms VS P(E|H): P(shooting arm pain|Heart Attack) Probability of symptoms given Disease Which is easier to find of the two? Perhaps P(E|H) is easier

27 RBS CS 331/531 Dr M M Awais Evaluating: Conditional Probability P(H|E): P(Heart Attack|shooting arm pain) Probability of Disease given symptoms Headache is mostly experienced when a patient suffers from flu, fever, tumor, etc… Find out whether a patient suffers from tumor, given headache Collect the data for all the headache patients, and then find the proportion of patients that have tumor.

28 RBS CS 331/531 Dr M M Awais Evaluating: Conditional Probability P(E|H): P(shooting arm pain|Heart Attack) Probability of symptoms given Disease Headache is mostly experienced when a patient suffers from flu, fever, tumor, etc… Find out whether a tumor patient suffers from headache Collect the data for all the tumor patients, and then find the proportion of patients that have headache

29 RBS CS 331/531 Dr M M Awais Evaluating: Conditional Probability Generally speaking P(E|H): P(shooting arm pain|Heart Attack) is easier to find. Therefore the we need to convert P(H|E) in terms of P(E|H) P(H|E)=P(H^E)/P(E)P(H|E)=[P(E|H)*P(H)]/P(E)

30 RBS CS 331/531 Dr M M Awais Evaluating: Conditional Probability More than one evidence Independence of events P(H|E1^E2)=P(H^E1^E2)/P(E1^E2) P(H|E1^E2)=[P(E1|H)* P(E2|H)* P(H)]/P(E1)*P(E2)

31 RBS CS 331/531 Dr M M Awais Inference through Joint Prob. Start with the joint probability distribution:

32 RBS CS 331/531 Dr M M Awais Inference by enumeration Start with the joint probability distribution: P(toothache) = = 0.2

33 RBS CS 331/531 Dr M M Awais Inference by enumeration Start with the joint probability distribution: P(toothache) = = 0.2

34 RBS CS 331/531 Dr M M Awais Inference by enumeration Start with the joint probability distribution: Can also compute conditional probabilities: P(  cavity | toothache) = P(  cavity  toothache) P(toothache) = = 0.4

35 RBS CS 331/531 Dr M M Awais Certainty Factors (CF) CF for rules CF(R) From the experts CF for Pre-conditions CF(PC) From the end user CF for conclusions CF(cl) CF(cl)=CF(R)*CF(PC)

36 RBS CS 331/531 Dr M M Awais Certainty Factors (CF) CF for rules CF(R) IF A then BCF(R) = 0.6 CF for Pre-conditions CF(PC) IF A (0.4) then BCF(A)= 0.4 CF for conclusions CF(cl) CF(B)=CF(R)*CF(A)= 0.6*0.4=0.24

37 RBS CS 331/531 Dr M M Awais Finding Overall CF for PC If A(0.1) and B(0.4) and C(0.5) Then D Overall CF(PC)=min(CF(A),CF(B),CF(C)) CF(PC)=0.1 If A(0.1) or B(0.4) or C(0.5) Then D Overall CF(PC)=max(CF(A),CF(B),CF(C)) CF(PC)=0.5

38 RBS CS 331/531 Dr M M Awais Combining Certainty factors When the conclusions are same and certainty factors are positive: CF(R1)+CF(R2) – CF(R1)*CF(R2) When the conclusions are same and the certainty factors are both negative CF(R1)+CF(R2) + CF(R1)*CF(R2) Otherwise: both conclusions are same but have different signs [CF(R1)+CF(R2)] / [1 – min ( | CF(R1) |, | CF(R1) |]

39 RBS CS 331/531 Dr M M Awais Example Please see the class handouts

40 RBS CS 331/531 Dr M M Awais