Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 1 Lecture 12 Dealing With Uncertainty Probabilistic.

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Introduction, or what is uncertainty? Introduction, or what is uncertainty? Basic probability theory Basic probability.
Advertisements

FT228/4 Knowledge Based Decision Support Systems
CHAPTER 13 Inference Techniques. Reasoning in Artificial Intelligence n Knowledge must be processed (reasoned with) n Computer program accesses knowledge.
Review of Probability. Definitions (1) Quiz 1.Let’s say I have a random variable X for a coin, with event space {H, T}. If the probability P(X=H) is.
 Negnevitsky, Pearson Education, Lecture 3 Uncertainty management in rule- based expert systems n Introduction, or what is uncertainty? n Basic.
Uncertainty in Expert Systems (Certainty Factors).
Artificial Intelligence Universitatea Politehnica Bucuresti Adina Magda Florea
Rulebase Expert System and Uncertainty. Rule-based ES Rules as a knowledge representation technique Type of rules :- relation, recommendation, directive,
Bayesian inference “Very much lies in the posterior distribution” Bayesian definition of sufficiency: A statistic T (x 1, …, x n ) is sufficient for 
Statistical Issues in Research Planning and Evaluation
1 1 Slide © 2009 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Independence and the Multiplication Rule
B. Ross Cosc 4f79 1 Uncertainty Knowledge can have uncertainty associated with it - Knowledge base: rule premises, rule conclusions - User input: uncertain,
AI – CS364 Uncertainty Management Introduction to Uncertainty Management 21 st September 2006 Dr Bogdan L. Vrusias
Chapter 4 Introduction to Probability Experiments, Counting Rules, and Assigning Probabilities Events and Their Probability Some Basic Relationships of.
AI – CS364 Uncertainty Management 26 th September 2006 Dr Bogdan L. Vrusias
CSNB234 ARTIFICIAL INTELLIGENCE
Dealing with Uncertainty  The need to deal with uncertainty arose in “expert systems”  Code expertise into a computer system Example:  Medical diagnosis:
© C. Kemke 1Classification Problem Solving COMP 4200: Expert Systems Dr. Christel Kemke Department of Computer Science University of Manitoba.
CS 589 Information Risk Management 30 January 2007.
Planning Chapter 7 article 7.4 Production Systems Chapter 5 article 5.3 RBSChapter 7 article 7.2.
Learning with Bayesian Networks David Heckerman Presented by Colin Rickert.
1 CS 430 / INFO 430 Information Retrieval Lecture 12 Probabilistic Information Retrieval.
Representing Uncertainty CSE 473. © Daniel S. Weld 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one.
CSE (c) S. Tanimoto, 2008 Bayes Nets 1 Probabilistic Reasoning With Bayes’ Rule Outline: Motivation Generalizing Modus Ponens Bayes’ Rule Applying.
I The meaning of chance Axiomatization. E Plurbus Unum.
Lecture 05 Rule-based Uncertain Reasoning
Thanks to Nir Friedman, HU
Probability, Bayes’ Theorem and the Monty Hall Problem
Basic Concepts and Approaches
Bayesian Decision Theory Making Decisions Under uncertainty 1.
Introduction to Probability n Experiments and the Sample Space n Assigning Probabilities to Experimental Outcomes Experimental Outcomes n Events and Their.
Dr. Gary Blau, Sean HanMonday, Aug 13, 2007 Statistical Design of Experiments SECTION I Probability Theory Review.
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
1 1 Slide © 2004 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
Week 71 Hypothesis Testing Suppose that we want to assess the evidence in the observed data, concerning the hypothesis. There are two approaches to assessing.
Introduction to Probability
Estimating Component Availability by Dempster-Shafer Belief Networks Estimating Component Availability by Dempster-Shafer Belief Networks Lan Guo Lane.
Uncertainty Management in Rule-based Expert Systems
Probability Course web page: vision.cis.udel.edu/cv March 19, 2003  Lecture 15.
Uncertainty in Expert Systems
Making sense of randomness
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Reasoning Under Uncertainty. 2 Objectives Learn the meaning of uncertainty and explore some theories designed to deal with it Find out what types of errors.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Please turn off cell phones, pagers, etc. The lecture will begin shortly. There will be a very easy quiz at the end of today’s lecture.
1 1 Slide IS 310 – Business Statistics IS 310 Business Statistics CSU Long Beach.
Reasoning with Uncertainty دكترمحسن كاهاني
Bayes Theorem. Prior Probabilities On way to party, you ask “Has Karl already had too many beers?” Your prior probabilities are 20% yes, 80% no.
V7 Foundations of Probability Theory „Probability“ : degree of confidence that an event of an uncertain nature will occur. „Events“ : we will assume that.
Textbook Basics of an Expert System: – “Expert systems: Design and Development,” by: John Durkin, 1994, Chapters 1-4. Uncertainty (Probability, Certainty.
UNIVERSITI TENAGA NASIONAL 1 CCSB354 ARTIFICIAL INTELLIGENCE Chapter 8.2 Certainty Factors Chapter 8.2 Certainty Factors Instructor: Alicia Tang Y. C.
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
- 1 - Outline Introduction to the Bayesian theory –Bayesian Probability –Bayes’ Rule –Bayesian Inference –Historical Note Coin trials example Bayes rule.
Planning Chapter 7 article 7.4 Production Systems Chapter 5 article 5.3 RBSChapter 7 article 7.2.
Artificial Intelligence: Applications
Chapter 12 Certainty Theory (Evidential Reasoning) 1.
REASONING UNDER UNCERTAINTY: CERTAINTY THEORY
Bayes’ Theorem Suppose we have estimated prior probabilities for events we are concerned with, and then obtain new information. We would like to a sound.
CHAPTER 5 Handling Uncertainty BIC 3337 EXPERT SYSTEM.
Basic Probabilistic Reasoning
Dealing with Uncertainty
Representing Uncertainty
Dependencies in Structures of Decision Tables
Probabilistic Reasoning With Bayes’ Rule
28th September 2005 Dr Bogdan L. Vrusias
Probabilistic Reasoning With Bayes’ Rule
Certainty Factor Model
Presentation transcript:

Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 1 Lecture 12 Dealing With Uncertainty Probabilistic Approach Frequency Interpretation Subjective Interpretation

Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 2 Frequency Interpretation In this domain probability is a set of similar events. Suppose S is a set of objects. An event corresponds to selecting an object from S. Lets divide S into P and N two subsets such that P  N = 0 and P U N = S. Then the probability of occurrence of P in S is given as |P|/|S| and of N is given as |N|/|S| or |S-P|/|S|

Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 3 Subjective Interpretation A subjective probability is a probability expressing a person’s degree of belief in a proposition or the occurrence of an event. Well prepared students can have higher level of confidence in passing the exam. Other types of probability approaches can be used to enhance the interpretation of the subjective probability

Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 4 Certainty Factors Certainty Factor theory is based on subjective Probability and is commonly used in Expert Systems to handle uncertainty. Lets define: MB=Measure of belief MD=Measure of Disbelief CF=Certainty Factor = MB-MD If CF is 1 the evidence for the hypothesis being true is 100% If the CF value is 0 the evidence is 0%. If CF approaches -1 the evidence of disbelief is 100%.

Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 5 Algebra of CF CF reflects the confidence of the expert in the rule’s reliability. Typical Rule: IF A and B THEN C (And B are premise and D is conclusion) Algebra for the Premise: for premise P1, P2, ……, The CF values are calculated as follows: AND operator CF(P12)=CF(P1 and P2)=MIN(CF(P1),CF(P2)) CF(P12 and P3)=MIN(CF(P12) and P3)) and so on OR operator CF(P12)=CF(P1 or P2)=MAX(CF(P1),CF(P2)) CF(P12 or P3)=MAX(CF(P12) and P3)) and so on

Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 6Algebra Suppose the rule is IF (P1 and P2 ) or P3 THEN C1 and CF(P1)=0.6, CF(P2)=0.4, CF(P3)=0.2 For premise the combined CF will be CF(P)=MAX( MIN(CF(P1),CF(P2)), CF(P3)) Algebra for the Overall rule: CF(Rule)=CF(P)*CF(C1)

Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 7 Combining Rules When two or more rules support the same conclusion CF(R1)+CF(R2) - CF(R1)*CF(R2) where all values are positive When two or more rules do not support the same conclusion CF(R1)+CF(R2) + CF(R1)*CF(R2) where all values are negative Otherwise [CF(R1)+CF(R2)]/[1-MIN(|CF(R1)|,|CF(R2|)]

Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 8 Example: MYCIN IF the infection is primary_bacteria (0.6) ANDthe site of the culture is one of the sterile sites(0.5) ANDthe suspected portal of the entry is gastrointestinal tract (0.8) THEN there is suggestive evidence that infection is bacteriod (0.7) CF(P)=MIN(0.6,0.5,0.8)=0.5 CF(R)=0.5*0.7=0.35

Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 9 Example 1: Suppose we have the following rule R1: if (P1 and P2 andP3) or (P4 and not P5 then C1 (0.7) and C2(-0.5) and the certainty factors of P1, P2, P3, P4, P5 are as follows: CF(P1) = 0.8,CF(P2) = 0.7, CF(P3) = 0.6,CF(P4) = 0.9, CF(P5) = -0.5, What are the certainty factors associated with conclusions C1 and C2 after using rule R1?

Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 10 Example 1: Solution: For P1 and P2 and P3, the CF is min(CF(P1), CF(P2), CF(P3)) = min(0.8, 0.7, 0.6) = 0.6. Call this CF A. For not P5, the CF is -CF(P5) = 0.5. For P4 and notP5, the CF is min(0.9, 0.5) = 0.5. Call this CF B. For (P1 and P2 and P3) or (P4 and notP), the CF is: max(CF A, CF B ) = max(0.6, 0.5) = 0.6. Thus CF(C1) = 0.7 * 0.6 = 0.42 and CF(C2) = -0.5 * 0.6 = -0.30

Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 11 Example 2: The final answers to the previous question are CF R1 (C1) = 0.42 and CF R1 (C2) = Suppose that we have, from another rule R2, the following certainty factors for C1 and C2: CF R2 (C1) = 0.7, CF R2 (C2) = -0.4 What are the certainty factors associated with C1 and C2 after combining the evidence from rules R1 and R2?

Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 12Example2: Solution: For C1, CF(C1) = CF R1 (C1) + CF R2 (C1) - CF R1 (C1)*CF R2 (C1) = *0.7 = = For C2, CF(C2) = CF R1 (C2) + CF R2 (C2) + CF R1 (C2)*CF R2 (C2) = (-0.3)*(-0.4) = = -0.58

Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 13 Modifying the CF values: If after getting new information the CF value is to be changed then CF(revised)=CF(old) + CF(new)*(1-CF(old)) What is 1-CF(old)? (the amount of doubt present in the old evidence)

Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 14 Bayesian Approach: IF there is hypothesis H then the P(H) gives the probability of H being true. If a certain evidence (E) is present for the happening of H then the probability is given as P(H|E). This is referred as Conditional Probability, defined as: P(H|E) = P(H^E) / P(E) where P(H^E) is the probability that both H and E are true.

Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 15 Gathering Information: Conditional Probability, may be obtained from experts. Example: we can know the probability of a heart attack given shooting pain in the arm from a doctor. (It is easier to obtain suitable data on people who had heart attack, than people who have had shooting pains) Thus we have Bayes’ Rule that says: P(H|E) = [P(E|H) * P(H) ] / P(E)

Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 16Independence What happens if more than one evidences are available?. The two evidences may or may not have any influence on each other if that’s the case then: P(E1^E2)=P(E1)*P(E2) (example: tossing a coin) But generally the evidences are not independent, thus conditional independence is used which is P(H|E1^E2^….) = [P(E1^E2...|H) * P(H) ] / P(E1^...) where P(E1^E2….) is the Joint Distribution of all the evidences

Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 17Independence What happens if more than one evidences are available?. The two evidences may or may not have any influence on each other if that’s the case then: P(E1^E2)=P(E1)*P(E2) (example: tossing a coin) But generally the evidences are not independent, thus conditional independence is used which is P(H|E1^E2^….) = [P(E1^E2...|H) * P(H) ] / P(E1^...) where P(E1^E2….) is the Joint Distribution of all the evidences

Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 18 Likelihood Ratios Prior OddsO(H)=P(H) / 1-P(H) Posterior Odds O(H|E) = P(H|E) /1- P(H|E) Likelihood Ratio (Level of Sufficiency) LS= P(E | H) / P(E | ¬H) Using odds and the likelihood ratio definitions: O(H | E) = LS * O(H) Multiple Evidences: O(H|E1^E2…..) = (LS1*LS2*…..)*O(H)

Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 19Example Suppose we have obtained the following likelihood ratios LSs Measles LSMumps LS spots1510 no spots high temp.45 no temp The prior odds for two diseases are 0.1 and 0.05 for measles and mumps. Calculate the posterior odds of the diseases for  spots and no temperature  no spots and temperature  no spots and no temperature

Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 20Example O(Measles|spots and no temp) = 0.1*15*0.8=1.2 O(Mumps|spots and no temp)=0.05*10*0.7=0.35