1 CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly.

Slides:



Advertisements
Similar presentations
FT228/4 Knowledge Based Decision Support Systems
Advertisements

Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 1 Lecture 12 Dealing With Uncertainty Probabilistic.
CHAPTER 13 Inference Techniques. Reasoning in Artificial Intelligence n Knowledge must be processed (reasoned with) n Computer program accesses knowledge.
© 2002 Franz J. Kurfess Expert System Examples 1 CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly.
 Negnevitsky, Pearson Education, Lecture 3 Uncertainty management in rule- based expert systems n Introduction, or what is uncertainty? n Basic.
The Logic of Intelligence Pei Wang Department of Computer and Information Sciences Temple University.
Rulebase Expert System and Uncertainty. Rule-based ES Rules as a knowledge representation technique Type of rules :- relation, recommendation, directive,
Uncertainty Everyday reasoning and decision making is based on uncertain evidence and inferences. Classical logic only allows conclusions to be strictly.
Chapter 4: Reasoning Under Uncertainty
B. Ross Cosc 4f79 1 Uncertainty Knowledge can have uncertainty associated with it - Knowledge base: rule premises, rule conclusions - User input: uncertain,
Modeling Human Reasoning About Meta-Information Presented By: Scott Langevin Jingsong Wang.
Certainty and Evidence
Paula Matuszek CSC 8520, Fall, 2005 Dealing with Uncertainty.
AI – CS364 Uncertainty Management Introduction to Uncertainty Management 21 st September 2006 Dr Bogdan L. Vrusias
CPE/CSC 481: Knowledge-Based Systems
Approximate Reasoning 1 Expert Systems Dr. Samy Abu Nasser.
© C. Kemke Approximate Reasoning 1 COMP 4200: Expert Systems Dr. Christel Kemke Department of Computer Science University of Manitoba.
AI – CS364 Uncertainty Management 26 th September 2006 Dr Bogdan L. Vrusias
CSNB234 ARTIFICIAL INTELLIGENCE
© 2002 Franz J. Kurfess Reasoning under Uncertainty 1 CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
© 2005 Franz J. Kurfess Expert System Examples 1 CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly.
© Franz J. Kurfess Approximate Reasoning 1 CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly.
© 2002 Franz J. Kurfess Introduction 1 CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly.
© 2002 Franz J. Kurfess Introduction 1 CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly.
© Franz J. Kurfess Reasoning under Uncertainty CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly.
Objectives Explore the sources of uncertainty in rules Analyze some methods for dealing with uncertainty Learn about the Dempster-Shafer theory Learn.
1 CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly.
© Franz J. Kurfess Reasoning under Uncertainty 1 CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly.
© 2002 Franz J. Kurfess Approximate Reasoning 1 CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly.
© C. Kemke1Reasoning - Introduction COMP 4200: Expert Systems Dr. Christel Kemke Department of Computer Science University of Manitoba.
© 2002 Franz J. Kurfess Introduction 1 CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly.
Chapter 5: Inexact Reasoning
Lecture 05 Rule-based Uncertain Reasoning
CS Bayesian Learning1 Bayesian Learning. CS Bayesian Learning2 States, causes, hypotheses. Observations, effect, data. We need to reconcile.
Chapter 9 Flashcards. measurement method that uses uniform procedures to collect, score, interpret, and report numerical results; usually has norms and.
Elec471 Embedded Computer Systems Chapter 4, Probability and Statistics By Prof. Tim Johnson, PE Wentworth Institute of Technology Boston, MA Theory and.
Chapter 4: Reasoning Under Uncertainty
Knowledge-Based Systems Knowledge-Based Systems Dr. Marco Antonio Ramos Corchado Computer Science Department.
Fuzzy Logic. Lecture Outline Fuzzy Systems Fuzzy Sets Membership Functions Fuzzy Operators Fuzzy Set Characteristics Fuzziness and Probability.
Inexact Reasoning. 2 Objectives Explore the sources of uncertainty in rules Analyze some methods for dealing with uncertainty Learn about the Dempster-Shafer.
© C. Kemke Reasoning under Uncertainty 1 COMP 4200: Expert Systems Dr. Christel Kemke Department of Computer Science University of Manitoba.
Introduction to Probability  Probability is a numerical measure of the likelihood that an event will occur.  Probability values are always assigned on.
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
From Rough Set Theory to Evidence Theory Roman Słowiński Laboratory of Intelligent Decision Support Systems Institute of Computing Science Poznań University.
Uncertainty Management in Rule-based Expert Systems
Uncertainty. Assumptions Inherent in Deductive Logic-based Systems All the assertions we wish to make and use are universally true. Observations of the.
Probability Course web page: vision.cis.udel.edu/cv March 19, 2003  Lecture 15.
Uncertainty in Expert Systems
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
© 2002 Franz J. Kurfess Introduction 1 CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly.
Reasoning Under Uncertainty. 2 Objectives Learn the meaning of uncertainty and explore some theories designed to deal with it Find out what types of errors.
Reasoning with Uncertainty دكترمحسن كاهاني
International Conference on Fuzzy Systems and Knowledge Discovery, p.p ,July 2011.
Textbook Basics of an Expert System: – “Expert systems: Design and Development,” by: John Durkin, 1994, Chapters 1-4. Uncertainty (Probability, Certainty.
UNIVERSITI TENAGA NASIONAL 1 CCSB354 ARTIFICIAL INTELLIGENCE Chapter 8.2 Certainty Factors Chapter 8.2 Certainty Factors Instructor: Alicia Tang Y. C.
Expert System Seyed Hashem Davarpanah University of Science and Culture.
DEALING WITH UNCERTAINTY (1) WEEK 5 CHAPTER 3. Introduction The world is not a well-defined place. There is uncertainty in the facts we know: – What’s.
Chapter 4: Reasoning Under Uncertainty Expert Systems: Principles and Programming, Fourth Edition Original by Course Technology Modified by Ramin Halavati.
Anifuddin Azis UNCERTAINTY. 2 Introduction The world is not a well-defined place. There is uncertainty in the facts we know: What’s the temperature? Imprecise.
Chapter 12 Certainty Theory (Evidential Reasoning) 1.
REASONING UNDER UNCERTAINTY: CERTAINTY THEORY
CHAPTER 5 Handling Uncertainty BIC 3337 EXPERT SYSTEM.
Chapter 3 Probability.
Inexact Reasoning 1 Session 9
REASONING WITH UNCERTANITY
Basic Probabilistic Reasoning
28th September 2005 Dr Bogdan L. Vrusias
Certainty Factor Model
Presentation transcript:

1 CPE/CSC 481: Knowledge-Based Systems Dr. Franz J. Kurfess Computer Science Department Cal Poly

2 © Franz J. Kurfess Usage of the Slides ❖ these slides are intended for the students of my CPE/CSC 481 “Knowledge-Based Systems” class at Cal Poly SLO  if you want to use them outside of my class, please let me know ❖ I usually put together a subset for each quarter as a “Custom Show”  to view these, go to “Slide Show => Custom Shows”, select the respective quarter, and click on “Show”  in Apple Keynote, I use the “Hide” feature to achieve similar results ❖ To print them, I suggest to use the “Handout” option  4, 6, or 9 per page works fine  Black & White should be fine; there are few diagrams where color is important

3 © Franz J. Kurfess Overview Reasoning and Uncertainty ❖ Motivation ❖ Objectives ❖ Sources of Uncertainty and Inexactness in Reasoning  Incorrect and Incomplete Knowledge  Ambiguities  Belief and Ignorance ❖ Probability Theory  Bayesian Networks  Certainty Factors  Belief and Disbelief  Dempster-Shafer Theory  Evidential Reasoning ❖ Important Concepts and Terms ❖ Chapter Summary

4 © Franz J. Kurfess Logistics ❖ Introductions ❖ Course Materials  textbooks (see below)  lecture notes  PowerPoint Slides will be available on my Web page  handouts  Web page  ❖ Term Project ❖ Lab and Homework Assignments ❖ Exams ❖ Grading

5 © Franz J. Kurfess Bridge-In

6 © Franz J. Kurfess Pre-Test

7 © Franz J. Kurfess Motivation ❖ reasoning for real-world problems involves missing knowledge, inexact knowledge, inconsistent facts or rules, and other sources of uncertainty ❖ while traditional logic in principle is capable of capturing and expressing these aspects, it is not very intuitive or practical  explicit introduction of predicates or functions ❖ many expert systems have mechanisms to deal with uncertainty  sometimes introduced as ad-hoc measures, lacking a sound foundation

8 © Franz J. Kurfess Objectives ❖ be familiar with various sources of uncertainty and imprecision in knowledge representation and reasoning ❖ understand the main approaches to dealing with uncertainty  probability theory  Bayesian networks  Dempster-Shafer theory  important characteristics of the approaches  differences between methods, advantages, disadvantages, performance, typical scenarios ❖ evaluate the suitability of those approaches  application of methods to scenarios or tasks ❖ apply selected approaches to simple problems

9 © Franz J. Kurfess Evaluation Criteria

10 © Franz J. Kurfess Introduction ❖ reasoning under uncertainty and with inexact knowledge  frequently necessary for real-world problems ❖ heuristics  ways to mimic heuristic knowledge processing  methods used by experts ❖ empirical associations  experiential reasoning  based on limited observations ❖ probabilities  objective (frequency counting)  subjective (human experience ) ❖ reproducibility  will observations deliver the same results when repeated

11 © Franz J. Kurfess Dealing with Uncertainty ❖ expressiveness  can concepts used by humans be represented adequately?  can the confidence of experts in their decisions be expressed? ❖ comprehensibility  representation of uncertainty  utilization in reasoning methods ❖ correctness  probabilities  adherence to the formal aspects of probability theory  relevance ranking  probabilities don’t add up to 1, but the “most likely” result is sufficient  long inference chains  tend to result in extreme (0,1) or not very useful (0.5) results ❖ computational complexity  feasibility of calculations for practical purposes

12 © Franz J. Kurfess Sources of Uncertainty ❖ data  data missing, unreliable, ambiguous,  representation imprecise, inconsistent, subjective, derived from defaults, … ❖ expert knowledge  inconsistency between different experts  plausibility  “best guess” of experts  quality  causal knowledge  deep understanding  statistical associations  observations  scope  only current domain, or more general

13 © Franz J. Kurfess Sources of Uncertainty (cont.) ❖ knowledge representation  restricted model of the real system  limited expressiveness of the representation mechanism ❖ inference process  deductive  the derived result is formally correct, but inappropriate  derivation of the result may take very long  inductive  new conclusions are not well-founded  not enough samples  samples are not representative  unsound reasoning methods  induction, non-monotonic, default reasoning

14 © Franz J. Kurfess Uncertainty in Individual Rules ❖ errors  domain errors  representation errors  inappropriate application of the rule ❖ likelihood of evidence  for each premise  for the conclusion  combination of evidence from multiple premises

15 © Franz J. Kurfess Uncertainty and Multiple Rules ❖ conflict resolution  if multiple rules are applicable, which one is selected  explicit priorities, provided by domain experts  implicit priorities derived from rule properties  specificity of patterns, ordering of patterns creation time of rules, most recent usage, … ❖ compatibility  contradictions between rules  subsumption  one rule is a more general version of another one  redundancy  missing rules  data fusion  integration of data from multiple sources

16 © Franz J. Kurfess Basics of Probability Theory ❖ mathematical approach for processing uncertain information ❖ sample space set X = {x 1, x 2, …, x n }  collection of all possible events  can be discrete or continuous ❖ probability number P(x i ) reflects the likelihood of an event x i to occur  non-negative value in [0,1]  total probability of the sample space (sum of probabilities) is 1  for mutually exclusive events, the probability for at least one of them is the sum of their individual probabilities  experimental probability  based on the frequency of events  subjective probability  based on expert assessment

17 © Franz J. Kurfess Compound Probabilities ❖ describes independent events  do not affect each other in any way ❖ joint probability of two independent events A and B P(A ∩ B) = n(A ∩ B) / n(s) = P(A) * P (B)  where n(S) is the number of elements in S ❖ union probability of two independent events A and B P(A ∪ B) = P(A) + P(B) - P(A ∩ B) = P(A) + P(B) - P(A) * P (B)

18 © Franz J. Kurfess Conditional Probabilities ❖ describes dependent events  affect each other in some way ❖ conditional probability of event A given that event B has already occurred P(A|B) = P(A ∩ B) / P(B)

19 © Franz J. Kurfess Advantages and Problems: Probabilities ❖ advantages  formal foundation  reflection of reality (a posteriori) ❖ problems  may be inappropriate  the future is not always similar to the past  inexact or incorrect  especially for subjective probabilities  ignorance  probabilities must be assigned even if no information is available  assigns an equal amount of probability to all such items  non-local reasoning  requires the consideration of all available evidence, not only from the rules currently under consideration  no compositionality  complex statements with conditional dependencies can not be decomposed into independent parts

20 © Franz J. Kurfess Bayesian Approaches ❖ derive the probability of a cause given a symptom ❖ has gained importance recently due to advances in efficiency  more computational power available  better methods ❖ especially useful in diagnostic systems  medicine, computer help systems ❖ inverse probability  inverse to conditional probability of an earlier event given that a later one occurred

21 © Franz J. Kurfess Bayes’ Rule for Single Event ❖ single hypothesis H, single event E P(H|E) = (P(E|H) * P(H)) / P(E) or ❖ P(H|E) = (P(E|H) * P(H) / (P(E|H) * P(H) + P(E|¬H) * P(¬H) )

22 © Franz J. Kurfess Bayes’ Rule for Multiple Events ❖ multiple hypotheses Hi, multiple events E 1, …, E n P(H i |E 1, E 2, …, E n ) = (P(E 1, E 2, …, E n |H i ) * P(Hi)) / P(E 1, E 2, …, E n ) or P(H i |E 1, E 2, …, E n ) = (P(E 1 |H i ) * P(E 2 |H i ) * …* P(E n |H i ) * P(H i )) / Σ k P(E 1 |H k ) * P(E 2 |H k ) * … * P(E n |H k )* P(H k )  with independent pieces of evidence E i

23 © Franz J. Kurfess Using Bayesian Reasoning: Spam Filters ❖ Bayesian reasoning was used for early approaches to spam filtering

24 © Franz J. Kurfess Advantages and Problems of Bayesian Reasoning ❖ advantages  sound theoretical foundation  well-defined semantics for decision making ❖ problems  requires large amounts of probability data  sufficient sample sizes  subjective evidence may not be reliable  independence of evidences assumption often not valid  relationship between hypothesis and evidence is reduced to a number  explanations for the user difficult  high computational overhead

25 © Franz J. Kurfess Certainty Factors ❖ denotes the belief in a hypothesis H given that some pieces of evidence E are observed ❖ no statements about the belief means that no evidence is present  in contrast to probabilities, Bayes’ method ❖ works reasonably well with partial evidence  separation of belief, disbelief, ignorance ❖ shares some foundations with Dempster-Shafer (DS) theory, but is more practical  introduced in an ad-hoc way in MYCIN  later mapped to DS theory

26 © Franz J. Kurfess Belief and Disbelief ❖ measure of belief  degree to which hypothesis H is supported by evidence E  MB(H,E) = 1 if P(H) = 1 (P(H|E) - P(H)) / (1- P(H)) otherwise ❖ measure of disbelief  degree to which doubt in hypothesis H is supported by evidence E  MD(H,E) = 1 if P(H) = 0 (P(H) - P(H|E)) / P(H)) otherwise

27 © Franz J. Kurfess Certainty Factor ❖ certainty factor CF  ranges between -1 (denial of the hypothesis H) and +1 (confirmation of H)  allows the ranking of hypotheses ❖ difference between belief and disbelief CF (H,E) = MB(H,E) - MD (H,E) ❖ combining antecedent evidence  use of premises with less than absolute confidence  E 1 ∧ E 2 = min(CF(H, E 1 ), CF(H, E 2 ))  E 1 ∨ E 2 = max(CF(H, E 1 ), CF(H, E 2 ))  ¬E = ¬ CF(H, E)

28 © Franz J. Kurfess Combining Certainty Factors ❖ certainty factors that support the same conclusion ❖ several rules can lead to the same conclusion ❖ applied incrementally as new evidence becomes available ❖ CF rev (CF old, CF new ) =  CF old + CF new( 1 - CF old ) if both > 0  CF old + CF new (1 + CF old ) if both < 0  CF old + CF new / (1 - min(|CF old |, |CF new |)) if one < 0

29 © Franz J. Kurfess Characteristics of Certainty Factors AspectProbabilityMBMDCF Certainly trueP(H|E) = 1101 Certainly falseP(¬H|E) = 101 No evidenceP(H|E) = P(H)000 ❖ Ranges  measure of belief 0 ≤ MB ≤ 1  measure of disbelief 0 ≤ MD ≤ 1  certainty factor -1 ≤ CF ≤ +1

30 © Franz J. Kurfess Advantages and Problems of Certainty Factors ❖ Advantages  simple implementation  reasonable modeling of human experts’ belief  expression of belief and disbelief  successful applications for certain problem classes  evidence relatively easy to gather  no statistical base required ❖ Problems  partially ad hoc approach  theoretical foundation through Dempster-Shafer theory was developed later  combination of non-independent evidence unsatisfactory  new knowledge may require changes in the certainty factors of existing knowledge  certainty factors can become the opposite of conditional probabilities for certain cases  not suitable for long inference chains

31 © Franz J. Kurfess Dempster-Shafer Theory ❖ mathematical theory of evidence  uncertainty is modeled through a range of probabilities  instead of a single number indicating a probability  sound theoretical foundation  allows distinction between belief, disbelief, ignorance (non- belief)  certainty factors are a special case of DS theory

32 © Franz J. Kurfess DS Theory Notation ❖ environment Θ = {O 1, O 2,..., O n }  set of objects Oi that are of interest  Θ = {O 1, O 2,..., O n } ❖ frame of discernment FD  an environment whose elements may be possible answers  only one answer is the correct one ❖ mass probability function m  assigns a value from [0,1] to every item in the frame of discernment  describes the degree of belief in analogy to the mass of a physical object ❖ mass probability m(A)  portion of the total mass probability that is assigned to a specific element A of FD

33 © Franz J. Kurfess Belief and Certainty ❖ belief Bel(A) in a set A  sum of the mass probabilities of all the proper subsets of A  all the mass that supports A  likelihood that one of its members is the conclusion  also called support function ❖ plausibility Pls(A)  maximum belief of A  upper bound for the range of belief ❖ certainty Cer(A)  interval [Bel(A), Pls(A)]  also called evidential interval  expresses the range of belief

34 © Franz J. Kurfess Combination of Mass Probabilities ❖ combining two masses in such a way that the new mass represents a consensus of the contributing pieces of evidence  set intersection puts the emphasis on common elements of evidence, rather than conflicting evidence ❖ m 1 ⊕ m 2 (C)= Σ X ∩ Y m 1 (X) * m 2 (Y) ❖ = C m 1 (X) * m 2 (Y) / (1- Σ X ∩ Y) = C m 1 (X) * m 2 (Y)  where X, Y are hypothesis subsets  C is their intersection C = X ∩ Y  ⊕ is the orthogonal or direct sum

35 © Franz J. Kurfess Differences Probabilities - DS Theory AspectProbabilitiesDempster-Shafer Aggregate Sum∑ i P i = 1m(Θ) ≤ 1 Subset X ⊆ Y P(X) ≤ P(Y)m(X) > m(Y) allowed relationship X, ¬X (ignorance) P(X) + P (¬X) = 1m(X) + m(¬X) ≤ 1

36 © Franz J. Kurfess Evidential Reasoning ❖ extension of DS theory that deals with uncertain, imprecise, and possibly inaccurate knowledge ❖ also uses evidential intervals to express the confidence in a statement  lower bound is called support (Spt) in evidential reasoning, and belief (Bel) in Dempster-Shafer theory  upper bound is plausibility (Pls)

37 © Franz J. Kurfess Evidential Intervals MeaningEvidential Interval Completely true[1,1] Completely false[0,0] Completely ignorant[0,1] Tends to support[Bel,1] where 0 < Bel < 1 Tends to refute[0,Pls] where 0 < Pls < 1 Tends to both support and refute[Bel,Pls] where 0 < Bel ≤ Pls< 1  Bel: belief; lower bound of the evidential interval  Pls: plausibility; upper bound

38 © Franz J. Kurfess Advantages and Problems of Dempster-Shafer ❖ advantages  clear, rigorous foundation  ability to express confidence through intervals  certainty about certainty  proper treatment of ignorance ❖ problems  non-intuitive determination of mass probability  very high computational overhead  may produce counterintuitive results due to normalization  usability somewhat unclear

39 © Franz J. Kurfess Post-Test

40 © Franz J. Kurfess Evaluation ❖ Criteria

41 © Franz J. Kurfess Important Concepts and Terms  Bayesian networks  belief  certainty factor  compound probability  conditional probability  Dempster-Shafer theory  disbelief  evidential reasoning  inference  inference mechanism  ignorance ❖ knowledge ❖ knowledge representation ❖ mass function ❖ probability ❖ reasoning ❖ rule ❖ sample ❖ set ❖ uncertainty

42 © Franz J. Kurfess Summary Reasoning and Uncertainty ❖ many practical tasks require reasoning under uncertainty  missing, inexact, inconsistent knowledge ❖ variations of probability theory are often combined with rule-based approaches  works reasonably well for many practical problems ❖ Bayesian networks have gained some prominence  improved methods, sufficient computational power

43 © Franz J. Kurfess