For Monday after Spring Break Read 14.1-14.5 Homework: –Chapter 13, exercise 6 and 8 May be done in pairs.

Slides:



Advertisements
Similar presentations
Probability: Review The state of the world is described using random variables Probabilities are defined over events –Sets of world states characterized.
Advertisements

PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
Uncertainty Everyday reasoning and decision making is based on uncertain evidence and inferences. Classical logic only allows conclusions to be strictly.
For Monday Read chapter 18, sections 1-2 Homework: –Chapter 14, exercise 8 a-d.
Text Categorization CSC 575 Intelligent Information Retrieval.
For Monday Finish chapter 14 Homework: –Chapter 13, exercises 8, 15.
5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn.
Probability.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
KI2 - 2 Kunstmatige Intelligentie / RuG Probabilities Revisited AIMA, Chapter 13.
1 Bayesian Reasoning Chapter 13 CMSC 471 Adapted from slides by Tim Finin and Marie desJardins.
Bayesian Networks What is the likelihood of X given evidence E? i.e. P(X|E) = ?
Representing Uncertainty CSE 473. © Daniel S. Weld 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one.
Ai in game programming it university of copenhagen Welcome to... the Crash Course Probability Theory Marco Loog.
Lecture 05 Rule-based Uncertain Reasoning
Uncertainty Chapter 13.
Jeff Howbert Introduction to Machine Learning Winter Classification Bayesian Classifiers.
Probabilistic Reasoning
CSCI 121 Special Topics: Bayesian Network Lecture #1: Reasoning Under Uncertainty.
Recitation 1 Probability Review
1 Naïve Bayes A probabilistic ML algorithm. 2 Axioms of Probability Theory All probabilities between 0 and 1 True proposition has probability 1, false.
CISC 4631 Data Mining Lecture 06: Bayes Theorem Theses slides are based on the slides by Tan, Steinbach and Kumar (textbook authors) Eamonn Koegh (UC Riverside)
Dr. Gary Blau, Sean HanMonday, Aug 13, 2007 Statistical Design of Experiments SECTION I Probability Theory Review.
1 CS 343: Artificial Intelligence Probabilistic Reasoning and Naïve Bayes Raymond J. Mooney University of Texas at Austin.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
1 Chapter 13 Uncertainty. 2 Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Introduction to Bayesian statistics Yves Moreau. Overview The Cox-Jaynes axioms Bayes’ rule Probabilistic models Maximum likelihood Maximum a posteriori.
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
For Wednesday Read Chapter 11, sections 1-2 Program 2 due.
1 CS 391L: Machine Learning: Bayesian Learning: Naïve Bayes Raymond J. Mooney University of Texas at Austin.
CSE 446: Point Estimation Winter 2012 Dan Weld Slides adapted from Carlos Guestrin (& Luke Zettlemoyer)
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
CSE PR 1 Reasoning - Rule-based and Probabilistic Representing relations with predicate logic Limitations of predicate logic Representing relations.
Bayesian Classification. Bayesian Classification: Why? A statistical classifier: performs probabilistic prediction, i.e., predicts class membership probabilities.
Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
Reasoning Under Uncertainty: Conditioning, Bayes Rule & the Chain Rule Jim Little Uncertainty 2 Nov 3, 2014 Textbook §6.1.3.
Chapter 13 February 19, Acting Under Uncertainty Rational Decision – Depends on the relative importance of the goals and the likelihood of.
Uncertainty Management in Rule-based Expert Systems
Uncertainty. Assumptions Inherent in Deductive Logic-based Systems All the assertions we wish to make and use are universally true. Observations of the.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
4 Proposed Research Projects SmartHome – Encouraging patients with mild cognitive disabilities to use digital memory notebook for activities of daily living.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
For Friday Read Homework: –Chapter 10, exercise 22 I strongly encourage you to tackle this together. You may work in groups of up to 4 people.
Computer Science CPSC 322 Lecture 27 Conditioning Ch Slide 1.
Textbook Basics of an Expert System: – “Expert systems: Design and Development,” by: John Durkin, 1994, Chapters 1-4. Uncertainty (Probability, Certainty.
1 Text Categorization CSE Categorization Given: –A description of an instance, x  X, where X is the instance language or instance space. –A fixed.
1 Probability FOL fails for a domain due to: –Laziness: too much to list the complete set of rules, too hard to use the enormous rules that result –Theoretical.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
CS 2750: Machine Learning Probability Review Prof. Adriana Kovashka University of Pittsburgh February 29, 2016.
Anifuddin Azis UNCERTAINTY. 2 Introduction The world is not a well-defined place. There is uncertainty in the facts we know: What’s the temperature? Imprecise.
Probabilistic Robotics Probability Theory Basics Error Propagation Slides from Autonomous Robots (Siegwart and Nourbaksh), Chapter 5 Probabilistic Robotics.
Matching ® ® ® Global Map Local Map … … … obstacle Where am I on the global map?                                   
Review of Probability.
Bayes Rule and Bayes Classifiers
Chapter 10: Using Uncertain Knowledge
Quick Review Probability Theory
Quick Review Probability Theory
Read R&N Ch Next lecture: Read R&N
Uncertainty Chapter 13.
Reasoning Under Uncertainty: Conditioning, Bayes Rule & Chain Rule
Representing Uncertainty
CSE-490DF Robotics Capstone
Professor Marie desJardins,
Class #21 – Monday, November 10
Hankz Hankui Zhuo Bayesian Networks Hankz Hankui Zhuo
basic probability and bayes' rule
Presentation transcript:

For Monday after Spring Break Read Homework: –Chapter 13, exercise 6 and 8 May be done in pairs.

Program 2 Any questions?

Homework Water is a liquid between 0 and 100 degrees.  w,s w  Water  (Centigrade(0) < Temperature(w,s) < Centigrade(100))  T(w  Liquid,s) Water boils at 100 degrees. SBoilingPoint(c,bp)   x,s x  c  (  t t=Temperature(x,s)  t>bp  T(x  Gas,s)) SBoilingPoint(Water,Centigrade(100)) The water in John’s water bottle is frozen.  b  w w  Water  b  WaterBottles  Has(John,b,Now)  Inside(w,b,Now)  T(w  Solid,Now) Perrier is a kind of water. Perrier  Water

Homework cont. John has Perrier in his water bottle.  b  w w  Water  b  WaterBottles  Has(John,b,Now)  Inside(w,b,Now)  w  Perrier All liquids have a freezing point.  c RoomTempLiquidSubst(c)   t SFreezingPoint(c,t) A liter of water weighs more than a liter of alcohol.  w,s w  Water  a  Alcohol  Volume(w) = Liters(1)  Volume(a) = Liters(1)  Mass(w) > Mass(a)

Uncertainty Everyday reasoning and decision making is based on uncertain evidence and inferences. Classical logic only allows conclusions to be strictly true or strictly false We need to account for this uncertainty and the need to weigh and combine conflicting evidence.

Coping with Uncertainty Straightforward application of probability theory is impractical since the large number of conditional probabilities required are rarely, if ever, available. Therefore, early expert systems employed fairly ad hoc methods for reasoning under uncertainty and for combining evidence. Recently, methods more rigorously founded in probability theory that attempt to decrease the amount of conditional probabilities required have flourished.

Probability Probabilities are real numbers 0­1 representing the a priori likelihood that a proposition is true. P(Cold) = 0.1 P(¬Cold) = 0.9 Probabilities can also be assigned to all values of a random variable (continuous or discrete) with a specific range of values (domain), e.g. low, normal, high. P(temperature=normal)=0.99 P(temperature=98.6) = 0.99

Probability Vectors The vector form gives probabilities for all values of a discrete variable, or its probability distribution. P(temperature) = This indicates the prior probability, in which no information is known.

Conditional Probability Conditional probability specifies the probability given that the values of some other random variables are known. P(Sneeze | Cold) = 0.8 P(Cold | Sneeze) = 0.6 The probability of a sneeze given a cold is 80%. The probability of a cold given a sneeze is 60%.

Cond. Probability cont. Assumes that the given information is all that is known, so all known information must be given. P(Sneeze | Cold  Allergy) = 0.95 Also allows for conditional distributions P(X |Y) gives 2­D array of values for all P(X=x i |Y=y j ) Defined as P (A | B) = P (A  B) P(B)

Axioms of Probability Theory All probabilities are between 0 and 1. 0  P(A)  1 Necessarily true propositions have probability 1, necessarily false have probability 0. P(true) = 1 P(false) = 0 The probability of a disjunction is given by P(A  B) = P(A) + P(B) - P(A  B)

Joint Probability Distribution The joint probability distribution for a set of random variables X 1 …X n gives the probability of every combination of values (an n­dimensional array with vn values if each variable has v values) P(X 1,...,X n ) Sneeze ¬Sneeze Cold ¬Cold The probability of all possible cases (assignments of values to some subset of variables) can be calculated by summing the appropriate subset of values from the joint distribution. All conditional probabilities can therefore also be calculated

Bayes Theorem P(H | e) = P(e | H) P(H) P(e) Follows from definition of conditional probability: P (A | B) = P (A  B) P(B)

Other Basic Theorems If events A and B are independent then: P(A  B) = P(A)P(B) If events A and B are incompatible then: P(A  B) = P(A) + P(B)

Simple Bayesian Reasoning If we assume there are n possible disjoint diagnoses, d 1 … d n P(d i | e) = P(e | d i ) P(d i ) P(e) P(e) may not be known but the total probability of all diagnoses must always be 1, so all must sum to 1 Thus, we can determine the most probable without knowing P(e).

Efficiency This method requires that for each disease the probability it will cause any possible combination of symptoms and the number of possible symptom sets, e, is exponential in the number of basic symptoms. This huge amount of data is usually not available.

Bayesian Reasoning with Independence (“Naïve” Bayes) If we assume that each piece of evidence (symptom) is independent given the diagnosis (conditional independence), then given evidence e as a sequence {e 1,e 2,…,e d } of observations, P(e | d i ) is the product of the probabilities of the observations given d i. The conditional probability of each individual symptom for each possible diagnosis can then be computed from a set of data or estimated by the expert. However, symptoms are usually not independent and frequently correlate, in which case the assumptions of this simple model are violated and it is not guaranteed to give reasonable results.

Bayes Independence Example Imagine there are diagnoses ALLERGY, COLD, and WELL and symptoms SNEEZE, COUGH, and FEVER Prob Well Cold Allergy P(d) P(sneeze|d) P(cough | d) P(fever | d)

If symptoms sneeze & cough & no fever: P(well | e) = (0.9)(0.1)(0.1)(0.99)/P(e) = /P(e) P(cold | e) = (.05)(0.9)(0.8)(0.3)/P(e) = 0.01/P(e) P(allergy | e) = (.05)(0.9)(0.7)(0.6)/P(e) = 0.019/P(e) Diagnosis: allergy P(e) = =.0379 P(well | e) =.23 P(cold | e) =.26 P(allergy | e) =.50

Problems with Probabilistic Reasoning If no assumptions of independence are made, then an exponential number of parameters is needed for sound probabilistic reasoning. There is almost never enough data or patience to reliably estimate so many very specific parameters. If a blanket assumption of conditional independence is made, efficient probabilistic reasoning is possible, but such a strong assumption is rarely warranted.