Reasoning under Uncertainty: Conditional Prob., Bayes and Independence Computer Science cpsc322, Lecture 25 (Textbook Chpt 6.1.3.1-2) March, 17, 2010.

Slides:



Advertisements
Similar presentations
Probability: Review The state of the world is described using random variables Probabilities are defined over events –Sets of world states characterized.
Advertisements

Reasoning under Uncertainty: Marginalization, Conditional Prob., and Bayes Computer Science cpsc322, Lecture 25 (Textbook Chpt ) Nov, 6, 2013.
Slide 1 Reasoning Under Uncertainty: More on BNets structure and construction Jim Little Nov (Textbook 6.3)
Artificial Intelligence Uncertainty
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
CPSC 322, Lecture 21Slide 1 Bottom Up: Soundness and Completeness Computer Science cpsc322, Lecture 21 (Textbook Chpt 5.2) February, 27, 2009.
Marginal Independence and Conditional Independence Computer Science cpsc322, Lecture 26 (Textbook Chpt 6.1-2) March, 19, 2010.
CPSC 422 Review Of Probability Theory.
Probability.
Uncertainty Chapter 13. Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability.
KI2 - 2 Kunstmatige Intelligentie / RuG Probabilities Revisited AIMA, Chapter 13.
University College Cork (Ireland) Department of Civil and Environmental Engineering Course: Engineering Artificial Intelligence Dr. Radu Marinescu Lecture.
CPSC 322, Lecture 28Slide 1 Reasoning Under Uncertainty: More on BNets structure and construction Computer Science cpsc322, Lecture 28 (Textbook Chpt 6.3)
CPSC 322, Lecture 21Slide 1 Bottom Up: Soundness and Completeness Computer Science cpsc322, Lecture 21 (Textbook Chpt 5.2) March, 5, 2010.
Representing Uncertainty CSE 473. © Daniel S. Weld 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one.
Ai in game programming it university of copenhagen Welcome to... the Crash Course Probability Theory Marco Loog.
Probability and Information Copyright, 1996 © Dale Carnegie & Associates, Inc. A brief review (Chapter 13)
Uncertainty Logical approach problem: we do not always know complete truth about the environment Example: Leave(t) = leave for airport t minutes before.
CPSC 322, Lecture 29Slide 1 Reasoning Under Uncertainty: Bnet Inference (Variable elimination) Computer Science cpsc322, Lecture 29 (Textbook Chpt 6.4)
CPSC 322, Lecture 24Slide 1 Reasoning under Uncertainty: Intro to Probability Computer Science cpsc322, Lecture 24 (Textbook Chpt 6.1, 6.1.1) March, 15,
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Methods in Computational Linguistics II Queens College Lecture 2: Counting Things.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Feb 28 and March 13-15, 2012.
Handling Uncertainty. Uncertain knowledge Typical example: Diagnosis. Consider:  x Symptom(x, Toothache)  Disease(x, Cavity). The problem is that this.
Uncertainty Chapter 13. Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability.
CSCI 121 Special Topics: Bayesian Network Lecture #1: Reasoning Under Uncertainty.
CPSC 322, Lecture 24Slide 1 Finish Logics… Reasoning under Uncertainty: Intro to Probability Computer Science cpsc322, Lecture 24 (Textbook Chpt 6.1, 6.1.1)
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
Computer Science CPSC 322 Lecture 26 Uncertainty and Probability (Ch. 6.1, 6.1.1, 6.1.3)
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
1 Chapter 13 Uncertainty. 2 Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Reasoning Under Uncertainty: Independence and Inference Jim Little Uncertainty 5 Nov 10, 2014 Textbook §6.3.1, 6.5, 6.5.1,
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
CSE PR 1 Reasoning - Rule-based and Probabilistic Representing relations with predicate logic Limitations of predicate logic Representing relations.
Reasoning Under Uncertainty: Conditioning, Bayes Rule & the Chain Rule Jim Little Uncertainty 2 Nov 3, 2014 Textbook §6.1.3.
Uncertainty. Assumptions Inherent in Deductive Logic-based Systems All the assertions we wish to make and use are universally true. Observations of the.
Probability and Information Copyright, 1996 © Dale Carnegie & Associates, Inc. A brief review.
Reasoning under Uncertainty: Conditional Prob., Bayes and Independence Computer Science cpsc322, Lecture 25 (Textbook Chpt ) Nov, 5, 2012.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Computer Science CPSC 322 Lecture 27 Conditioning Ch Slide 1.
Reasoning Under Uncertainty: Independence Jim Little Uncertainty 3 Nov 5, 2014 Textbook §6.2.
Finish Probability Theory + Bayesian Networks Computer Science cpsc322, Lecture 9 (Textbook Chpt ) June, 5, CPSC 322, Lecture 9.
Reasoning Under Uncertainty: Independence CPSC 322 – Uncertainty 3 Textbook §6.2 March 21, 2011.
Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability (road state, other.
Bayes network inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y 
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
Uncertainty Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) Nov, 13, 2013.
Outline [AIMA Ch 13] 1 Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Uncertainty & Probability CIS 391 – Introduction to Artificial Intelligence AIMA, Chapter 13 Many slides adapted from CMSC 421 (U. Maryland) by Bonnie.
Matching ® ® ® Global Map Local Map … … … obstacle Where am I on the global map?                                   
Computer Science cpsc322, Lecture 25
Pattern Recognition Probability Review
Marginal Independence and Conditional Independence
Chapter 10: Using Uncertain Knowledge
Where are we in CS 440? Now leaving: sequential, deterministic reasoning Entering: probabilistic reasoning and machine learning.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Reasoning Under Uncertainty: Conditioning, Bayes Rule & Chain Rule
Uncertainty.
Uncertainty in Environments
Representing Uncertainty
CS 188: Artificial Intelligence Fall 2007
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Presentation transcript:

Reasoning under Uncertainty: Conditional Prob., Bayes and Independence Computer Science cpsc322, Lecture 25 (Textbook Chpt ) March, 17, 2010

Lecture Overview –Recap Semantics of Probability –Marginalization –Conditional Probability –Chain Rule –Bayes' Rule –Independence

Recap: Possible World Semantics for Probabilities Random variable and probability distribution Probability is a formal measure of subjective uncertainty. Model Environment with a set of random vars Probability of a proposition f

Joint Distribution and Marginalization cavitytoothachecatch µ(w) TTT.108 TTF.012 TFT.072 TFF.008 FTT.016 FTF.064 FFT.144 FFF.576 Given a joint distribution, e.g. P(X,Y, Z) we can compute distributions over any smaller sets of variables cavitytoothacheP(cavity, toothache) TT.12 TF.08 FT FF.72

Why is it called Marginalization? cavitytoothacheP(cavity, toothache) TT.12 TF.08 FT FF.72 Toothache = TToothache = F Cavity = T Cavity = F.08.72

Lecture Overview –Recap Semantics of Probability –Marginalization –Conditional Probability –Chain Rule –Bayes' Rule –Independence

Conditioning (Conditional Probability) We model our environment with a set of random variables. Assume have the joint, we can compute the probability ……. Are we done with reasoning under uncertainty? What can happen? Think of a patient showing up at the dentist office. Does she have a cavity?

Conditioning (Conditional Probability) Probabilistic conditioning specifies how to revise beliefs based on new information. You build a probabilistic model (for now the joint) taking all background information into account. This gives the prior probability. All other information must be conditioned on. If evidence e is all of the information obtained subsequently, the conditional probability P(h|e) of h given e is the posterior probability of h.

Conditioning Example Prior probability of having a cavity P(cavity = T) Should be revised if you know that there is toothache P(cavity = T | toothache = T) It should be revised again if you were informed that the probe did not catch anything P(cavity =T | toothache = T, catch = F) What about ? P(cavity = T | sunny = T)

How can we compute P(h|e) What happens in term of possible worlds if we know the value of a random var (or a set of random vars)? cavitytoothachecatch µ(w)µ e (w) TTT.108 TTF.012 TFT.072 TFF.008 FTT.016 FTF.064 FFT.144 FFF.576 e = (cavity = T) Some worlds are. The other become ….

Semantics of Conditional Probability The conditional probability of formula h given evidence e is

Semantics of Conditional Prob.: Example cavitytoothachecatch µ(w)µ e (w) TTT TTF TFT TFF FTT.0160 FTF.0640 FFT.1440 FFF.5760 e = (cavity = T) P(h | e) = P(toothache = T | cavity = T) =

Conditional Probability among Random Variables P(X | Y) = P(toothache | cavity) = P(toothache  cavity) / P(cavity) Toothache = TToothache = F Cavity = T Cavity = F Toothache = TToothache = F Cavity = T Cavity = F P(X | Y) = P(X, Y) / P(Y)

Product Rule Definition of conditional probability: –P(X 1 | X 2 ) = P(X 1, X 2 ) / P(X 2 ) Product rule gives an alternative, more intuitive formulation: –P(X 1, X 2 ) = P(X 2 ) P(X 1 | X 2 ) = P(X 1 ) P(X 2 | X 1 ) Product rule general form: P(X 1, …,X n ) = = P(X 1,...,X t ) P(X t+1 …. X n | X 1,...,X t )

Chain Rule Product rule general form: P(X 1, …,X n ) = = P(X 1,...,X t ) P(X t+1 …. X n | X 1,...,X t ) Chain rule is derived by successive application of product rule: P(X 1, … X n-1, X n ) = = P(X 1,...,X n-1 ) P(X n | X 1,...,X n-1 ) = P(X 1,...,X n-2 ) P(X n-1 | X 1,...,X n-2 ) P(X n | X 1,...,X n-1 ) = …. = P(X 1 ) P(X 2 | X 1 ) … P(X n-1 | X 1,...,X n-2 ) P(X n | X 1,.,X n-1 ) = ∏ n i= 1 P(X i | X 1, …,X i-1 )

Chain Rule: Example P(cavity, toothache, catch) = P(toothache, catch, cavity) =

Lecture Overview –Recap Semantics of Probability –Marginalization –Conditional Probability –Chain Rule –Bayes' Rule –Independence

Bayes' Rule From Product rule : –P(X, Y) = P(Y) P(X | Y) = P(X) P(Y | X)

Do you always need to revise your beliefs? …… when your knowledge of Y’s value doesn’t affect your belief in the value of X DEF. Random variable X is marginal independent of random variable Y if, for all x i  dom(X), y k  dom(Y), P( X= x i | Y= y k ) = P(X= x i ) Consequence: P( X= x i, Y= y k ) = P( X= x i | Y= y k ) P( Y= y k ) = = P(X= x i ) P( Y= y k )

Marginal Independence: Example A and B are independent iff: P(A|B) = P(A) or P(B|A) = P(B) or P(A, B) = P(A) P(B) That is new evidence B (or A) does not affect current belief in A (or B) Ex: P(Toothache, Catch, Cavity, Weather) = P(Toothache, Catch, Cavity) P(Weather) JPD requiring entries is reduced to two smaller ones ( and )

CPSC 322, Lecture 4Slide 21 Learning Goals for today’s class You can: Given a joint, compute distributions over any subset of the variables Prove the formula to compute P(h|e) Derive the Chain Rule and the Bayes Rule Define Marginal Independence

Next Class Conditional Independence Belief Networks……. I will post Assignment 3 this evening Assignment2 If any of the TAs’ feedback is unclear go to office hours If you have questions on the programming part, office hours next Tue (Ken) Assignments

Plan for this week Probability is a rigorous formalism for uncertain knowledge Joint probability distribution specifies probability of every possible world Probabilistic queries can be answered by summing over possible worlds For nontrivial domains, we must find a way to reduce the joint distribution size Independence (rare) and conditional independence (frequent) provide the tools

Conditional probability (irrelevant evidence) New evidence may be irrelevant, allowing simplification, e.g., –P(cavity | toothache, sunny) = P(cavity | toothache) –We say that Cavity is conditionally independent from Weather (more on this next class) This kind of inference, sanctioned by domain knowledge, is crucial in probabilistic inference