Reasoning Under Uncertainty: Independence CPSC 322 – Uncertainty 3 Textbook §6.2 March 21, 2011.

Slides:



Advertisements
Similar presentations
Bayes rule, priors and maximum a posteriori
Advertisements

Lecture 4A: Probability Theory Review Advanced Artificial Intelligence.
Review of Probability. Definitions (1) Quiz 1.Let’s say I have a random variable X for a coin, with event space {H, T}. If the probability P(X=H) is.
Probability: Review The state of the world is described using random variables Probabilities are defined over events –Sets of world states characterized.
Reasoning under Uncertainty: Marginalization, Conditional Prob., and Bayes Computer Science cpsc322, Lecture 25 (Textbook Chpt ) Nov, 6, 2013.
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
Lirong Xia Hidden Markov Models Tue, March 28, 2014.
Reasoning Under Uncertainty: Bayesian networks intro Jim Little Uncertainty 4 November 7, 2014 Textbook §6.3, 6.3.1, 6.5, 6.5.1,
Reasoning under Uncertainty: Conditional Prob., Bayes and Independence Computer Science cpsc322, Lecture 25 (Textbook Chpt ) March, 17, 2010.
Marginal Independence and Conditional Independence Computer Science cpsc322, Lecture 26 (Textbook Chpt 6.1-2) March, 19, 2010.
CPSC 422 Review Of Probability Theory.
KI2 - 2 Kunstmatige Intelligentie / RuG Probabilities Revisited AIMA, Chapter 13.
Ai in game programming it university of copenhagen Welcome to... the Crash Course Probability Theory Marco Loog.
CPSC 422, Lecture 14Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 14 Feb, 4, 2015 Slide credit: some slides adapted from Stuart.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
Expected Value (Mean), Variance, Independence Transformations of Random Variables Last Time:
Probability, Bayes’ Theorem and the Monty Hall Problem
Artificial Intelligence CS 165A Tuesday, November 27, 2007  Probabilistic Reasoning (Ch 14)
Made by: Maor Levy, Temple University  Probability expresses uncertainty.  Pervasive in all of Artificial Intelligence  Machine learning 
Uncertainty Probability and Bayesian Networks
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Introduction to Probability Theory March 24, 2015 Credits for slides: Allan, Arms, Mihalcea, Schutze.
Computer Science CPSC 322 Lecture 26 Uncertainty and Probability (Ch. 6.1, 6.1.1, 6.1.3)
Reasoning Under Uncertainty: Bayesian networks intro CPSC 322 – Uncertainty 4 Textbook §6.3 – March 23, 2011.
Reasoning Under Uncertainty: Independence and Inference Jim Little Uncertainty 5 Nov 10, 2014 Textbook §6.3.1, 6.5, 6.5.1,
Visibility Graph. Voronoi Diagram Control is easy: stay equidistant away from closest obstacles.
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
Expected Value. Expected Value - Definition The mean (average) of a random variable.
Reasoning Under Uncertainty: Conditioning, Bayes Rule & the Chain Rule Jim Little Uncertainty 2 Nov 3, 2014 Textbook §6.1.3.
Elementary manipulations of probabilities Set probability of multi-valued r.v. P({x=Odd}) = P(1)+P(3)+P(5) = 1/6+1/6+1/6 = ½ Multi-variant distribution:
Reasoning Under Uncertainty: Introduction to Probability CPSC 322 – Uncertainty 1 Textbook §6.1 March 16, 2011.
Uncertainty. Assumptions Inherent in Deductive Logic-based Systems All the assertions we wish to make and use are universally true. Observations of the.
Probability Course web page: vision.cis.udel.edu/cv March 19, 2003  Lecture 15.
Reasoning under Uncertainty: Conditional Prob., Bayes and Independence Computer Science cpsc322, Lecture 25 (Textbook Chpt ) Nov, 5, 2012.
NLP. Introduction to NLP Very important for language processing Example in speech recognition: –“recognize speech” vs “wreck a nice beach” Example in.
PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY
4 Proposed Research Projects SmartHome – Encouraging patients with mild cognitive disabilities to use digital memory notebook for activities of daily living.
Uncertainty ECE457 Applied Artificial Intelligence Spring 2007 Lecture #8.
Computer Science CPSC 322 Lecture 27 Conditioning Ch Slide 1.
Reasoning Under Uncertainty: Independence Jim Little Uncertainty 3 Nov 5, 2014 Textbook §6.2.
Lecture 29 Conditional Independence, Bayesian networks intro Ch 6.3, 6.3.1, 6.5, 6.5.1,
Measuring chance Probabilities FETP India. Competency to be gained from this lecture Apply probabilities to field epidemiology.
Inference Algorithms for Bayes Networks
STT 315 This lecture note is based on Chapter 3
Computer vision: models, learning and inference Chapter 2 Introduction to probability.
Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability (road state, other.
Reasoning Under Uncertainty: Independence and Inference CPSC 322 – Uncertainty 5 Textbook §6.3.1 (and for HMMs) March 25, 2011.
Reasoning Under Uncertainty: Variable Elimination CPSC 322 – Uncertainty 6 Textbook §6.4.1 March 28, 2011.
Matching ® ® ® Global Map Local Map … … … obstacle Where am I on the global map?                                   
QUIZ!!  T/F: You can always (theoretically) do BNs inference by enumeration. TRUE  T/F: In VE, always first marginalize, then join. FALSE  T/F: VE is.
Computer Science CPSC 322 Lecture 19 Bayesian Networks: Construction 1.
Basics of Multivariate Probability
Computer Science cpsc322, Lecture 25
Pattern Recognition Probability Review
Marginal Independence and Conditional Independence
Probabilistic Reasoning
From last time: on-policy vs off-policy Take an action Observe a reward Choose the next action Learn (using chosen action) Take the next action Off-policy.
Computer vision: models, learning and inference
Bayes Net Learning: Bayesian Approaches
Reasoning Under Uncertainty: Conditioning, Bayes Rule & Chain Rule
Probabilistic Reasoning
Probability Topics Random Variables Joint and Marginal Distributions
Inference Inference: calculating some useful quantity from a joint probability distribution Examples: Posterior probability: Most likely explanation: B.
CS 188: Artificial Intelligence Fall 2008
CS 188: Artificial Intelligence Fall 2007
Probabilistic Reasoning
CS 188: Artificial Intelligence Fall 2008
Hidden Markov Models Lirong Xia.
Mathematical Foundations of BME Reza Shadmehr
Presentation transcript:

Reasoning Under Uncertainty: Independence CPSC 322 – Uncertainty 3 Textbook §6.2 March 21, 2011

Lecture Overview Recap –Conditioning & Inference by Enumeration –Bayes Rule & Chain Rule Independence –Marginal Independence –Conditional Independence –Time-permitting: Rainbow Robot example 2

Recap: Conditioning Conditioning: revise beliefs based on new observations We need to integrate two sources of knowledge – Prior probability distribution P(X): all background knowledge – New evidence e Combine the two to form a posterior probability distribution – The conditional probability P(h|e) 3

Recap: Example for conditioning You have a prior for the joint distribution of weather and temperature, and the marginal distribution of temperature Now, you look outside and see that it’s sunny –You are certain that you’re in world w 1, w 2, or w 3 4 Possible world WeatherTemperatureµ(w) w1w1 sunnyhot0.10 w2w2 sunnymild0.20 w3w3 sunnycold0.10 w4w4 cloudyhot0.05 w5w5 cloudymild0.35 w6w6 cloudycold0.20 TP(T|W=sunny) hot? mild? cold?

Recap: Example for conditioning You have a prior for the joint distribution of weather and temperature, and the marginal distribution of temperature Now, you look outside and see that it’s sunny –You are certain that you’re in world w 1, w 2, or w 3 –To get the conditional probability, you simply renormalize to sum to 1 – = Possible world WeatherTemperatureµ(w) w1w1 sunnyhot0.10 w2w2 sunnymild0.20 w3w3 sunnycold0.10 w4w4 cloudyhot0.05 w5w5 cloudymild0.35 w6w6 cloudycold0.20 TP(T|W=sunny) hot0.10/0.40=0.25 mild0.20/0.40=0.50 cold0.10/0.40=0.25

Recap: Conditional probability 6 Possible world WeatherTemperatureµ(w) w1w1 sunnyhot0.10 w2w2 sunnymild0.20 w3w3 sunnycold0.10 w4w4 cloudyhot0.05 w5w5 cloudymild0.35 w6w6 cloudycold0.20 TP(T|W=sunny) hot0.10/0.40=0.25 mild0.20/0.40=0.50 cold0.10/0.40=0.25

Recap: Inference by Enumeration Great, we can compute arbitrary probabilities now! Given –Prior joint probability distribution (JPD) on set of variables X –specific values e for the evidence variables E (subset of X) We want to compute –posterior joint distribution of query variables Y (a subset of X) given evidence e Step 1: Condition to get distribution P(X|e) Step 2: Marginalize to get distribution P(Y|e) Generally applicable, but memory-heavy and slow 7

Recap: Bayes rule and Chain Rule 8

Lecture Overview Recap –Conditioning & Inference by Enumeration –Bayes Rule & Chain Rule Independence –Marginal Independence –Conditional Independence –Time-permitting: Rainbow Robot example 9

Marginal Independence: example Some variables are independent: –Knowing the value of one does not tell you anything about the other –Example: variables W (weather) and R (result of a die throw) Let’s compare P(W) vs. P(W | R = 6 ) What is P(W=cloudy) ? 10 Weather WResult RP(W,R) sunny sunny sunny sunny sunny sunny cloudy10.1 cloudy20.1 cloudy30.1 cloudy40.1 cloudy50.1 cloudy

Marginal Independence: example 11 Weather WResult RP(W,R) sunny sunny sunny sunny sunny sunny cloudy10.1 cloudy20.1 cloudy30.1 cloudy40.1 cloudy50.1 cloudy60.1 Some variables are independent: –Knowing the value of one does not tell you anything about the other –Example: variables W (weather) and R (result of a die throw) Let’s compare P(W) vs. P(W | R = 6 ) What is P(W=cloudy) ? –P(W=cloudy) =  r  dom(R) P(W=cloudy, R = r) = = 0.6 What is P(W=cloudy|R=6) ? 0.1/ / /0.6

Marginal Independence: example Weather WResult RP(W,R) sunny sunny sunny sunny sunny sunny cloudy10.1 cloudy20.1 cloudy30.1 cloudy40.1 cloudy50.1 cloudy

Marginal Independence: example Weather WResult RP(W,R) sunny sunny sunny sunny sunny sunny cloudy10.1 cloudy20.1 cloudy30.1 cloudy40.1 cloudy50.1 cloudy

Marginal Independence: example Some variables are independent: –Knowing the value of one does not tell you anything about the other –Example: variables W (weather) and R (result of a die throw) Let’s compare P(W) vs. P(W | R = 6 ) The two distributions are identical Knowing the result of the die does not change our belief in the weather 14 Weather WResult RP(W,R) sunny sunny sunny sunny sunny sunny cloudy10.1 cloudy20.1 cloudy30.1 cloudy40.1 cloudy50.1 cloudy60.1 Weather WP(W) sunny0.4 cloudy0.6 Weather WP(W|R=6) sunny0.066/0.166=0.4 cloudy0.1/0.166=0.6

Marginal Independence Intuitively: if X and Y are marginally independent, then –learning that Y=y does not change your belief in X –and this is true for all values y that Y could take For example, weather is marginally independent from the result of a die throw 15

Examples for marginal independence Results C 1 and C 2 of two tosses of a fair coin Are C 1 and C 2 marginally independent? 16 C1C1 C2C2 P(C 1, C 2 ) heads 0.25 headstails0.25 tailsheads0.25 tails 0.25 noyes

Examples for marginal independence Results C 1 and C 2 of two tosses of a fair coin Are C 1 and C 2 marginally independent? –Yes. All probabilities in the definition above are C1C1 C2C2 P(C 1, C 2 ) heads 0.25 headstails0.25 tailsheads0.25 tails 0.25

Examples for marginal independence Are Weather and Temperature marginally independent? Weather WTemperature TP(W,T) sunnyhot0.10 sunnymild0.20 sunnycold0.10 cloudyhot0.05 cloudymild0.35 cloudycold0.20 noyes

Examples for marginal independence Are Weather and Temperature marginally independent? –No. We saw before that knowing the Temperature changes our belief on the weather –E.g. P(hot) = =0.15 P(hot|cloudy) = 0.05/0.6  Weather WTemperature TP(W,T) sunnyhot0.10 sunnymild0.20 sunnycold0.10 cloudyhot0.05 cloudymild0.35 cloudycold0.20

Examples for marginal independence Intuitively (without numbers): –Boolean random variable “Canucks win the Stanley Cup this season” –Numerical random variable “Canucks’ revenue last season” ? –Are the two marginally independent? 20 noyes

Examples for marginal independence Intuitively (without numbers): –Boolean random variable “Canucks win the Stanley Cup this season” –Numerical random variable “Canucks’ revenue last season” ? –Are the two marginally independent? No! Without revenue they cannot afford to keep their best players 21

Exploiting marginal independence 22

Exploiting marginal independence 23 2n2n2n 2+nn2n2

Exploiting marginal independence 24 2n2n2n 2+nn2n2

Exploiting marginal independence 25

Lecture Overview Recap –Conditioning & Inference by Enumeration –Bayes Rule & Chain Rule Independence –Marginal Independence –Conditional Independence –Time-permitting: Rainbow Robot example 26

Follow-up Example Intuitively (without numbers): –Boolean random variable “Canucks win the Stanley Cup this season” –Numerical random variable “Canucks’ revenue last season” ? –Are the two marginally independent? No! Without revenue they cannot afford to keep their best players –But they are conditionally independent given the Canucks line-up Once we know who is playing then learning their revenue last year won’t change our belief in their chances 27

Conditional Independence 28 Intuitively: if X and Y are conditionally independent given Z, then –learning that Y=y does not change your belief in X when we already know Z=z –and this is true for all values y that Y could take and all values z that Z could take

Example for Conditional Independence Whether light l 1 is lit is conditionally independent from the position of switch s 2 given whether there is power in wire w 0 Once we know Power(w 0 ) learning values for any other variable will not change our beliefs about Lit(l 1 ) –I.e., Lit(l 1 ) is independent of any other variable given Power(w 0 ) 29

Example: conditionally but not marginally independent ExamGrade and AssignmentGrade are not marginally independent –Students who do well on one typically do well on the other But conditional on UnderstoodMaterial, they are independent –Variable UnderstoodMaterial is a common cause of variables ExamGrade and AssignmentGrade –UnderstoodMaterial shields any information we could get from AssignmentGrade 30 Understood Material Assignment Grade Exam Grade

Example: marginally but not conditionally independent Two variables can be marginally but not conditionally independent –“Smoking At Sensor” S: resident smokes cigarette next to fire sensor –“Fire” F: there is a fire somewhere in the building –“Alarm” A: the fire alarm rings –S and F are marginally independent Learning S=true or S=false does not change your belief in F –But they are not conditionally independent given alarm If the alarm rings and you learn S=true your belief in F decreases 31 Alarm Smoking At Sensor Fire

Conditional vs. Marginal Independence Two variables can be –Both marginally nor conditionally independent CanucksWinStanleyCup and Lit(l 1 ) CanucksWinStanleyCup and Lit(l 1 ) given Power(w 0 ) –Neither marginally nor conditionally independent Temperature and Cloudiness Temperature and Cloudiness given Wind –Conditionally but not marginally independent ExamGrade and AssignmentGrade ExamGrade and AssignmentGrade given UnderstoodMaterial –Marginally but not conditionally independent SmokingAtSensor and Fire SmokingAtSensor and Fire given Alarm 32

Exploiting Conditional Independence Example 1: Boolean variables A,B,C –C is conditionally independent of A given B –We can then rewrite P(C | A,B) as P(C|B)

Exploiting Conditional Independence Example 2: Boolean variables A,B,C,D –D is conditionally independent of A given C –D is conditionally independent of B given C –We can then rewrite P(D | A,B,C) as P(D|B,C) –And can further rewrite P(D|B,C) as P(D|C)

Exploiting Conditional Independence Recall the chain rule 35

Lecture Overview Recap –Conditioning & Inference by Enumeration –Bayes Rule & Chain Rule Independence –Marginal Independence –Conditional Independence –Time-permitting: Rainbow Robot example 36

Rainbow Robot Example P(Position 2 | Position 0, Position 1, Sensors 1, Sensors 2 ) –What variables is Position 2 cond. independent of given Position 1 ? 37 Sens 1 Pos 0 Pos 1 Sens 2 Pos 2

Rainbow Robot Example P(Pos 2 | Pos 0, Pos 1, Sens 1, Sens 2 ) –What variables is Pos 2 conditionally independent of given Pos 1 ? Pos 2 is conditionally independent of Pos 0 given Pos 1 Pos 2 is conditionally independent of Sens 1 given Pos 1 38 Sens 1 Pos 0 Pos 1 Sens 2 Pos 2

Rainbow Robot Example (cont’d) 39 Sens 1 Pos 0 Pos 1 Sens 2 Pos 2 Pos 2 is conditionally independent of Pos 0 and Sens 1 given Pos 1 Bayes rule Sens 2 is conditionally independent of Pos 1 given Pos 2 The denominator is a constant (does not depend on Pos 2 ). The probability just has to sum to 1.

Rainbow Robot Example (cont’d) 40 Sens 1 Pos 0 Pos 1 Sens 2 Pos 2

Define and use marginal independence Define and use conditional independence Assignment 4 available on WebCT –Due in 2 weeks –Do the questions early Right after the material for the question has been covered in class This will help you stay on track 41 Learning Goals For Today’s Class