QUIZ!!  T/F: Traffic, Umbrella are cond. independent given raining. TRUE  T/F: Fire, Smoke are cond. Independent given alarm. FALSE  T/F: BNs encode.

Slides:



Advertisements
Similar presentations
Reminder Reassigned Project 1 due today midnight
Advertisements

Bayesian networks Chapter 14 Section 1 – 2. Outline Syntax Semantics Exact computation.
Bayesian Networks CSE 473. © Daniel S. Weld 2 Last Time Basic notions Atomic events Probabilities Joint distribution Inference by enumeration Independence.
Lirong Xia Bayesian networks (2) Thursday, Feb 25, 2014.
BAYESIAN NETWORKS. Bayesian Network Motivation  We want a representation and reasoning system that is based on conditional independence  Compact yet.
Identifying Conditional Independencies in Bayes Nets Lecture 4.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
Bayesian Networks. Motivation The conditional independence assumption made by naïve Bayes classifiers may seem to rigid, especially for classification.
Bayesian networks Chapter 14 Section 1 – 2.
Bayesian Belief Networks
CS 188: Artificial Intelligence Fall 2009 Lecture 15: Bayes’ Nets II – Independence 10/15/2009 Dan Klein – UC Berkeley.
CS 188: Artificial Intelligence Spring 2009 Lecture 15: Bayes’ Nets II -- Independence 3/10/2009 John DeNero – UC Berkeley Slides adapted from Dan Klein.
Announcements Homework 8 is out Final Contest (Optional)
CS 188: Artificial Intelligence Fall 2009 Lecture 14: Bayes’ Nets 10/13/2009 Dan Klein – UC Berkeley.
Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification.
Bayes’ Nets [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available at
Advanced Artificial Intelligence
Announcements  Midterm 1  Graded midterms available on pandagrader  See Piazza for post on how fire alarm is handled  Project 4: Ghostbusters  Due.
Artificial Intelligence CS 165A Tuesday, November 27, 2007  Probabilistic Reasoning (Ch 14)
Bayes’ Nets  A Bayes’ net is an efficient encoding of a probabilistic model of a domain  Questions we can ask:  Inference: given a fixed BN, what is.
Made by: Maor Levy, Temple University  Probability expresses uncertainty.  Pervasive in all of Artificial Intelligence  Machine learning 
Bayesian networks Chapter 14. Outline Syntax Semantics.
QUIZ!!  T/F: The forward algorithm is really variable elimination, over time. TRUE  T/F: Particle Filtering is really sampling, over time. TRUE  T/F:
Bayesian Networks Tamara Berg CS Artificial Intelligence Many slides throughout the course adapted from Svetlana Lazebnik, Dan Klein, Stuart Russell,
CSE 473: Artificial Intelligence Spring 2012 Bayesian Networks Dan Weld Many slides adapted from Dan Klein, Stuart Russell, Andrew Moore & Luke Zettlemoyer.
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
Probabilistic Models  Models describe how (a portion of) the world works  Models are always simplifications  May not account for every variable  May.
1 Monte Carlo Artificial Intelligence: Bayesian Networks.
CSE 473: Artificial Intelligence Spring 2012 Bayesian Networks - Inference Dan Weld Slides adapted from Jack Breese, Dan Klein, Daphne Koller, Stuart Russell,
Announcements Project 4: Ghostbusters Homework 7
Probabilistic Reasoning [Ch. 14] Bayes Networks – Part 1 ◦Syntax ◦Semantics ◦Parameterized distributions Inference – Part2 ◦Exact inference by enumeration.
Marginalization & Conditioning Marginalization (summing out): for any sets of variables Y and Z: Conditioning(variant of marginalization):
QUIZ!! T/F: Probability Tables PT(X) do not always sum to one. FALSE
CHAPTER 5 Probability Theory (continued) Introduction to Bayesian Networks.
Review: Bayesian inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y.
QUIZ!!  In HMMs...  T/F:... the emissions are hidden. FALSE  T/F:... observations are independent given no evidence. FALSE  T/F:... each variable X.
Reasoning Under Uncertainty: Independence and Inference CPSC 322 – Uncertainty 5 Textbook §6.3.1 (and for HMMs) March 25, 2011.
1 BN Semantics 2 – Representation Theorem The revenge of d-separation Graphical Models – Carlos Guestrin Carnegie Mellon University September 17.
1 BN Semantics 1 Graphical Models – Carlos Guestrin Carnegie Mellon University September 15 th, 2006 Readings: K&F: 3.1, 3.2, 3.3.
PROBABILISTIC REASONING Heng Ji 04/05, 04/08, 2016.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Artificial Intelligence Bayes’ Nets: Independence Instructors: David Suter and Qince Li Course Harbin Institute of Technology [Many slides.
CS 188: Artificial Intelligence Spring 2007
Bayesian networks Chapter 14 Section 1 – 2.
Presented By S.Yamuna AP/CSE
Bayesian networks (1) Lirong Xia Spring Bayesian networks (1) Lirong Xia Spring 2017.
Probabilistic Models Models describe how (a portion of) the world works Models are always simplifications May not account for every variable May not account.
CS 4/527: Artificial Intelligence
Quizzz Rihanna’s car engine does not start (E).
CS 4/527: Artificial Intelligence
Structure and Semantics of BN
CHAPTER 7 BAYESIAN NETWORK INDEPENDENCE BAYESIAN NETWORK INFERENCE MACHINE LEARNING ISSUES.
CS 188: Artificial Intelligence
CAP 5636 – Advanced Artificial Intelligence
CSE 473: Artificial Intelligence Autumn 2011
CS 188: Artificial Intelligence Fall 2007
CS 188: Artificial Intelligence Fall 2008
CAP 5636 – Advanced Artificial Intelligence
CS 188: Artificial Intelligence
CS 188: Artificial Intelligence Spring 2007
Structure and Semantics of BN
Bayesian networks Chapter 14 Section 1 – 2.
Probabilistic Reasoning
CS 188: Artificial Intelligence Spring 2006
Warm-up as you walk in Each node in a Bayes net represents a conditional probability distribution. What distribution do you get when you multiply all of.
Bayesian networks (2) Lirong Xia. Bayesian networks (2) Lirong Xia.
Bayesian networks (1) Lirong Xia. Bayesian networks (1) Lirong Xia.
CS 188: Artificial Intelligence Fall 2008
Bayesian networks (2) Lirong Xia.
Presentation transcript:

QUIZ!!  T/F: Traffic, Umbrella are cond. independent given raining. TRUE  T/F: Fire, Smoke are cond. Independent given alarm. FALSE  T/F: BNs encode qualitative and quantitative knowledge about the world. TRUE  T/F: You cannot break your BN by adding an arc from any node A to B. FALSE  T/F: Adding too many arcs in a BN might overly restrict your joint PT. FALSE  T/F: An arc from A to B means, A caused B. FALSE  T/F: In a BN, each node has a local joint PT including all its parents. FALSE  T/F: In a BN, each node has a local CPT over itself and all its children. FALSE  T/F: In a BN, each node has a local CPT over itself and all its parents. TRUE  T/F: Nodes without parents have no probability table. FALSE.  T/F: If your assumptions are correct, you can reconstruct the full joint PT. TRUE 1

CSE 511: Artificial Intelligence Spring 2013 Lecture 15: Bayes’ Nets II – Independence 3/25/2012 Robert Pless, via Kilian Q. Weinberger via Dan Klein – UC Berkeley

Announcements  Project 3 due a week from today at midnight.  Policy clarification: no use of late days for competitive (bonus) parts of projects.  Final exam will be weighted to count for between 1 and 2 times the midterm, whichever helps you most. 3

Bayes’ Nets  A Bayes’ net is an efficient encoding of a probabilistic model of a domain  Questions we can ask:  Inference: given a fixed BN, what is P(X | e)?  Representation: given a BN graph, what kinds of distributions can it encode?  Modeling: what BN is most appropriate for a given domain? 4

Bayes’ Net Semantics  Formalized semantics of a Bayes’ net:  A set of nodes, one per variable X  A directed, acyclic graph  A conditional distribution for each node  A collection of distributions over X, one for each combination of parents’ values  CPT: conditional probability table  Description of a noisy “causal” process A1A1 X AnAn A Bayes net = Topology (graph) + Local Conditional Probabilities 5

Example: Alarm Network Burglary Earthqk Alarm John calls Mary calls BP(B) +b0.001 bb EP(E) +e0.002 ee BEAP(A|B,E) +b+e+a0.95 +b+e aa b ee +a0.94 +b ee aa 0.06 bb +e+a0.29 bb +e aa 0.71 bb ee +a0.001 bb ee aa AJP(J|A) +a+j0.9 +a jj 0.1 aa +j0.05 aa jj 0.95 AMP(M|A) +a+m0.7 +a mm 0.3 aa +m0.01 aa mm 0.99

Ex: Conditional Dependence: Alarm John calls Mary calls AJP(J|A) +a+j0.9 +a jj 0.1 aa +j0.05 aa jj 0.95 AMP(M|A) +a+m0.7 +a mm 0.3 aa +m0.01 aa mm 0.99 P(Alarm = 0.1)P(Alarm = 0.1) AJMP(A,J,M) +a+j+m a+j mm a jj +m a jj mm aa +j+m aa +j mm aa jj +m aa jj mm JMP(J,M) +j+m j mm jj +m jj mm.84945

Ex: Conditional Dependence: JMP(J,M) +j+m j mm jj +m jj mm m-m +j j

Building the (Entire) Joint  We can take a Bayes’ net and build any entry from the full joint distribution it encodes  Typically, there’s no reason to build ALL of it  We build what we need on the fly  To emphasize: every BN over a domain implicitly defines a joint distribution over that domain, specified by local probabilities and graph structure 9

Size of a Bayes’ Net  How big is a joint distribution over N Boolean variables? 2N2N  How big is an N-node net if nodes have up to k parents? O(N * 2 k+1 )  Both give you the power to calculate  BNs: Huge space savings!  Also easier to elicit local CPTs  Also turns out to be faster to answer queries (coming) 10

Bayes’ Nets So Far  We now know:  What is a Bayes’ net?  What joint distribution does a Bayes’ net encode?  Now: properties of that joint distribution (independence)  Key idea: conditional independence  Last class: assembled BNs using an intuitive notion of conditional independence as causality  Today: formalize these ideas  Main goal: answer queries about conditional independence and influence  Next: how to compute posteriors quickly (inference) 11

Conditional Independence  Reminder: independence  X and Y are independent if  X and Y are conditionally independent given Z  (Conditional) independence is a property of a distribution 12

Example: Independence  For this graph, you can fiddle with  (the CPTs) all you want, but you won’t be able to represent any distribution in which the flips are dependent! h0.5 t h t X1X1 X2X2 All distributions 13

Topology Limits Distributions  Given some graph topology G, only certain joint distributions can be encoded  The graph structure guarantees certain (conditional) independences  (There might be more independence)  Adding arcs increases the set of distributions, but has several costs  Full conditioning can encode any distribution X Y Z X Y ZX Y Z 14

Independence in a BN  Important question about a BN:  Are two nodes independent given certain evidence?  If yes, can prove using algebra (tedious in general)  If no, can prove with a counter example  Example:  Question: are X and Z necessarily independent?  Answer: no. Example: low pressure causes rain, which causes traffic.  X can influence Z, Z can influence X (via Y)  Addendum: they could be independent: how? XYZ

Three Amigos! 16

Causal Chains  This configuration is a “causal chain”  Is Z independent of X given Y?  Evidence along the chain “blocks” the influence XYZ Yes! X: Project due Y: Autograder down Z: Students panic 17

Common Cause  Another basic configuration: two effects of the same cause  Are X and Z independent?  Are X and Z independent given Y?  Observing the cause blocks influence between effects. X Y Z Yes! Y: Homework due X: Full attendance Z: Students sleepy 18

Common Effect  Last configuration: two causes of one effect (v-structures)  Are X and Z independent?  Yes: the ballgame and the rain cause traffic, but they are not correlated  Still need to prove they must be (try it!)  Are X and Z independent given Y?  No: seeing traffic puts the rain and the ballgame in competition as explanation?  This is backwards from the other cases  Observing an effect activates influence between possible causes. X Y Z X: Raining Z: Ballgame Y: Traffic 19

The General Case  Any complex example can be analyzed using these three canonical cases  General question: in a given BN, are two variables independent (given evidence)?  Solution: analyze the graph 20 Causal Chain Common Cause (Unobserved) Common Effect

Reachability  Recipe: shade evidence nodes  Attempt 1: if two nodes are connected by an undirected path not blocked by a shaded node, they are conditionally independent  Almost works, but not quite  Where does it break?  Answer: the v-structure at T doesn’t count as a link in a path unless “active” R T B D L 21

Three Amigos 22 Inactive Triples Causal Chain Common Cause (Unobserved) Common Effect

Reachability (D-Separation)  Question: Are X and Y conditionally independent given evidence vars {Z}?  Yes, if X and Y “separated” by Z  Look for active paths from X to Y  No active paths = independence!  A path is active if each triple is active:  Causal chain A  B  C where B is unobserved (either direction)  Common cause A  B  C where B is unobserved  Common effect (aka v-structure) A  B  C where B or one of its descendents is observed  All it takes to block a path is a single inactive segment Active TriplesInactive Triples

Example Yes 24 R T B T’T’

Example R T B D L T’T’ Yes 25

Example  Variables:  R: Raining  T: Traffic  D: Roof drips  S: I’m sad  Questions: T S D R Yes 26

Causality?  When Bayes’ nets reflect the true causal patterns:  Often simpler (nodes have fewer parents)  Often easier to think about  Often easier to elicit from experts  BNs need not actually be causal  Sometimes no causal net exists over the domain  E.g. consider the variables Traffic and Drips  End up with arrows that reflect correlation, not causation  What do the arrows really mean?  Topology may happen to encode causal structure  Topology only guaranteed to encode conditional independence 27

Example: Traffic  Basic traffic net  Let’s multiply out the joint R T r1/4 rr 3/4 r t tt 1/4 rr t1/2 tt r t3/16 r tt 1/16 rr t6/16 rr tt 28

Example: Reverse Traffic  Reverse causality? T R t9/16 tt 7/16 t r1/3 rr 2/3 tt r1/7 rr 6/7 r t3/16 r tt 1/16 rr t6/16 rr tt 29

Example: Coins  Extra arcs don’t prevent representing independence, just allow non-independence h0.5 t h t X1X1 X2X2 h t h | h0.5 t | h0.5 X1X1 X2X2 h | t0.5 t | t  Adding unneeded arcs isn’t wrong, it’s just inefficient

Changing Bayes’ Net Structure  The same joint distribution can be encoded in many different Bayes’ nets  Causal structure tends to be the simplest  Analysis question: given some edges, what other edges do you need to add?  One answer: fully connect the graph  Better answer: don’t make any false conditional independence assumptions 31

Example: Alternate Alarm 32 BurglaryEarthquake Alarm John calls Mary calls John callsMary calls Alarm BurglaryEarthquake If we reverse the edges, we make different conditional independence assumptions To capture the same joint distribution, we have to add more edges to the graph

Summary  Bayes nets compactly encode joint distributions  Guaranteed independencies of distributions can be deduced from BN graph structure  D-separation gives precise conditional independence guarantees from graph alone  A Bayes’ net’s joint distribution may have further (conditional) independence that is not detectable until you inspect its specific distribution 33

Example: Alarm Network Burglary Earthquake Alarm John calls Mary calls

Reachability (the Bayes’ Ball)  Correct algorithm:  Shade in evidence  Start at source node  Try to reach target by search  States: pair of (node X, previous state S)  Successor function:  X unobserved:  To any child  To any parent if coming from a child  X observed:  From parent to parent  If you can’t reach a node, it’s conditionally independent of the start node given evidence S XX S S XX S 35

36