Identifying Conditional Independencies in Bayes Nets Lecture 4.

Slides:



Advertisements
Similar presentations
Bayesian networks Chapter 14 Section 1 – 2. Outline Syntax Semantics Exact computation.
Advertisements

Probabilistic Reasoning Bayesian Belief Networks Constructing Bayesian Networks Representing Conditional Distributions Summary.
Bayesian Networks CSE 473. © Daniel S. Weld 2 Last Time Basic notions Atomic events Probabilities Joint distribution Inference by enumeration Independence.
BAYESIAN NETWORKS. Bayesian Network Motivation  We want a representation and reasoning system that is based on conditional independence  Compact yet.
BAYESIAN NETWORKS CHAPTER#4 Book: Modeling and Reasoning with Bayesian Networks Author : Adnan Darwiche Publisher: CambridgeUniversity Press 2009.
For Monday Read chapter 18, sections 1-2 Homework: –Chapter 14, exercise 8 a-d.
1 22c:145 Artificial Intelligence Bayesian Networks Reading: Ch 14. Russell & Norvig.
Bayesian Networks Chapter 14 Section 1, 2, 4. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
Causal and Bayesian Network (Chapter 2) Book: Bayesian Networks and Decision Graphs Author: Finn V. Jensen, Thomas D. Nielsen CSE 655 Probabilistic Reasoning.
3/19. Conditional Independence Assertions We write X || Y | Z to say that the set of variables X is conditionally independent of the set of variables.
Review: Bayesian learning and inference
Bayesian Networks. Motivation The conditional independence assumption made by naïve Bayes classifiers may seem to rigid, especially for classification.
Probabilistic Reasoning Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 14 (14.1, 14.2, 14.3, 14.4) Capturing uncertain knowledge Probabilistic.
PGM 2003/04 Tirgul 3-4 The Bayesian Network Representation.
Bayesian networks Chapter 14 Section 1 – 2.
Bayesian Belief Networks
Bayesian Network Representation Continued
Bayesian Belief Network. The decomposition of large probabilistic domains into weakly connected subsets via conditional independence is one of the most.
Cooperating Intelligent Systems
Bayesian Networks Alan Ritter.
. DAGs, I-Maps, Factorization, d-Separation, Minimal I-Maps, Bayesian Networks Slides by Nir Friedman.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification.
Advanced Artificial Intelligence
CPS 270: Artificial Intelligence Bayesian networks Instructor: Vincent Conitzer.
Bayesian Networks Material used 1 Random variables
Bayesian networks Chapter 14. Outline Syntax Semantics.
Bayesian networks Chapter 14 Section 1 – 2. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
Undirected Models: Markov Networks David Page, Fall 2009 CS 731: Advanced Methods in Artificial Intelligence, with Biomedical Applications.
Probabilistic Belief States and Bayesian Networks (Where we exploit the sparseness of direct interactions among components of a world) R&N: Chap. 14, Sect.
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
For Wednesday Read Chapter 11, sections 1-2 Program 2 due.
2 Syntax of Bayesian networks Semantics of Bayesian networks Efficient representation of conditional distributions Exact inference by enumeration Exact.
1 Monte Carlo Artificial Intelligence: Bayesian Networks.
Introduction to Bayesian Networks
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
Generalizing Variable Elimination in Bayesian Networks 서울 시립대학원 전자 전기 컴퓨터 공학과 G 박민규.
Announcements Project 4: Ghostbusters Homework 7
Probabilistic Reasoning [Ch. 14] Bayes Networks – Part 1 ◦Syntax ◦Semantics ◦Parameterized distributions Inference – Part2 ◦Exact inference by enumeration.
Marginalization & Conditioning Marginalization (summing out): for any sets of variables Y and Z: Conditioning(variant of marginalization):
1 BN Semantics 1 Graphical Models – Carlos Guestrin Carnegie Mellon University September 15 th, 2008 Readings: K&F: 3.1, 3.2, –  Carlos.
Bayesian Networks Aldi Kraja Division of Statistical Genomics.
Review: Bayesian inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y.
Lecture 29 Conditional Independence, Bayesian networks intro Ch 6.3, 6.3.1, 6.5, 6.5.1,
Inference Algorithms for Bayes Networks
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
1 BN Semantics 2 – Representation Theorem The revenge of d-separation Graphical Models – Carlos Guestrin Carnegie Mellon University September 17.
Today Graphical Models Representing conditional dependence graphically
Belief Networks Kostas Kontogiannis E&CE 457. Belief Networks A belief network is a graph in which the following holds: –A set of random variables makes.
Conditional Independence As with absolute independence, the equivalent forms of X and Y being conditionally independent given Z can also be used: P(X|Y,
1 BN Semantics 1 Graphical Models – Carlos Guestrin Carnegie Mellon University September 15 th, 2006 Readings: K&F: 3.1, 3.2, 3.3.
PROBABILISTIC REASONING Heng Ji 04/05, 04/08, 2016.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016.
Web-Mining Agents Data Mining Prof. Dr. Ralf Möller Universität zu Lübeck Institut für Informationssysteme Karsten Martiny (Übungen)
Bayesian Networks CS182/CogSci110/Ling109 Spring 2007 Leon Barrett a.k.a. belief nets, bayes nets.
A Brief Introduction to Bayesian networks
CS 2750: Machine Learning Directed Graphical Models
Bayesian networks Chapter 14 Section 1 – 2.
Presented By S.Yamuna AP/CSE
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 18
Structure and Semantics of BN
CS 188: Artificial Intelligence
Class #16 – Tuesday, October 26
Structure and Semantics of BN
Bayesian networks Chapter 14 Section 1 – 2.
Probabilistic Reasoning
Warm-up as you walk in Each node in a Bayes net represents a conditional probability distribution. What distribution do you get when you multiply all of.
Bayesian Networks: Structure and Semantics
Presentation transcript:

Identifying Conditional Independencies in Bayes Nets Lecture 4

Getting a Full Joint Table Entry from a Bayes Net Recall: A table entry for X 1 = x 1,…,X n = x n is simply P(x 1,…,x n ) which can be calculated based on the Bayes Net semantics above. Recall example:

Inference Example What is probability alarm sounds, but neither a burglary nor an earthquake has occurred, and both John and Mary call? Using j for John Calls, a for Alarm, etc.:

Chain Rule Generalization of the product rule, easily proven by repeated application of the product rule Chain Rule:

Chain Rule and BN Semantics

Markov Blanket and Conditional Independence Recall that X is conditionally independent of its predecessors given Parents(X). Markov Blanket of X: set consisting of the parents of X, the children of X, and the other parents of the children of X. X is conditionally independent of all nodes in the network given its Markov Blanket.

d-Separation ABC Linear connection: Information can flow between A and C if and only if we do not have evidence at B

d-Separation (continued) ABC Diverging connection: Information can flow between A and C if and only if we do not have evidence at B

d-Separation (continued) ABC Converging connection: Information can flow between A and C if and only if we do have evidence at B or any descendent of B (such as D or E) DE

d-Separation An undirected path between two nodes is “cut off” if information cannot flow across one of the nodes in the path Two nodes are d-separated if every undirected path between them is cut off Two sets of nodes are d-separated if every pair of nodes, one from each set, is d-separated

An I-Map is a Set of Conditional Independence Statements P(X Y | Z): sets of variables X and Y are conditionally independent given Z (given a complete setting for the variables in Z) A set of conditional independence statements K is an I-map for a probability distribution P just if the independence statements in K are a subset of the conditional independencies in P. K and P can also be graphical models instead of either sets of independence statements or distributions.

Note: For Some CPT Choices, More Conditional Independences May Hold Suppose we have: Then only conditional independence we have is: P(A C | B) Now choose CPTs such that A must be True, B must take same value as A, and C must take same value as B In the resulting distribution P, all pairs of variables are conditionally independent given the third The Bayes net is an I-map of P A BC

Procedure for BN Construction Choose relevant random variables. While there are variables left:

Principles to Guide Choices Goal: build a locally structured (sparse) network -- each component interacts with a bounded number of other components. Add root causes first, then the variables that they influence.