Bayesian Network By Zhang Liliang. Key Point Today Intro to Bayesian Network Usage of Bayesian Network Reasoning BN: D-separation.

Slides:



Advertisements
Similar presentations
Lirong Xia Bayesian networks (2) Thursday, Feb 25, 2014.
Advertisements

BAYESIAN NETWORKS CHAPTER#4 Book: Modeling and Reasoning with Bayesian Networks Author : Adnan Darwiche Publisher: CambridgeUniversity Press 2009.
Identifying Conditional Independencies in Bayes Nets Lecture 4.
For Monday Finish chapter 14 Homework: –Chapter 13, exercises 8, 15.
Bayesian Networks. Contents Semantics and factorization Reasoning Patterns Flow of Probabilistic Influence.
Bayesian Networks VISA Hyoungjune Yi. BN – Intro. Introduced by Pearl (1986 ) Resembles human reasoning Causal relationship Decision support system/ Expert.
Introduction of Probabilistic Reasoning and Bayesian Networks
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Review: Bayesian learning and inference
Bayesian Network Representation Continued
Machine Learning Recitation 2 Öznur Taştan September 2, 2009.
Bayesian Networks Alan Ritter.
. DAGs, I-Maps, Factorization, d-Separation, Minimal I-Maps, Bayesian Networks Slides by Nir Friedman.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification.
Advanced Artificial Intelligence
Bayes Net Perspectives on Causation and Causal Inference
Bayes’ Nets  A Bayes’ net is an efficient encoding of a probabilistic model of a domain  Questions we can ask:  Inference: given a fixed BN, what is.
Made by: Maor Levy, Temple University  Probability expresses uncertainty.  Pervasive in all of Artificial Intelligence  Machine learning 
Summary of the Bayes Net Formalism David Danks Institute for Human & Machine Cognition.
A Brief Introduction to Graphical Models
Artificial Intelligence CS 165A Thursday, November 29, 2007  Probabilistic Reasoning / Bayesian networks (Ch 14)
Perceptual and Sensory Augmented Computing Machine Learning, Summer’11 Machine Learning – Lecture 13 Introduction to Graphical Models Bastian.
For Wednesday Read Chapter 11, sections 1-2 Program 2 due.
Bayesian Network By DengKe Dong. Key Points Today  Intro to Graphical Model  Conditional Independence  Intro to Bayesian Network  Reasoning BN: D-Separation.
1 Monte Carlo Artificial Intelligence: Bayesian Networks.
Introduction to Bayesian Networks
Lectures 2 – Oct 3, 2011 CSE 527 Computational Biology, Fall 2011 Instructor: Su-In Lee TA: Christopher Miles Monday & Wednesday 12:00-1:20 Johnson Hall.
Generalizing Variable Elimination in Bayesian Networks 서울 시립대학원 전자 전기 컴퓨터 공학과 G 박민규.
Ch 8. Graphical Models Pattern Recognition and Machine Learning, C. M. Bishop, Revised by M.-O. Heo Summarized by J.W. Nam Biointelligence Laboratory,
Announcements Project 4: Ghostbusters Homework 7
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS.
INTERVENTIONS AND INFERENCE / REASONING. Causal models  Recall from yesterday:  Represent relevance using graphs  Causal relevance ⇒ DAGs  Quantitative.
Marginalization & Conditioning Marginalization (summing out): for any sets of variables Y and Z: Conditioning(variant of marginalization):
1 BN Semantics 1 Graphical Models – Carlos Guestrin Carnegie Mellon University September 15 th, 2008 Readings: K&F: 3.1, 3.2, –  Carlos.
Bayesian Networks Aldi Kraja Division of Statistical Genomics.
INTRODUCTION TO MACHINE LEARNING 3RD EDITION ETHEM ALPAYDIN © The MIT Press, Lecture.
Review: Bayesian inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y.
Machine Learning – Lecture 11
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Pattern Recognition and Machine Learning
Reasoning Under Uncertainty: Independence and Inference CPSC 322 – Uncertainty 5 Textbook §6.3.1 (and for HMMs) March 25, 2011.
Daphne Koller Bayesian Networks Semantics & Factorization Probabilistic Graphical Models Representation.
1 BN Semantics 2 – Representation Theorem The revenge of d-separation Graphical Models – Carlos Guestrin Carnegie Mellon University September 17.
Daphne Koller Bayesian Networks Semantics & Factorization Probabilistic Graphical Models Representation.
Reasoning Patterns Bayesian Networks Representation Probabilistic
1 BN Semantics 1 Graphical Models – Carlos Guestrin Carnegie Mellon University September 15 th, 2006 Readings: K&F: 3.1, 3.2, 3.3.
PROBABILISTIC REASONING Heng Ji 04/05, 04/08, 2016.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016.
Daphne Koller Independencies Bayesian Networks Probabilistic Graphical Models Representation.
Artificial Intelligence Bayes’ Nets: Independence Instructors: David Suter and Qince Li Course Harbin Institute of Technology [Many slides.
CS 2750: Machine Learning Directed Graphical Models
General Gibbs Distribution
Markov Networks Independencies Representation Probabilistic Graphical
Preliminaries: Distributions
CAP 5636 – Advanced Artificial Intelligence
Bayesian Networks Independencies Representation Probabilistic
Probabilistic Graphical Models Independencies Preliminaries.
CS 188: Artificial Intelligence
I-equivalence Bayesian Networks Representation Probabilistic Graphical
CS 188: Artificial Intelligence Spring 2007
Conditional Random Fields
Probabilistic Influence & d-separation
Reasoning Patterns Bayesian Networks Representation Probabilistic
Markov Networks Independencies Representation Probabilistic Graphical
Flow of Probabilistic Influence
Bayesian networks (2) Lirong Xia. Bayesian networks (2) Lirong Xia.
CS 188: Artificial Intelligence Fall 2008
Bayesian networks (2) Lirong Xia.
Presentation transcript:

Bayesian Network By Zhang Liliang

Key Point Today Intro to Bayesian Network Usage of Bayesian Network Reasoning BN: D-separation

Bayesian Network Definition Difficulty {easy, hard} Intelligence {low, high} Grade {A, B, C} SAT {low_mark, high_mark} Letter {No, Yes} Bayes networks defines Joint Distribution in term of a graph over a collection of random variable. P(D, I, G, S, L)= ?

Joint Distribution of BN General Form: P(D, I, G, S, L) = P(L|D, I, G, S) *P(S|D, I, G) *P(G|D, I) *P(I|D) *P(D) D G L S I Can the formula be simplify?

Conditional Independence For (sets of) random variables X,Y,Z X is conditional independence of Y given Z, Denotes as P (X ⊥ Y | Z), if: – P(X, Y|Z) = P(X|Z) P(Y|Z) – P(X|Y,Z) = P(X|Z) – P(Y|X,Z) = P(Y|Z)

Joint Distribution of BN General Form: P(D, I, G, S, L) = P(L|D, I, G, S) *P(S|D, I, G) *P(G|D, I) *P(I|D) *P(D) D G L S I In BN, it can be simplify as: P(D, I, G, S, L) = P(D) * P(I) * P(G|D, I) * P(S|I) * P(L|G) Parameters: 2*2*3*2*2-1=47 Parameters: = 15

Bayesian Network Definition(2) A Bayesian network is a directed acyclic graph(DAG) and a set of conditional probability distribution(CPD). P(D, I, G, S, L) = P(D) * P(I) * P(G|D, I) * P(S|I) * P(L|G)

Usages of BN: Reasoning 3 kinds of reasoning: Causal Reasoning Evidential Reasoning Intercausal Reasoning D G L S I

Causal Reasoning

Evidential Reasoning

Intercausal Reasoning

How to reasoning? For certain cases, tractable - Full observed set of variable - just one variable unobserved In general, intractable…(NP-complete) How to deal with the problem? An intuitive solution: D-separation

Conditional Independence: Revisited For (sets of) random variables X,Y,Z X is conditional independence of Y given Z, Denotes as P (X ⊥ Y | Z), if: – P(X, Y|Z) = P(X|Z) P(Y|Z) – P(X|Y,Z) = P(X|Z) – P(Y|X,Z) = P(X|Z) D G L S I Given an observation of G, is L is conditional independence of D? Given an observation of I, is G conditional independence of S ? Given an observation of G, is D conditional independence of I ? A method may simplify the calculation when reasoning : to find out more variables which satisfied with conditional independence.

Three Easy Network about Conditional Independence Tail to Tail Head to Head Head to Tail

(D ⊥ L ) ? (D ⊥ L| G) ? D G L S I No Yes

Tail to Tail (G ⊥ S) ? (G ⊥ S| I) ? D G L S I No Yes

Head to Head (D ⊥ I)? (D ⊥ I| G) ? D G L S I Yes No

X and Y are conditionally independent given Z, if and only if X and Y are D-separated by Z Suppose we have three sets of random variables: X, Y and Z X and Y are D-separated by Z (and therefore conditionally independence, given Z) iff every path from any variable in X to any variable in Y is blocked A path from variable A to variable B is blocked if it includes a node such that either 1.arrows on the path meet either head-to-tail or tail-to-tail at the node and this node is in Z 2.the arrows meet head-to-head at the node, and neither the node, nor any of its descendants, is in Z

Summary Bayesian Network = Directed Acyclic Graph + Conditional Probability Distribution Joint Distribution: Three type of Reasoning BN: causal, evidential, intercausal Conditional Independence & D-separation

Reference Machine Learning, CMU Probabilistic Graphical Models, Stanford, on Coursera. SamIam: a comprehensive tool, UCLA

Thanks