Causal and Bayesian Network (Chapter 2) Book: Bayesian Networks and Decision Graphs Author: Finn V. Jensen, Thomas D. Nielsen CSE 655 Probabilistic Reasoning.

Slides:



Advertisements
Similar presentations
A Tutorial on Learning with Bayesian Networks
Advertisements

BAYESIAN NETWORKS Ivan Bratko Faculty of Computer and Information Sc. University of Ljubljana.
Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
Bayesian Networks CSE 473. © Daniel S. Weld 2 Last Time Basic notions Atomic events Probabilities Joint distribution Inference by enumeration Independence.
A. Darwiche Bayesian Networks. A. Darwiche Reasoning Systems Diagnostics: Which component failed? Information retrieval: What document to retrieve? On-line.
BAYESIAN NETWORKS. Bayesian Network Motivation  We want a representation and reasoning system that is based on conditional independence  Compact yet.
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
Probabilistic Reasoning Course 8
BAYESIAN NETWORKS CHAPTER#4 Book: Modeling and Reasoning with Bayesian Networks Author : Adnan Darwiche Publisher: CambridgeUniversity Press 2009.
Identifying Conditional Independencies in Bayes Nets Lecture 4.
Bayesian Network : An Introduction May 2005 김 진형 KAIST
Belief Networks Done by: Amna Omar Abdullah Fatima Mahmood Al-Hajjat Najwa Ahmad Al-Zubi.
Bayesian Networks VISA Hyoungjune Yi. BN – Intro. Introduced by Pearl (1986 ) Resembles human reasoning Causal relationship Decision support system/ Expert.
5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn.
Introduction of Probabilistic Reasoning and Bayesian Networks
Artificial Intelligence Chapter 19 Reasoning with Uncertain Information Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
From: Probabilistic Methods for Bioinformatics - With an Introduction to Bayesian Networks By: Rich Neapolitan.
Graphical Models - Modeling - Wolfram Burgard, Luc De Raedt, Kristian Kersting, Bernhard Nebel Albert-Ludwigs University Freiburg, Germany PCWP CO HRBP.
PGM 2003/04 Tirgul 3-4 The Bayesian Network Representation.
Bayesian networks Chapter 14 Section 1 – 2.
Bayesian Belief Networks
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.6 [P]: Reasoning Under Uncertainty Sections.
1 Bayesian Reasoning Chapter 13 CMSC 471 Adapted from slides by Tim Finin and Marie desJardins.
Goal: Reconstruct Cellular Networks Biocarta. Conditions Genes.
Representing Uncertainty CSE 473. © Daniel S. Weld 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one.
5/25/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005.
Bayesian Networks Alan Ritter.
A Differential Approach to Inference in Bayesian Networks - Adnan Darwiche Jiangbo Dang and Yimin Huang CSCE582 Bayesian Networks and Decision Graphs.
. DAGs, I-Maps, Factorization, d-Separation, Minimal I-Maps, Bayesian Networks Slides by Nir Friedman.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
Probabilistic Reasoning
Made by: Maor Levy, Temple University  Probability expresses uncertainty.  Pervasive in all of Artificial Intelligence  Machine learning 
A Brief Introduction to Graphical Models
Bayesian Belief Network Compiled By: Raj Gaurang Tiwari Assistant Professor SRMGPC, Lucknow.
Soft Computing Lecture 17 Introduction to probabilistic reasoning. Bayesian nets. Markov models.
Bayesian networks Chapter 14 Section 1 – 2. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
Bayesian Learning By Porchelvi Vijayakumar. Cognitive Science Current Problem: How do children learn and how do they get it right?
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
Perceptual and Sensory Augmented Computing Machine Learning, Summer’11 Machine Learning – Lecture 13 Introduction to Graphical Models Bastian.
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
Probabilistic Reasoning ECE457 Applied Artificial Intelligence Spring 2007 Lecture #9.
Reasoning in Uncertain Situations
Introduction to Bayesian Networks
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
Generalizing Variable Elimination in Bayesian Networks 서울 시립대학원 전자 전기 컴퓨터 공학과 G 박민규.
Ch 8. Graphical Models Pattern Recognition and Machine Learning, C. M. Bishop, Revised by M.-O. Heo Summarized by J.W. Nam Biointelligence Laboratory,
Uncertainty Management in Rule-based Expert Systems
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
Bayesian Networks Aldi Kraja Division of Statistical Genomics.
Lecture 29 Conditional Independence, Bayesian networks intro Ch 6.3, 6.3.1, 6.5, 6.5.1,
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Introduction on Graphic Models
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Artificial Intelligence Chapter 19 Reasoning with Uncertain Information Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016.
CS 2750: Machine Learning Directed Graphical Models
Bayesian networks Chapter 14 Section 1 – 2.
Presented By S.Yamuna AP/CSE
Qian Liu CSE spring University of Pennsylvania
Artificial Intelligence Chapter 19
Representing Uncertainty
Biointelligence Lab School of Computer Sci. & Eng.
Class #16 – Tuesday, October 26
Bayesian networks Chapter 14 Section 1 – 2.
Probabilistic Reasoning
Chapter 14 February 26, 2004.
Presentation transcript:

Causal and Bayesian Network (Chapter 2) Book: Bayesian Networks and Decision Graphs Author: Finn V. Jensen, Thomas D. Nielsen CSE 655 Probabilistic Reasoning Faculty of Computer Science, Institute of Business Administration Presented by Quratulain

Outline Reasoning under uncertainty Causal network and d-separation Bayesian network Graphical model 9/16/2009Quratulain2

Reasoning Under Uncertainty Why Reason Probabilistically? In many problem domains it isn't possible to create complete, consistent models of the world. If information is given with certainty then Propositional logic (Truth table) can be used. Want to make rational decisions even when there is not enough information to prove that an action will work. To deal with uncertain events, we extend truth value of propositional logic to certainties which are number between 0 and 1. 9/16/2009Quratulain3

Example ( Type of reasoning that human do daily ) “In the morning, my car will not start.” Reasons: ◦ I can here starter tune, so must be power in battery ◦ May be fuel has been stolen overnight ◦ The spark plug are dirty ◦ May be due to the dirt in carburetor ◦ A loose connection in ignition system or any thing serious 9/16/2009Quratulain 4

A Causal Perspective – Car Example Construct a graph to represent causal relationship between events which gives structure to the situation for reasoning. 9/16/2009Quratulain5 Variable (node)States Fuel{yes, no} CleanSparkPlugs{yes, no} Fuel Meter{full, half, empty} Start{yes, no}

Outline Reasoning under uncertainty Causal network and d-separation Bayesian network Graphical model 9/16/2009Quratulain6

Causal network and d-separation A causal network consists of a set of variables and a set of directed links between variables. Mathematically, the structure is called a directed graph. Causal networks can be used to follow how a change of certainty in one variable may change the certainty for other variables. 9/16/2009Quratulain7

3-Cases of evidence transmition Serial Connections Diverging Connections (common cause) Converging Connection (common effect) 9/16/2009Quratulain8 P(C|A^B)=P(C|B)

Serial Connections Evidence about A will influence the certainty of B, which then influences the certainty of C. Similarly, evidence about C will influence the certainty of A through B. If the state of B is known, then the channel is blocked, A and C become independent. we say that A and C are d-separated given B. 9/16/2009Quratulain9

Diverging Connections Influence can pass between all the children of A if A is not known. That is, B,C,..., E are d-separated given A. 9/16/2009Quratulain10 Sex (male, female), length of hair (long, short), and stature (<168 cm, ≥168 cm)

Converging Connection If nothing is known about A then the parents are independent evidence about one of them cannot influence the certainties of the others through A. 9/16/2009Quratulain11

D-separation Two distinct variables A and B in a causal network are d-separated such that either: ◦ The connection is serial or diverging and V is instantiated. ◦ The connection is converging, and neither V nor any of V ’s descendants have received evidence. 9/16/2009Quratulain12

Example Are B and C independent given A? Are B and C independent given F 9/16/2009Quratulain13

Markov Blanket The Markov blanket of a variable A is the set consisting of: ◦ the parents of A, ◦ the children of A, and ◦ the variables sharing a child with A. The Markov blanket has the property that when instantiated, A is d-separated from the rest of the network. 9/16/2009Quratulain14

Outline Reasoning under uncertainty Causal network and d-separation Bayesian network Graphical model 9/16/2009Quratulain15

Bayesian Network A Bayesian network consists of the following ◦ A set of variables and a set of directed edges between variables. ◦ Each variable has a finite set of mutually exclusive states. ◦ The variables together with the directed edges form an acyclic directed graph. ◦ To each variable A with parents B 1,..., B n, a conditional probability table P(A|B 1,..., B n ) is attached. 9/16/2009Quratulain16

Bayesian Network The probabilities to specify are: P(A), P(B), P(C | A,B), P(E |C), P(D|C), P(F |E), and P(G| D,E,F) It has been claimed that prior probabilities are bias to the model Prior probabilities are necessary because prior certainty assessments are an integral part of human reasoning about certainty The model should not include conditional independences that do not hold in the real world. The d-separation properties check’s Conditional independences in model. 9/16/2009Quratulain17

Chain Rule for Bayesian Network Let BN be a Bayesian network over U = {A1,...,An}. Then BN specifies a unique joint probability distribution P( U ) given by the product of all conditional probability tables specified in BN: 9/16/2009Quratulain18

Outline Reasoning under uncertainty Causal network and d-separation Bayesian network Graphical model 9/16/2009Quratulain19

Graphical Model Graphical specification is easy for humans to read, and helps focus attention. The basic property of the Bayesian networks is the chain rule for compact representation of joint probability distribution. Graphical model represents a causal relation in a knowledge domain. 9/16/2009Quratulain20

Questions 9/16/2009Quratulain21