Download presentation
Presentation is loading. Please wait.
Published byBlanche Wilson Modified over 9 years ago
1
Causal and Bayesian Network (Chapter 2) Book: Bayesian Networks and Decision Graphs Author: Finn V. Jensen, Thomas D. Nielsen CSE 655 Probabilistic Reasoning Faculty of Computer Science, Institute of Business Administration Presented by Quratulain
2
Outline Reasoning under uncertainty Causal network and d-separation Bayesian network Graphical model 9/16/2009Quratulain2
3
Reasoning Under Uncertainty Why Reason Probabilistically? In many problem domains it isn't possible to create complete, consistent models of the world. If information is given with certainty then Propositional logic (Truth table) can be used. Want to make rational decisions even when there is not enough information to prove that an action will work. To deal with uncertain events, we extend truth value of propositional logic to certainties which are number between 0 and 1. 9/16/2009Quratulain3
4
Example ( Type of reasoning that human do daily ) “In the morning, my car will not start.” Reasons: ◦ I can here starter tune, so must be power in battery ◦ May be fuel has been stolen overnight ◦ The spark plug are dirty ◦ May be due to the dirt in carburetor ◦ A loose connection in ignition system or any thing serious 9/16/2009Quratulain 4
5
A Causal Perspective – Car Example Construct a graph to represent causal relationship between events which gives structure to the situation for reasoning. 9/16/2009Quratulain5 Variable (node)States Fuel{yes, no} CleanSparkPlugs{yes, no} Fuel Meter{full, half, empty} Start{yes, no}
6
Outline Reasoning under uncertainty Causal network and d-separation Bayesian network Graphical model 9/16/2009Quratulain6
7
Causal network and d-separation A causal network consists of a set of variables and a set of directed links between variables. Mathematically, the structure is called a directed graph. Causal networks can be used to follow how a change of certainty in one variable may change the certainty for other variables. 9/16/2009Quratulain7
8
3-Cases of evidence transmition Serial Connections Diverging Connections (common cause) Converging Connection (common effect) 9/16/2009Quratulain8 P(C|A^B)=P(C|B)
9
Serial Connections Evidence about A will influence the certainty of B, which then influences the certainty of C. Similarly, evidence about C will influence the certainty of A through B. If the state of B is known, then the channel is blocked, A and C become independent. we say that A and C are d-separated given B. 9/16/2009Quratulain9
10
Diverging Connections Influence can pass between all the children of A if A is not known. That is, B,C,..., E are d-separated given A. 9/16/2009Quratulain10 Sex (male, female), length of hair (long, short), and stature (<168 cm, ≥168 cm)
11
Converging Connection If nothing is known about A then the parents are independent evidence about one of them cannot influence the certainties of the others through A. 9/16/2009Quratulain11
12
D-separation Two distinct variables A and B in a causal network are d-separated such that either: ◦ The connection is serial or diverging and V is instantiated. ◦ The connection is converging, and neither V nor any of V ’s descendants have received evidence. 9/16/2009Quratulain12
13
Example Are B and C independent given A? Are B and C independent given F 9/16/2009Quratulain13
14
Markov Blanket The Markov blanket of a variable A is the set consisting of: ◦ the parents of A, ◦ the children of A, and ◦ the variables sharing a child with A. The Markov blanket has the property that when instantiated, A is d-separated from the rest of the network. 9/16/2009Quratulain14
15
Outline Reasoning under uncertainty Causal network and d-separation Bayesian network Graphical model 9/16/2009Quratulain15
16
Bayesian Network A Bayesian network consists of the following ◦ A set of variables and a set of directed edges between variables. ◦ Each variable has a finite set of mutually exclusive states. ◦ The variables together with the directed edges form an acyclic directed graph. ◦ To each variable A with parents B 1,..., B n, a conditional probability table P(A|B 1,..., B n ) is attached. 9/16/2009Quratulain16
17
Bayesian Network The probabilities to specify are: P(A), P(B), P(C | A,B), P(E |C), P(D|C), P(F |E), and P(G| D,E,F) It has been claimed that prior probabilities are bias to the model Prior probabilities are necessary because prior certainty assessments are an integral part of human reasoning about certainty The model should not include conditional independences that do not hold in the real world. The d-separation properties check’s Conditional independences in model. 9/16/2009Quratulain17
18
Chain Rule for Bayesian Network Let BN be a Bayesian network over U = {A1,...,An}. Then BN specifies a unique joint probability distribution P( U ) given by the product of all conditional probability tables specified in BN: 9/16/2009Quratulain18
19
Outline Reasoning under uncertainty Causal network and d-separation Bayesian network Graphical model 9/16/2009Quratulain19
20
Graphical Model Graphical specification is easy for humans to read, and helps focus attention. The basic property of the Bayesian networks is the chain rule for compact representation of joint probability distribution. Graphical model represents a causal relation in a knowledge domain. 9/16/2009Quratulain20
21
Questions 9/16/2009Quratulain21
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.