Bayesian Network : An Introduction May 2005 김 진형 KAIST

Slides:



Advertisements
Similar presentations
Bayesian networks Chapter 14 Section 1 – 2. Outline Syntax Semantics Exact computation.
Advertisements

Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
BAYESIAN NETWORKS. Bayesian Network Motivation  We want a representation and reasoning system that is based on conditional independence  Compact yet.
1 Knowledge Engineering for Bayesian Networks. 2 Probability theory for representing uncertainty l Assigns a numerical degree of belief between 0 and.
Bayesian Networks. Introduction A problem domain is modeled by a list of variables X 1, …, X n Knowledge about the problem domain is represented by a.
1 22c:145 Artificial Intelligence Bayesian Networks Reading: Ch 14. Russell & Norvig.
Bayesian Networks Chapter 14 Section 1, 2, 4. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
Introduction of Probabilistic Reasoning and Bayesian Networks
Probabilistic Reasoning (2)
Causal and Bayesian Network (Chapter 2) Book: Bayesian Networks and Decision Graphs Author: Finn V. Jensen, Thomas D. Nielsen CSE 655 Probabilistic Reasoning.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
Review: Bayesian learning and inference
Bayesian Networks Chapter 2 (Duda et al.) – Section 2.11
Bayesian Networks. Motivation The conditional independence assumption made by naïve Bayes classifiers may seem to rigid, especially for classification.
Probabilistic Reasoning Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 14 (14.1, 14.2, 14.3, 14.4) Capturing uncertain knowledge Probabilistic.
Bayesian networks Chapter 14 Section 1 – 2.
Bayesian Belief Networks
Goal: Reconstruct Cellular Networks Biocarta. Conditions Genes.
Bayesian Networks What is the likelihood of X given evidence E? i.e. P(X|E) = ?
5/25/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
Bayesian Reasoning. Tax Data – Naive Bayes Classify: (_, No, Married, 95K, ?)
Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification.
Probabilistic Reasoning
Bayesian Networks Material used 1 Random variables
Artificial Intelligence CS 165A Tuesday, November 27, 2007  Probabilistic Reasoning (Ch 14)
Read R&N Ch Next lecture: Read R&N
Bayesian networks Chapter 14. Outline Syntax Semantics.
A Brief Introduction to Graphical Models
Bayesian networks Chapter 14 Section 1 – 2. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
Bayesian Networks What is the likelihood of X given evidence E? i.e. P(X|E) = ?
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
1 Chapter 14 Probabilistic Reasoning. 2 Outline Syntax of Bayesian networks Semantics of Bayesian networks Efficient representation of conditional distributions.
2 Syntax of Bayesian networks Semantics of Bayesian networks Efficient representation of conditional distributions Exact inference by enumeration Exact.
Bayesian Statistics and Belief Networks. Overview Book: Ch 13,14 Refresher on Probability Bayesian classifiers Belief Networks / Bayesian Networks.
Introduction to Bayesian Networks
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
Probabilistic Reasoning [Ch. 14] Bayes Networks – Part 1 ◦Syntax ◦Semantics ◦Parameterized distributions Inference – Part2 ◦Exact inference by enumeration.
Review: Bayesian inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y.
Bayesian networks and their application in circuit reliability estimation Erin Taylor.
Lecture 29 Conditional Independence, Bayesian networks intro Ch 6.3, 6.3.1, 6.5, 6.5.1,
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
04/21/2005 CS673 1 Being Bayesian About Network Structure A Bayesian Approach to Structure Discovery in Bayesian Networks Nir Friedman and Daphne Koller.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) Nov, 13, 2013.
PROBABILISTIC REASONING Heng Ji 04/05, 04/08, 2016.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Web-Mining Agents Data Mining Prof. Dr. Ralf Möller Universität zu Lübeck Institut für Informationssysteme Karsten Martiny (Übungen)
Bayesian Networks Chapter 2 (Duda et al.) – Section 2.11 CS479/679 Pattern Recognition Dr. George Bebis.
A Brief Introduction to Bayesian networks
Another look at Bayesian inference
Reasoning Under Uncertainty: Belief Networks
CS 2750: Machine Learning Directed Graphical Models
Bayesian networks Chapter 14 Section 1 – 2.
Presented By S.Yamuna AP/CSE
Qian Liu CSE spring University of Pennsylvania
Read R&N Ch Next lecture: Read R&N
Read R&N Ch Next lecture: Read R&N
Probabilistic Reasoning; Network-based reasoning
Structure and Semantics of BN
Bayesian Statistics and Belief Networks
Class #19 – Tuesday, November 3
CS 188: Artificial Intelligence
Class #16 – Tuesday, October 26
Structure and Semantics of BN
Bayesian networks Chapter 14 Section 1 – 2.
Probabilistic Reasoning
Read R&N Ch Next lecture: Read R&N
Chapter 14 February 26, 2004.
Presentation transcript:

Bayesian Network : An Introduction May 2005 김 진형 KAIST

2 BN = graph theory + probability theory Qualitative part: graph theory Directed acyclic graph Nodes: variables Edges: dependency or influence of a variable on another. Quantitative part: probability theory Set of conditional probabilities for all variables Naturally handles the problem of complexity and uncertainty.

3 Bayesian Network is A framework for representing uncertainty in our knowledge A Graphical modeling framework of causality and influence A Representation of the dependencies among random variables A compact representation of a joint probability of variables on the basis of the concept of conditional independence. Earthquake RadioAlarm Burglar Alarm-Example

4 Bayesian Network Syntax A set of nodes, one per variable A diected, acyclic graph (link = “directly influences”) A conditional distribution for each node given its parents : P(Xi| Parents(Xi)) In the simplist case, conditional distribution represented as a conditional probability table (CPT) giving the distribution over Xi for each combination of parent values

5 Earthquake Example I’m at work, neighbor John calls to say my alarmis ringing, but neighbor Mary doesn’t call. Sometimes it’s set off by minor earthquakes. Is there a burglar ? Variable : Burglar, Earthquake, Alarm, JohnCalls, MaryCalls Network Topology A burglar can set the alarm off An earthquake can set the alarm off The alarm can cause Mary to call The alarm can cause John to call

6 Earthquake Example

7 Representation of Joint Probability Joint probability as a product of conditional probabilities Can dramatically reduce the parameters for data modeling in Bayesian networks. C E D B A

8 Causal Networks Node: event Arc: causal relationship between two nodes A  B: A causes B. Causal network for the car start problem [Jensen 01] Fuel Fuel Meter Standing Start Clean Spark Plugs

9 Reasoning with Causal Networks My car does not start.  increases the certainty of no fuel and dirty spark plugs.  increases the certainty of fuel meter’s standing for the empty. Fuel meter stands for the half.  decreases the certainty of no fuel  increases the certainty of dirty spark plugs. Fuel Fuel Meter Standing Start Clean Spark Plugs

10 Structuring Bayesian Network Initial configuration of Bayesian Network Root nodes Prior probabilities Non-root nodes Conditional probabilities given all possible combinations of direct predecessors AB D E C P(b) P(a) P(d|ab), P(d|a ㄱ b), P(d| ㄱ ab), P(d| ㄱ a ㄱ b) P(e|d) P(e| ㄱ d) P(c|a) P(c| ㄱ a)

11 Structuring Bayesian Network Fuel Fuel Meter Standing Start Clean Spark Plugs P(Fu = Yes) = 0.98P(CSP = Yes) = 0.96 P(St|Fu, CSP) P(FMS|Fu) FMS = Half Fu = No Fu = Yes FMS = Empty FMS = Full 10(No, Yes) (Yes, No) 10(No, No) (Yes, Yes) Start=NoStart=YES(Fu, CSP)

12 Independence assumptions & Complexity Problem of probability theory 2 n -1 joint distributions for n variables For 5 variables, 31 joint distributions Solution by BN For 5 variables, 10 joint distributions Bayesian Networks have built-in independence assumptions. AB D E C P(b) P(a) P(d|ab), P(d|a ㄱ b), P(d| ㄱ ab), P(d| ㄱ a ㄱ b) P(e|d) P(e| ㄱ d) P(c|a) P(c| ㄱ a)

13 Independent Assumptions CAB C AB A and B is marginally dependent CAB C AB A and B is conditionally independent C AB A and B is marginally independent C AB A and B is conditionally dependent

14 Independent Assumption : Car Start Problem 1.‘Start’ and ‘Fuel’ are dependent on each other. 2.‘Start’ and ‘Clean Spark Plugs’ are dependent on each other. 3.‘Fuel’ and ‘Fuel Meter Standing’ are dependent on each other. 4.‘Fuel’ and ‘Clean Spark Plugs’ are conditionally dependent on each other given the value of ‘Start’. 5.‘Fuel Meter Standing’ and ‘Start’ are conditionally independent given the value of ‘Fuel’. Fuel Fuel Meter Standing Start Clean Spark Plugs

15 Quantitative Specification by Probability Calculus Fundamentals Conditional Probability Product Rule Chain Rule: a successive application of the product rule.

16 Main Issues in BN Inference in Bayesian networks Given an assignment of a subset of variables (evidence) in a BN, estimate the posterior distribution over another subset of unobserved variables of interest. Learning Bayesian network from data Parameter Learning Given a data set, estimate local probability distributions P(X i |Pa(X i )). for all variables (nodes) comprising the BN. Structure learning For a data set, search a network structure G (dependency structure) which is best or at least plausible.

17 Evaluating networks Evaluation of network (inference) Computation of all node’s conditional probability given evidence Type of evaluation Exact inference NP-Hard Problem Approximate inference Not exact, but within small distance of the correct answer

18 Inference Task in Bayesian networks

19 Inference in Bayesian networks Joint distribution Definition of joint distribution Set of boolean variables (a,b) P(ab), P( ㄱ ab), P(a ㄱ b), P( ㄱ a ㄱ b) Role of joint distribution Joint distribution give all the information about probability distribution. Ex> P(a|b) = P(ab) / P(b) = P(ab) / ((P(ab)+P( ㄱ ab)) For n random variables, 2 n –1 joint distributions

20 Inference in Bayesian networks Joint distribution for BN is uniquely defined By the product individual distribution of R.V. Using chain-rule, topological sort and dependency ba cd e P(abcde) = P(a)P(b)P(c|a)P(d|ab)P(e|d) Ex)

21 Inference in Bayesian networks Example ba cd e P(a)P(b|a)P(c|ab)P(d|abc)P(e|abcd) Joint probability P(abcde) Chain-rule, Topological sort P(abcde) = P(a)P(b)P(c|a)P(d|ab)P(e|d) Independence assumption b is independent on a,c d is independent on c e is independent on a,b,c

22 Exact inference Two network types Singly connected network (polytree) Multiply connected network Complexity according to network type Singly connected network can be efficiently solved E CD AB Singly connected network A BC D Multiply connected network

23 Inference by enumeration

24 Evaluation Tree

25 Exact inference Multiply Connected Network Hard to evaluate multiply connection network A BC D D will affect C directly D will affect C indirectly p(C|D) ? evidence Probabilities can be affected by both neighbor nodes and other nodes

26 Exact inference Multiply Connected Network (cont.) Methodology to evaluate the network exactly Clustering To Combination of nodes until the resulting graph is singly connected Cloudy WetGrass Spr+Rain CP(S=F) P(S=T) FTFT CP(R=F) P(R=T) FTFT CP(S,R) FF FT TF TT FTFT C W R S

27 Inference by Stochastic Simulation

28 Sampling from an empty network

29 Real World Applications of BN Intelligent agents Microsoft Office assistant: Bayesian user modeling Medical diagnosis PATHFINDER (Heckerman, 1992): diagnosis of lymph node disease  commercialized as INTELLIPATH ( Control decision support system Speech recognition (HMMs) Genome data analysis gene expression, DNA sequence, a combined analysis of heterogeneous data. Turbocodes (channel coding)

30 MSBNx Tools for Baysian Network building tool programmed by Microsoft research Free downloadable Features Graphical Editing of Bayesian Networks Exact Probability Calculations XML Format MSBN3 ActiveX DLL provides an COM-based API for editing and evaluating Bayesian Networks.editingevaluating

31 Conclusion Bayesian Networks are solutions of problems in traditional probability theory Reason of using BN BN need not many numbers Efficient exact solution methods as well as a variety of approximation schemes