Computer Science CPSC 322 Lecture 19 Bayesian Networks: Construction 1.

Slides:



Advertisements
Similar presentations
Computer Science CPSC 322 Lecture 25 Top Down Proof Procedure (Ch 5.2.2)
Advertisements

Decision Theory: Sequential Decisions Computer Science cpsc322, Lecture 34 (Textbook Chpt 9.3) Nov, 28, 2012.
Bayesian Networks CSE 473. © Daniel S. Weld 2 Last Time Basic notions Atomic events Probabilities Joint distribution Inference by enumeration Independence.
BAYESIAN NETWORKS. Bayesian Network Motivation  We want a representation and reasoning system that is based on conditional independence  Compact yet.
CPSC 322, Lecture 2Slide 1 Representational Dimensions Computer Science cpsc322, Lecture 2 (Textbook Chpt1) Sept, 6, 2013.
Reasoning Under Uncertainty: Bayesian networks intro Jim Little Uncertainty 4 November 7, 2014 Textbook §6.3, 6.3.1, 6.5, 6.5.1,
Identifying Conditional Independencies in Bayes Nets Lecture 4.
Slide 1 Reasoning Under Uncertainty: More on BNets structure and construction Jim Little Nov (Textbook 6.3)
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
CPSC 322, Lecture 18Slide 1 Planning: Heuristics and CSP Planning Computer Science cpsc322, Lecture 18 (Textbook Chpt 8) February, 12, 2010.
CPSC 322, Lecture 30Slide 1 Reasoning Under Uncertainty: Variable elimination Computer Science cpsc322, Lecture 30 (Textbook Chpt 6.4) March, 23, 2009.
Bayesian Networks. Motivation The conditional independence assumption made by naïve Bayes classifiers may seem to rigid, especially for classification.
CPSC 322, Lecture 12Slide 1 CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12 (Textbook Chpt ) January, 29, 2010.
Bayesian networks Chapter 14 Section 1 – 2.
Bayesian Belief Networks
CPSC 322, Lecture 28Slide 1 Reasoning Under Uncertainty: More on BNets structure and construction Computer Science cpsc322, Lecture 28 (Textbook Chpt 6.3)
CPSC 322, Lecture 29Slide 1 Reasoning Under Uncertainty: Bnet Inference (Variable elimination) Computer Science cpsc322, Lecture 29 (Textbook Chpt 6.4)
CPSC 322, Lecture 24Slide 1 Reasoning under Uncertainty: Intro to Probability Computer Science cpsc322, Lecture 24 (Textbook Chpt 6.1, 6.1.1) March, 15,
. PGM 2002/3 – Tirgul6 Approximate Inference: Sampling.
Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification.
Uncertainty Probability and Bayesian Networks
Computer Science CPSC 322 Lecture 26 Uncertainty and Probability (Ch. 6.1, 6.1.1, 6.1.3)
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
Reasoning Under Uncertainty: Bayesian networks intro CPSC 322 – Uncertainty 4 Textbook §6.3 – March 23, 2011.
Reasoning Under Uncertainty: Independence and Inference Jim Little Uncertainty 5 Nov 10, 2014 Textbook §6.3.1, 6.5, 6.5.1,
Department of Computer Science Undergraduate Events More
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
Reasoning Under Uncertainty: Conditioning, Bayes Rule & the Chain Rule Jim Little Uncertainty 2 Nov 3, 2014 Textbook §6.1.3.
Reasoning under Uncertainty: Conditional Prob., Bayes and Independence Computer Science cpsc322, Lecture 25 (Textbook Chpt ) Nov, 5, 2012.
Marginalization & Conditioning Marginalization (summing out): for any sets of variables Y and Z: Conditioning(variant of marginalization):
Computer Science CPSC 322 Lecture 27 Conditioning Ch Slide 1.
CPSC 422, Lecture 11Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11 Oct, 2, 2015.
Review: Bayesian inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y.
Reasoning Under Uncertainty: Independence Jim Little Uncertainty 3 Nov 5, 2014 Textbook §6.2.
Lecture 29 Conditional Independence, Bayesian networks intro Ch 6.3, 6.3.1, 6.5, 6.5.1,
Reasoning Under Uncertainty: Independence CPSC 322 – Uncertainty 3 Textbook §6.2 March 21, 2011.
Reasoning Under Uncertainty: Independence and Inference CPSC 322 – Uncertainty 5 Textbook §6.3.1 (and for HMMs) March 25, 2011.
CPSC 322, Lecture 2Slide 1 Representational Dimensions Computer Science cpsc322, Lecture 2 (Textbook Chpt1) Sept, 7, 2012.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) Nov, 13, 2013.
Reasoning Under Uncertainty: Variable Elimination for Bayes Nets Jim Little Uncertainty 6 Nov Textbook §6.4,
PROBABILISTIC REASONING Heng Ji 04/05, 04/08, 2016.
Reasoning Under Uncertainty: Variable Elimination CPSC 322 – Uncertainty 6 Textbook §6.4.1 March 28, 2011.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Reasoning Under Uncertainty: Belief Networks
Marginal Independence and Conditional Independence
Reasoning Under Uncertainty: More on BNets structure and construction
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Presented By S.Yamuna AP/CSE
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Qian Liu CSE spring University of Pennsylvania
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Reasoning Under Uncertainty: More on BNets structure and construction
Reasoning Under Uncertainty: Conditioning, Bayes Rule & Chain Rule
Learning Bayesian Network Models from Data
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Reasoning Under Uncertainty: Bnet Inference
Decision Theory: Single Stage Decisions
Reasoning Under Uncertainty: Bnet Inference
CS 188: Artificial Intelligence Fall 2007
CS 188: Artificial Intelligence Fall 2008
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11
Logic: Domain Modeling /Proofs + Computer Science cpsc322, Lecture 22
Bayesian networks Chapter 14 Section 1 – 2.
CS 188: Artificial Intelligence Spring 2006
Reasoning Under Uncertainty: Variable elimination
CS 188: Artificial Intelligence Fall 2008
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Presentation transcript:

Computer Science CPSC 322 Lecture 19 Bayesian Networks: Construction 1

Announcements Assignment 4 posted last Friday You can already do Q1, Q2 and part of Q3 after today Due Friday, April 1, 11.59pm Solution for the assignment will be posted on Wed, April 6 (to account for late days) This will give you plenty of time to study it and discuss it with us before the final 2

Lecture Overview Recap lecture 18 Defining Conditional Probabilities in a Bnet Considerations on Network Structure Inference: Intro 3

Marginal Independence Intuitively: if X ╨ Y, then learning that Y=y does not change your belief in X and this is true for all values y that Y could take For example, weather is marginally independent of the result of a coin toss 4

Exploiting marginal independence Recall the product rule p(X=x ˄ Y=y) = p(X=x | Y=y) × p(Y=y) If X and Y are marginally independent, p(X=x | Y=y) = p(X=x) Thus we have p(X=x ˄ Y=y) = p(X=x) × p(Y=y) In distribution form p(X,Y) = p(X) × p(Y) 5

Exploiting marginal independence Exponentially fewer than the JPD! 6

Conditional Independence Intuitively: if X and Y are conditionally independent given Z, then learning that Y=y does not change your belief in X when we already know Z=z and this is true for all values y that Y could take and all values z that Z could take 7

Example for Conditional Independence However, whether light l 1 is lit (Lit-l 1 ) is conditionally independent from the position of switch s 2 (Up-s 2 ) given whether there is power in wire w 0 Once we know Power-w 0, learning values for Up-s 2 does not change our beliefs about Lit-l 1 I.e., Lit-l 1 is conditionally independent of Up-s 2 given Power-w 0 Power-w 0 Lit-l 1 Up-s 2 Whether light l 1 is lit and the position of switch s 2 are not marginally independent The position of the switch determines whether there is power in the wire w 0 connected to the light Lit-l 1 Up-s 2 8

Exploiting Conditional Independence If, for instance, D is conditionally independent of A and B given C, we can rewrite the above as Under independence we gain compactness The chain rule allows us to write the JPD as a product of conditional distributions Conditional independence allows us to write them compactly 9

Bayesian Network Motivation We want a representation and reasoning system that is based on conditional (and marginal) independence Compact yet expressive representation Efficient reasoning procedures Bayesian (Belief) Networks are such a representation 10

Belief (or Bayesian) networks Def. A Belief network consists of a directed, acyclic graph (DAG) where each node is associated with a random variable X i A domain for each variable X i a set of conditional probability distributions for each node X i given its parents Pa(X i ) in the graph P (X i | Pa(X i )) The parents Pa(X i ) of a variable X i are those X i directly depends on A Bayesian network is a compact representation of the JDP for a set of variables (X 1, …,X n ) P(X 1, …,X n ) = ∏ n i= 1 P (X i | Pa(X i )) 11

1. Define a total order over the random variables: (X 1, …,X n ) 2. Apply the chain rule P(X 1, …,X n ) = ∏ n i= 1 P(X i | X 1, …,X i-1 ) 3.For each X i,, select the smallest set of predecessors Pa (X i ) such that P(X i | X 1, …,X i-1 ) = P (X i | Pa(X i )) 4.Then we can rewrite P(X 1, …,X n ) = ∏ n i= 1 P (X i | Pa(X i )) This is a compact representation of the initial JPD factorization of the JPD based on existing conditional independencies among the variables How to build a Bayesian network Predecessors of X i in the total order defined over the variables X i is conditionally independent from all its other predecessors given Pa(X i ) 12

How to build a Bayesian network (cont’d) 5. Construct the Bayesian Net (BN) Nodes are the random variables Draw a directed arc from each variable in Pa(X i ) to X i Define a conditional probability table (CPT) for each variable X i : P(X i | Pa(X i )) 13

Defined a total order of the variables that follows the causal structure of events: Fire, Tampering, Alarm, Smoke, Leaving, Report Obtained the Bayesian network below, and its corresponding, very compact factorization of the original JPD P(F,T,A,S,L,R)= P(F)P (T ) P (A | F,T) P (S | F) P (L | A) P (R | L) Fire Diagnosis Example Tampering Fire Alarm Smoke Leaving Report 14

Example for BN construction: Fire Diagnosis We are not done yet: must specify the Conditional Probability Table (CPT) for each variable. All variables are Boolean. How many probabilities do we need to specify for this Bayesian network? For instance, how many probabilities do we need to explicitly specify for Fire? P(Fire): 1 probability –> P(Fire = T) because P(Fire = F) can be derived from that (1 minus) P(Fire=t)

Lecture Overview Recap lecture 18 Defining Conditional Probabilities in a Bnet Considerations on Network Structure Inference: Intro 16

Example for BN construction: Fire Diagnosis We are not done yet: must specify the Conditional Probability Table (CPT) for each variable. All variables are Boolean. How many probabilities do we need to specify for this Bayesian network? For instance, how many probabilities do we need to explicitly specify for Alarm? B. 2A. 1C. 4D. 8 P(Fire=t)

Example for BN construction: Fire Diagnosis We are not done yet: must specify the Conditional Probability Table (CPT) for each variable. All variables are Boolean. How many probabilities do we need to specify for this Bayesian network? For instance, how many probabilities do we need to explicitly specify for Alarm? P(Alarm|Tampering, Fire): 4 probabilities, 1 probability for each of the 4 instantiations of the parents P(Fire=t)

Tampering TFire FP(Alarm=t|T,F)P(Alarm=f|T,F) tt0.5 tf ft ff Example for BN construction: Fire Diagnosis We don’t need to speficy explicitly P(Alarm=f|T,F) since probabilities in each row must sum to 1 Each row of this table is a conditional probability distribution P(Fire=t)

Example for BN construction: Fire Diagnosis How many probabilities do we need to explicitly specify for the whole Bayesian network? B. 12 A. 6 C. 20D

Example for BN construction: Fire Diagnosis ……..probabilities in total, compared to the of the JPD for P(T,F,A,S,L,R) Tampering TFire FP(Alarm=t|T,F) tt0.5 tf0.85 ft0.99 ff P(Tampering=t) 0.02 P(Fire=t) 0.01 Fire FP(Smoke=t |F) t0.9 f0.01 AlarmP(Leaving=t|A) t0.88 f0.001 LeavingP(Report=t|L) t0.75 f

Example for BN construction: Fire Diagnosis We are not done yet: must specify the Conditional Probability Table (CPT) for each variable. All variables are Boolean. How many probabilities do we need to specify for this Bayesian network? P(Tampering): 1 probability P(T = t) P(Alarm|Tampering, Fire): 4 (independent) –1 probability for each of the 4 instantiations of the parents For all other variables with only one parent – 2 probabilities: one for the parent be true and one for the parent being false In total: = 12 (compared to = 63 for full JPD!) 22

Example for BN construction: Fire Diagnosis Tampering TFire FP(Alarm=t|T,F) tt0.5 tf0.85 ft0.99 ff P(Tampering=t) 0.02 P(Fire=t) 0.01 Fire FP(Smoke=t |F) t0.9 f0.01 AlarmP(Leaving=t|A) t0.88 f0.001 LeavingP(Report=t|L) t0.75 f0.01 P(Tampering=t, Fire=f, Alarm=t, Smoke=f, Leaving=t, Report=t) = P(Tampering=t) x P(Fire=f)xP(Alarm=t| Tampering=t, Fire=f)xP(Smoke=f| Fire = f)xP(Leaving=t| Alarm=t) x P(Report=t|Leaving=t) = = 0.02 x (1-0.01) x 0.85 x (1-0.01) x 0.88 x 0.75 = Once we have the CPTs in the network, we can compute any entry of the JPD 23

In Summary In a Belief network, the JPD of the variables involved is defined as the product of the local conditional distributions: P (X 1, …,X n ) = ∏ i P(X i | X 1, …,X i-1 ) = ∏ i P (X i | Parents(X i )) Any entry of the JPD can be computed given the CPTs in the network Once we know the JPD, we can answer any query about any subset of the variables - (see Inference by Enumeration topic) Thus, a Belief network allows one to answer any query on any subset of the variables We will see how this can be done efficiently, but first let’s think a bit more about network structure 24

Lecture Overview Recap lecture 18 Defining Conditional Probabilities in the network Considerations on Network Structure Inference: Intro 25

Compactness In a Bnet, how many rows do we need to explicitly store for the CPT of a Boolean variable X i with k Boolean parents? A. kB. 2kC. 2 k 26

Compactness A CPT for a Boolean variable X i with k Boolean parents has 2 k rows for the combinations of parent values Each row requires one number p for X i = true (the number for X i = false is just 1-p) If each variable has no more than k parents, the complete network requires to specify O(n · 2 k ) numbers For k<< n, this is a substantial improvement, the numbers required grow linearly with n, vs. O(2 n ) for the full joint distribution E.g., if we have a Bnets with 30 boolean variables, each with 5 parents Need to specify 30*2 5 probability But we need 2 30 for JPD 27

Realistic BNet: Liver Diagnosis Source: Onisko et al., 1999 ~ 60 nodes, max 4 parents per node Need ~ 60 x 2 4 = 15 x 2 6 probabilities instead of 2 60 probabilities for the JPD 28

Compactness What happens if the network is fully connected? Or k ≈ n Not much saving compared to the numbers needed to specify the full JPD Bnets are useful in sparse (or locally structured) domains Domains in with each component interacts with (is related to) a small fraction of other components What if this is not the case in a domain we need to reason about May need to make simplifying assumptions to reduce the dependencies in a domain 29

“Where do the numbers (CPTs) come from?” From experts Tedious Costly Not always reliable From data => Machine Learning There are algorithms to learn both structures and numbers (CPSC 340, CPSC 422) Can be hard to get enough data Still, usually better than specifying the full JPD 30

What if we use a different ordering? We end up with a completely different network structure! (try it as an exercise) Leaving Report Tampering Smoke Alarm Fire A. The previous structure B. This structure What happens if we use the following order: Leaving; Tampering; Report; Smoke; Alarm; Fire. Which of the two structures is better? C. There is no difference 31

Which Structure is Better? Non-causal network is less compact: = 25 numbers needed Deciding on conditional independence is hard in non-causal directions Causal models and conditional independence seem hardwired for humans! Specifying the conditional probabilities may be harder than in causal direction For instance, we have lost the direct dependency between alarm and one of its causes, which essentially describes the alarm’s reliability (info often provided by the maker) Leaving Report Tampering Smoke Alarm Fire 32

Example contd. Other than that, our two Bnets for the Alarm problem are equivalent as long as they represent the same probability distribution P(T,F,A,S,L,R) = P (T) P (F) P (A | T,F) P (L | A) P (R|L) = = P(L)P(T|L)P(R|L)P(S|L,T)P(A|S,L,T) P(F|S,A,T) i.e., they are equivalent if the corresponding CPTs are specified so that they satisfy the equation above Leaving Report Tampering Smoke Alarm Fire Variable ordering: T,F,A,S,L,R Variable ordering: L,T,R,S,A,F 33

Are there wrong network structures? Given an order of variables, a network with arcs in excess to those required by the direct dependencies implied by that order are still ok Just not as efficient Leaving Report Tampering Smoke Alarm Fire P (L)P(T|L)P(R|L) P(S|L,R,T) P(A|S,L,T) P(F|S,A,T) = P (L)P(T|L)P(R|L)P(S|L,T)P(A|S,L,T) P(F|S,A,T) One extreme: the fully connected network is always correct but rarely the best choice It corresponds to just applying the chain rule to the JDP, without leveraging conditional independencies to simplify the factorization P(L,T,R,S,A,L)= P (L)P(T|L)P(R|L,T)P(S|L,T,R)P(A|S,L,T,R) P(F|S,A,T,L,R) 34

Are there wrong network structures? Leaving Report Tampering Smoke Alarm Fire P(L,T,R,S,A,L)= P (L)P(T|L)P(R|L,T)P(S|L,T,R)P(A|S,L,T,R) P(F|S,A,T,L,R) 35

Are there wrong network structures? How can a network structure be wrong? If it misses directed edges that are required E.g. an edge is missing below, making Fire conditionally independent of Alarm given Tampering and Smoke Leaving Report Tampering Smoke Alarm Fire But they are not: for instance, P(Fire = t| Smoke = f, Tampering = F, Alarm = T) should be higher than P(Fire = t| Smoke = f, Tampering = f), 36

Are there wrong network structures? How can a network structure be wrong? If it misses directed edges that are required E.g. an edge is missing below: Fire is not conditionally independent of Alarm | {Tampering, Smoke} Leaving Report Tampering Smoke Alarm Fire But remember what we said a few slides back. Sometimes we may need to make simplifying assumptions - e.g. assume conditional independence when it does not actually hold – in order to reduce complexity 37

In 1, 2 and 3, X and Y are dependent (grey areas represent existing evidence/observations) Summary of Dependencies in a Bayesian Network Z Z Z XY E E E In 3, X and Y become dependent as soon as there is evidence on Z or on any of its descendants. This is because knowledge of one possible cause given evidence of the effect explains away the other cause 38

In 1, 2 and 3, X and Y are dependent (grey areas represent existing evidence/observations) Dependencies in a Bayesian Network: summary Z Z Z XY E E E In 3, X and Y become dependent as soon as there is evidence on Z or on any of its descendants. This is because knowledge of one possible cause given evidence of the effect explains away the other cause Understood Material Assignment Grade Exam Grade Alarm Smoking At Sensor Fire Power(w 0 ) Lit(l 1 ) Up(s 2 )

Or, blocking paths for probability propagation. Three ways in which a path between Y to X (or viceversa) can be blocked, given evidence E Or Conditional Independencies Z Z Z XYE In 3, X and Y are independent if there is no evidence on their common effect (recall fire and tampering in the alarm example 40

Or, blocking paths for probability propagation. Three ways in which a path between Y to X (or viceversa) can be blocked, given evidence E Or Conditional Independencies Z Z Z XYE In 3, X and Y are independent if there is no evidence on their common effect (recall fire and tampering in the alarm example Understood Material Assignment Grade Exam Grade Alarm Smoking At Sensor Fire Power(w 0 ) Lit(l 1 ) Up(s 2 )

Conditional Independece in a Bnet Is H conditionally independent of E given I? Z Z Z XYE A. True B. False C. It depends on the evidence 42

Conditional Independece in a Bnet Is H conditionally independent of E given I? Z Z Z XYE T: Since we have no evidence on F, changes on the belief of G (caused by changes in the belief of H) do not affect belief on E 43

(In)Dependencies in a Bnet Is A conditionally independent of I given F? Z Z Z XYE A. True B. False C. It depends on the evidence 44

(In)Dependencies in a Bnet Is A conditionally independent of I given F? Z Z Z XYE False: Since we have evidence on F, changes to the belief on G due to changes in I affect belief on E, which in turns propagates to A via C and B What if node C is not there? 45

(In)Dependencies in a Bnet Is A conditionally independent of I given F? Z Z Z XYE A. True B. False 46

(In)Dependencies in a Bnet Is A conditionally independent of I given F? Z Z Z XYE T: lack on evidence on D blocks the path that goes through E, D, B, 47

Given a JPD Marginalize over specific variables Compute distributions over any subset of the variables Use inference by enumeration to compute joint posterior probability distributions over any subset of variables given evidence Define and use marginal and conditional independence Build a Bayesian Network for a given domain (structure) Specify the necessary conditional probabilities Compute the representational savings in terms of number of probabilities required Identify dependencies/independencies between nodes in a Bayesian Network Learning Goals so Far Now we will see how to do inference in BNETS 48

Lecture Overview Recap lecture 18 Defining Conditional Probabilities in a Bnet Considerations on Network Structure Inference: Intro 49

Where Are We? Environment Problem Type Query Planning Deterministic Stochastic Constraint Satisfaction Search Arc Consistency Search Logics STRIPS Vars + Constraints Value Iteration Variable Elimination Belief Nets Decision Nets Markov Processes Static Sequential Representation Reasoning Technique Variable Elimination We’ll focus on Variable Elimination 50

Inference in Bayesian Networks Given: A Bayesian Network BN, and Observations of a subset of its variables E: E=e A subset of its variables Y that is queried Compute: The conditional probability P(Y|E=e) We denote observed variables as shaded. Examples: UnderstoodMaterial Assignment Grade Exam Grade Alarm Smoking At Sensor Fire 51

Bayesian Networks: Types of Inference Diagnostic People are leaving P(L=t)=1 P(F|L=t)=? PredictiveIntercausal Mixed Fire happens P(F=t)=1 P(L|F=t)=? Alarm goes off P(A=T) = 1 P(F|A=t,T=t)=? People are leaving P(L=t)=1 There is no fire F=f P(A|F=f,L=t)=? Person smokes next to sensor S=t Fire Alarm Leaving Fire Alarm Leaving Fire Alarm Leaving Fire Alarm Smoking at Sensor We will use the same reasoning procedure for all of these types 52

Bayesian Networks - AISpace Try queries with the Belief Networks applet in AISpace, Load the Fire Alarm example (ex in textbook) Compare the probability of each query node before and after the new evidence is observed try first to predict how they should change using your understanding of the domain Make sure you explore and understand the various other example networks in textbook and Aispace Electrical Circuit example (textbook ex 6.11) Patient’s wheezing and coughing example (ex. 6.14) Not in applet, but you can easily insert it Several other examples in AIspace 53

Example Suppose that We observe people leaving Should the probability of Tampering A. Go up B. Go down C. Stay the same 54

Example Suppose that, after observing people leaving, we then observe smoke Now should the probability of Tampering A. Go up B. Go down C. Stay the same 55

Bayesian Networks - AISpace Try queries with the Belief Networks applet in AISpace, Load the Fire Alarm example (ex in textbook) Compare the probability of each query node before and after the new evidence is observed try first to predict how they should change using your understanding of the domain Make sure you explore and understand the various other example networks in textbook and Aispace Electrical Circuit example (textbook ex 6.11) Patient’s wheezing and coughing example (ex. 6.14) Not in applet, but you can easily insert it Several other examples in AIspace 56