Reasoning Under Uncertainty: More on BNets structure and construction

Slides:



Advertisements
Similar presentations
BAYESIAN NETWORKS Ivan Bratko Faculty of Computer and Information Sc. University of Ljubljana.
Advertisements

Probabilistic Reasoning Bayesian Belief Networks Constructing Bayesian Networks Representing Conditional Distributions Summary.
For Monday Read chapter 18, sections 1-2 Homework: –Chapter 14, exercise 8 a-d.
Slide 1 Reasoning Under Uncertainty: More on BNets structure and construction Jim Little Nov (Textbook 6.3)
For Monday Finish chapter 14 Homework: –Chapter 13, exercises 8, 15.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
CPSC 322, Lecture 30Slide 1 Reasoning Under Uncertainty: Variable elimination Computer Science cpsc322, Lecture 30 (Textbook Chpt 6.4) March, 23, 2009.
Probabilistic Reasoning Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 14 (14.1, 14.2, 14.3, 14.4) Capturing uncertain knowledge Probabilistic.
CPSC 322, Lecture 12Slide 1 CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12 (Textbook Chpt ) January, 29, 2010.
Bayesian Belief Networks
Example applications of Bayesian networks
CPSC 322, Lecture 28Slide 1 Reasoning Under Uncertainty: More on BNets structure and construction Computer Science cpsc322, Lecture 28 (Textbook Chpt 6.3)
5/25/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005.
CPSC 322, Lecture 31Slide 1 Probability and Time: Markov Models Computer Science cpsc322, Lecture 31 (Textbook Chpt 6.5) March, 25, 2009.
CPSC 322, Lecture 32Slide 1 Probability and Time: Hidden Markov Models (HMMs) Computer Science cpsc322, Lecture 32 (Textbook Chpt 6.5) March, 27, 2009.
CPSC 422, Lecture 18Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 18 Feb, 25, 2015 Slide Sources Raymond J. Mooney University of.
CPSC 322, Lecture 29Slide 1 Reasoning Under Uncertainty: Bnet Inference (Variable elimination) Computer Science cpsc322, Lecture 29 (Textbook Chpt 6.4)
CPSC 322, Lecture 24Slide 1 Reasoning under Uncertainty: Intro to Probability Computer Science cpsc322, Lecture 24 (Textbook Chpt 6.1, 6.1.1) March, 15,
Quiz 4: Mean: 7.0/8.0 (= 88%) Median: 7.5/8.0 (= 94%)
Bayes’ Nets  A Bayes’ net is an efficient encoding of a probabilistic model of a domain  Questions we can ask:  Inference: given a fixed BN, what is.
Made by: Maor Levy, Temple University  Probability expresses uncertainty.  Pervasive in all of Artificial Intelligence  Machine learning 
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
Department of Computer Science Undergraduate Events More
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
1 Chapter 14 Probabilistic Reasoning. 2 Outline Syntax of Bayesian networks Semantics of Bayesian networks Efficient representation of conditional distributions.
For Wednesday Read Chapter 11, sections 1-2 Program 2 due.
2 Syntax of Bayesian networks Semantics of Bayesian networks Efficient representation of conditional distributions Exact inference by enumeration Exact.
Introduction to Bayesian Networks
CPSC 322, Lecture 32Slide 1 Probability and Time: Hidden Markov Models (HMMs) Computer Science cpsc322, Lecture 32 (Textbook Chpt 6.5.2) Nov, 25, 2013.
CPSC 322, Lecture 28Slide 1 More on Construction and Compactness: Compact Conditional Distributions Once we have established the topology of a Bnet, we.
CPSC 422, Lecture 11Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11 Oct, 2, 2015.
Finish Probability Theory + Bayesian Networks Computer Science cpsc322, Lecture 9 (Textbook Chpt ) June, 5, CPSC 322, Lecture 9.
Reasoning Under Uncertainty: Independence and Inference CPSC 322 – Uncertainty 5 Textbook §6.3.1 (and for HMMs) March 25, 2011.
Belief Networks Kostas Kontogiannis E&CE 457. Belief Networks A belief network is a graph in which the following holds: –A set of random variables makes.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) Nov, 13, 2013.
Computer Science CPSC 322 Lecture 19 Bayesian Networks: Construction 1.
A Brief Introduction to Bayesian networks
Reasoning Under Uncertainty: Belief Networks
Marginal Independence and Conditional Independence
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Bayesian networks Chapter 14 Section 1 – 2.
Presented By S.Yamuna AP/CSE
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Qian Liu CSE spring University of Pennsylvania
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Read R&N Ch Next lecture: Read R&N
Reasoning Under Uncertainty: More on BNets structure and construction
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 18
Probability and Time: Markov Models
Read R&N Ch Next lecture: Read R&N
Reasoning Under Uncertainty: Bnet Inference
Probability and Time: Markov Models
Read R&N Ch Next lecture: Read R&N
Reasoning Under Uncertainty: Bnet Inference
CAP 5636 – Advanced Artificial Intelligence
Probability and Time: Markov Models
Reasoning Under Uncertainty: Bnet Inference
CS 188: Artificial Intelligence Fall 2007
CS 188: Artificial Intelligence
CS 188: Artificial Intelligence Spring 2007
Probability and Time: Markov Models
Read R&N Ch Next lecture: Read R&N
Bayesian networks Chapter 14 Section 1 – 2.
Probabilistic Reasoning
CS 188: Artificial Intelligence Spring 2006
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 18
Read R&N Ch Next lecture: Read R&N
Reasoning Under Uncertainty: Variable elimination
CS 188: Artificial Intelligence Fall 2008
Presentation transcript:

Reasoning Under Uncertainty: More on BNets structure and construction Computer Science cpsc322, Lecture 28 (Textbook Chpt 6.3) June, 15, 2017 CPSC 322, Lecture 28

Belief networks Recap By considering causal dependencies, we order variables in the joint. Apply_______________________________ and simplify Build a directed acyclic graph (DAG) in which the parents of each var X are those vars on which X directly depends. By construction, a var is independent form it non-descendant given its parents. CPSC 322, Lecture 28

Belief Networks: open issues Independencies: Does a BNet encode more independencies than the ones specified by construction? Compactness: We reduce the number of probabilities from to In some domains we need to do better than that! Still too many and often there are no data/experts for accurate assessment Solution: Make stronger (approximate) independence assumptions CPSC 322, Lecture 28

Implied Conditional Independence relations in a Bnet Lecture Overview Implied Conditional Independence relations in a Bnet Compactness: Making stronger Independence assumptions Representation of Compact Conditional Distributions Network structure( Naïve Bayesian Classifier) CPSC 322, Lecture 28

Bnets: Entailed (in)dependencies Indep(Report, Fire,{Alarm})? Indep(Leaving, SeeSmoke,{Fire})? CPSC 322, Lecture 29

Conditional Independencies Or, blocking paths for probability propagation. Three ways in which a path between X to Y can be blocked, (1 and 2 given evidence E ) Y E X 1 Z Z 2 3 this is intuitive Z Note that, in 3, X and Y become dependent as soon as I get evidence on Z or on any of its descendants CPSC 322, Lecture 28

Or ….Conditional Dependencies Y X 1 Z Z 2 E 3 this is intuitive Z CPSC 322, Lecture 28

In/Dependencies in a Bnet : Example 1 Z X Y E 1 2 3 Is A conditionally independent of I given F? FALSE, because evidence from A reaches E via C. CPSC 322, Lecture 28

In/Dependencies in a Bnet : Example 2 Z X Y E 1 2 3 Is A conditionally independent of I given F? FALSE, because evidence from A reaches E via C. CPSC 322, Lecture 28

In/Dependencies in a Bnet : Example 3 Z X Y E 1 2 3 Is H conditionally independent of E given I? yes CPSC 322, Lecture 28

Implied Conditional Independence relations in a Bnet Lecture Overview Implied Conditional Independence relations in a Bnet Compactness: Making stronger Independence assumptions Representation of Compact Conditional Distributions Network structure( Naïve Bayesian Classifier) CPSC 322, Lecture 28

More on Construction and Compactness: Compact Conditional Distributions Once we have established the topology of a Bnet, we still need to specify the conditional probabilities How? From Data From Experts To facilitate acquisition, we aim for compact representations for which data/experts can provide accurate assessments Reduce to 8,254 CPSC 322, Lecture 28

More on Construction and Compactness: Compact Conditional Distributions From JointPD to But still, CPT grows exponentially with number of parents In semi-realistic model of internal medicine with 448 nodes and 906 links 133,931,430 values are required! And often there are no data/experts for accurate assessment Reduce to 8,254 CPSC 322, Lecture 28

Effect with multiple non-interacting causes What do we need to specify? Malaria Flu Cold Malaria Flu Cold P(Fever=T | ..) P(Fever=F|..) T F Fever What do you think data/experts could easily tell you? More difficult to get info to assess more complex conditioning…. CPSC 322, Lecture 28

Solution: Noisy-OR Distributions Models multiple non interacting causes Logic OR with a probabilistic twist. Logic OR Conditional Prob. Table. Malaria Flu Cold P(Fever=T | ..) P(Fever=F|..) T F CPSC 322, Lecture 28

Solution: Noisy-OR Distributions The Noisy-OR model allows for uncertainty in the ability of each cause to generate the effect (e.g.. one may have a cold without a fever) Malaria Flu Cold P(Fever=T | ..) P(Fever=F|..) T F Two assumptions All possible causes a listed For each of the causes, whatever inhibits it to generate the target effect is independent from the inhibitors of the other causes CPSC 322, Lecture 28

Noisy-OR: Derivations C1 Ck For each of the causes, whatever inhibits it to generate the target effect is independent from the inhibitors of the other causes Effect Independent Probability of failure qi for each cause alone: P(Effect=F | Ci = T, and no other causes) = qi P(Effect=F | C1 = T,.. Cj = T, Cj+1 = F,., Ck = F)= A. B. C. D. None of those CPSC 322, Lecture 28

Noisy-OR: Derivations C1 Ck Effect For each of the causes, whatever inhibits it to generate the target effect is independent from the inhibitors of the other causes Independent Probability of failure qi for each cause alone: P(Effect=F | Ci = T, and no other causes) = qi P(Effect=F | C1 = T,.. Cj = T, Cj+1 = F,., Ck = F)= P(Effect=T | C1 = T,.. Cj = T, Cj+1 = F,., Ck = F) = CPSC 322, Lecture 28

Noisy-OR: Example Number of probabilities linear in ….. P(Fever=F| Cold=T, Flu=F, Malaria=F) = 0.6 P(Fever=F| Cold=F, Flu=T, Malaria=F) = 0.2 P(Fever=F| Cold=F, Flu=F, Malaria=T) = 0.1 Model of internal medicine 133,931,430  8,254 P(Effect=F | C1 = T,.. Cj = T, Cj+1 = F,., Ck = F)= ∏ji=1 qi Malaria Flu Cold P(Fever=T | ..) P(Fever=F|..) T 0.1 x 0.2 x 0.6 = 0.012 F 0.2 x 0.1 = 0.02 0.6 x 0.1=0.06 0.9 0.1 0.2 x 0.6 = 0.12 0.8 0.2 0.4 0.6 1.0 Number of probabilities linear in ….. CPSC 322, Lecture 28

Implied Conditional Independence relations in a Bnet Lecture Overview Implied Conditional Independence relations in a Bnet Compactness: Making stronger Independence assumptions Representation of Compact Conditional Distributions Network structure ( Naïve Bayesian Classifier) CPSC 322, Lecture 28

Naïve Bayesian Classifier A very simple and successful Bnets that allow to classify entities in a set of classes C, given a set of attributes Example: Determine whether an email is spam (only two classes spam=T and spam=F) Useful attributes of an email ? Bnets do not need to be complex to be useful! Assumptions The value of each attribute depends on the classification (Naïve) The attributes are independent of each other given the classification P(“bank” | “account” , spam=T) = P(“bank” | spam=T)

What is the structure? Assumptions The value of each attribute depends on the classification (Naïve) The attributes are independent of each other given the classification Email Spam A. Email contains “free” Email contains “money” Email contains “ubc” Email contains “midterm” Email Spam B. Bnets do not need to be complex to be useful! Email contains “free” Email contains “money” Email contains “ubc” Email contains “midterm” Email contains “free” Email contains “money” Email contains “ubc” Email contains “midterm” Email Spam C.

Naïve Bayesian Classifier for Email Spam Assumptions The value of each attribute depends on the classification (Naïve) The attributes are independent of each other given the classification Email Spam words Email contains “free” Email contains “money” Email contains “ubc” Email contains “midterm” Bnets do not need to be complex to be useful! Number of parameters? Easy to acquire? If you have a large collection of emails for which you know if they are spam or not……

Most likely class given set of observations NB Classifier for Email Spam: Usage Most likely class given set of observations Is a given Email E spam? “free money for you now” Email Spam Email contains “free” Email contains “money” Email contains “ubc” Email contains “midterm” Email is a spam if…….

For another example of naïve Bayesian Classifier See textbook ex. 6.16 help system to determine what help page a user is interested in based on the keywords they give in a query to a help system.

Learning Goals for this class You can: Given a Belief Net, determine whether one variable is conditionally independent of another variable, given a set of observations. Define and use Noisy-OR distributions. Explain assumptions and benefit. Implement and use a naïve Bayesian classifier. Explain assumptions and benefit. CPSC 322, Lecture 4

Next Class Course Elements Bayesian Networks Inference: Variable Elimination Course Elements Work on Practice Exercises 6A and 6B Assignment 3 is due on Tue the 20th ! Assignment 4 will be available on the same day and due TBA as soon as I know when the final will be. CPSC 322, Lecture 28