UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.6 [P]: Reasoning Under Uncertainty Sections.

Slides:



Advertisements
Similar presentations
BAYESIAN NETWORKS Ivan Bratko Faculty of Computer and Information Sc. University of Ljubljana.
Advertisements

Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
Knowledge Representation and Reasoning University "Politehnica" of Bucharest Department of Computer Science Fall 2010 Adina Magda Florea
1 Some Comments on Sebastiani et al Nature Genetics 37(4)2005.
BAYESIAN NETWORKS. Bayesian Network Motivation  We want a representation and reasoning system that is based on conditional independence  Compact yet.
Reasoning Under Uncertainty: Bayesian networks intro Jim Little Uncertainty 4 November 7, 2014 Textbook §6.3, 6.3.1, 6.5, 6.5.1,
Identifying Conditional Independencies in Bayes Nets Lecture 4.
Causal and Bayesian Network (Chapter 2) Book: Bayesian Networks and Decision Graphs Author: Finn V. Jensen, Thomas D. Nielsen CSE 655 Probabilistic Reasoning.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.6 [P]: Reasoning Under Uncertainty Section.
Bayesian Networks Chapter 2 (Duda et al.) – Section 2.11
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.5 [P]: Propositions and Inference Sections.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 531 Artificial Intelligence Ch.1 [P]: Artificial Intelligence and Agents.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.3 [P]: Searching Fall 2009 Marco Valtorta.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.6: Adversarial Search Fall 2008 Marco Valtorta.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 12 Jim Martin.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.12 [P]: Individuals and Relations Proofs.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Using Definite Knowledge Notes for Ch.3 of Poole et al. CSCE 580 Marco Valtorta.
December Marginal and Joint Beliefs in BN1 A Hybrid Algorithm to Compute Marginal and Joint Beliefs in Bayesian Networks and its complexity Mark.
1 Bayesian Reasoning Chapter 13 CMSC 471 Adapted from slides by Tim Finin and Marie desJardins.
Decision Theory: Sequential Decisions Computer Science cpsc322, Lecture 34 (Textbook Chpt 9.3) April, 12, 2010.
CSCE 580 Artificial Intelligence Ch.18: Learning from Observations
1 Department of Computer Science and Engineering, University of South Carolina Issues for Discussion and Work Jan 2007  Choose meeting time.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Bayes-ball—an Efficient Algorithm to Assess D-separation A Presentation for.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Section 6.4.1: Probabilistic Inference and.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.2 [P]: Agent Architectures and Hierarchical.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.12 [P]: Individuals and Relations Datalog.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
1 CMSC 471 Fall 2002 Class #19 – Monday, November 4.
Made by: Maor Levy, Temple University  Probability expresses uncertainty.  Pervasive in all of Artificial Intelligence  Machine learning 
A Brief Introduction to Graphical Models
Applications of Propositional Logic
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
Reasoning Under Uncertainty: Bayesian networks intro CPSC 322 – Uncertainty 4 Textbook §6.3 – March 23, 2011.
Perceptual and Sensory Augmented Computing Machine Learning, Summer’11 Machine Learning – Lecture 13 Introduction to Graphical Models Bastian.
Aprendizagem Computacional Gladys Castillo, UA Bayesian Networks Classifiers Gladys Castillo University of Aveiro.
Probabilistic Reasoning ECE457 Applied Artificial Intelligence Spring 2007 Lecture #9.
Reasoning in Uncertain Situations
Introduction to Bayesian Networks
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
George F Luger ARTIFICIAL INTELLIGENCE 6th edition Structures and Strategies for Complex Problem Solving STOCHASTIC METHODS Luger: Artificial Intelligence,
Reasoning Under Uncertainty: Conditioning, Bayes Rule & the Chain Rule Jim Little Uncertainty 2 Nov 3, 2014 Textbook §6.1.3.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 781 AILog for Ch.5 Spring 2011 Marco Valtorta
INTERVENTIONS AND INFERENCE / REASONING. Causal models  Recall from yesterday:  Represent relevance using graphs  Causal relevance ⇒ DAGs  Quantitative.
Lecture 29 Conditional Independence, Bayesian networks intro Ch 6.3, 6.3.1, 6.5, 6.5.1,
Machine Learning – Lecture 11
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Reasoning Under Uncertainty: Independence and Inference CPSC 322 – Uncertainty 5 Textbook §6.3.1 (and for HMMs) March 25, 2011.
Introduction on Graphic Models
Belief Networks Kostas Kontogiannis E&CE 457. Belief Networks A belief network is a graph in which the following holds: –A set of random variables makes.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) Nov, 13, 2013.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Bayesian Networks Chapter 2 (Duda et al.) – Section 2.11 CS479/679 Pattern Recognition Dr. George Bebis.
Reasoning Under Uncertainty: Belief Networks
Qian Liu CSE spring University of Pennsylvania
Chapter 10: Using Uncertain Knowledge
CSCE 580 Artificial Intelligence Ch
Chapter 10 (part 3): Using Uncertain Knowledge
CSCE 580 Artificial Intelligence Ch
Read R&N Ch Next lecture: Read R&N
Reasoning Under Uncertainty: Conditioning, Bayes Rule & Chain Rule
A Bayesian Approach to Learning Causal networks
Read R&N Ch Next lecture: Read R&N
CSCE 390 Professional Issues in Computer Science and Engineering Ch
CS 188: Artificial Intelligence Fall 2007
CS 188: Artificial Intelligence Fall 2008
Class #21 – Monday, November 10
Probabilistic Reasoning
Read R&N Ch Next lecture: Read R&N
Chapter 1: Computational Intelligence and Knowledge
Presentation transcript:

UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.6 [P]: Reasoning Under Uncertainty Sections 6.2 and 6.3: Independence and Belief (Bayesian) Networks Fall 2009 Marco Valtorta Probability does not exist. --Bruno de Finetti, 1970 It is remarkable that a science which began with the consideration of games of chance should become the most important object of human knowledge... The most important questions of life are, for the most part, really only problems of probability... The theory of probabilities is at bottom nothing but common sense reduced to calculus. --Pierre Simon de Laplace, 1812

UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Acknowledgment The slides are based on the textbook [P] and other sources, including other fine textbooks –[AIMA-2] –David Poole, Alan Mackworth, and Randy Goebel. Computational Intelligence: A Logical Approach. Oxford, 1998 A second edition (by Poole and Mackworth) is under development. Dr. Poole allowed us to use a draft of it in this course –Ivan Bratko. Prolog Programming for Artificial Intelligence, Third Edition. Addison-Wesley, 2001 The fourth edition is under development –George F. Luger. Artificial Intelligence: Structures and Strategies for Complex Problem Solving, Sixth Edition. Addison-Welsey, 2009

UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Conditional Independence

UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Conditional Independence Is Symmetric

UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Example domain (diagnostic assistant)

UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Examples of conditional independence The identity of the queen of Canada is independent of whether light l1 is lit given whether there is outside power. Whether there is someone in a room is independent of whether a light l2 is lit given the position of switch s3. Whether light l1 is lit is independent of the position of light switch s2 given whether there is power in wire w0. Every other variable may be independent of whether light l1 is lit given whether there is power in wire w0 and the status of light l1 (if it's ok, or if not, how it's broken). Conditional independence is defined using numbers, but it can often be established by qualitative arguments.

UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Idea of belief (Bayesian) networks

UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Bayesian networks The method of strata for constructing a Bayesian network is given above Usually, a Bayesian network is defined to also include probabilities, as in the following slide.

UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Components of a Bayesian network A belief network consists of: –a directed acyclic graph with nodes labeled with random variables –a domain for each random variable –a set of conditional probability tables for each variable given its parents (including prior probabilities for nodes with no parents).

UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Example BN (6.10) Suppose we want to use the diagnostic assistant to diagnose whether there is a fire in a building based on noisy sensor information and possibly conflicting explanations of what could be going on. The agent receives a report about whether everyone is leaving the building. Suppose the report sensor is noisy: It sometimes reports leaving when there is no exodus, a false positive, and sometimes does not report when everyone is leaving, a false negative. Suppose the fire alarm going off can cause the leaving, but this is not deterministic relationship. Either tampering or fire could affect the alarm. Fire also causes smoke to rise from the building.

UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Example BN (6.11) Let’s select an ordering where the causes of a variable are before the variable in the ordering. For example, the variable for whether a light is lit comes after variables for whether the light is working and whether there is power coming into the light.

UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Independence Assumptions in BNs

UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Example for D-Separation

UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering D-separation: converging connections

UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering D-Separation: Diverging Connections

UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering D-Separation: Serial Connections

UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Using BNs: Diagnostic Assistant The power network can be used by the diagnostic assistant in a number of ways: Conditioning on the status of the switches and circuit breakers, whether there is outside power and the position of the switches, you can simulate the lighting. Given values for the switches, the outside power, and whether the lights are lit, you can determine the posterior probability that each switch or circuit breaker is ok or not. Given some switch positions and some outputs and some intermediate values, you can determine the probability of any other variable in the network.

UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Updating Probabilities by Conditioning

UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Some features and properties of BNs A belief network is automatically acyclic by construction. A belief network is a directed acyclic graph (DAG) where nodes are random variables. The parents of a node n are those variables on which n directly depends. A belief network is a graphical representation of dependence and independence: –A variable is independent of its non- descendants given its parents.

UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Constructing BNs To represent a domain in a belief network, you need to consider: What are the relevant variables? –What will you observe? –What would you like to find out (query)? –What other features make the model simpler? What values should these variables take? What is the relationship between them? This should be expressed in terms of local influence. How does the value of each variable depend on its parents? This is expressed in terms of the conditional probabilities.

UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Example 5.30 and 6.14 The process of diagnosis is carried out by conditioning on the observed symptoms and deriving posterior probabilities of the faults or diseases.

UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Help System Example A Naïve Bayesian Classifier: P(H|F1,F2,…,Fn) = K x P(H) x P(F1,F2,…,Fn) = = K x P(H) x P(F1|H) x P(F2|H) x … x P(Fn | H)