Artificial Intelligence Chapter 19

Slides:



Advertisements
Similar presentations
A Tutorial on Learning with Bayesian Networks
Advertisements

BAYESIAN NETWORKS Ivan Bratko Faculty of Computer and Information Sc. University of Ljubljana.
Probabilistic models Jouni Tuomisto THL. Outline Deterministic models with probabilistic parameters Hierarchical Bayesian models Bayesian belief nets.
Representations for KBS: Uncertainty & Decision Support
Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
Bayesian Networks CSE 473. © Daniel S. Weld 2 Last Time Basic notions Atomic events Probabilities Joint distribution Inference by enumeration Independence.
Knowledge Representation and Reasoning University "Politehnica" of Bucharest Department of Computer Science Fall 2010 Adina Magda Florea
Lauritzen-Spiegelhalter Algorithm
Artificial Intelligence Universitatea Politehnica Bucuresti Adina Magda Florea
Exact Inference in Bayes Nets
Identifying Conditional Independencies in Bayes Nets Lecture 4.
For Monday Read chapter 18, sections 1-2 Homework: –Chapter 14, exercise 8 a-d.
For Monday Finish chapter 14 Homework: –Chapter 13, exercises 8, 15.
Artificial Intelligence Chapter 14. Resolution in the Propositional Calculus Artificial Intelligence Chapter 14. Resolution in the Propositional Calculus.
Introduction of Probabilistic Reasoning and Bayesian Networks
Artificial Intelligence Chapter 19 Reasoning with Uncertain Information Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
Overview of Inference Algorithms for Bayesian Networks Wei Sun, PhD Assistant Research Professor SEOR Dept. & C4I Center George Mason University, 2009.
Causal and Bayesian Network (Chapter 2) Book: Bayesian Networks and Decision Graphs Author: Finn V. Jensen, Thomas D. Nielsen CSE 655 Probabilistic Reasoning.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
December Marginal and Joint Beliefs in BN1 A Hybrid Algorithm to Compute Marginal and Joint Beliefs in Bayesian Networks and its complexity Mark.
Representing Uncertainty CSE 473. © Daniel S. Weld 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one.
5/25/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005.
Bayesian Networks Alan Ritter.
A Differential Approach to Inference in Bayesian Networks - Adnan Darwiche Jiangbo Dang and Yimin Huang CSCE582 Bayesian Networks and Decision Graphs.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
Made by: Maor Levy, Temple University  Probability expresses uncertainty.  Pervasive in all of Artificial Intelligence  Machine learning 
Soft Computing Lecture 17 Introduction to probabilistic reasoning. Bayesian nets. Markov models.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
For Wednesday Read Chapter 11, sections 1-2 Program 2 due.
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
Probabilistic Reasoning ECE457 Applied Artificial Intelligence Spring 2007 Lecture #9.
Bayesian Statistics and Belief Networks. Overview Book: Ch 13,14 Refresher on Probability Bayesian classifiers Belief Networks / Bayesian Networks.
1 Monte Carlo Artificial Intelligence: Bayesian Networks.
Ch 8. Graphical Models Pattern Recognition and Machine Learning, C. M. Bishop, Revised by M.-O. Heo Summarized by J.W. Nam Biointelligence Laboratory,
Announcements Project 4: Ghostbusters Homework 7
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS.
1 CMSC 671 Fall 2001 Class #21 – Tuesday, November 13.
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
Bayesian Networks Aldi Kraja Division of Statistical Genomics.
INTRODUCTION TO MACHINE LEARNING 3RD EDITION ETHEM ALPAYDIN © The MIT Press, Lecture.
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
Artificial Intelligence Chapter 19 Reasoning with Uncertain Information Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
Bayesian Decision Theory Introduction to Machine Learning (Chap 3), E. Alpaydin.
Artificial Intelligence Bayes’ Nets: Independence Instructors: David Suter and Qince Li Course Harbin Institute of Technology [Many slides.
INTRODUCTION TO Machine Learning 2nd Edition
CS 2750: Machine Learning Directed Graphical Models
Bayesian Networks: A Tutorial
Representing Uncertainty
CSE-490DF Robotics Capstone
CSCI 5822 Probabilistic Models of Human and Machine Learning
Pattern Recognition and Image Analysis
CAP 5636 – Advanced Artificial Intelligence
Professor Marie desJardins,
Bayesian Statistics and Belief Networks
CSE 473: Artificial Intelligence Autumn 2011
Artificial Intelligence Chapter 20 Learning and Acting with Bayes Nets
Class #19 – Tuesday, November 3
CS 188: Artificial Intelligence
Biointelligence Lab School of Computer Sci. & Eng.
Biointelligence Lab School of Computer Sci. & Eng.
Directed Graphical Probabilistic Models: the sequel
Class #16 – Tuesday, October 26
CS 188: Artificial Intelligence Spring 2007
Chapter 20. Learning and Acting with Bayes Nets
CS 188: Artificial Intelligence Spring 2006
CS 188: Artificial Intelligence Fall 2008
Presentation transcript:

Artificial Intelligence Chapter 19 Artificial Intelligence Chapter 19. Reasoning with Uncertain Information

(C) 2000, 2001 SNU CSE Biointelligence Lab Outline Review of Probability Theory Probabilistic Inference Bayes Networks Patterns of Inference in Bayes Networks Uncertain Evidence D-Separation Probabilistic Inference in Polytrees (C) 2000, 2001 SNU CSE Biointelligence Lab

19.1 Review of Probability Theory Random variables Joint probability Ex. (B (BAT_OK), M (MOVES) , L (LIFTABLE), G (GUAGE)) Joint Probability (True, True, True, True) 0.5686 (True, True, True, False) 0.0299 (True, True, False, True) 0.0135 (True, True, False, False) 0.0007 … (C) 2000, 2001 SNU CSE Biointelligence Lab

19.1 Review of Probability Theory Marginal probability Conditional probability Ex. The probability that the battery is charged given that the are does not move Ex. (C) 2000, 2001 SNU CSE Biointelligence Lab

19.1 Review of Probability Theory (C) 2000, 2001 SNU CSE Biointelligence Lab

19.1 Review of Probability Theory Chain rule Bayes’ rule Abbreviation for where (C) 2000, 2001 SNU CSE Biointelligence Lab

19.2 Probabilistic Inference The probability some variable Vi has value vi given the evidence  =e. p(P,Q,R) 0.3 p(P,Q,¬R) 0.2 p(P, ¬Q,R) p(P, ¬Q,¬R) 0.1 p(¬P,Q,R) 0.05 p(¬P, Q, ¬R) p(¬P, ¬Q,R) p(¬P, ¬Q,¬R) 0.0 (C) 2000, 2001 SNU CSE Biointelligence Lab

Statistical Independence Conditional independence Mutually conditional independence Unconditional independence (C) 2000, 2001 SNU CSE Biointelligence Lab

(C) 2000, 2001 SNU CSE Biointelligence Lab 19.3 Bayes Networks Directed, acyclic graph (DAG) whose nodes are labeled by random variables. Characteristics of Bayesian networks Node Vi is conditionally independent of any subset of nodes that are not descendents of Vi. Prior probability Conditional probability table (CPT) (C) 2000, 2001 SNU CSE Biointelligence Lab

(C) 2000, 2001 SNU CSE Biointelligence Lab 19.3 Bayes Networks (C) 2000, 2001 SNU CSE Biointelligence Lab

19.4 Patterns of Inference in Bayes Networks Causal or top-down inference Ex. The probability that the arm moves given that the block is liftable (C) 2000, 2001 SNU CSE Biointelligence Lab

19.4 Patterns of Inference in Bayes Networks Diagnostic or bottom-up inference Using an effect (or symptom) to infer a cause Ex. The probability that the block is not liftable given that the arm does not move. (using a causal reasoning) (Bayes’ rule) (C) 2000, 2001 SNU CSE Biointelligence Lab

19.4 Patterns of Inference in Bayes Networks Explaining away ¬B explains ¬M, making ¬L less certain (Bayes’ rule) (def. of conditional prob.) (structure of the Bayes network) (C) 2000, 2001 SNU CSE Biointelligence Lab

(C) 2000, 2001 SNU CSE Biointelligence Lab 19.5 Uncertain Evidence We must be certain about the truth or falsity of the propositions they represent. Each uncertain evidence node should have a child node, about which we can be certain. Ex. Suppose the robot is not certain that its arm did not move. Introducing M’ : “The arm sensor says that the arm moved” We can be certain that that proposition is either true or false. p(¬L| ¬B, ¬M’) instead of p(¬L| ¬B, ¬M) Ex. Suppose we are uncertain about whether or not the battery is charged. Introducing G : “Battery guage” p(¬L| ¬G, ¬M’) instead of p(¬L| ¬B, ¬M’) (C) 2000, 2001 SNU CSE Biointelligence Lab

(C) 2000, 2001 SNU CSE Biointelligence Lab 19.6 D-Separation  d-separates Vi and Vj if for every undirected path in the Bayes network between Vi and Vj, there is some node, Vb, on the path having one of the following three properties. Vb is in , and both arcs on the path lead out of Vb Vb is in , and one arc on the path leads in to Vb and one arc leads out. Neither Vb nor any descendant of Vb is in , and both arcs on the path lead in to Vb. Vb blocks the path given  when any one of these conditions holds for a path. Two nodes Vi and Vj are conditionally independent given a set of node  (C) 2000, 2001 SNU CSE Biointelligence Lab

(C) 2000, 2001 SNU CSE Biointelligence Lab 19.6 D-Separation (C) 2000, 2001 SNU CSE Biointelligence Lab

(C) 2000, 2001 SNU CSE Biointelligence Lab 19.6 D-Separation Ex. I(G, L|B) by rules 1 and 3 By rule 1, B blocks the (only) path between G and L, given B. By rule 3, M also blocks this path given B. I(G, L) By rule 3, M blocks the path between G and L. I(B, L) By rule 3, M blocks the path between B and L. Even using d-separation, probabilistic inference in Bayes networks is, in general, NP-hard. (C) 2000, 2001 SNU CSE Biointelligence Lab

19.7 Probabilistic Inference in Polytrees A DAG for which there is just one path, along arcs in either direction, between any two nodes in the DAG. (C) 2000, 2001 SNU CSE Biointelligence Lab

19.7 Probabilistic Inference in Polytrees A node is above Q The node is connected to Q only through Q’s parents A node is below Q The node is connected to Q only through Q’s immediate successors. Three types of evidence. All evidence nodes are above Q. All evidence nodes are below Q. There are evidence nodes both above and below Q. (C) 2000, 2001 SNU CSE Biointelligence Lab

(C) 2000, 2001 SNU CSE Biointelligence Lab Evidence Above (1) Bottom-up recursive algorithm Ex. p(Q|P5, P4) (Structure of The Bayes network) (d-separation) (d-separation) (C) 2000, 2001 SNU CSE Biointelligence Lab

(C) 2000, 2001 SNU CSE Biointelligence Lab Evidence Above (2) Calculating p(P7|P4) and p(P6|P5) Calculating p(P5|P1) Evidence is “below” Here, we use Bayes’ rule (C) 2000, 2001 SNU CSE Biointelligence Lab

(C) 2000, 2001 SNU CSE Biointelligence Lab Evidence Below (1) Top-down recursive algorithm (C) 2000, 2001 SNU CSE Biointelligence Lab

(C) 2000, 2001 SNU CSE Biointelligence Lab Evidence Below (2) (C) 2000, 2001 SNU CSE Biointelligence Lab

Evidence Above and Below + - (C) 2000, 2001 SNU CSE Biointelligence Lab

(C) 2000, 2001 SNU CSE Biointelligence Lab A Numerical Example (1) (C) 2000, 2001 SNU CSE Biointelligence Lab

(C) 2000, 2001 SNU CSE Biointelligence Lab A Numerical Example (2) Other techniques Bucket elimination Monte Carlo method Clustering (C) 2000, 2001 SNU CSE Biointelligence Lab

(C) 2000, 2001 SNU CSE Biointelligence Lab Additional Readings [Feller 1968] Probability Theory [Goldszmidt, Morris & Pearl 1990] Non-monotonic inference through probabilistic method [Pearl 1982a, Kim & Pearl 1983] Message-passing algorithm [Russell & Norvig 1995, pp.447ff] Polytree methods (C) 2000, 2001 SNU CSE Biointelligence Lab

(C) 2000, 2001 SNU CSE Biointelligence Lab Additional Readings [Shachter & Kenley 1989] Bayesian network for continuous random variables [Wellman 1990] Qualitative networks [Neapolitan 1990] Probabilistic methods in expert systems [Henrion 1990] Probability inference in Bayesian networks (C) 2000, 2001 SNU CSE Biointelligence Lab

(C) 2000, 2001 SNU CSE Biointelligence Lab Additional Readings [Jensen 1996] Bayesian networks: HUGIN system [Neal 1991] Relationships between Bayesian networks and neural networks [Hecherman 1991, Heckerman & Nathwani 1992] PATHFINDER [Pradhan, et al. 1994] CPCSBN (C) 2000, 2001 SNU CSE Biointelligence Lab

(C) 2000, 2001 SNU CSE Biointelligence Lab Additional Readings [Shortliffe 1976, Buchanan & Shortliffe 1984] MYCIN: uses certainty factor [Duda, Hart & Nilsson 1987] PROSPECTOR: uses sufficiency index and necessity index [Zadeh 1975, Zadeh 1978, Elkan 1993] Fuzzy logic and possibility theory [Dempster 1968, Shafer 1979] Dempster-Shafer’s combination rules (C) 2000, 2001 SNU CSE Biointelligence Lab

(C) 2000, 2001 SNU CSE Biointelligence Lab Additional Readings [Nilsson 1986] Probabilistic logic [Tversky & Kahneman 1982] Human generally loses consistency facing uncertainty [Shafer & Pearl 1990] Papers for uncertain inference Proceedings & Journals Uncertainty in Artificial Intelligence (UAI) International Journal of Approximate Reasoning (C) 2000, 2001 SNU CSE Biointelligence Lab