Probabilistic Horn abduction and Bayesian Networks

Slides:



Advertisements
Similar presentations
1 Probability and the Web Ken Baclawski Northeastern University VIStology, Inc.
Advertisements

CS188: Computational Models of Human Behavior
Autonomic Scaling of Cloud Computing Resources
Big Ideas in Cmput366. Search Blind Search State space representation Iterative deepening Heuristic Search A*, f(n)=g(n)+h(n), admissible heuristics Local.
Logic Programming Automated Reasoning in practice.
CHAPTER 13 Inference Techniques. Reasoning in Artificial Intelligence n Knowledge must be processed (reasoned with) n Computer program accesses knowledge.
BAYESIAN NETWORKS. Bayesian Network Motivation  We want a representation and reasoning system that is based on conditional independence  Compact yet.
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
Everything You Need to Know (since the midterm). Diagnosis Abductive diagnosis: a minimal set of (positive and negative) assumptions that entails the.
UMBC an Honors University in Maryland 1 Uncertainty in Ontology Mapping: Uncertainty in Ontology Mapping: A Bayesian Perspective Yun Peng, Zhongli Ding,
1 Slides for the book: Probabilistic Robotics Authors: Sebastian Thrun Wolfram Burgard Dieter Fox Publisher: MIT Press, Web site for the book & more.
Probabilistic Models of Cognition Conceptual Foundations Chater, Tenenbaum, & Yuille TICS, 10(7), (2006)
Introduction to probability theory and graphical models Translational Neuroimaging Seminar on Bayesian Inference Spring 2013 Jakob Heinzle Translational.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.6 [P]: Reasoning Under Uncertainty Section.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Consistency-Based vs. Explanation- Based (Abduction) Diagnosis Andrew Smith.
A Probabilistic Framework for Information Integration and Retrieval on the Semantic Web by Livia Predoiu, Heiner Stuckenschmidt Institute of Computer Science,
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.5 [P]: Propositions and Inference Sections.
Relational Data Mining in Finance Haonan Zhang CFWin /04/2003.
CSE 574 – Artificial Intelligence II Statistical Relational Learning Instructor: Pedro Domingos.
1 Learning Entity Specific Models Stefan Niculescu Carnegie Mellon University November, 2003.
Big Ideas in Cmput366. Search Blind Search Iterative deepening Heuristic Search A* Local and Stochastic Search Randomized algorithm Constraint satisfaction.
AI - Week 24 Uncertain Reasoning (quick mention) then REVISION Lee McCluskey, room 2/07
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.6 [P]: Reasoning Under Uncertainty Sections.
1 Bayesian Reasoning Chapter 13 CMSC 471 Adapted from slides by Tim Finin and Marie desJardins.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering A Hole in Goal Trees Notes for: D.W. Loveland and M. Stickel. “A Hole in Goal.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
Representing Uncertainty CSE 473. © Daniel S. Weld 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one.
1 Department of Computer Science and Engineering, University of South Carolina Issues for Discussion and Work Jan 2007  Choose meeting time.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Using Definite Knowledge Notes for Ch.3 of Poole et al. CSCE 580 Marco Valtorta.
CPSC 322, Lecture 31Slide 1 Probability and Time: Markov Models Computer Science cpsc322, Lecture 31 (Textbook Chpt 6.5) March, 25, 2009.
1 CILOG User Manual Bayesian Networks Seminar Sep 7th, 2006.
Chapter 14: Artificial Intelligence Invitation to Computer Science, C++ Version, Third Edition.
Notes for Chapter 12 Logic Programming The AI War Basic Concepts of Logic Programming Prolog Review questions.
Bayesian Learning By Porchelvi Vijayakumar. Cognitive Science Current Problem: How do children learn and how do they get it right?
第十讲 概率图模型导论 Chapter 10 Introduction to Probabilistic Graphical Models
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 28 of 41 Friday, 22 October.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 12th Bayesian network and belief propagation in statistical inference Kazuyuki Tanaka.
Estimating Component Availability by Dempster-Shafer Belief Networks Estimating Component Availability by Dempster-Shafer Belief Networks Lan Guo Lane.
1 2010/2011 Semester 2 Introduction: Chapter 1 ARTIFICIAL INTELLIGENCE.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
Section 3.2 Notes Conditional Probability. Conditional probability is the probability of an event occurring, given that another event has already occurred.
Chapter 4. Probability 1.
CSE & CSE6002E - Soft Computing Winter Semester, 2011 Course Review.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 24 of 41 Monday, 18 October.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
Probabilistic Robotics Probability Theory Basics Error Propagation Slides from Autonomous Robots (Siegwart and Nourbaksh), Chapter 5 Probabilistic Robotics.
Graduate School of Information Sciences, Tohoku University
CHAPTER 5 Handling Uncertainty BIC 3337 EXPERT SYSTEM.
Chapter 4. Probability
Horn Clause Computation with DNA Molecules
Probability and Time: Markov Models
Probability and Time: Markov Models
Representing Uncertainty
CSE-490DF Robotics Capstone
University of California
Probability and Time: Markov Models
Introduction.
Class #21 – Monday, November 10
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Probability and Time: Markov Models
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Introduction to Probability
Chapter 5 – Probability Rules
Generalized Diagnostics with the Non-Axiomatic Reasoning System (NARS)
Habib Ullah qamar Mscs(se)
basic probability and bayes' rule
Presentation transcript:

Probabilistic Horn abduction and Bayesian Networks David Poole presented by Hrishikesh Goradia 11/24/2018 Computer Science and Engineering, University of South Carolina

Computer Science and Engineering, University of South Carolina Introduction Logic-based systems for diagnostic problems Too many logical possibilities to handle Many of the diagnoses not worth considering Bayesian networks Probabilistic analysis Probabilistic Horn Abduction Framework for logic-based abduction that incorporates probabilities with assumptions Extends pure Prolog in a simple way to include probabilities 11/24/2018 Computer Science and Engineering, University of South Carolina

Computer Science and Engineering, University of South Carolina Motivating Example 11/24/2018 Computer Science and Engineering, University of South Carolina

Computer Science and Engineering, University of South Carolina Motivating Example 11/24/2018 Computer Science and Engineering, University of South Carolina

Probabilistic Horn Abduction Theory 11/24/2018 Computer Science and Engineering, University of South Carolina

Probabilistic Horn Abduction Theory 11/24/2018 Computer Science and Engineering, University of South Carolina

Assumptions and Constraints Identical hypotheses cannot appear in multiple disjoint declarations. All atoms in disjoint declarations share the same variables. Hypotheses cannot form the head of rules. No cycles in the knowledge base. Knowledge base is both covering and disjoint. 11/24/2018 Computer Science and Engineering, University of South Carolina

Bayesian Networks to Probabilistic Horn Abduction Theory A discrete Bayesian network is represented by Probabilistic Horn abduction rules that relates a random variable ai with its parents {ai1, …, ain}: The conditional probabilities for the random variable are translated into assertions: 11/24/2018 Computer Science and Engineering, University of South Carolina

Bayesian Networks to Probabilistic Horn Abduction Theory 11/24/2018 Computer Science and Engineering, University of South Carolina

Bayesian Networks to Probabilistic Horn Abduction Theory 11/24/2018 Computer Science and Engineering, University of South Carolina

Probabilistic Horn Abduction Theory to Bayesian Networks Each disjoint declaration maps to a random variable. Each atom defined by rules also corresponds to a random variable. Arcs go from the body RV(s) to the head RV in each rule. Probabilities in the disjoint declarations map directly to the conditional probabilities for the RVs Additional optimizations possible. 11/24/2018 Computer Science and Engineering, University of South Carolina

Discussion – Independence and Dependence Can the world be represented such that all of the hypotheses are independent? 11/24/2018 Computer Science and Engineering, University of South Carolina

Discussion – Independence and Dependence Can the world be represented such that all of the hypotheses are independent? Author claims that it is possible. Reichenbach’s principle of the common cause: “If coincidences of two events A and B occur more frequently than their independent occurrence, … then there exists a common cause for these events …” 11/24/2018 Computer Science and Engineering, University of South Carolina

Discussion – Abduction and Prediction Is abducing to causes and making assumptions as to what to predict from those assumptions the right logical analogue of the independence in Bayesian networks? 11/24/2018 Computer Science and Engineering, University of South Carolina

Discussion – Abduction and Prediction Is abducing to causes and making assumptions as to what to predict from those assumptions the right logical analogue of the independence in Bayesian networks? Author claims that it is true. Approach is analogous to Pearl’s network propagation scheme for computing conditional probabilities. 11/24/2018 Computer Science and Engineering, University of South Carolina

Discussion – Causation Common problem associated with logical formulation of causation: “If c1is a cause for a and c2 is a cause for ¬a, then from c1 we can infer ¬c2.” Does the probabilistic Horn abduction theory overcome this? 11/24/2018 Computer Science and Engineering, University of South Carolina

Discussion – Causation Common problem associated with logical formulation of causation: “If c1is a cause for a and c2 is a cause for ¬a, then from c1 we can infer ¬c2.” Does the probabilistic Horn abduction theory overcome this? Author claims that it does. The Bayesian network represented by the theory will have c1 and c2 as disjoint RVs. 11/24/2018 Computer Science and Engineering, University of South Carolina

Computer Science and Engineering, University of South Carolina Summary Presents a simple framework for Horn clause abduction, with probabilities associated with hypotheses. Finds a relationship between logical and probabilistic notions of evidential reasoning. Presents a useful representation language that provides a compromise between heuristic and epistemic adequacy. 11/24/2018 Computer Science and Engineering, University of South Carolina