1 Department of Computer Science and Engineering, University of South Carolina 2007-01-05 Issues for Discussion and Work Jan 2007  Choose meeting time.

Slides:



Advertisements
Similar presentations
1 Probability and the Web Ken Baclawski Northeastern University VIStology, Inc.
Advertisements

Slide 1 of 18 Uncertainty Representation and Reasoning with MEBN/PR-OWL Kathryn Blackmond Laskey Paulo C. G. da Costa The Volgenau School of Information.
Object- Oriented Bayesian Networks : An Overview Presented By: Asma Sanam Larik Course: Probabilistic Reasoning.
Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
The Logic of Intelligence Pei Wang Department of Computer and Information Sciences Temple University.
Rulebase Expert System and Uncertainty. Rule-based ES Rules as a knowledge representation technique Type of rules :- relation, recommendation, directive,
Multisource Fusion for Opportunistic Detection and Probabilistic Assessment of Homeland Terrorist Threats Kathryn Blackmond Laskey & Tod S. Levitt presented.
UMBC an Honors University in Maryland 1 Uncertainty in Ontology Mapping: Uncertainty in Ontology Mapping: A Bayesian Perspective Yun Peng, Zhongli Ding,
Introduction of Probabilistic Reasoning and Bayesian Networks
Causal and Bayesian Network (Chapter 2) Book: Bayesian Networks and Decision Graphs Author: Finn V. Jensen, Thomas D. Nielsen CSE 655 Probabilistic Reasoning.
From: Probabilistic Methods for Bioinformatics - With an Introduction to Bayesian Networks By: Rich Neapolitan.
Plan Recognition with Multi- Entity Bayesian Networks Kathryn Blackmond Laskey Department of Systems Engineering and Operations Research George Mason University.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
A Probabilistic Framework for Information Integration and Retrieval on the Semantic Web by Livia Predoiu, Heiner Stuckenschmidt Institute of Computer Science,
PR-OWL: A Framework for Probabilistic Ontologies by Paulo C. G. COSTA, Kathryn B. LASKEY George Mason University presented by Thomas Packer 1PR-OWL.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.5 [P]: Propositions and Inference Sections.
LEARNING FROM OBSERVATIONS Yılmaz KILIÇASLAN. Definition Learning takes place as the agent observes its interactions with the world and its own decision-making.
1 Department of Computer Science and Engineering, University of South Carolina Bayesian Network Development  Kevin B. Korb and Ann E. Nicholson.
Bayesian Belief Networks
1 Learning Entity Specific Models Stefan Niculescu Carnegie Mellon University November, 2003.
Introduction to Bayesian Learning Bob Durrant School of Computer Science University of Birmingham (Slides: Dr Ata Kabán)
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.6 [P]: Reasoning Under Uncertainty Sections.
Data Mining with Decision Trees Lutz Hamel Dept. of Computer Science and Statistics University of Rhode Island.
Introduction to Bayesian Learning Ata Kaban School of Computer Science University of Birmingham.
1 Inference Algorithm for Similarity Networks Dan Geiger & David Heckerman Presentation by Jingsong Wang USC CSE BN Reading Club Contact:
CSc411Artificial Intelligence1 Chapter 5 STOCHASTIC METHODS Contents The Elements of Counting Elements of Probability Theory Applications of the Stochastic.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Conflicts in Bayesian Networks January 23, 2007 Marco Valtorta
1 Multi-Entity Bayesian Networks Without Multi-Tears Bayesian Networks Seminar Jan 3-4, 2007.
Science and Engineering Practices
Chapter 7: The Object-Oriented Approach to Requirements
1 Performance Evaluation of Computer Networks: Part II Objectives r Simulation Modeling r Classification of Simulation Modeling r Discrete-Event Simulation.
Soft Computing Lecture 17 Introduction to probabilistic reasoning. Bayesian nets. Markov models.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 26 of 41 Friday, 22 October.
Bayesian Learning By Porchelvi Vijayakumar. Cognitive Science Current Problem: How do children learn and how do they get it right?
Dr. Gary Blau, Sean HanMonday, Aug 13, 2007 Statistical Design of Experiments SECTION I Probability Theory Review.
第十讲 概率图模型导论 Chapter 10 Introduction to Probabilistic Graphical Models
THEORETICAL FRAMEWORK and Hypothesis Development
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 25 Wednesday, 20 October.
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
1 EZIO BIGLIERI (work done with Marco Lops) USC, September 20, 2006.
Book: Bayesian Networks : A practical guide to applications Paper-authors: Luis M. de Campos, Juan M. Fernandez-Luna, Juan F. Huete, Carlos Martine, Alfonso.
George F Luger ARTIFICIAL INTELLIGENCE 6th edition Structures and Strategies for Complex Problem Solving STOCHASTIC METHODS Luger: Artificial Intelligence,
Reasoning Under Uncertainty: Conditioning, Bayes Rule & the Chain Rule Jim Little Uncertainty 2 Nov 3, 2014 Textbook §6.1.3.
Uncertainty Management in Rule-based Expert Systems
Chapter 6 Bayesian Learning
Generic Tasks by Ihab M. Amer Graduate Student Computer Science Dept. AUC, Cairo, Egypt.
Marginalization & Conditioning Marginalization (summing out): for any sets of variables Y and Z: Conditioning(variant of marginalization):
BLOG: Probabilistic Models with Unknown Objects Brian Milch, Bhaskara Marthi, Stuart Russell, David Sontag, Daniel L. Ong, Andrey Kolobov University of.
CPSC 322, Lecture 33Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 33 Nov, 30, 2015 Slide source: from David Page (MIT) (which were.
Some Thoughts to Consider 8 How difficult is it to get a group of people, or a group of companies, or a group of nations to agree on a particular ontology?
DeepDive Model Dongfang Xu Ph.D student, School of Information, University of Arizona Dec 13, 2015.
Artificial Intelligence: Research and Collaborative Possibilities a presentation by: Dr. Ernest L. McDuffie, Assistant Professor Department of Computer.
1 Machine Learning: Lecture 6 Bayesian Learning (Based on Chapter 6 of Mitchell T.., Machine Learning, 1997)
Bayesian Learning Bayes Theorem MAP, ML hypotheses MAP learners
Introduction on Graphic Models
From NARS to a Thinking Machine Pei Wang Temple University.
Probabilistic Reasoning Inference and Relational Bayesian Networks.
Naive Bayes Classifier. REVIEW: Bayesian Methods Our focus this lecture: – Learning and classification methods based on probability theory. Bayes theorem.
IT 5433 LM3 Relational Data Model. Learning Objectives: List the 5 properties of relations List the properties of a candidate key, primary key and foreign.
Reasoning Under Uncertainty: Conditioning, Bayes Rule & Chain Rule
Inconsistent Constraints
Object-Oriented Analysis
Ontology.
Probabilistic Horn abduction and Bayesian Networks
Ontology.
Causal Models Lecture 12.
CS 594: Empirical Methods in HCC Introduction to Bayesian Analysis
Machine Learning: Lecture 6
Machine Learning: UNIT-3 CHAPTER-1
Presentation transcript:

1 Department of Computer Science and Engineering, University of South Carolina Issues for Discussion and Work Jan 2007  Choose meeting time for Sp07  Tuesday (or earlier, if compatible with Dr. Huhns).  “MEBN logic includes FOL as a subset” [Laskey 2006, section 5 (p.36)]. Explain and prove this claim  Continue work on technical report  Upgrade Magellan with ACHv2.0.3  [Marco] Contact PARC for Source Code of ACH version  [Jingshan] Prepare Magellan for the update  [JH] Show JW and SL how Magellan works and how it is organized

2 Department of Computer Science and Engineering, University of South Carolina Nodes  Nodes in a Bayesian network are in one-to-one correspondence with (random) variables.  Variables map states (also known as values) to subsets of the event space  The probability of a variable having a certain value is the probability of all the events consistent with that variable having that value  Variables represent propositions about which the system reasons; they are therefore sometimes called propositional variables, even though they may take values other than true and false.

3 Department of Computer Science and Engineering, University of South Carolina Attributes  Each variable has a set of identifying attributes  Attributes “play the role of variables in a logic programming language” [Laskey and Mahoney, UAI-97]  Attributes identify a particular instance of a random variable  Attributes are used to combine fragments:  Fragments can be combined only if their attributes unify

4 Department of Computer Science and Engineering, University of South Carolina Fragments As Templates  Fragments are template models:  “A template model is appropriate for problem domains in which the relevant variables, their state spaces, and their probabilistic relationships do not vary from problem instance to problem instance” [L&M, UAI-97]  A scenario is a combination of instantiated template models  The attributes are used to identify and combine fragment instances but the probabilistic relationships do not change from instance to instance:  The probability distribution described in the Bayesian network is a joint distribution on the nodes only, not on the nodes and the attributes

5 Department of Computer Science and Engineering, University of South Carolina Medical Illustration  [A] medical diagnosis template network would contain variables representing background information about a patient, possible medical conditions the patient might be experiencing, and clinical findings that might be observed.  The network encodes probabilistic relationships among these variables. To perform diagnosis on a particular patient, background information and findings for the patient are entered as evidence and the posterior probabilities of the possible medical conditions are reported.  Although values of the evidence variables vary from patient to patient, the relevant variables and their probabilistic relationships are assumed to be the same for all patients. It is this assumption that justifies the use of template models. Direct quote from [Laskey and Mahoney, UAI-97]

6 Department of Computer Science and Engineering, University of South Carolina Guidance for Selection of Nodes and Attributes  Nodes represent the variables on which the assessment of a situation depends. For example:  State and hypothesis variables  Observation and test variables  Intermediate and theoretical variables  Setting factors  Attributes identify a particular situation. E.g.:  Location  Time  Name  Case ID

7 Department of Computer Science and Engineering, University of South Carolina Use of MEBNs in Magellan and Evolution of MEBNs  In Magellan,  No provision is made for the combination of multiple instances of the same fragment  This simplifies the specification of local probability distributions  In later versions of MEBNs:  A language is provided for the description of local probability distributions  Multiple instances of the same fragments can be used  Local probability distributions depend on the values of attributes

8 Department of Computer Science and Engineering, University of South Carolina MEBNs As a System Integrating First-Order Logic and Probability  Paulo C.G. da Costa and Kathryn B. Laskey. “Multi- Entity Bayesian Networks without Multi-Tears.” Available at [Costa, 2005]  Kathryn B. Laskey. “First-order Bayesian Logic.” Available at [Laskey, 2005]  Kathryn B. Laskey. “MEBN: A Logic for Open-World Probabilistic Reasoning.” Available at [Laskey, 2006]

9 Department of Computer Science and Engineering, University of South Carolina Sample BN Fragments [Laskey, 2005]

10 Department of Computer Science and Engineering, University of South Carolina Using MEBNs Bayesian Network Fragment (BNF) It is the basic unit. Each network fragment consists of a set of related variables together with knowledge about the probabilistic relationships among the variables. Multi Entity Bayesian Network (MEBN) Collection of BNFs specifying probability distribution over attributes of and relationships among a collection of interrelated entities Situation-Specific Network(SSN) Ordinary finite Bayesian Network constructed from an MEBN knowledge base, to reason about specific target hypothesis, with a particular evidence. [Laskey, 2005]

11 Department of Computer Science and Engineering, University of South Carolina Formal Specifications  First-Order Bayesian Logic  A logical foundation that fully integrates classical first-order logic with probability theory  Because first-order Bayesian logic contains classical first-order logic as a deterministic subset, it is a natural candidate as a universal representation for integrating domain ontologies expressed in languages based on classical first-order logic or subsets thereof. [Laskey, 2005]

12 Department of Computer Science and Engineering, University of South Carolina Logic in BN Fragments [Laskey, 2005]

13 Department of Computer Science and Engineering, University of South Carolina A Simple Bayesian Network [Laskey, 2005]

14 Department of Computer Science and Engineering, University of South Carolina A Conditional Proabability Table [Laskey, 2005]

15 Department of Computer Science and Engineering, University of South Carolina Multiple Instances [Laskey, 2005]

16 Department of Computer Science and Engineering, University of South Carolina Temporal Repetition [Laskey, 2005]

17 Department of Computer Science and Engineering, University of South Carolina A Fragment (MFrag) [Laskey, 2005]

18 Department of Computer Science and Engineering, University of South Carolina An Instance of an MFrag [Laskey, 2005]

19 Department of Computer Science and Engineering, University of South Carolina A Temporal MFrag [Laskey, 2005]

20 Department of Computer Science and Engineering, University of South Carolina Temporal Situation-Specific BN [Laskey, 2005]

21 Department of Computer Science and Engineering, University of South Carolina Other Issues in [Laskey, 2005]  Generative Theories  Composition Algorithm  Related Research:  HMMs  DBNs  Plates  Object-Oriented BNs  Probabilistic Relational Models  Learning  Decision Making  Multiple-entity decision graphs (MEDGs) are to influence diagrams what MEBNs are to Bayesian networks  OWL-P  A planned MEBN-based extension to OWL