Object- Oriented Bayesian Networks : An Overview Presented By: Asma Sanam Larik Course: Probabilistic Reasoning.

Slides:



Advertisements
Similar presentations
1 Probability and the Web Ken Baclawski Northeastern University VIStology, Inc.
Advertisements

Slide 1 of 18 Uncertainty Representation and Reasoning with MEBN/PR-OWL Kathryn Blackmond Laskey Paulo C. G. da Costa The Volgenau School of Information.
Bayesian network for gene regulatory network construction
1 WHY MAKING BAYESIAN NETWORKS BAYESIAN MAKES SENSE. Dawn E. Holmes Department of Statistics and Applied Probability University of California, Santa Barbara.
ARCHITECTURES FOR ARTIFICIAL INTELLIGENCE SYSTEMS
CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 27 – Overview of probability concepts 1.
Modeling Main issues: What do we want to build How do we write this down.
A Tutorial on Learning with Bayesian Networks
BAYESIAN NETWORKS Ivan Bratko Faculty of Computer and Information Sc. University of Ljubljana.
1 Chapter 5 Belief Updating in Bayesian Networks Bayesian Networks and Decision Graphs Finn V. Jensen Qunyuan Zhang Division. of Statistical Genomics,
Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
The Logic of Intelligence Pei Wang Department of Computer and Information Sciences Temple University.
1 Some Comments on Sebastiani et al Nature Genetics 37(4)2005.
Knowledge Representation
Background Reinforcement Learning (RL) agents learn to do tasks by iteratively performing actions in the world and using resulting experiences to decide.
Dynamic Bayesian Networks (DBNs)
Multisource Fusion for Opportunistic Detection and Probabilistic Assessment of Homeland Terrorist Threats Kathryn Blackmond Laskey & Tod S. Levitt presented.
UMBC an Honors University in Maryland 1 Uncertainty in Ontology Mapping: Uncertainty in Ontology Mapping: A Bayesian Perspective Yun Peng, Zhongli Ding,
Identifying Conditional Independencies in Bayes Nets Lecture 4.
Bayesian Networks. Contents Semantics and factorization Reasoning Patterns Flow of Probabilistic Influence.
Simulation Metamodeling using Dynamic Bayesian Networks in Continuous Time Jirka Poropudas (M.Sc.) Aalto University School of Science and Technology Systems.
Artificial Intelligence Chapter 19 Reasoning with Uncertain Information Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
Causal and Bayesian Network (Chapter 2) Book: Bayesian Networks and Decision Graphs Author: Finn V. Jensen, Thomas D. Nielsen CSE 655 Probabilistic Reasoning.
Object-Oriented Analysis and Design
Temporal Action-Graph Games: A New Representation for Dynamic Games Albert Xin Jiang University of British Columbia Kevin Leyton-Brown University of British.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.6 [P]: Reasoning Under Uncertainty Sections.
1 Bayesian Reasoning Chapter 13 CMSC 471 Adapted from slides by Tim Finin and Marie desJardins.
1 Department of Computer Science and Engineering, University of South Carolina Issues for Discussion and Work Jan 2007  Choose meeting time.
Course Instructor: Aisha Azeem
Made by: Maor Levy, Temple University  Probability expresses uncertainty.  Pervasive in all of Artificial Intelligence  Machine learning 
The Bayesian Web Adding Reasoning with Uncertainty to the Semantic Web
1. Motivation Knowledge in the Semantic Web must be shared and modularly organised. The semantics of the modular ERDF framework has been defined model.
Chapter 14: Artificial Intelligence Invitation to Computer Science, C++ Version, Third Edition.
Slides for “Data Mining” by I. H. Witten and E. Frank.
10/6/2015 1Intelligent Systems and Soft Computing Lecture 0 What is Soft Computing.
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang 5-1 Chapter 5 Business Intelligence: Data.
Aprendizagem Computacional Gladys Castillo, UA Bayesian Networks Classifiers Gladys Castillo University of Aveiro.
updated CmpE 583 Fall 2008 Ontology Integration- 1 CmpE 583- Web Semantics: Theory and Practice ONTOLOGY INTEGRATION Atilla ELÇİ Computer.
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
Reasoning in Uncertain Situations
Book: Bayesian Networks : A practical guide to applications Paper-authors: Luis M. de Campos, Juan M. Fernandez-Luna, Juan F. Huete, Carlos Martine, Alfonso.
Learning Linear Causal Models Oksana Kohutyuk ComS 673 Spring 2005 Department of Computer Science Iowa State University.
Machine Learning Chapter 5. Artificial IntelligenceChapter 52 Learning 1. Rote learning rote( โรท ) n. วิถีทาง, ทางเดิน, วิธีการตามปกติ, (by rote จากความทรงจำ.
What is Object-Oriented?  Organization of software as a collection of discreet objects that incorporate both data structure and behavior.
1 CMSC 671 Fall 2001 Class #21 – Tuesday, November 13.
Marginalization & Conditioning Marginalization (summing out): for any sets of variables Y and Z: Conditioning(variant of marginalization):
CPSC 322, Lecture 33Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 33 Nov, 30, 2015 Slide source: from David Page (MIT) (which were.
Wei Sun and KC Chang George Mason University March 2008 Convergence Study of Message Passing In Arbitrary Continuous Bayesian.
Week 04 Object Oriented Analysis and Designing. What is a model? A model is quicker and easier to build A model can be used in simulations, to learn more.
Knowledge Representation Fall 2013 COMP3710 Artificial Intelligence Computing Science Thompson Rivers University.
04/21/2005 CS673 1 Being Bayesian About Network Structure A Bayesian Approach to Structure Discovery in Bayesian Networks Nir Friedman and Daphne Koller.
1 Probability FOL fails for a domain due to: –Laziness: too much to list the complete set of rules, too hard to use the enormous rules that result –Theoretical.
Introduction on Graphic Models
Daphne Koller Introduction Motivation and Overview Probabilistic Graphical Models.
Maestro AI Vision and Design Overview Definitions Maestro: A naïve Sensorimotor Engine prototype. Sensorimotor Engine: Combining sensory and motor functions.
Network Management Lecture 13. MACHINE LEARNING TECHNIQUES 2 Dr. Atiq Ahmed Université de Balouchistan.
Object-oriented programming (OOP) is a programming paradigm using "objects" – data structures consisting of data fields and methods together with their.
Knowledge Representation. A knowledge base can be organised in several different configurations to facilitate fast inferencing Knowledge Representation.
Knowledge Representation
Cmpe 589 Spring 2006.
Maximum Expected Utility
Pekka Laitila, Kai Virtanen
Artificial Intelligence (CS 370D)
Associative Query Answering via Query Feature Similarity
CAP 5636 – Advanced Artificial Intelligence
Class #21 – Monday, November 10
CS 188: Artificial Intelligence
Authors: Wai Lam and Kon Fan Low Announcer: Kyu-Baek Hwang
Knowledge Representation
Using Bayesian Network in the Construction of a Bi-level Multi-classifier. A Case Study Using Intensive Care Unit Patients Data B. Sierra, N. Serrano,
Presentation transcript:

Object- Oriented Bayesian Networks : An Overview Presented By: Asma Sanam Larik Course: Probabilistic Reasoning

Limitations of BN Standard BN representation makes it hard to construct update reuse learn reason with complex models.

Scaling up Our goal is to scale BNs to more complex domains Large-scale diagnosis. Monitor complex processes: highway traffic; military situation assessment. Control intelligent agents in complex environments: Smart robot; intelligent building.

Problem : Knowledge Engineering Main reuse mechanism: cut & paste How is the model updated? How do we construct large BNs?

Problem: BN Inference BN Inference can be exponential Inference complexity depends on subtle properties of BN structure. =>Will a large BN support efficient inference?

Approach 1: Proposed by Laskey Network fragments A Network fragment is basically a set of related variable together with knowledge about the probabilistic relationships among the variables. Two types of object were identified Input and Result fragments. Input fragments are composed together to form a result fragment. To join input fragments together an influence combination rule is needed to compute local probability

Exploit structure! The architecture of complexity [Herbert Simon, 1962] many complex systems have a nearly decomposable, hierarchic structure. Hierarchic systems are usually composed of only a few different kinds of subsystems. By appropriate recoding, the redundancy that is present but unobvious in the structure of a complex system can often be made patent.

Our goal ? Our goal is a more expressive representation language with rigorous probabilistic semantics; model-based; supports hierarchical structure & redundancy; exploits structure for effective inference!

Object-Oriented Bayesian Network Classes represent types of object – Attributes for a class are represented as OOBN nodes – Input nodes refer to instances of another class – Output nodes can be referred to by other classes – Encapsulated nodes are private » Conditionally independent of other objects given input and output nodes Classes may have subclasses – Subclass inherits attributes from superclass – Subclass may have additional attributes not in superclass Classes may be instantiated – Instances represent particular members of the class

Example Reference : F.V.Jensen, T.D.Nelson Bayesian Networks and Decision Graphs, vol. 2, Springer 2007

OOBN An OOBN models a domain with hierarchical structure & redundancy An OOBN consists of a set of objects: simple objects: random variables complex objects :have attributes which are enclosed objects.

Inter Object Interaction Related objects can influence each other via imports and exports. X imports A from Y => value of X can depend on the value of A. objects related to X can import A from X.

Imports and Exports / Inputs and Output Variables Value of object depends probabilistically on the value of its imports A simple object is associated with a conditional probability table distribution over its values given values for its imports. The value of a complex object X is composed of the values for its attributes Its probabilistic model is defined recursively from the models of its attributes

Semantics Theorem: The probabilistic model for an object X defines a conditional probability distribution P( value of X | imports into X from enclosing object)

Old Mac Donald Case Study Reference: O. Bangsø and P.-H. Wuillemin. Top-down construction and repetitive structures representation in Bayesian networks. Proceedings of the 13th International Florida Artificial Intelligene Research Society Conference (FLAIRS-2000), pp. 282–286, AAAI Press, 2000

Sub Classing and Inheritance If a class C should be a subclass of C it should hold the set of input variables for C is a subset of input variables for C the set of output variables for C is a subset of output variables for C

Reference: F.V.Jensen, T.D.Nelson Bayesian Networks and Decision Graphs,vol. 2, Springer 2007

OOBN Inference The OOBN representation allows us to easily construct large complex models Can we do inference in these models? BN constructed very large… efficient inference?

Approaches to Inferencing Convert to normal BN and use standard inference techniques Convert OOBN to MSBN and apply MSBN inference approach By exploiting the modularity we can obtain good results Algorithms are being developed in this area

Conclusion In essence, where Bayesian networks contain two types of knowledge relevance relationships and conditional probabilities OOBNs contain a third type of knowledge organizational structure. They can model static situations but cannot model situations where instances are changing

References D.Koller and A.Pfeffer. Object Oriented Bayesian Networks.Proceedings of the Thirteenth Annual Conference on Uncertainty in Artificial Intelligence. August 1-3, 1997, Brown University, Providence, Rhode Island, USA. Morgan Kaufman Publishers Inc, San Francisco, K. B. Laskey and S. M. Mahoney Network Fragments: Representing Knowledge for Constructing Probabilistic Models. Proceedings of Thirteenth Annual Conference on uncertainty in Artificial Intelligence. Morgan Kaufman Publishers Inc., San Francisco, O. Bangsø and P.-H. Wuillemin. Top-down construction and repetitive structures representation in Bayesian networks. Proceedings of the 13th International Florida Artificial Intelligene Research Society Conference (FLAIRS-2000), pp. 282–286, AAAI Press, M. Fenton, Nielsen, L. M. (2000). Building Large-Scale Bayesian Networks,The Knowledge Engineering Review 15(3): 257–284. J.Pearl (1988). Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference, Series in Representation and Reasoning, Morgan Kaufmann Publishers,San Mateo, CA. M. Julia Gallego, Bayesian networks inference: Advanced algorithms for triangulation and partial abduction, Ph.D. dissertation, Departamento de Sistemas Inform´aticos, University of Castilla - La Mancha (UCLM), 2005 U.B. Kjaerulff, A.L. Madsen, Bayesian Networks and Influence Diagrams : A Guide to Construction and Analysis, Springer 2008,pp F.V.Jensen, T.D.Nelson Bayesian Networks and Decision Graphs,vol. 2, Springer 2007, pp Hugin Tutorial, H.Simon,"The Architecture of Complexity", Proceedings of American Philosophical Association, 1962