1 WHY MAKING BAYESIAN NETWORKS BAYESIAN MAKES SENSE. Dawn E. Holmes Department of Statistics and Applied Probability University of California, Santa Barbara.

Slides:



Advertisements
Similar presentations
Slide 1 Insert your own content. Slide 2 Insert your own content.
Advertisements

Slide 1 of 18 Uncertainty Representation and Reasoning with MEBN/PR-OWL Kathryn Blackmond Laskey Paulo C. G. da Costa The Volgenau School of Information.
Combining Like Terms. Only combine terms that are exactly the same!! Whats the same mean? –If numbers have a variable, then you can combine only ones.
1 2 Test for Independence 2 Test for Independence.
0 - 0.
Teacher Name Class / Subject Date A:B: Write an answer here #1 Write your question Here C:D: Write an answer here.
CS4026 Formal Models of Computation Running Haskell Programs – power.
MCMC estimation in MlwiN
Bayesian network for gene regulatory network construction
Inferential Statistics and t - tests
Bayesian networks Chapter 14 Section 1 – 2. Outline Syntax Semantics Exact computation.
1 Using Bayesian Network for combining classifiers Leonardo Nogueira Matos Departamento de Computação Universidade Federal de Sergipe.
CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 27 – Overview of probability concepts 1.
ICS-171:Notes 8: 1 Notes 8: Uncertainty, Probability and Optimal Decision-Making ICS 171, Winter 2001.
A Tutorial on Learning with Bayesian Networks
1 ECE 776 Project Information-theoretic Approaches for Sensor Selection and Placement in Sensor Networks for Target Localization and Tracking Renita Machado.
Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
1 Some Comments on Sebastiani et al Nature Genetics 37(4)2005.
BAYESIAN NETWORKS. Bayesian Network Motivation  We want a representation and reasoning system that is based on conditional independence  Compact yet.
1 Knowledge Engineering for Bayesian Networks. 2 Probability theory for representing uncertainty l Assigns a numerical degree of belief between 0 and.
UMBC an Honors University in Maryland 1 Uncertainty in Ontology Mapping: Uncertainty in Ontology Mapping: A Bayesian Perspective Yun Peng, Zhongli Ding,
Crime Risk Factors Analysis Application of Bayesian Network.
Introduction of Probabilistic Reasoning and Bayesian Networks
Date:2011/06/08 吳昕澧 BOA: The Bayesian Optimization Algorithm.
Midterm Review. The Midterm Everything we have talked about so far Stuff from HW I won’t ask you to do as complicated calculations as the HW Don’t need.
1 Graphical Models in Data Assimilation Problems Alexander Ihler UC Irvine Collaborators: Sergey Kirshner Andrew Robertson Padhraic Smyth.
An Introduction to Bayesian Inference Michael Betancourt April 8,
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Learning with Bayesian Networks David Heckerman Presented by Colin Rickert.
Bayes Nets Rong Jin. Hidden Markov Model  Inferring from observations (o i ) to hidden variables (q i )  This is a general framework for representing.
1 Learning Entity Specific Models Stefan Niculescu Carnegie Mellon University November, 2003.
Goal: Reconstruct Cellular Networks Biocarta. Conditions Genes.
Causal Models, Learning Algorithms and their Application to Performance Modeling Jan Lemeire Parallel Systems lab November 15 th 2006.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
1 Bayesian methods for parameter estimation and data assimilation with crop models Part 2: Likelihood function and prior distribution David Makowski and.
Additional Slides on Bayesian Statistics for STA 101 Prof. Jerry Reiter Fall 2008.
A Brief Introduction to Graphical Models
Bayesian Learning By Porchelvi Vijayakumar. Cognitive Science Current Problem: How do children learn and how do they get it right?
Dr. Gary Blau, Sean HanMonday, Aug 13, 2007 Statistical Design of Experiments SECTION I Probability Theory Review.
National Accounts and SAM Estimation Using Cross-Entropy Methods Sherman Robinson.
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
1 Monte Carlo Artificial Intelligence: Bayesian Networks.
Estimating Component Availability by Dempster-Shafer Belief Networks Estimating Component Availability by Dempster-Shafer Belief Networks Lan Guo Lane.
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
CHAPTER 5 Probability Theory (continued) Introduction to Bayesian Networks.
Intelligent Database Systems Lab Advisor : Dr.Hsu Graduate : Keng-Wei Chang Author : Lian Yan and David J. Miller 國立雲林科技大學 National Yunlin University of.
Lecture 2: Statistical learning primer for biologists
5. Maximum Likelihood –II Prof. Yuille. Stat 231. Fall 2004.
Adaptive Dependent Context BGL: Budgeted Generative (-) Learning Given nothing about training instances, pay for any feature [no “labels”, no “attributes”
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Bayesian Optimization Algorithm, Decision Graphs, and Occam’s Razor Martin Pelikan, David E. Goldberg, and Kumara Sastry IlliGAL Report No May.
Introduction on Graphic Models
A Cooperative Coevolutionary Genetic Algorithm for Learning Bayesian Network Structures Arthur Carvalho
Ariel Caticha on Information and Entropy July 8, 2007 (16)
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
Probabilistic Robotics Probability Theory Basics Error Propagation Slides from Autonomous Robots (Siegwart and Nourbaksh), Chapter 5 Probabilistic Robotics.
Graduate School of Information Sciences, Tohoku University
Today.
Learning Bayesian Network Models from Data
Propagation Algorithm in Bayesian Networks
CS 188: Artificial Intelligence Fall 2007
Class #19 – Tuesday, November 3
Graduate School of Information Sciences, Tohoku University
Class #16 – Tuesday, October 26
Graduate School of Information Sciences, Tohoku University
Machine Learning: Lecture 6
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Presentation transcript:

1 WHY MAKING BAYESIAN NETWORKS BAYESIAN MAKES SENSE. Dawn E. Holmes Department of Statistics and Applied Probability University of California, Santa Barbara CA 93106, USA

2 Subjective Probability Rational degrees of belief. Rational degrees of belief. Keyness consensual rational degrees of belief. Keyness consensual rational degrees of belief.

3 What is a Bayesian Network? Directed acyclic graph Directed acyclic graph Nodes are variables (discrete or continuous)Nodes are variables (discrete or continuous) Arcs indicate dependence between variables.Arcs indicate dependence between variables. Conditional Probabilities (local distributions) Conditional Probabilities (local distributions) Missing arcs implies conditional independence Missing arcs implies conditional independence Independencies + local distributions => specification of a joint distribution Independencies + local distributions => specification of a joint distribution

4 Bayesian Networks In classical Bayesian network theory a prior distribution must be specified. In classical Bayesian network theory a prior distribution must be specified. When information is missing, we are able to find a minimally prejudiced prior distribution using MaxEnt When information is missing, we are able to find a minimally prejudiced prior distribution using MaxEnt

5 A Simple Bayesian Network

6 Priors for Bayesian Networks Using frequentist probabilities results in a rigid network.Using frequentist probabilities results in a rigid network. The results obtained using Bayesian networks are only as good as their prior distribution.The results obtained using Bayesian networks are only as good as their prior distribution. The maximum entropy formalism.The maximum entropy formalism.

7 Maximum Entropy and the Principle of Insufficient Reason. The principle of maximum entropy is a generalization of the principle of insufficient reason. The principle of maximum entropy is a generalization of the principle of insufficient reason. It is capable of determining a probability distribution for any combination of partial knowledge and partial ignorance. It is capable of determining a probability distribution for any combination of partial knowledge and partial ignorance.

8 An iterative algorithm for updating probabilities in a multivalued multiway tree in given. An iterative algorithm for updating probabilities in a multivalued multiway tree in given. A Lagrange multiplier technique is used to find the probability of an arbitrary state in a Bayesian tree using only MaxEnt. A Lagrange multiplier technique is used to find the probability of an arbitrary state in a Bayesian tree using only MaxEnt. Two Results

9 FRAGMENT OF STATE TABLE

10 A Simple Bayesian Network

11 A Simple BN with Maximum Entropy

12 Maximum Entropy in Bayesian Networks Maximum entropy provides a technique for eliciting knowledge from incomplete information. Maximum entropy provides a technique for eliciting knowledge from incomplete information. We use the maximum entropy formalism to optimally estimate the prior distribution of a Bayesian network. We use the maximum entropy formalism to optimally estimate the prior distribution of a Bayesian network. All and only the information provided by expert knowledge is used. All and only the information provided by expert knowledge is used.

13 What use are Subjective Bayesian Prior Distributions Why determine the prior distribution for a Bayesian network using maximum entropy? Why determine the prior distribution for a Bayesian network using maximum entropy? Any problem involving probabilities can be represented by a Bayesian network. Any problem involving probabilities can be represented by a Bayesian network.

14 Independence Proofs must not use techniques outside of MaxEnt. Proofs must not use techniques outside of MaxEnt. Proofs have already been given elsewhere. Proofs have already been given elsewhere.

15 Thank You!