Probabilistic Reasoning with Uncertain Data Yun Peng and Zhongli Ding, Rong Pan, Shenyong Zhang.

Slides:



Advertisements
Similar presentations
Autonomic Scaling of Cloud Computing Resources
Advertisements

A Tutorial on Learning with Bayesian Networks
Exact Inference. Inference Basic task for inference: – Compute a posterior distribution for some query variables given some observed evidence – Sum out.
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Bayesian Networks CSE 473. © Daniel S. Weld 2 Last Time Basic notions Atomic events Probabilities Joint distribution Inference by enumeration Independence.
Partially Observable Markov Decision Process (POMDP)
Exact Inference in Bayes Nets
UMBC an Honors University in Maryland 1 Uncertainty in Ontology Mapping: Uncertainty in Ontology Mapping: A Bayesian Perspective Yun Peng, Zhongli Ding,
BAYESIAN NETWORKS CHAPTER#4 Book: Modeling and Reasoning with Bayesian Networks Author : Adnan Darwiche Publisher: CambridgeUniversity Press 2009.
Identifying Conditional Independencies in Bayes Nets Lecture 4.
IMPORTANCE SAMPLING ALGORITHM FOR BAYESIAN NETWORKS
Introduction of Probabilistic Reasoning and Bayesian Networks
OMEN: A Probabilistic Ontology Mapping Tool Mitra et al.
Non-linear Dimensionality Reduction CMPUT 466/551 Nilanjan Ray Prepared on materials from the book Non-linear dimensionality reduction By Lee and Verleysen,
Chapter 12: Expert Systems Design Examples
Causal and Bayesian Network (Chapter 2) Book: Bayesian Networks and Decision Graphs Author: Finn V. Jensen, Thomas D. Nielsen CSE 655 Probabilistic Reasoning.
GS 540 week 6. HMM basics Given a sequence, and state parameters: – Each possible path through the states has a certain probability of emitting the sequence.
M.I. Jaime Alfonso Reyes ´Cortés.  The basic task for any probabilistic inference system is to compute the posterior probability distribution for a set.
Bayesian network inference
Date:2011/06/08 吳昕澧 BOA: The Bayesian Optimization Algorithm.
1 © 1998 HRL Laboratories, LLC. All Rights Reserved Construction of Bayesian Networks for Diagnostics K. Wojtek Przytula: HRL Laboratories & Don Thompson:
Regulatory Network (Part II) 11/05/07. Methods Linear –PCA (Raychaudhuri et al. 2000) –NIR (Gardner et al. 2003) Nonlinear –Bayesian network (Friedman.
Probabilistic Reasoning Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 14 (14.1, 14.2, 14.3, 14.4) Capturing uncertain knowledge Probabilistic.
Learning with Bayesian Networks David Heckerman Presented by Colin Rickert.
Bayesian Belief Networks
1 Learning Entity Specific Models Stefan Niculescu Carnegie Mellon University November, 2003.
December Marginal and Joint Beliefs in BN1 A Hybrid Algorithm to Compute Marginal and Joint Beliefs in Bayesian Networks and its complexity Mark.
5/25/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005.
Scalable Information-Driven Sensor Querying and Routing for ad hoc Heterogeneous Sensor Networks Maurice Chu, Horst Haussecker and Feng Zhao Xerox Palo.
1 © 1998 HRL Laboratories, LLC. All Rights Reserved Development of Bayesian Diagnostic Models Using Troubleshooting Flow Diagrams K. Wojtek Przytula: HRL.
Learning Bayesian Networks
. PGM 2002/3 – Tirgul6 Approximate Inference: Sampling.
Semantically-Linked Bayesian Networks: A Framework for Probabilistic Inference Over Multiple Bayesian Networks PhD Dissertation Defense Advisor: Dr. Yun.
Artificial Intelligence CS 165A Tuesday, November 27, 2007  Probabilistic Reasoning (Ch 14)
Introduction to Monte Carlo Methods D.J.C. Mackay.
The Bayesian Web Adding Reasoning with Uncertainty to the Semantic Web
Soft Computing Lecture 17 Introduction to probabilistic reasoning. Bayesian nets. Markov models.
Adaptive Importance Sampling for Estimation in Structured Domains L.E. Ortiz and L.P. Kaelbling.
Bayesian networks Classification, segmentation, time series prediction and more. Website: Twitter:
Reasoning Under Uncertainty: Independence and Inference Jim Little Uncertainty 5 Nov 10, 2014 Textbook §6.3.1, 6.5, 6.5.1,
Comp. Genomics Recitation 12 Bayesian networks Taken from Artificial Intelligence course, MIT, 6.034
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
Unsupervised Learning: Clustering Some material adapted from slides by Andrew Moore, CMU. Visit for
Reasoning in Uncertain Situations
Topic (vi): New and Emerging Methods Topic organizer: Maria Garcia (USA) UNECE Work Session on Statistical Data Editing Oslo, Norway, September 2012.
Learning Linear Causal Models Oksana Kohutyuk ComS 673 Spring 2005 Department of Computer Science Iowa State University.
Zhuo Peng, Chaokun Wang, Lu Han, Jingchao Hao and Yiyuan Ba Proceedings of the Third International Conference on Emerging Databases, Incheon, Korea (August.
Announcements Project 4: Ghostbusters Homework 7
1 CMSC 671 Fall 2001 Class #21 – Tuesday, November 13.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
AI ● Dr. Ahmad aljaafreh. What is AI? “AI” can be defined as the simulation of human intelligence on a machine, so as to make the machine efficient to.
Bayesian networks and their application in circuit reliability estimation Erin Taylor.
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Learning and Acting with Bayes Nets Chapter 20.. Page 2 === A Network and a Training Data.
A Bayesian Perspective to Semantic Web – Uncertainty modeling in OWL Jyotishman Pathak 04/28/2005.
Textbook Basics of an Expert System: – “Expert systems: Design and Development,” by: John Durkin, 1994, Chapters 1-4. Uncertainty (Probability, Certainty.
Reasoning Under Uncertainty: Independence and Inference CPSC 322 – Uncertainty 5 Textbook §6.3.1 (and for HMMs) March 25, 2011.
Introduction on Graphic Models
Today Graphical Models Representing conditional dependence graphically
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Pekka Laitila, Kai Virtanen
Inconsistent Constraints
Bayesian Models in Machine Learning
CAP 5636 – Advanced Artificial Intelligence
CS 188: Artificial Intelligence
CS 188: Artificial Intelligence Fall 2008
Reasoning Patterns Bayesian Networks Representation Probabilistic
Probabilistic Reasoning
Multidisciplinary Optimization
Presentation transcript:

Probabilistic Reasoning with Uncertain Data Yun Peng and Zhongli Ding, Rong Pan, Shenyong Zhang

Uncertain Evidences Causes for uncertainty of evidence – Observation error – Unable to observe the precise state the world is in Two types of uncertain evidences – Virtual evidence: evidence with uncertainty I’m not sure about my observation that A = a 1 – Soft evidence: evidence of uncertainty I cannot observe the state of A but have observed the distribution of A as P(A) = (0.7, 0.3)

Virtual Evidences ► Represent uncertainty in VE by likelihood ratio   This ratio shall be preserved (invariant) in belief update ► Implemented by adding a VE node  It is a leaf node, with A as its only parent  Its CPT conform the likelihood ratio  Many BN engine accept likelihood ratio directly  Multiple VE is not a problem A ve A ve B B

Soft Evidences ► Represent uncertainty in SE by distribution   itself is to be believed without uncertainty and must be preserved (invariant) in belief update ► Reasoning with a SE: Jeffrey’s rule ► Reasoning with a single SE: Jeffrey’s rule  For the given se A = R(A) for evidence variable A for evidence variable A for the rest of variables  For BN: convert SE to VE: calculate likelihood ratio

Multiple Soft Evidences ► Problem: cannot satisfy all SE  update one variable’s distribution to its target value (the observed distribution) can make those of others’ off their targets  A se A se B ► Solution: IPFP  A procedure that modify a distribution by one or more distributions over subsets of variables B

Jeffrey’s Rule Jeffrey’s rule (J-conditioning) (R. Jeffrey 1983) –Given SE R(a), any other variable c is updated by –Extend Jeffrey’s rule to the entire distribution –Q(a) = R(a) –Among all JPD sayisfying R(a), Q(x) has the smallest KL distance (I-divergence) to the original P(x) –Q(x) is called an I-projection of P(x) on R(B) What if we have more than one SE? –R 1 (educ) and R 2 (smoker) (constraints) –How to make a minimum change to P(x) to satisfy ALL constraints?

IPFP We can try Jeffrey’s rule –First on P(x) using R 1 -> Q 1 (x) –Then on Q 1 (x) using R 2 -> Q 2 (x) –Q 2 (x) satisfies R 2 but not R 1 Iterative Proportional Fitting Procedure (IPFP) –Proposed by R. Kruithof (1937); convergence proved by I. Csiszar (1975) –Loop over the set of constraints, each step tries to fit one constraint –Converges to Q*(x), which is the I-projection of P(x) on the set of given constraints

IPFP All JPD satisfying R1 P All JPD satisfying R2 R2 R1 Q1 Q2 Q3 Q*

IPFP Problems with IPFP –Very slow Each iteration (fitting step) has complexity of O(2 |x| ) Factorization -> Bayesian network (BN) oscillating –Inconsistent constraints No JPD satisfies all constraints IPFP won’t converge (oscillating)

BN Belief Update with SE BN belief update with hard evidence –HE a = A1; b = B3 –Clamp node a to A1 and b to B3 –Calculate P(c|A1, B3) for all c a b a ve a ve b b Virtual evidence –Uncertainty of the HE (observation) –Represented as a likelihood ratio –Virtual node ve a, with conditional probability table calculated from L(a) –When ve a is clamped to “true”, P(a) on a is updated to have its likelihood ratio = L(a)

BN Belief Update with SE Convert SE to VE – –Belief update with yields Q(a) = R 1 (a) se a se b a b Solution: combine VE with IPFP Not work with multiple SE –When apply both se a and se b, Q(a) != R 1 (a); Q(b) != R 2 (b)

BN Belief Update with SE V-IPFP: at k th iteration – Pick up a se i, say R 1 (a), create a new ve i,j, with likelihood ratio –Apply ve i,j to update the entire network se a,1 se b,1 se a,2 … … Convergence –Converges to the I-projection on all constraints Cost –Space: small –Time: large for large BN

Inconsistent Constraints Smooth: –Phase I: apply IPFP until oscillation is detected Pull Q to the neighborhood of the solution –Phase II: continue IPFP, but each time the constraint is modified –A new constraint is generated at each step, Original constraints gradually phased out Serialized GEMA New constraints are generated only based on and Incorporate into V-IPFP for BN reasoning is straightforward current constraint new constraint with influences from other constraints

BN Learning with Uncertain Data Modify BN by a set of low dimensional PD (constraints) –Approach 1: Compute the JPD P(x) from BN, Modify P(x) to Q*(x) by constraints using IPFP Construct a new BN from Q*(x) (it may have different structure that the original BN –Our approach: Keep BN structure unchanged, only modify the CPTs Developed a localized version of IPFP –Next step: Dealing with inconsistency Change structure (minimum necessary) Learning both structure and CPT with mixed data (samples as low dimensional PDs)

Remarks Wide potential applications –Probabilistic resources are all over the places (survey data, databases, probabilistic knowledge bases of different kinds) –This line of research may lead to effective ways to connect them Problems with the IPFP based approaches –Computationally expensive –Hard to do mathematical proofs References: [1] Peng, Y., Zhang, S., Pan, R.: “Bayesian Network Reasoning with Uncertain Evidences”, International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, 18 (5), , 2010Bayesian Network Reasoning with Uncertain Evidences [2] Pan, R., Peng, Y., and Ding, Z: “Belief Update in Bayesian Networks Using Uncertain Evidence”, in Proceedings of the IEEE International Conference on Tools with Artificial Intelligence (ICTAI-2006), Washington, DC,13 – 15, Nov Belief Update in Bayesian Networks Using Uncertain Evidenc [3] Peng, Y. and Ding, Z.: “Modifying Bayesian Networks by Probability Constraints”, in Proceedings of 21st Conference on Uncertainty in Artificial Intelligence (UAI-2005), Edinburgh, Scotland, July 26-29, 2005Modifying Bayesian Networks by Probability Constraints