Arthur Choi and Adnan Darwiche UCLA

Slides:



Advertisements
Similar presentations
Variational Inference Amr Ahmed Nov. 6 th Outline Approximate Inference Variational inference formulation – Mean Field Examples – Structured VI.
Advertisements

Variational Methods for Graphical Models Micheal I. Jordan Zoubin Ghahramani Tommi S. Jaakkola Lawrence K. Saul Presented by: Afsaneh Shirazi.
Exact Inference in Bayes Nets
Junction Trees And Belief Propagation. Junction Trees: Motivation What if we want to compute all marginals, not just one? Doing variable elimination for.
Convergent Message-Passing Algorithms for Inference over General Graphs with Convex Free Energies Tamir Hazan, Amnon Shashua School of Computer Science.
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
GS 540 week 6. HMM basics Given a sequence, and state parameters: – Each possible path through the states has a certain probability of emitting the sequence.
Bayesian network inference
Belief Propagation in a Continuous World Andrew Frank 11/02/2009 Joint work with Alex Ihler and Padhraic Smyth TexPoint fonts used in EMF. Read the TexPoint.
A posteriori Error Estimate - Adaptive method Consider the boundary value problem Weak form Discrete Equation Error bounds ( priori error )
Global Approximate Inference Eran Segal Weizmann Institute.
. Bayesian Networks Lecture 9 Edited from Nir Friedman’s slides by Dan Geiger from Nir Friedman’s slides.
Belief Propagation, Junction Trees, and Factor Graphs
Nonlinear Sampling. 2 Saturation in CCD sensors Dynamic range correction Optical devices High power amplifiers s(-t) Memoryless nonlinear distortion t=n.
. Approximate Inference Slides by Nir Friedman. When can we hope to approximate? Two situations: u Highly stochastic distributions “Far” evidence is discarded.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
. Expressive Graphical Models in Variational Approximations: Chain-Graphs and Hidden Variables Tal El-Hay & Nir Friedman School of Computer Science & Engineering.
Aspects of Bayesian Inference and Statistical Disclosure Control in Python Duncan Smith Confidentiality and Privacy Group CCSR University of Manchester.
1 Structured Region Graphs: Morphing EP into GBP Max Welling Tom Minka Yee Whye Teh.
Probabilistic Graphical Models
Automated Planning and Decision Making Prof. Ronen Brafman Automated Planning and Decision Making 2007 Bayesian networks Variable Elimination Based on.
Belief Propagation Revisited Adnan Darwiche. Graphical Models Battery Age Alternator Fan Belt Battery Charge Delivered Battery Power Starter Radio LightsEngine.
Survey Propagation. Outline Survey Propagation: an algorithm for satisfiability 1 – Warning Propagation – Belief Propagation – Survey Propagation Survey.
Tokyo Institute of Technology, Japan Yu Nishiyama and Sumio Watanabe Theoretical Analysis of Accuracy of Gaussian Belief Propagation.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 12th Bayesian network and belief propagation in statistical inference Kazuyuki Tanaka.
Readings: K&F: 11.3, 11.5 Yedidia et al. paper from the class website
Two Approximate Algorithms for Belief Updating Mini-Clustering - MC Robert Mateescu, Rina Dechter, Kalev Kask. "Tree Approximation for Belief Updating",
1 Mean Field and Variational Methods finishing off Graphical Models – Carlos Guestrin Carnegie Mellon University November 5 th, 2008 Readings: K&F:
Tractable Inference for Complex Stochastic Processes X. Boyen & D. Koller Presented by Shiau Hong Lim Partially based on slides by Boyen & Koller at UAI.
Belief Propagation and its Generalizations Shane Oldenburger.
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Bayes network inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y 
Markov Networks: Theory and Applications Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208
1 Structure Learning (The Good), The Bad, The Ugly Inference Graphical Models – Carlos Guestrin Carnegie Mellon University October 13 th, 2008 Readings:
1 Variable Elimination Graphical Models – Carlos Guestrin Carnegie Mellon University October 15 th, 2008 Readings: K&F: 8.1, 8.2, 8.3,
30 November, 2005 CIMCA2005, Vienna 1 Statistical Learning Procedure in Loopy Belief Propagation for Probabilistic Image Processing Kazuyuki Tanaka Graduate.
ED-BP: Belief Propagation via Edge Deletion UCLA Automated Reasoning Group Arthur Choi, Adnan Darwiche, Glen Lenker, Knot Pipatsrisawat Last updated 07/15/2010:
A Collapsed Variational Bayesian Inference Algorithm for Latent Dirichlet Allocation Yee W. Teh, David Newman and Max Welling Published on NIPS 2006 Discussion.
Bayesian Conditional Random Fields using Power EP Tom Minka Joint work with Yuan Qi and Martin Szummer.
Many-Pairs Mutual Information for Adding Structure to Belief Propagation Approximations Arthur Choi and Adnan Darwiche University of California, Los Angeles.
ICPR2004 (24 July, 2004, Cambridge) 1 Probabilistic image processing based on the Q-Ising model by means of the mean- field method and loopy belief propagation.
ベーテ自由エネルギーに対するCCCPアルゴリズムの拡張
Graduate School of Information Sciences, Tohoku University
Extending Expectation Propagation for Graphical Models
Belief Propagation and Approximate Inference: Compensating for Relaxations Arthur Choi.
Solving MAP Exactly by Searching on Compiled Arithmetic Circuits
Exact Inference Continued
Bucket Renormalization for Approximate Inference
Markov Networks.
Bayesian Models in Machine Learning
Readings: K&F: 15.1, 15.2, 15.3, 15.4, 15.5 K&F: 7 (overview of inference) K&F: 8.1, 8.2 (Variable Elimination) Structure Learning in BNs 3: (the good,
Exact Inference ..
Class #19 – Tuesday, November 3
≠ Particle-based Variational Inference for Continuous Systems
Graduate School of Information Sciences, Tohoku University
Arthur Choi and Adnan Darwiche UCLA
Physical Fluctuomatics 7th~10th Belief propagation
Exact Inference Continued
Expectation-Maximization & Belief Propagation
Graduate School of Information Sciences, Tohoku University
Extending Expectation Propagation for Graphical Models
Approximate Inference by Sampling
Approximating the Partition Function by Deleting and then Correcting for Model Edges Arthur Choi and Adnan Darwiche University of California, Los Angeles.
Readings: K&F: 11.3, 11.5 Yedidia et al. paper from the class website
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Mean Field and Variational Methods Loopy Belief Propagation
Generalized Belief Propagation
Iterative Join Graph Propagation
Presentation transcript:

Arthur Choi and Adnan Darwiche UCLA {aychoi,darwiche}@cs.ucla.edu A Variational Approach for Approximating Bayesian Networks by Edge Deletion Arthur Choi and Adnan Darwiche UCLA {aychoi,darwiche}@cs.ucla.edu Slides used for plenary presentation at UAI-06. Updated 09/21/2006.

The Idea A C B D A B C D Approximate inference: Exact inference in an approximate model Approximate model: by deleting edges

The Idea A C B D A B Y X C D Approximate inference: Exact inference in an approximate model Approximate model: by deleting edges Specifying Auxiliary Parameters Method 1: BP Method 2: KL

The Idea Original Network Approximate Network

Deleting an Edge U X

Deleting an Edge: The Clone U U' X

Deleting an Edge: The Soft Evidence U New edge parameters for each new query. s' U' X

Specifying the Approximation How do we parametrize edges? Compensate for the missing edge Quality of approximation Which edges do we delete? Computational complexity

A First Approach: ED-BP (Edge Deletion-Belief Propagation) Choose parameters that satisfy: U s' U' X Can be used as update equations: Initialize parameters randomly Iterate until fixed point is reached To be presented at AAAI-06.

Belief Propagation as Edge Deletion Theorem: IBP corresponds to ED-BP U s' U' X

Belief Propagation as Edge Deletion IBP in the original network ED-BP in a disconnected approximation To be presented at AAAI-06.

Edge Recovery using Mutual Information MI(U;U'|e') U s' U' X

A First Approach: ED-BP (Edge Deletion-Belief Propagation) How do we parametrize edges? Subsumes BP as a degenerate case. Which edges do we delete? Recover edges using mutual information

A Second Approach Based on the KL-Divergence

An Simple Bound on The KL-Divergence X U X U' A Bayesian network An approximation

An Simple Bound on The KL-Divergence X U X U' U X U' qu'|u = 1 iff u' = u A Bayesian network An extended network An approximation

Identifying Edge Parameters: ED-KL Theorem 1: Edge parameters are a stationary point of the KL-divergence if and only if: U X U' s'

Identifying Edge Parameters: ED-KL Theorem 1: Edge parameters are a stationary point of the KL-divergence if and only if: U X U' s' Theorem 2: Edge parameters are a stationary point of the KL-divergence if and only if:

Deleting a Single Edge When a single edge is deleted, we can: kl1 When a single edge is deleted, we can: compute KL-divergence efficiently. iterate efficiently.

Identifying Edges to Delete kl4 kl1 kl2 kl5 kl3 kl6

Comparing ED-BP & ED-KL ED-BP characterized by: ED-KL characterized by:

Quality of Approximation Disconnected Approximation Exact Inference

Quality of Approximation Belief Propagation

Quality of Approximation Belief Propagation

Quality of Approximation

Quality of Approximation, Extreme Cases

Approximating MAP Consider the MAP explanation: MAP is hard even when marginals are easy! P(e): complexity in treewidth, MAP: complexity in constrained treewidth. Delete edges to reduce constrained treewidth!

Quality of MAP Approximations

Quality of MAP Approximations

Quality of MAP Approximations

Complexity of Approximation

Summary Approximate Inference Parametrizing Deleted Edges: Exact inference in an approximate model. Tradeoff approximation quality with computational resources by deleting edges. Parametrizing Deleted Edges: ED-BP: Subsumes belief propagation. (New understanding of belief propagation) ED-KL: A variational approach. Choosing Which Edges to Delete: ED-BP: Edge recovery in terms of mutual information. ED-KL: Delete edges by (single-edge) KL. ED-BP + Delete edges by KL: surprisingly good!