Download presentation
Presentation is loading. Please wait.
1
Bucket Renormalization for Approximate Inference
Speaker: Michael Chertkov 1 Joint work with Sungsoo Ahn2, Adrian Weller3 and Jinwoo Shin2 1Los Alamos National Laboratory (LANL) 2Korea Advanced Institute of Science and Technology (KAIST) 3University of Cambridge July 12th, 2018
2
Goal: Partition Function Approximation in GMs
Graphical model (GM) is a family of distributions, factorized by graphs. computer vision [freeman et al., 2000], social science [scott, 2017] and deep learning [hinton & Salakhutdinov, 2006] Partition function Z is essential for inference & normalization However, NP-hard to compute, and we use approximation algorithms: Markov chain Monte Carlo (MCMC): running Markov chains for samples of GM. - Asymptotically exact, but slow in convergence. Variational inference: casting computation of Z as an optimization. - Moderately fast & accurate in general, but output poor results when failed to converge, e.g., belief propagation (BP). Approximate elimination: approximately summing out variables one-by-one. - Terminates in iterations (fastest among three), but poor approximation quality, e.g., mini-bucket elimination (MBE).
3
Contribution: Bucket Renormalization
We propose new approximate elimination framework, inspired from mini-bucket elimination [Dechter & Rish, 2003] and tensor network renormalization [Levin and Nave, 2007]. Mini-bucket renormalization (MBR) - Repeatedly renormalizing mini-buckets via rank-1 approximation of mini-buckets. Global-bucket renormalization (GBR) - Calibrating MBR based on a explicit approximation error. approximation quality speed MCMC variational inference approximate elimination MBR GBR Empirical Observations MBR was up to x faster with x0.25 smaller log-Z error compared to belief propagation (BP). GBR reduced the error of MBR up to x0.5 times.
4
Mini-Bucket Renormalization (MBR)
Repeatedly renormalizing mini-buckets via rank-1 approximation of mini-buckets
5
Mini-Bucket Renormalization (MBR)
(Exact) Variable Elimination Computing Z by iteratively summing out variables. Takes exponential complexity & memory (e.g., for figure) MBR iteratively approximates each step e.g.,
6
Mini-Bucket Renormalization (MBR)
Idea 1. Splitting variables, then adding singleton factors for compensation. Splitting variable allows seperate marginalizations in tractable time ( for this cas). Singleton factors ( ) tries to ‘compensate’ the splitting of variables. Number of splits can be controlled by memory budget (ibound). Idea 2. Choosing compensating factors by comparing to optimal compensation.
7
Mini-Bucket Renormalization (MBR)
Idea 2. Choosing compensating factors by comparing to optimal compensation. Error is measured in terms of marginalization over scope of comparison (i.e., mini-bucket). minimize L2-difference:
8
Global-Bucket Renormalization (GBR)
Calibrating MBR based on a explicit approximation error.
9
Global-Bucket Renormalization (GBR)
Idea. Increasing the size for maximal scope of comparison (i.e., global-bucket). As scope gets larger, L2-difference gets closer to Z-difference. minimize L2-difference
10
Global-Bucket Renormalization (GBR)
Idea. Increasing the size for maximal scope of comparison (i.e., global-bucket). As scope gets larger, L2-difference gets closer to Z-difference. minimize L2-difference
11
Global-Bucket Renormalization (GBR)
Idea. Increasing the size for maximal scope of comparison (i.e., global-bucket). As scope gets larger, L2-difference gets closer to Z-difference. However, marginalizing over maximal scope is as hard as computing Z. To resolve this, MBR is applied again to approximate the marginalization process. minimize L2-difference
12
Comparing performance and speed to existing algorithms
Experiments Comparing performance and speed to existing algorithms
13
Experiments in Ising GMs
Comparing approximate eliminations and VI: Approximate elimination: mini-bucket elimination (MBE) and weighted MBE (WMBE) Variational inference: mean-field (MF), belief propagation (BP) and generalized BP (GBP) Accuracy: MBR and GBR outperform all methods. Speed: MBR is as fast as existing approximate eliminations. GBR improves MBR, but requires more computation (slower). interaction strength vs performance memory budget elapsed time complete graph,
14
Experiments in Ising GMs
Comparing approximate eliminations and VI: Approximate elimination: mini-bucket elimination (MBE) and weighted MBE (WMBE) Variational inference: mean-field (MF), belief propagation (BP) and generalized BP (GBP) Accuracy: MBR and GBR outperform all methods. Speed: MBR is as fast as existing approximate eliminations. GBR improves MBR, but requires more computation (slower). grid graph, interaction strength vs performance memory budget elapsed time
15
Experiments in Real-World Models
Dataset from UAI 2014 Inference competition, Promedus (medical diagnosis) and Linkage (genetic linkage) dataset. MBR and GBR are less accurate than GBP in Promedus, but still outperform in other cases. Linkage Promedus
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.