Belief Propagation Revisited Adnan Darwiche
Graphical Models Battery Age Alternator Fan Belt Battery Charge Delivered Battery Power Starter Radio LightsEngine Turn Over Gas Gauge Gas Fuel Pump Fuel Line Distributor Spark Plugs Engine Start
Graphical Models Battery Age Alternator Fan Belt Battery Charge Delivered Battery Power Starter Radio LightsEngine Turn Over Gas Gauge Gas Fuel Pump Fuel Line Distributor Spark Plugs Engine Start ON OFF OK WEAK DEAD Lights Battery Power
Graphical Models Battery Age Alternator Fan Belt Battery Charge Delivered Battery Power Starter Radio LightsEngine Turn Over Gas Gauge Gas Fuel Pump Fuel Line Distributor Spark Plugs Engine Start
Probabilistic Reasoning in… Diagnosis Planning Learning Channel coding Vision Speech recognition Language comprehension Bioinformatics …
Treewidth w
Larger Shift Closer Object Smaller Shift Further Object
Input: L&R Image Output: Depth Map Bayesian Network Images Define a Bayesian Network Reasoning in Bayesian Network Estimates Depth
Belief Propagation
Belief Propagation: What if there are loops? ? ?
Loopy Belief Propagation p.235
The Merit of Loopy Belief Propagation Revolutionary error correcting codes: Turbo Codes (Berrou&Glavieux 1993) LDPC Codes (MacKay&Neal 1995, Gallager 1962) Can closely reach the theoretical limit of communications in noisy channels Turbo & LDPC decoders: Loopy BP in BNs! (McEliece, MacKay & Cheng 1998)
Stereo Vision Two ImagesDepth Map
Stereo Vision Top 4 highest ranking are loopy BP or extend loopy BP
Edge Deletion Semantics (Joint work with Arthur Choi) Energy-Based Semantics (Statistical Physics)
The Idea A C B D
A C B D A C B D
A C B D A C B D X Y New Edge Parameters for each Query
Specifying the Approximation How do we parametrize edges? Quality of approximation Which edges do we delete? Quality of approximation Computational complexity
Parametrizing Edges: ED-BP U X U' s' Choose parameters that satisfy:
Parametrizing Edges Iteratively: ED-BP Iteration t = 0 Initialization
Parametrizing Edges Iteratively: ED-BP Iteration t = 1
Parametrizing Edges Iteratively: ED-BP Iteration t = 2
Parametrizing Edges Iteratively: ED-BP Iteration t c Convergence
Belief Propagation as Edge Deletion Iteration t
Which Edges To Delete? X U' s' U
Which Edges To Delete? X U' s' U
ED-BP: Improving on the Quality of IBP Exact Inference BP
ED-BP: Improving on the Quality of IBP Exact Inference BP
ED-BP: Potentially Bad Approximations Exact Inference BP Unimproved, but costly, approximation,
ED-BP: Improving on the Convergence Rate
ED-BP: Improving on Running Time
Edge Deletion in Undirected Models Original Network Approximate Network
Correcting the Partition Function I i j Theorem: If MI(X i,X j ) = 0 in ED-BP network M', then: where
Deleting Many Edges This will yield the Bethe free energy approximation!
Correcting the Partition Function II i j Theorem: For an ED-BP network M', we have: where
Which Edges Do We Recover? EC2? ij Recover edges with largest kl MI(X i,X j ;X k X l ) !
Experiment: Random Grid x edges recovered |log Z - log Z'| EC1,rand EC2,rand EC1,MI EC2,MI EC1,2MI EC2,2MI Bethe exact yfq
Beyond Treewidth… Exact Inference: Exploit Non-structural Independence Approximate inference: Exact inference on an approximate network obtained by deleting edges
What next? Constant factors! Guarantees/bounds on approximations Edge recovery heuristics: getting the most out of the extra time Controlling tradeoff between quality & complexity Dynamic models Logical reasoning: Survey propagation