Download presentation
Presentation is loading. Please wait.
Published bySophie Oliver Modified over 8 years ago
1
Belief Propagation Revisited Adnan Darwiche
2
Graphical Models Battery Age Alternator Fan Belt Battery Charge Delivered Battery Power Starter Radio LightsEngine Turn Over Gas Gauge Gas Fuel Pump Fuel Line Distributor Spark Plugs Engine Start
3
Graphical Models Battery Age Alternator Fan Belt Battery Charge Delivered Battery Power Starter Radio LightsEngine Turn Over Gas Gauge Gas Fuel Pump Fuel Line Distributor Spark Plugs Engine Start ON OFF OK WEAK DEAD Lights Battery Power.99.01.20.80 01
4
Graphical Models Battery Age Alternator Fan Belt Battery Charge Delivered Battery Power Starter Radio LightsEngine Turn Over Gas Gauge Gas Fuel Pump Fuel Line Distributor Spark Plugs Engine Start
5
Probabilistic Reasoning in… Diagnosis Planning Learning Channel coding Vision Speech recognition Language comprehension Bioinformatics …
6
Treewidth w
7
Larger Shift Closer Object Smaller Shift Further Object
8
Input: L&R Image Output: Depth Map Bayesian Network Images Define a Bayesian Network Reasoning in Bayesian Network Estimates Depth
9
Belief Propagation
11
Belief Propagation: What if there are loops? ? ?
13
Loopy Belief Propagation p.235
14
The Merit of Loopy Belief Propagation Revolutionary error correcting codes: Turbo Codes (Berrou&Glavieux 1993) LDPC Codes (MacKay&Neal 1995, Gallager 1962) Can closely reach the theoretical limit of communications in noisy channels Turbo & LDPC decoders: Loopy BP in BNs! (McEliece, MacKay & Cheng 1998)
15
Stereo Vision Two ImagesDepth Map
16
Stereo Vision http://vision.middlebury.edu/stereo/eval/ Top 4 highest ranking are loopy BP or extend loopy BP
17
Edge Deletion Semantics (Joint work with Arthur Choi) Energy-Based Semantics (Statistical Physics)
18
The Idea A C B D
19
A C B D A C B D
20
A C B D A C B D X Y New Edge Parameters for each Query
21
Specifying the Approximation How do we parametrize edges? Quality of approximation Which edges do we delete? Quality of approximation Computational complexity
22
Parametrizing Edges: ED-BP U X U' s' Choose parameters that satisfy:
23
Parametrizing Edges Iteratively: ED-BP Iteration t = 0 Initialization
24
Parametrizing Edges Iteratively: ED-BP Iteration t = 1
25
Parametrizing Edges Iteratively: ED-BP Iteration t = 2
26
Parametrizing Edges Iteratively: ED-BP Iteration t c Convergence
27
Belief Propagation as Edge Deletion Iteration t
31
Which Edges To Delete? X U' s' U
32
Which Edges To Delete? X U' s' U
33
ED-BP: Improving on the Quality of IBP Exact Inference BP
34
ED-BP: Improving on the Quality of IBP Exact Inference BP
35
ED-BP: Potentially Bad Approximations Exact Inference BP Unimproved, but costly, approximation,
36
ED-BP: Improving on the Convergence Rate
37
ED-BP: Improving on Running Time
38
Edge Deletion in Undirected Models Original Network Approximate Network
39
Correcting the Partition Function I i j Theorem: If MI(X i,X j ) = 0 in ED-BP network M', then: where
40
Deleting Many Edges This will yield the Bethe free energy approximation!
41
Correcting the Partition Function II i j Theorem: For an ED-BP network M', we have: where
42
Which Edges Do We Recover? EC2? ij Recover edges with largest kl MI(X i,X j ;X k X l ) !
43
Experiment: Random Grid 8079640 5 10 15 x 10 -3 edges recovered |log Z - log Z'| EC1,rand EC2,rand EC1,MI EC2,MI EC1,2MI EC2,2MI Bethe exact yfq
44
Beyond Treewidth… Exact Inference: Exploit Non-structural Independence Approximate inference: Exact inference on an approximate network obtained by deleting edges
45
What next? Constant factors! Guarantees/bounds on approximations Edge recovery heuristics: getting the most out of the extra time Controlling tradeoff between quality & complexity Dynamic models Logical reasoning: Survey propagation
46
http://reasoning.cs.ucla.edu
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.