Download presentation
Presentation is loading. Please wait.
Published byAlfred Fisher Modified over 9 years ago
1
MPE, MAP AND APPROXIMATIONS Lecture 10: Statistical Methods in AI/ML Vibhav Gogate The University of Texas at Dallas Readings: AD Chapter 10
2
Recap: AND/OR Search C BD A 2 BD 0101 C 01 BDBD 01010101 1001 A 01 A 01 A 0 A 1 A 1 A 0 AND/OR tree AND/OR graph C B D A {0,1}{0,1,2} {0,1} 2 BD 0101 C 01 DD 0101 01 1 B 0 A 0 1 B 0 A 1 Pseudo tree Pairwise Markov network
3
Recap: Formula-based Probabilistic inference Convert the graphical model into weighted logic Perform AND/OR search in weighted logic with Boolean constraint propagation Delete all formulas which are valid or unsatisfiable In practice, complexity much lower than worst case complexity of exp(w*) where w* is the treewidth Formula-based Inference AND/OR search over formula assignments rather than variable assignments
4
What we will cover? MPE= most probable explanation The tuple with the highest probability in the joint distribution Pr(X|e) MAP=maximum a posteriori Given a subset of variables Y, the tuple with the highest probability in the distribution P(Y|e) Exact Algorithms Variable elimination DFS search Branch and Bound Search Approximations Upper bounds Local search
5
Running Example: Cheating in UTD CS Population Sex (S), Cheating (C), Tests (T1 and T2) and Agreement (A)
6
Most likely instantiations A person takes a test and the test administrator says The two tests agree (A = true) What is the most likely group that the individual belongs to? Query: Most likely instantiation of Sex and Cheating given evidence A = true Is this a MAP or an MPE problem? Answer: Sex=male and Cheating=no.
7
MPE is a special case of MAP Most likely instantiation of all variables given A=yes S=female, C=no, T 1 =negative and T 2 =negative MPE projected on to the MAP variables does not yield the correct answer. S=female, C=no is incorrect! S=male, C=no is correct! We will distinguish between MPE and MAP probabilities MPE and MAP instantiations
8
Bucket Elimination for MPE Same schematic algorithm as before Replace “elimination operator” by “maximization operator” SCValue maleyes0.05 maleno0.95 femaleyes0.01 femaleno0.99 CValue yes no Collect all instantiations that agree on all other variables except S Compute the maximum value =
9
Bucket Elimination for MPE Same schematic algorithm as before Replace “elimination operator” by “maximization operator” SCValue maleyes0.05 maleno0.95 femaleyes0.01 femaleno0.99 CValue yes0.05 no0.99 Collect all instantiations that agree on all other variables except S and return the maximum value among them. =
10
Bucket elimination: order (S, C, T1, T2) S C T1T1 T2T2 MPE probability
11
Bucket elimination: Recovering MPE tuple S C T1T1 T2T2 MPE probability
12
Bucket elimination: MPE vs PE (Z) Maximization vs summation Complexity: Same Time and Space exponential in the width (w) of the given order: O(n exp(w*+1))
13
BE and Hidden Markov models BE_MPE in the order S 1, S 2, S 3,...., is equivalent to the Viterbi algorithm BE_PE in the order in the order S 1, S 2, S 3,...., is equivalent to the Forward algorithm
14
OR search for MPE At leaf nodes compute probabilities by taking product of factors Select the path with the highest leaf probability
15
Branch and Bound Search Oracle gives you an upper bound on MPE Prune nodes which have smaller upper bound than the current MPE solution
16
16 Mini-Bucket Approximation: Idea Split a bucket into mini-buckets => bound complexity
17
Mini Bucket elimination: (max-size=3 vars) Upper bound on the MPE probability C S T1T1 T2T2
18
Mini-bucket (i-bounds) A parameter “i” which controls the size of (number of variables in) each mini-bucket Algorithm exponential in “i” : O(n exp(i)) Example i=2, quadratic i=3, cubed etc Higher the i-bound, better the upper bound In practice, can use i-bounds as high as 22-25.
19
Branch and Bound Search Oracle = MBE (i) at each point. Prune nodes which have smaller upper bound than the current MPE solution
20
Computing MAP probabilities: Bucket Elimination Given MAP variables “M” Can compute the MAP probability using bucket elimination by first summing out all non-MAP variables, and then maximizing out MAP variables. By summing out non-MAP variables we are effectively computing the joint marginal Pr(M, e) in factored form. By maximizing out MAP variables M, we are effectively solving an MPE problem over the resulting marginal. The variable order used in BE_MAP is constrained as it requires MAP variables M to appear last in the order.
21
MAP and constrained width Treewidth = 2 MAP variables = {Y 1,..,Y n } Any order in which M variables come first has width greater than or equal to n BE_MPE linear and BE_MAP is exponential
22
MAP and constrained width
23
MAP by branch and bound search MAP can be solved using depth-first brand-and-bound search, just as we did for MPE. Algorithm BB_MAP resembles the one for computing MPE with two exceptions. Exception 1: The search space consists only of the MAP variables Exception 2: We use a version of MBE_MAP for computing the bounds Order all MAP variables after the non-MAP variables.
24
MAP by Local Search Given a network with n variables and an elimination order of width w Complexity: O(r nexp(w+1)) where “r” is the number of local search steps Start with an initial random instantiation of MAP variables Neighbors of the instantiation “m” are instantiations that result from changing the value of one variable in “m” Score for neighbor “m”: Pr(m,e) How to compute Pr(m,e)? Bucket elimination.
25
MAP: Local search algorithm
26
Recap Exact MPE and MAP Bucket elimination Branch and Bound Search Approximations Mini bucket elimination Branch and Bound Search Local Search
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.