Daphne Koller Overview Maximum a posteriori (MAP) Probabilistic Graphical Models Inference
Daphne Koller Maximum a Posteriori (MAP) Evidence: E=e Query: all other variables Y (Y={X 1,…,X n }-E) Task: compute MAP(Y|E=e) = argmax y P(Y=y | E=e) – Note: there may be more than one possible solution Applications – Message decoding: most likely transmitted message – Image segmentation: most likely segmentation
Daphne Koller MAP ≠ Max over Marginals Joint distribution – P(a 0, b 0 ) = 0.04 – P(a 0, b 1 ) = 0.36 – P(a 1, b 0 ) = 0.3 – P(a 1, b 1 ) = 0.3 B AB0B0 B1B1 a0a a1a1 0.5 I a0a0 a1a P(B|A)P(A) A B
Daphne Koller NP-Hardness The following are NP-hard Given a PGM P , find the joint assignment x with highest probability P (x) Given a PGM P and a probability p, decide if there is an assignment x such that P (x) > p
Daphne Koller Max-Product C D I SG L J H D
Daphne Koller Evidence: Reduced Factors BD C A
Daphne Koller Max-Product
Daphne Koller Algorithms: MAP Push maximization into factor product – Variable elimination Message passing over a graph – Max-product belief propagation Using methods for integer programming For some networks: graph-cut methods Combinatorial search
Daphne Koller Summary MAP: single coherent assignment of highest probability – Not the same as maximizing individual marginal probabilities Maxing over factor product Combinatorial optimization problem Many exact and approximate algorithms
Daphne Koller END END END