Download presentation
Presentation is loading. Please wait.
Published byDustin Hall Modified over 9 years ago
1
Daphne Koller Overview Maximum a posteriori (MAP) Probabilistic Graphical Models Inference
2
Daphne Koller Maximum a Posteriori (MAP) Evidence: E=e Query: all other variables Y (Y={X 1,…,X n }-E) Task: compute MAP(Y|E=e) = argmax y P(Y=y | E=e) – Note: there may be more than one possible solution Applications – Message decoding: most likely transmitted message – Image segmentation: most likely segmentation
3
Daphne Koller MAP ≠ Max over Marginals Joint distribution – P(a 0, b 0 ) = 0.04 – P(a 0, b 1 ) = 0.36 – P(a 1, b 0 ) = 0.3 – P(a 1, b 1 ) = 0.3 B AB0B0 B1B1 a0a0 0.10.9 a1a1 0.5 I a0a0 a1a1 0.40.6 P(B|A)P(A) A B
4
Daphne Koller NP-Hardness The following are NP-hard Given a PGM P , find the joint assignment x with highest probability P (x) Given a PGM P and a probability p, decide if there is an assignment x such that P (x) > p
5
Daphne Koller Max-Product C D I SG L J H D
6
Daphne Koller Evidence: Reduced Factors BD C A
7
Daphne Koller Max-Product
8
Daphne Koller Algorithms: MAP Push maximization into factor product – Variable elimination Message passing over a graph – Max-product belief propagation Using methods for integer programming For some networks: graph-cut methods Combinatorial search
9
Daphne Koller Summary MAP: single coherent assignment of highest probability – Not the same as maximizing individual marginal probabilities Maxing over factor product Combinatorial optimization problem Many exact and approximate algorithms
10
Daphne Koller END END END
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.