Daphne Koller Overview Maximum a posteriori (MAP) Probabilistic Graphical Models Inference.

Slides:



Advertisements
Similar presentations
Exact Inference. Inference Basic task for inference: – Compute a posterior distribution for some query variables given some observed evidence – Sum out.
Advertisements

Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
Bucket Elimination: A Unifying Framework for Probabilistic Inference By: Rina Dechter Presented by: Gal Lavee.
Bayesian Networks Bucket Elimination Algorithm 主講人:虞台文 大同大學資工所 智慧型多媒體研究室.
CSCE 582 Computation of the Most Probable Explanation in Bayesian Networks using Bucket Elimination -Hareesh Lingareddy University of South Carolina.
Exact Inference in Bayes Nets
Loopy Belief Propagation a summary. What is inference? Given: –Observabled variables Y –Hidden variables X –Some model of P(X,Y) We want to make some.
MPE, MAP AND APPROXIMATIONS Lecture 10: Statistical Methods in AI/ML Vibhav Gogate The University of Texas at Dallas Readings: AD Chapter 10.
Undirected Probabilistic Graphical Models (Markov Nets) (Slides from Sam Roweis)
Introduction to Belief Propagation and its Generalizations. Max Welling Donald Bren School of Information and Computer and Science University of California.
3 March, 2003University of Glasgow1 Statistical-Mechanical Approach to Probabilistic Inference --- Cluster Variation Method and Generalized Loopy Belief.
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
Markov Networks.
GS 540 week 6. HMM basics Given a sequence, and state parameters: – Each possible path through the states has a certain probability of emitting the sequence.
Distributed Message Passing for Large Scale Graphical Models Alexander Schwing Tamir Hazan Marc Pollefeys Raquel Urtasun CVPR2011.
Bayesian network inference
Recent Development on Elimination Ordering Group 1.
Conditional Random Fields
Belief Propagation, Junction Trees, and Factor Graphs
Constructing Belief Networks: Summary [[Decide on what sorts of queries you are interested in answering –This in turn dictates what factors to model in.
Understanding Belief Propagation and its Applications Dan Yuan June 2004.
Tutorial #9 by Ma’ayan Fishelson
Exact Inference Eran Segal Weizmann Institute. Course Outline WeekTopicReading 1Introduction, Bayesian network representation1-3 2Bayesian network representation.
10/22  Homework 3 returned; solutions posted  Homework 4 socket opened  Project 3 assigned  Mid-term on Wednesday  (Optional) Review session Tuesday.
Measuring Uncertainty in Graph Cut Solutions Pushmeet Kohli Philip H.S. Torr Department of Computing Oxford Brookes University.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
A1A1 A4A4 A2A2 A3A3 Context-Specific Multiagent Coordination and Planning with Factored MDPs Carlos Guestrin Shobha Venkataraman Daphne Koller Stanford.
Ahsanul Haque *, Swarup Chandra *, Latifur Khan * and Charu Aggarwal + * Department of Computer Science, University of Texas at Dallas + IBM T. J. Watson.
Ahsanul Haque *, Swarup Chandra *, Latifur Khan * and Michael Baron + * Department of Computer Science, University of Texas at Dallas + Department of Mathematical.
Exact Inference on Graphical Models Samson Cheung.
Introduction to Bayesian Networks
1 Variable Elimination Graphical Models – Carlos Guestrin Carnegie Mellon University October 11 th, 2006 Readings: K&F: 8.1, 8.2, 8.3,
Daphne Koller Variable Elimination Graph-Based Perspective Probabilistic Graphical Models Inference.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 12th Bayesian network and belief propagation in statistical inference Kazuyuki Tanaka.
Probabilistic Networks Chapter 14 of Dechter’s CP textbook Speaker: Daniel Geschwender April 1, 2013 April 1&3, 2013DanielG--Probabilistic Networks1.
Daphne Koller Message Passing Belief Propagation Algorithm Probabilistic Graphical Models Inference.
Probabilistic Graphical Models seminar 15/16 ( ) Haim Kaplan Tel Aviv University.
Using Combinatorial Optimization within Max-Product Belief Propagation
Maximum Likelihood Estimation
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Motivation and Overview
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Daphne Koller Markov Networks General Gibbs Distribution Probabilistic Graphical Models Representation.
Pattern Recognition and Machine Learning
1 Tutorial #9 by Ma’ayan Fishelson. 2 Bucket Elimination Algorithm An algorithm for performing inference in a Bayesian network. Similar algorithms can.
Bayes network inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y 
Daphne Koller Overview Conditional Probability Queries Probabilistic Graphical Models Inference.
1 Structure Learning (The Good), The Bad, The Ugly Inference Graphical Models – Carlos Guestrin Carnegie Mellon University October 13 th, 2008 Readings:
Daphne Koller Template Models Plate Models Probabilistic Graphical Models Representation.
1 Variable Elimination Graphical Models – Carlos Guestrin Carnegie Mellon University October 15 th, 2008 Readings: K&F: 8.1, 8.2, 8.3,
Daphne Koller Introduction Motivation and Overview Probabilistic Graphical Models.
30 November, 2005 CIMCA2005, Vienna 1 Statistical Learning Procedure in Loopy Belief Propagation for Probabilistic Image Processing Kazuyuki Tanaka Graduate.
Daphne Koller Variable Elimination Variable Elimination Algorithm Probabilistic Graphical Models Inference.
Graduate School of Information Sciences, Tohoku University
Maximum Expected Utility
ICS 280 Learning in Graphical Models
CIS 700 Advanced Machine Learning for NLP Inference Applications
StatSense In-Network Probabilistic Inference over Sensor Networks
Markov Networks.
Tractable MAP Problems
Graduate School of Information Sciences, Tohoku University
Expectation-Maximization & Belief Propagation
Class #16 – Tuesday, October 26
Graduate School of Information Sciences, Tohoku University
Clique Tree Algorithm: Computation
Graduate School of Information Sciences, Tohoku University
Markov Networks.
Variable Elimination Graphical Models – Carlos Guestrin
Hidden Markov Models ..
Graduate School of Information Sciences, Tohoku University
Presentation transcript:

Daphne Koller Overview Maximum a posteriori (MAP) Probabilistic Graphical Models Inference

Daphne Koller Maximum a Posteriori (MAP) Evidence: E=e Query: all other variables Y (Y={X 1,…,X n }-E) Task: compute MAP(Y|E=e) = argmax y P(Y=y | E=e) – Note: there may be more than one possible solution Applications – Message decoding: most likely transmitted message – Image segmentation: most likely segmentation

Daphne Koller MAP ≠ Max over Marginals Joint distribution – P(a 0, b 0 ) = 0.04 – P(a 0, b 1 ) = 0.36 – P(a 1, b 0 ) = 0.3 – P(a 1, b 1 ) = 0.3 B AB0B0 B1B1 a0a a1a1 0.5 I a0a0 a1a P(B|A)P(A) A B

Daphne Koller NP-Hardness The following are NP-hard Given a PGM P , find the joint assignment x with highest probability P  (x) Given a PGM P  and a probability p, decide if there is an assignment x such that P  (x) > p

Daphne Koller Max-Product C D I SG L J H D

Daphne Koller Evidence: Reduced Factors BD C A

Daphne Koller Max-Product

Daphne Koller Algorithms: MAP Push maximization into factor product – Variable elimination Message passing over a graph – Max-product belief propagation Using methods for integer programming For some networks: graph-cut methods Combinatorial search

Daphne Koller Summary MAP: single coherent assignment of highest probability – Not the same as maximizing individual marginal probabilities Maxing over factor product Combinatorial optimization problem Many exact and approximate algorithms

Daphne Koller END END END