Daphne Koller Overview Conditional Probability Queries Probabilistic Graphical Models Inference
Daphne Koller Inference in a PGM PGM encodes P(X 1,…,X n ) Can be used to answer any query over joint distribution
Daphne Koller Conditional Probability Queries Evidence: E = e Query: a subset of variables Y Task: compute P(Y | E=e) Applications – Medical/fault diagnosis – Pedigree analysis
Daphne Koller NP-Hardness The following are all NP-hard Given a PGM P , a variable X and a value x Val(X), compute P (X=x) – Or even decide if P (X=x) > 0 Let < 0.5. Given a PGM P , a variable X and a value x Val(X), and observation e Val(E), find a number p that has |P (X=x|E=e) – p| <
Daphne Koller Sum-Product C D I SG L J H D
Daphne Koller Sum-Product BD C A
Daphne Koller Evidence: Reduced Factors BD C A
Daphne Koller Evidence: Reduced Factors C D I SG L J H D P(J,I=i, H=h) =
Daphne Koller Sum-Product Computeand renormalize
Daphne Koller Algorithms: Conditional Probability Push summations into factor product – Variable elimination Message passing over a graph – Belief propagation – Variational approximations Random sampling instantiations – Markov chain Monte Carlo (MCMC) – Importance sampling
Daphne Koller Summary Conditional probability queries of subset of variables given evidence on others Summing over factor product Evidence simply reduces the factors Many exact and approximate algorithms
Daphne Koller END END END