Download presentation
Presentation is loading. Please wait.
Published byAbigail Owens Modified over 8 years ago
1
Daphne Koller Overview Conditional Probability Queries Probabilistic Graphical Models Inference
2
Daphne Koller Inference in a PGM PGM encodes P(X 1,…,X n ) Can be used to answer any query over joint distribution
3
Daphne Koller Conditional Probability Queries Evidence: E = e Query: a subset of variables Y Task: compute P(Y | E=e) Applications – Medical/fault diagnosis – Pedigree analysis
4
Daphne Koller NP-Hardness The following are all NP-hard Given a PGM P , a variable X and a value x Val(X), compute P (X=x) – Or even decide if P (X=x) > 0 Let < 0.5. Given a PGM P , a variable X and a value x Val(X), and observation e Val(E), find a number p that has |P (X=x|E=e) – p| <
5
Daphne Koller Sum-Product C D I SG L J H D
6
Daphne Koller Sum-Product BD C A
7
Daphne Koller Evidence: Reduced Factors BD C A
8
Daphne Koller Evidence: Reduced Factors C D I SG L J H D P(J,I=i, H=h) =
9
Daphne Koller Sum-Product Computeand renormalize
10
Daphne Koller Algorithms: Conditional Probability Push summations into factor product – Variable elimination Message passing over a graph – Belief propagation – Variational approximations Random sampling instantiations – Markov chain Monte Carlo (MCMC) – Importance sampling
11
Daphne Koller Summary Conditional probability queries of subset of variables given evidence on others Summing over factor product Evidence simply reduces the factors Many exact and approximate algorithms
12
Daphne Koller END END END
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.