Daphne Koller Overview Conditional Probability Queries Probabilistic Graphical Models Inference.

Slides:



Advertisements
Similar presentations
Markov models and HMMs. Probabilistic Inference P(X,H,E) P(X|E=e) = P(X,E=e) / P(E=e) with P(X,E=e) = sum h P(X, H=h, E=e) P(X,E=e) = sum h,x P(X=x, H=h,
Advertisements

Exact Inference. Inference Basic task for inference: – Compute a posterior distribution for some query variables given some observed evidence – Sum out.
Exact Inference in Bayes Nets
Loopy Belief Propagation a summary. What is inference? Given: –Observabled variables Y –Hidden variables X –Some model of P(X,Y) We want to make some.
Undirected Probabilistic Graphical Models (Markov Nets) (Slides from Sam Roweis)
Introduction to Belief Propagation and its Generalizations. Max Welling Donald Bren School of Information and Computer and Science University of California.
Pearl’s Belief Propagation Algorithm Exact answers from tree-structured Bayesian networks Heavily based on slides by: Tomas Singliar,
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
Overview of Inference Algorithms for Bayesian Networks Wei Sun, PhD Assistant Research Professor SEOR Dept. & C4I Center George Mason University, 2009.
Markov Networks.
Graduate School of Information Sciences, Tohoku University
Probabilistic reasoning over time So far, we’ve mostly dealt with episodic environments –One exception: games with multiple moves In particular, the Bayesian.
… Hidden Markov Models Markov assumption: Transition model:
Bayesian Networks Chapter 2 (Duda et al.) – Section 2.11
Bayesian Networks. Graphical Models Bayesian networks Conditional random fields etc.
Conditional Random Fields
Belief Propagation, Junction Trees, and Factor Graphs
Graphical Models Lei Tang. Review of Graphical Models Directed Graph (DAG, Bayesian Network, Belief Network) Typically used to represent causal relationship.
Constructing Belief Networks: Summary [[Decide on what sorts of queries you are interested in answering –This in turn dictates what factors to model in.
5/25/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005.
CS 561, Session 29 1 Belief networks Conditional independence Syntax and semantics Exact inference Approximate inference.
Bayesian Networks Russell and Norvig: Chapter 14 CMCS424 Fall 2003 based on material from Jean-Claude Latombe, Daphne Koller and Nir Friedman.
Exact Inference Eran Segal Weizmann Institute. Course Outline WeekTopicReading 1Introduction, Bayesian network representation1-3 2Bayesian network representation.
. Approximate Inference Slides by Nir Friedman. When can we hope to approximate? Two situations: u Highly stochastic distributions “Far” evidence is discarded.
Announcements Homework 8 is out Final Contest (Optional)
. Applications and Summary. . Presented By Dan Geiger Journal Club of the Pharmacogenetics Group Meeting Technion.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
1 Midterm Exam Mean: 72.7% Max: % Kernel Density Estimation.
Approximate Inference 2: Monte Carlo Markov Chain
Daphne Koller Variable Elimination Graph-Based Perspective Probabilistic Graphical Models Inference.
Bayes’ Nets: Sampling [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available.
Learning With Bayesian Networks Markus Kalisch ETH Zürich.
1 CMSC 671 Fall 2001 Class #21 – Tuesday, November 13.
Daphne Koller Message Passing Belief Propagation Algorithm Probabilistic Graphical Models Inference.
The famous “sprinkler” example (J. Pearl, Probabilistic Reasoning in Intelligent Systems, 1988)
CPSC 422, Lecture 11Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11 Oct, 2, 2015.
Probabilistic Graphical Models seminar 15/16 ( ) Haim Kaplan Tel Aviv University.
Probabilistic models Jouni Tuomisto THL. Outline Deterministic models with probabilistic parameters Hierarchical Bayesian models Bayesian belief nets.
Bayesian networks and their application in circuit reliability estimation Erin Taylor.
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Christopher M. Bishop, Pattern Recognition and Machine Learning 1.
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Pattern Recognition and Machine Learning
CPSC 7373: Artificial Intelligence Lecture 5: Probabilistic Inference Jiang Bian, Fall 2012 University of Arkansas at Little Rock.
Reasoning Under Uncertainty: Independence and Inference CPSC 322 – Uncertainty 5 Textbook §6.3.1 (and for HMMs) March 25, 2011.
Bayes network inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y 
Markov Networks: Theory and Applications Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208
Daphne Koller Overview Maximum a posteriori (MAP) Probabilistic Graphical Models Inference.
Daphne Koller Sampling Methods Metropolis- Hastings Algorithm Probabilistic Graphical Models Inference.
Probabilistic Reasoning Inference and Relational Bayesian Networks.
Daphne Koller Variable Elimination Variable Elimination Algorithm Probabilistic Graphical Models Inference.
Instructor: Eyal Amir Grad TAs: Wen Pu, Yonatan Bisk
Maximum Expected Utility
Today.
Markov Networks.
CSCI 5822 Probabilistic Models of Human and Machine Learning
Bucket Renormalization for Approximate Inference
Class #19 – Tuesday, November 3
Exact Inference Eric Xing Lecture 11, August 14, 2010
Graduate School of Information Sciences, Tohoku University
Simple Sampling Sampling Methods Inference Probabilistic Graphical
Expectation-Maximization & Belief Propagation
Class #16 – Tuesday, October 26
Class #22/23 – Wednesday, November 12 / Monday, November 17
MCMC for PGMs: The Gibbs Chain
Approximate Inference by Sampling
Clique Tree Algorithm: Computation
Markov Networks.
Advanced Machine Learning
Presentation transcript:

Daphne Koller Overview Conditional Probability Queries Probabilistic Graphical Models Inference

Daphne Koller Inference in a PGM PGM encodes P(X 1,…,X n ) Can be used to answer any query over joint distribution

Daphne Koller Conditional Probability Queries Evidence: E = e Query: a subset of variables Y Task: compute P(Y | E=e) Applications – Medical/fault diagnosis – Pedigree analysis

Daphne Koller NP-Hardness The following are all NP-hard Given a PGM P , a variable X and a value x  Val(X), compute P  (X=x) – Or even decide if P  (X=x) > 0 Let  < 0.5. Given a PGM P , a variable X and a value x  Val(X), and observation e  Val(E), find a number p that has |P  (X=x|E=e) – p| < 

Daphne Koller Sum-Product C D I SG L J H D

Daphne Koller Sum-Product BD C A

Daphne Koller Evidence: Reduced Factors BD C A

Daphne Koller Evidence: Reduced Factors C D I SG L J H D P(J,I=i, H=h) =

Daphne Koller Sum-Product Computeand renormalize

Daphne Koller Algorithms: Conditional Probability Push summations into factor product – Variable elimination Message passing over a graph – Belief propagation – Variational approximations Random sampling instantiations – Markov chain Monte Carlo (MCMC) – Importance sampling

Daphne Koller Summary Conditional probability queries of subset of variables given evidence on others Summing over factor product Evidence simply reduces the factors Many exact and approximate algorithms

Daphne Koller END END END