Simple Sampling Sampling Methods Inference Probabilistic Graphical

Slides:



Advertisements
Similar presentations
CPSC 422, Lecture 11Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11 Jan, 29, 2014.
Advertisements

Dynamic Bayesian Networks (DBNs)
Bayesian Networks. Contents Semantics and factorization Reasoning Patterns Flow of Probabilistic Influence.
IMPORTANCE SAMPLING ALGORITHM FOR BAYESIAN NETWORKS
For inference in Bayesian Networks Presented by Daniel Rembiszewski and Avishay Livne Based on Probabilistic Graphical Models: Principles And Techniques.
Probabilistic reasoning over time So far, we’ve mostly dealt with episodic environments –One exception: games with multiple moves In particular, the Bayesian.
Bayesian network inference
… Hidden Markov Models Markov assumption: Transition model:
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotic Embedded Systems Laboratory University of Southern California
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Statistical inference (Sec. )
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
5/25/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005.
CPSC 422, Lecture 18Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 18 Feb, 25, 2015 Slide Sources Raymond J. Mooney University of.
Population Proportion The fraction of values in a population which have a specific attribute p = Population proportion X = Number of items having the attribute.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
. PGM 2002/3 – Tirgul6 Approximate Inference: Sampling.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Statistical inference.
Approximate Inference 2: Monte Carlo Markov Chain
Crash Course on Machine Learning
Computer Simulation A Laboratory to Evaluate “What-if” Questions.
Recitation 1 Probability Review
QUIZ!!  T/F: The forward algorithm is really variable elimination, over time. TRUE  T/F: Particle Filtering is really sampling, over time. TRUE  T/F:
1 Approximate Inference 2: Importance Sampling. (Unnormalized) Importance Sampling.
1 Variable Elimination Graphical Models – Carlos Guestrin Carnegie Mellon University October 11 th, 2006 Readings: K&F: 8.1, 8.2, 8.3,
Made by: Maor Levy, Temple University  Inference in Bayes Nets ◦ What is the probability of getting a strong letter? ◦ We want to compute the.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 5.2: Recap on Probability Theory Jürgen Sturm Technische Universität.
Elementary manipulations of probabilities Set probability of multi-valued r.v. P({x=Odd}) = P(1)+P(3)+P(5) = 1/6+1/6+1/6 = ½ Multi-variant distribution:
Marginalization & Conditioning Marginalization (summing out): for any sets of variables Y and Z: Conditioning(variant of marginalization):
CPSC 422, Lecture 11Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11 Oct, 2, 2015.
Probabilistic Graphical Models seminar 15/16 ( ) Haim Kaplan Tel Aviv University.
Bayesian networks and their application in circuit reliability estimation Erin Taylor.
Maximum Likelihood Estimation
Output Analysis for Simulation
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
Bayes network inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y 
Daphne Koller Overview Conditional Probability Queries Probabilistic Graphical Models Inference.
Daphne Koller Template Models Plate Models Probabilistic Graphical Models Representation.
Probabilistic Robotics Probability Theory Basics Error Propagation Slides from Autonomous Robots (Siegwart and Nourbaksh), Chapter 5 Probabilistic Robotics.
CS 541: Artificial Intelligence Lecture VII: Inference in Bayesian Networks.
Maximum Expected Utility
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Random Variable.
ICS 280 Learning in Graphical Models
Approximate Inference Methods
Learning Bayesian Network Models from Data
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11
CSE-490DF Robotics Capstone
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Instructors: Fei Fang (This Lecture) and Dave Touretzky
Preliminaries: Distributions
Complexity Analysis Variable Elimination Inference Probabilistic
Random Variable.
Population Proportion
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11
Class #19 – Tuesday, November 3
Class #16 – Tuesday, October 26
Readings: K&F: 5.1, 5.2, 5.3, 5.4, 5.5, 5.6, 5.7 Markov networks, Factor graphs, and an unified view Start approximate inference If we are lucky… Graphical.
MCMC for PGMs: The Gibbs Chain
Conditional Random Fields
Approximate Inference by Sampling
Hankz Hankui Zhuo Bayesian Networks Hankz Hankui Zhuo
Reasoning Patterns Bayesian Networks Representation Probabilistic
Approximate Inference: Particle-Based Methods
Plate Models Template Models Representation Probabilistic Graphical
Flow of Probabilistic Influence
Variable Elimination Graphical Models – Carlos Guestrin
An approach to quantum Bayesian inference
Kalman Filters Gaussian MNs
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Presentation transcript:

Simple Sampling Sampling Methods Inference Probabilistic Graphical Models Sampling Methods Simple Sampling

Sampling-Based Estimation sampled IID from P If P(X=1) = p Estimator for p: More generally, for any distribution P, function f:

Sampling from Discrete Distribution Val(X) = {x1, …, xk} P(xi) = i 1

Sampling-Based Estimation Hoeffding Bound: Chernoff Bound:

Sampling-Based Estimation Hoeffding Bound: For additive bound  on error with probability > 1-: Chernoff Bound: For multiplicative bound  on error with probability > 1-:

Forward sampling from a BN 0.6 0.4 i0 i1 0.7 0.3 Difficulty Intelligence g1 g2 g3 i0,d0 0.3 0.4 i0,d1 0.05 0.25 0.7 i1,d0 0.9 0.08 0.02 i1,d1 0.5 0.2 Grade SAT s0 s1 i0 0.95 0.05 i1 0.2 0.8 Letter l0 l1 g1 0.1 0.9 g2 0.4 0.6 g3 0.99 0.01

Forward Sampling for Querying Goal: Estimate P(Y=y) Generate samples from BN Compute fraction where Y=y For additive bound  on error with probability > 1-: For multiplicative bound  on error with probability > 1-:

Queries with Evidence Goal: Estimate P(Y=y | E=e) Rejection sampling algorithm Generate samples from BN Throw away all those where Ee Compute fraction where Y=y Expected fraction of samples kept ~ P(e) # samples needed rows exponentially with # of observed variables

Summary Generating samples from a BN is easy (,)-bounds exist, but usefulness is limited: Additive bounds: useless for low probability events Multiplicative bounds: # samples grows as 1/P(y) With evidence, # of required samples grows exponentially with # of observed variables Forward sampling generally infeasible for MNs

END END END