Importance Sampling ICS 276 Fall 2007 Rina Dechter.

Slides:



Advertisements
Similar presentations
Jose-Luis Blanco, Javier González, Juan-Antonio Fernández-Madrigal University of Málaga (Spain) Dpt. of System Engineering and Automation May Pasadena,
Advertisements

Dynamic Bayesian Networks (DBNs)
Lirong Xia Approximate inference: Particle filter Tue, April 1, 2014.
Rao-Blackwellised Particle Filtering Based on Rao-Blackwellised Particle Filtering for Dynamic Bayesian Networks by Arnaud Doucet, Nando de Freitas, Kevin.
Visual Tracking CMPUT 615 Nilanjan Ray. What is Visual Tracking Following objects through image sequences or videos Sometimes we need to track a single.
IMPORTANCE SAMPLING ALGORITHM FOR BAYESIAN NETWORKS
Introduction of Probabilistic Reasoning and Bayesian Networks
Markov Networks.
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
Advanced Artificial Intelligence
Probabilistic Robotics Bayes Filter Implementations Particle filters.
… Hidden Markov Models Markov assumption: Transition model:
CS 188: Artificial Intelligence Fall 2009 Lecture 20: Particle Filtering 11/5/2009 Dan Klein – UC Berkeley TexPoint fonts used in EMF. Read the TexPoint.
Particle Filters Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics TexPoint fonts used in EMF. Read.
Graphical Models for Mobile Robot Localization Shuang Wu.
Extending Expectation Propagation for Graphical Models Yuan (Alan) Qi Joint work with Tom Minka.
Machine Learning CUNY Graduate Center Lecture 7b: Sampling.
Sérgio Pequito Phd Student
SampleSearch: A scheme that searches for Consistent Samples Vibhav Gogate and Rina Dechter University of California, Irvine USA.
Today Introduction to MCMC Particle filters and MCMC
Probabilistic Robotics Bayes Filter Implementations Particle filters.
Novel approach to nonlinear/non- Gaussian Bayesian state estimation N.J Gordon, D.J. Salmond and A.F.M. Smith Presenter: Tri Tran
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
HCI / CprE / ComS 575: Computational Perception
Approximate Inference 2: Monte Carlo Markov Chain
Bayesian Filtering for Robot Localization
1 Miodrag Bolic ARCHITECTURES FOR EFFICIENT IMPLEMENTATION OF PARTICLE FILTERS Department of Electrical and Computer Engineering Stony Brook University.
QUIZ!!  T/F: The forward algorithm is really variable elimination, over time. TRUE  T/F: Particle Filtering is really sampling, over time. TRUE  T/F:
Ahsanul Haque *, Swarup Chandra *, Latifur Khan * and Charu Aggarwal + * Department of Computer Science, University of Texas at Dallas + IBM T. J. Watson.
Object Tracking using Particle Filter
From Bayesian Filtering to Particle Filters Dieter Fox University of Washington Joint work with W. Burgard, F. Dellaert, C. Kwok, S. Thrun.
SIS Sequential Importance Sampling Advanced Methods In Simulation Winter 2009 Presented by: Chen Bukay, Ella Pemov, Amit Dvash.
Ahsanul Haque *, Swarup Chandra *, Latifur Khan * and Michael Baron + * Department of Computer Science, University of Texas at Dallas + Department of Mathematical.
Machine Learning Lecture 23: Statistical Estimation with Sampling Iain Murray’s MLSS lecture on videolectures.net:
Recap: Reasoning Over Time  Stationary Markov models  Hidden Markov models X2X2 X1X1 X3X3 X4X4 rainsun X5X5 X2X2 E1E1 X1X1 X3X3 X4X4 E2E2 E3E3.
Probabilistic Robotics Bayes Filter Implementations.
SampleSearch: Importance Sampling in the presence of Determinism
1 Sampling Bayesian Networks ICS 275b Approximation Algorithms Structural Approximations Eliminate some dependencies Remove edges Mini-Bucket.
Computer Science, Software Engineering & Robotics Workshop, FGCU, April 27-28, 2012 Fault Prediction with Particle Filters by David Hatfield mentors: Dr.
Summary of This Course Huanhuan Chen. Outline  Basics about Signal & Systems  Bayesian inference  PCVM  Hidden Markov Model  Kalman filter  Extended.
UIUC CS 498: Section EA Lecture #21 Reasoning in Artificial Intelligence Professor: Eyal Amir Fall Semester 2011 (Some slides from Kevin Murphy (UBC))
Dynamic Bayesian Networks and Particle Filtering COMPSCI 276 (chapter 15, Russel and Norvig) 2007.
-Arnaud Doucet, Nando de Freitas et al, UAI
Bayes’ Nets: Sampling [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available.
1 CMSC 671 Fall 2001 Class #21 – Tuesday, November 13.
Computing & Information Sciences Kansas State University Data Sciences Summer Institute Multimodal Information Access and Synthesis Learning and Reasoning.
CPSC 422, Lecture 11Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11 Oct, 2, 2015.
QUIZ!!  In HMMs...  T/F:... the emissions are hidden. FALSE  T/F:... observations are independent given no evidence. FALSE  T/F:... each variable X.
Beam Sampling for the Infinite Hidden Markov Model by Jurgen Van Gael, Yunus Saatic, Yee Whye Teh and Zoubin Ghahramani (ICML 2008) Presented by Lihan.
Bayesian networks and their application in circuit reliability estimation Erin Taylor.
Advances in Bayesian Learning Learning and Inference in Bayesian Networks Irina Rish IBM T.J.Watson Research Center
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Rao-Blackwellised Particle Filtering for Dynamic Bayesian Network Arnaud Doucet Nando de Freitas Kevin Murphy Stuart Russell.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Daphne Koller Overview Conditional Probability Queries Probabilistic Graphical Models Inference.
Rao-Blackwellised Particle Filtering for Dynamic Bayesian Networks Arnaud Doucet, Nando de Freitas, Kevin Murphy and Stuart Russell CS497EA presentation.
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
Particle filters for Robot Localization An implementation of Bayes Filtering Markov Localization.
Instructor: Eyal Amir Grad TAs: Wen Pu, Yonatan Bisk Undergrad TAs: Sam Johnson, Nikhil Johri CS 440 / ECE 448 Introduction to Artificial Intelligence.
CS 541: Artificial Intelligence Lecture VIII: Temporal Probability Models.
Introduction to Sampling based inference and MCMC
Irina Rish IBM T.J.Watson Research Center
Course: Autonomous Machine Learning
Particle Filtering ICS 275b 2002.
Particle filters for Robot Localization
Instructors: Fei Fang (This Lecture) and Dave Touretzky
Class #19 – Tuesday, November 3
Class #16 – Tuesday, October 26
Markov Networks.
Importance Sampling, Sequential Importance Sampling and more.
Presentation transcript:

Importance Sampling ICS 276 Fall 2007 Rina Dechter

Outline Gibbs Sampling Advances in Gibbs sampling Blocking Cutset sampling (Rao-Blackwellisation) Importance Sampling Advances in Importance Sampling Particle Filtering

Importance Sampling Theory

Given a distribution called the proposal distribution Q (such that P(Z=z,e)>0=> Q(Z=z)>0) w(Z=z) is called as importance weight

Importance Sampling Theory Underlying principle, Approximate Average over a set of numbers by an average over a set of sampled numbers

Importance Sampling (Informally) Express the problem as computing the average over a set of real numbers Sample a subset of real numbers Approximate the true average by sample average. True Average: Average of (0.11, 0.24, 0.55, 0.77, 0.88,0.99)=0.59 Sample Average over 2 samples: Average of (0.24, 0.77) = 0.505

How to generate samples from Q Express Q in product form: Q(Z)=Q(Z 1 )Q(Z 2 |Z 1 )….Q(Z n |Z 1,..Z n-1 ) Sample along the order Z 1,..Z n Example: Q(Z 1 )=(0.2,0.8) Q(Z 2 |Z 1 )=(0.2,0.8,0.1,0.9) Q(Z 3 |Z 1,Z 2 )=Q(Z 3 |Z 1 )=(0.5,0.5,0.3,0.7)

How to sample from Q Generate a random number between 0 and 1 Q(Z 1 )=(0.2,0.8) Q(Z 2 |Z 1 )=(0.2,0.8,0.1,0.9) Q(Z 3 |Z 1,Z 2 )=Q(Z 3 |Z 1 )=(0.5,0.5,0.3,0.7 ) Which value to select for Z 1 ? Domains of each variable is {0,1} 0 1

How to sample from Q? Each Sample Z=z Sample Z 1 =z 1 from Q(Z 1 ) Sample Z 2 =z 2 from Q(Z 2 |Z 1 =z1) Sample Z 3 =z 3 from Q(Z 3 |Z1=z1) Generate N such samples

Likelihood weighting Q= Prior Distribution=CPTs of the Bayesian network

Likelihood weighting example lung Cancer Smoking X-ray Bronchitis Dyspnoea P(D|C,B) P(B|S) P(S) P(X|C,S) P(C|S) P(S, C, B, X, D) = P(S) P(C|S) P(B|S) P(X|C,S) P(D|C,B)

Likelihood weighting example lung Cancer Smoking X-ray Bronchitis Dyspnoea P(D|C,B) P(B|S) P(S) P(X|C,S) P(C|S) Q=Prior Q(S,C,D)=Q(S)*Q(C|S)*Q(D|C,B=0) =P(S)P(C|S)P(D|C,B=0) Sample S=s from P(S) Sample C=c from P(C|S=s) Sample D=d from P(D|C=c,B=0)

The Algorithm

How to solve belief updating?

Difference between estimating P(E=e) and P(X i =x i |E=e) Unbiased Asymptotically Unbiased

Proposal Distribution: Which is better?

Outline Gibbs Sampling Advances in Gibbs sampling Blocking Cutset sampling (Rao-Blackwellisation) Importance Sampling Advances in Importance Sampling Particle Filtering

Research Issues in Importance Sampling Better Proposal Distribution Likelihood weighting Fung and Chang, 1990; Shachter and Peot, 1990 AIS-BN Cheng and Druzdzel, 2000 Iterative Belief Propagation Changhe and Druzdzel, 2003 Iterative Join Graph Propagation and variable ordering Gogate and Dechter, 2005

Research Issues in Importance Sampling (Cheng and Druzdzel 2000) Adaptive Importance Sampling

General case Given k proposal distributions Take N samples out of each distribution Approximate P(e)

Estimating Q'(z)

Cutset importance sampling Divide the Set of variables into two parts Cutset (C) and Remaining Variables (R) (Gogate and Dechter, 2005) and (Bidyuk and Dechter 2006)

Outline Gibbs Sampling Advances in Gibbs sampling Blocking Cutset sampling (Rao-Blackwellisation) Importance Sampling Advances in Importance Sampling Particle Filtering

Dynamic Belief Networks (DBNs) Bayesian Network at time t Bayesian Network at time t+1 Transition arcs XtXt X t+1 YtYt Y t+1 X0X0 X1X1 X2X2 Y0Y0 Y1Y1 Y2Y2 Unrolled DBN for t=0 to t=10 X 10 Y 10

Query Compute P(X 0:t |Y 0:t ) or P(X t |Y 0:t ) Example P(X 0:10 |Y 0:10 ) or P(X 10 |Y 0:10 ) Hard!!! over a long time period Approximate! Sample!

Particle Filtering (PF) = “condensation” = “sequential Monte Carlo” = “survival of the fittest” PF can treat any type of probability distribution, non-linearity, and non- stationarity; PF are powerful sampling based inference/learning algorithms for DBNs.

Particle Filtering On white board