Download presentation
Presentation is loading. Please wait.
Published byCurtis Cannon Modified over 9 years ago
1
Importance Sampling ICS 276 Fall 2007 Rina Dechter
2
Outline Gibbs Sampling Advances in Gibbs sampling Blocking Cutset sampling (Rao-Blackwellisation) Importance Sampling Advances in Importance Sampling Particle Filtering
3
Importance Sampling Theory
4
Given a distribution called the proposal distribution Q (such that P(Z=z,e)>0=> Q(Z=z)>0) w(Z=z) is called as importance weight
5
Importance Sampling Theory Underlying principle, Approximate Average over a set of numbers by an average over a set of sampled numbers
6
Importance Sampling (Informally) Express the problem as computing the average over a set of real numbers Sample a subset of real numbers Approximate the true average by sample average. True Average: Average of (0.11, 0.24, 0.55, 0.77, 0.88,0.99)=0.59 Sample Average over 2 samples: Average of (0.24, 0.77) = 0.505
7
How to generate samples from Q Express Q in product form: Q(Z)=Q(Z 1 )Q(Z 2 |Z 1 )….Q(Z n |Z 1,..Z n-1 ) Sample along the order Z 1,..Z n Example: Q(Z 1 )=(0.2,0.8) Q(Z 2 |Z 1 )=(0.2,0.8,0.1,0.9) Q(Z 3 |Z 1,Z 2 )=Q(Z 3 |Z 1 )=(0.5,0.5,0.3,0.7)
8
How to sample from Q Generate a random number between 0 and 1 Q(Z 1 )=(0.2,0.8) Q(Z 2 |Z 1 )=(0.2,0.8,0.1,0.9) Q(Z 3 |Z 1,Z 2 )=Q(Z 3 |Z 1 )=(0.5,0.5,0.3,0.7 ) 0 1 0.2 Which value to select for Z 1 ? Domains of each variable is {0,1} 0 1
9
How to sample from Q? Each Sample Z=z Sample Z 1 =z 1 from Q(Z 1 ) Sample Z 2 =z 2 from Q(Z 2 |Z 1 =z1) Sample Z 3 =z 3 from Q(Z 3 |Z1=z1) Generate N such samples
10
Likelihood weighting Q= Prior Distribution=CPTs of the Bayesian network
11
Likelihood weighting example lung Cancer Smoking X-ray Bronchitis Dyspnoea P(D|C,B) P(B|S) P(S) P(X|C,S) P(C|S) P(S, C, B, X, D) = P(S) P(C|S) P(B|S) P(X|C,S) P(D|C,B)
12
Likelihood weighting example lung Cancer Smoking X-ray Bronchitis Dyspnoea P(D|C,B) P(B|S) P(S) P(X|C,S) P(C|S) Q=Prior Q(S,C,D)=Q(S)*Q(C|S)*Q(D|C,B=0) =P(S)P(C|S)P(D|C,B=0) Sample S=s from P(S) Sample C=c from P(C|S=s) Sample D=d from P(D|C=c,B=0)
13
The Algorithm
14
How to solve belief updating?
15
Difference between estimating P(E=e) and P(X i =x i |E=e) Unbiased Asymptotically Unbiased
16
Proposal Distribution: Which is better?
17
Outline Gibbs Sampling Advances in Gibbs sampling Blocking Cutset sampling (Rao-Blackwellisation) Importance Sampling Advances in Importance Sampling Particle Filtering
18
Research Issues in Importance Sampling Better Proposal Distribution Likelihood weighting Fung and Chang, 1990; Shachter and Peot, 1990 AIS-BN Cheng and Druzdzel, 2000 Iterative Belief Propagation Changhe and Druzdzel, 2003 Iterative Join Graph Propagation and variable ordering Gogate and Dechter, 2005
19
Research Issues in Importance Sampling (Cheng and Druzdzel 2000) Adaptive Importance Sampling
20
General case Given k proposal distributions Take N samples out of each distribution Approximate P(e)
21
Estimating Q'(z)
22
Cutset importance sampling Divide the Set of variables into two parts Cutset (C) and Remaining Variables (R) (Gogate and Dechter, 2005) and (Bidyuk and Dechter 2006)
23
Outline Gibbs Sampling Advances in Gibbs sampling Blocking Cutset sampling (Rao-Blackwellisation) Importance Sampling Advances in Importance Sampling Particle Filtering
24
Dynamic Belief Networks (DBNs) Bayesian Network at time t Bayesian Network at time t+1 Transition arcs XtXt X t+1 YtYt Y t+1 X0X0 X1X1 X2X2 Y0Y0 Y1Y1 Y2Y2 Unrolled DBN for t=0 to t=10 X 10 Y 10
25
Query Compute P(X 0:t |Y 0:t ) or P(X t |Y 0:t ) Example P(X 0:10 |Y 0:10 ) or P(X 10 |Y 0:10 ) Hard!!! over a long time period Approximate! Sample!
26
Particle Filtering (PF) = “condensation” = “sequential Monte Carlo” = “survival of the fittest” PF can treat any type of probability distribution, non-linearity, and non- stationarity; PF are powerful sampling based inference/learning algorithms for DBNs.
27
Particle Filtering On white board
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.