Download presentation
Presentation is loading. Please wait.
Published byOsborn Parks Modified over 9 years ago
1
Improved Cross Entropy Method For Estimation Presented by: Alex & Yanna
2
This presentation is based on the paper “Improved Cross-Entropy Method for Estimation” By Dirk P.Kroese & Joshua C.Chan This presentation is based on the paper “Improved Cross-Entropy Method for Estimation” By Dirk P.Kroese & Joshua C.Chan
3
Rare Events Estimation
4
We wish to estimate - Random vector taking values in some set Function on
5
Rare Events Estimation We can rewrite it as - And estimate with a crude Monte Carlo
6
Rare Events Estimation Lets say, for example, that Direct Calculation Simulated
7
Rare Events Estimation
9
Importance Sampling
10
And the importance sampling estimator will be
11
Importance Sampling What would be a good choice for the importance density
12
Importance Sampling We shall take a look at the Kullback Leibler divergence: The zero variance density = The density from the family of with parameter
13
CE Algorithm In the article, 2 problematic issues were mentioned regarding the multilevel CE: The parametric family within which the optimal importance density g is obtained might not be large enough when the dimension of the problem is large, the likelihood ratio involved in obtaining becomes unstable. In the article, 2 problematic issues were mentioned regarding the multilevel CE: The parametric family within which the optimal importance density g is obtained might not be large enough when the dimension of the problem is large, the likelihood ratio involved in obtaining becomes unstable. Importance Sampling
14
Solution Sample directly from g*
15
Importance Sampling Our goal is to find Stochastic Version Deterministic Version
16
Importance Sampling But how the hell are we supposed to sample from ? ? ?
17
Importance Sampling This observation grants us the opportunity to apply the useful tool of gibbs sampling.
18
Gibbs Sampler In Brief
19
an algorithm to generate a sequence of samples from the joint probability distribution Gibbs sampling is a special case of the Metropolis–Hastings algorithm, and thus an example of a Markov chain Monte Carlo algorithm Gibbs sampling is applicable when the joint distribution is not known explicitly, but the conditional distribution of each variable is known It can be shown that the sequence of samples constitutes a Markov chain, and the stationary distribution of that Markov chain is just the sought-after joint distribution an algorithm to generate a sequence of samples from the joint probability distribution Gibbs sampling is a special case of the Metropolis–Hastings algorithm, and thus an example of a Markov chain Monte Carlo algorithm Gibbs sampling is applicable when the joint distribution is not known explicitly, but the conditional distribution of each variable is known It can be shown that the sequence of samples constitutes a Markov chain, and the stationary distribution of that Markov chain is just the sought-after joint distribution Gibbs Sampler In Brief
20
Gibbs Sampler In Brief The Gibbs sampler algorithm Given Generate Return
21
Improved Cross Entropy
22
Improved Cross Entropy The Improved CE consists of 3 steps: 1. Generate via gibbs sampler, N RVs 2. Solve 3. Estimate
23
Improved Cross Entropy Considerwhere and we would like to estimate under the improved cross entropy scheme.
24
Improved Cross Entropy Lets set and imply the new proposed algorithm
25
Improved Cross Entropy Step 1 – generate RVs from First we need to find
26
Improved Cross Entropy Step 1 – generate RVs from cont. Set Generate Set For
27
Improved Cross Entropy Step 2 – Solve the optimization problem
28
Improved Cross Entropy Step 3 – Estimate via importance sampling
29
Improved Cross Entropy Multilevel CE Vs. Improved CE
30
Improved Cross Entropy
31
CE N=10000 4 iterations Total budget 40000 CE N=10000 4 iterations Total budget 40000 Gibbs Sampler 10 parallel chains Each has 1000 length Total budget 10000 Gibbs Sampler In Brief
33
Obligors Probability of the obligor to default for a given threshold Monetary loss if the obligor defaults
35
t Copula Model
36
Known methods for the rare event estimation Exponential Change of MeasureHazard Rate Twisting Bounded relative errorLogarithmically efficient Needs to generate RVs from non standard distribution 10 times more variance reduction then ECM
37
The Improved CE for Estimating the Prob. of a Rare Loss
38
Step I – Sampling from g*
39
Sampling From g* Now we will show how we find the conditional probabilities of g* to apply the gibbs sampler For generating RVs from g*
40
Sampling From g* Define and arrange them is ascending order Let denote the ordered value and the corresponding loss Then the event occurs iff where Via Inverse Transform
41
Sampling From g* Define and arrange them is ascending order Let denote the ordered value and the corresponding loss Then the event occurs iff where Via Inverse Transform
42
Sampling From g* Multivariate truncated normal distribution Sequentially draw from if then else
43
After we got we are ready to move to the next step…
44
Step II – Solving the Opt. Problem
45
Solving Opt. Problem In our model
46
Solving Opt. Problem Since any member of the group is a product of densities, standard techniques of maximum likelihood estimation can be applied to find the optimal v*.
47
Solving Opt. Problem Once we obtain the optimal importance density we are moving to step 3
48
Step III – Importance Sampling
49
Importence Sampling
50
Some Results
52
Pros and Cons Improved CE Pros Rare events 3 basic steps Appropriate in multi dimension settings Fewer simulation effort then the Multi level CE Pros Rare events 3 basic steps Appropriate in multi dimension settings Fewer simulation effort then the Multi level CE Cons Problematic in general performance function not trivial Gibbs sampler requires warm up time Cons Problematic in general performance function not trivial Gibbs sampler requires warm up time
53
Further research Gibbs sampler for the general performance function Applying Sequential Monte Carlo Methods for sampling from g* Gibbs sampler for the general performance function Applying Sequential Monte Carlo Methods for sampling from g*
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.