Bayesian Monte-Carlo and Experimental Uncertainties

Slides:



Advertisements
Similar presentations
Combined evaluation of PFNS for 235 U(n th,f), 239 Pu(n th,f), 233 U(n th,f) and 252 Cf(sf) (in progress) V.G. Pronyaev Institute of Physics.
Advertisements

Process Control: Designing Process and Control Systems for Dynamic Performance Chapter 6. Empirical Model Identification Copyright © Thomas Marlin 2013.
Materials for Lecture 11 Chapters 3 and 6 Chapter 16 Section 4.0 and 5.0 Lecture 11 Pseudo Random LHC.xls Lecture 11 Validation Tests.xls Next 4 slides.
Total Monte Carlo and related applications of the TALYS code system Arjan Koning NRG Petten, the Netherlands Technical Meeting on Neutron Cross- Section.
Approaches to Data Acquisition The LCA depends upon data acquisition Qualitative vs. Quantitative –While some quantitative analysis is appropriate, inappropriate.
1 Activation Analysis A comparison between FLUKA and FISPACT results Pavia, Gabriele Firpo Reactor and Safety Dept. Phone:
. PGM: Tirgul 8 Markov Chains. Stochastic Sampling  In previous class, we examined methods that use independent samples to estimate P(X = x |e ) Problem:
Statistics: The Science of Learning from Data Data Collection Data Analysis Interpretation Prediction  Take Action W.E. Deming “The value of statistics.
Using ranking and DCE data to value health states on the QALY scale using conventional and Bayesian methods Theresa Cain.
Gaussian process modelling
Decay Data in View of Complex Applications Octavian Sima Physics Department, University of Bucharest Decay Data Evaluation Project Workshop May 12 – 14,
Applications of Bayesian sensitivity and uncertainty analysis to the statistical analysis of computer simulators for carbon dynamics Marc Kennedy Clive.
Bayesian parameter estimation in cosmology with Population Monte Carlo By Darell Moodley (UKZN) Supervisor: Prof. K Moodley (UKZN) SKA Postgraduate conference,
Introduction to MCMC and BUGS. Computational problems More parameters -> even more parameter combinations Exact computation and grid approximation become.
MA and LLFP Transmutation Performance Assessment in the MYRRHA eXperimental ADS P&T: 8th IEM, Las Vegas, Nevada, USA November 9-11, 2004 E. Malambu, W.
Generic Approaches to Model Validation Presented at Growth Model User’s Group August 10, 2005 David K. Walters.
“Good Practices” for long term orbit propagation and associated criteria verification in the frame of the French Space Act Presentation.
Center for Radiative Shock Hydrodynamics Fall 2011 Review Assessment of predictive capability Derek Bingham 1.
MCMC reconstruction of the 2 HE cascade events Dmitry Chirkin, UW Madison.
Spectrum Sensing In Cognitive Radio Networks
CCFE is the fusion research arm of the United Kingdom Atomic Energy Authority Modern , d, p, n-Induced Activation Transmutation Systems EURATOM/CCFE.
CDO correlation smile and deltas under different correlations
Information and Statistics in Nuclear Experiment and Theory - Introduction D. G. Ireland 16 November 2015 ISNET-3, ECT* Trento.
Gil McVean, Department of Statistics Thursday February 12 th 2009 Monte Carlo simulation.
G. Cowan Lectures on Statistical Data Analysis Lecture 10 page 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem 2Random variables and.
Unified Adaptivity Optimization of Clock and Logic Signals Shiyan Hu and Jiang Hu Dept of Electrical and Computer Engineering Texas A&M University.
Ciemat Centro de Investigaciones Energéticas, Medioambientales y Tecnológicas D. Cano-Ott, 6 th Geant4 Space Users Workshop Evaluated neutron cross section.
E. Mendoza, D.Cano-Ott Nuclear Innovation Unit (CIEMAT)
M. J. TannenbaumQuarkMatter M. J. Tannenbaum Brookhaven National Laboratory Upton, NY USA for the PHENIX Collaboration Event-by-Event Average.
Uncertainty quantification in generic Monte Carlo Simulation: a mathematical framework How to do it? Abstract: Uncertainty Quantification (UQ) is the capability.
Peter Vovsha, Robert Donnelly, Surabhi Gupta pb
Hypothesis Tests l Chapter 7 l 7.1 Developing Null and Alternative
GLBIO ML workshop May 17, 2016 Ivan Kryukov and Jeff Wintersinger
AP Seminar: Statistics Primer
Computer Simulation Henry C. Co Technology and Operations Management,
Overview Modern chip designs have multiple IP components with different process, voltage, temperature sensitivities Optimizing mix to different customer.
Missing data: Why you should care about it and what to do about it
Lesson 8: Basic Monte Carlo integration
UKAEA work in fission yields and decay data
1. Nuclear Data Prof. Dr. A.J. (Arjan) Koning1,2
Power and p-values Benjamin Neale March 10th, 2016
Optimization of Monte Carlo Integration
Decay Data for Fusion Applications: Status, Issues and Needs
Point and interval estimations of parameters of the normally up-diffused sign. Concept of statistical evaluation.
Monte Carlo simulation
AP Seminar: Statistics Primer
Lecture 3 of Computer Science II
Hypothesis Testing and Confidence Intervals (Part 1): Using the Standard Normal Lecture 8 Justin Kern October 10 and 12, 2017.
The Calibration Process
Chapter 10 Verification and Validation of Simulation Models
How to handle missing data values
Professor S K Dubey,VSM Amity School of Business
Remember that our objective is for some density f(y|) for observations where y and  are vectors of data and parameters,  being sampled from a prior.
Unfolding Problem: A Machine Learning Approach
Sensitivity analysis in burnup calculations with Monte Carlo
Descriptive and inferential statistics. Confidence interval
Lecture 2 – Monte Carlo method in finance
10701 / Machine Learning Today: - Cross validation,
Counting Statistics and Error Prediction
Chapter 10: Estimating with Confidence
Simple Kmeans Examples
G4GeneralParticleSource Class:
I.Hill, J.Dyrda, N.Soppera, M.Bossant
Overfitting and Underfitting
Computing and Statistical Data Analysis / Stat 7
VQMC J. Planelles.
The Life Cycle of a Trend Savannah River Nuclear Solutions, LLC
Marketing Experiments I
The TALYS nuclear model code II
Stratified Covariate Balancing Using R
Presentation transcript:

Bayesian Monte-Carlo and Experimental Uncertainties M. Fleming UK Atomic Energy Authority OECD NEA Data Week JEFF Decay Data and Fission Yields Working Group Paris 27 April 2016

Some starting comments Total-Monte-Carlo in this presentation refers to: Random sampling of nuclear data files using input parameter variation, followed by simulation with sampled random nuclear data files TMC necessarily involves many random files which will never be validated (or read by a human?). Quality of the TMC is only as good as the quality of these (perturbed parameter) files which… You think look like this… But may look like this…

Bayesian TMC Bayesian TMC in this presentation refers to: Use of optimisation algorithm to select best input parameters for ND generating code, so as to best match some experimental or evaluated data This approach has been developed by several, to produce sets of fission yield files for TMC uncertainty calculations (cf https://tendl.web.psi.ch/tendl_2015/randomYields.html) The complexity comes from definition of a fitness function, choice of optimisation algorithm and parameter updating method

A few cautionary thoughts Fission yield simulation codes must be run to convergence. Depending on how far you wish to simulate The number of randomly sampled files required to converge some observables is not a priori obvious – 10 per parameter? 20? More? GEF may be able to match the yields of evaluated files quite well, since these are (hopefully) physically consistent, but how do we reconcile uncertainties which are not based on model/theory? As just mentioned, a few ‘unique’ files may slip through generation and some simple checks, but will skew results, particularly variances which are sensitive to outliers

Convergence with TMC Convergence of GEF calculations for U235_th nFY (1.0E+05 to 1.0E+08)

Convergence with TMC 2. Performing enough samplings to converge observables Below: nFY TMC with PWR UO2 assembly averaged Nd148 at shutdown, after 40 GWd/tn burn-up (how many samples/parameter?)

GEF vs Evaluated There are some fundamentally different approaches which are difficult to reconcile: Build the best physical model based on marriage of theory and semi-empirical models Perform sophisticated statistical analysis of experimental data and generate best fit Match some experiments from your reactor of interest It will be very challenging for a physically consistent model to reproduce tweaked values, and impossible for it to reproduce the uncertainties with extreme discontinuities However, legacy approaches will continue to be used and users want UQP, so attempting to fit GEF results to evaluated data has value.

GEF and Evaluated Unc. To fit the evaluated yields, there is a natural fitness function: At least two sources for uncertainties to match: evaluation or experiment Experiment is preferable for many reasons, but this is not for the faint-hearted. Unwinding cumulative yields, differences between decay files, experimental bias for good or bad… I am not so brave… Choosing the evaluated data is a much easier task. Moreover, reactor operators are (probably) not going to prefer GEF-calculated uncertainties for their fission products of interest, irrespective of quality. For this we define a fitness function:

Yield sensitivities To best fit the evaluated variances, some updating algorithm for the parameter variances is required, using the sensitivity of the yields to the input parameters:

Updating Approach prototyped is quite simple: Mean values fixed as end-of-optimisation values from DR BMC method Variances taken as default GEF ratios of default GEF parameters Sets of files are generated with Gaussian samples over all parameters Statistical collapse of sampled files compared with evaluated data Variances are gently nudged based on sampled-to-evaluated fitness of all reasonably converged yields Continue until update has no effect Result depends on fitness function, path to minimum and target

Variance update prototype Can design parameter updating algorithms to push input variances toward reproducing evaluated uncertainties, but only as far as the physical model can cooperate with the evaluated uncertainties…

Comments on covariances Independent covariances intuitive based on simulation of fission events (independent correlation chart for Nd148 GEFY-5.3 U5_th)

Comments on covariances Cumulative covariances and covariances from full irradiation scenarios show completely different trends (assembly 40 GWd/tn)

FISPACT-II UQP FISPACT-II can provide full, reaction-pathway-based production uncertainties for nuclide inventories, as well as TMC methods Typically inventory codes are tuned to follow select FPs, pseudo-FPs, set fission yields, ratios of fissions, etc FISPACT-II can be reigned in or let loose per user requirements Can probe as far/deep as you dare to go

FISPACT-II and nFY TMC FISPACT-II can be used to fully sample random independent yield files with any decay library, propagating uncertainties through full fuel life-cycle

Conclusions Fission yields for reactor operation have uncertainties due to measurement techniques, interest in nuclides and choices for normalisation methods Two different evaluations (ENDF v JEFF) not only disagree, but will intentionally, always disagree for nuclides of importance to them A physically-faithful code with natural parameter variation cannot be reconciled with discontinuities of evaluated uncertainties A physically-faithful code with massaged parameter variation may complement uncertainties and correlations for evaluated methods… To couple with full, unconstrained nuclear data we must have open, unconstrained simulation tools, such as FISPACT-II