CRASH UQ Program: Overview & Results James Paul Holloway CRASH Annual Review Fall 2010.

Slides:



Advertisements
Similar presentations
Center for Radiative Shock Hydrodynamics Fall 2011 Review Experimental data from CRASH experiments Carolyn Kuranz.
Advertisements

Assessing Uncertainties in Radiative Shock Modeling James Paul Holloway University of Michegan Joslin Goh, Mike Grosskopf, Bruce Fryxell, Derek Bingham.
Insert Date HereSlide 1 Using Derivative and Integral Information in the Statistical Analysis of Computer Models Gemma Stephenson March 2007.
Getting started with GEM-SA Marc Kennedy. This talk  Starting GEM-SA program  Creating input and output files  Explanation of the menus, toolbars,
Designing Ensembles for Climate Prediction
Model calibration using. Pag. 5/3/20152 PEST program.
CCMT Validation of Shock Tube Simulation Chanyoung Park, Raphael (Rafi) T. Haftka and Nam-Ho Kim Department of Mechanical & Aerospace Engineering, University.
Markov-Chain Monte Carlo
Validating uncertain predictions Tony O’Hagan, Leo Bastos, Jeremy Oakley, University of Sheffield.
Gaussian Processes I have known
Climate case study. Outline The challenge The simulator The data Definitions and conventions Elicitation Expert beliefs about climate parameters Expert.
Quantifying Uncertainties in Radiative Shock Experiments Carolyn C. Kuranz CRASH Annual Review Fall 2010.
Copyright (c) Bani K. Mallick1 STAT 651 Lecture #18.
Simple Linear Regression
Center for Laser Experimental Astrophysics Research Department of Atmospheric Oceanic & Space Sciences Applied Physics Program Department of Physics Michigan.
Air-Water Heat Exchanger Lab In this lab, YOU will design, conduct, and analyze your experiment. The lab handout will not tell you exactly what to measure.
Assessment of Predictive Capability James Paul Holloway CRASH Review Meeting October
Statistics, data, and deterministic models NRCSE.
Transport Physics and UQ Marvin L. Adams Texas A&M University CRASH Annual Review Ann Arbor, MI October 28-29, 2010.
Preliminary Sensitivity Studies With CRASH 3D Bruce Fryxell CRASH Review October 20, 2009.
Center for Radiative Shock Hydrodynamics Integration for Predictive Science R. Paul Drake.
Ken Powell and Ryan McClarren CRASH Review, October 2010 CRASH Students and Courses.
University of Colorado Boulder ASEN 5070: Statistical Orbit Determination I Fall 2014 Professor Brandon A. Jones Lecture 37: SNC Example and Solution Characterization.
Gaussian process modelling
CPE 619 Simple Linear Regression Models Aleksandar Milenković The LaCASA Laboratory Electrical and Computer Engineering Department The University of Alabama.
Simple Linear Regression Models
Calibration of Computer Simulators using Emulators.
Hydrologic Modeling: Verification, Validation, Calibration, and Sensitivity Analysis Fritz R. Fiedler, P.E., Ph.D.
Applications of Bayesian sensitivity and uncertainty analysis to the statistical analysis of computer simulators for carbon dynamics Marc Kennedy Clive.
How the CRASH project has addressed the fall 2008 review recommendations Some detail here, with reference to more material in fall 2009 presentations.
Part 5 Parameter Identification (Model Calibration/Updating)
W  eν The W->eν analysis is a phi uniformity calibration, and only yields relative calibration constants. This means that all of the α’s in a given eta.
Chanyoung Park Raphael T. Haftka Paper Helicopter Project.
Outrigger Timing Calibration & Reconstruction Milagro – 11/17/03 Shoup – 1 Purpose: To incorporate outrigger hits in event angle fitting Additionally incorporated.
Using an emulator. Outline So we’ve built an emulator – what can we use it for? Prediction What would the simulator output y be at an untried input x?
PFI Cobra/MC simulator Peter Mao. purpose develop algorithms for fiducial (FF) and science (SF) fiber identification under representative operating conditions.
Laser Energy Transport and Deposition Package for CRASH Fall 2011 Review Ben Torralva.
Fast Simulators for Assessment and Propagation of Model Uncertainty* Jim Berger, M.J. Bayarri, German Molina June 20, 2001 SAMO 2001, Madrid *Project of.
Center for Radiative Shock Hydrodynamics Fall 2011 Review Assessment of predictive capability Derek Bingham 1.
5th July 00PSI SEU Studies1 Preliminary PSI SEU Studies Study SEU effects by measuring the BER of the link in  /p beams at PSI. Measure the SEU rate as.
MSE-415: B. Hawrylo Chapter 13 – Robust Design What is robust design/process/product?: A robust product (process) is one that performs as intended even.
Center for Radiative Shock Hydrodynamics Fall 2011 Review PDT and radiation transport Marvin L. Adams.
- 1 - Bayesian inference of binomial problem Estimating a probability from binomial data –Objective is to estimate unknown proportion (or probability of.
PROCESS MODELLING AND MODEL ANALYSIS © CAPE Centre, The University of Queensland Hungarian Academy of Sciences Statistical Model Calibration and Validation.
Source catalog generation Aim: Build the LAT source catalog (1, 3, 5 years) Jean Ballet, CEA SaclayGSFC, 29 June 2005 Four main functions: Find unknown.
1 Calorimeter in G4MICE Berkeley 10 Feb 2005 Rikard Sandström Geneva University.
1  The Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
MCMC reconstruction of the 2 HE cascade events Dmitry Chirkin, UW Madison.
DFT Applications Technology to calculate observables Global properties Spectroscopy DFT Solvers Functional form Functional optimization Estimation of theoretical.
Measuring  a in the New experiment D. Hertzog / June 2004 n Plan that rates could be 7 x higher n Plan that new WFDs will have deep memory n Plan that.
Analysis of Experimental Data; Introduction
- 1 - Calibration with discrepancy Major references –Calibration lecture is not in the book. –Kennedy, Marc C., and Anthony O'Hagan. "Bayesian calibration.
- 1 - Computer model under uncertainty In previous lecture on accuracy assessment –We considered mostly deterministic models. – We did not distinguish.
Peterson xBSM Optics, Beam Size Calibration1 xBSM Beam Size Calibration Dan Peterson CesrTA general meeting introduction to the optics.
2005 Unbinned Point Source Analysis Update Jim Braun IceCube Fall 2006 Collaboration Meeting.
1 1 Chapter 6 Forecasting n Quantitative Approaches to Forecasting n The Components of a Time Series n Measures of Forecast Accuracy n Using Smoothing.
47th Annual Meeting of the Division of Plasma Physics, October 24-28, 2005, Denver, Colorado ECE spectrum of HSX plasma at 0.5 T K.M.Likin, H.J.Lu, D.T.Anderson,
Electron Spectrometer: Status July 14 Simon Jolly, Lawrence Deacon 1 st July 2014.
Development of a pad interpolation algorithm using charge-sharing.
Global predictors of regression fidelity A single number to characterize the overall quality of the surrogate. Equivalence measures –Coefficient of multiple.
6/11/20161 Process Optimisation For Micro Laser Welding in Fibre Optics Asif Malik Supervisors: Prof. Chris Bailey & Dr. Stoyan Stoyanov 14 May 2008.
Introduction to emulators Tony O’Hagan University of Sheffield.
A Primer on Running Deterministic Experiments
From: Optimal Test Selection for Prediction Uncertainty Reduction
Two Interpretations of What it Means to Normalize the Low Energy Monte Carlo Events to the Low Energy Data Atms MC Atms MC Data Data Signal Signal Apply.
Siara Fabbri University of Manchester
Air-Water Heat Exchanger Lab
Basics of Photometry.
Slope measurements from test-beam irradiations
Update on POLA-01 measurements in Catania
Presentation transcript:

CRASH UQ Program: Overview & Results James Paul Holloway CRASH Annual Review Fall 2010

We predict what we have not yet measured Do calibration and validation experiments in Years 1-3 Do code runs to characterize and improve predictions around those experiments Do code runs around year 4 & 5 experiments Use physics (code) and our characterization of uncertainties in new region of inputs to predict year 4 & 5 Year 1-3 experiments Years 4 & 5 experiments Simulations

W e have several outputs & inputs Outputs ( ) Shock location (SL) Axial centroid of dense Xe (AC) Area of dense Xe (A) Shock breakout time (BOT) Inputs ( ) Observation time of shock location, axial centroid, area Laser energy Be disk thickness Xe fill gas pressure Calibration parameters (  ) Vary with model Shock location Centroid of dense Xe Area of dense Xe Fixed window

Using a variety of methods we have… explored sensitivity of SL to screen important inputs Influence plots, GPM correlations explored SL output surfaces to understand sensitivity SourceSL Uncert. Be Gamma~ 0.15 mm Initialization~ 0.10 mm Discrepancy~ 0.10 mm Be Disk Thickness~ 0.10 mm Xe Fill Pressure~ 0.04 mm Laser Energy~ 0.01 mm Exp. Uncert.~ 0.10 mm Importance of electron flux limiter led to 2009 calibration experiment definition Sensitivity of triple point location led to new integrated metrics

Integrated metrics: shock location Extract shock location from piecewise constant fits over a fixed region (window) of the radiograph Four segment fit representing unshocked, shocked disk, entrained Xe annulus, trailing plasma Knot locations optimized for minimal MSE First knot is a predicted output (SL)

Integrated metrics: mask to large optical depth Add slide with 104 CRASH runs windowed to 100 micron radius & 2 mm long

Integrated metrics: dense Xe centroid & area Define threshold based on the unshocked Xe optical depth Extract Axial Centroid of Xe above the threshold Insensitive to threshold over wide range Extract Area of Xe above the threshold Varies smoothly with threshold Additionally, and for radial metrics

The 1024 point run set Hyades and CRASH 2.0 in 1D 6D input space (4 x’s and 2 thetas) Orthogonal LHD with space filling criterion Best estimates of x uncertainties at time of problem definition (we know more now) We also have a 104 point run set in 2D

We have experiments for calibration and experiments for characterizing uncertainty 2008 Shock Location measurements at 13, 14 and 16 ns 2009 Shock Breakout Time (BOT) measurements 2010 Shock location at 20 and 26 ns (SL2010) Currently we are predicting SL2010 using: BOT for calibration SL from 2008 to characterize predictive error The process involves using a pair of Kennedy-O’Hagen models and moving data from one to the next

We use a model structure for calibration, validation & uncertainty assessment Measured in calibration experiments with specific x and unknown theta (few of these) Computed with specific values of x and theta (lots of these) Models discrepancy between reality and code – speaks to validation Replication error Fits code over input space Kennedy & O’Hagan 2000, 2001 experimental input physics or calibration input

Leave one out predictions tell us how we are doing 2008 SL experiments 2009 BOT experiments

Calibration using Breakout Time (BOT) Predicting SL at 20 and 26 ns Assessing Shock Location (SL) prediction Prediction and estimate of uncertainty Move discrepancy and replication error to new region of inputs small  model calibrates

Posterior distribution of electron flux limiter is useful for other outputs Consistent with BMARS based calibration of BOT by Stripling (see poster)

Posterior distribution of laser energy scale factor is useful for other outputs

Predictive Study Use calibration experiments (2009) and validation experiments (2008) with CRASH to construct model Use model to predict at 20 and 26 ns Sample 50 sets of x values For each x sample 200 theta values Sample shock location from model Construct predictive intervals for: Code alone (red) Entire model: code, discrepancy, replication error (blue)

Median SL ns ns We have 95% predictive intervals Repeat this predictive study using the 104 runs initialized using Hyades 2D and 2D CRASH

Future studies need to cope with finite computational resources Use simulations of varying fidelity in calibration and prediction Because computational costs are high, we need to be strategic about what runs we do Highly resolved 2D Multigroup and 2D Gray Well resolved 3D Gray Lower resolution 3D Multigroup A first study can be tried with 1D CRASH and 2D CRASH