Simulation Tool Benchmarking and Verification

Slides:



Advertisements
Similar presentations
Modeling and Simulation By Lecturer: Nada Ahmed. Introduction to simulation and Modeling.
Advertisements

Extreme Programming Alexander Kanavin Lappeenranta University of Technology.
6/14/991 Symbolic verification of systems with state machines David L. Dill Jeffrey Su Jens Skakkebaek Computer System Laboratory Stanford University.
Modeling and Computational Tools for Contemporary Biology By Jeff Krause, Ph.D. Shodor 2010 NCSI/iPlant CBBE Workshop.
ARCS Data Analysis Software An overview of the ARCS software management plan Michael Aivazis California Institute of Technology ARCS Baseline Review March.
Sharif University of Technology Session # 7.  Contents  Systems Analysis and Design  Planning the approach  Asking questions and collecting data 
Framework for K-12 Science Education
Introduction to Systems Analysis and Design Trisha Cummings.
IAEA International Atomic Energy Agency Overview International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO) Presented by Jon R. Phillips.
Office of Nuclear Energy U.S. Department of Energy
1. Topics to be discussed Introduction Objectives Testing Life Cycle Verification Vs Validation Testing Methodology Testing Levels 2.
Objectives Understand the basic concepts and definitions relating to testing, like error, fault, failure, test case, test suite, test harness. Explore.
Complex Approach to Study Physical Features of Uranium Multiple Recycling in Light Water Reactors A.A. Dudnikov, V.A. Nevinitsa, A.V. Chibinyaev, V.N.
Physics Steven Gottlieb, NCSA/Indiana University Lattice QCD: focus on one area I understand well. A central aim of calculations using lattice QCD is to.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
The Roadmap to Software Factories Tools, Patterns and Frameworks.
Developing software and hardware in parallel Vladimir Rubanov ISP RAS.
© 2014 Carl Lund, all rights reserved A First Course on Kinetics and Reaction Engineering Class 12.
© 2014 Carl Lund, all rights reserved A First Course on Kinetics and Reaction Engineering Class 12.
CPSC 873 John D. McGregor Session 9 Testing Vocabulary.
International Atomic Energy Agency M. El-Shanawany IAEA Technical Support & Capacity Building Programme M. El-Shanawany Department of Nuclear Safety &
CPSC 871 John D. McGregor Module 8 Session 1 Testing.
What is a software? Computer Software, or just Software, is the collection of computer programs and related data that provide the instructions telling.
CPSC 372 John D. McGregor Module 8 Session 1 Testing.
Software Development. The Software Life Cycle Encompasses all activities from initial analysis until obsolescence Analysis of problem or request Analysis.
Planning Engagement Kickoff
Software Development.
Linux Standard Base Основной современный стандарт Linux, стандарт ISO/IEC с 2005 года Определяет состав и поведение основных системных библиотек.
You Can’t Afford to be Late!
GKR 2113: INTRODUCTION TO DESIGN PROCESS
SOFTWARE TESTING Date: 29-Dec-2016 By: Ram Karthick.
Component 1.6.
Panel Discussion: Discussion on Trends in Multi-Physics Simulation
Office 365 FastTrack Planning Engagement Kickoff
John D. McGregor Session 9 Testing Vocabulary
Closing Remarks and Action Items
Bayesian Monte-Carlo and Experimental Uncertainties
The Calibration Process
Object oriented system development life cycle
Retrospection – Next Generation FCS Missions and Use Cases
John D. McGregor Session 9 Testing Vocabulary
Advantages OF BDD Testing
Scientific Computing At Jefferson Lab
John D. McGregor Session 9 Testing Vocabulary
Introduction to Systems Analysis and Design
UNIT 3 CHAPTER 1 LESSON 4 Using Simple Commands.
CS240: Advanced Programming Concepts
Improvements of Nuclear Fuel Cycle Simulation System (NFCSS) at IAEA
HIGH LEVEL SYNTHESIS.
Baisc Of Software Testing
Test Case Test case Describes an input Description and an expected output Description. Test case ID Section 1: Before execution Section 2: After execution.
Short Contribution Title on Advanced Method/Instrument
Fuel Cycle Simulators and Data Treatment
Software Verification, Validation, and Acceptance Testing
A practical approach for process control optimization during start-up
3rd Workshop on dynamic fuel cycle Timothée Kooyman, DEN,DR,SPRC,LE2C
Visualization of Simulation Results
MANAGING THE DEVELOPMENT AND PURCHASE OF INFORMATION SYSTEMS
drhgfdjhngngfmhgmghmghjmghfmf 20 min total EDWARD Hoffman Bo Feng
IAEA Technical Support & Capacity Building Programme M. El-Shanawany
The Fuel Cycle Analysis Toolbox
Daniel Wojtaszek 3rd Technical Workshop on Fuel Cycle Simulation
Economic Analysis of Alternative Transition Pathways to Improve Economic Considerations in Fuel Cycle Transition Brent Dixon Deputy National Technical.
Defining the Grid Fabrizio Gagliardi EMEA Director Technical Computing
M. Kezunovic (P.I.) S. S. Luo D. Ristanovic Texas A&M University
Valuable Lessons from Fuel Cycle Code Comparisons
OmegaPS Users’ Group Meeting OUGM19
Introduction to the session Reactor Models
Cross Section Versus Recipes for Fuel Cycle Transition Analysis
From Use Cases to Implementation
Presentation transcript:

Simulation Tool Benchmarking and Verification Brent Dixon Idaho National Laboratory Technical Workshop on Fuel Cycle Simulation

FCS Verification – What to test Basic features or essential functionalities These are the building blocks of the code May also be considered “core capability” blocks Integral features that bind the basic features together and produce useful results Considering the core capabilities as an integrated whole Produces a complete result Limited to basic capabilities common to most codes Exemplary features that enable study of particular sensitivities or other capabilities May not be present in every tool Provides “full features” of a code

FCS Verification – How to test General Approaches Manual verification of individual equations up to a highly simplified model Spreadsheet verification of a simple model Benchmarking of single or multiple scenarios with a set of codes Auto-verification of code functions/methods Scenario-Specific Approaches Manual or spreadsheet verification of key results Cross-comparison to results from other FCS codes

Unit Tests – Parts of Code or Full Code Parts example Enrichment module provides correct NU, LEU, DU masses and SWUs Full Code Examples Run one reactor at steady state Startup and run (and retire) one reactor Run a homogenous fleet at steady state Run a homogenous fleet with simple growth Items reviewed in full code tests are just the basics Number of reactors, basic material flows May omit mining, conversion, transportation, storage, reprocessing, waste disposal, etc. Includes no exemplary features such as decay

Code to Code Testing Purpose is to produce exactly the same results or be able to explain any differences Reference result may be calculated with a different tool (e.g. spreadsheet) Differences may result in: Evolution of the specification to eliminate multiple interpretations Changes to codes to correct any errors discovered Detailed explanation of differences due to different code architectures, time steps sizes, reporting times (e.g. first of the year vs. end o the year), rounding, etc.

Benchmarking Full code tests May be based on one or a family of scenarios Typically will assess more parameters that a code-to-code test Often used to prove a new code is ready for production work Purpose is typically to produce similar behavior, not exact matches Only significant differences require explanation Typically these trace back to unmodeled features in less mature codes, etc. May involve disabling exemplary features to improve matches May be blind or open Blind tests do not allow for iteration of the specification if found to allow multiple interpretations

Benchmarking Examples MIT Benchmark (2009) Included multiple scenarios One iterated for close matches with advanced features disabled Others allowed for all features to operate, including automated decision-making Four mature codes compared

Benchmarking Examples NEA/OECD Benchmark (2012) Involved a single scenario with three stages Five mature codes were compared

Benchmarking Examples IAEA/INPRO GAINS (2013) Involved three simple scenarios Six codes compared (several not previously benchmarked)

Benchmarking Observations Results are more understandable with simple scenarios and tight specifications Must design around the intersection of code capabilities, not the union Expect differences in results Just because results do not match, does not mean one code is “wrong” However - Allow time for iterations When running a new type of problem, a code may reveal a bug that needs fixing Expect strong similarities in general trends A transition may not occur in the same year, but should occur near the same time A slope may not be equal, but should be close (e.g. all trending slightly up) Use the experience to improve your code Tune up your calculations or add a new capability you saw in another code

References [1] Brown, N.R., B.W. Carlsen, B.W. Dixon, B. Feng, H.R. Greenberg, R.D. Hays, S. Passerini, M. Todosow, A. Worrall, “Identification of fuel cycle simulator functionalities for analysis of transition to a new fuel cycle”, Annals of Nuclear Energy 96 (2016) 88-95, 2016. [2] Feng, B., B. Dixon, E. Sunny, A. Cuadra, J. Jacobson, N.R. Brown, J. Powers, A. Worrall, S. Passerini, R. Gregg, “Standardized verification of fuel cycle modeling”, Annals of Nuclear Energy 94 (2016) 300-312, 2016. [3] Guerin, L., B. Feng, P. Hajzlar, B. Forget, M.S. Kazimi, L. Van Den Durpel, A. Yacout, T. Taiwo, B.W. Dixon, G. Matthern, L. Boucher, M Delpech, R. Girieud, M. Meyer, “A Benchmark Study of Computer Codes for System Analysis of the Nuclear Fuel Cycle”, Center for Advanced Nuclear Energy Systems, Massachusetts Institute of Technology, MIT-NRC-TR-105, April 2009. [4] “Benchmark Study on Nuclear Fuel Cycle Transition Scenarios Analysis Codes”, Expert Group on Fuel Cycle Transition Scenario Studies, Nuclear Energy Agency, NEA/NSC/WPFC/DOC(2012)16, Paris, June 2012. [5] “Framework for Assessing Dynamic Nuclear Energy Systems for Sustainability: Final Report of the INPRO Collaborative Project GAINS”, International Atomic Energy Agency, NP-T-1.14, Vienna, 2013. [6] Gidden, M., Scopatz, A., Wilson, P., “Developing standardized, open benchmarks and a corresponding specification language for the simulation of dynamic fuel cycle”, Trans. Am. Nucl. Soc. 108, 127–130, 2013.

Discussion What issues have you encountered in code verification? What have you been surprised to find or learn? What weaknesses in current approaches have you noted and what ideas do you have to address them?