An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.

Slides:



Advertisements
Similar presentations
University of Chicago Department of Energy The Parallel and Grid I/O Perspective MPI, MPI-IO, NetCDF, and HDF5 are in common use Multi TB datasets also.
Advertisements

National Alliance for Medical Image Computing Anatomy of a plugin Common architecture for interactive and batch processing.
An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
The Zebra Striped Network File System Presentation by Joseph Thompson.
Cactus in GrADS (HFA) Ian Foster Dave Angulo, Matei Ripeanu, Michael Russell.
An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
An Advanced Simulation and Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
Fundamentals, Design, and Implementation, 9/e Chapter 11 Managing Databases with SQL Server 2000.
An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
C++ data types. Structs vs. Classes C++ Classes.
An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
An Advanced Simulation and Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
The ASCI/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Type Ia Supernovae and Cosmology  M ~ 0.3,   ~ 0.7 Smoldering.
CS 194 Research Proposal Paul Salzman Advisor: Professor Glenn Reinman Winter 2007.
An Advanced Simulation and Computation (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
An Accelerated Strategic Computing Initiative (ASCI) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
CS 206 Introduction to Computer Science II 01 / 23 / 2009 Instructor: Michael Eckmann.
An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
An Advanced Simulation and Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
Lecture 18 Today Curl of a vector filed 1.Circulation 2.Definition of Curl operator in Cartesian Coordinate 3.Vector identities involving the curl.
An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
Module 2: Using Transact-SQL Querying Tools. Overview SQL Query Analyzer Using the Object Browser Tool in SQL Query Analyzer Using Templates in SQL Query.
By: Mr. Baha Hanene Chapter 3. Learning Outcomes We will cover the learning outcome 02 in this chapter i.e. Use basic data-types and input / output in.
Lesson 5 – Looking at the Output MATSim Tutorial, 2011, Shanghai 1.
Lesson 1: Introduction to ABAP OBJECTS Todd A. Boyle, Ph.D. St. Francis Xavier University.
Co-allocation Using HARC IV. ResourceManagers HARC Workshop University of Manchester.
UNIT - 1Topic - 1. An electronic device, operating under the control of instructions stored in its own memory unit, that can accept data (input), manipulate.
DCE (distributed computing environment) DCE (distributed computing environment)
Process Management Working Group Process Management “Meatball” Dallas November 28, 2001.
Multiplexers 1 The output is equal to one of several input signals to the circuit The multiplexer selects which input signal to use as an output signal.
Parallel Computing with Matlab CBI Lab Parallel Computing Toolbox TM An Introduction Oct. 27, 2011 By: CBI Development Team.
Object-Oriented Programming (OOP). Implementing an OOD in Java Each class is stored in a separate file. All files must be stored in the same package.
INTRODUCTION TO PL/SQL. Class Agenda Introduction Introduction to PL/SQL Declaring PL/SQL Variable Creating the Executable Section Interacting with the.
Crystal Ball Panel ORNL Heterogeneous Distributed Computing Research Al Geist ORNL March 6, 2003 SOS 7.
Subprograms CE 311 K - Introduction to Computer Methods Daene C. McKinney.
An Accelerated Strategic Computing Initiative (ASCI) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
High Performance Computing on the GRID Infrastructure of COMETA S. Orlando 1,2, G. Peres 3,1,2, F. Reale 3,1,2, F. Bocchino 1,2, G.G. Sacco 2, M. Miceli.
Advanced Simulation and Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
Chapter 15 Introduction to PL/SQL. Chapter Objectives  Explain the benefits of using PL/SQL blocks versus several SQL statements  Identify the sections.
The Fresh Breeze Memory Model Status: Linear Algebra and Plans Guang R. Gao Jack Dennis MIT CSAIL University of Delaware Funded in part by NSF HECURA Grant.
Chapter 6 Review: User Defined Functions Introduction to MATLAB 7 Engineering 161.
F. Douglas Swesty, DOE Office of Science Data Management Workshop, SLAC March Data Management Needs for Nuclear-Astrophysical Simulation at the Ultrascale.
Chap. 4 Modules and Ports. 2 Modules and Ports Modules Ports Hierarchical Names Summary.
Concept V2.5 Lesson 10 Objectives: After completing this lesson, the learner will be able to:  Define what is described by the term “Project”.  Describe.
NCEP ESMF GFS Global Spectral Forecast Model Weiyu Yang, Mike Young and Joe Sela ESMF Community Meeting MIT, Cambridge, MA July 21, 2005.
Connections to Other Packages The Cactus Team Albert Einstein Institute
1. FINISHING FUNCTIONS 2. INTRODUCING PLOTTING 1.
An Accelerated Strategic Computing Initiative (ASCI) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
Programming Assignment 4 Code generator Md. Zahurul Islam Center for Research on Bangla Language Processing (CRBLP) BRAC University.
Md. Zahurul Islam Center for Research on Bangla Language Processing (CRBLP) BRAC University.
ENG College of Engineering Engineering Education Innovation Center 1 Functions 2 in MATLAB Topics Covered: 1.Functions in Script Files Inline Functions.
Simulation Production System Science Advisory Committee Meeting UW-Madison March 1 st -2 nd 2007 Juan Carlos Díaz Vélez.
Geant4 GRID production Sangwan Kim, Vu Trong Hieu, AD At KISTI.
An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
Configuration & Registry Microservice Deep Dive
Simulation Production System
SECTION 2 SETUP, WRITING AND CREATING
Timepix+GEM project Field cage simulations
Managing results files
Additive and Multiplicative Relationships
Using local variable without initialization is an error.
CSC 113: Computer programming II
Chapter 11 Managing Databases with SQL Server 2000
C++ data types.
C Programming Lecture-17 Storage Classes
Getting Started With Coding
Presentation transcript:

An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear Flashes IO Architecture Notes

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago IO Scalar Quantities  Information that is needed for the simulation that is neither grid data, nor defined as a runtime parameter (ie. simulation time)  Each unit writes what, if any, scalar values they own through sendOutputScalars, which is called by IO.  Output scalars are read back during IO_init in Driver_initFlash

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Integral Quantities  Quantities are integrated by volume over the grid.  Cartesian geometries are supported by default as well as 2D Cylindrical.  Frequently overridden to provide additional functionality  Does not utilize any of the IO unit’s output machinery, you are responsible for your own MPI communication if you override the file.  Recommended that you use Flash_mpi.h and FLASH_REAL for MPI calls.

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago IO Caveats and Gotchas  IO has a few exceptions to our general architecture rules  Can directly access the Grid unit’s data  Not all of IO is included by default  IOMain and IOParticles are separate subunits!  Restarts do interact with the environment  Runtime parameters from the Config files and flash.par trump values stored in a checkpoint  Values for Runtime Parameters in checkpoints are stored as a previous value  Initialization Order  IO_init must be called before any scalar quantities are used  This includes sim_time!  Logfiles are their own unit. They are not a part of the IO unit  Diagnostic IO: watch who is writing to what file

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago IO and Grid Variables  The reading and writing of Grid variables are handled differently depending on type.  Unknowns are both written and read by default  Only variables to be plotted, at this time  Face-centered data will be checkpointed if declared  Fluxes are not checkpointed at all  Scratch variables may be checkpointed for diagnostic purposes, but are never read back in