Presentation is loading. Please wait.

Presentation is loading. Please wait.

An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.

Similar presentations


Presentation on theme: "An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear."— Presentation transcript:

1 An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear Flashes Inner Workings And Architecture of FLASH Anshu Dubey & Katie Antypas June 4, 2006

2 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Overview qWhat is FLASH qBasics of FLASH Architecture q Similarities and differences between versions 2 and 3 q Units q Inheritance q Naming qBasics behind a problem setup q Enough of an idea of these concepts to be able to look at a sample setup and understand what is happening.

3 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago The Flash Code Cellular detonation Compressed turbulence Helium burning on neutron stars Richtmyer-Meshkov instability Laser-driven shock instabilities Nova outbursts on white dwarfs Rayleigh-Taylor instability Flame-vortex interactions Gravitational collapse/Jeans instability Wave breaking on white dwarfs Shortly: Relativistic accretion onto NS Orzag/Tang MHD vortex Type Ia Supernova Intracluster interactions Magnetic Rayleigh-Taylor The Flash code 1.Parallel, adaptive-mesh simulation code 2.Designed for compressible reactive flows 3.Has a modern CS-influenced architecture 4.Can solve a broad range of (astro)physics problems 5.Portable- runs on many massively-parallel systems 6.Scales and performs well 7.Is available on the web: http://flash.uchicago.edu

4 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago What FLASH Provides qPhysics q Hydrodynamics q PPM q MHD q Relativistic PPM q Equation of State q Ideal gas q Multigamma q Helmholtz q Nuclear Physics q Gravity q Cosmology q Particles qInfrastructure q Setup q Mesh Management q AMR: Paramesh q UG: q Parallel I/O q hdf5, pnetcdf q Monitoring q Runtime and post- processing visualization q Regular testing toolkit q Unit test framework

5 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago A Little FLASH History qFLASH0 q Paramesh2, Prometheus and EOS/Burn qFLASH1 q Smoothing out the smash q First form of module architecture & inheritance qFLASH2 q Untangle modules from each other (Grid) q dBase q Concept of levels of users qFLASH3 q Stricter interface control & unit architecture q Taming the database q Better encapsulation and isolation q Community supported and developed code BAM

6 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago FLASH Audiences FLASH Application Developer Application Programmer End User Develop physics Unit Publish API Use other unit’s API Initial conditions Boundary conditions Runtime parameters Select units, and their specific implementations Implement functions specific to an application, if they are not provided by FLASH Works on just about everything Grid development Data access Architecture

7 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago FLASH Code Basics qAn application code, composed of units/modules. Particular modules are set up together to run different physics problems. qPerformance, Testing, Usability, Portability qFortran, C, Python, … q More than 560,000* lines of code q 75% code, 25% comment qVery portable qScaling to tens of thousand procs * Internal Release

8 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago FLASH on BG/L - Sod weak scaling Source :K.M. Riley, Argonne National Lab Hydro Time: Constant Work Per Processor using AMR Number of Processors Total Time

9 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Hydro and Particles Scaling

10 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Basic Computational Unit : Block qThe grid is composed of blocks qMost often all blocks have same dimensions qCover different fraction of the physical domain. qIn AMR blocks at different levels of refinement have different grid spacing.

11 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago FLASH Architecture Four Cornerstones Setup tool assemble simulation Config files Tell setup how to Assemble simulation Driver Organize interactions Between units Unit Architecture API Inheritance Data management

12 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago What’s a FLASH Unit? qFLASH basic architecture unit q Component of the FLASH code providing a particular functionality q Different combinations of units are used for particular problem setups q Publishes a public interface for other units’ use. q Ex: Driver, Grid, Hydro, IO etc qFake inheritance by use of directory structure qInteraction between units governed by the Driver qNot all units are included in all applications

13 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago FLASH Units Driver I/O Runtime Params Grid Profiling Logfile Simulation Infrastructure monitoring Hydro BurnGravity MHD Physics

14 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Unit Architecture Driver Top level: API Unit Test Grid Data Module block data time step etc Wrapper Kernel

15 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Unit Hierarchy Unit API/stubs Impl_1 Remaining API impl kernel UnitMain Common API implementation UnitSomething API implementation kernel Impl_2 Remaining API impl kernel Impl_3 Remaining API impl common More common impl

16 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Inheritance Through Directories: Eos Eos_ init Eos Eos_ wrapped EosMain Gamma Multigamma Stub Implementations of the three functions at the top level Eos/EosMain Replaces the stub with an implementation common to all formulations of EOS Eos/EosMain/Gamma implements gamma versions Of Eos_init and Eos Eos_init Eos_wrapped There is only one subunit Specific implementation Eos Another implementation, which will have its own Eos and Eos_init etc.

17 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago The Config File qDeclare solution variables, fluxes qDeclare runtime parameters q Sets defaults qLists required, requested or exclusive units qConfig files are additive down the directory tree - no replacements

18 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago FLASH Setup Script: Implements Architecture Python code links together needed physics and tools for a problem q Parses Config files to q Determine a self consistent set of units to include q If a unit has multiple implementations, finds out which implementation to include q get list of parameters from units q Determines solution data storage q Creates a file defining global constants set at build time q Builds infrastructure for mapping runtime parameters to constants as needed q Configures Makefiles properly

19 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Pulling it All Together qChoose a problem simulation qRun setup to configure that problem q Everything is in a new top-level directory q ‘object’ qMake qRun q flash.par for runtime parameters q Defaults already set from particular units

20 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Adding a new Simulation qA basic problem setup q Config file q Required physics modules q flash.par q Default list runtime parameter configuration q Simulation_initBlock q Initial conditions for the problem set block by block q Many other possible files: q Driver, Refinement algorithms, User defined boundary conditions qAny files in setup take precedence More about setting up your own problem in the hands on session later this afternoon

21 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Provided Driver qProvided: Second order, state form, strang split Initialize Loop over timesteps evolve adjust dt output visualize End loop Finalize flash.F90 set time step do physics set time step repeat physics Mesh_updateGrid evolve.F90 FLASH2 FLASH3 Flash.F90 Driver_initFlash Driver_evolveFlash Driver_finalizeFlash Loop over timesteps set time step call physics set time step repeat physics Grid_updateRefinement adjust dt output End loop Driver_evolveFlash.F90 Driver Unit

22 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Flash Grids : Paramesh  FLASH original design was Paramesh specific  Block Structured  All blocks have same dimensions  Blocks at different refinement levels have different grid spacings and thus cover different fractions of the physical domain  Fixed sized blocks specified at compile time  Every section of the code assumed knowledge of block size  Good for capturing shocks  Currently removing fixed block size requirement from code to open door for other mesh packages like a uniform grid, squishy grid and patched base grid In choosing Paramesh, the original FLASH code architects chose simplicity of the Paramesh structure over a patch based mesh.

23 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Infrastructure : Mesh Packages in Flash Paramesh  Block Structured  Fixed sized blocks  Specified at compile time  Not more than one level jump at fine coarse boundaries Uniform Grid  one block per proc  No AMR related overhead

24 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Building and Running an Application Grid mesh I/O Runtime Inputs Profiler Simulation/ setup Driver Physics

25 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago IO/Viz Outline  I/O Challenges with Scientific Computing  speed, portability, file size  serial, parallel multi-file, MPI-IO  I/O Libraries  hdf5  parallel-netcdf  FLASH I/O specifics  parameter input file  checkpointing and restarts  plotfiles  Visualization  runtime visualization  fidlr3  flashView

26 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago I/O Challenges: Portability  Portability  big endian, little endian  size of types vary  data alignment in memory 4 byte int2 byte int 048121620 24 int double Taking a basic binary output file from one machine and trying to use it on another often causes problems

27 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago I/O challenges: File size and Speed  File size  Files are huge --in FLASH a typical checkpoint file typically ~5 GB and up to 50 GB for a research simulation (plotfiles ~2GB)  Need access to large storage areas  Data transfer takes lots of time (~.5 MB/sec from labs)  I/O can be a major performance bottle neck  slow file system  writing to disk takes time It isn’t hard to have speed, portability or usability. It is hard to have speed, portability and usability in the same implementation

28 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Serial I/O 01234 File processors  Each processor sends its data to the master who then writes the data to a file  Advantages  Don’t need a parallel file system  Simple  Disadvantages  Not scalable  Not Efficient 5

29 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Parallel I/O Multi-file 01234 File processors  Each processor writes its own data to a separate file  Advantages  Fast!  Disadvantages  can quickly accumulate many files  hard to manage  requires post processing 5 File

30 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Parallel I/O Single-file 0 1234 File processors  Each processor writes its own data to the same file using MPI-IO mapping  Advantages  single file  scalable (to how many procs is debatable...)  Disadvantages  requires MPI-IO mapping or other higher level libraries 5

31 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Parallel I/O single file 352924319824 012345 processors array of data Each processor writes to a section of a data array. Each must know its offset from the beginning of the array and the number of elements to write

32 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago I/O Libraries qFLASH works with 2 different I/O libraries q HDF5 q Parallel-NetCDF qUse MPI-IO mappings qBoth Portable libraries qBoth can translate data between systems qScientific Data mostly stored in multidimensional arrays qFLASH3 also supports a basic direct FORTRAN I/O -- use only as a last resort!

33 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago FLASH I/O qFLASH can use either HDF5 or Parallel-NetCDF for I/O qInitially very optimistic about PnetCDF because it appeared to be ~ twice as fast as HDF5 Source :J.B. Gallagher on ASC White

34 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Parameter Input File - flash.par qTo change parameters in a simulation edit the flash.par file qDefaults are set in Config files and any value in the flash.par overides the Config values qParameter MUST be declared in Config file!!! qControls parameters for all units (examples below) q Grid parameters - lrefine_max, lrefine_min, geometry q Physics parameters - flame_speed q I/O parameters - restart, checkpointFileNumber

35 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Flash Checkpoint (restart) Files qCheckpoint file -- a file that saves all the data you need to restart your simulation from the exact state it left off q Grid Data q refinement level q nodetype (parent, child) q bound box q coordinates q unknown variables (density, pressure, temperature etc) q Simulation Data q time q dt q timestep q MetaData q time stamp q build directory q setup problem etc For any number of reasons your simulation could be interrupted - the machine crashes, the queue window closes, the machine runs out of disk space etc. Rather than starting again from the beginning of your simulation you can restart your problem using a flash checkpoint files

36 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Checkpoint Parameters qThere are a number of ways to control the occurrence and spacing of checkpoint dumps (names have been clarified for FLASH3) q timesteps between dumps - (parameter: checkpointFileIntervalStep) q simulation time seconds between dumps - (parameter: checkpointFileIntervalTime) q wall clock seconds between dumps - (parameter: wall_clock_checkpoint) q very useful when running in a queue or when only a certain amount of time is available on a machine q creating a file called.dump_checkpoint will force a checkpoint file to be dumped immediately q creating a.dump_restart file will force a checkpoint and halt the code q rolling checkpoint ability - (parameter: rolling_checkpoint) q checkpoint files are big! 5 - 50GB for a large science run q this feature allows you to keep only a specified number of checkpoints to save disk space The FLASH code support website gives detailed explanations of all runtime parameters

37 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Restarting a Simulation qBecause the checkpoint files save the exact state of the simulation you can restart your simulation from any FLASH checkpoint file qCan start with Parallel IO if previous run was written in serial qCan restart with a different number of processors than the previous run To restart a simulation, in your flash.par file set *the runtime parameter ‘restart’ =.true. *the checkpointFileNumber parameter to the desired checkpoint number

38 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Restarting on a different number of processors 352924319824 012345 processors array of data 352924319824 012 processors array of data restart

39 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Flash Plotfiles and Particle Files qPlotfiles are smaller than checkpoints q Only output variables needed for visualization or analysis q Data stored in single precision qControl plotfile frequency with the runtime parameters q plotfileNumber q plotfileIntervalStep q plotfileIntervalTime qParticle files stored in single precision qControl particle file frequency with the runtime parameters q particleFileNumber q particleFileIntervalStep q particleFileIntervalTime Plotfiles and particle files output data for visualization and analysis purposes

40 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Visualization

41 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Visualization qRuntime Visualization qFidlr3 qFlashView

42 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Runtime Visualization qRuntime Visualization module q not automatically included -- can add it into your Modules file to set up problem with Runtime Viz qAllows user a quick and dirty view of the simulation qOnly requires the widely available png library qCreates png files in current directory during runtime

43 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Fidlr3 (FLASH IDL Routines) qIDL tool for 2d visualization qAvailable with FLASH directly qCan step through plots to view sequences or see movies qCan visualize 3d data by looking at slices qMake histograms qAdditionally, if you have access to idl and want to do more analysis you can use the routines to read in data, statistical analysis, max min vaules Fidlr is an application written in idl to visualize Flash checkpoint and plotfiles

44 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Example of 3d slice taken by FILDR3

45 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago FlashView qFeatures q Reads native FLASH plotfiles q XYZ cut planes features so entire range of data for a variable can be projected on a plane q Renders 3d iso-surfaces of Flash Data q Fully parallel (both reading the file and calculations) q Parallelism allows for the ability to look at very large data sets interactively q Displays mulitple (up to 5) variables at the same time q various color maps, zoom, rotation, translation etc q shows grid structure q movies making q Currently only supports hdf5 format * Yes... we think it is difficult to install too... * Key is to install exact version of recommended packages * Ask Dan if you have questions FlashView is a 3d visualization package developed for FLASH data at Argonne National Lab by Mike Papka and Randy Hudson

46 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago FlashView

47 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Example of an isosurface (single variable)

48 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Another example with two variables

49 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Another example/ isosurface with cut plane

50 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Isosurface with Grid

51 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Issues with Cell-centered vs. Corner Interpolated Data

52 The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Issues with cell-centered vs. corner interpolated data


Download ppt "An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear."

Similar presentations


Ads by Google