Presentation is loading. Please wait.

Presentation is loading. Please wait.

An Accelerated Strategic Computing Initiative (ASCI) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.

Similar presentations


Presentation on theme: "An Accelerated Strategic Computing Initiative (ASCI) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical."— Presentation transcript:

1 An Accelerated Strategic Computing Initiative (ASCI) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear Flashes The Flash Code Tomek Plewa on behalf of almost countless contributors The ASCI Flash Center Dept. of Astronomy & Astrophysics The University of Chicago July 22, 2002 http://flash.uchicago.edu From Design to Applications From Applications to Design

2 The ASCI/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago What Is the Flash Center? qSupported by the DOE ASCI/Alliances Program qOver $1,000,000 p.a. budget q20 core researches, 30 contributors at different levels qAccess to the most advanced computer technology qClose partners at UofC, ANL MCS, UCSC, UIUC qCollaboration with LLNL, LANL, LBNL, Sandia, ORNL qLinks to MPA Garching, Arizona, Palermo, Torino

3 The ASCI/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago What Is the FLASH Code? Has a modern architecture modular with interfaces configurable with parameter/variable database Is highly portable across software platforms: most UNICES including Linux across hardware platforms: MPI for intra- and inter-box communication scalar cache-based systems provides strong tests of software (operating system, compiler, language interoperability) and hardware (network, storage) parallel I/O with HDF5 for large data sets Can solve a broad range of problems adaptive mesh discretization in 3-D with PARAMESH, compressible hydrodynamics, MHD, SRHD, elliptic operators, explicit diffusion, complex EOS, particle tracking Future developments extended framework through IBEAM (parallel solvers for large linear systems) formal interface specification (for solvers and mesh component) front tracking component elements of TSTT/CCA forums interoperability with other AMR software

4 The ASCI/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Primary Applications q X-ray bursts on neutron star surfaces q Novae q Type Ia supernovae The common elements: The underlying stars are compact Members of close binary systems Physical processes: hydrodynamics, complex EOS, gravitation, nuclear burning Radiation hydrodynamics important at late times Initial conditions involve long timescales (implicit solve) Rapid evolution during final event (explicit solve)

5 The ASCI/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Primary Applications: Towards Understanding  What is environment for the explosion?  How does it form?  What happens during the explosion?  Where are complex elements produced?  How big is the Universe?  How old is the Universe?

6 The ASCI/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Primary Applications: Examples Helium burning on neutron stars R-M instability Laser-driven shock instabilities Magnetic and hydrodynamic Rayleigh-Taylor instabilities Flame-vortex interactions Wave breaking on white dwarfs Type Ia supernova Cellular detonation Landau-Darrieus instability

7 The ASCI/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Primary Applications: Examples 3-D Rayleigh-Taylor instability Flame-vortex interactions Wave breaking on white dwarfs

8 The ASCI/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Additional Applications Jeans instability Relativistic accretion onto NS Intracluster interactions Non-relativistic accretion onto BH

9 The ASCI/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago The FLASH 2+ Code: at a Glance qGeneral description parallel block-structured adaptive mesh refinement code q Solvers constantly developed hyperbolic: hydrodynamics, MHD, and SRHD elliptic: self-gravity parabolic: thermal conduction ODE: nuclear burning qArchitecture undergoes substantial changes modular, fine-grained, SPMD (patch-based), efficient parallelism cache-based scalar architectures (most of the market) mostly F90 with elements in C; no language restrictions qTesting one of the finest and most matured elements used to control development and prevent major design flaws compile  run  compare applied daily on several platforms, centralized database

10 The ASCI/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago The FLASH 2+ Code: Physics Modules qCompressible hydro PPM, WENO, Tadmore central-difference MHD 2 nd order TVD SRHD 2 nd order Godunov, Roe solver, R-K stepping qSource terms Nuclear burning – variety of reaction networks qGravitational field Externally imposed Self-gravity (multipole, multigrid, single level FFT) qDiffusion Thermal Conduction

11 The ASCI/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago The FLASH 2+ Code: Component Model 1. Meta-data (Configuration Info) Variable/parameter registration Variable attributes Module requirements Role in driver (?) 2. Interface Wrapper Exchange with variable database 3. Physics Module(s) Single patch, single proc functions written in any language Can be sub-classed FLASH Component Database mesh FLASH Application driver Collection of Flash Components

12 The ASCI/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago The FLASH 2+ Code: Application Builder Gravity Source Terms Materials Hydro Particles I/OVis MHD Configuration Tool (Setup) DatabaseMesh Driver

13 The ASCI/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago The FLASH 2+ Code: Application Example Driver.f90 evolve(); timeStep(); AMR Library (PARAMESH) Variable Database Sets solution variable descriptions in database Evolve.f90 hydro3D( ); burn(); Burn.f90 burner(arglist); Class accessor methods Global memory Parameter list Data Flow PPM.f77NetInt.f77 EulerStep.c Framework Standard interfaces Physics Modules (easily interchanged) Hydro3d.f90 computeRHS(arglist); eulerStep(arglist);

14 Mesh init() fill_guardcells() test_refinement() refine_derefine() MaterialsHydroSource_termsGravity init() tstep() hydro3d() init() tstep() grav3d() init() tstep() src_terms() eos3d() eos1d() eos() ExplicitImplicit Split PPM Unsplit WENO constant point_mass Poisson Multigrid Gamma Helmholtz... burn cool heat iso13 Multipole Variable Database dBase_init() dBaseGetData() dBasePutData() dBaseProperty() Particles init() advance() Visualization init() render() IO init() write() MHD init() Driver init() dBaseGetData() dBasePutData() dBaseProperty() Diffusion init() fill_guardcells() test_refinement() refine_derefine() Structure of FLASH Modules

15 The ASCI/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Driver steadytime dep rk3 strang stirburnheatcool source terms aprox13 ppcno HDF4.0HDF5.0 f77_unf IO Gravity planpar constantpoisson MHD 2 nd order TVD Hydro implicit explicit splitunsplit delta form PPMMUSCL PPMKurganovWENO Materials composition eos gamma hemlholtz nadozhin Particles passiveactive Solvers poisson multigrid multipole Runtime Visualization PVTK scriptedcompiled Mesh AMR uniform paramesh MPI mesh database mesh database mesh database The FLASH 2+ Code: Directory Structure

16 The ASCI/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago The FLASH 2+ Code: Additional Features qExternal libraries MPI for parallel execution Paramesh for adaptive discretization HDF5 for efficient I/O pVTK for remote visualization q External tools Python for configuration gmake for code compilation qhttp://flash.uchicago.edu Available with no major restrictions Looking to expand user base Support with short response time

17 The ASCI/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago qFLASH aspires to become a community code qFLASH is freely available qMajor emphases Performance Portability Testing Usability Support (secured for at least next 5 years) qInterest, skill, and organization guarantees success qFuture FLASH Implicit hydro solvers; Front tracking mesh component Solver and mesh interfaces FLASH component model FLASH developer’s guide Summary

18 The ASCI/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago The FLASH Code: From Design to Applications Questions and Discussion

19 The ASCI/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago The FLASH 2+ code


Download ppt "An Accelerated Strategic Computing Initiative (ASCI) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical."

Similar presentations


Ads by Google