PARAMESH: A PARALLEL, ADAPTIVE GRID TOOL FOR THE SPACE SCIENCES Kevin Olson (NASA/GSFC and GEST/Univ. of MD, Baltimore) Presented, AISRP PI Meeting April,

Slides:



Advertisements
Similar presentations
Simulating Cosmological Radiative Transfer (or: How to Light Up The Universe) Introduction: Behind the term radiative transfer hides a theory that mathematically.
Advertisements

FLASH Workshop Hamburger Sternwarte, University of Hamburg, Feb 15 – Feb 16, 2012 A Solution Accurate, Efficient and Stable Unsplit Staggered Mesh MHD.
WRF Modeling System V2.0 Overview
Presented By: Paul Grenning. Deflagration is the ignition and combustion Gasoline deflagrates when lit with a match Detonation is the explosive force.
Current Progress on the CCA Groundwater Modeling Framework Bruce Palmer, Yilin Fang, Vidhya Gurumoorthi, Computational Sciences and Mathematics Division.
Recent results with Goddard AMR codes Dae-Il (Dale) Choi NASA/Goddard, USRA Collaborators J. Centrella, J. Baker, J. van Meter, D. Fiske, B. Imbiriba (NASA/Goddard)
For a typical white dwarf density of 5  10 8 g cm -3 and a pure carbon environment, the flame thickness is 3.78  cm and the speed is 58 km s -1.
Coupling Continuum Model and Smoothed Particle Hydrodynamics Methods for Reactive Transport Yilin Fang, Timothy D Scheibe and Alexandre M Tartakovsky Pacific.
Gravitational Collapse in Axisymmetry Collaborators: Matthew Choptuik, CIAR/UBC Eric Hircshmann, BYU Steve Liebling, LIU APS Meeting Albuquerque, New Mexico.
TOWARDS A REALISTIC, DATA-DRIVEN THERMODYNAMIC MHD MODEL OF THE GLOBAL SOLAR CORONA Cooper Downs, Ilia I. Roussev, Bart van der Holst, Noe Lugaz, Igor.
ASCI/Alliances Center for Astrophysical Thermonuclear Flashes Simulating Self-Gravitating Flows with FLASH P. M. Ricker, K. Olson, and F. X. Timmes Motivation:
GRMHD Simulations of Jet Formation with Newly-Developed GRMHD Code K.-I. Nishikawa (NSSTC/UAH), Y. Mizuno (NSSTC/MSFC/NPP), P. Hardee (UA), S. Koide (Kumamoto.
Novae and Mixing John ZuHone ASCI/Alliances Center for Thermonuclear Flashes University of Chicago.
An Advanced Simulation and Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
Hybrid Simulation of Ion-Cyclotron Turbulence Induced by Artificial Plasma Cloud in the Magnetosphere W. Scales, J. Wang, C. Chang Center for Space Science.
Understanding Magnetic Eruptions on the Sun and their Interplanetary Consequences A Solar and Heliospheric Research grant funded by the DoD MURI program.
Parallel Mesh Refinement with Optimal Load Balancing Jean-Francois Remacle, Joseph E. Flaherty and Mark. S. Shephard Scientific Computation Research Center.
Understanding Magnetic Eruptions on the Sun and their Interplanetary Consequences A Solar and Heliospheric Research grant funded by the DoD MURI program.
SSL (UC Berkeley): Prospective Codes to Transfer to the CCMC Developers: W.P. Abbett, D.J. Bercik, G.H. Fisher, B.T. Welsch, and Y. Fan (HAO/NCAR)
The ASCI/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Type Ia Supernovae and Cosmology  M ~ 0.3,   ~ 0.7 Smoldering.
An Accelerated Strategic Computing Initiative (ASCI) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
Cosmological MHD Hui Li Collaborators: S. Li, M. Nakamura, S. Diehl, B. Oshea, P. Kronberg, S. Colgate (LANL) H. Xu, M. Norman (UCSD), R. Cen (Princeton)
Modeling Emerging Magnetic Flux W.P. Abbett, G.H. Fisher & Y. Fan.
An Advanced Simulation and Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
New Coupled Models of Emerging Magnetic Flux in Active Regions W. P. Abbett, S. A. Ledvina, and G.H. Fisher.
1 Data Structures for Scientific Computing Orion Sky Lawlor charm.cs.uiuc.edu 2003/12/17.
An Accelerated Strategic Computing Initiative (ASCI) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
ASCI/Alliances Center for Astrophysical Thermonuclear Flashes FLASH MHD Timur Linde FLASH MHD Timur Linde This work was supported by the ASCI Flash Center.
Center for Magnetic Reconnection Studies The Magnetic Reconnection Code within the FLASH Framework Timur Linde, Leonid Malyshkin, Robert Rosner, and Andrew.
Jonathan Carroll-Nellenback University of Rochester.
A Metadata Based Approach For Supporting Subsetting Queries Over Parallel HDF5 Datasets Vignesh Santhanagopalan Graduate Student Department Of CSE.
Development of ORBIT Data Generation and Exploration Routines G. Shelburne K. Indireshkumar E. Feibush.
Objective of numerical relativity is to develop simulation code and relating computing tools to solve problems of general relativity and relativistic astrophysics.
Application / User Viewpoint Computer Science Section Head Computational and Information Systems Laboratory National Center for Atmospheric.
SciDAC All Hands Meeting, March 2-3, 2005 Northwestern University PIs:Alok Choudhary, Wei-keng Liao Graduate Students:Avery Ching, Kenin Coloma, Jianwei.
An Accelerated Strategic Computing Initiative (ASCI) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
The european ITM Task Force data structure F. Imbeaux.
Collapsar Accretion and the Gamma-Ray Burst X-Ray Light Curve Chris Lindner Milos Milosavljevic, Sean M. Couch, Pawan Kumar.
CMRS Review, PPPL, 5 June 2003 &. 4 projects in high energy and nuclear physics 5 projects in fusion energy science 14 projects in biological and environmental.
Advanced Simulation and Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
F. Douglas Swesty, DOE Office of Science Data Management Workshop, SLAC March Data Management Needs for Nuclear-Astrophysical Simulation at the Ultrascale.
1 Building Bridges: CGWA Inauguration 15 December 2003 Lazarus Approach to Binary Black Hole Modeling John Baker Laboratory for High Energy Astrophysics.
1 1  Capabilities: Building blocks for block-structured AMR codes for solving time-dependent PDE’s Functionality for [1…6]D, mixed-dimension building.
I/O for Structured-Grid AMR Phil Colella Lawrence Berkeley National Laboratory Coordinating PI, APDEC CET.
Cracow Grid Workshop, November 5-6, 2001 Concepts for implementing adaptive finite element codes for grid computing Krzysztof Banaś, Joanna Płażek Cracow.
Connections to Other Packages The Cactus Team Albert Einstein Institute
ASCI/Alliances Center for Astrophysical Thermonuclear Flashes Helium Detonations on Neutron Stars M. Zingale, F. X. Timmes, B. Fryxell, D. Q. Lamb, K.
Patrick Zhou University of Maryland, College Park Science and Engineering Student Internship Community Coordinated Modeling Center Integrated Space Weather.
WRF Software Development and Performance John Michalakes, NCAR NCAR: W. Skamarock, J. Dudhia, D. Gill, A. Bourgeois, W. Wang, C. Deluca, R. Loft NOAA/NCEP:
TR&D 2: NUMERICAL TOOLS FOR MODELING IN CELL BIOLOGY Software development: Jim Schaff Fei Gao Frank Morgan Math & Physics: Boris Slepchenko Diana Resasco.
DOE/SciDAC Supernova Science Center (SNSC) S. Woosley (UCSC), A. Burrows (UA), C. Fryer (LANL), R. Hoffman (LLNL)+ 20 researchers.
An Accelerated Strategic Computing Initiative (ASCI) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
SDM Center High-Performance Parallel I/O Libraries (PI) Alok Choudhary, (Co-I) Wei-Keng Liao Northwestern University In Collaboration with the SEA Group.
1 Data Structures for Scientific Computing Orion Sky Lawlor /04/14.
1 Rocket Science using Charm++ at CSAR Orion Sky Lawlor 2003/10/21.
A Numerical Study of the Breakout Model for Coronal Mass Ejection Initiation P. MacNeice, S.K. Antiochos, A. Phillips, D.S. Spicer, C.R. DeVore, and K.
Center for Extended MHD Modeling (PI: S. Jardin, PPPL) –Two extensively developed fully 3-D nonlinear MHD codes, NIMROD and M3D formed the basis for further.
PLUTO: a modular code for computational astrophysics Developers: A. Mignone 1,2, G. Bodo 2 1 The University of Chicago, ASC FLASH Center 2 INAF Osseratorio.
Improving Swift Hal Levison (PI), SwRI Martin Duncan (CoI), Queen’s University Mark Lewis (CoI), Trinity University David Kaufmann, SwRI.
FESR Consorzio COMETA - Progetto PI2S2 Porting MHD codes on the GRID infrastructure of COMETA Germano Sacco & Salvatore Orlando.
1 Test Particle Simulations of Solar Energetic Particle Propagation for Space Weather Mike Marsh, S. Dalla, J. Kelly & T. Laitinen University of Central.
CITA|ICAT Jonathan Dursi HPCS’06 15 May Towards Understanding some Astrophysical Flows using Multiscale Simulations with the FLASH code Jonathan Dursi,
Unstructured Meshing Tools for Fusion Plasma Simulations
GEM Student Tutorial: GGCM Modeling (MHD Backbone)
Overview PI: Doron Kushnir, Institute for Advanced Study, Weizmann Institute ECSS: David Bock, National Center for Supercomputing Applications Type Ia.
ASC/Alliances Center for Astrophysical Thermonuclear Flashes
The Cactus Team Albert Einstein Institute
Advances in BUB3D-OM Development
D. Odstrcil1,2, V.J. Pizzo2, C.N. Arge3, B.V.Jackson4, P.P. Hick4
Component Frameworks:
Presentation transcript:

PARAMESH: A PARALLEL, ADAPTIVE GRID TOOL FOR THE SPACE SCIENCES Kevin Olson (NASA/GSFC and GEST/Univ. of MD, Baltimore) Presented, AISRP PI Meeting April, 2005 NASA/AMES Research Center

COLLABORATORS Peter MacNeice (NASA/GSFC and Drexel U.) Joan Centrella (NASA/GSFC) Don Lamb (U. of Chicago) other developers include: C. Mobarry, R. DeFainchtein, M. Gehmeyer, M. Bhat, C. Packer, M. Rilee, J. VanMetre, D. Choi (NASA/GSFC) R. Devore (NRL), M. Zingale, J. Dursi, K. Riley, A. Siegel, T. Linde, D. Sheeler (U. Chicago) Initial funding provided by NASA/ESTO-CT

TALK OUTLINE I.AN OVERVIEW OF PARAMESH II.AISRP GOALS III.APPLICATIONS OF PARAMESH IV.PROGRESS TOWARD GOALS

An Overview of PARAMESH

PARAMESH: what is it ? A Package designed to ease the task of adding parallelization and dynamic, adaptive mesh refinement (AMR) to an already existing uniform mesh, serial code A library of subroutines and accessible data structures PARAMESH is the basic parallelization and AMR tool for several important space science applications Written in Fortran90 (NAG, Lahey, Intel, Portland Group, HP-Compaq, IBM, SGI) Interprocessor communication using MPI Version 3.3 (beta) released January 2005, version 3.3 (stable) and 3.4 (beta) to be released June of PARAMESH WEB site:

A subset of Berger-Oliger, block-adaptive scheme Computational Volume is recursively bisected into ‘blocks’, forming a tree data structure.

Blocks are ordered and distributed to processors using a space filling curve as in particle tree codes.

Each Block is a logically cartesian, uniform mesh of cells. Each cell in a block can store user specified data at cell centers, corners, edges or faces A 2-D Block of CellsA Single Mesh Cell in 3-D

Support for consistent fluxes and ensuring conservation for finite volume schemes Support for averaging data at cell edges to ensure consistent circulation integrals around cell faces

AISRP GOALS

AISRP Goals ● Extend and improve PARAMESH – Parallel I/O (HDF5, NetCDF), I/O formats for graphics packages such as ChomboVis and Techplot. – C Interface – Improved support for multigrid and linear systems solvers – Improve divergence of B control – Improve user interface for interpolation – Improve support for non-cartesian coordinate systems

AISRP Goals ● Effective Open Source Development – Develop coding standards – Improve automatic testing procedure – Self-documenting comments using ‘Robodoc’ – Developers’ Guide – System for bug and feature request tracking – More formal release and patching strategy ● Integrate new versions of PARAMESH into actual, working, space science applications

SOME SPACE SCIENCE APPLICATIONS USING PARAMESH

NRL AMRMHD3D R. DeVore (NRL) P. MacNeice, K. Olson (NASA/GSFC) Solves the equations of MHD (DeVore, 1991) Code for which PARAMESH was developed Used for solar physics applications Numerical schemes: FCT with constrained transport for MHD and multigrid for implicit formulation of non-linear thermal diffusion

CORONAL MASS EJECTIONS (P. MacNeice et al., Drexel U. and NASA/GSFC)

General Relativity, HAHNDOL J. Centrella, D. Choi, B. Imbiriba, J. Baker, D. Fiske, & J. Van Meter (NASA/GSFC), D. Brown, L. Lowe (N.C. State) Solves Einstein Equations Goal: To simulate gravitational waves resulting from the collision of super-massive black holes in order to help interpret data from LISA mission (to be launched 2011). Numerical Schemes: Multigrid, Finite Difference Major user of Columbia System

Gravitational Wave Propagation (J. Van Metre NASA/GSFC, movie by C. Henze, NASA/AMES)

FLASH ASTROPHYSICS CODE FLASH code team Fryxell et al., 2000, ApJS, 131, 273, Implements various CFD Schemes, MHD, Nuclear Reactions, Stellar Equations of State, and self-gravity using multigrid. Designed to model Astrophysical thermonuclear ‘flashes’ (X-ray bursts, Novae, and Type 1a Supernovae). Awarded 2000 Gordon-Bell Prize (0.25 Tflops on 6,420 processors, ASCI ‘Red’)

Simulation of an X-ray Burst due to a detonation in He atmosphere on a neutron star. (Mike Zingale, U.C. Santa Cruz)

Type Ia Supernova due to an off-center explosion in a white dwarf. (T. Plewa, A. Calder, and D. Lamb, U. of Chicago)

Other Space Science Applications ● CASIM (M. Benna and P. Mahaffy at GSFC) – MHD application for modeling comet-solar wind interaction ● YDFCT (D. Odstrcil at NOAA) – MHD application for modeling multiple interacting CME’s, integrated into CCMC ● ZeusAMR (W. Abbett et al. at U.C. Berkeley) – Combination of Zeus MHD code and PARAMESH for modeling magnetic flux emergence from the sun ● IBEAM (D. Swesty et al. at UIUC/NCSA and SUNY-SB) – Modern Astrophysics framework, radiation hydrodynamics for modeling gamma ray burst fireballs ● Plus others, the list continues to grow!

PROGRESS TOWARD GOALS

Progress Toward Goals ● Parallel I/O checkpointing capability added to PARAMESH using HDF5 (version 3.3) ● Capability added to write files which can be viewed using ChomboVis graphics and analysis package (version 3.3) ● C interface work begun ● Divergence of B control working and in currently released version, user interface being improved ● Non-cartesian coordinates are working in tests now, will be released in a beta version this coming Fall

Progress Toward Goals ● PARAMESH being developed as an open source project through sourceforge.net web site. We are actively seeking developers! ● Sourceforge.net used for bug and feature request tracking. ● Rododoc documentation partially complete and will form a portion of the developers’ guide ● Release policy established for patches and major releases ● Developers’ guide under development

Progress Toward Goals ● HAHNDOL is using latest version of PARAMESH ● FLASH 3.0 development begun and will incorporate the latest version of PARAMESH

CONCLUSIONS Parallel, Adaptive Mesh Refinement has wide applicability in the space sciences PARAMESH provides a useful and flexible tool for adding parallel AMR to a wide variety of applications, allowing the efficient solution of ‘real’ problems We are making good progress toward the goals we promised for the AISRP project.