Center for Magnetic Reconnection Studies The Magnetic Reconnection Code within the FLASH Framework Timur Linde, Leonid Malyshkin, Robert Rosner, and Andrew.

Slides:



Advertisements
Similar presentations
A Workflow Engine with Multi-Level Parallelism Supports Qifeng Huang and Yan Huang School of Computer Science Cardiff University
Advertisements

Software Engineering Key design concepts Design heuristics Design practices.
Current Progress on the CCA Groundwater Modeling Framework Bruce Palmer, Yilin Fang, Vidhya Gurumoorthi, Computational Sciences and Mathematics Division.
Plannes security for items, variables and applications NEPS User Rights Management.
ARCS Data Analysis Software An overview of the ARCS software management plan Michael Aivazis California Institute of Technology ARCS Baseline Review March.
Software Quality Metrics
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
An Accelerated Strategic Computing Initiative (ASCI) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
Astrophysics, Biology, Climate, Combustion, Fusion, Nanoscience Working Group on Simulation-Driven Applications 10 CS, 10 Sim, 1 VR.
© , Michael Aivazis DANSE Software Issues Michael Aivazis California Institute of Technology DANSE Software Workshop September 3-8, 2003.
An Accelerated Strategic Computing Initiative (ASCI) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
An Advanced Simulation and Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
Automated Tests in NICOS Nightly Control System Alexander Undrus Brookhaven National Laboratory, Upton, NY Software testing is a difficult, time-consuming.
NA-MIC National Alliance for Medical Image Computing Core 1b – Engineering End-user Platform Steve Pieper Isomics, Inc.
Common Data Elements and Metadata: Their Roles in Integrating Public Health Surveillance and Information Systems Ron Fichtner, Chief, Prevention Informatics.
Framework for Automated Builds Natalia Ratnikova CHEP’03.
Chapter 2 The process Process, Methods, and Tools
©2013 Lavastorm Analytics. All rights reserved.1 Lavastorm Analytics Engine 5.0 New Feature Overview.
CCSM Software Engineering Coordination Plan Tony Craig SEWG Meeting Feb 14-15, 2002 NCAR.
An Introduction to Software Architecture
An Accelerated Strategic Computing Initiative (ASCI) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
Distributed Development: Lessons learned by Herschel GRITS 2011, June 17 Colin Borys.
EMI INFSO-RI EMI Quality Assurance Processes (PS ) Alberto Aimar (CERN) CERN IT-GT-SL Section Leader EMI SA2 QA Activity Leader.
Topic (1)Software Engineering (601321)1 Introduction Complex and large SW. SW crises Expensive HW. Custom SW. Batch execution.
EMI INFSO-RI SA2 - Quality Assurance Alberto Aimar (CERN) SA2 Leader EMI First EC Review 22 June 2011, Brussels.
EGEE is a project funded by the European Union under contract IST Testing processes Leanne Guy Testing activity manager JRA1 All hands meeting,
R. Ryne, NUG mtg: Page 1 High Energy Physics Greenbook Presentation Robert D. Ryne Lawrence Berkeley National Laboratory NERSC User Group Meeting.
Software Engineering Quality What is Quality? Quality software is software that satisfies a user’s requirements, whether that is explicit or implicit.
CSE 219 Computer Science III Program Design Principles.
SciDAC All Hands Meeting, March 2-3, 2005 Northwestern University PIs:Alok Choudhary, Wei-keng Liao Graduate Students:Avery Ching, Kenin Coloma, Jianwei.
An Accelerated Strategic Computing Initiative (ASCI) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
Components for Beam Dynamics Douglas R. Dechow, Tech-X Lois Curfman McInnes, ANL Boyana Norris, ANL With thanks to the Common Component Architecture (CCA)
Middleware for FIs Apeego House 4B, Tardeo Rd. Mumbai Tel: Fax:
Advanced Simulation and Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
F. Douglas Swesty, DOE Office of Science Data Management Workshop, SLAC March Data Management Needs for Nuclear-Astrophysical Simulation at the Ultrascale.
NEES Cyberinfrastructure Center at the San Diego Supercomputer Center, UCSD George E. Brown, Jr. Network for Earthquake Engineering Simulation NEES TeraGrid.
Chapter 6 CASE Tools Software Engineering Chapter 6-- CASE TOOLS
MESQUITE: Mesh Optimization Toolkit Brian Miller, LLNL
I/O for Structured-Grid AMR Phil Colella Lawrence Berkeley National Laboratory Coordinating PI, APDEC CET.
CCA Common Component Architecture CCA Forum Tutorial Working Group CCA Status and Plans.
A PanDA Backend for the Ganga Analysis Interface J. Elmsheuser 1, D. Liko 2, T. Maeno 3, P. Nilsson 4, D.C. Vanderster 5, T. Wenaus 3, R. Walker 1 1: Ludwig-Maximilians-Universität.
CASE (Computer-Aided Software Engineering) Tools Software that is used to support software process activities. Provides software process support by:- –
Connections to Other Packages The Cactus Team Albert Einstein Institute
NOVA A Networked Object-Based EnVironment for Analysis “Framework Components for Distributed Computing” Pavel Nevski, Sasha Vanyashin, Torre Wenaus US.
CSE 303 – Software Design and Architecture
Visualization in Problem Solving Environments Amit Goel Department of Computer Science Virginia Tech June 14, 1999.
G.Govi CERN/IT-DB 1 September 26, 2003 POOL Integration, Testing and Release Procedure Integration  Packages structure  External dependencies  Configuration.
Jean-Roch Vlimant, CERN Physics Performance and Dataset Project Physics Data & MC Validation Group McM : The Evolution of PREP. The CMS tool for Monte-Carlo.
TR&D 2: NUMERICAL TOOLS FOR MODELING IN CELL BIOLOGY Software development: Jim Schaff Fei Gao Frank Morgan Math & Physics: Boris Slepchenko Diana Resasco.
Centroute, Tenet and EmStar: Development and Integration Karen Chandler Centre for Embedded Network Systems University of California, Los Angeles.
CS223: Software Engineering Lecture 2: Introduction to Software Engineering.
An Accelerated Strategic Computing Initiative (ASCI) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
Brookhaven Science Associates U.S. Department of Energy MERIT Project Review December 12, 2005, BNL, Upton NY MHD Studies of Mercury Jet Target Roman Samulyak.
SDM Center High-Performance Parallel I/O Libraries (PI) Alok Choudhary, (Co-I) Wei-Keng Liao Northwestern University In Collaboration with the SEA Group.
Center for Extended MHD Modeling (PI: S. Jardin, PPPL) –Two extensively developed fully 3-D nonlinear MHD codes, NIMROD and M3D formed the basis for further.
PLUTO: a modular code for computational astrophysics Developers: A. Mignone 1,2, G. Bodo 2 1 The University of Chicago, ASC FLASH Center 2 INAF Osseratorio.
1 Configuration Database David Forrest University of Glasgow RAL :: 31 May 2009.
Building PetaScale Applications and Tools on the TeraGrid Workshop December 11-12, 2007 Scott Lathrop and Sergiu Sanielevici.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks GOCDB4 Gilles Mathieu, RAL-STFC, UK An introduction.
Software Engineering Salihu Ibrahim Dasuki (PhD) CSC102 INTRODUCTION TO COMPUTER SCIENCE.
VisIt Project Overview
Kai Li, Allen D. Malony, Sameer Shende, Robert Bell
ASC/Alliances Center for Astrophysical Thermonuclear Flashes
Pipeline Execution Environment
DOE 2000 PI Retreat Breakout C-1
Module 01 ETICS Overview ETICS Online Tutorials
An Introduction to Software Architecture
Overview Activities from additional UP disciplines are needed to bring a system into being Implementation Testing Deployment Configuration and change management.
Presentation transcript:

Center for Magnetic Reconnection Studies The Magnetic Reconnection Code within the FLASH Framework Timur Linde, Leonid Malyshkin, Robert Rosner, and Andrew Siegel University of Chicago June 5, 2003 Princeton, NJ

Center for Magnetic Reconnection Studies (Univ. of Chicago branch) Overview q FLASH project in general q FLASH role in Magnetic Reconnection Code (MRC) development

Center for Magnetic Reconnection Studies (Univ. of Chicago branch) What is FLASH? What is MRC? q Initially: AMR code for astrophysics problems on ASCI machines (compressible hydro + burning) q FLASH evolved into two things: q More general application code q A framework for building/hosting new problems FLASH physics modules + FLASH framework = FLASH application code Next:  What physics modules does FLASH contain?  What services does FLASH framework contain? Hall MHD modules + FLASH framework = Magnetic Reconnection Code

Center for Magnetic Reconnection Studies (Univ. of Chicago branch) FLASH breakdown q physics modules: (in)compressible hydro, relativistic hydro/MHD, resistive mhd, 2-D Hall mhd, (nuclear) reaction networks, time- dependent ionization, various equations of state, particles, self- gravity, Boltzmann transport, subgrid models, front-tracking q framework: block-structured AMR (Paramesh), parallel io (hdf5), runtime vis (pvtk), runtime performance monitoring (PAPI), generic linear solvers tied to mesh, syntax/tool for building new solvers q code support (public web-based) q flash_test q flash_benchmark q coding standard verification q bug/feature tracker q user support schedule q download:

Center for Magnetic Reconnection Studies (Univ. of Chicago branch) General features of FLASH q Three major releases over four years q 300,000+ lines (F90 / C / Python) q Good performance q Scalable on ASCI machines to 5K procs q Gordon Bell prize (2000) q Emphasis on portability, interoperability q Standardization of AMR output format, data sharing via CCA q Flash 2.3 q New release, scheduled June 1, 2003 q optimized multigrid solver q significant improvements in documentation q ported to Compaq TRU64 q 2-D runtime visualization q optimized uniform grid q support for different mesh geometries q FFT on uniform grid q optimized multigrid on uniform grid q paramesh3.0 q Parallel NetCDF i/o module q Implicit diffusion q Flash 2.4 q Final 2.x version (Sept 2004)

Center for Magnetic Reconnection Studies (Univ. of Chicago branch) FLASH foci q Four initial major emphases q Performance q Testing q Usability q Portability q Later progress in extensibility/reuse: Flash v3.x q Generalized mesh variable database q FLASH component model q FLASH Developer’s Guide

Center for Magnetic Reconnection Studies (Univ. of Chicago branch) The future of Flash q Take this a step further: identify the “actors” A. End-users Run an existing problem B. Module/problem contributors Use database Module interface but unaware of Flash internals C. Flash developers Work on general framework issues, utility modules, performance, portability, etc. according to needs of astrophysics and (laboratory) code validation. q Flash development successively focused on these 3 areas q Flash1.x: emphasis on A q Flash2.x: expand emphasis to B q Flash3.x: expand emphasis to C q Note: q Application scientists lean toward A. and B; programmers/software engineers lean toward C; computer scientists can be involved at any level q Everybody contributes to design process; software architect must make final decisions on how to implement plan.

Center for Magnetic Reconnection Studies (Univ. of Chicago branch) FLASH and CMRS q Follows typical pattern of FLASH collaborations q Prototyping, testing, results initially external to FLASH if desired q Iowa AMR-based Hall MHD – Kai Germaschewski q No “commitment” to FLASH q Interoperability strategy agreed upon q how are solvers packaged? q what data structures are used? q what operations must mesh support? component model

Center for Magnetic Reconnection Studies (Univ. of Chicago branch) CMRS/Flash strategy q Move portable components between FLASH/local framework as needs warrant q People strategy: q FLASH developer leading the FLASH single-fluid MHD work (Timur Linde) leads the Chicago MRC development q CMRS supports a postdoctoral fellow (Leonid Malyshkin) fully engaged in developing/testing the MRC q We also support a new graduate student (Claudio Zanni/U. Torino) working on the MRC and its extensions q Science strategy: q The immediate target of our efforts are on reconnection q Specifically: what is the consequence of relaxing the “steady state” assumption of reconnection - can one have fast reconnection in time- dependent circumstances under conditions in which steady reconnection cannot occur?

Center for Magnetic Reconnection Studies (Univ. of Chicago branch) Using FLASH q Some advantages of FLASH q tested nightly q constantly ported to new platforms q i/o optimized independently q visualization developed independently q documentation manager q user support q bug database q performance measured regularly q AMR (tested/documented independently) q coding standards enforcement scripts q debugged frequently (lint, forcheck) q sophisticated versioning, repository management q possible interplay with other physics modules (particles, etc.)

Center for Magnetic Reconnection Studies (Univ. of Chicago branch) Where are we now? q We have a working 3-D resistive/viscous AMR MHD code q Has already been used by R. Fitzpatrick in his study of compressible reconnection q MRC v1.0 exists q FLASH and 2-D Hall MHD have been joined and are being tested q Required elliptic solves for Helmholtz, Poisson (i.e., multigrid) q Based on reusable components q This was done by importing the Iowa Hall MHD code as a “module”, but using our own Poisson and Helmholtz solvers; hence we solve exactly the same equations as the Iowa “local framework” q We are now running comparisons of MRC with the Iowa Hall MHD code q The next steps are q Inclusion of full 3-D Hall MHD, again implemented in a staged manner (almost completed) q More flexible geometry: cylindrical, toroidal

Center for Magnetic Reconnection Studies (Univ. of Chicago branch) Concluding remarks q Code emphases: q Standards of interoperability q Simple: common i/o formats – can reuse postprocessing tools q More complex: reusing solvers from one meshing package in another – libAMR (Colella) q More complex: standard interface for meshing package q Robustness, performance, portability, ease of use q Science emphases: q Focus is on an astrophysically-interesting and central problem q Problem is also highly susceptible to laboratory verification

Center for Magnetic Reconnection Studies (Univ. of Chicago branch) Questions and discussion … which brings us to