An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.

Slides:



Advertisements
Similar presentations
Inpainting Assigment – Tips and Hints Outline how to design a good test plan selection of dimensions to test along selection of values for each dimension.
Advertisements

File Systems.
An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
Automated Instrumentation and Monitoring System (AIMS)
Creating Computer Programs lesson 27. This lesson includes the following sections: What is a Computer Program? How Programs Solve Problems Two Approaches:
ASCI/Alliances Center for Astrophysical Thermonuclear Flashes Simulating Self-Gravitating Flows with FLASH P. M. Ricker, K. Olson, and F. X. Timmes Motivation:
I/O Analysis and Optimization for an AMR Cosmology Simulation Jianwei LiWei-keng Liao Alok ChoudharyValerie Taylor ECE Department Northwestern University.
An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
Novae and Mixing John ZuHone ASCI/Alliances Center for Thermonuclear Flashes University of Chicago.
An Advanced Simulation and Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
Astrophysics, Biology, Climate, Combustion, Fusion, Nanoscience Working Group on Simulation-Driven Applications 10 CS, 10 Sim, 1 VR.
Efficient Parallelization for AMR MHD Multiphysics Calculations Implementation in AstroBEAR.
An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
An Accelerated Strategic Computing Initiative (ASCI) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
Connecting HPIO Capabilities with Domain Specific Needs Rob Ross MCS Division Argonne National Laboratory
An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
Guide To UNIX Using Linux Third Edition
An Advanced Simulation and Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
Principle of Functional Verification Chapter 1~3 Presenter : Fu-Ching Yang.
An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
Copyright Arshi Khan1 System Programming Instructor Arshi Khan.
CHAPTER 4: INTRODUCTION TO COMPUTER ORGANIZATION AND PROGRAMMING DESIGN Lec. Ghader Kurdi.
Architecture Of ASP.NET. What is ASP?  Server-side scripting technology.  Files containing HTML and scripting code.  Access via HTTP requests.  Scripting.
Languages and Environments Higher Computing Unit 2 – Software Development.
1 iSee Player Tutorial Using the Forest Biomass Accumulation Model as an Example ( Tutorial Developed by: (
Systems Software & Operating systems
Topics Introduction Hardware and Software How Computers Store Data
1 Computing Software. Programming Style Programs that are not documented internally, while they may do what is requested, can be difficult to understand.
Center for Magnetic Reconnection Studies The Magnetic Reconnection Code within the FLASH Framework Timur Linde, Leonid Malyshkin, Robert Rosner, and Andrew.
A Metadata Based Approach For Supporting Subsetting Queries Over Parallel HDF5 Datasets Vignesh Santhanagopalan Graduate Student Department Of CSE.
CMPD 434 MULTIMEDIA AUTHORING Chapter 06 Multimedia Authoring Process IV.
Installation Overview Lab#2 1Hanin Abdulrahman. Installing Ubuntu Linux is the process of copying operating system files from a CD, DVD, or USB flash.
CCGrid 2014 Improving I/O Throughput of Scientific Applications using Transparent Parallel Compression Tekin Bicer, Jian Yin and Gagan Agrawal Ohio State.
Application / User Viewpoint Computer Science Section Head Computational and Information Systems Laboratory National Center for Atmospheric.
An Accelerated Strategic Computing Initiative (ASCI) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
The european ITM Task Force data structure F. Imbeaux.
Advanced Simulation and Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
Moving Arrays -- 1 Completion of ideas needed for a general and complete program Final concepts needed for Final Review for Final – Loop efficiency.
CCGrid, 2012 Supporting User Defined Subsetting and Aggregation over Parallel NetCDF Datasets Yu Su and Gagan Agrawal Department of Computer Science and.
1 CSCD 326 Data Structures I Software Design. 2 The Software Life Cycle 1. Specification 2. Design 3. Risk Analysis 4. Verification 5. Coding 6. Testing.
I/O for Structured-Grid AMR Phil Colella Lawrence Berkeley National Laboratory Coordinating PI, APDEC CET.
Chapter 3 Top-Down Design with Functions Part II J. H. Wang ( 王正豪 ), Ph. D. Assistant Professor Dept. Computer Science and Information Engineering National.
1 Computer Systems II Introduction to Processes. 2 First Two Major Computer System Evolution Steps Led to the idea of multiprogramming (multiple concurrent.
Connections to Other Packages The Cactus Team Albert Einstein Institute
Parallel I/O Performance Study and Optimizations with HDF5, A Scientific Data Package MuQun Yang, Christian Chilan, Albert Cheng, Quincey Koziol, Mike.
Introduction to OOP CPS235: Introduction.
An Accelerated Strategic Computing Initiative (ASCI) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical.
SDM Center High-Performance Parallel I/O Libraries (PI) Alok Choudhary, (Co-I) Wei-Keng Liao Northwestern University In Collaboration with the SEA Group.
NTFS Filing System CHAPTER 9. New Technology File System (NTFS) Started with Window NT in 1993, Windows XP, 2000, Server 2003, 2008, and Window 7 also.
Lesson 1 1 LESSON 1 l Background information l Introduction to Java Introduction and a Taste of Java.
1 Rocket Science using Charm++ at CSAR Orion Sky Lawlor 2003/10/21.
Center for Extended MHD Modeling (PI: S. Jardin, PPPL) –Two extensively developed fully 3-D nonlinear MHD codes, NIMROD and M3D formed the basis for further.
An overview of C Language. Overview of C C language is a general purpose and structured programming language developed by 'Dennis Ritchie' at AT &T's.
OCR A Level F453: The function and purpose of translators Translators a. describe the need for, and use of, translators to convert source code.
First INFN International School on Architectures, tools and methodologies for developing efficient large scale scientific computing applications Ce.U.B.
Some of the utilities associated with the development of programs. These program development tools allow users to write and construct programs that the.
An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear.
CITA|ICAT Jonathan Dursi HPCS’06 15 May Towards Understanding some Astrophysical Flows using Multiscale Simulations with the FLASH code Jonathan Dursi,
Visit for more Learning Resources
Learning to Program D is for Digital.
In-situ Visualization using VisIt
A Closer Look at Instruction Set Architectures
CS190/295 Programming in Python for Life Sciences: Lecture 1
Creating Computer Programs
Overview of Workflows: Why Use Them?
Lecture 13 Teamwork Bryan Burlingame 1 May 2019.
Creating Computer Programs
Presentation transcript:

An Advanced Simulation & Computing (ASC) Academic Strategic Alliances Program (ASAP) Center at The University of Chicago The Center for Astrophysical Thermonuclear Flashes Inner Workings And Architecture of FLASH Anshu Dubey & Katie Antypas June 4, 2006

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Overview qWhat is FLASH qBasics of FLASH Architecture q Similarities and differences between versions 2 and 3 q Units q Inheritance q Naming qBasics behind a problem setup q Enough of an idea of these concepts to be able to look at a sample setup and understand what is happening.

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago The Flash Code Cellular detonation Compressed turbulence Helium burning on neutron stars Richtmyer-Meshkov instability Laser-driven shock instabilities Nova outbursts on white dwarfs Rayleigh-Taylor instability Flame-vortex interactions Gravitational collapse/Jeans instability Wave breaking on white dwarfs Shortly: Relativistic accretion onto NS Orzag/Tang MHD vortex Type Ia Supernova Intracluster interactions Magnetic Rayleigh-Taylor The Flash code 1.Parallel, adaptive-mesh simulation code 2.Designed for compressible reactive flows 3.Has a modern CS-influenced architecture 4.Can solve a broad range of (astro)physics problems 5.Portable- runs on many massively-parallel systems 6.Scales and performs well 7.Is available on the web:

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago What FLASH Provides qPhysics q Hydrodynamics q PPM q MHD q Relativistic PPM q Equation of State q Ideal gas q Multigamma q Helmholtz q Nuclear Physics q Gravity q Cosmology q Particles qInfrastructure q Setup q Mesh Management q AMR: Paramesh q UG: q Parallel I/O q hdf5, pnetcdf q Monitoring q Runtime and post- processing visualization q Regular testing toolkit q Unit test framework

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago A Little FLASH History qFLASH0 q Paramesh2, Prometheus and EOS/Burn qFLASH1 q Smoothing out the smash q First form of module architecture & inheritance qFLASH2 q Untangle modules from each other (Grid) q dBase q Concept of levels of users qFLASH3 q Stricter interface control & unit architecture q Taming the database q Better encapsulation and isolation q Community supported and developed code BAM

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago FLASH Audiences FLASH Application Developer Application Programmer End User Develop physics Unit Publish API Use other unit’s API Initial conditions Boundary conditions Runtime parameters Select units, and their specific implementations Implement functions specific to an application, if they are not provided by FLASH Works on just about everything Grid development Data access Architecture

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago FLASH Code Basics qAn application code, composed of units/modules. Particular modules are set up together to run different physics problems. qPerformance, Testing, Usability, Portability qFortran, C, Python, … q More than 560,000* lines of code q 75% code, 25% comment qVery portable qScaling to tens of thousand procs * Internal Release

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago FLASH on BG/L - Sod weak scaling Source :K.M. Riley, Argonne National Lab Hydro Time: Constant Work Per Processor using AMR Number of Processors Total Time

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Hydro and Particles Scaling

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Basic Computational Unit : Block qThe grid is composed of blocks qMost often all blocks have same dimensions qCover different fraction of the physical domain. qIn AMR blocks at different levels of refinement have different grid spacing.

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago FLASH Architecture Four Cornerstones Setup tool assemble simulation Config files Tell setup how to Assemble simulation Driver Organize interactions Between units Unit Architecture API Inheritance Data management

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago What’s a FLASH Unit? qFLASH basic architecture unit q Component of the FLASH code providing a particular functionality q Different combinations of units are used for particular problem setups q Publishes a public interface for other units’ use. q Ex: Driver, Grid, Hydro, IO etc qFake inheritance by use of directory structure qInteraction between units governed by the Driver qNot all units are included in all applications

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago FLASH Units Driver I/O Runtime Params Grid Profiling Logfile Simulation Infrastructure monitoring Hydro BurnGravity MHD Physics

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Unit Architecture Driver Top level: API Unit Test Grid Data Module block data time step etc Wrapper Kernel

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Unit Hierarchy Unit API/stubs Impl_1 Remaining API impl kernel UnitMain Common API implementation UnitSomething API implementation kernel Impl_2 Remaining API impl kernel Impl_3 Remaining API impl common More common impl

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Inheritance Through Directories: Eos Eos_ init Eos Eos_ wrapped EosMain Gamma Multigamma Stub Implementations of the three functions at the top level Eos/EosMain Replaces the stub with an implementation common to all formulations of EOS Eos/EosMain/Gamma implements gamma versions Of Eos_init and Eos Eos_init Eos_wrapped There is only one subunit Specific implementation Eos Another implementation, which will have its own Eos and Eos_init etc.

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago The Config File qDeclare solution variables, fluxes qDeclare runtime parameters q Sets defaults qLists required, requested or exclusive units qConfig files are additive down the directory tree - no replacements

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago FLASH Setup Script: Implements Architecture Python code links together needed physics and tools for a problem q Parses Config files to q Determine a self consistent set of units to include q If a unit has multiple implementations, finds out which implementation to include q get list of parameters from units q Determines solution data storage q Creates a file defining global constants set at build time q Builds infrastructure for mapping runtime parameters to constants as needed q Configures Makefiles properly

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Pulling it All Together qChoose a problem simulation qRun setup to configure that problem q Everything is in a new top-level directory q ‘object’ qMake qRun q flash.par for runtime parameters q Defaults already set from particular units

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Adding a new Simulation qA basic problem setup q Config file q Required physics modules q flash.par q Default list runtime parameter configuration q Simulation_initBlock q Initial conditions for the problem set block by block q Many other possible files: q Driver, Refinement algorithms, User defined boundary conditions qAny files in setup take precedence More about setting up your own problem in the hands on session later this afternoon

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Provided Driver qProvided: Second order, state form, strang split Initialize Loop over timesteps evolve adjust dt output visualize End loop Finalize flash.F90 set time step do physics set time step repeat physics Mesh_updateGrid evolve.F90 FLASH2 FLASH3 Flash.F90 Driver_initFlash Driver_evolveFlash Driver_finalizeFlash Loop over timesteps set time step call physics set time step repeat physics Grid_updateRefinement adjust dt output End loop Driver_evolveFlash.F90 Driver Unit

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Flash Grids : Paramesh  FLASH original design was Paramesh specific  Block Structured  All blocks have same dimensions  Blocks at different refinement levels have different grid spacings and thus cover different fractions of the physical domain  Fixed sized blocks specified at compile time  Every section of the code assumed knowledge of block size  Good for capturing shocks  Currently removing fixed block size requirement from code to open door for other mesh packages like a uniform grid, squishy grid and patched base grid In choosing Paramesh, the original FLASH code architects chose simplicity of the Paramesh structure over a patch based mesh.

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Infrastructure : Mesh Packages in Flash Paramesh  Block Structured  Fixed sized blocks  Specified at compile time  Not more than one level jump at fine coarse boundaries Uniform Grid  one block per proc  No AMR related overhead

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Building and Running an Application Grid mesh I/O Runtime Inputs Profiler Simulation/ setup Driver Physics

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago IO/Viz Outline  I/O Challenges with Scientific Computing  speed, portability, file size  serial, parallel multi-file, MPI-IO  I/O Libraries  hdf5  parallel-netcdf  FLASH I/O specifics  parameter input file  checkpointing and restarts  plotfiles  Visualization  runtime visualization  fidlr3  flashView

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago I/O Challenges: Portability  Portability  big endian, little endian  size of types vary  data alignment in memory 4 byte int2 byte int int double Taking a basic binary output file from one machine and trying to use it on another often causes problems

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago I/O challenges: File size and Speed  File size  Files are huge --in FLASH a typical checkpoint file typically ~5 GB and up to 50 GB for a research simulation (plotfiles ~2GB)  Need access to large storage areas  Data transfer takes lots of time (~.5 MB/sec from labs)  I/O can be a major performance bottle neck  slow file system  writing to disk takes time It isn’t hard to have speed, portability or usability. It is hard to have speed, portability and usability in the same implementation

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Serial I/O File processors  Each processor sends its data to the master who then writes the data to a file  Advantages  Don’t need a parallel file system  Simple  Disadvantages  Not scalable  Not Efficient 5

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Parallel I/O Multi-file File processors  Each processor writes its own data to a separate file  Advantages  Fast!  Disadvantages  can quickly accumulate many files  hard to manage  requires post processing 5 File

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Parallel I/O Single-file File processors  Each processor writes its own data to the same file using MPI-IO mapping  Advantages  single file  scalable (to how many procs is debatable...)  Disadvantages  requires MPI-IO mapping or other higher level libraries 5

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Parallel I/O single file processors array of data Each processor writes to a section of a data array. Each must know its offset from the beginning of the array and the number of elements to write

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago I/O Libraries qFLASH works with 2 different I/O libraries q HDF5 q Parallel-NetCDF qUse MPI-IO mappings qBoth Portable libraries qBoth can translate data between systems qScientific Data mostly stored in multidimensional arrays qFLASH3 also supports a basic direct FORTRAN I/O -- use only as a last resort!

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago FLASH I/O qFLASH can use either HDF5 or Parallel-NetCDF for I/O qInitially very optimistic about PnetCDF because it appeared to be ~ twice as fast as HDF5 Source :J.B. Gallagher on ASC White

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Parameter Input File - flash.par qTo change parameters in a simulation edit the flash.par file qDefaults are set in Config files and any value in the flash.par overides the Config values qParameter MUST be declared in Config file!!! qControls parameters for all units (examples below) q Grid parameters - lrefine_max, lrefine_min, geometry q Physics parameters - flame_speed q I/O parameters - restart, checkpointFileNumber

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Flash Checkpoint (restart) Files qCheckpoint file -- a file that saves all the data you need to restart your simulation from the exact state it left off q Grid Data q refinement level q nodetype (parent, child) q bound box q coordinates q unknown variables (density, pressure, temperature etc) q Simulation Data q time q dt q timestep q MetaData q time stamp q build directory q setup problem etc For any number of reasons your simulation could be interrupted - the machine crashes, the queue window closes, the machine runs out of disk space etc. Rather than starting again from the beginning of your simulation you can restart your problem using a flash checkpoint files

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Checkpoint Parameters qThere are a number of ways to control the occurrence and spacing of checkpoint dumps (names have been clarified for FLASH3) q timesteps between dumps - (parameter: checkpointFileIntervalStep) q simulation time seconds between dumps - (parameter: checkpointFileIntervalTime) q wall clock seconds between dumps - (parameter: wall_clock_checkpoint) q very useful when running in a queue or when only a certain amount of time is available on a machine q creating a file called.dump_checkpoint will force a checkpoint file to be dumped immediately q creating a.dump_restart file will force a checkpoint and halt the code q rolling checkpoint ability - (parameter: rolling_checkpoint) q checkpoint files are big! GB for a large science run q this feature allows you to keep only a specified number of checkpoints to save disk space The FLASH code support website gives detailed explanations of all runtime parameters

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Restarting a Simulation qBecause the checkpoint files save the exact state of the simulation you can restart your simulation from any FLASH checkpoint file qCan start with Parallel IO if previous run was written in serial qCan restart with a different number of processors than the previous run To restart a simulation, in your flash.par file set *the runtime parameter ‘restart’ =.true. *the checkpointFileNumber parameter to the desired checkpoint number

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Restarting on a different number of processors processors array of data processors array of data restart

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Flash Plotfiles and Particle Files qPlotfiles are smaller than checkpoints q Only output variables needed for visualization or analysis q Data stored in single precision qControl plotfile frequency with the runtime parameters q plotfileNumber q plotfileIntervalStep q plotfileIntervalTime qParticle files stored in single precision qControl particle file frequency with the runtime parameters q particleFileNumber q particleFileIntervalStep q particleFileIntervalTime Plotfiles and particle files output data for visualization and analysis purposes

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Visualization

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Visualization qRuntime Visualization qFidlr3 qFlashView

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Runtime Visualization qRuntime Visualization module q not automatically included -- can add it into your Modules file to set up problem with Runtime Viz qAllows user a quick and dirty view of the simulation qOnly requires the widely available png library qCreates png files in current directory during runtime

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Fidlr3 (FLASH IDL Routines) qIDL tool for 2d visualization qAvailable with FLASH directly qCan step through plots to view sequences or see movies qCan visualize 3d data by looking at slices qMake histograms qAdditionally, if you have access to idl and want to do more analysis you can use the routines to read in data, statistical analysis, max min vaules Fidlr is an application written in idl to visualize Flash checkpoint and plotfiles

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Example of 3d slice taken by FILDR3

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago FlashView qFeatures q Reads native FLASH plotfiles q XYZ cut planes features so entire range of data for a variable can be projected on a plane q Renders 3d iso-surfaces of Flash Data q Fully parallel (both reading the file and calculations) q Parallelism allows for the ability to look at very large data sets interactively q Displays mulitple (up to 5) variables at the same time q various color maps, zoom, rotation, translation etc q shows grid structure q movies making q Currently only supports hdf5 format * Yes... we think it is difficult to install too... * Key is to install exact version of recommended packages * Ask Dan if you have questions FlashView is a 3d visualization package developed for FLASH data at Argonne National Lab by Mike Papka and Randy Hudson

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago FlashView

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Example of an isosurface (single variable)

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Another example with two variables

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Another example/ isosurface with cut plane

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Isosurface with Grid

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Issues with Cell-centered vs. Corner Interpolated Data

The ASC/Alliances Center for Astrophysical Thermonuclear Flashes The University of Chicago Issues with cell-centered vs. corner interpolated data