Download presentation
Presentation is loading. Please wait.
Published byDarren Moore Modified over 9 years ago
1
LCSE – NCSA Partnership Accomplishments, FY01 Paul R. Woodward Laboratory for Computational Science & Engineering University of Minnesota October 17, 2001
2
Simulations of Compressible Turbulence Who: Paul Woodward’s LCSE team at the Univ. of Minnesota. How: PPM gas dynamics code running out of core with LCSE’s SHMOD (Shared Memory On Disk) library. Where: Prototype Itanium cluster at NCSA. Why: Study the dynamics of homogeneous, compressible turbulence, and compare computed behavior with proposed turbulence models. What: 2 TB of details on transition to turbulence on 1-billion-cell grid. Record: Most detailed data on compressible turbulence to date. What now: Move to new Itanium cluster and 8-billion-cell grid to study fully developed compressible turbulence over widest possible range of length and time scales. Record: This will be largest fluid dynamical simulation to date. Performance: Around 480 Mflop/s per CPU (64-bit arithmetic). Reference: Conf. Article at www.lcse.umn.edu/mexico. (Background of this slide shows the shock fronts in the developing turbulent flow.)
3
A series of snap shots of the vorticity in part of a thin slice of the problem domain illustrates the transition to turbulence in the PPM simulation. 12 34
4
Simulations of Convection in Red Giant Stars Who: Paul Woodward’s LCSE team at the Univ. of Minnesota. How: PPM gas dynamics code running on 256 CPU Origin 2000. Why: To study the interaction of convection, turbulence, and stellar pulsation and its impact on stellar evolution. What: 2 TB of details on 1 billion cell grid of turbulent dipolar convection flow past stellar core, which has the hydrogen burning shell on its surface. Record: Largest and most detailed stellar convection simulation; both the convection and the turbulence are well resolved. What now: Adding improved treatment of escape of radiation at the photosphere, for comparison with observations, and also of the gas equation of state, including ionization effects. Why this: Study the interaction of pulsations driven by dipolar convection flow and by ionization effects. Observable result of the dipolar flow can be geometry of planetary nebula. Performance: Around 93% parallel efficiency on 256 processors. Reference: Conf. Article at www.lcse.umn.edu/convsph.
5
PPM simulation on a one billion cell grid of convection and pulsation in the extended envelope of a model red giant star of 3 solar masses. At the left we see the fluctuations in the temperature in the back half of the stellar envelope. A global dipolar convection pattern has been set up, with a Mach 1/3 stream of cooler gas becoming heated as it passes over the central core. At the right we see the magnitude of vorticity in a central slice of the star.
6
Collaboration of the Environmental Hydrology Team with the LCSE and Rice ET Team Members Who: Bob Wilhelmson’s COMMAS meteorology code team @ UIUC, Paul Woodward’s LCSE team at the University of Minnesota, Ken Kennedy’s compiler technology team at Rice University. Why: To transfer parallel code implementation technology and movie rendering technology from the ET teams to the AT team Goal: High performance run of restructured COMMAS code on a one-billion cell grid to simulate a tornado or a hurricane in unprecedented detail. What: LCSE implemented a simplified version of COMMAS in its SHMOD library to run on Itanium cluster in same manner as PPM code. Rice team came up with even more aggressive “time skewing” technique. LCSE restructured advection portion (50%) of COMMAS code, via aggressive loop fusion, to run at high performance on single Intel CPUs. Rice team developed capability to automatically perform key code transformations so far performed only manually by LCSE. LCSE modified its A3D and HVR visualization software to work from HDF5 and MM5 data formats, so that AT team can use it. What now: COMMAS team works with Rice & LCSE teams to determine best means for invoking code transformations by Rice precompiler. LCSE & Rice will work with COMMAS team to restructure rest of code. LCSE will add capability to make movies from nested grid data. Rice and LCSE will experiment with more aggressive “time skewing.”
7
PrecipitationCloud Moisture Vertical VorticitySide view of Cloud Moisture LCSE’s A3D allows COMMAS team to visualize any function of archived flow variables, and HVR produces movie images over the Grid.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.