Presentation is loading. Please wait.

Presentation is loading. Please wait.

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Capability Computing: Performance, Usage, Tflops Peter R. Taylor San Diego Supercomputer.

Similar presentations


Presentation on theme: "NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Capability Computing: Performance, Usage, Tflops Peter R. Taylor San Diego Supercomputer."— Presentation transcript:

1 NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Capability Computing: Performance, Usage, Tflops Peter R. Taylor San Diego Supercomputer Center taylor@sdsc.edu http://www.sdsc.edu/~taylor NPACI Site Visit July 21-22, 1999

2 NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Accelerating Large Research Projects Strategic Applications Collaborations (SACs) NPACI staff work with researchers to improve performance of codes Aim to improve performance, ideally using generic techniques that can be re-used. Initial groups Hernquist (UCSC/Harvard; Astrophysics) Kollman (UCSF; Biochemistry) Peskin (NYU; Biomedical Engineering)

3 NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE SAC Successes AMBER molecular dynamics code T3E: 1.7x faster on 2 procs to 1.3x on 64 procs SP: 1.4x faster on 1 proc to 1.2x on 32 procs PULSE3D code for fluid-structure heart analysis T90: 1.3x faster (400 Mflops on 1 proc) SP: 12x speedup of kernel (250 Mflops on 1 proc) SCF code for galactic modeling T3E: 1.8x faster on 1 proc to 2.6x faster on 32 procs PARTREE code for galactic modeling SP: 2.0x faster on 1 proc

4 NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE New SACs Tim Barnett (SIO), Warren Washington (NCAR) Climate modeling Toichiro Kinoshita (Cornell) Particle physics Ed Givelberg (U Michigan) Cochlea modeling Bob Sugar (UCSB) Quantum chromodynamics Neal Bertram (UCSD) Magnetic recording James Bower (Caltech) Neuron modeling Peter Hauschildt (U Georgia), Eddie Baron (Oklahoma U) Stellar atmospheres FY00 Plans: New Strategic Applications Collaborations

5 NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE System Utilization Large jobs dominate job mix URLs: http://intranet.npaci.edu/Resources/Systems/Snupi http://intranet.npaci.edu/Resources/Webnewu

6 NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE System Utilization Overall utilization very high Percentage based on availability to NPACI

7 NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Performance HPM figures for T90 Mittal, CFD, 1.24 Gflops (av) Hu, Physics, 965 Mflops Eberly, Optics, 791 Mflops Taylor, Chemistry, 748 Mflops Case-by-case for T3E and SP Quinn, Astro, 95% efficiency on 256 proc (90% on 512) Suen, Astro, 98+% on 256 (95% on 1024!) Wunsch, Climate, 90+% on 256

8 NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Allocations by Discipline Summary: All platforms

9 NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Allocations by Discipline Summary: SDSC SP

10 NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Allocations by Discipline Summary: SDSC T3E

11 NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Allocations by Discipline Summary: SDSC T90

12 NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Allocations Switched to annual allocations from 4 * quarters Turnaround time problems in spring on all platforms Two NRAC and two PRAC meetings per year Developing policy for large Tflops allocations through NRAC Joint process with NCSA

13 NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Allocations

14 NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE User Advisory Committee Lars HernquistHarvard Peter KollmanUCSF Aron KuppermannCaltech David LandauU Georgia Jim McWilliamsUCLA Alan NeedlemanBrown Lee PanettaTexas A&M Charles PeskinNYU Bob SugarUCSB Susan Graham, Paul Messina, Peter Taylor

15 NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE User Advisory Committee Has Met Twice Meetings in November 1998 and June 1999 Discussion of Resources available to program User representation Allocations procedure Interaction with vendors Tflops machine John Levesque (IBM ACTC) presentation on Tflops system and software

16 NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Levesque Briefing IBM Advanced Computing Technology Center (14+ staff) Interact with customers on customer codes for peformance Discussed hardware features of Winterhawk and Nighthawk nodes Discussed software for support of Nighthawk

17 NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Tflops Software Enormous improvement in IBM schedules for all these software developments: A real partnership with IBM and their ACTC Compilers (xlf and xlc) will support all of OpenMP in next release (late ’99) Also support all CRAY multitasking directives Cray2ibm script for CRAY vector codes SCILIB compatibility Thread-safe MASS and VMASS Distributed-memory programming: MPI and LAPI

18 NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Tflops Software (cont.) shmem get and put support on top of LAPI T3E users should use shmem MPI will use shared memory within a node on our machine 64-bit addressing in MPI in late ’99 (was ’01!) Limit on MPI tasks/node fixed in next AIX release (1Q00) NCAR CCM: 6.3 speedup on 8-processor Nighthawk with compiler defaults

19 NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Users Selected for Early Use of the Tflops System Hernquist, Kollman, Peskin (SACs) Barnett & Washington, Hauschildt & Baron, Kinoshita, Sugar, Givelberg (new SACs) McWilliams (Climate modeling) Abraham (Materials science) Baldridge (Quantum chemistry) Johnson (MPIRE rendering)


Download ppt "NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Capability Computing: Performance, Usage, Tflops Peter R. Taylor San Diego Supercomputer."

Similar presentations


Ads by Google