NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Capability Computing: Performance, Usage, Tflops Peter R. Taylor San Diego Supercomputer.

Slides:



Advertisements
Similar presentations
NPACI Parallel Computing Institute August 19-23, 2002 San Diego Supercomputing Center S an D IEGO S UPERCOMPUTER C ENTER N ATIONAL P ARTNERSHIP FOR A DVANCED.
Advertisements

Slides Prepared from the CI-Tutor Courses at NCSA By S. Masoud Sadjadi School of Computing and Information Sciences Florida.
Optimizing the Performance of Streaming Numerical Kernels on the IBM Blue Gene/P PowerPC 450 Tareq Malas Advisors: Prof. David Keyes, Dr. Aron Ahmadia.
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE SAN DIEGO SUPERCOMPUTER CENTER Early Experiences with Datastar: A 10TF Power4 + Federation.
SAN DIEGO SUPERCOMPUTER CENTER Blue Gene for Protein Structure Prediction (Predicting CASP Targets in Record Time) Ross C. Walker.
Ver 0.1 Page 1 SGI Proprietary Introducing the CRAY SV1 CRAY SV1-128 SuperCluster.
One-day Meeting, INI, September 26th, 2008 Role of spectral turbulence simulations in developing HPC systems YOKOKAWA, Mitsuo Next-Generation Supercomputer.
S AN D IEGO S UPERCOMPUTER C ENTER N ATIONAL P ARTNERSHIP FOR A DVANCED C OMPUTATIONAL I NFRASTRUCTURE Computational Science Challenges for the Beginning.
Beowulf Supercomputer System Lee, Jung won CS843.
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA; SAN DIEGO SDSC RP Update October 21, 2010.
SDSC Computing the 21st Century Talk Given to the NSF Sugar Panel May 27, 1998.
O AK R IDGE N ATIONAL L ABORATORY U.S. D EPARTMENT OF E NERGY Global Climate Modeling Research John Drake Computational Climate Dynamics Group Computer.
Presented by Scalable Systems Software Project Al Geist Computer Science Research Group Computer Science and Mathematics Division Research supported by.
IBM RS6000/SP Overview Advanced IBM Unix computers series Multiple different configurations Available from entry level to high-end machines. POWER (1,2,3,4)
NPACI: National Partnership for Advanced Computational Infrastructure Supercomputing ‘98 Mannheim CRAY T90 vs. Tera MTA: The Old Champ Faces a New Challenger.
National Partnership for Advanced Computational Infrastructure San Diego Supercomputer Center Evaluating the Tera MTA Allan Snavely, Wayne Pfeiffer et.
1 HPC and the ROMS BENCHMARK Program Kate Hedstrom August 2003.
The Interplay of Funding Policy for infrastructure at NSF Richard S. Hirsh.
IBM RS/6000 SP POWER3 SMP Jari Jokinen Pekka Laurila.
Simo Niskala Teemu Pasanen
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Discovery Environments Susan L. Graham Chief Computer Scientist Peter.
Reference: / Parallel Programming Paradigm Yeni Herdiyeni Dept of Computer Science, IPB.
Descriptive Data Analysis of File Transfer Data Sudarshan Srinivasan Victor Hazlewood Gregory D. Peterson.
Seaborg Cerise Wuthrich CMPS Seaborg  Manufactured by IBM  Distributed Memory Parallel Supercomputer  Based on IBM’s SP RS/6000 Architecture.
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Molecular Science in NPACI Russ B. Altman NPACI Molecular Science Thrust Stanford Medical.
2005 Materials Computation Center External Board Meeting The Materials Computation Center Duane D. Johnson and Richard M. Martin (PIs) Funded by NSF DMR.
An Introduction to Software Engineering. What is Software?
NERSC NUG Meeting 5/29/03 Seaborg Code Scalability Project Richard Gerber NERSC User Services.
N ATIONAL E NERGY R ESEARCH S CIENTIFIC C OMPUTING C ENTER Evolution of the NERSC SP System NERSC User Services Original Plans Phase 1 Phase 2 Programming.
Center for Computational Sciences O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY Vision for OSC Computing and Computational Sciences
Case Study in Computational Science & Engineering - Lecture 2 1 Parallel Architecture Models Shared Memory –Dual/Quad Pentium, Cray T90, IBM Power3 Node.
1 Monday, 26 October 2015 © Crown copyright Met Office Computing Update Paul Selwood, Met Office.
Cray Innovation Barry Bolding, Ph.D. Director of Product Marketing, Cray September 2008.
Kurt Mueller San Diego Supercomputer Center NPACI HotPage Updates.
ARGONNE NATIONAL LABORATORY Climate Modeling on the Jazz Linux Cluster at ANL John Taylor Mathematics and Computer Science & Environmental Research Divisions.
Copyright © 2003 University Corporation for Atmospheric ResearchSponsored by the National Science Foundation NCAR Computing Update Tom Engel Scientific.
NIH Resource for Biomolecular Modeling and Bioinformatics Beckman Institute, UIUC NAMD Development Goals L.V. (Sanjay) Kale Professor.
Evaluation of Modern Parallel Vector Architectures Leonid Oliker Future Technologies Group Computational Research Division LBNL
IACC-ITS.PPT June 1, 2001 Computer Lab Operations Dan Inlow IACC-ITS Joint Meeting June 1, 2001.
HPCMP Benchmarking Update Cray Henry April 2008 Department of Defense High Performance Computing Modernization Program.
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Capability Computing - User Environment Anke Kamrath Associate Director, SDSC
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Capability Computing – High-End Resources Wayne Pfeiffer Deputy Director NPACI & SDSC NPACI.
Internet 2 Applications Update Ted Hanss 8 October 1997 Washington D.C. Ted Hanss 8 October 1997 Washington D.C.
Brent Gorda LBNL – SOS7 3/5/03 1 Planned Machines: BluePlanet SOS7 March 5, 2003 Brent Gorda Future Technologies Group Lawrence Berkeley.
Nanco: a large HPC cluster for RBNI (Russell Berrie Nanotechnology Institute) Anne Weill – Zrahia Technion,Computer Center October 2008.
Scalable Systems Software for Terascale Computer Centers Coordinator: Al Geist Participating Organizations ORNL ANL LBNL.
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE A New Era for Computational Science Sid Karin NPACI Director NPACI Site.
Computing Environment The computing environment rapidly evolving ‑ you need to know not only the methods, but also How and when to apply them, Which computers.
SAN DIEGO SUPERCOMPUTER CENTER Advanced User Support Project Overview Adrian E. Roitberg University of Florida July 2nd 2009 By Ross C. Walker.
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
© 2010 Pittsburgh Supercomputing Center Pittsburgh Supercomputing Center RP Update July 1, 2010 Bob Stock Associate Director
National Computational Science Alliance The Alliance Distributed Supercomputing Facilities Opening Talk to the Alliance User Advisory Council Held at Supercomputing.
Status and plans at KEK Shoji Hashimoto Workshop on LQCD Software for Blue Gene/L, Boston University, Jan. 27, 2006.
Comprehensive Scientific Support Of Large Scale Parallel Computation David Skinner, NERSC.
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Capability Computing: Performance, Usage, Tflops Peter R. Taylor San Diego Supercomputer.
National Center for Supercomputing Applications University of Illinois at Urbana–Champaign Visualization Support for XSEDE and Blue Waters DOE Graphics.
April 23, 2002 Parallel Programming Techniques Intro to PSC Tom Maiden
Presented by NCCS Hardware Jim Rogers Director of Operations National Center for Computational Sciences.
SAN DIEGO SUPERCOMPUTER CENTER Fran Berman Engineering Advisory Committee Cyberinfrastructure Subcommittee -- Prologue Dr. Francine Berman Director, SDSC.
Today's Software For Tomorrow's Hardware: An Introduction to Parallel Computing Rahul.S. Sampath May 9 th 2007.
National Computational Science Ky PACS at the University of Kentucky April 2000 –Advanced Computing Resources –EPSCoR Outreach –SURA Liaison –John.
Performance Comparison of Winterhawk I and Winterhawk II Systems Patrick H. Worley Computer Science and Mathematics Division Oak Ridge National Laboratory.
Parallel Computers Today Oak Ridge / Cray Jaguar > 1.75 PFLOPS Two Nvidia 8800 GPUs > 1 TFLOPS Intel 80- core chip > 1 TFLOPS  TFLOPS = floating.
Getting Started: XSEDE Comet Shahzeb Siddiqui - Software Systems Engineer Office: 222A Computer Building Institute of CyberScience May.
From Clustered SMPs to Clustered NUMA John M. Levesque The Advanced Computing Technology Center.
TG ’08, June 9-13, State of TeraGrid John Towns Co-Chair, TeraGrid Forum Director, Persistent Infrastructure National Center for Supercomputing.
Using Cyberinfrastructure to Study the Earth’s Climate and Air Quality Don Wuebbles Department of Atmospheric Sciences University of Illinois, Urbana-Champaign.
TEMPLATE DESIGN © H. Che 2, E. D’Azevedo 1, M. Sekachev 3, K. Wong 3 1 Oak Ridge National Laboratory, 2 Chinese University.
BlueGene/L Supercomputer
Development of the Nanoconfinement Science Gateway
Presentation transcript:

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Capability Computing: Performance, Usage, Tflops Peter R. Taylor San Diego Supercomputer Center NPACI review, NSF Washington DC July 21-22, 1999

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Accelerating Large Research Projects Strategic Applications Collaborations (SACs) NPACI staff work with researchers to improve performance of codes Aim to improve performance, ideally using generic techniques that can be re-used. Initial groups Hernquist (UCSC/Harvard; Astrophysics) Kollman (UCSF; Biochemistry) Peskin (NYU; Biomedical Engineering)

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE SAC Successes AMBER molecular dynamics code T3E: 1.7x faster on 2 procs to 1.3x on 64 procs SP: 1.4x faster on 1 proc to 1.2x on 32 procs PULSE3D code for fluid-structure heart analysis T90: 1.3x faster (400 Mflops on 1 proc) SP: 12x speedup of kernel (250 Mflops on 1 proc) SCF code for galactic modeling T3E: 1.8x faster on 1 proc to 2.6x faster on 32 procs PARTREE code for galactic modeling SP: 2.0x faster on 1 proc

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE New SACs Tim Barnett (SIO), Warren Washington (NCAR) Climate modeling Toichiro Kinoshita (Cornell) Particle physics Ed Givelberg (U Michigan) Cochlea modeling Bob Sugar (UCSB) Quantum chromodynamics Neal Bertram (UCSD) Magnetic recording James Bower (Caltech) Neuron modeling Peter Hauschildt (U Georgia), Eddie Baron (Oklahoma U) Stellar atmospheres FY00 Plans: New Strategic Applications Collaborations

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE System Utilization Large jobs dominate job mix URLs:

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE System Utilization Overall utilization very high Percentage based on availability to NPACI

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Performance HPM figures for T90 Mittal, CFD, 1.24 Gflops (av) Hu, Physics, 965 Mflops Eberly, Optics, 791 Mflops Taylor, Chemistry, 748 Mflops Case-by-case for T3E and SP Quinn, Astro, 95% efficiency on 256 proc (90% on 512) Suen, Astro, 98+% on 256 (95% on 1024!) Wunsch, Climate, 90+% on 256

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Allocations by Discipline Summary: All platforms

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Allocations by Discipline Summary: SDSC SP

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Allocations by Discipline Summary: SDSC T3E

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Allocations by Discipline Summary: SDSC T90

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Allocations Switched to annual allocations from 4 * quarters Turnaround time problems in spring on all platforms Two NRAC and two PRAC meetings per year Developing policy for large Tflops allocations through NRAC Joint process with NCSA

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Allocations

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE User Advisory Committee Lars HernquistHarvard Peter KollmanUCSF Aron KuppermannCaltech David LandauU Georgia Jim McWilliamsUCLA Alan NeedlemanBrown Lee PanettaTexas A&M Charles PeskinNYU Bob SugarUCSB Susan Graham, Paul Messina, Peter Taylor

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE User Advisory Committee Met Twice Meetings in November 1998 and June 1999 Discussion of Resources available to program User representation Allocations procedure Interaction with vendors Tflops machine John Levesque (IBM ACTC) presentation on Tflops system and software

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Levesque Briefing IBM Advanced Computing Technology Center (14+ staff) Interact with customers on customer codes for peformance Discussed hardware features of Winterhawk and Nighthawk nodes Discussed software for support of Nighthawk

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Tflops Software Enormous improvement in IBM schedules for all these software developments: A real partnership with IBM and their ACTC Compilers (xlf and xlc) will support all of OpenMP in next release (late ’99) Also support all CRAY multitasking directives Cray2ibm script for CRAY vector codes SCILIB compatibility Thread-safe MASS and VMASS Distributed-memory programming: MPI and LAPI

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Tflops Software (cont.) shmem get and put support on top of LAPI T3E users should use shmem MPI will use shared memory within a node on our machine 64-bit addressing in MPI in late ’99 (was ’01!) Limit on MPI tasks/node fixed in next AIX release (1Q00) NCAR CCM: 6.3 speedup on 8-processor Nighthawk with compiler defaults

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Hernquist SCF code Time (sec) 2 proc4 proc8 proc Nighthawk * SP (P2SC) T3E * 4 MPI tasks on 2 nodes

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Monte Carlo Photon Transport #procsTime (sec) threads (OpenMP) threads/2 MPI tasks threads/4 MPI tasks

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Pulse3D Scaling (1 Node)

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Users Selected for Early Use of the Tflops System Hernquist, Kollman, Peskin (SACs) Barnett & Washington, Hauschildt & Baron, Kinoshita, Sugar, Givelberg (new SACs) McWilliams (Climate modeling) Abraham (Materials science) Baldridge (Quantum chemistry) Johnson (MPIRE rendering)