NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Capability Computing: Performance, Usage, Tflops Peter R. Taylor San Diego Supercomputer.

Slides:



Advertisements
Similar presentations
Founded in 2010: UCL, Southampton, Oxford and Bristol Key Objectives of the Consortium: Prove the concept of shared, regional e-infrastructure services.
Advertisements

Slides Prepared from the CI-Tutor Courses at NCSA By S. Masoud Sadjadi School of Computing and Information Sciences Florida.
Optimizing the Performance of Streaming Numerical Kernels on the IBM Blue Gene/P PowerPC 450 Tareq Malas Advisors: Prof. David Keyes, Dr. Aron Ahmadia.
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE SAN DIEGO SUPERCOMPUTER CENTER Early Experiences with Datastar: A 10TF Power4 + Federation.
SAN DIEGO SUPERCOMPUTER CENTER Blue Gene for Protein Structure Prediction (Predicting CASP Targets in Record Time) Ross C. Walker.
Ver 0.1 Page 1 SGI Proprietary Introducing the CRAY SV1 CRAY SV1-128 SuperCluster.
One-day Meeting, INI, September 26th, 2008 Role of spectral turbulence simulations in developing HPC systems YOKOKAWA, Mitsuo Next-Generation Supercomputer.
S AN D IEGO S UPERCOMPUTER C ENTER N ATIONAL P ARTNERSHIP FOR A DVANCED C OMPUTATIONAL I NFRASTRUCTURE Computational Science Challenges for the Beginning.
Beowulf Supercomputer System Lee, Jung won CS843.
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA; SAN DIEGO SDSC RP Update October 21, 2010.
SDSC Computing the 21st Century Talk Given to the NSF Sugar Panel May 27, 1998.
O AK R IDGE N ATIONAL L ABORATORY U.S. D EPARTMENT OF E NERGY Global Climate Modeling Research John Drake Computational Climate Dynamics Group Computer.
Presented by Scalable Systems Software Project Al Geist Computer Science Research Group Computer Science and Mathematics Division Research supported by.
IBM RS6000/SP Overview Advanced IBM Unix computers series Multiple different configurations Available from entry level to high-end machines. POWER (1,2,3,4)
NPACI: National Partnership for Advanced Computational Infrastructure Supercomputing ‘98 Mannheim CRAY T90 vs. Tera MTA: The Old Champ Faces a New Challenger.
National Partnership for Advanced Computational Infrastructure San Diego Supercomputer Center Evaluating the Tera MTA Allan Snavely, Wayne Pfeiffer et.
1 HPC and the ROMS BENCHMARK Program Kate Hedstrom August 2003.
IBM RS/6000 SP POWER3 SMP Jari Jokinen Pekka Laurila.
NPACI: National Partnership for Advanced Computational Infrastructure August 17-21, 1998 NPACI Parallel Computing Institute 1 Cluster Archtectures and.
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Discovery Environments Susan L. Graham Chief Computer Scientist Peter.
Reference: / Parallel Programming Paradigm Yeni Herdiyeni Dept of Computer Science, IPB.
Descriptive Data Analysis of File Transfer Data Sudarshan Srinivasan Victor Hazlewood Gregory D. Peterson.
SURA Regional HPC Grid Proposal Ed Seidel LSU With Barbara Kucera, Sara Graves, Henry Neeman, Otis Brown, others.
Seaborg Cerise Wuthrich CMPS Seaborg  Manufactured by IBM  Distributed Memory Parallel Supercomputer  Based on IBM’s SP RS/6000 Architecture.
SAN DIEGO SUPERCOMPUTER CENTER NUCRI Advisory Board Meeting November 9, 2006 Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director.
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Molecular Science in NPACI Russ B. Altman NPACI Molecular Science Thrust Stanford Medical.
2005 Materials Computation Center External Board Meeting The Materials Computation Center Duane D. Johnson and Richard M. Martin (PIs) Funded by NSF DMR.
The Research Computing Center Nicholas Labello
An Introduction to Software Engineering. What is Software?
N ATIONAL E NERGY R ESEARCH S CIENTIFIC C OMPUTING C ENTER Evolution of the NERSC SP System NERSC User Services Original Plans Phase 1 Phase 2 Programming.
Center for Computational Sciences O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY Vision for OSC Computing and Computational Sciences
Cray XT3 Experience so far Horizon Grows Bigger Richard Alexander 24 January 2006
1 Monday, 26 October 2015 © Crown copyright Met Office Computing Update Paul Selwood, Met Office.
Cray Innovation Barry Bolding, Ph.D. Director of Product Marketing, Cray September 2008.
Kurt Mueller San Diego Supercomputer Center NPACI HotPage Updates.
ARGONNE NATIONAL LABORATORY Climate Modeling on the Jazz Linux Cluster at ANL John Taylor Mathematics and Computer Science & Environmental Research Divisions.
SALSASALSASALSASALSA FutureGrid Venus-C June Geoffrey Fox
Copyright © 2003 University Corporation for Atmospheric ResearchSponsored by the National Science Foundation NCAR Computing Update Tom Engel Scientific.
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Capability Computing: Performance, Usage, Tflops Peter R. Taylor San Diego Supercomputer.
IACC-ITS.PPT June 1, 2001 Computer Lab Operations Dan Inlow IACC-ITS Joint Meeting June 1, 2001.
HPCMP Benchmarking Update Cray Henry April 2008 Department of Defense High Performance Computing Modernization Program.
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Capability Computing - User Environment Anke Kamrath Associate Director, SDSC
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Capability Computing – High-End Resources Wayne Pfeiffer Deputy Director NPACI & SDSC NPACI.
Brent Gorda LBNL – SOS7 3/5/03 1 Planned Machines: BluePlanet SOS7 March 5, 2003 Brent Gorda Future Technologies Group Lawrence Berkeley.
Nanco: a large HPC cluster for RBNI (Russell Berrie Nanotechnology Institute) Anne Weill – Zrahia Technion,Computer Center October 2008.
Scalable Systems Software for Terascale Computer Centers Coordinator: Al Geist Participating Organizations ORNL ANL LBNL.
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE A New Era for Computational Science Sid Karin NPACI Director NPACI Site.
NEES Cyberinfrastructure Center at the San Diego Supercomputer Center, UCSD George E. Brown, Jr. Network for Earthquake Engineering Simulation NEES TeraGrid.
Computing Environment The computing environment rapidly evolving ‑ you need to know not only the methods, but also How and when to apply them, Which computers.
SAN DIEGO SUPERCOMPUTER CENTER Advanced User Support Project Overview Adrian E. Roitberg University of Florida July 2nd 2009 By Ross C. Walker.
11 January 2005 High Performance Computing at NCAR Tom Bettge Deputy Director Scientific Computing Division National Center for Atmospheric Research Boulder,
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
© 2010 Pittsburgh Supercomputing Center Pittsburgh Supercomputing Center RP Update July 1, 2010 Bob Stock Associate Director
National Computational Science Alliance The Alliance Distributed Supercomputing Facilities Opening Talk to the Alliance User Advisory Council Held at Supercomputing.
Comprehensive Scientific Support Of Large Scale Parallel Computation David Skinner, NERSC.
National Center for Supercomputing Applications University of Illinois at Urbana–Champaign Visualization Support for XSEDE and Blue Waters DOE Graphics.
Toward GSI Community Code Louisa Nance, Ming Hu, Hui Shao, Laurie Carson, Hans Huang.
April 23, 2002 Parallel Programming Techniques Intro to PSC Tom Maiden
Presented by NCCS Hardware Jim Rogers Director of Operations National Center for Computational Sciences.
Seaborg Decommission James M. Craw Computational Systems Group Lead NERSC User Group Meeting September 17, 2007.
Administration Jeff Chen Hans De Sterck SHARCNET Staffs Jemmy Hu Adam Munro Hugh Merz UW SHARCNET Personnel Committee members Marcel Nooijen M.J.P. Gingras.
SAN DIEGO SUPERCOMPUTER CENTER Fran Berman Engineering Advisory Committee Cyberinfrastructure Subcommittee -- Prologue Dr. Francine Berman Director, SDSC.
National Computational Science Ky PACS at the University of Kentucky April 2000 –Advanced Computing Resources –EPSCoR Outreach –SURA Liaison –John.
Performance Comparison of Winterhawk I and Winterhawk II Systems Patrick H. Worley Computer Science and Mathematics Division Oak Ridge National Laboratory.
Parallel Computers Today Oak Ridge / Cray Jaguar > 1.75 PFLOPS Two Nvidia 8800 GPUs > 1 TFLOPS Intel 80- core chip > 1 TFLOPS  TFLOPS = floating.
From Clustered SMPs to Clustered NUMA John M. Levesque The Advanced Computing Technology Center.
TEMPLATE DESIGN © H. Che 2, E. D’Azevedo 1, M. Sekachev 3, K. Wong 3 1 Oak Ridge National Laboratory, 2 Chinese University.
Super Computing By RIsaj t r S3 ece, roll 50.
BlueGene/L Supercomputer
Development of the Nanoconfinement Science Gateway
Presentation transcript:

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Capability Computing: Performance, Usage, Tflops Peter R. Taylor San Diego Supercomputer Center NPACI Site Visit July 21-22, 1999

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Accelerating Large Research Projects Strategic Applications Collaborations (SACs) NPACI staff work with researchers to improve performance of codes Aim to improve performance, ideally using generic techniques that can be re-used. Initial groups Hernquist (UCSC/Harvard; Astrophysics) Kollman (UCSF; Biochemistry) Peskin (NYU; Biomedical Engineering)

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE SAC Successes AMBER molecular dynamics code T3E: 1.7x faster on 2 procs to 1.3x on 64 procs SP: 1.4x faster on 1 proc to 1.2x on 32 procs PULSE3D code for fluid-structure heart analysis T90: 1.3x faster (400 Mflops on 1 proc) SP: 12x speedup of kernel (250 Mflops on 1 proc) SCF code for galactic modeling T3E: 1.8x faster on 1 proc to 2.6x faster on 32 procs PARTREE code for galactic modeling SP: 2.0x faster on 1 proc

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE New SACs Tim Barnett (SIO), Warren Washington (NCAR) Climate modeling Toichiro Kinoshita (Cornell) Particle physics Ed Givelberg (U Michigan) Cochlea modeling Bob Sugar (UCSB) Quantum chromodynamics Neal Bertram (UCSD) Magnetic recording James Bower (Caltech) Neuron modeling Peter Hauschildt (U Georgia), Eddie Baron (Oklahoma U) Stellar atmospheres FY00 Plans: New Strategic Applications Collaborations

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE System Utilization Large jobs dominate job mix URLs:

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE System Utilization Overall utilization very high Percentage based on availability to NPACI

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Performance HPM figures for T90 Mittal, CFD, 1.24 Gflops (av) Hu, Physics, 965 Mflops Eberly, Optics, 791 Mflops Taylor, Chemistry, 748 Mflops Case-by-case for T3E and SP Quinn, Astro, 95% efficiency on 256 proc (90% on 512) Suen, Astro, 98+% on 256 (95% on 1024!) Wunsch, Climate, 90+% on 256

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Allocations by Discipline Summary: All platforms

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Allocations by Discipline Summary: SDSC SP

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Allocations by Discipline Summary: SDSC T3E

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Allocations by Discipline Summary: SDSC T90

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Allocations Switched to annual allocations from 4 * quarters Turnaround time problems in spring on all platforms Two NRAC and two PRAC meetings per year Developing policy for large Tflops allocations through NRAC Joint process with NCSA

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Allocations

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE User Advisory Committee Lars HernquistHarvard Peter KollmanUCSF Aron KuppermannCaltech David LandauU Georgia Jim McWilliamsUCLA Alan NeedlemanBrown Lee PanettaTexas A&M Charles PeskinNYU Bob SugarUCSB Susan Graham, Paul Messina, Peter Taylor

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE User Advisory Committee Has Met Twice Meetings in November 1998 and June 1999 Discussion of Resources available to program User representation Allocations procedure Interaction with vendors Tflops machine John Levesque (IBM ACTC) presentation on Tflops system and software

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Levesque Briefing IBM Advanced Computing Technology Center (14+ staff) Interact with customers on customer codes for peformance Discussed hardware features of Winterhawk and Nighthawk nodes Discussed software for support of Nighthawk

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Tflops Software Enormous improvement in IBM schedules for all these software developments: A real partnership with IBM and their ACTC Compilers (xlf and xlc) will support all of OpenMP in next release (late ’99) Also support all CRAY multitasking directives Cray2ibm script for CRAY vector codes SCILIB compatibility Thread-safe MASS and VMASS Distributed-memory programming: MPI and LAPI

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Tflops Software (cont.) shmem get and put support on top of LAPI T3E users should use shmem MPI will use shared memory within a node on our machine 64-bit addressing in MPI in late ’99 (was ’01!) Limit on MPI tasks/node fixed in next AIX release (1Q00) NCAR CCM: 6.3 speedup on 8-processor Nighthawk with compiler defaults

NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Users Selected for Early Use of the Tflops System Hernquist, Kollman, Peskin (SACs) Barnett & Washington, Hauschildt & Baron, Kinoshita, Sugar, Givelberg (new SACs) McWilliams (Climate modeling) Abraham (Materials science) Baldridge (Quantum chemistry) Johnson (MPIRE rendering)