National Center for Supercomputing Applications University of Illinois at Urbana-Champaign Blue Waters An Extraordinary Computing Resource for Advancing.

Slides:



Advertisements
Similar presentations
O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY Center for Computational Sciences Cray X1 and Black Widow at ORNL Center for Computational.
Advertisements

The Development of Mellanox - NVIDIA GPUDirect over InfiniBand A New Model for GPU to GPU Communications Gilad Shainer.
Slides Prepared from the CI-Tutor Courses at NCSA By S. Masoud Sadjadi School of Computing and Information Sciences Florida.
Presentation at WebEx Meeting June 15,  Context  Challenge  Anticipated Outcomes  Framework  Timeline & Guidance  Comment and Questions.
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE SAN DIEGO SUPERCOMPUTER CENTER Early Experiences with Datastar: A 10TF Power4 + Federation.
Appro Xtreme-X Supercomputers A P P R O I N T E R N A T I O N A L I N C.
Parallel Research at Illinois Parallel Everywhere
Petascale System Requirements for the Geosciences Richard Loft SCD Deputy Director for R&D.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
SDSC Computing the 21st Century Talk Given to the NSF Sugar Panel May 27, 1998.
Supporting Transformative Research Through Regional Cyberinfrastructure (CI) Dr. Dali Wang, Grid Infrastructure Specialist.
GLCPC - MSU - 12/15/2009 Blue Waters An Extraordinary Resource for Extraordinary Science Thom Dunning, William Kramer, Marc Snir, William Gropp, Wen-mei.
IBM RS6000/SP Overview Advanced IBM Unix computers series Multiple different configurations Available from entry level to high-end machines. POWER (1,2,3,4)
Hitachi SR8000 Supercomputer LAPPEENRANTA UNIVERSITY OF TECHNOLOGY Department of Information Technology Introduction to Parallel Computing Group.
IBM RS/6000 SP POWER3 SMP Jari Jokinen Pekka Laurila.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
UNIVERSITY of MARYLAND GLOBAL LAND COVER FACILITY High Performance Computing in Support of Geospatial Information Discovery and Mining Joseph JaJa Institute.
NSF Vision and Strategy for Advanced Computational Infrastructure Vision: NSF Leadership in creating and deploying a comprehensive portfolio…to facilitate.
Exascale Evolution 1 Brad Benton, IBM March 15, 2010.
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
1 Down Place Hammersmith London UK 530 Lytton Ave. Palo Alto CA USA.
“CyberInfrastructure: Challenges and Opportunities for Undergraduate Education” Roscoe Giles June 13, 2005.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
National Center for Supercomputing Applications University of Illinois at Urbana-Champaign Future of High Performance Computing Thom Dunning National Center.
Information Technology at Purdue Presented by: Dr. Gerry McCartney Vice President and CIO, ITaP HPC User Forum September 8-10, 2008 Using SiCortex SC5832.
SDSC RP Update TeraGrid Roundtable Reviewing Dash Unique characteristics: –A pre-production/evaluation “data-intensive” supercomputer based.
Seaborg Cerise Wuthrich CMPS Seaborg  Manufactured by IBM  Distributed Memory Parallel Supercomputer  Based on IBM’s SP RS/6000 Architecture.
Center for Programming Models for Scalable Parallel Computing: Project Meeting Report Libraries, Languages, and Execution Models for Terascale Applications.
Physics Steven Gottlieb, NCSA/Indiana University Lattice QCD: focus on one area I understand well. A central aim of calculations using lattice QCD is to.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
The Red Storm High Performance Computer March 19, 2008 Sue Kelly Sandia National Laboratories Abstract: Sandia National.
2005 Materials Computation Center External Board Meeting The Materials Computation Center Duane D. Johnson and Richard M. Martin (PIs) Funded by NSF DMR.
The Research Computing Center Nicholas Labello
Rensselaer Why not change the world? Rensselaer Why not change the world? 1.
Results of the HPC in Europe Taskforce (HET) e-IRG Workshop Kimmo Koski CSC – The Finnish IT Center for Science April 19 th, 2007.
Cyberinfrastructure: Enabling New Research Frontiers Sangtae “Sang” Kim Division Director – Division of Shared Cyberinfrastructure Directorate for Computer.
Problem is to compute: f(latitude, longitude, elevation, time)  temperature, pressure, humidity, wind velocity Approach: –Discretize the.
NCAR Supercomputing ‘Data Center’ Project An NCAR-led computing ‘facility’ for the study of the Earth system May 30, 2006.
Center for Computational Sciences O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY Vision for OSC Computing and Computational Sciences
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
Presented by Leadership Computing Facility (LCF) Roadmap Buddy Bland Center for Computational Sciences Leadership Computing Facility Project.
- Rohan Dhamnaskar. Overview  What is a Supercomputer  Some Concepts  Couple of examples.
CCS Overview Rene Salmon Center for Computational Science.
Argonne Leadership Computing Facility ALCF at Argonne  Opened in 2006  Operated by the Department of Energy’s Office of Science  Located at Argonne.
Brent Gorda LBNL – SOS7 3/5/03 1 Planned Machines: BluePlanet SOS7 March 5, 2003 Brent Gorda Future Technologies Group Lawrence Berkeley.
Introduction to Research 2011 Introduction to Research 2011 Ashok Srinivasan Florida State University Images from ORNL, IBM, NVIDIA.
© 2009 IBM Corporation Motivation for HPC Innovation in the Coming Decade Dave Turek VP Deep Computing, IBM.
Cray Environmental Industry Solutions Per Nyberg Earth Sciences Business Manager Annecy CAS2K3 Sept 2003.
Internet2 Applications Group: Renater Group Presentation T. Charles Yun Internet2 Program Manager, Applications Group 30 October 2001.
Comprehensive Scientific Support Of Large Scale Parallel Computation David Skinner, NERSC.
National Center for Supercomputing Applications University of Illinois at Urbana–Champaign Visualization Support for XSEDE and Blue Waters DOE Graphics.
Exscale – when will it happen? William Kramer National Center for Supercomputing Applications.
DiRAC-3 – The future Jeremy Yates, STFC DiRAC HPC Facility.
NICS Update Bruce Loftis 16 December National Institute for Computational Sciences University of Tennessee and ORNL partnership  NICS is the 2.
Presented by NCCS Hardware Jim Rogers Director of Operations National Center for Computational Sciences.
Next Generation Data and Computing Facility NCAR “Data Center” project An NCAR-led computing ‘facility’ for the study of the Earth system.
Tackling I/O Issues 1 David Race 16 March 2010.
HPC University Requirements Analysis Team Training Analysis Summary Meeting at PSC September Mary Ann Leung, Ph.D.
Petascale Computing Resource Allocations PRAC – NSF Ed Walker, NSF CISE/ACI March 3,
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Fermi National Accelerator Laboratory & Thomas Jefferson National Accelerator Facility SciDAC LQCD Software The Department of Energy (DOE) Office of Science.
BLUE GENE Sunitha M. Jenarius. What is Blue Gene A massively parallel supercomputer using tens of thousands of embedded PowerPC processors supporting.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
Extreme Scale Infrastructure
Engineering (Richard D. Braatz and Umberto Ravaioli)
Appro Xtreme-X Supercomputers
with Computational Scientists
The C&C Center Three Major Missions: In This Presentation:
Presentation transcript:

National Center for Supercomputing Applications University of Illinois at Urbana-Champaign Blue Waters An Extraordinary Computing Resource for Advancing Science and Engineering Thom Dunning, Rob Pennington,* Bill Kramer, Marc Snir, Bill Gropp, Wen-mei Hwu and Ed Seidel* * Currently at NSF

Background March to Petascale Computing CI Days 22 February 2010 University of Kentucky Top 500: #1 1 GF: late 1980s 1 TF: PF: 2008 petaflops teraflops

Background NSF’s Strategy for High-end Computing Three Resource Levels Track 3: University owned and operated Track 2: Several NSF-funded supercomputer & specialized computing centers (TeraGrid) Track 1: NSF-funded leading- edge computer center Computing Resources Track 3: 10s–100s TF Track 2: 500–1,000 TF, ~100+ TB of memory Track 1: see following slides CI Days 22 February 2010 University of Kentucky

Blue Waters Project Fielding a Sustained Petascale Computing System

Blue Waters Project Technology Selection: Who We Consulted CI Days 22 February 2010 University of Kentucky D. Baker, University of Washington Protein structure refinement and determination M. Campanelli, RIT Computational relativity and gravitation D. Ceperley, UIUC Quantum Monte Carlo molecular dynamics J. P. Draayer, LSU Ab initio nuclear structure calculations P. Fussell, Boeing Aircraft design optimization C. C. Goodrich Space weather modeling M. Gordon, T. Windus, Iowa State University Electronic structure of molecules S. Gottlieb, Indiana University Lattice quantum chromodynamics V. Govindaraju Image processing and feature extraction M. L. Klein, University of Pennsylvania Biophysical and materials simulations J. B. Klemp et al., NCAR Weather forecasting/hurricane modeling R. Luettich, University of North Carolina Coastal circulation and storm surge modeling W. K. Liu, Northwestern University Multiscale materials simulations M. Maxey, Brown University Multiphase turbulent flow in channels S. McKee, University of Michigan Analysis of ATLAS data M. L. Norman, UCSD Simulations in astrophysics and cosmology J. P. Ostriker, Princeton University Virtual universe J. P. Schaefer, LSST Corporation Analysis of LSST datasets P. Spentzouris, Fermilab Design of new accelerators W. M. Tang, Princeton University Simulation of fine-scale plasma turbulence A. W. Thomas, D. Richards, Jefferson Lab Lattice QCD for hadronic and nuclear physics J. Tromp, Caltech/Princeton Global and regional seismic wave propagation P. R. Woodward, University of Minnesota Astrophysical fluid dynamics

Blue Waters Project Attributes of Sustained Petascale System Maximum Core Performance …to minimize number of cores needed for a given performance level, lessen impact of sections of code with limited scalability Low Latency, High Bandwidth Interconnect …to enable science and engineering applications to scale to tens to hundreds of thousands of cores Large, Fast Memories …to solve the most memory-intensive problems Large, Fast I/O System and Data Archive …to solve the most data-intensive problems Reliable Operation …to enable the solution of Grand Challenge problems CI Days 22 February 2010 University of Kentucky

Blue Waters Project Track 1 System: Blue Waters TACCNCSA System AttributeTrack 2Track 1 VendorSunIBM ProcessorAMD BarcelonaIBM Power7 Peak Performance (PF)0.579 Sustained Performance (PF)~0.05~1 Number of Cores/Chip48 Number of Processor Cores62,976>300,000 Amount of Memory (TB)123>1,000 Amount of Disk Storage (PB)1.73 (s)>10 Amount of Archival Storage (PB)2.5 (20)>500 External Bandwidth (Gbps) >20 2 >3 >6 >5 >200 >10 CI Days 22 February 2010 University of Kentucky

Blue Waters Project Building Blue Waters Multi-chip Module 4 Power7 chips 128 GB memory 512 GB/s memory bandwidth 1 TF (peak) Router 1,128 GB/s bandwidth IH Server Node 8 MCM’s (256 cores) 1 TB memory 8 TF (peak) Fully water cooled Blue Waters Building Block 32 IH server nodes 32 TB memory 256 TF (peak) 4 Storage systems 10 Tape drive connections Blue Waters ~1 PF sustained >300,000 cores >1 PB of memory >10 PB of disk storage ~500 PB of archival storage >100 Gbps connectivity Blue Waters is built from components that can also be used to build systems with a wide range of capabilities—from deskside to beyond Blue Waters. Blue Waters will be the most powerful computer in the world for scientific research when it comes on line in Summer of CI Days 22 February 2010 University of Kentucky Power7 Chip 8 cores, 32 threads L1, L2, L3 cache (32 MB) Up to 256 GF (peak) 45 nm technology

Blue Waters Project Selected Unique Features of Blue Waters Shared/Distributed Memory Computing System Powerful Shared Memory Multichip Module (QCM) New high performance fabric interconnects all nodes Hardware support for global shared memory I/O and Data archive Systems High performance I/O subsystems On-line disks fully integrated with archival storage system Natural Growth Path Full range of systems from servers to supercomputers Facilitates software development Address science and engineering problems at all levels CI Days 22 February 2010 University of Kentucky

Blue Waters Project National Petascale Computing Facility Modern Data Center 90,000+ ft 2 total 30,000 ft 2 raised floor 20,000 ft 2 machine room Energy Efficiency LEED certified Gold (goal: Platinum) PUE < 1.2 (< 1.1 for much of year) Partners EYP MCF/ Gensler IBM Yahoo! CI Days 22 February 2010 University of Kentucky

Blue Waters Project Great Lakes Consortium for Petascale Computation The Ohio State University* Shiloh Community Unit School District #1 Shodor Education Foundation, Inc. SURA – 60 plus universities University of Chicago* University of Illinois at Chicago* University of Illinois at Urbana-Champaign* University of Iowa* University of Michigan* University of Minnesota* University of North Carolina–Chapel Hill University of Wisconsin–Madison* Wayne City High School * CIC universities* Argonne National Laboratory Fermi National Accelerator Laboratory Illinois Math and Science Academy Illinois Wesleyan University Indiana University* Iowa State University Illinois Mathematics and Science Academy Krell Institute, Inc. Louisiana State University Michigan State University* Northwestern University* Parkland Community College Pennsylvania State University* Purdue University* Goal: Facilitate the widespread and effective use of petascale computing to address frontier research questions in science, technology and engineering at research, educational and industrial organizations across the region and nation. Charter Members CI Days 22 February 2010 University of Kentucky

Science & Engineering Research on Blue Waters

Blue Waters Project Computational Science and Engineering Petascale computing will enable advances in a broad range of science and engineering disciplines: CI Days 22 February 2010 University of Kentucky Molecular ScienceWeather & Climate Forecasting Earth ScienceAstronomy Health

Petascale Computing Resource Allocations Solicitation: NSF (PRAC) Selection Criteria Compelling science or engineering research question Question that can only be answered using a system of the scale of Blue Waters (cycles, memory, I/O bandwidth, etc.) Evidence, or a convincing argument, that the application code can make effective use of Blue Waters Source (or sources) of funds to support the research work and any needed code development effort Funding Allocation or provisional allocation of time on Blue Waters Travel funds to enable teams to work closely with Blue Waters Project team Next Due Date March 17, 2010 (annually thereafter) CI Days 22 February 2010 University of Kentucky

Blue Waters Engagement with Research Teams Provide Details on Blue Waters System Provide Assistance with Blue Waters Software Numerical libraries MPI, OpenMP, ARMCI/Global Arrays, LAPI, OpenSHMEM, etc. Compilers (Fortran, C, UPC, Co-array Fortran) Provide Assistance with Blue Waters Hardware Chip and network simulators Staged access to Power7 hardware Provide Training On-line documentation Webinars/tutorials/on-line courses (~8 per year) Workshops (~2 per year) CI Days 22 February 2010 University of Kentucky

For More Information: Blue Waters Website IBM Power Systems Website PRAC Solicitation CI Days 22 February 2010 University of Kentucky

Questions?