HPCx: Multi-Teraflops in the UK A World-Class Service for World-Class Research Dr Arthur Trew Director.

Slides:



Advertisements
Similar presentations
Research Councils ICT Conference Welcome Malcolm Atkinson Director 17 th May 2004.
Advertisements

HPCx Power for the Grid Dr Alan D Simpson HPCx Project Director EPCC Technical Director.
SWITCH Visit to NeSC Malcolm Atkinson Director 5 th October 2004.
Founded in 2010: UCL, Southampton, Oxford and Bristol Key Objectives of the Consortium: Prove the concept of shared, regional e-infrastructure services.
STFC and the UK e-Infrastructure Initiative The Hartree Centre Prof. John Bancroft Project Director, the Hartree Centre Member, e-Infrastructure Leadership.
Research Computing and Facilitating Services CLMS Symposium 28 th June 2012 Clare Gryce Head of Research Computing & Facilitating Services.
O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY Center for Computational Sciences Cray X1 and Black Widow at ORNL Center for Computational.
The Development of Mellanox - NVIDIA GPUDirect over InfiniBand A New Model for GPU to GPU Communications Gilad Shainer.
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
High-Performance Computing
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update June 12,
An Institute for Theory and Computation In Molecular and Materials Sciences at the University of Florida Theory & Computation for Atomic & Molecular Materials.
IDC HPC User Forum Conference Appro Product Update Anthony Kenisky, VP of Sales.
SDSC Computing the 21st Century Talk Given to the NSF Sugar Panel May 27, 1998.
Problem-Solving Environments: The Next Level in Software Integration David W. Walker Cardiff University.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
UK e-Science and the White Rose Grid Paul Townend Distributed Systems and Services Group Informatics Research Institute University of Leeds.
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
Research and Innovation Research and Innovation Research and Innovation Research and Innovation Research Infrastructures and Horizon 2020 The EU Framework.
1October High Performance Computing at EPCC Alan D Simpson Technical Director Telephone: Fax:
Results Matter. Trust NAG. Numerical Algorithms Group Mathematics and technology for optimized performance Andrew Jones IDC HPC User Forum, Imperial College.
1Training & Education at EPCC Training and Education at EPCC Judy Hardy
GridPP Tuesday, 23 September 2003 Tim Phillips. 2 Bristol e-Science Vision National scene Bristol e-Science Centre Issues & Challenges.
© Fujitsu Laboratories of Europe 2009 HPC and Chaste: Towards Real-Time Simulation 24 March
SICSA student induction day, 2009Slide 1 Social Simulation Tutorial Session 6: Introduction to grids and cloud computing International Symposium on Grid.
CEMS: The Facility for Climate and Environmental Monitoring from Space Victoria Bennett, ISIC/CEDA/NCEO RAL Space.
EGEE-III INFSO-RI Enabling Grids for E-sciencE Nov. 18, EGEE and gLite are registered trademarks EGEE-III, Regional, and National.
1 e-Infrastructures and Virtualisation, Remote Instrumentation “The views expressed in this presentation are those of the author and do not necessarily.
Miracle Consortium: Progress Report Nick Achilleos (UCL), Consortium Chair, Miracle Co-I; Jeremy Yates (UCL), Miracle Co-I Brief report prepared according.
ANL Royal Society - June 2004 The TeraGyroid Project - Aims and Achievements Richard Blake Computational Science and Engineering Department CCLRC Daresbury.
Integrated e-Infrastructure for Scientific Facilities Kerstin Kleese van Dam STFC- e-Science Centre Daresbury Laboratory
INFSO-RI Enabling Grids for E-sciencE EGEODE VO « Expanding GEosciences On DEmand » Geocluster©: Generic Seismic Processing Platform.
INFSO-RI Enabling Grids for E-sciencE EGEE Induction Grid training for users, Institute of Physics Belgrade, Serbia Sep. 19, 2008.
Computational Science at Edinburgh From Excellence to Enterprise Dr Arthur Trew Director.
Results of the HPC in Europe Taskforce (HET) e-IRG Workshop Kimmo Koski CSC – The Finnish IT Center for Science April 19 th, 2007.
Cyberinfrastructure Planning at NSF Deborah L. Crawford Acting Director, Office of Cyberinfrastructure HPC Acquisition Models September 9, 2005.
EU-IndiaGrid (RI ) is funded by the European Commission under the Research Infrastructure Programme The EU-IndiaGrid Project Joining.
Neil Geddes GridPP-10, June 2004 UK e-Science Grid Dr Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations Support Centre.
Center for Computational Sciences O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY Vision for OSC Computing and Computational Sciences
Rob Allan Daresbury Laboratory NW-GRID Training Event 25 th January 2007 North West Grid Overview R.J. Allan CCLRC Daresbury Laboratory A world-class Grid.
INFSO-RI Enabling Grids for E-sciencE Plan until the end of the project and beyond, sustainability plans Dieter Kranzlmüller Deputy.
The UK eScience Grid (and other real Grids) Mark Hayes NIEeS Summer School 2003.
Building the e-Minerals Minigrid Rik Tyer, Lisa Blanshard, Kerstin Kleese (Data Management Group) Rob Allan, Andrew Richards (Grid Technology Group)
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Capability Computing – High-End Resources Wayne Pfeiffer Deputy Director NPACI & SDSC NPACI.
By Will Peeden. Topics to be covered  What is nanotechnology?  Storing data in atoms  Using molecules for switches  Benefits  Challenges Ahead 
HPCx:an Overview Dr Arthur Trew Director, EPCC. 210 February 2004IBM Team Talent Meeting what is HPCx? HPCx is the latest in a series of HPC services.
Performance Benefits on HPCx from Power5 chips and SMT HPCx User Group Meeting 28 June 2006 Alan Gray EPCC, University of Edinburgh.
Project Database Handler The Project Database Handler is a brokering application, which will mediate interactions between the project database and other.
Utility Computing: Security & Trust Issues Dr Steven Newhouse Technical Director London e-Science Centre Department of Computing, Imperial College London.
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
National e-Science Institute and National e-Science Centre The Way Ahead Prof. Malcolm Atkinson Director 30 th September 2003.
Comprehensive Scientific Support Of Large Scale Parallel Computation David Skinner, NERSC.
National Center for Supercomputing Applications University of Illinois at Urbana–Champaign Visualization Support for XSEDE and Blue Waters DOE Graphics.
Power and Cooling at Texas Advanced Computing Center Tommy Minyard, Ph.D. Director of Advanced Computing Systems 42 nd HPC User Forum September 8, 2011.
SEE-GRID-2 The SEE-GRID-2 initiative is co-funded by the European Commission under the FP6 Research Infrastructures contract no
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
Professor Arthur Trew Director, EPCC EPCC: KT in novel computing.
Erwin Laure ScalaLife Project Director.
Science Support for Phase 4 Dr Alan D Simpson HPCx Project Director EPCC Technical Director.
The National Grid Service Mike Mineter.
UK X-FEL National Laboratory Perspective Susan Smith STFC ASTeC IoP PAB/STFC Workshop Towards a UK XFEL 16 th February 2016.
Petascale Computing Resource Allocations PRAC – NSF Ed Walker, NSF CISE/ACI March 3,
HPC Training Perspectives and Collaborations PRACE Advanced Training Centres.
Earth System Modelling: an HPC perspective Mike Ashworth & Rupert Ford Scientific Computing Department and STFC Hartree Centre STFC Daresbury Laboratory.
DutchGrid KNMI KUN Delft Leiden VU ASTRON WCW Utrecht Telin Amsterdam Many organizations in the Netherlands are very active in Grid usage and development,
Capabilities and Programmes of STFC’s Accelerator Science & Technology Centre (ASTeC)
overview of activities on High Performance Computing
H2020, COEs and PRACE.
National e-Infrastructure Vision
EGEE support for HEP and other applications
Presentation transcript:

HPCx: Multi-Teraflops in the UK A World-Class Service for World-Class Research Dr Arthur Trew Director

HPCx rationale  UK academic research is increasingly dependent upon high-end compute facilities  recent technological and Grid advances highlighted the need to upgrade UK resources  HPCx objectives are thus: –aim “to deliver optimum service resulting in world- leading science” –address “the problems involved in scaling to the capability levels required”

What is HPCx?  Consortium of leading UK organisations committed to creating and managing the new resource for the next 6 years –led by University of Edinburgh  multi-stage project to deliver a world-class academic computing resource, the largest in Europe  £54M/$100M budget  Grid-enabled, a key component in the UK e-Science programme

HPCx Consortium Members  Daresbury Laboratory, CCLRC  IBM  EPCC, University of Edinburgh  UoE HPCx Ltd – wholly-owned subsidiary of the University of Edinburgh and lead contractor

 the University of Edinburgh is one of the top 5 research universities in the UK  EPCC is the leading computer centre in Europe, bridging the gap between academia and industry  … and provides both HPC and novel computing solutions to a wide range of problems and users  long experience of providing national HPC services including: –Meiko Computing Surfaces –Thinking Machines CM200 –Cray T3D/T3E

EPCC overview Academic: o National HPC facilities o Research o Support Training: o Academia o Industry European leadership: o Visitor programmes o Technology Transfer o Strategic Planning Industry: o Projects o Consultancy o Standards Technology Transfer

Daresbury Laboratory  A multi disciplinary research lab with over 500 people  Provides large-scale research facilities both for UK academic and industrial research communities  Runs the UK’s Collaborative Computational Projects  Daresbury hosts the HPCx hardware

 IBM provides the technology for HPCx  Long standing involvement in HPC including the development of a number of ASCI machines and 4 of the top dozen machines in the 21 st TOP500 list: –ASCI White: R max = 7.3 TFlop/s –SP Power3 (6656 Processors): R max = 7.3 TFlop/s –xSeries (1920 Processors): R max = 6.6 TFlop/s –HPCx (1280 processors): R max = 3.2 TFlop/s  IBM has the long term technology road map essential to a 6 year project such as HPCx

HPCx in Place

HPCx: Phase 1  System will be commissioned in three main stages Phase 1 covering consists of: –40 Regatta-H SMP nodes, 1280 processors –Peak performance 6.6 TFlop/s, with 3.4 Tflop/s sustained Linpack currently 16 th in the Top500 –1.28 TB total memory capacity –Over 50 TB of storage capacity –Double plane Colony switch with total peak bandwidth of 250 MB/s per processor

The e-Science Grid CeSC (Cambridge)

HPCx Phases 2 & 3  Phase 2 ( ) –aiming for 6 TFlop/s sustained on Linpack and 2.5 TFlop/s on sPPM –O(48) Regatta-H+ SMP nodes –interconnect upgraded to Federation switch –doubling of I/O and storage already built a cluster with 8 Regatta-H+ frames and a pre- release Federation switch undertaking a phased upgrade during 1H2004  Phase 3 ( ) –target of 12 TFlop/s sustained on Linpack –may be additional nodes or alternative technology

HPCx Science Support Outreach Life sciences New applications Applications Support Helpdesk Training Liaising with users Users Technology Software Engineering Underpinning technology Grid/e-Science Systems & Networking Flexible and responsive capability computing service Smooth transitions between phases Terascaling Capability applications Scalable algorithms Performance optimisation  18 staff in 5 dual-centre functional support teams

HPCx Status: Usage

 currently 28 active consortia, and over 450 users  Life Sciences outreach activity supported by IBM HPCx Status: Application Areas

Atomic and Molecular Physics  The UK Multiphoton, Electron Collision and Bose Einstein Condensates (MECBEC) HPC Consortium  Two flagship projects model two-electron atoms (helium) and molecules (hydrogen) exposed to intense, ultra-short laser pulses  Modelling involves the grid solution of multi- dimensional, time-dependent partial differential equations)  Visualisation techniques crucial in extracting information Simulation of Double Ionization of laser-driven helium performed at Queen’s University Belfast  Requires large amounts of computing power

Environmental Science: POLCOMS  POLCOMS is a multi-disciplinary model developed at the Proudman Oceanographic Laboratory  … a 3-D hydrodynamic model integrating coasts and oceans using a wide range of associated models A simulation of Chlorophyll density in UK waters using the POLCOMS model  POLCOMS is a step towards real time modelling of coastal zones, enabling better analysis of impacts to, and sustainability of, the marine environment

Material Science: Crystal  CRYSTAL computes electronic structure and related properties of periodic systems  Developed jointly by Daresbury and the University of Turin  A Fortran90 and MPI program that performs Hartree-Fock, density functional and other approximation calculations  On HPCx, CRYSTAL used to calculate the structure of the Crambin molecule, the largest Hartree-Fock calculation ever converged (1284 atoms) –next, the Rusticyanin molecule (6284 atoms)

Engineering: UKTC  UK Turbulence Consortium is developing world leading turbulence simulation codes using HPCx  essential that they can easily facilitate the scientific usage of the resulting data  the calculated data are transferred from HPCx to a remote site for analysis  so, the Grid is becoming increasingly important

Chemistry: RealityGrid  RealityGrid is a UK collaboration –which aims to grid-enable the realistic modelling and simulation of complex condensed matter structures at the meso and nanoscale levels  HPCx terascaling team has worked on parallelisation & optmisation of RealityGrid codes, such as LB3D  RealityGrid also uses the Grid for data transfer, computational steering, and remote visualisation –using Globus on HPCx –TeraGyroid won award at SC2003

Summary  HPCx is the new UK HPC resource –capability computing for world-leading science –largest academic computer in Europe  Series of IBM pSeries clusters –6.6TF  12TF  22TF –remaining very competitive until  Very successful first year HPCx: Supporting the Future of UK Research

Summary: Website  For further information: The front page of the HPCx website