Computational Science at Edinburgh From Excellence to Enterprise Dr Arthur Trew Director.

Slides:



Advertisements
Similar presentations
Research Councils ICT Conference Welcome Malcolm Atkinson Director 17 th May 2004.
Advertisements

National e-Science Centre Glasgow e-Science Hub Opening: Remarks NeSCs Role Prof. Malcolm Atkinson Director 17 th September 2003.
HPCx Power for the Grid Dr Alan D Simpson HPCx Project Director EPCC Technical Director.
Knowledge Exchange and Economic Benefit Dr Mark Parsons Commercial Director EPCC and NeSC.
Founded in 2010: UCL, Southampton, Oxford and Bristol Key Objectives of the Consortium: Prove the concept of shared, regional e-infrastructure services.
The Role of Environmental Monitoring in the Green Economy Strategy K Nathan Hill March 2010.
O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY Center for Computational Sciences Cray X1 and Black Widow at ORNL Center for Computational.
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
High-Performance Computing
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update June 12,
Towards a Virtual European Supercomputing Infrastructure Vision & issues Sanzio Bassini
Grid Infrastructure in the UK Neil Geddes. Why this talk ? LHC to 2020 –GridPP to 2011 –SRIF3 to 2010 ? Who was successful in SRIF3? –Thereafter ? PPARC.
An Institute for Theory and Computation In Molecular and Materials Sciences at the University of Florida Theory & Computation for Atomic & Molecular Materials.
SDSC Computing the 21st Century Talk Given to the NSF Sugar Panel May 27, 1998.
The EPSRC Sustainable Urban Environment Programme Philippa Hemmings 27 October 2010.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
UK e-Science and the White Rose Grid Paul Townend Distributed Systems and Services Group Informatics Research Institute University of Leeds.
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
Research and Innovation Research and Innovation Research and Innovation Research and Innovation Research Infrastructures and Horizon 2020 The EU Framework.
1October High Performance Computing at EPCC Alan D Simpson Technical Director Telephone: Fax:
Results Matter. Trust NAG. Numerical Algorithms Group Mathematics and technology for optimized performance Andrew Jones IDC HPC User Forum, Imperial College.
1Training & Education at EPCC Training and Education at EPCC Judy Hardy
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
© Fujitsu Laboratories of Europe 2009 HPC and Chaste: Towards Real-Time Simulation 24 March
HPCx: Multi-Teraflops in the UK A World-Class Service for World-Class Research Dr Arthur Trew Director.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
EGEE-III INFSO-RI Enabling Grids for E-sciencE Nov. 18, EGEE and gLite are registered trademarks EGEE-III, Regional, and National.
1 e-Infrastructures and Virtualisation, Remote Instrumentation “The views expressed in this presentation are those of the author and do not necessarily.
Advancing Computational Science in Academic Institutions Organisers: Dan Katz – University of Chicago Gabrielle Allen – Louisiana State University Rob.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update September.
INFSO-RI Enabling Grids for E-sciencE EGEE Induction Grid training for users, Institute of Physics Belgrade, Serbia Sep. 19, 2008.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
Informatics Achievements and Objectives. Key Facts We lead the UK in research (according to the UK Research Assessment Exercise we have 69% more top rated.
Results of the HPC in Europe Taskforce (HET) e-IRG Workshop Kimmo Koski CSC – The Finnish IT Center for Science April 19 th, 2007.
Cyberinfrastructure Planning at NSF Deborah L. Crawford Acting Director, Office of Cyberinfrastructure HPC Acquisition Models September 9, 2005.
EU-IndiaGrid (RI ) is funded by the European Commission under the Research Infrastructure Programme The EU-IndiaGrid Project Joining.
Edinburgh Investment in e-Science Infrastructure Dr Arthur Trew.
Center for Computational Sciences O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY Vision for OSC Computing and Computational Sciences
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Rob Allan Daresbury Laboratory NW-GRID Training Event 25 th January 2007 North West Grid Overview R.J. Allan CCLRC Daresbury Laboratory A world-class Grid.
"Recent development of e- Science & Grid in Thailand" 24 January 2006 Piyawut Srichaikul, NECTEC Putchong Uthayopas, KU.
Pascucci-1 Valerio Pascucci Director, CEDMAV Professor, SCI Institute & School of Computing Laboratory Fellow, PNNL Massive Data Management, Analysis,
INFSO-RI Enabling Grids for E-sciencE Plan until the end of the project and beyond, sustainability plans Dieter Kranzlmüller Deputy.
A Presentation By Wayne Lawton Department of Mathematics National University of Singapore.
Ted Fox Interim Associate Laboratory Director Energy and Engineering Sciences Oak Ridge, Tennessee March 21, 2006 Oak Ridge National Laboratory.
Building the e-Minerals Minigrid Rik Tyer, Lisa Blanshard, Kerstin Kleese (Data Management Group) Rob Allan, Andrew Richards (Grid Technology Group)
ESFRI & e-Infrastructure Collaborations, EGEE’09 Krzysztof Wrona September 21 st, 2009 European XFEL.
HPCx:an Overview Dr Arthur Trew Director, EPCC. 210 February 2004IBM Team Talent Meeting what is HPCx? HPCx is the latest in a series of HPC services.
Computational Science & Engineering meeting national needs Steven F. Ashby SIAG-CSE Chair March 24, 2003.
Utility Computing: Security & Trust Issues Dr Steven Newhouse Technical Director London e-Science Centre Department of Computing, Imperial College London.
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
HP-SEE High-Performance Computing Infrastructure for South East Europe’s Research Communities 8th e-Infrastructure Concertation Meeting,
Comprehensive Scientific Support Of Large Scale Parallel Computation David Skinner, NERSC.
SEE-GRID-2 The SEE-GRID-2 initiative is co-funded by the European Commission under the FP6 Research Infrastructures contract no
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
Professor Arthur Trew Director, EPCC EPCC: KT in novel computing.
Erwin Laure ScalaLife Project Director.
Science Support for Phase 4 Dr Alan D Simpson HPCx Project Director EPCC Technical Director.
The National Grid Service Mike Mineter.
Sensors and Instrumentation Computational and Data Challenges in Environmental Modelling Dr Peter M Allan Director, Hartree Centre, STFC.
HPC Training Perspectives and Collaborations PRACE Advanced Training Centres.
Earth System Modelling: an HPC perspective Mike Ashworth & Rupert Ford Scientific Computing Department and STFC Hartree Centre STFC Daresbury Laboratory.
DutchGrid KNMI KUN Delft Leiden VU ASTRON WCW Utrecht Telin Amsterdam Many organizations in the Netherlands are very active in Grid usage and development,
overview of activities on High Performance Computing
Scientific Computing Department
Electron Ion Collider New aspects of EIC experiment instrumentation and computing, as well as their possible impact on and context in society (B) COMPUTING.
National e-Infrastructure Vision
ICT NCP Infoday Brussels, 23 June 2010
Presentation transcript:

Computational Science at Edinburgh From Excellence to Enterprise Dr Arthur Trew Director

on the shoulders of giants  for centuries science has relied on experiment and theory, but –theory is best suited to a reductionist approach –experiment is limited in its range of applicability  there is a need for a complementary approach: simulation  study of emergent phenomena Theory- Greece 400 BC Experiment - Italy 1,500 AD For problems which are: - too large/small - too fast/slow - too complex - too expensive... Simulation - Edinburgh 1,980 AD

becoming the best  in 1990 Edinburgh established EPCC as its focus for HPC simulation –following a decade of research in Physics  in 1992 EPCC set itself a mission “to be Europe’s top HPC centre within 5 years” –we succeeded –today, that success is benefiting Scottish research and industry  in 2001 Edinburgh & Glasgow created NeSC to extend that vision –over the past 18 months EPCC & NeSC have brought in £65M of research grants/contracts of which £2.3M from SHEFC for eDIKT –Edinburgh & Glasgow backed this with £3M –… and foundation departments lead major UK Grid projects arthur: HPCx 1.0 Donofrio.4 Eades 1.68 SRIF 2.3 eDIKT 1.0 SUR 2.3 NeSC/eSI 3 GCP 0.2 GSC 0.35 e-Strom 0.6 GridNet arthur: HPCx 1.0 Donofrio.4 Eades 1.68 SRIF 2.3 eDIKT 1.0 SUR 2.3 NeSC/eSI 3 GCP 0.2 GSC 0.35 e-Strom 0.6 GridNet arthur: UoE = UoG = 1.45 (AK’s figures) arthur: UoE = UoG = 1.45 (AK’s figures)

what’s that to me?  as computer performance improves, the range of applications increases whole earth climate organs solar weather materials design cells whole aircraft drug design protein structures nanostructures astroplasmas eddy resolution oceans HPCx £53M: 3 machines

linking to data  … but we have the web, don’t we?  but simulation is only part of the story …  linking computer modelling to experiment is vital to the interpretation of many large experiments, eg LHC  … and with big experiments set to generate Pbyte/year data management is critical

EPCC overview Academic: o National HPC facilities o Research o Support Training: o Academia o Industry European leadership: o Visitor programmes o Technology Transfer o Strategic Planning Industry: o Projects o Consultancy o Standards Technology Transfer  70 staff, £3M turnover  … possible through the “win-win-win” model

HPCx rationale  UK academic research is increasingly dependent upon high-end compute facilities  recent technological and Grid advances highlighted the need to upgrade UK resources  HPCx objectives are thus: –aim “to deliver optimum service resulting in world- leading science” –address “the problems involved in scaling to the capability levels required”

What is HPCx?  Consortium of leading UK organisations committed to creating and managing the new resource for the next 6 years –led by University of Edinburgh  multi-stage project to deliver a world-class academic computing resource, the largest in Europe  £54M/$100M budget  Grid-enabled, a key component in the UK e-Science programme

HPCx Consortium Members  Daresbury Laboratory, CCLRC  IBM  EPCC, University of Edinburgh  UoE HPCx Ltd – wholly-owned subsidiary of the University of Edinburgh and lead contractor

 the University of Edinburgh is one of the top 5 research universities in the UK  EPCC is the leading computer centre in Europe, bridging the gap between academia and industry  … and provides both HPC and novel computing solutions to a wide range of problems and users  long experience of providing national HPC services including: –Meiko Computing Surfaces –Thinking Machines CM200 –Cray T3D/T3E

Daresbury Laboratory  A multi disciplinary research lab with over 500 people  Provides large-scale research facilities both for UK academic and industrial research communities  Runs the UK’s Collaborative Computational Projects  Daresbury hosts the HPCx hardware

 IBM provides the technology for HPCx  Long standing involvement in HPC including the development of a number of ASCI machines and 4 of the top dozen machines in the 21 st TOP500 list: –ASCI White: R max = 7.3 TFlop/s –SP Power3 (6656 Processors): R max = 7.3 TFlop/s –xSeries (1920 Processors): R max = 6.6 TFlop/s –HPCx (1280 processors): R max = 3.2 TFlop/s  IBM has the long term technology road map essential to a 6 year project such as HPCx

HPCx in Place

HPCx: Phase 1  System will be commissioned in three main stages Phase 1 covering consists of: –40 Regatta-H SMP nodes, 1280 processors –Peak performance 6.6 TFlop/s, with 3.4 Tflop/s sustained Linpack currently 16 th in the Top500 –1.28 TB total memory capacity –Over 50 TB of storage capacity –Double plane Colony switch with total peak bandwidth of 250 MB/s per processor

The e-Science Grid CeSC (Cambridge)

HPCx Phases 2 & 3  Phase 2 ( ) –aiming for 6 TFlop/s sustained on Linpack and 2.5 TFlop/s on sPPM –O(48) Regatta-H+ SMP nodes –interconnect upgraded to Federation switch –doubling of I/O and storage already built a cluster with 8 Regatta-H+ frames and a pre- release Federation switch undertaking a phased upgrade during 1H2004  Phase 3 ( ) –target of 12 TFlop/s sustained on Linpack –may be additional nodes or alternative technology

HPCx Science Support Outreach Life sciences New applications Applications Support Helpdesk Training Liaising with users Users Technology Software Engineering Underpinning technology Grid/e-Science Systems & Networking Flexible and responsive capability computing service Smooth transitions between phases Terascaling Capability applications Scalable algorithms Performance optimisation  18 staff in 5 dual-centre functional support teams

HPCx Status: Usage

 currently 28 active consortia, and over 450 users  Life Sciences outreach activity supported by IBM HPCx Status: Application Areas

Atomic and Molecular Physics  The UK Multiphoton, Electron Collision and Bose Einstein Condensates (MECBEC) HPC Consortium  Two flagship projects model two-electron atoms (helium) and molecules (hydrogen) exposed to intense, ultra-short laser pulses  Modelling involves the grid solution of multi- dimensional, time-dependent partial differential equations)  Visualisation techniques crucial in extracting information Simulation of Double Ionization of laser-driven helium performed at Queen’s University Belfast  Requires large amounts of computing power

Environmental Science: POLCOMS  POLCOMS is a multi-disciplinary model developed at the Proudman Oceanographic Laboratory  … a 3-D hydrodynamic model integrating coasts and oceans using a wide range of associated models A simulation of Chlorophyll density in UK waters using the POLCOMS model  POLCOMS is a step towards real time modelling of coastal zones, enabling better analysis of impacts to, and sustainability of, the marine environment

Material Science: Crystal  CRYSTAL computes electronic structure and related properties of periodic systems  Developed jointly by Daresbury and the University of Turin  A Fortran90 and MPI program that performs Hartree-Fock, density functional and other approximation calculations  On HPCx, CRYSTAL used to calculate the structure of the Crambin molecule, the largest Hartree-Fock calculation ever converged (1284 atoms) –next, the Rusticyanin molecule (6284 atoms)

Engineering: UKTC  UK Turbulence Consortium is developing world leading turbulence simulation codes using HPCx  essential that they can easily facilitate the scientific usage of the resulting data  the calculated data are transferred from HPCx to a remote site for analysis  so, the Grid is becoming increasingly important

Chemistry: RealityGrid  RealityGrid is a UK collaboration –which aims to grid-enable the realistic modelling and simulation of complex condensed matter structures at the meso and nanoscale levels  HPCx terascaling team has worked on parallelisation & optmisation of RealityGrid codes, such as LB3D  RealityGrid also uses the Grid for data transfer, computational steering, and remote visualisation –using Globus on HPCx –TeraGyroid won award at SC2003

The Vision  we want Edinburgh to lead the e-science revolution –to become the European equivalent of one of the big US centres, eg San Diego Supercomputer Center 