Supercomputing and Grid Joint Use Case Peter Kunszt ETH Zürich / Swiss National Supercomputing Centre CSCS.

Slides:



Advertisements
Similar presentations
1 From Grids to Service-Oriented Knowledge Utilities research challenges Thierry Priol.
Advertisements

E-infrastructures Orientations for Horizon 2020 Kostas Glinos DG CONNECT.
1 The EU e-Infrastructure Programme EUROCRIS Conference, Brussels, 17 th July 2007 Mário Campolargo European Commission - DG INFSO Head of Unit “GÉANT.
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
The EGI – a sustainable European grid infrastructure Michael Wilson STFC RAL.
Towards a Virtual European Supercomputing Infrastructure Vision & issues Sanzio Bassini
Towards a European Training Network in Scientific Computing Pekka Manninen, PhD CSC – IT Center for Science Ltd.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
Introduction to Grids and Grid applications Gergely Sipos MTA SZTAKI
SDSC Computing the 21st Century Talk Given to the NSF Sugar Panel May 27, 1998.
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
SICSA student induction day, 2009Slide 1 Social Simulation Tutorial Session 6: Introduction to grids and cloud computing International Symposium on Grid.
1 European policies for e- Infrastructures Belarus-Poland NREN cross-border link inauguration event Minsk, 9 November 2010 Jean-Luc Dorel European Commission.
Moving Beyond IGY: An electronic Geophysical Year (eGY) Concept D.N. Baker Laboratory for Atmospheric and Space Physics University of Colorado - Boulder.
New Communities: The Virtual Physiological Human Use Case Stefan Zasada University College London
SAN DIEGO SUPERCOMPUTER CENTER NUCRI Advisory Board Meeting November 9, 2006 Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director.
Grid Technologies  Slide text. What is Grid?  The World Wide Web provides seamless access to information that is stored in many millions of different.
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
Seismic Hazard Map EPOS European Plate Observing System RESEARCH INFRASTRUCTURE AND E-SCIENCE FOR DATA AND OBSERVATORIES ON EARTHQUAKES, VOLCANOES, SURFACE.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks ?? Athens, May 5-6th 2009 Community Support.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE – paving the way for a sustainable infrastructure.
Pascucci-1 Valerio Pascucci Director, CEDMAV Professor, SCI Institute & School of Computing Laboratory Fellow, PNNL Massive Data Management, Analysis,
Interoperability Grids, Clouds and Collaboratories Ruth Pordes Executive Director Open Science Grid, Fermilab.
Grid Middleware Tutorial / Grid Technologies IntroSlide 1 /14 Grid Technologies Intro Ivan Degtyarenko ivan.degtyarenko dog csc dot fi CSC – The Finnish.
The Swiss Grid Initiative Context and Initiation Work by CSCS Peter Kunszt, CSCS.
Mcs/ HPC challenges in Switzerland Marie-Christine Sawley General Manager CSCS SOS8, Charleston April,
ESFRI & e-Infrastructure Collaborations, EGEE’09 Krzysztof Wrona September 21 st, 2009 European XFEL.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Interoperability from the e-Science Perspective Yannis Ioannidis Univ. Of Athens and ATHENA Research Center
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks David Fergusson, Emidio Giorgio, Gergely.
7. Grid Computing Systems and Resource Management
Grid Computing Unit I Introduction. Information anytime anywhere!!! support computation across administrative domains Generally  virtualizing computing.
Switzerland joining EGEE 7 March, 2005, GSI- Darmstadt by CSCS, the Swiss National Supercomputing Centre.
Università di Perugia Enabling Grids for E-sciencE Status of and requirements for Computational Chemistry NA4 – SA1 Meeting – 6 th April.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE and HPC: Pride and Prejudice? Peter.
1 TCS Confidential. 2 Objective : In this session we will be able to learn:  What is Cloud Computing?  Characteristics  Cloud Flavors  Cloud Deployment.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI strategy and Grand Vision Ludek Matyska EGI Council Chair EGI InSPIRE.
1 st EGI CMMST VT meeting 19 February 2013 A. Laganà (UNIPG, Italy)
Resource Provisioning EGI_DS WP3 consolidation workshop, CERN Fotis Karayannis, GRNET.
PEER 2003 Meeting 03/08/031 Interdisciplinary Framework Major focus areas Structural Representation Fault Systems Earthquake Source Physics Ground Motions.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
SAFE SSCs for A&A, Fusion and ES Coordinator: Claudio Vuerli, INAF, Italy.
INTRODUCTION TO GRID & CLOUD COMPUTING U. Jhashuva 1 Asst. Professor Dept. of CSE.
IBERGRID as RC Total Capacity: > 10k-20K cores, > 3 Petabytes Evolving to cloud (conditioned by WLCG in some cases) Capacity may substantially increase.
European Perspective on Distributed Computing Luis C. Busquets Pérez European Commission - DG CONNECT eInfrastructures 17 September 2013.
Introduction to Grid and Grid applications Peter Kacsuk MTA SZTAKI
EGI-InSPIRE EGI-InSPIRE RI The European Grid Infrastructure Steven Newhouse Director, EGI.eu Project Director, EGI-InSPIRE 29/06/2016CoreGrid.
The Mapper project receives funding from the EC's Seventh Framework Programme (FP7/ ) under grant agreement n° RI EGI and PRACE ecosystem.
The PRACE Initiative Catherine RIVIERE, CEO of GENCI.
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
1 MSWG, Amsterdam, December 15, 2005 DEISA security Jules Wolfrat SARA.
EGI-InSPIRE RI An Introduction to European Grid Infrastructure (EGI) March An Introduction to the European Grid Infrastructure.
ScotGRID is the Scottish prototype Tier 2 Centre for LHCb and ATLAS computing resources. It uses a novel distributed architecture and cutting-edge technology,
Sustainability of EMI Results
Clouds , Grids and Clusters
Grid Computing Unit I Introduction.
PRACE Experiences of an e-Infrastructure Flagship Project
Challenges of HPC High Performance Computing in Albania
Grid infrastructure development: current state
EGEE support for HEP and other applications
Grid Computing.
Recap: introduction to e-science
EGI Webinar - Introduction -
Cécile Germain-Renaud Grid Observatory meeting 19 October 2007 Orsay
The Barcelona Supercomputing Center
Expand portfolio of EGI services
Presentation transcript:

Supercomputing and Grid Joint Use Case Peter Kunszt ETH Zürich / Swiss National Supercomputing Centre CSCS

Careful with The Terminology High Performance Computing HPC –Layman’s definition: LINPACK top500 ranking of HW –Supercomputing community definition: Very few jobs completely max out a very large resource – resource hungry applications –Cluster computing community definition: Usage of powerful modern clusters in general – large number of applications Grid Computing –Sold as: Ubiquitous computing like electric power from the outlet –Advantages praised as: Standard interfaces, apps are not resource-bound –Common misconception: It is for free –Includes also High Throughput Computing, Collaborative Computing, Cloud Computing.. 2 EGI_DS meeting Istambul, – P.Kunszt, CSCS

Terminology suggestion Avoid the terms HPC and Grid as they have too many conflicting connotations Use –Supercomputing : Very resource-hungry, highly machine-specific optimized simulations, on cutting edge systems –Cluster computing: Applications able to run on general purpose mainstream systems with ever-increasing capacitiy and capability –Collaborative computing: User domains in need of several heterogeneous geographically distributed systems –On-demand computing: Usage of third-party resources on a pay- per-use basis (mostly to deal with spike overload of local resources) –Desktop computing: Exploitation of spare desktop cycles

Terminology suggestion Avoid the terms HPC and Grid as they have too many conflicting connotations Use –Supercomputing : Very resource-hungry, highly machine-specific optimized simulations, on cutting edge systems –Cluster computing: Applications able to run on general purpose mainstream systems with ever-increasing capacitiy and capability –Collaborative computing: User domains in need of several heterogeneous geographically distributed systems –On-demand computing: Usage of third-party resources on a pay- per-use basis (mostly to deal with spike overload of local resources) –Desktop computing: Exploitation of spare desktop cycles EGI has to consider all of these COMPLEMENTARY resources

Terminology suggestion Avoid the terms HPC and Grid as they have too many conflicting connotations Use –Supercomputing : Very resource-hungry, highly machine-specific optimized simulations, on cutting edge systems –Cluster computing: Applications able to run on general purpose mainstream systems with ever-increasing capacitiy and capability –Collaborative computing: User domains in need of several heterogeneous geographically distributed systems –On-demand computing: Usage of third-party resources on a pay- per-use basis (mostly to deal with spike overload of local resources) –Desktop computing: Exploitation of spare desktop cycles This Talk

6 SC: The Scientific Case  Weather, Climatology, Earth Science –degree of warming, scenarios for our future climate. –understand and predict ocean properties and variations –weather and flood events  Astrophysics, Elementary particle physics, Plasma physics –systems, structures which span a large range of different length and time scales –quantum field theories like QCD, ITER  Material Science, Chemistry, Nanoscience –understanding complex materials, complex chemistry, nanoscience –the determination of electronic and transport properties  Life Science –system biology, chromatin dynamics, large scale protein dynamics, protein association and aggregation, supramolecular systems, medicine  Engineering –complex helicopter simulation, biomedical flows, gas turbines and internal combustion engines, forest fires, green aircraft, –virtual power plant

The PRACE Mission Create a persistent pan-European HPC service –Provide European researchers with world-class computing resources –Establish the top-level of the European HPC ecosystem involving national, regional and topical HPC centres –Deploy several leadership computing systems at selected tier-0 centres 14 European Countries participate –Several are willing to fund and operate a tier-0 centre 7 EGI_DS meeting Istambul, – P.Kunszt, CSCS

8 DEISA, eDEISA, DEISA2 ProjectsPRACE Research Infrastructure PRACE Project PRACE Roadmap Creation of the legal entity Procurement of the first tier-0 systems PRACE Initiative Legal form, funding, peer review defined Beginning of close cooperation Prototype procurement providing the European Tier-0 HPC service

Supercomputing‘ Use Case Categories Instrument Data Analysis. There is one or several sources of experimental data to be analyzed. –Astrophysics, Geophysics, Satellites, Nanosciences Simulation. Simulation of large systems –Plasma physics, Fluid dynamics, Solid state physics, QCD, Comp.chem, Life Sciences, Climate modeling Time Critical Applications. Urgent computing –Earthquake, tsunami predictions, Weather models Complex Workflows. Correlation of complex data –Life sciences, nanosciences, molecular dynamics

Additional Needs Instrument Data Analysis. There is one or several sources of experimental data to be analyzed. –Astrophysics, Geophysics, Satellites, Nanosciences Simulation. Simulation of large systems –Plasma physics, Fluid dynamics, Solid state physics, QCD, Comp.chem, Life Sciences, Climate modeling Time Critical Applications. Urgent computing –Earthquake, tsunami predictions, Weather models Complex Workflows. Correlation of complex data –Life sciences, nanosciences, molecular dynamics Collaborative Services Clusters Storage Network Transfer Security Optimization

How people work today Projects are formed and scientific proposals are assembled Requesting CPU time and storage space – equivalent of a grant There is a peer review process that allocates time – based on technical feasability, merit and scientific value and merit Project duration 1-3 years Many have had long histories, communities that have been working together for years and have established processes Very close collaboration with resource providers SSH access to machines, direct queue submission

How people work today: DEISA 5-10% of site‘s resources dedicated to DEISA open call Submission through UNICORE possible but not mandatory Possibility to optimize resources, matching the application with the resource ‚Local‘ resources are usually dedicated to the groups People have dedicated people working on computing issues, scientific computing experts

Differences to current vision Project, not VO oriented –No continuity beyond that –Very clear process how to apply and enter –Peer review allocated resource, not ‚taking with you through NGI‘ Inherently NOT matching the Grid ‚ubiquitousness‘ –not at all like ‚computing power from the outlet‘ –Specialized optimization and knowledge necessary to build SC applications Supercomputing resources are nonstandard –Restricted usage of custom and specialized resources –Will rarely adhere to established standards

Questions to address SSC –What role does it have concerning Supercomputing resources? Virtual Organisations –What is the motivation to build one? To join one? –How is one established and dismantled? Project – VO – SSC - NGI – EGI relations –Who is responsible for what? Who provides what resource? Peer review vs. My own resource vs. NGI resource