Distributed EU-wide Supercomputing Facility as a New Research Infrastructure for Europe Gabrielle Allen Albert-Einstein-Institut, Germany Jarek Nabrzyski.

Slides:



Advertisements
Similar presentations
Jorge Gasós Grid Technologies Unit European Commission The EU e Infrastructures Programme Workshop, Beijing, June 2005.
Advertisements

HPCx Power for the Grid Dr Alan D Simpson HPCx Project Director EPCC Technical Director.
What I really want from networks NOW, and in 5-10 years time The Researchers View Ed Seidel Max-Planck-Institut für Gravitationsphysik.
Supporting Research on Campus - Using Cyberinfrastructure (CI) Public research use of ICT has rapidly increased in the past decade, requiring high performance.
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
1 Challenges and New Trends in Data Intensive Science Panel at Data-aware Distributed Computing (DADC) Workshop HPDC Boston June Geoffrey Fox Community.
Presentation at WebEx Meeting June 15,  Context  Challenge  Anticipated Outcomes  Framework  Timeline & Guidance  Comment and Questions.
Towards a European Training Network in Scientific Computing Pekka Manninen, PhD CSC – IT Center for Science Ltd.
GENI: Global Environment for Networking Innovations Larry Landweber Senior Advisor NSF:CISE Joint Techs Madison, WI July 17, 2006.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CIF21) NSF-wide Cyberinfrastructure Vision People, Sustainability, Innovation,
“Scientific software is an essential integral part of the European Cyber infrastructure” There are facilities for Neutrons, X-rays, NMR that are maintained;
SDSC Computing the 21st Century Talk Given to the NSF Sugar Panel May 27, 1998.
LinkSCEEM-2: A computational resource for the development of Computational Sciences in the Eastern Mediterranean Mostafa Zoubi SESAME SESAME – LinkSCEEM.
PDC Enabling Science ParallellDatorCentrum Olle Mulmo
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
BERAC Charge A recognized strength of the Office of Science, and BER is no exception, is the development of tools and technologies that enable science.
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
1 1 st Virtual Forum on Global Research Communities Brussels, 12 th July 2007 Mário Campolargo European Commission - DG INFSO Head of Unit “GÉANT and e-Infrastructures”
Knowledge Environments for Science and Engineering: Current Technical Developments James French, Information and Intelligent Systems Division, Computer.
Knowledge Environments for Science and Engineering: Overview of Past, Present and Future Michael Pazzani, Information and Intelligent Systems Division,
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
Miron Livny Computer Sciences Department University of Wisconsin-Madison From Compute Intensive to Data.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
ISBE An infrastructure for European (systems) biology Martijn J. Moné Seqahead meeting “ICT needs and challenges for Big Data in the Life Sciences” Pula,
Big Data: Movement, Crunching, and Sharing Guy Almes, Academy for Advanced Telecommunications 13 February 2015.
PCGRID ‘08 Workshop, Miami, FL April 18, 2008 Preston Smith Implementing an Industrial-Strength Academic Cyberinfrastructure at Purdue University.
Advanced Computing Services for Research Organisations Bob Jones Head of openlab IT dept CERN This document produced by Members of the Helix Nebula consortium.
Advancing Computational Science in Academic Institutions Organisers: Dan Katz – University of Chicago Gabrielle Allen – Louisiana State University Rob.
Future role of DMR in Cyber Infrastructure D. Ceperley NCSA, University of Illinois Urbana-Champaign N.B. All views expressed are my own.
Results of the HPC in Europe Taskforce (HET) e-IRG Workshop Kimmo Koski CSC – The Finnish IT Center for Science April 19 th, 2007.
Jarek Nabrzyski, Ariel Oleksiak Comparison of Grid Middleware in European Grid Projects Jarek Nabrzyski, Ariel Oleksiak Poznań Supercomputing and Networking.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
1-1.1 Sample Grid Computing Projects. NSF Network for Earthquake Engineering Simulation (NEES) 2004 – to date‏ Transform our ability to carry out research.
Introduction to Grid Computing Ed Seidel Max Planck Institute for Gravitational Physics
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
Grid Computing at PSNC Jarosław Nabrzyski Poznań Supercomputing and Networking Center (PSNC) and Information Sciences Institute, Poznan University of Technology.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
1 European e-Infrastructure experiences gained and way ahead OGF 20 / EGEE User’s Forum 9 th May 2007 Mário Campolargo European Commission - DG INFSO Head.
HPC Centres and Strategies for Advancing Computational Science in Academic Institutions Organisers: Dan Katz – University of Chicago Gabrielle Allen –
RI The DEISA Sustainability Model Wolfgang Gentzsch DEISA-2 and OGF rzg.mpg.de.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
GridLab WP-2 Cactus GAT (CGAT) Ed Seidel, AEI & LSU Co-chair, GGF Apps RG, Gridstart Apps TWG Gabrielle Allen, Robert Engel, Tom Goodale, *Thomas Radke.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
TeraGrid n US based, funded by the NSF n High speed network, combines clusters into one Super-Cluster! n Doesn’t just combine computation… combines storage,
August 3, March, The AC3 GRID An investment in the future of Atlantic Canadian R&D Infrastructure Dr. Virendra C. Bhavsar UNB, Fredericton.
Leveraging the InCommon Federation to access the NSF TeraGrid Jim Basney Senior Research Scientist National Center for Supercomputing Applications University.
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
EScience: Techniques and Technologies for 21st Century Discovery Ed Lazowska Bill & Melinda Gates Chair in Computer Science & Engineering Computer Science.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
Comprehensive Scientific Support Of Large Scale Parallel Computation David Skinner, NERSC.
Switzerland joining EGEE 7 March, 2005, GSI- Darmstadt by CSCS, the Swiss National Supercomputing Centre.
SEE-GRID-2 The SEE-GRID-2 initiative is co-funded by the European Commission under the FP6 Research Infrastructures contract no
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI strategy and Grand Vision Ludek Matyska EGI Council Chair EGI InSPIRE.
NASA Earth Exchange (NEX) A collaborative supercomputing environment for global change science Earth Science Division/NASA Advanced Supercomputing (NAS)
IPCEI on High performance computing and big data enabled application: a pilot for the European Data Infrastructure Antonio Zoccoli INFN & University of.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
CMB & LSS Virtual Research Community Marcos López-Caniego Enrique Martínez Isabel Campos Jesús Marco Instituto de Física de Cantabria (CSIC-UC) EGI Community.
CaSToRC Linking Scientific Computing in Europe and the Eastern Mediterranean LinkSCEEM-2 CyI IUCC CYNET SESAME JUNET JSC BA NARSS MPIC ESRF NCSA.
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
Jan Odegard, Ken Kennedy Institute for Information Technology, Rice
Tools and Services Workshop
Joslynn Lee – Data Science Educator
LinkSCEEM-2: A computational resource for the Eastern Mediterranean
EGI Webinar - Introduction -
Welcome to (HT)Condor Week #19 (year 34 of our project)
Presentation transcript:

Distributed EU-wide Supercomputing Facility as a New Research Infrastructure for Europe Gabrielle Allen Albert-Einstein-Institut, Germany Jarek Nabrzyski PSNC, Poland Ed Seidel Albert-Einstein-Institut, Germany Maciej Stroiński PSNC, Poland

Computational Needs not Met Great scientific and engineering talent in EU Many EU programs now beginning, with training mission, but: No EU-wide facilities No Collaborative Infrastructure No HPC Training (except for do-it-yourself) Rely on US connections What do others do?? In many cases, they do toy problems EU science and engineering fall behind Germany, UK, France are slight exceptions, but no large scale Many countries (mainly NAS, but not only) simply cut off from modern science

Discovery Channel Movie (June 3, 2002!) : 3000 frames of Volume Rendering of TB of simulation data3000 frames of Volume Rendering of TB of simulation data EU Project simulation had to be computed and visualized in US!EU Project simulation had to be computed and visualized in US!

How did we do this EU Calculation? Used largest Academic machines for simulation LBL/NERSC (US DOE) NCSA Platinum cluster High Speed backbone for data transfer Flew students from Berlin to Illinois 3 weeks analysis, visualization, using special facilities unavailable to our project in EU Brought 1TB data back on 6 hard disks purchased in US

US Programs Available to academic researchers from every US state, occasionally to EU NSF PACI Program Started 20 years ago 5 National Centers, now 3, combining to form TeraGrid US DOE LBL/NERSC, LANL, etc. NASA Goddard, JPL, Ames, etc. US far ahead of EU in this regard But even there, widely recognized that facilities are woefully too small for today/tomorrow’s science/engineering

Other efforts Not a unique idea! ARCADE, CERN, ENACTS, GRIDSTART, other projects, many ideas presented today Requirements Must be application oriented Build both Infrastructure and Expertise Centers Leverage Grid projects, CS efforts, National centers, networks, etc Work united with other project organizers, Long term funding process is required!

Our Vision for the EU Facility Science and Engineering Drivers Expertise Centers (CFD, Astrophysics, Computational Biology, Aerospace, etc.) Sheparding/building communities across Europe Present science/engineering calculations require TB/Tflop/sec, but just scratching the surface! Need much more! Enabling new science Computational Science Drivers Need Grid Software infrastructure Cycles, networks, visualization centers Partnerships with existing centers, networks, etc

Vision of the Facility Distributed Nature Pushes networking technologies through apps Works to integrate European efforts, scientific, cultural, etc. East and West at 100Gb/sec! Multi-lambda backplane (GEANT?) Unique EU facility in the HPC world Leapfrog Teragrid with Big Machine, but also Draw on unique, more diverse EU research strengths Beginning of a Worldwide Grid Effort Unique challenges to integrating diverse EU grids Already strong support from US, NSF, DOE principals Scientists/engineers know they have expanding facility over coming decades (can affect their projects)

Our Vision Major Center Satellite

Worldwide Grid Main BH Simulation starts here All analysis tasks spawned automatically to free resources worldwide User only has to invoke “Spawner” thorn…