Jürgen Knobloch/CERN Slide 1 Grids for Science in Europe by Jürgen Knobloch CERN IT-Department Presented at Gridforum.nl Annual Business Day Eindhoven.

Slides:



Advertisements
Similar presentations
EGEE-III and EGI Design Study Future European Grid Landscape CSC Finnish IT Center for Science.
Advertisements

EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Why Grids Matter to Europe Bob Jones EGEE.
The EGI – a sustainable European grid infrastructure Michael Wilson STFC RAL.
Randall Sobie The ATLAS Experiment Randall Sobie Institute for Particle Physics University of Victoria Large Hadron Collider (LHC) at CERN Laboratory ATLAS.
The LHC Computing Grid Project Tomi Kauppi Timo Larjo.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
The European Grid Initiative – Status and Overview Dieter Kranzlmüller (GUP, Joh. Kepler University Linz)
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE-III Program of Work Erwin Laure EGEE-II / EGEE-III Transition Meeting CERN,
Towards a National Grid Infrastructure - The European View Dieter Kranzlmüller (GUP, Joh. Kepler University Linz)
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
SICSA student induction day, 2009Slide 1 Social Simulation Tutorial Session 6: Introduction to grids and cloud computing International Symposium on Grid.
EGEE – EGI Transition Jürgen Knobloch, CERN EGI_DS Technical Director WLCG OB CERN 13 November 2009.
EGI: A European Distributed Computing Infrastructure Steven Newhouse Interim EGI.eu Director.
European Grid Initiative Design Study Ludek Matyska CESNET, Czech Republic Member of the EGI_DS team.
Co-ordination & Harmonisation of Advanced e-Infrastructures Research Infrastructures – Grant Agreement n Regional progress report: Mediterranean.
Frédéric Hemmer, CERN, IT DepartmentThe LHC Computing Grid – October 2006 LHC Computing and Grids Frédéric Hemmer IT Deputy Department Head October 10,
1 Introduction to EGEE-II Antonio Fuentes Tutorial Grid Madrid, May 2007 RedIRIS/Red.es (Slices of Bob Jone, Director of EGEE-II.
Advanced Computing Services for Research Organisations Bob Jones Head of openlab IT dept CERN This document produced by Members of the Helix Nebula consortium.
A short introduction to the Worldwide LHC Computing Grid Maarten Litmaath (CERN)
European Grid Initiative Dieter Kranzlmüller (GUP, Joh. Kepler University Linz)
Overview & Status of the Middleware & EGI Core Proposals Steven Newhouse Interim EGI Director EGEE Technical Director 26/05/2016 Status of EGI & Middleware.
Notur: - Grant f.o.m is 16.5 Mkr (was 21.7 Mkr) - No guarantees that funding will increase in Same level of operations maintained.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
The Grid Beyond Physics Bob Jones, CERN EGEE project director.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE – paving the way for a sustainable infrastructure.
INFSO-RI Enabling Grids for E-sciencE Plan until the end of the project and beyond, sustainability plans Dieter Kranzlmüller Deputy.
Enabling Grids for E-sciencE 1 EGEE III Project Prof.dr. Doina Banciu ICI Bucharest GRID activities in RoGrid Consortium.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE II: an eInfrastructure for Europe and.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Bob Jones EGEE project director CERN.
Jürgen Knobloch/CERN Slide 1 A Global Computer – the Grid Is Reality by Jürgen Knobloch October 31, 2007.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
…building the next IT revolution From Web to Grid…
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Ian Bird LCG Deployment Area Manager & EGEE Operations Manager IT Department, CERN Presentation to HEPiX 22 nd October 2004 LCG Operations.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
EGEE-III-INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE-III All Activity Meeting Brussels,
Dieter European Grid Initiative Towards a sustainable production grid infrastructure.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGI Operations Tiziana Ferrari EGEE User.
EGI-InSPIRE Steven Newhouse Interim EGI.eu Director EGI-InSPIRE Project Director Technical Director EGEE-III 1GDB - December 2009.
Dr. Andreas Wagner Deputy Group Leader - Operating Systems and Infrastructure Services CERN IT Department The IT Department & The LHC Computing Grid –
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
LHC Computing, CERN, & Federated Identities
European Grid Initiative Design Study Current Status Luděk Matyska CESNET EGI_DS Project Director (With thanks to Juergen Knobloch for pictures)
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
NORDUnet NORDUnet e-Infrastrucure: Grids and Hybrid Networks Lars Fischer CTO, NORDUnet Fall 2006 Internet2 Member Meeting, Chicago.
European Grid Initiative Design Study Current Status Luděk Matyska CESNET EGI_DS Project Director (With thanks to Juergen Knobloch for pictures)
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE: Enabling grids for E-Science Bob Jones.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks NA5: Policy and International Cooperation.
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
Ian Bird LCG Project Leader Status of EGEE  EGI transition WLCG LHCC Referees’ meeting 21 st September 2009.
Representation of European Grid efforts, international cooperation, and ESFRI EGI_DS WP3 consolidation workshop, CERN Fotis Karayannis, GRNET.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
EGI-InSPIRE Project Overview1 EGI-InSPIRE Overview Activities and operations boards Tiziana Ferrari, EGI.eu Operations Unit Tiziana.Ferrari at egi.eu 1.
Jürgen Knobloch/CERN Slide 1 Grid Computing by Jürgen Knobloch CERN IT-Department Presented at Physics at the Terascale DESY, Hamburg December 4, 2007.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
Dr. Ian Bird LHC Computing Grid Project Leader Göttingen Tier 2 Inauguration 13 th May 2008 Challenges and Opportunities.
Bob Jones EGEE Technical Director
WLCG Tier-2 Asia Workshop TIFR, Mumbai 1-3 December 2006
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
Ian Bird GDB Meeting CERN 9 September 2003
Long-term Grid Sustainability
EGEE support for HEP and other applications
LHC Data Analysis using a worldwide computing grid
Cécile Germain-Renaud Grid Observatory meeting 19 October 2007 Orsay
Overview & Status Al-Ain, UAE November 2007.
Presentation transcript:

Jürgen Knobloch/CERN Slide 1 Grids for Science in Europe by Jürgen Knobloch CERN IT-Department Presented at Gridforum.nl Annual Business Day Eindhoven April 2, 2008 planning for sustainability

Jürgen Knobloch- cern-it Slide-2 CERN Annual budget: ~1000 MSFr (~600 M€) Staff members: 2650 Member states: Fellows, Associates CERN users Basic research Fundamental questions High E accelerator: Giant microscope (p=h/λ) Generate new particles (E=mc 2 ) Create Big Bang conditions

Large Hadron Collider - LHC 27 km circumference Cost ~ 3000 M€ (+ detectors) Proton-proton (or lead ion) collisions at 7+7 TeV Bunches protons cross every 25 nsec 600 million collisions/sec Physics questions –Origin of mass (Higgs?) –Dark matter? –Symmetry matter-antimatter –Forces – supersymmetry –Early universe – quark-gluon plasma –… Jürgen Knobloch/CERN Slide 3 16 micron squeeze 1.9 K to

Jürgen Knobloch/CERN Slide 4 LHC gets ready …

LHC Computing Challenge Jürgen Knobloch/CERN Slide 5

Requirements Match? Jürgen Knobloch/CERN Slide 6 Tape & disk requirements: >10 times CERN possibility Substantial computing required outside CERN

Jürgen Knobloch/CERN Slide 7 Timeline - Grids Data Challenges First physics Cosmics GRID 3 EGEE 1 LCG 1 Service Challenges EU DataGrid GriPhyN, iVDGL, PPDG EGEE 2 OSG LCG 2 EGEE WLCG CCRC08 Common Computing Readiness Challenge Inter- operation

WLCG Collaboration Jürgen Knobloch/CERN Slide 8

Events at LHC Jürgen Knobloch/CERN Slide 9 Luminosity : cm -2 s MHz – every 25 ns 20 events overlaying

Trigger & Data Acquisition Jürgen Knobloch/CERN Slide 10

Data Recording Jürgen Knobloch/CERN Slide GB/sec (Ions)

Tier-0, Tier-1, Tier-2 Jürgen Knobloch/CERN Slide 12 Tier-0 (CERN): Data recording First-pass reconstruction Data distribution Tier-1 (11 centres): Permanent storage Re-processing Analysis Tier-2 (>250 centres): Simulation End-user analysis 2 GB/sec Data Transfer out of Tier-0 Note: For a site to be reliable, many things have to work simultaneously! Target

Middleware Security –Virtual Organization Management (VOMS) –MyProxy Data management –File catalogue (LFC) –File transfer service (FTS) –Storage Element (SE) –Storage Resource Management (SRM) Job management –Work Load Management System(WMS) –Logging and Bookeeping (LB) –Computing Element (CE) –Worker Nodes (WN) Information System –Monitoring: BDII (Berkeley Database Information Index), RGMA (Relational Grid Monitoring Architecture)  aggregate service information from multiple Grid sites, now moved to SAM (Site Availability Monitoring) –Monitoring & visualization (Gridview, Dashboard, Gridmap etc.) Jürgen Knobloch/CERN Slide 13

Increasing workloads Factor 3 capacity ramp-up is in progress for LHC operation >⅓ non-LHC

Many Grid Applications At present there are about 20 applications from more than 10 domains on the EGEE Grid infrastructure –Astronomy & Astrophysics - MAGIC, Planck –Computational Chemistry –Earth Sciences - Earth Observation, Solid Earth Physics, Hydrology, Climate –Fusion –High Energy Physics - 4 LHC experiments (ALICE, ATLAS, CMS, LHCb) BaBar, CDF, DØ, ZEUS –Life Sciences - Bioinformatics (Drug Discovery, Xmipp_MLrefine, etc.) –Condensed Matter Physics –Computational Fluid Dynamics –Computer Science/Tools –Civil Protection –Finance (through the Industry Task Force) Jürgen Knobloch/CERN Slide 15

Grid Applications Jürgen Knobloch/CERN Slide 16 Medical Seismology Chemistry Astronomy Fusion Particle Physics

Jürgen Knobloch/CERN Slide 17 Tier-1 Centers: TRIUMF (Canada); GridKA(Germany); IN2P3 (France); CNAF (Italy); SARA/NIKHEF (NL); Nordic Data Grid Facility (NDGF); ASCC (Taipei); RAL (UK); BNL (US); FNAL (US); PIC (Spain) Computing at the Terra-Scale

Evolution Jürgen Knobloch/CERN Slide 18 National Global European e-Infrastructure

Sustainability Jürgen Knobloch/CERN Slide 19

European Grid Initiative Goal: Long-term sustainability of grid infrastructures in Europe Approach: Establishment of a new federated model bringing together NGIs to build the EGI Organisation EGI Organisation: Coordination and operation of a common multi- national, multi-disciplinary Grid infrastructure –To enable and support international Grid-based collaboration –To provide support and added value to NGIs –To liaise with corresponding infrastructures outside Europe 20

EGI_DS Consortium members Johannes Kepler Universität Linz (GUP) Greek Research and Technology Network S.A. (GRNET) Istituto Nazionale di Fisica Nucleare (INFN) CSC - Scientific Computing Ltd. (CSC) CESNET, z.s.p.o. (CESNET) European Organization for Nuclear Research (CERN) Verein zur Förderung eines Deutschen Forschungsnetzes - DFN-Verein (DFN) Science & Technology Facilities Council (STFC) Centre National de la Recherche Scientifique (CNRS) Jürgen Knobloch/CERN Slide 21

EGI – European Grid Initiative EGI Design Study started Sept 07 to establish a sustainable pan-European grid infrastructure after the end of EGEE-3 in 2010 The main foundations of EGI are 37 National Grid Initiatives (NGIs) Project is funded by the European Commission's 7 th Framework Program Jürgen Knobloch/CERN Slide 22

EGI_DS Schedule Jürgen Knobloch/CERN Slide 23 Duration 27 months: Develop EGI ProposalNGIs signing Proposal Start of EGEE-III Final Draft of EGI Blueprint Proposal EGI Blueprint Proposal EGEE-III transition to EGI-like structure EGI Entity in place EU Call Deadline for EGI Proposal Submission of EGEE-III Start of EGI Design Study EGEE-II (2YEARS)‏ EGEE-III (2YEARS)‏ EGI operational

EGI Functionality Overview Management, Outreach & Dissemination - Representation of EU Grid Efforts Operations & Resource Provisioning & Security Application Support & Training Middleware (Build&Test, Component Selection/Validation/Deployment) Standardisation & Policies Industry take-up Jürgen Knobloch/CERN Slide 24

EGI Management Structure Jürgen Knobloch/CERN Slide 25 EGI EGI Organisation Director + staff EGI Council CTO (Development) COO (Operations) Political Bodies such as: eIRG, EU, ESFRI, Ntl. Bodies,... Advisory Committees Dev. Group Oper. Group Adm.+PR Group Strategy Committee CAO (Admin.+ PR)

Characteristics of NGIs Each NGI … should be a recognized national body with a single point-of-contact … should mobilise national funding and resources … should ensure an operational national grid infrastructure … should support user communities (application independent, and open to new user communities and resource providers) … should contribute and adhere to international standards and policies Responsibilities between NGIs and EGI are split to be federated and complementary Jürgen Knobloch/CERN Slide 26 Work in progress! Things may change!

EGI Operations NGIs perform operations of their national grid infrastructure, including monitoring and accounting, following a general SLA Each NGI has full responsibility towards other NGIs but acts independently EGI provides coordination level to guarantee effective, integrated operations through the adoption of standard services Jürgen Knobloch/CERN Slide 27 Work in progress! Things may change!

EGI Resource Strategy EGI owns very few hardware resources Resource provisioning is responsibility of NGIs  Freedom to choose whatever resources to offer Development of a strategy for simple entry of new Virtual Organizations Jürgen Knobloch/CERN Slide 28 Work in progress! Things may change!

EGI Middleware Convergence of middleware stacks towards full interoperability (through general standards interfaces) Distinction between core and high-level functions  EGI coordinates core services –Standards-compliance test for core services provided by EGI –Common build&test system needed in Europe Jürgen Knobloch/CERN Slide 29 Work in progress! Things may change!

EGI Application Support Application support implemented and provided by NGIs EGI performs coordination role –Validation of user support by NGIs –Facilitating experts between NGIs –Organization of user forums EGI to coordinate training events Jürgen Knobloch/CERN Slide 30 Work in progress! Things may change!

EGI Funding The EGI organisation has three major and separate cost centres: –general central services funded through contributions of the NGIs –service provisioning funded through service charges –Developments funded through project grants There are in general no cross-subsidies possible between these cost centres. Jürgen Knobloch/CERN Slide 31 Work in progress! Things may change!

Issues being discussed Small vs. large central EGI Roles “core EGI” vs. NGI Transition from a (EU) project-funded grid to self-sustained operations Maintaining the available expertise Choice of legal structure and location Jürgen Knobloch/CERN Slide 32

EGI – European Grid Initiative Future EGI Organisation = “Glue” between various grid communities in Europe and beyond EGI_DS defines required mechanisms and functionalities of the EGI Organisation  Towards a sustainable environment for the application communities utilizing grid infrastructures for their everyday work Jürgen Knobloch/CERN Slide 33