Towards a HEP(++) SSC WLCG Overview Board, June 2009 Jamie Shiers Grid Support Group IT Department, CERN.

Slides:



Advertisements
Similar presentations
ROSCOE Status IT-GS Group Meeting Focus on POW November 2009.
Advertisements

Connect communicate collaborate View on eResearch 2020 study Draft report on “The Role of e-Infrastructures in the Creation of Global Virtual Research.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
EGI: A European Distributed Computing Infrastructure Steven Newhouse Interim EGI.eu Director.
EGI: SA1 Operations John Gordon EGEE09 Barcelona September 2009.
INFN Pre-GDB, May INFN Grid INFN Tier1 and Tier2 sites, together with the other INFN sites, are part of INFN Grid, a geographically distributed.
INFSO-RI Enabling Grids for E-sciencE SA1: Cookbook (DSA1.7) Ian Bird CERN 18 January 2006.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Related Projects Dieter Kranzlmüller Deputy.
Results of the HPC in Europe Taskforce (HET) e-IRG Workshop Kimmo Koski CSC – The Finnish IT Center for Science April 19 th, 2007.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Steven Newhouse EGEE’s plans for transition.
Sophie Sergent Ifremer European Affairs Department / MariFish WP7 ERANET MariFish COORDINATION OF EUROPEAN MARINE FISHERIES RESEARCH Presentation of MariFish.
Overview & Status of the Middleware & EGI Core Proposals Steven Newhouse Interim EGI Director EGEE Technical Director 26/05/2016 Status of EGI & Middleware.
Notur: - Grant f.o.m is 16.5 Mkr (was 21.7 Mkr) - No guarantees that funding will increase in Same level of operations maintained.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks ?? Athens, May 5-6th 2009 Community Support.
Workshop summary Ian Bird, CERN WLCG Workshop; DESY, 13 th July 2011 Accelerating Science and Innovation Accelerating Science and Innovation.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE – paving the way for a sustainable infrastructure.
EGI_DS or “can WLCG operate after EGEE?” Jamie Shiers ~~~ WLCG GDB, May 14 th 2008.
Ian Bird LCG Project Leader OB Summary GDB 10 th June 2009.
Enabling Grids for E-sciencE 1 EGEE III Project Prof.dr. Doina Banciu ICI Bucharest GRID activities in RoGrid Consortium.
LCG LHC Computing Grid Project – LCG CERN – European Organisation for Nuclear Research Geneva, Switzerland LCG LHCC Comprehensive.
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Steven Newhouse Technical Director CERN.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Multi-level monitoring - an overview James.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE-EGI Grid Operations Transition Maite.
Towards a HEP(++) SSC Jamie Shiers Grid Support Group IT Department, CERN.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
E-Science Research Councils awarded e-Science funds ” science increasingly done through distributed global collaborations enabled by the Internet, using.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks SA1: Grid Operations Maite Barroso (CERN)
INFSO-RI Enabling Grids for E-sciencE EGEE SA1 in EGEE-II – Overview Ian Bird IT Department CERN, Switzerland EGEE.
EGEE-III-INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE-III All Activity Meeting Brussels,
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks The EGEE User Support Infrastructure Torsten.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGI Operations Tiziana Ferrari EGEE User.
EGI-InSPIRE Steven Newhouse Interim EGI.eu Director EGI-InSPIRE Project Director Technical Director EGEE-III 1GDB - December 2009.
Report from the WLCG Operations and Tools TEG Maria Girone / CERN & Jeff Templon / NIKHEF WLCG Workshop, 19 th May 2012.
Ian Bird GDB CERN, 9 th September Sept 2015
WLCG Laura Perini1 EGI Operation Scenarios Introduction to panel discussion.
Update on HEP SSC WLCG MB, 18 th August 2009 Jamie Shiers Grid Support Group IT Department, CERN.
SEE-GRID-2 The SEE-GRID-2 initiative is co-funded by the European Commission under the FP6 Research Infrastructures contract no
PIC port d’informació científica EGEE – EGI Transition for WLCG in Spain M. Delfino, G. Merino, PIC Spanish Tier-1 WLCG CB 13-Nov-2009.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
Update on HEP SSC WLCG MB, 6 th July 2009 Jamie Shiers Grid Support Group IT Department, CERN.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Operations Automation Team Kickoff Meeting.
LHC Computing – the 3 rd Decade Jamie Shiers LHC OPN meeting October 2010.
Operations model Maite Barroso, CERN On behalf of EGEE operations WLCG Service Workshop 11/02/2006.
CERN - IT Department CH-1211 Genève 23 Switzerland t IT-GD-OPS attendance to EGEE’09 IT/GD Group Meeting, 09 October 2009.
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
E-science grid facility for Europe and Latin America CHAIN Proposal v0.1 EELA-2 compilation CERN e-Infrastructure projects Meeting ( )
Ian Bird LCG Project Leader Status of EGEE  EGI transition WLCG LHCC Referees’ meeting 21 st September 2009.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI Plans for PY2 Steven Newhouse Project Director, EGI.eu 30/05/2011 Future.
Resource Provisioning EGI_DS WP3 consolidation workshop, CERN Fotis Karayannis, GRNET.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
INFSO-RI Enabling Grids for E-sciencE EGEE general project update Fotis Karayannis EGEE South East Europe Project Management Board.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks IT ROC: Vision for EGEE III Tiziana Ferrari.
Experiment Support CERN IT Department CH-1211 Geneva 23 Switzerland t DBES IT-ES Group Meeting November 3 rd 2010.
Ian Bird LCG Project Leader Summary of EGI workshop.
Operations Coordination Team Maria Girone, CERN IT-ES GDB, 11 July 2012.
Ian Bird, CERN WLCG Project Leader Amsterdam, 24 th January 2012.
EGEE-III INFSO-RI Enabling Grids for E-sciencE Application Porting Support SSC and proposal Gergely Sipos
Bob Jones EGEE Technical Director
EGEE Middleware Activities Overview
SA1 Execution Plan Status and Issues
Ian Bird GDB Meeting CERN 9 September 2003
Steven Newhouse Project Director, EGI.eu
N.B. Please always use EGI-InSPIRE templates!
EGEE support for HEP and other applications
The CCIN2P3 and its role in EGEE/LCG
ROSCOE Robust Scientific Communities for EGI
Input on Sustainability
Collaboration Board Meeting
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

Towards a HEP(++) SSC WLCG Overview Board, June 2009 Jamie Shiers Grid Support Group IT Department, CERN

Purpose The purpose of this presentation is to inform you of what is being done with respect to a proposed “support centre” for the HEP community and beyond in the “EGI era” – June 2010 for 3 years? Timeline: deadline for proposals November 2009 Proposed methodology: “bottom-up” approach addressing requirements of LHC data taking & analysis (plus needs of other communities)

Background As we have still not formed the “consortium” that would prepare the proposal, nor discussed in depth with the target communities, premature to give (too) concrete information on proposed activities Think of them as (approximate) “continuation of EIS / ARDA” but refocused on top priorities of 2009 / 2010 and beyond  Your input needed to define the direction! There have been discussions in EGI_DS, EGEE NA4 etc on the roles and functions of an SSC in general – Most recently at a workshop in Athens in May; – Follow-up meeting in Orsay early July; – Barcelona during EGEE’09; – “Final” meeting before submission: CERN early November

Preparation Plan Propose following steps outlined in Hyperion FP7 training course held recently at CERN A mailing list exists and we have held 2 phone Tentatively, a 1-day workshop at CERN Friday 26 th June (with EVO) – The training proposes an agenda for such a meeting, plus a timeline for preparing the proposal! Draft “internal” and “lobby” 1-pagers produced More calls / meetings will follow as we move forward with proposal…

Agenda for Consortium Meeting Agree on Aim of Project (one page proposal) Refine Idea; address Evaluators’ Questions Plan for Writing the Proposal Writing Team Prepare Proposal Review Plan [ Not the first time we have prepared an EU proposal! ]

Introduction – Athens Slides From the CERN / WLCG perspective, a “grid support” team has been functioning for many years, focussing on the needs of the LHC experiments but also offering (in)valuable services to other HEP experiments, related disciplines and other projects From the EGEE perspective, a HEP cluster has existed for 5+ years  Clearly we are not starting from zero but building on what we have established over many years

A HEP(++) SSC CERN expects to be a lead partner in establishing such an SSC and hopes that other players can be identified very soon (if not already) – By end May 2009 would be optimal – Prague(?), DESY and INFN are understood to be interested – others? – Input from Bob, others, this (Athens) workshop etc required! We would expect to produce a draft – but already fairly complete – proposal no later than July 2009, with further iteration by the time of EGEE’09 – Semi-public presentation in BCN; discussions with experiments from now Further iteration prior to final submission – as part of a complete proposal – for November 2009 Our assumption is that any personnel would be in place from 1 May 2010 – continuity with the existing team(s) will be an important issue and will require “inventive solutions” From a technical perspective, this timeline looks readily achievable ++ includes both “related disciplines” as well as those using common solutions

Main Theme(s) Centres of expertise (excellence?) at CERN and at other sites to match the needs of the user communities supported – There is a minimal size below which it does not make sense to go These teams should work together – sharing knowledge, tools and experience – Sizing based on existing experience: something like 2 people per LHC (or other) VO would really allow us (Europe) to fully exploit the “facilities” (Grid, LHC) that we have build up over O(decade) (or two) – It is expected that the people at CERN would be integrated into the existing Grid Support group: in other words CERN would match or most likely exceed the staffing available through “SSC” Contacts: myself for “management level” issues; Patricia for technical ones

Summary As part of CERN’s position wrt EGI, it is expected (at least by CERN + WLCG management) that CERN lead the formation of a HEP(++) SSC We are actively looking for partners – some already identified – with whom we can start to work immediately We believe that by working together – also with those involved in other proto-SSCs – we can achieve “the Greater Grid”

Possible Overview WP1 Project Management WP2: discipline 1 WPn: HEPWPx: photon scienceWPy: fusion WPy: astro particleWPz: international communities … ………… … TrainingDisseminationCommon Tools Ganga Diane Dashboard framework Amga Experiment frameworks? … Operations & Support CIC GOCDB Other OPS tools GGUS m/w release(s) Per area? EnviroGRIDS etc. Does HEP include GEANT? SIXT? Others?

Recommended Workplan Strategy Manage by workpackage, not by partner WP for coordination & exploitation / dissemination Identify leader + partners AVOID: – Everyone in every WP – One partner per WP – Doing work in all WPs – “Floating partners” Ideal: lead one, be in 2 others

A Specialised Support Centre for Large Internal Grid Communities (SLIC) INFRA (+ others??) The aim of this proposal is to establish a specialized support centre (SSC) focusing on large international communities that use or will use the EGEE  EGI Grid infrastructure. The work that will be carried out by the SSC will be predominantly support for production exploitation of the Grid, following on from the successful adaption of numerous major applications and their communities to the Grid environment, ensuring that the return on past investment is maximized. Ease of use, low cost of entry and of ownership will be key priorities.

SLIC - Background The European Community has funded three phases of the Enabling Grids for E-sciencE project, which has resulted in large scale production use of world-class Grid-based solutions by many key communities. It has established Europe’s leadership in this area and – in the particular case of the Large Hadron Collider (LHC) at CERN and its computational and storage needs – brought us to the brink of scientific exploitation of this world facility. To capitalize on this investment and to ensure that Europe retains its leading role not only in Grid technologies but also in the scientific disciplines that are dependent on them, continued support for these communities is required. In the short to medium term it is expected that this will lead to significant advances in our basic understanding of the Universe around us, whereas in the longer term major spin-offs, both related to the advances in science as well as in Information Technology, can be expected.

Expected Results + Lead Users The expected results of this work are a dramatic increase in the number of Grid users, as usage expands from the data processing activities that have dominated until now into the realm of data analysis, scientific discovery and publication. This can only be achieved by a significant simplification of an end- user’s interaction with the Grid, through further adoption of existing tools such as Ganga, and by a flexible and scalable end-user support model. This includes the establishment of community support, whereby the communities are encouraged and enabled to be largely self-supporting, with expert guidance to establish and optimize the support structures and associated tools. This is essential not only to deal with the large expansion in terms of number of users but also for long-term sustainability. [ Numbers ??? ] … Expected Budget: €10,000,000 Framework 7 contribution: €5,000,000 Duration: 36 months

The Model – S.W.O.T. analysis StrengthsThis model has been proven to work over many years: “experiment integration”  (+) production support  (+) analysis support WeaknessesFunding is drying up; people are leaving; some have already gone and more in pipeline… OpportunitiesWe have built up a worldwide production quality Grid system over many years: now is the time to exploit it fully – and to respond to Fabiola’s “challenge”. ThreatsWe cannot afford to dilute the effort beyond reasonable limits: a team of ~2 FTEs is already on the low edge for an SSC. On the other hand, a team of ~2 “EIS FTEs” per LHC VO has been shown to be effective.  [ The service ] “should not limit ability of physicist to exploit performance of detectors nor LHC’s physics potential“  “…whilst being stable, reliable and easy to use” now Past Future

Next Steps (imho) We need to move ahead systematically and relatively rapidly in all aspects of preparing a proposal We need not only buy-in from the communities that we propose to support, but also their active involvement in all of the above This includes the sites / countries / regions involved A coherent view – balancing our needs in other areas (e.g. m/w, operations) is required

Possible Funds (Extract) (Cal Loomis) had discussions with people at the European Commission concerning the FP7 Infrastructure "Capacities" call that will be targeted by the EGI and EGI-related proposals. As a reminder, the expected calls are: – INFRA Distributed Computing Infrastructure (DCI) Creation of EGI Service/user support for existing international ("heavy") grid users Middleware New user communities for DCIs – INFRA Simulation software and services – INFRA Virtual Research Communities – INFRA First impl. phase of the European High Performance Computing (HPC) service PRACE – INFRA Coordination actions, conferences, and studies supporting policy development, including international cooperation. The money available in the calls is: – = 25M€ – = 25M€ – = 12M€ – = 23M€ – = 20M€ – 3.3 = 10M€ In any case, it looks like the SSCs will have to be "split" in some way to take advantage of all of the money in the various calls. Whether this split is purely on scientific discipline or on support activities is up to us. The EC was not concerned about having a scientific community appear in proposals in both calls so long as work was not duplicated between those proposals. I would suggest that for the short term we develop each SSC independently and worry about the placement/splitting/synergies until after the scope and work program of the individual SSCs is clear.

Discussion

Tentative Conclusions In the above scenario, CERN would expect to lead the HEP WP and to be involved in (at least) the International Communities one It would also coordinate the overall effort – at least up to the stage of proposal submission – of the agreed WPs (+ i/f to Cal & other SSCs?) – This is arguably enough! (Other WP leads?) Need to agree partners and PoW rather soon! Checkpoints: July, September, then monthly – We still have target of draft prior to Paris meeting! (Clearly not all details but key directions & goals)

BACKUP SLIDES

Without a HEP SSC? We have already cut the number of staff supporting the LHC experiments – including in the critical area of analysis support – and further cuts are in the pipeline (budget imposed CERN-wide) The EIS team – generously supported in significant part by INFN – would most likely collapse into the most rudimentary support The reality is: we will drop from 8 (2008) to 2.5 (early 2010) to <1 (post EGEE III) There is nowhere else that we can make up these resources – other key areas are also subject to the same level of drastic reduction! Whilst as an enabling factor (using the grid, empowering the VO, …) this has to be one of the best ROI areas (IMHO…) – You can and should design for resilience; you can and should automate; you cannot replace or substitute this area with other than the most skilled and dedicated personnel – such as those we have today – (This was not the original timeline / planning…)

With a (HEP) SSC We can really deliver on the promises we have made to our funding agencies – and to our user communities We can – at extremely low (or no) cost – assist other important / visible projects We can deliver the “added value” – the real(?) reason that much of this research is funded (“science & society”) There are many examples of the “added value” of a “SSC- like” team at CERN: and the costs are very modest compared to other types of “petascale computing”

Pre-GDB – May 12 Short 10’ presentation by each Tier 1 (and CERN): CERN, Italy, UK, Germany, France, Spain, Netherlands, Nordic Which services you currently provide for WLCG (via EGEE) that you will commit to continue to support (see attached slide) – what is the level of effort you currently provide for these (separated into operation, maintenance, and development) Which services you will not be able to continue to support, or where the level of effort may be significantly decreased that may slow developments, bug fixes, etc. What is the state of the planning for the NGI: – Will it be in place (and fully operational!) by the end of EGEE-III? – What is the management structure of the NGI?, and – How do the Tier 1 and Tier 2s fit into that structure? – How the effort that today is part of the ROCs (e.g. COD, TPM, etc) for supporting the WLCG operations evolve? How will daily operations support be provided? – Does the country intend to sign the Letter of Intent and MoU expressing the intention to be a full member of EGI? – Which additional services could the Tier 1 offer if other Tier 1s are unable to provide them? – Other issues particular to the country, or general problems to be addressed. – What are the plans to maintain the WLCG service if the NGI is not in place by May 2010, or if EGI.org is not in place. For ASGC and Triumf it would be useful to hear on their plans in the absence of EGEE ROC support – i.e. do they have plans to continue or build local support centres. For BNL and FNAL I assume that nothing will really change on the timescale of the next year. If there are other countries with Tier 2s that would like to mention the state of their planning that would also be welcome within the constraints of the time. Other issues that need to be discussed include how the support for non-EU, non-US sites will be managed. For example sites in Latin America and others which are currently supported by the CERN ROC.

24 EGEE Services needed by WLCG (Plan B)  GGUS  Relies on connections to local support ticket systems – today in ROCs and sites   Tier1 and Tier2 sites?  COD, TPM  Operations and Service coordination  CERN + EGEE ROCs  ROCs:  Support effort (TPM, COD)  moves to Tier 1s?  EIS team – CERN (largely LCG funded)  ENOC  Coordination of OPN operations- currently by IN2P3  Deployment support:  m/w deployment/testing/rollout/support  Pre-production testing – effort and resources  Operational Security coordination  Policy development  Accounting:  APEL – infrastructure/DB and service  NB Italy uses DGAS and publishes into APEL; OSG + ARC publish into APEL  Portal – CESGA  GOCDB: configuration DB  Important for all configurations and definitions of sites and services  CIC Portal:  Contact information, VO-ID cards, broadcast tool, Automated reporting,  Availability/Reliability:  SAM framework (and migration to Nagios); SAM tests  Gridview/Algorithms etc:  GridMap:  MSG  Dashboards  Service, framework and common services  Experiment-specifics  Middleware...