Download presentation
Presentation is loading. Please wait.
Published byBasil Rodgers Modified over 9 years ago
1
Tony Doyle - University of Glasgow 12 January 2005Collaboration Board Beyond GridPP2 Tony Doyle
2
Tony Doyle - University of Glasgow 12 January 2005Collaboration Board Outline Beyond GridPP2: what are the estimated resource requirements in the LHC exploitation era? Background: PMB preliminary discussions in September PPAP presentation in October: Resources needed in medium-long term? (09/07-08/10) Exploitationmedium (09/10-08/14) Exploitationlong-term Focus on resources needed in 2008 GridPP Oversight Committee outline on Monday initial ideas for discussion here..
3
Tony Doyle - University of Glasgow 12 January 2005Collaboration Board Tier 0 and LCG: Foundation Programme Aim: build upon Phase 1 Ensure development programmes are linked Project management: GridPPLCG Shared expertise: LCG establishes the global computing infrastructure Allows all participating physicists to exploit LHC data Earmarked UK funding being reviewed Required Foundation: LCG Deployment F. LHC Computing Grid Project (LCG Phase 2) [review]
4
Tony Doyle - University of Glasgow 12 January 2005Collaboration Board Tier 0 and LCG: RRB meeting (October) Jos Engelen proposal to RRB members (Richard Wade [UK]) on how a 20MCHF shortfall for LCG phase II can be funded Spain to fund ~2 staff. Others at this level? Funding from UK (£1m), France, Germany and Italy for 5 staff. Others? Now vitally important that the LCG effort established predominantly via UK funding (40%) is sustained at this level (~10%) Proposal to SC in preparation Value to the UK? Required Foundation: LCG Deployment
5
Tony Doyle - University of Glasgow 12 January 2005Collaboration Board Grid and e-Science Support in 2008 What areas require support? IV Running the Tier-1 Data Centre IVHardware annual upgrade IVContribution to Tier-2 Sysman effort (non-PPARC) hardware IVFrontend Tier-2 hardware IVContribution to Tier-0 support IIIOne M/S/N expert in each of 6 areas IIIProduction manager and four Tier-2 coordinators IIApplication/Grid experts (UK support) IATLAS Computing MoU commitments and support ICMS Computing MoU commitments and support ILHCb Core Tasks and Computing Support IALICE Computing support IFuture experiments adopt e-Infrastructure methods No GridPP management: (assume production mode established + devolved management to Institutes) III. Grid Middleware I. Experiment Layer II. Application Middleware IV. Facilities and Fabrics
6
Tony Doyle - University of Glasgow 12 January 2005Collaboration Board PPARC Financial Input: GridPP1 Outturn LHC Computing Grid Project (LCG) Applications, Fabrics, Technology and Deployment European DataGrid (EDG) Middleware Development UK Tier-1/A Regional Centre Hardware and Manpower Grid Application Development LHC and US Experiments + Lattice QCD Management Travel etc
7
Tony Doyle - University of Glasgow 12 January 2005Collaboration Board C. Grid Application Development LHC and US Experiments + Lattice QCD + Phenomenology B. Middleware Security Network Development F. LHC Computing Grid Project (LCG Phase 2) [review] E. Tier-1/A Deployment: Hardware, System Management, Experiment Support A. Management, Travel, Operations D. Tier-2 Deployment: 4 Regional Centres - M/S/N support and System Management PPARC Financial Input: GridPP2 Components
8
Tony Doyle - University of Glasgow 12 January 2005Collaboration Board IV. Hardware Support UK Tier-12008 CPU Total (MSI2k)(7.8) 5.2 Disk Total (PB)(3.8) 1.6 Total Tape (PB)(2.3) 1.6 UK Tier-22008 CPU Total (MSI2k)8.0 Disk Total (PB)1.0 1.Between (October) and December UK Tier-1 LHC estimates reduced (see Dave’s talk): now more realistic 2.Global shortfall of Tier-1 CPU was (-13%) and Disk (-55%) in October 3.UK Tier-1 estimated input in December now corresponds to ~20% (~7%) of global disk (CPU) 4.LCG MoU commitments required by April 2005 5.UK Tier-2 CPU and disk resources significant 6.Rapid physics analysis turnaround is a necessity 7.Priority is to ensure that ALL required software (experiment, middleware, OS) is routinely deployed on this hardware well before 2008
9
Tony Doyle - University of Glasgow 12 January 2005Collaboration Board III. Middleware, Security and Network M/S/N builds upon UK strengths as part of International development Configuration Management Storage Interfaces Network Monitoring Security Information Services Grid Data Management Security Middleware Networking Require some support expertise in each of these areas in order to maintain the Grid
10
Tony Doyle - University of Glasgow 12 January 2005Collaboration Board II. Application Middleware GANGA SAMGrid Lattice QCD AliEn CMS BaBar Require some support expertise in each of these areas in order to maintain the Grid applications. Need to develop e-Infrastructure portals for new experiments starting up in exploitation era. Pheomenology
11
ATLAS UK e-science forward look (Roger Jones) Both will move from development to optimisation & maintenance Current core and infrastructure activities: Run Time Testing and Validation Framework, tracking and trigger instantiations Provision of ATLAS Distributed Analysis & production tools Production management GANGA development Metadata development ATLFast simulation ATLANTIS Event Display Physics Software Tools ~11 FTEs mainly ATLAS e-science with some GridPP & HEFCE Current Tracking and Trigger e-science: Alignment effort ~6FTEs Core software ~2.5FTEs Tracking tools ~6FTEs Trigger ~2FTEs The current eScience funding will only take us (at best) to first data Expertise required for the real-world problems and maintenance Note for the HLT, the installation and commissioning will continue into the running period because of staging Need ~15 FTE (beyond existing rolling grant) in 2007/9 - continued e-science/GridPP support
12
CMS UK e-science forward look (Dave Newbold) NB: ‘First look’ estimates; well inevitably change as we approach running Need ~9 FTE (beyond existing rolling grant) in 2007/9 - continued e-science/GridPP support Work areaCurrent FTEsFTEs 2007-9FTEs 2009 - Cmp sys / support (e-science WP1) 2.03.0 - ramp UP for running phase 3.0 - steady state Monitoring / DQM (e-science WP3) 2.52.0 - initial running1.5 - support / maintenance Tracker software (e-science WP4) 2.01.5 - initial deployment running 1.0 - support / maintenance ECAL software (e-science WP5) 2.01.5 - initial running1.0 - support / maintenance Data management (GridPP2) 1.51.5 - final dplymnt / support 1.5 - support / maintenance Analysis system (GridPP2) 1.51.0 - final dplymnt / support 1.0 - support / maintenance 11.510.59.0 Computing system / support Development / tuning of computing model + system; management User support for T1 / T2 centres (globally); liaison with LCG ops Monitoring / DQM Online data gathering/‘expert systems’ for CMS tracker, trigger Tracker /ECAL software Installation / calibration support; low-level reconstruction codes Data management Phedex system for bulk offline data movement and tracking System-level metadata; movement of HLT farm data online (new area) Analysis system CMS-specific parts of distributed analysis system on LCG
13
LHCb UK e-science forward look (Nick Brook) Current core activities: GANGA development Provision of DIRAC & production tools Development of conditions DB The production bookkeeping DB Data management & metadata Tracking Data Challenge Production Manager ~10 FTEs mainly GridPP, e-science, studentships with some HEFCE support Will move from development to maintenance phase - UK pro rata share of LHCb core computing activities ~5 FTEs Current RICH & VELO e-science: RICH: UK provide bulk of the RICH s/w team including s/w coordinator ~7 FTEs about 50:50 e-science funding+rolling grant/HEFCE VELO: UK provide bulk of the VELO s/w team including s/w coordinator ~4 FTEs about 50:50 e-science funding+rolling grant/HEFCE ALL essential alignment activities for both detectors through e-science funding Will move from development to maintenance and operational alignment ~3FTEs for alignment in 2007-9 Need ~9 FTE (core+alignment+UK support) in 2007/9 - continued e-science support
14
Tony Doyle - University of Glasgow 12 January 2005Collaboration Board Grid and e-Science funding requirements Simple model
15
Tony Doyle - University of Glasgow 12 January 2005Collaboration Board Priorities in context of a financial snapshot in 2008 Grid (£5.6m p.a.) and e-Science (£2.7m p.a.) Assumes no GridPP project management Savings? –EGEE Phase 2 (2006-08) may contribute –UK e-Science context is 1.NGS (National Grid Service) 2.OMII (Open Middleware Infrastructure Institute) 3.DCC (Digital Curation Centre) Timeline? Grid and e-Science funding requirements To be compared with Road Map: Not a Bid - Preliminary Input
16
Tony Doyle - University of Glasgow 12 January 2005Collaboration Board Grid and e-Science Exploitation Timeline? PPAP initial inputOct 2004 Science Committee initial input PPARC call assessment (2007-2010)2005 Science Committee outcomeOct 2005 PPARC call Jan 2006 PPARC close of call May 2006 Assessment Jun-Dec 2006 PPARC outcome Dec 2006 Institute Recruitment/RetentionJan-Aug 2007 Grid and e-Science Exploitation Sep 2007 - …. Note if the assessment from PPARC internal planning differs significantly from this preliminary advice from PPAP and SC, then earlier planning is required.
17
Tony Doyle - University of Glasgow 12 January 2005Collaboration Board Summary Resources needed for Grid and e-Science in medium-long term? Current Road Map ~£6m p.a. Resources needed in 2008 estimated at £8.3m p.a. Timeline for decision-making outlined.. PP community-supported strategy required
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.