Tony Doyle - University of Glasgow 12 January 2005Collaboration Board Beyond GridPP2 Tony Doyle.

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

Tony Doyle GridPP – From Prototype To Production, HEPiX Meeting, Edinburgh, 25 May 2004.
GridPP9 – 5 February 2004 – Data Management DataGrid is a project funded by the European Union GridPP is funded by PPARC GridPP2: Data and Storage Management.
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Tony Doyle - University of Glasgow E-Science and LCG-2 PPAP Summary Results from GridPP1/LCG1 Value of the UK contribution to LCG? Aims of GridPP2/LCG2.
Tony Doyle Executive Summary, PPARC, MRC London, 15 May 2003.
Tony Doyle GridPP2 Specification Process Grid Steering Committee Meeting, MRC, London, 18 February 2004.
1Oxford eSc – 1 st July03 GridPP2: Application Requirement & Developments Nick Brook University of Bristol ALICE Hardware Projections Applications Programme.
S.L.LloydGridPP9 IntroductionSlide 1 Introduction Welcome to the 9 th GridPP Collaboration Meeting Dissemination Officer GridPP2 Posts Tier-2 Centres Steve.
Tony Doyle GridPP2 Proposal, BT Meeting, Imperial, 23 July 2003.
SWITCH Visit to NeSC Malcolm Atkinson Director 5 th October 2004.
15 May 2006Collaboration Board GridPP3 Planning Executive Summary Steve Lloyd.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
Nick Brook University of Bristol The LHC Experiments & Lattice EB News Brief overview of the expts  ATLAS  CMS  LHCb  Lattice.
EGEE statement EU and EU member states major investment in Grid Technology Several good prototype results Next Step: –Leverage current and planned national.
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
Tony Doyle “GridPP2 Proposal”, GridPP7 Collab. Meeting, Oxford, 1 July 2003.
S.L.LloydGridPP CB 29 Oct 2002Slide 1 Agenda 1.Introduction – Steve Lloyd 2.Minutes of Previous Meeting (23 Oct 2001) 3.Matters Arising 4.Project Leader's.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Hall D Online Data Acquisition CEBAF provides us with a tremendous scientific opportunity for understanding one of the fundamental forces of nature. 75.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
D. Britton GridPP3 David Britton 27/June/2006. D. Britton27/June/2006GridPP3 Life after GridPP2 We propose a 7-month transition period for GridPP2, followed.
The Preparatory Phase Proposal a first draft to be discussed.
S.L.LloydGridPP CB 19 February 2003Slide 1 Agenda 1.Minutes of Previous Meeting (29 Oct 2002) 2.Matters Arising 3.GridPP2 Planning 4.EGEE 5.Any Other Business.
Tony Doyle GridPP – From Prototype To Production, GridPP10 Meeting, CERN, 2 June 2004.
Robin Middleton RAL/PPD DG Co-ordination Rome, 23rd June 2001.
Nick Brook Current status Future Collaboration Plans Future UK plans.
UKQCD QCDgrid Richard Kenway. UKQCD Nov 2001QCDgrid2 why build a QCD grid? the computational problem is too big for current computers –configuration generation.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
3 June 2004GridPP10Slide 1 GridPP Dissemination Sarah Pearce Dissemination Officer
LHC Computing Plans Scale of the challenge Computing model Resource estimates Financial implications Plans in Canada.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
Data Grid projects in HENP R. Pordes, Fermilab Many HENP projects are working on the infrastructure for global distributed simulated data production, data.
Tony Doyle - University of Glasgow 6 September 2005Collaboration Meeting GridPP Overview (emphasis on beyond GridPP) Tony Doyle.
InterGrid Meeting 7 Oct 2001 Tony Doyle. Tony Doyle - University of Glasgow GridPP Status  Financial Background  Deliverables  Recruitment  Regional.
11 March 2008 GridPP20 Collaboration meeting David Britton - University of Glasgow GridPP Status GridPP20 Collaboration Meeting, Dublin David Britton,
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
1 HiGrade Kick-off Welcome to DESY Hamburg Zeuthen.
GridPP Presentation to AstroGrid 13 December 2001 Steve Lloyd Queen Mary University of London.
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
GRID IIII D UK Particle Physics Tony Doyle - University of Glasgow GridPP Status Report  Executive Summary  News  “Unfunded” EU DataGrid Posts  Shadow.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
E-Science Research Councils awarded e-Science funds ” science increasingly done through distributed global collaborations enabled by the Internet, using.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
UKQCD Grid Status Report GridPP 13 th Collaboration Meeting Durham, 4th—6th July 2005 Dr George Beckett Project Manager, EPCC +44.
LHCbComputing Manpower requirements. Disclaimer m In the absence of a manpower planning officer, all FTE figures in the following slides are approximate.
Tony Doyle - University of Glasgow Introduction. Tony Doyle - University of Glasgow 6 November 2006ScotGrid Expression of Interest Universities of Aberdeen,
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGI Operations Tiziana Ferrari EGEE User.
U.S. ATLAS Computing Facilities Overview Bruce G. Gibbard Brookhaven National Laboratory U.S. LHC Software and Computing Review Brookhaven National Laboratory.
Computing Division FY03 Budget and budget outlook for FY04 + CDF International Finance Committee April 4, 2003 Vicky White Head, Computing Division.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
November 27, 2001DOE/NSF review of US LHC S&C projects1 The Software and Computing Committee (SC2) in the LHC Computing Grid Project M Kasemann, FNAL.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
ALICE RRB-T ALICE Computing – an update F.Carminati 23 October 2001.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
LHC Computing at RAL PPD Dave Newbold RAL PPD / University of Bristol The LHC computing challenge PPD and the Grid Computing for physics PPD added value.
18/12/03PPD Christmas Lectures 2003 Grid in the Department A Guide for the Uninvolved PPD Computing Group Christmas Lecture 2003 Chris Brew.
Bob Jones EGEE Technical Director
Ian Bird GDB Meeting CERN 9 September 2003
US ATLAS Physics & Computing
LHC Data Analysis using a worldwide computing grid
Collaboration Board Meeting
LHCb thinking on Regional Centres and Related activities (GRIDs)
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

Tony Doyle - University of Glasgow 12 January 2005Collaboration Board Beyond GridPP2 Tony Doyle

Tony Doyle - University of Glasgow 12 January 2005Collaboration Board Outline Beyond GridPP2: what are the estimated resource requirements in the LHC exploitation era? Background: PMB preliminary discussions in September PPAP presentation in October: Resources needed in medium-long term? (09/07-08/10) Exploitationmedium (09/10-08/14) Exploitationlong-term Focus on resources needed in 2008 GridPP Oversight Committee outline on Monday initial ideas for discussion here..

Tony Doyle - University of Glasgow 12 January 2005Collaboration Board Tier 0 and LCG: Foundation Programme Aim: build upon Phase 1 Ensure development programmes are linked Project management: GridPPLCG Shared expertise: LCG establishes the global computing infrastructure Allows all participating physicists to exploit LHC data Earmarked UK funding being reviewed Required Foundation: LCG Deployment F. LHC Computing Grid Project (LCG Phase 2) [review]

Tony Doyle - University of Glasgow 12 January 2005Collaboration Board Tier 0 and LCG: RRB meeting (October) Jos Engelen proposal to RRB members (Richard Wade [UK]) on how a 20MCHF shortfall for LCG phase II can be funded Spain to fund ~2 staff. Others at this level? Funding from UK (£1m), France, Germany and Italy for 5 staff. Others? Now vitally important that the LCG effort established predominantly via UK funding (40%) is sustained at this level (~10%) Proposal to SC in preparation Value to the UK? Required Foundation: LCG Deployment

Tony Doyle - University of Glasgow 12 January 2005Collaboration Board Grid and e-Science Support in 2008 What areas require support? IV Running the Tier-1 Data Centre IVHardware annual upgrade IVContribution to Tier-2 Sysman effort  (non-PPARC) hardware IVFrontend Tier-2 hardware IVContribution to Tier-0 support IIIOne M/S/N expert in each of 6 areas IIIProduction manager and four Tier-2 coordinators IIApplication/Grid experts (UK support) IATLAS Computing MoU commitments and support ICMS Computing MoU commitments and support ILHCb Core Tasks and Computing Support IALICE Computing support IFuture experiments adopt e-Infrastructure methods No GridPP management: (assume production mode established + devolved management to Institutes) III. Grid Middleware I. Experiment Layer II. Application Middleware IV. Facilities and Fabrics

Tony Doyle - University of Glasgow 12 January 2005Collaboration Board PPARC Financial Input: GridPP1 Outturn LHC Computing Grid Project (LCG) Applications, Fabrics, Technology and Deployment European DataGrid (EDG) Middleware Development UK Tier-1/A Regional Centre Hardware and Manpower Grid Application Development LHC and US Experiments + Lattice QCD Management Travel etc

Tony Doyle - University of Glasgow 12 January 2005Collaboration Board C. Grid Application Development LHC and US Experiments + Lattice QCD + Phenomenology B. Middleware Security Network Development F. LHC Computing Grid Project (LCG Phase 2) [review] E. Tier-1/A Deployment: Hardware, System Management, Experiment Support A. Management, Travel, Operations D. Tier-2 Deployment: 4 Regional Centres - M/S/N support and System Management PPARC Financial Input: GridPP2 Components

Tony Doyle - University of Glasgow 12 January 2005Collaboration Board IV. Hardware Support UK Tier CPU Total (MSI2k)(7.8) 5.2 Disk Total (PB)(3.8) 1.6 Total Tape (PB)(2.3) 1.6 UK Tier CPU Total (MSI2k)8.0 Disk Total (PB)1.0 1.Between (October) and December UK Tier-1 LHC estimates reduced (see Dave’s talk): now more realistic 2.Global shortfall of Tier-1 CPU was (-13%) and Disk (-55%) in October 3.UK Tier-1 estimated input in December now corresponds to ~20% (~7%) of global disk (CPU) 4.LCG MoU commitments required by April UK Tier-2 CPU and disk resources significant 6.Rapid physics analysis turnaround is a necessity 7.Priority is to ensure that ALL required software (experiment, middleware, OS) is routinely deployed on this hardware well before 2008

Tony Doyle - University of Glasgow 12 January 2005Collaboration Board III. Middleware, Security and Network M/S/N builds upon UK strengths as part of International development Configuration Management Storage Interfaces Network Monitoring Security Information Services Grid Data Management Security Middleware Networking Require some support expertise in each of these areas in order to maintain the Grid

Tony Doyle - University of Glasgow 12 January 2005Collaboration Board II. Application Middleware GANGA SAMGrid Lattice QCD AliEn CMS BaBar Require some support expertise in each of these areas in order to maintain the Grid applications. Need to develop e-Infrastructure portals for new experiments starting up in exploitation era. Pheomenology

ATLAS UK e-science forward look (Roger Jones) Both will move from development to optimisation & maintenance Current core and infrastructure activities:  Run Time Testing and Validation Framework, tracking and trigger instantiations  Provision of ATLAS Distributed Analysis & production tools  Production management  GANGA development  Metadata development  ATLFast simulation  ATLANTIS Event Display  Physics Software Tools ~11 FTEs mainly ATLAS e-science with some GridPP & HEFCE Current Tracking and Trigger e-science:  Alignment effort ~6FTEs  Core software ~2.5FTEs  Tracking tools ~6FTEs  Trigger ~2FTEs The current eScience funding will only take us (at best) to first data Expertise required for the real-world problems and maintenance Note for the HLT, the installation and commissioning will continue into the running period because of staging Need ~15 FTE (beyond existing rolling grant) in 2007/9 - continued e-science/GridPP support

CMS UK e-science forward look (Dave Newbold) NB: ‘First look’ estimates; well inevitably change as we approach running Need ~9 FTE (beyond existing rolling grant) in 2007/9 - continued e-science/GridPP support Work areaCurrent FTEsFTEs FTEs Cmp sys / support (e-science WP1) ramp UP for running phase steady state Monitoring / DQM (e-science WP3) initial running1.5 - support / maintenance Tracker software (e-science WP4) initial deployment running support / maintenance ECAL software (e-science WP5) initial running1.0 - support / maintenance Data management (GridPP2) final dplymnt / support support / maintenance Analysis system (GridPP2) final dplymnt / support support / maintenance Computing system / support Development / tuning of computing model + system; management User support for T1 / T2 centres (globally); liaison with LCG ops Monitoring / DQM Online data gathering/‘expert systems’ for CMS tracker, trigger Tracker /ECAL software Installation / calibration support; low-level reconstruction codes Data management Phedex system for bulk offline data movement and tracking System-level metadata; movement of HLT farm data online (new area) Analysis system CMS-specific parts of distributed analysis system on LCG

LHCb UK e-science forward look (Nick Brook) Current core activities:  GANGA development  Provision of DIRAC & production tools  Development of conditions DB  The production bookkeeping DB  Data management & metadata  Tracking  Data Challenge Production Manager ~10 FTEs mainly GridPP, e-science, studentships with some HEFCE support Will move from development to maintenance phase - UK pro rata share of LHCb core computing activities ~5 FTEs Current RICH & VELO e-science:  RICH: UK provide bulk of the RICH s/w team including s/w coordinator ~7 FTEs about 50:50 e-science funding+rolling grant/HEFCE  VELO: UK provide bulk of the VELO s/w team including s/w coordinator ~4 FTEs about 50:50 e-science funding+rolling grant/HEFCE ALL essential alignment activities for both detectors through e-science funding Will move from development to maintenance and operational alignment ~3FTEs for alignment in Need ~9 FTE (core+alignment+UK support) in 2007/9 - continued e-science support

Tony Doyle - University of Glasgow 12 January 2005Collaboration Board Grid and e-Science funding requirements Simple model

Tony Doyle - University of Glasgow 12 January 2005Collaboration Board Priorities in context of a financial snapshot in 2008 Grid (£5.6m p.a.) and e-Science (£2.7m p.a.) Assumes no GridPP project management Savings? –EGEE Phase 2 ( ) may contribute –UK e-Science context is 1.NGS (National Grid Service) 2.OMII (Open Middleware Infrastructure Institute) 3.DCC (Digital Curation Centre) Timeline? Grid and e-Science funding requirements To be compared with Road Map: Not a Bid - Preliminary Input

Tony Doyle - University of Glasgow 12 January 2005Collaboration Board Grid and e-Science Exploitation Timeline? PPAP initial inputOct 2004 Science Committee initial input PPARC call assessment ( )2005 Science Committee outcomeOct 2005 PPARC call Jan 2006 PPARC close of call May 2006 Assessment Jun-Dec 2006 PPARC outcome Dec 2006 Institute Recruitment/RetentionJan-Aug 2007 Grid and e-Science Exploitation Sep …. Note if the assessment from PPARC internal planning differs significantly from this preliminary advice from PPAP and SC, then earlier planning is required.

Tony Doyle - University of Glasgow 12 January 2005Collaboration Board Summary Resources needed for Grid and e-Science in medium-long term? Current Road Map ~£6m p.a. Resources needed in 2008 estimated at £8.3m p.a. Timeline for decision-making outlined.. PP community-supported strategy required