Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

Tony Doyle Executive Summary, PPARC, MRC London, 15 May 2003.
S.L.LloydGridPP Collaboration Meeting IC Sept 2002Slide 1 Introduction Welcome to the 5 th GridPP Collaboration Meeting Steve Lloyd, Chair of GridPP.
Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
Tony Doyle GridPP2 Proposal, BT Meeting, Imperial, 23 July 2003.
SWITCH Visit to NeSC Malcolm Atkinson Director 5 th October 2004.
05/07/00LHCb Computing Model Meeting (CERN) LHCb(UK) Computing Status Glenn Patrick Prototype (Tier 1) UK national computing centre Bid to Joint.
15 May 2006Collaboration Board GridPP3 Planning Executive Summary Steve Lloyd.
Nick Brook University of Bristol The LHC Experiments & Lattice EB News Brief overview of the expts  ATLAS  CMS  LHCb  Lattice.
EGEE statement EU and EU member states major investment in Grid Technology Several good prototype results Next Step: –Leverage current and planned national.
2 GridPP2 Budget David Britton, 4/12/03 Imperial College.
INFSO-RI Enabling Grids for E-sciencE The EGEE project Fabrizio Gagliardi Project Director EGEE CERN, Switzerland Research Infrastructures.
The UK OMII Context, Vision and Agenda An Institute of the University of Southampton.
Tony Doyle “GridPP2 Proposal”, GridPP7 Collab. Meeting, Oxford, 1 July 2003.
S.L.LloydGridPP CB 29 Oct 2002Slide 1 Agenda 1.Introduction – Steve Lloyd 2.Minutes of Previous Meeting (23 Oct 2001) 3.Matters Arising 4.Project Leader's.
EGEE is proposed as a project funded by the European Union under contract IST The EGEE International Grid Infrastructure and the Digital Divide.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
D. Britton GridPP3 David Britton 27/June/2006. D. Britton27/June/2006GridPP3 Life after GridPP2 We propose a 7-month transition period for GridPP2, followed.
GridPP Tuesday, 23 September 2003 Tim Phillips. 2 Bristol e-Science Vision National scene Bristol e-Science Centre Issues & Challenges.
The OMII Perspective on Grid and Web Services At the University of Southampton.
D. Britton GridPP Status - ProjectMap 22/Feb/06. D. Britton22/Feb/2006GridPP Status GridPP2 ProjectMap.
S.L.LloydGridPP CB 19 February 2003Slide 1 Agenda 1.Minutes of Previous Meeting (29 Oct 2002) 2.Matters Arising 3.GridPP2 Planning 4.EGEE 5.Any Other Business.
Tony Doyle - University of Glasgow 12 January 2005Collaboration Board Beyond GridPP2 Tony Doyle.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridP35 Collaboration Meeting.
Tony Doyle GridPP – From Prototype To Production, GridPP10 Meeting, CERN, 2 June 2004.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Related Projects Dieter Kranzlmüller Deputy.
Robin Middleton RAL/PPD DG Co-ordination Rome, 23rd June 2001.
3 June 2004GridPP10Slide 1 GridPP Dissemination Sarah Pearce Dissemination Officer
GridPP3 Project Management GridPP20 Sarah Pearce 11 March 2008.
Tony Doyle - University of Glasgow 6 September 2005Collaboration Meeting GridPP Overview (emphasis on beyond GridPP) Tony Doyle.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
InterGrid Meeting 7 Oct 2001 Tony Doyle. Tony Doyle - University of Glasgow GridPP Status  Financial Background  Deliverables  Recruitment  Regional.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Computing for Particle.
INFSO-RI Enabling Grids for E-sciencE Plan until the end of the project and beyond, sustainability plans Dieter Kranzlmüller Deputy.
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
SA1/SA2 meeting 28 November The status of EGEE project and next steps Bob Jones EGEE Technical Director EGEE is proposed as.
GridPP Presentation to AstroGrid 13 December 2001 Steve Lloyd Queen Mary University of London.
EGEE is a project funded by the European Union under contract IST Middleware Planning for LCG/EGEE Bob Jones EGEE Technical Director e-Science.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
GRID IIII D UK Particle Physics Tony Doyle - University of Glasgow GridPP Status Report  Executive Summary  News  “Unfunded” EU DataGrid Posts  Shadow.
…building the next IT revolution From Web to Grid…
E-Science Research Councils awarded e-Science funds ” science increasingly done through distributed global collaborations enabled by the Internet, using.
Ian Bird LCG Deployment Area Manager & EGEE Operations Manager IT Department, CERN Presentation to HEPiX 22 nd October 2004 LCG Operations.
Tony Doyle - University of Glasgow Introduction. Tony Doyle - University of Glasgow 6 November 2006ScotGrid Expression of Interest Universities of Aberdeen,
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGI Operations Tiziana Ferrari EGEE User.
1 Future Circular Collider Study Preparatory Collaboration Board Meeting September 2014 R-D Heuer Global Future Circular Collider (FCC) Study Goals and.
INFSO-RI Enabling Grids for E-sciencE Plans for the EGEE-2 EGEE 2 Task Force 3rd EGEE Conference, Athens, April 2005.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
EGEE is a project funded by the European Union under contract IST EGEE Summary NA2 Partners April
EGEE Project Review Fabrizio Gagliardi EDG-7 30 September 2003 EGEE is proposed as a project funded by the European Union under contract IST
The National Grid Service Mike Mineter.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
NERC e-Science Meeting Malcolm Atkinson Director & e-Science Envoy UK National e-Science Centre & e-Science Institute 26 th April 2006.
RC ICT Conference 17 May 2004 Research Councils ICT Conference The UK e-Science Programme David Wallace, Chair, e-Science Steering Committee.
Ian Bird LCG Project Leader Status of EGEE  EGI transition WLCG LHCC Referees’ meeting 21 st September 2009.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
The National e-Science Centre. eSI in Edinburgh NeSC Roles National: help coordinate and lead the UK e- Science Programme Community building activities,
INFSO-RI Enabling Grids for E-sciencE EGEE general project update Fotis Karayannis EGEE South East Europe Project Management Board.
Slide § David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP delivering The UK Grid.
 Prospective Nationale sur les Grilles de Production, Paris, L'état d'avancement des grilles.
Bob Jones EGEE Technical Director
Collaboration Board Meeting
LHCb thinking on Regional Centres and Related activities (GRIDs)
UK MoUs and Tier-1/A experiment shares
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle

Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting Beyond GridPP Funding from September 2007 will be incorporated as part of PPARC’s request for planning input for LHC exploitation from the LHC experiments and GridPP that will be considered by a Panel consisting of Prof. G. Lafferty (Chair), Prof. S. Watts and Dr. P. Harris meeting over the summer to provide input to Science Committee in the Autumn. 1. An important issue to note is the need to ensure matching funding is fully in place for the full term of EGEE-2, anticipated to be 1st April 2006 to 31st March Such funding for SA1 and JRA1 is currently provided by PPARC through GridPP2, but this will terminate under current arrangements at the end of GridPP2 in August There is thus a 7 month gap for which matching funding is currently not in place. This needs to be resolved, with some urgency, before the proposal is submitted this summer.

Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting Enabling Grids for E-science in Europe Deliver a 24/7 Grid service to European science build a consistent, robust and secure Grid network that will attract additional computing resources. continuously improve and maintain the middleware in order to deliver a reliable service to users. attract new users from industry as well as science and ensure they receive the high standard of training and support they need. 100 million euros/4years, funded by EU >400 software engineers + service support 70 European partners

Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting Overview EGEE is the Grid Infrastructure Project in Europe Take the lead in developing roadmaps, white papers, collaborations Organise European flagship events Collaborate with other projects (including CPS) –start date = April 1 UK partners –CCLRC+NeSC+PPARC (+TCD) (n.b. UK e-Science, not only HEP) NeSC : Training, Dissemination & Applications NeSC : Networking CLRC : Grid Operations, Support & Management CLRC : Middleware Engineering (R-GMA) UK “3 rd parties” –Glasgow, ICSTM, Leeds, Manchester, Oxford Funded effort dedicated to deploying regional grids UK T2 coordinators

Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting Beyond GridPP2.. LHC EXPLOITATION PLANNING REVIEW Input is requested from the UK project spokespersons, for ATLAS and CMS for each of the financial years 2008/9 to 2011/12, and for LHCb, ALICE and GridPP for 2007/8 to 2011/12. Physics programme Please give a brief outline of the planned physics programme. Please also indicate how this planned programme could be enhanced with additional resources. In total this should be no more than 3 sides of A4. The aim is to understand the incremental physics return from increasing resources. Input will be based upon PPAP roadmap input E-Science and LCG-2E-Science and LCG-2 (26 Oct 2004) and feedback from CB (12 Jan & 7 July 2005)

Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting Grid and e-Science funding requirements Simple model

Tony Doyle - University of Glasgow Priorities: GridPP2 Proposal 1.Tier-1/A staff – National Grid Centre 2.Tier-1/A hardware – International Role 3.Tier-2 staff – UK e-Science Grid 4.Applications –Grid Integration (GridPP2) –Development (experiments proposals) 5.Middleware – EU-wide development 6.Tier-2 hardware – non-PPARC funding 7.CERN staff – quality assurance 8.CERN hardware – pro-rata contribution Established entering proposal writing phase… ALL of these are required to address the LHC Computing Challenge

Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting Grid and e-Science funding requirements Simple model

Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting UK Analysis for the LHC Experiments I The basic functionality of the Tier-1 is: ALICE - Reconstruction, Chaotic Analysis ATLAS - Reconstruction, Scheduled Analysis/strimming, Calibration CMS - Reconstruction LHCb - Reconstruction, scheduled strimming, chaotic analysis The basic functionality of the Tier-2s is: ALICE - MC Production, Chaotic Analysis ATLAS - Simulation, Analysis, Calibration CMS - Analysis for Physicists, All Simulation Production LHCb - MC Production, No analysis

Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting UK Analysis for the LHC Experiments II UK Tier-1 (~7% of Global Tier-1): UK Tier-2 (pre-SRIF3):

Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting Priorities in context of a financial snapshot in 2008 Grid (£5.6m p.a.) and e-Science (£2.7m p.a.) Assumes no GridPP project management Savings? –EGEE Phase 2 ( ) may contribute –UK e-Science context is 1.NGS (National Grid Service) 2.OMII (Open Middleware Infrastructure Institute) 3.DCC (Digital Curation Centre) Grid and e-Science funding requirements To be compared with Road Map: Not a Bid - Preliminary Input

Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting Management? Current Proposed Model is low cost SCAP PPARC committee provide overview – (comment: PPARC could e.g. appoint a project leader) Production manager (PPARC) + 4 Tier-2 coordinators (EU) + Operations Centre (comment: EU funding likely in 2008 on.. But will it fund these people?) No PMB, CB – devolution to institutes Some concerns that this will not work

Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting FEC? Computers funded via SRIF3 (+  SRIF4?) – OK up to 2010 [see Steve’s slides] However, in future - CHARGE: Power usage support staff time maintenance (routine/emergency) space charges share of the replacement capital item cost (if so, effect comes earlier than end of SRIF)

Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting FEC? Back of envelope Estimated costs per annum Power usage (200 CPUs +disk) ~50k support staff time ~50k maintenance (routine/emergency) ~? space charges ~? replacement capital item cost ~50k FEC ~150k [None of this “scales” but…] 5,000 CPUs ~ £4m p.a. Current proposed model = £2m via SRIF + £1.3m PPARC + £0.7m Institutes Hardware ManpowerPower, Space FEC model = ?? Comment: Dual Funding was excellent value for HEP Comment: Tier-2 functionality is needed (wherever it resides)