GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

GridPP4 Oversight Committee Meeting 4th February 2010
Level 1 Components of the Project. Level 0 Goal or Aim of GridPP. Level 2 Elements of the components. Level 2 Milestones for the elements.
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
05/07/00LHCb Computing Model Meeting (CERN) LHCb(UK) Computing Status Glenn Patrick Prototype (Tier 1) UK national computing centre Bid to Joint.
15 May 2006Collaboration Board GridPP3 Planning Executive Summary Steve Lloyd.
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP26 Collaboration Meeting.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Collaboration Board 2 nd.
S.L.LloydGridPP CB 29 Oct 2002Slide 1 Agenda 1.Introduction – Steve Lloyd 2.Minutes of Previous Meeting (23 Oct 2001) 3.Matters Arising 4.Project Leader's.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Regional Computing Centre for Particle Physics Institute of Physics AS CR (FZU) TIER2 of LCG (LHC Computing Grid) 1M. Lokajicek Dell Presentation.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Vendor Day 30 th April.
Assessment of Core Services provided to USLHC by OSG.
Ian Bird LHCC Referees’ meeting; CERN, 11 th June 2013 March 6, 2013
D. Britton GridPP3 David Britton 27/June/2006. D. Britton27/June/2006GridPP3 Life after GridPP2 We propose a 7-month transition period for GridPP2, followed.
S.L.LloydGridPP CB 19 February 2003Slide 1 Agenda 1.Minutes of Previous Meeting (29 Oct 2002) 2.Matters Arising 3.GridPP2 Planning 4.EGEE 5.Any Other Business.
Grid Computing Oxana Smirnova NDGF- Lund University R-ECFA meeting in Sweden Uppsala, May 9, 2008.
Tony Doyle - University of Glasgow 12 January 2005Collaboration Board Beyond GridPP2 Tony Doyle.
Tony Doyle GridPP – From Prototype To Production, GridPP10 Meeting, CERN, 2 June 2004.
1 Kittikul Kovitanggoon*, Burin Asavapibhop, Narumon Suwonjandee, Gurpreet Singh Chulalongkorn University, Thailand July 23, 2015 Workshop on e-Science.
Robin Middleton RAL/PPD DG Co-ordination Rome, 23rd June 2001.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
GridPP3 project status Sarah Pearce 14 April 2010 GridPP24 RHUL.
GridPP3 Project Management GridPP20 Sarah Pearce 11 March 2008.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
Project Management Sarah Pearce 3 September GridPP21.
Grid Computing Status Report Jeff Templon PDP Group, NIKHEF NIKHEF Scientific Advisory Committee 20 May 2005.
Tony Doyle - University of Glasgow 6 September 2005Collaboration Meeting GridPP Overview (emphasis on beyond GridPP) Tony Doyle.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
InterGrid Meeting 7 Oct 2001 Tony Doyle. Tony Doyle - University of Glasgow GridPP Status  Financial Background  Deliverables  Recruitment  Regional.
October LHCUSA meeting BNL Bjørn S. Nilsen Update on NSF-ITR Proposal Bjørn S. Nilsen The Ohio State University.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Computing for Particle.
11 March 2008 GridPP20 Collaboration meeting David Britton - University of Glasgow GridPP Status GridPP20 Collaboration Meeting, Dublin David Britton,
GridPP Presentation to AstroGrid 13 December 2001 Steve Lloyd Queen Mary University of London.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow PPAP Community Meeting Imperial,
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
Jürgen Knobloch/CERN Slide 1 A Global Computer – the Grid Is Reality by Jürgen Knobloch October 31, 2007.
GRID IIII D UK Particle Physics Tony Doyle - University of Glasgow GridPP Status Report  Executive Summary  News  “Unfunded” EU DataGrid Posts  Shadow.
…building the next IT revolution From Web to Grid…
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
LHCbComputing Manpower requirements. Disclaimer m In the absence of a manpower planning officer, all FTE figures in the following slides are approximate.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
LHC Computing, CERN, & Federated Identities
U.S. ATLAS Computing Facilities Overview Bruce G. Gibbard Brookhaven National Laboratory U.S. LHC Software and Computing Review Brookhaven National Laboratory.
NORDUnet NORDUnet e-Infrastrucure: Grids and Hybrid Networks Lars Fischer CTO, NORDUnet Fall 2006 Internet2 Member Meeting, Chicago.
Ian Bird WLCG Networking workshop CERN, 10 th February February 2014
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
Ian Bird LCG Project Leader Status of EGEE  EGI transition WLCG LHCC Referees’ meeting 21 st September 2009.
STFC in INDIGO DataCloud WP3 INDIGO DataCloud Kickoff Meeting Bologna April 2015 Ian Collier
Grid Computing Jeff Templon Programme: Group composition (current): 2 staff, 10 technicians, 1 PhD. Publications: 2 theses (PD Eng.) 16 publications.
NDGF – a Joint Nordic Production Grid Lars Fischer ICFA Workshop on HEP Networking, Grid, and Digital Divide Issues for Global e-Science Cracow, 2 October.
Slide § David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP delivering The UK Grid.
EGI-Engage EGI Webinar - Introduction - Gergely Sipos EGI.eu / MTA SZTAKI 6/26/
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
Computing infrastructures for the LHC: current status and challenges of the High Luminosity LHC future Worldwide LHC Computing Grid (WLCG): Distributed.
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
Ian Bird, CERN WLCG Project Leader Amsterdam, 24 th January 2012.
Bob Jones EGEE Technical Director
(Prague, March 2009) Andrey Y Shevel
Dagmar Adamova, NPI AS CR Prague/Rez
UK GridPP Tier-1/A Centre at CLRC
UK Status and Plans Scientific Computing Forum 27th Oct 2017
New strategies of the LHC experiments to meet
Collaboration Board Meeting
GridPP4 Oversight Committee Meeting 4th February 2010
Presentation transcript:

GridPP Steve Lloyd, Chair of the GridPP Collaboration Board

Steve Lloyd GridPP PPAP July 2014 Slide 2 A Brief History of GridPP 2001 – 2004 GridPP1 – From Web to Grid 2004 – 2007 GridPP2 – From Prototype to Production 2007 – 2008 GridPP2+ One year extension 2008 – 2011 GridPP3 – From Production to Exploitation 2011 – 2015 GridPP4 - Computing in the LHC era 2015 – 2016 GridPP4+ One year extension ? GridPP5 GridPP provides UK Computing resources for the LHC experiments, other HEP experiments and other activities.

Steve Lloyd GridPP PPAP July 2014 Slide 3 Project Management GridPP only exists to facilitate analysis of data from the experiments. This is reflected by the management of the project which has the LHC experiments directly represented. Oversight Committee (OC) Collaboration Board (CB) Project Management Board (PMB) Operations Team (Ops-Team) Experiments Liaison Review Operation UtilisationProvision

GridPP Activities Steve Lloyd GridPP PPAP July 2014 Slide 4 Tier-1 Hardware 23% Tier-1 Staff 20% Tier-2 Hardware 10% Tier-2 Staff 23% Operations Staff 15% Manage/Travel 10%

Hardware Costs Steve Lloyd GridPP PPAP July 2014 Slide 5 STFC have signed an MoU with WLCG for the UK to provide Tier-1 and Tier-2 resources The LHC Experiments provide estimates of their required future resources These are scrutinised and approved by the CERN Computing Resource Review Board (CRRB) We multiply by the ~UK authorship fraction: 2% of ALICE, 12.5% of ATLAS, 8%/5% of CMS and 31.5%/21.5% of LHCb [Tier-1/Tier-2] We add ~10% for other experiments then multiply by the estimated cost Gives total hardware cost per year at Tier-1 and Tier-2s – NO DISCRETION!

RAL Tier-1 Steve Lloyd GridPP PPAP July 2014 Slide 6 5,760 Logical CPUs 59,641 HEPSPEC06 12 PB Disk 12 PB Tape The UK Tier-1 at RAL provides: CPU and disk resources to meet the UK’s WLCG Tier-1 MoU requirements; An Archival (robotic) Tape service to preserve raw LHC data; Manpower to maintain and operate the disk and tape systems, Grid Middleware, Oracle databases, fabric and day to day production; Embedded manpower to interface directly with ATLAS, CMS and LHCb; Excellent reliability, a high level of availability and rapid responsiveness; Excellent Network connections including an optical private network (OPN) direct connection to CERN.

Tier-1 Delivery Steve Lloyd GridPP PPAP July 2014 Slide 7 Tier-1 CPU Delivery

Steve Lloyd GridPP PPAP July 2014 Slide 8 Tier-2s The UK Tier-2s provide: CPU and disk resources to meet the UK’s WLCG Tier-2 MoU requirements; Manpower to support Group Analysis Sites, with large amounts of disk and excellent network connections, for ATLAS and CMS; Manpower to provide all experiments with CPU and disk for opportunistic user analysis and Monte Carlo simulation; A distributed ecosystem to support UK physicists doing their analysis; The majority of the Deployment, Operations, and Support staff who transform the distributed resources into a coherent Grid infrastructure Hardware resources for testing middleware releases, new technologies and running some core services; A successful framework for leveraging local resources and support; Opportunities for reaching out to other communities to support STFC’s impact agenda.

Tier-2 Resources Steve Lloyd GridPP PPAP July 2014 Slide 9 Large sites (green) have ~2 FTE staff Medium sites (yellow) have ~1 FTE staff Small sites (blue) have ~0 FTE staff London ScotGrid NorthGrid SouthGrid Some redistribution following mid-term review

Tier-2 Delivery Steve Lloyd GridPP PPAP July 2014 Slide 10 Tier-2 CPU Delivery

UK Tier-2 Delivery Steve Lloyd GridPP PPAP July 2014 Slide 11 UK Tier-2 CPU Delivery

External Activities Steve Lloyd GridPP PPAP July 2014 Slide 12 GridPP is part of WLCG that combines: EGI (European Grid Infrastructure) OSG (Open Science Grid) in the US NorduGrid in the Nordic countries There are new Initiatives looking towards Horizon 2020 etc. such as EU-T0 – an initiative to federate national centres into a European Computing Centre for Experimental Data Management formed by major EU funding agencies inc. STFC. HEP Software Foundation – a collaborative framework to develop and maintain all major HEP-related software. VLData – a generic platform for distributed computing integrating existing Grid, cloud and other computing and storage resources. UK Project Directors Group. E-infrastructure academic user community. JOIN – A joint DiRAC (HPC)-GridPP Authentication and Authorization Infrastructure (AAI) initiative

Steve Lloyd GridPP PPAP July 2014 Slide 13 The Future 13 GridPP5

The Future Steve Lloyd GridPP PPAP July 2014 Slide 14 Hardware costs will hopefully continue to fall (Moore’s Law etc) but: Data accumulates and trigger rates are going up (x2.5 Run-2, x10 Run-3?) and more complex (pile-up) requiring much more hardware The Grid is continually evolving – multi- cores, new architectures, clouds, storage technologies, WAN access etc. Manpower requirements may decrease slightly but not dramatically in face of increased resources and complexity GridPP with continue to be a necessary component of UK particle physics for many years for ALL experiments. Distinction between Tier-1, Tier-2 is blurring. LUX/ZEPLIN are bidding for computing and storage that will be run by GridPP leveraging existing support and expertise. Could be a good model for other communities, T2K, Hyper-K, ILC, NA62 etc.)