UK Status and Plans Scientific Computing Forum 27th Oct 2017 Prof. David Britton GridPP Project leader University of Glasgow David Britton, University of Glasgow IET, Oct 09
Outline Current GridPP Sites and supported science UK Funding Agency structure and Computing Strategy UKT0 Initiative Future evolution of Funding Agency Future evolution of GridPP sites David Britton, University of Glasgow SCF
GridPP Status ~10% of WLCG; 18 sites hosting hardware. Tier-1 at RAL and 4 distributed Tier-2 centres. Network Connectivity RAL Tier-1: 20 + 10 Gb/s active/active LHCOPN connection. 2 x 40Gb/s active/standby UK Core network. Tier-2s: Typically 10-40 Gb/s dedicated link to UK core network Currently no need for LHCONE (additional expense? + complexity?) ScotGrid NorthGrid SouthGrid London David Britton, University of Glasgow SCF
Janet Core Network David Britton, University of Glasgow SCF
VO Usage (Normalised CPU Time) April 2017 to September 2017 LHC: 93% Policy: 10% of CPU and 5% of storage available to “non-LHC VOs” David Britton, University of Glasgow SCF
Funding agency (STFC) structure STFC has five Directorates Finance Strategy, Performance, Comms Business & Innovation National Facilities Directorate Programmes Directorate Runs National Facilities: Diamond Light Source ISIS (neutrons) Central Laser Facility . Supports programmes in: HEP, Astronomy, Astro-Particle, Nuclear
Major computing within/supported by STFC STFC has five Directorates Finance Strategy, Performance, Comms Business & Innovation National Facilities Directorate Programmes Directorate STFC has five Scientific Computing Department (SCD) Computing for Theory and Cosmology Computing for National Facilities Computing for LHC (+other HEP) SCD is internal to STFC, GridPP & DiRAC are external (funded at Universities) except that the Tier-1 is run within SCD. Up until now these have been separate computing “facilities” (but with very good informal co-operation).
STFC computing Strategic Review “All STFC programme areas anticipate an order of magnitude increase over the next five years in data volumes, with implications for requirements for computing hardware, storage and network bandwidth. Meeting this challenge is essential to ensure the UK can continue to produce world leading science, but in the current financial climate it is clear there will be funding limitations. Long-term planning will be critical, as will more efficient use of resources. The scientific disciplines supported by STFC should work more closely together to find ways of sharing computing infrastructure, with new projects encouraged to make use of existing expertise and infrastructure. “ In the future we expect funding to only be available for joined up computing ...but to be clear .....we wanted to do this anyway..... In particular, we need to work closely with SKA, EUCLID, aLIGO…
UKT0 – a community initiative UKT0 is an initiative to bring STFC computing interests together. It was formed from bottom up by the science communities - long before the Strategic Review. Aims are harmonisation, sharing, cost-effectiveness, avoiding duplication… It is not a project seeking to find users - it is users wanting to work together Co-operation across disciplines now embedded. Joint GridPP posts with SKA, LSST, LZ. Joint working meetings. DiRAC - sharing RAL tape store ✔ Astronomy jobs run on GridPP ✔ Lux-Zeplin in production ✔ Recent aLIGO expansion to use RAL ✔ Fusion @ Culham LAb. ✔
Examples of important UK Astronomy and Particle-Astro computing interests LSST Data Access Centre Data Challenge2 in 2018 Advanced LIGO Run-3 with increased sensitivity in 2018 Lux-Zeplin Mock Data Challenge1 in 2017 MDC2 in 2018 EUCLID Galaxy shear simulations in 2018 SKA : HQ in Manchester Responsible for Science data Processor (P.Alexander) Developing European Science Regional Centre (SRC) Cambridge, Manchester & STFC involvement in AENEAS (H2020) WLCG-SKA meetings, CERN-SKA accord, CERN-SKA “Big Data” workshop 2018 @ Alan Turing Institute.
UKT0 – a community initiative STFC has five Directorates Finance Strategy, Performance, Comms Business & Innovation National Facilities Directorate Programmes Directorate STFC has five Scientific Computing Department (SCD) Computing for Theory and Cosmology Computing for National Facilities Computing for LHC (+other HEP) + computing for Astronomy, Particle-Astro, Nuclear UKT0
Change of funding agency structure in UK Following a national review (“Nurse Review”) it was decided to bring all UK Research Councils (=Funding agencies) into a single organisation. UKRI = UK Research and innovation: will be born in April 2018 There has been a UKRI-wide group working for some time towards a National eInfrastructure for Research Currently we are making a case to our ministry (BEIS) for investment in UKRI eInfratructure So direction of travel in UK is: joined up (shared) computing across STFC (relevant to this meeting) working much more closely with large Astronomy projects progress towards a National eInfrastructure for research a push towards the “Cloud” where applicable.
GridPP Tier-2 Site Evolution Envisage consolidating LHC disk at 4 sites for ATLAS and 1 for CMS. Other sites may choose to host disk for other VOs. Smaller sites will be CPU dominated with an appropriately sized disk cache. For HEP, the Tier-1 and the 5 large Tier-2 sites become a single, but distributed, UK data lake (HSF CWP)? 2016 Snapshot David Britton, University of Glasgow SCF Meeting