UK Status and Plans Scientific Computing Forum 27th Oct 2017

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

STFC and the UK e-Infrastructure Initiative The Hartree Centre Prof. John Bancroft Project Director, the Hartree Centre Member, e-Infrastructure Leadership.
EGEE statement EU and EU member states major investment in Grid Technology Several good prototype results Next Step: –Leverage current and planned national.
The EGI – a sustainable European grid infrastructure Michael Wilson STFC RAL.
Grid Infrastructure in the UK Neil Geddes. Why this talk ? LHC to 2020 –GridPP to 2011 –SRIF3 to 2010 ? Who was successful in SRIF3? –Thereafter ? PPARC.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP26 Collaboration Meeting.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP27 15 th Sep 2011 GridPP.
Ian Bird WLCG Workshop Okinawa, 12 th April 2015.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Vendor Day 30 th April.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP30 26 th Mar 2013 GridPP30.
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridP35 Collaboration Meeting.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP28 17 th Apr 2012 GridPP28.
Slide David Britton, University of Glasgow IET, Oct 09 1 Pete Clarke University of Edinburgh PPAP Meeting Imperial, 24/25 th Sep 2015 GridPP Access for.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
John Womersley PPD staff meeting John Womersley 6 February 2007.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Computing for Particle.
11 March 2008 GridPP20 Collaboration meeting David Britton - University of Glasgow GridPP Status GridPP20 Collaboration Meeting, Dublin David Britton,
Particle Physics Advisory Panel Philip Burrows John Adams Institute for Accelerator Science Oxford University.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow PPAP Community Meeting Imperial,
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
…building the next IT revolution From Web to Grid…
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
DiRAC-3 – The future Jeremy Yates, STFC DiRAC HPC Facility.
LHC Computing, CERN, & Federated Identities
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow EGI Technical Forum 21 st Sep.
STFC in INDIGO DataCloud WP3 INDIGO DataCloud Kickoff Meeting Bologna April 2015 Ian Collier
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridP36 Collaboration Meeting.
1 David Britton, University of Glasgow IET, Oct 09 1 Pete Clarke University of Edinburgh GridPP36 Pitlochry April 12th 2016 Computing future News & Views.
Slide § David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP delivering The UK Grid.
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
Ian Bird LHCC Referees; CERN, 2 nd June 2015 June 2,
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
Core Network Services Robin Tasker 10 May Network Performance.
ScotGRID is the Scottish prototype Tier 2 Centre for LHCb and ATLAS computing resources. It uses a novel distributed architecture and cutting-edge technology,
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI Role and Challenges of the Resource Centre in the EGI Ecosystem Tiziana Ferrari,
Ian Bird WLCG Workshop San Francisco, 8th October 2016
Report from WLCG Workshop 2017: WLCG Network Requirements GDB - CERN 12th of July 2017
Computing models, facilities, distributed computing
Scientific Computing Department
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
INFN Computing Outlook The Bologna Initiative
Report from Computing Advisory Panel
Christos Markou Institute of Nuclear Physics NCSR ‘Demokritos’
National e-Infrastructure Vision
Understanding the nature of matter -
Dagmar Adamova, NPI AS CR Prague/Rez
John Womersley Super-B Workshop Oxford May 2011
Dagmar Adamova (NPI AS CR Prague/Rez) and Maarten Litmaath (CERN)
UK Testbed Status Testbed 0 GridPP project Experiments’ tests started
System performance and cost model working group
Introduction to HEPiX Helge Meinhard, CERN-IT
Scientific Computing At Jefferson Lab
High Energy Physics Computing Coordination in Pakistan
EGI Webinar - Introduction -
Collaboration Board Meeting
GridPP4 Oversight Committee Meeting 4th February 2010
LHCb thinking on Regional Centres and Related activities (GRIDs)
IPv6 update Duncan Rand Imperial College London
The LHC Computing Grid Visit of Professor Andreas Demetriou
STFC Update Charlotte Jamieson 3rd April 2019.
Fabio Pasian, Marco Molinaro, Giuliano Taffoni
Presentation transcript:

UK Status and Plans Scientific Computing Forum 27th Oct 2017 Prof. David Britton GridPP Project leader University of Glasgow David Britton, University of Glasgow IET, Oct 09

Outline Current GridPP Sites and supported science UK Funding Agency structure and Computing Strategy UKT0 Initiative Future evolution of Funding Agency Future evolution of GridPP sites David Britton, University of Glasgow SCF

GridPP Status ~10% of WLCG; 18 sites hosting hardware. Tier-1 at RAL and 4 distributed Tier-2 centres. Network Connectivity RAL Tier-1: 20 + 10 Gb/s active/active LHCOPN connection. 2 x 40Gb/s active/standby UK Core network. Tier-2s: Typically 10-40 Gb/s dedicated link to UK core network Currently no need for LHCONE (additional expense? + complexity?) ScotGrid NorthGrid SouthGrid London David Britton, University of Glasgow SCF

Janet Core Network David Britton, University of Glasgow SCF

VO Usage (Normalised CPU Time) April 2017 to September 2017 LHC: 93% Policy: 10% of CPU and 5% of storage available to “non-LHC VOs” David Britton, University of Glasgow SCF

Funding agency (STFC) structure STFC has five Directorates Finance Strategy, Performance, Comms Business & Innovation National Facilities Directorate Programmes Directorate Runs National Facilities: Diamond Light Source ISIS (neutrons) Central Laser Facility . Supports programmes in: HEP, Astronomy, Astro-Particle, Nuclear

Major computing within/supported by STFC STFC has five Directorates Finance Strategy, Performance, Comms Business & Innovation National Facilities Directorate Programmes Directorate STFC has five Scientific Computing Department (SCD) Computing for Theory and Cosmology Computing for National Facilities Computing for LHC (+other HEP) SCD is internal to STFC, GridPP & DiRAC are external (funded at Universities) except that the Tier-1 is run within SCD. Up until now these have been separate computing “facilities” (but with very good informal co-operation).

STFC computing Strategic Review “All STFC programme areas anticipate an order of magnitude increase over the next five years in data volumes, with implications for requirements for computing hardware, storage and network bandwidth. Meeting this challenge is essential to ensure the UK can continue to produce world leading science, but in the current financial climate it is clear there will be funding limitations. Long-term planning will be critical, as will more efficient use of resources. The scientific disciplines supported by STFC should work more closely together to find ways of sharing computing infrastructure, with new projects encouraged to make use of existing expertise and infrastructure. “ In the future we expect funding to only be available for joined up computing ...but to be clear .....we wanted to do this anyway..... In particular, we need to work closely with SKA, EUCLID, aLIGO…

UKT0 – a community initiative UKT0 is an initiative to bring STFC computing interests together. It was formed from bottom up by the science communities - long before the Strategic Review. Aims are harmonisation, sharing, cost-effectiveness, avoiding duplication… It is not a project seeking to find users - it is users wanting to work together Co-operation across disciplines now embedded. Joint GridPP posts with SKA, LSST, LZ. Joint working meetings. DiRAC - sharing RAL tape store ✔ Astronomy jobs run on GridPP ✔ Lux-Zeplin in production ✔ Recent aLIGO expansion to use RAL ✔ Fusion @ Culham LAb. ✔

Examples of important UK Astronomy and Particle-Astro computing interests LSST Data Access Centre Data Challenge2 in 2018 Advanced LIGO Run-3 with increased sensitivity in 2018 Lux-Zeplin Mock Data Challenge1 in 2017 MDC2 in 2018 EUCLID Galaxy shear simulations in 2018 SKA : HQ in Manchester Responsible for Science data Processor (P.Alexander) Developing European Science Regional Centre (SRC) Cambridge, Manchester & STFC involvement in AENEAS (H2020) WLCG-SKA meetings, CERN-SKA accord, CERN-SKA “Big Data” workshop 2018 @ Alan Turing Institute.

UKT0 – a community initiative STFC has five Directorates Finance Strategy, Performance, Comms Business & Innovation National Facilities Directorate Programmes Directorate STFC has five Scientific Computing Department (SCD) Computing for Theory and Cosmology Computing for National Facilities Computing for LHC (+other HEP) + computing for Astronomy, Particle-Astro, Nuclear UKT0

Change of funding agency structure in UK Following a national review (“Nurse Review”) it was decided to bring all UK Research Councils (=Funding agencies) into a single organisation. UKRI = UK Research and innovation: will be born in April 2018 There has been a UKRI-wide group working for some time towards a National eInfrastructure for Research Currently we are making a case to our ministry (BEIS) for investment in UKRI eInfratructure So direction of travel in UK is: joined up (shared) computing across STFC (relevant to this meeting) working much more closely with large Astronomy projects progress towards a National eInfrastructure for research a push towards the “Cloud” where applicable.

GridPP Tier-2 Site Evolution Envisage consolidating LHC disk at 4 sites for ATLAS and 1 for CMS. Other sites may choose to host disk for other VOs. Smaller sites will be CPU dominated with an appropriately sized disk cache. For HEP, the Tier-1 and the 5 large Tier-2 sites become a single, but distributed, UK data lake (HSF CWP)? 2016 Snapshot David Britton, University of Glasgow SCF Meeting