1 David Britton, University of Glasgow IET, Oct 09 1 Pete Clarke University of Edinburgh GridPP36 Pitlochry April 12th 2016 Computing future News & Views.

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

S.L.LloydGridPP Collaboration Meeting IC Sept 2002Slide 1 Introduction Welcome to the 5 th GridPP Collaboration Meeting Steve Lloyd, Chair of GridPP.
15 May 2006Collaboration Board GridPP3 Planning Executive Summary Steve Lloyd.
SCD in Horizon 2020 Ian Collier RAL Tier 1 GridPP 33, Ambleside, August 22 nd 2014.
S.L.LloydGridPP CB 29 Oct 2002Slide 1 Agenda 1.Introduction – Steve Lloyd 2.Minutes of Previous Meeting (23 Oct 2001) 3.Matters Arising 4.Project Leader's.
Greater Lincolnshire LEP EU Structural and Investment Fund Strategy – November 2013.
WLCG Cloud Traceability Working Group progress Ian Collier Pre-GDB Amsterdam 10th March 2015.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP27 15 th Sep 2011 GridPP.
Bob Jones, CERN, IT department 4 October Why Procurement The activities within the Helix Nebula initiative have shown that public research organisations.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Vendor Day 30 th April.
Assessment of Core Services provided to USLHC by OSG.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
Open Source Grid Computing in the Finance Industry Alex Efimov STFC Kite Club Knowledge Exchange Advisor UK CERN Technology Transfer Officer
EGI-Engage EGI-Engage Engaging the EGI Community towards an Open Science Commons Project Overview 9/14/2015 EGI-Engage: a project.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP30 26 th Mar 2013 GridPP30.
Climate Sciences: Use Case and Vision Summary Philip Kershaw CEDA, RAL Space, STFC.
The National Grid Service User Accounting System Katie Weeks Science and Technology Facilities Council.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridP35 Collaboration Meeting.
The UK NGI Claire Devereux NGI Manager. Overview What is it? Who does it include? How does it work? Why do we need it? How is it evolving and what does.
John Womersley John Womersley Director, Science Programmes Science and Technology Facilities Council Technology Gateway Centres.
Notur: - Grant f.o.m is 16.5 Mkr (was 21.7 Mkr) - No guarantees that funding will increase in Same level of operations maintained.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP28 17 th Apr 2012 GridPP28.
Slide David Britton, University of Glasgow IET, Oct 09 1 Pete Clarke University of Edinburgh PPAP Meeting Imperial, 24/25 th Sep 2015 GridPP Access for.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Computing for Particle.
A public-private partnership building a multidisciplinary cloud platform for data intensive science Bob Jones Head of openlab IT dept CERN This document.
This document produced by Members of the Helix Nebula Partners and Consortium is licensed under a Creative Commons Attribution 3.0 Unported License. Permissions.
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
Particle Physics Advisory Panel Philip Burrows John Adams Institute for Accelerator Science Oxford University.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow PPAP Community Meeting Imperial,
MEDIN Work Plan for By March 2011 MEDIN will be 3 years into the original 5 year development plan started in Would normally ask for continued.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
PPAN Programmatic Review Presentation to PP town meeting Jordan Nash.
…building the next IT revolution From Web to Grid…
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
E-Science Research Councils awarded e-Science funds ” science increasingly done through distributed global collaborations enabled by the Internet, using.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
DiRAC-3 – The future Jeremy Yates, STFC DiRAC HPC Facility.
The GridPP DIRAC project DIRAC for non-LHC communities.
LHC Computing, CERN, & Federated Identities
Possibilities for joint procurement of commercial cloud services for WLCG WLCG Overview Board Bob Jones (CERN) 28 November 2014.
The National Grid Service User Accounting System Katie Weeks Science and Technology Facilities Council.
Netherlands Institute for Radio Astronomy Big Data Radio Astronomy A VRC for SKA and its pathfinders Hanno Holties March 28 th 2012.
The GridPP DIRAC project DIRAC for non-LHC communities.
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
Ian Bird, CERN 1 st February Dec 2015
IPCEI on High performance computing and big data enabled application: a pilot for the European Data Infrastructure Antonio Zoccoli INFN & University of.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridP36 Collaboration Meeting.
A worldwide e-Infrastructure and Virtual Research Community for NMR and structural biology Alexandre M.J.J. Bonvin Project coordinator Bijvoet Center for.
Slide § David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP delivering The UK Grid.
EUDAT receives funding from the European Union's Horizon 2020 programme - DG CONNECT e-Infrastructures. Contract No Collaboration.
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
RI EGI-InSPIRE RI Pre-OMB meeting Preparation for the Workshop “EGI towards H2020” NGI_UK John Gordon and.
EGI-InSPIRE RI An Introduction to European Grid Infrastructure (EGI) March An Introduction to the European Grid Infrastructure.
Ian Bird, CERN WLCG Project Leader Amsterdam, 24 th January 2012.
Scaling Science Communities Lessons learned by and future plans of the Open Science Grid Frank Würthwein OSG Executive Director Professor of Physics UCSD/SDSC.
Ian Bird GDB Meeting CERN 9 September 2003
Report from Computing Advisory Panel
National e-Infrastructure Vision
Long-term Grid Sustainability
UK Status and Plans Scientific Computing Forum 27th Oct 2017
EGI Webinar - Introduction -
Collaboration Board Meeting
STFC Update – Programmes Directorate PPAP Community Meeting
Brian Matthews STFC EOSCpilot Brian Matthews STFC
Stakeholders R. Dimper 15 January 2019
Fabio Pasian, Marco Molinaro, Giuliano Taffoni
Presentation transcript:

1 David Britton, University of Glasgow IET, Oct 09 1 Pete Clarke University of Edinburgh GridPP36 Pitlochry April 12th 2016 Computing future News & Views

2 2 Backdrop UK-T0 SKA UK investment in e-Infrastructure EOSC

3 Reminder of backdrop Consolidation and sharing of e-Infrastructure –Many.. many.. discussions going on around consolidation and sharing of e-Infrastructure –Its fractal Within WLCG Within STFC across PPAN and facilities (UK-T0) Within UK at RC-UK and BIS level Within Europe Computing Strategy Review –Published by last meeting CSR outcome –At PP Town Meeting (Sussex IoP) it was said we face flat cash again -> this will be hard –This means pressures of last years will continue for next years –Confirms that if nothing else changed then tension between Consolidated Grants  GridPP would remain  the backdrop for STFC communities to get together is stronger than ever 3

4 Recommendations:  PPAN  Facilities  Engineering  External  Hartree Centre  Cross Programme Computing Strategic Review

5 UK-T0 summary UK-T0 is established –STFC management are fully aware of initiative –Astronomy activities are fully aware, and open to collaboration –There is a good spirit of “lets work together” Meetings so far have proved useful for information sharing –First face-to-face meeting was Oct 21/22 at RAL (approx 50 people) –Monthly telephone meetings 25 th November, 27 th Jan, 24 th Feb Short term new community activities “using what is there in GridPP-like way” –DiRAC tape store ✔ –LSST (>160,000 core hours used) ✔ –CCFE ✔ –EUCLID starting The longer term requires investment –To develop some national scale solutions –To make infrastructure services more agnostic to the user activities –To provide activity specific staff needed by other activities –To provide marginal costs of capacity hardware, operations and services staff –.... 5

6 Tom’s recent summary ####Communities engaged during GridPP4+ #####Research users * GalDyn: Contact re-established with GalDyn group (UCLan) - contact now submitted thesis and looking to continue research by moving to large-scale simulations on the grid after renewing their expired grid certificate. * PRaVDA: Contact re-established with Tony Price of the PRaVDA group. It turns out that they have successfully used grid resources for their simulations and have been working on the actual device itself - hence the lack of updates. Simulations to resume. Great support from the local Ganga team (MS/MW), currently looking at integrating file catalogs with DIRAC and Ganga. * SuperNEMO - looking to resurrect the VO - ongoing. Major barrier seems to be lack of time/resource from new user(s). * Climate change (Oxford) - GRIDPP-SUPPORT list working well to support new user from a climate change group at Oxford. Some issues with using a GridPP CernVM from the university network - NAT networking not working (see LIGO below). To bypass this they have been given an account on the Oxford cluster (thanks to Ewan M!). Testing now proceeding as planned. * SNO+: Matt M (QMUL) has been given considerable assistance via the GRIDPP-SUPPORT mailing list with setting up a Grid FTP endpoint in the SNO+ cavern to allow data transfers to the grid (i.e. the outside world). Test transfers have been conducted, now waiting for it to be used in anger. * LSST: Thanks to Alessandra the LSST effort is now going great guns with thousands of jobs being submitted with DIRAC using the Ganga interface.

7 Science Domains remain “sovereign” where appropriate Federated HTC Clusters Federated HTC Clusters Federated Data Storage Services: Monitoring Accountin g Incident reporting Services: Monitoring Accountin g Incident reporting AAI VO tools AAI VO tools Tape Archive LHC VO Management Reconstruction Data Manag. Analysis LHC VO Management Reconstruction Data Manag. Analysis SKA LSST Share in common where it makes sense to do so Public & Commercial Cloud access

8 Its fractal - it applies at UK level Many discussions going on in this context (PDG, RCUK, NEI meetings) Federated HTC Clusters Federated HTC Clusters Federated Data Storage Services: Monitoring Accountin g Incident reporting Services: Monitoring Accountin g Incident reporting AAI VO tools AAI VO tools Tape Archive Particle bio Public & Commercial Cloud access

9 SKA a particular opportunity to collaborate SKA is set to become one of the the largest STFC data intensive projects SKA is currently working out what its regional data centres should look like The European Science Data Centre (ESDC) is important to STFC –H2020 Project submitted (ANEAS) –Response to a targeted H2020 call to design the SKA ESDC (European Science Data Centre) –The key work package is led by Manchester astronomers –SCD/RAL are partners - helped in preparation –SKA work package leader is keen to collaborate with GridPP This is an opportunity to cross boundaries, and be very helpful. 9

10 Case for e-Infrastructure investment PPAN science + STFC facilities need substantial investment in e-Infrastructure –We are involved Involved in making the case “up the chain” for e-Infrastructure funds –This is through: Project Directors Group (PDG) RCUK e-Infrastructure Group UK NEI meetings The famous spreadsheet –In mid January a spreadsheet was submitted to BIS via RCUK –This contained a substantive line indicating the requirements for PPAN computing for 5 years –Many 10s of £M –Included staff element –[This was update of a previous version which had separate GridPP/SKA/Astro-archive.. lines] Timeline –Jan STFC in agreement –Jan 22 RCUK+PDG+BIS meeting  wants cases –March CSR outcome known  unclear if any of this is expected to be within CSR allocations –March 14 th : PDG-RCUK-NEI meeting to discuss further –April 26 th : Next PDG-RCUK-NEI meeting BIS have the information It is essential to keep pushing on this 10

11 European Open Science cloud (EOSC) Seen as an important EU theme –Many discussions/worshops...etc have been held, However these is still much uncertainty as to what it is: –A widely scoped thing to serve all possible communities –Mainly about a data cloud – and making data more available –Nothing to do with cloud infrastructure as we mean it It remains unclear as to what is expected in response to the current call INFRADEV-4 “European Open Science cloud” : 10M Euro Unclear who has the mandate /will /ability to lead it –Many discussions with other EU funding agencies (through EU-T0), with RCUK –A set of discussions with EU funding agencies, CERN, EGI and EUDAT –.... Best guess –This wont affect any computing infrastructure for LHC any time soon –It wont provide any co-funding any time soon –We (CERN, STFC, and us) are tightly involved in trying to “direct it” –We have to position ourselves to be good supportive partners 11

12 Hybrid Cloud infrastructure Hybrid Clouds - CERN are pushing this 12

13 Discussion - Questions ? 13