Slide David Britton, University of Glasgow IET, Oct 09 1 Pete Clarke University of Edinburgh PPAP Meeting Imperial, 24/25 th Sep 2015 GridPP Access for.

Slides:



Advertisements
Similar presentations
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Why Grids Matter to Europe Bob Jones EGEE.
Advertisements

S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
GridPP4 – Revised Plan Implementing the PPAN recommendations.
User Board - Supporting Other Experiments Stephen Burke, RAL pp Glenn Patrick.
S.L.LloydGridPP Collaboration Meeting IC Sept 2002Slide 1 Introduction Welcome to the 5 th GridPP Collaboration Meeting Steve Lloyd, Chair of GridPP.
GridPP4 Oversight Committee Meeting 4th February 2010
SCD in Horizon 2020 Ian Collier RAL Tier 1 GridPP 33, Ambleside, August 22 nd 2014.
Grid Infrastructure in the UK Neil Geddes. Why this talk ? LHC to 2020 –GridPP to 2011 –SRIF3 to 2010 ? Who was successful in SRIF3? –Thereafter ? PPARC.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP26 Collaboration Meeting.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Collaboration Board 2 nd.
S.L.LloydGridPP CB 29 Oct 2002Slide 1 Agenda 1.Introduction – Steve Lloyd 2.Minutes of Previous Meeting (23 Oct 2001) 3.Matters Arising 4.Project Leader's.
Technology on the NGS Pete Oliver NGS Operations Manager.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Vendor Day 30 th April.
Assessment of Core Services provided to USLHC by OSG.
QCDgrid Technology James Perry, George Beckett, Lorna Smith EPCC, The University Of Edinburgh.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
Ian Bird LHCC Referees’ meeting; CERN, 11 th June 2013 March 6, 2013
UK NGI Operations John Gordon 10 th January 2012.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP30 26 th Mar 2013 GridPP30.
S.L.LloydGridPP CB 19 February 2003Slide 1 Agenda 1.Minutes of Previous Meeting (29 Oct 2002) 2.Matters Arising 3.GridPP2 Planning 4.EGEE 5.Any Other Business.
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridP35 Collaboration Meeting.
GridPP23 – Final Steps to Data David Britton, 8/Sep/09.
The UK NGI Claire Devereux NGI Manager. Overview What is it? Who does it include? How does it work? Why do we need it? How is it evolving and what does.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Computing for Particle.
Introduction to Grid Computing Ed Seidel Max Planck Institute for Gravitational Physics
Grid Middleware Tutorial / Grid Technologies IntroSlide 1 /14 Grid Technologies Intro Ivan Degtyarenko ivan.degtyarenko dog csc dot fi CSC – The Finnish.
GridPP Presentation to AstroGrid 13 December 2001 Steve Lloyd Queen Mary University of London.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow PPAP Community Meeting Imperial,
BNL Tier 1 Service Planning & Monitoring Bruce G. Gibbard GDB 5-6 August 2006.
…building the next IT revolution From Web to Grid…
E-Science Research Councils awarded e-Science funds ” science increasingly done through distributed global collaborations enabled by the Internet, using.
GDB July 2015 Jeremy’s quick summary notes Also refer to the meeting minutes
ESFRI & e-Infrastructure Collaborations, EGEE’09 Krzysztof Wrona September 21 st, 2009 European XFEL.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
GridPP Collaboration Meeting 5 th November 2001 Dan Tovey, University of Sheffield Non-LHC and Non-US-Collider Experiments’ Requirements Dan Tovey, University.
Your university or experiment logo here The European Landscape John Gordon GridPP24 RHUL 15 th April 2010.
Proposed Panel Jeremy Yates Dave Britton (chair) Manuel Delfino Jeff Templon Bob Mann Stephen Fairhurst.
20-21 October 2015UK-T0 Workshop1 Experience of Data Transfer to the Tier-1 from a DIRAC Perspective Lydia Heck Institute for Computational Cosmology Manager.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
DiRAC-3 – The future Jeremy Yates, STFC DiRAC HPC Facility.
The GridPP DIRAC project DIRAC for non-LHC communities.
BaBar and the GRID Tim Adye CLRC PP GRID Team Meeting 3rd May 2000.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow EGI Technical Forum 21 st Sep.
Stephen Burke – Sysman meeting - 22/4/2002 Partner Logo The Testbed – A User View Stephen Burke, PPARC/RAL.
The GridPP DIRAC project DIRAC for non-LHC communities.
Ian Bird, CERN 1 st February Dec 2015
STFC in INDIGO DataCloud WP3 INDIGO DataCloud Kickoff Meeting Bologna April 2015 Ian Collier
CernVM-FS – Best Practice to Consolidate Global Software Distribution Catalin CONDURACHE, Ian COLLIER STFC RAL Tier-1 ISGC15, Taipei, March 2015.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridP36 Collaboration Meeting.
1 David Britton, University of Glasgow IET, Oct 09 1 Pete Clarke University of Edinburgh GridPP36 Pitlochry April 12th 2016 Computing future News & Views.
Slide § David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP delivering The UK Grid.
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
Distributed Computing for Small Experiments
Distributed Computing for Small Experiments
Pete Gronbech GridPP Project Manager April 2017
Report from Computing Advisory Panel
National e-Infrastructure Vision
Understanding the nature of matter -
John Womersley Super-B Workshop Oxford May 2011
UK Status and Plans Scientific Computing Forum 27th Oct 2017
LHC Data Analysis using a worldwide computing grid
David Kelsey (STFC-RAL)
GridPP4 Oversight Committee Meeting 4th February 2010
Future GridPP Security
Fabio Pasian, Marco Molinaro, Giuliano Taffoni
Presentation transcript:

Slide David Britton, University of Glasgow IET, Oct 09 1 Pete Clarke University of Edinburgh PPAP Meeting Imperial, 24/25 th Sep 2015 GridPP Access for non-LHC activities

Slide GridPP Status (see talk from Dave Britton) GridPP5 was recently renewed in the PPGP round. Resources were awarded at ~ 90% of flat cash. Features: –Tier-1 site at RAL remains. –Tier-2 sites will be consolidated into ~ 5 largish ones –Other Tier-2 sites retained at minimal staff support level GridPP strongly wishes to continue to support non-LHC activities 2

Slide 3 T2K Pheno SNO+ ILCHone Biomed See also: Neutrino cross- section modeling NA62 Monte Carlo studies of Kaon decays Medical image analysis Drug discovery Bio informatics Phenomenology studies Testing MC generators. Simulations Beam studies Neutrino decay and Detector simulations MC studies Fusion MC studies Processing for Timepix hybrid silicon pixel detector Plasma studies Non-LHC usage of GridPP today

Slide slide from D.Britton’s talk yesterday 4 9% of Tier-2 CPU 4% of Tier-1 CPU used by 32 non-LHC VOs between Jan 2012 and Dec 2014

Slide The changing landscape Data rates are increasing very significantly across the science domains –No longer just LHC - SKA will be a major data source, others as well (DLS, Telescopes..) –It is a challenge to work out how STFC can support all of these ! Funding realities –Flat cash or less ? –All countries are facing this EU-T0 –European funding agencies (STFC,IN2P3,INFN,SARA,CEA,...) have formed a consortium. –They all want to see more harmonisation across the communities they support UK-T0 -Initiative to join up STFC computing across science and facilities (SLIDE AT END) H2020, CSR -If funds are going to be accessible for computing, then this will only be for a more joined up approach. 5  Do more – do it for less - be more joined up

Slide Non-LHC activities : Future All of the foregoing leads to an increased mandate for GridPP to support non-LHC activities. –Part of GridPP5 brief from Swindon –This is great – it has always been the spirit of GridPP anyway. Formal position: –GridPP welcomes non-LHC activities to discuss sharing the resources –You are welcome to raise this through your local GridPP contacts if you have them –You can contact myself or Jeremy Coles –It is helpful if you could provide a ~few page document describing your computing requirement your resource requirement profile –Technical recipe already available on GridPP website –GridPP staff will then liaise with you to discuss timescales, get you going. –We will assemble a description of all of this for PIs on the web site Resources –In order to get going resources are provided within the ~ 10% allocation for non-LHC work –In you have a particularly large CPU and Storage resource requirement then in due course you will need to seek funding for the marginal cost of this - SEE LATER SLIDE 6

Slide 7 GridPP DIRAC (job submission framework) + Ganga (for bulk operations) CVMFS (software repository) Site resources (hardware at incremental cost) GGUS (support – help desk)/Documentation/Examples/User interface VOMS (authorisation) CA (authentication ) FTS (bulk file transfers) APEL (accounting/usage). VO Nagios (monitoring) Non-LHC support : some of the common services + access to GridPP expertise and experience

Slide Ease of access to new communities Under the wider “UK-T0” banner it is obvious that to enable new/smaller communities in the future will also require development –A “single sign on” type AAA system (using University credential) –A “cloud” deployment (facility for you to deploy your virtual environment) –Easy to use services for managing and moving even larger data volumes There are no resources awarded under GridPP5 to develop all of this, but – at the margins we are trying –Some marginal RAL SCD effort as SCD have responsibilities for all of STFC science –H2020 projects such as AARC (authentication), DataCloud (cloud/virtualisation) –EGI funded staff work on community services –Shared GridPP-SKA and GridPP-LSST posts already in place. 8

Slide Non-LHC activities ramping up 9 DIRAC LIGO LOFAR LSSTQCD GalDyn PRaVDA (Proton Radiotherapy) LZ (Data Centre at IC) Setting up for TDR simulations Geant4 Monte Carlo code to fully model the PRaVDA pCT device GHOST Geant 4 Simulation of X- Ray Dose Deposition Full-chain analysis for single orbit simulations Running scalar analysis on ILDG Data Backing up >5PB data Simulations

Slide Non-LHC support :LSST Pre-LSST –Pilot activity using DES shear analysis at Manchester –Joe Zunst (LSST) and Alessandra Forti (GridPP) 10

Slide Non-LHC support : DiRAC DiRAC Storage –Use of STFC RAL tape store for the DiRAC HPC –Lydia Heck (Durham) + GridPP staff enabled this –Excellent co-operation between GridPP and DiRAC 11 Blue Gene Edinburgh Cosmos Cambridge Complexity Leicester Data Centric Durham Data Analytic Cambridge

Slide Geant Human Oncology Simulation Tool I One of our most recent use-cases has come from the STFC funded GHOST project for evaluating Late Toxicity Risk for RT Patients through the use of Geant 4 Simulation of X-Ray Dose Deposition. (see this talk from GridPP35)this talk 12 The approach….

Slide UK-T0 meeting UK-T0 is an initiative to bring STFC science communities together to address future computing and data centre needs First meeting arranged for non-pure-PP communities on Oct 21/22 at RAL. (pure PP communities are already part of GridPP (T2K, NA62, ILC..)) To discuss: –Sharing of the infrastructure and services where this makes sense. –How to ease access to smaller communities. –How to go for funding opportunities in both UK and EU Contacted so far –LOFAR, LSST, EUCLID, Advanced-LIGO, SKA, DiRAC, Fusion (Culham), LZ, CTA, Facilities computing. If there are other experiments/projects/activities interested - please contact me at the end of the meeting. 13

Slide Practicalities and caveats There is no magic wand GridPP5 has been at flat cash for 8 years  19% reduction in resources. Non-LHC activities are typically not awarded computing capital resources by PPRP, and in some cases asked to talk to GridPP The incremental capital cost of CPU and Storage for these activities falls between the cracks –If 10 non-LHC activities require 10% of GridPP  would double the resource requirement ! - Mitigated by leverage at Tier-2 sites. –This is as yet an unsolved situation, but we have ideas. Some key software services which would have helped other smaller communities have had their support cut (e.g. Ganga) GridPP is seeking capital resources from outside the science line aggressively –Lobbying for CSR capital injection –Working hard to be involved in H2020 bids 14

Slide Questions ? 15