Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

Level 1 Components of the Project. Level 0 Goal or Aim of GridPP. Level 2 Elements of the components. Level 2 Milestones for the elements.
Fighting Malaria With The Grid. Computing on The Grid The Internet allows users to share information across vast geographical distances. Using similar.
STFC and the UK e-Infrastructure Initiative The Hartree Centre Prof. John Bancroft Project Director, the Hartree Centre Member, e-Infrastructure Leadership.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP26 Collaboration Meeting.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Collaboration Board 2 nd.
VO Sandpit, November 2009 NERC Big Data And what’s in it for NCEO? June 2014 Victoria Bennett CEDA (Centre for Environmental Data Archival)
Distributed IT Infrastructure for U.S. ATLAS Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP27 15 th Sep 2011 GridPP.
Ian Bird WLCG Workshop Okinawa, 12 th April 2015.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Vendor Day 30 th April.
Assessment of Core Services provided to USLHC by OSG.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
Ian Fisk and Maria Girone Improvements in the CMS Computing System from Run2 CHEP 2015 Ian Fisk and Maria Girone For CMS Collaboration.
Status Report on Tier-1 in Korea Gungwon Kang, Sang-Un Ahn and Hangjin Jang (KISTI GSDC) April 28, 2014 at 15th CERN-Korea Committee, Geneva Korea Institute.
NORDUnet NORDUnet The Fibre Generation Lars Fischer CTO NORDUnet.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP30 26 th Mar 2013 GridPP30.
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
José M. Hernández CIEMAT Grid Computing in the Experiment at LHC Jornada de usuarios de Infraestructuras Grid January 2012, CIEMAT, Madrid.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridP35 Collaboration Meeting.
GridPP23 – Final Steps to Data David Britton, 8/Sep/09.
Take on messages from Lecture 1 LHC Computing has been well sized to handle the production and analysis needs of LHC (very high data rates and throughputs)
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP28 17 th Apr 2012 GridPP28.
Workshop summary Ian Bird, CERN WLCG Workshop; DESY, 13 th July 2011 Accelerating Science and Innovation Accelerating Science and Innovation.
Main title ERANET - HEP Group info (if required) Your name ….
Main title HEP in Greece Group info (if required) Your name ….
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Computing for Particle.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow PPAP Community Meeting Imperial,
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
CERN - IT Department CH-1211 Genève 23 Switzerland t High Availability Databases based on Oracle 10g RAC on Linux WLCG Tier2 Tutorials, CERN,
DiRAC-3 – The future Jeremy Yates, STFC DiRAC HPC Facility.
LHC Computing, CERN, & Federated Identities
Evolving Security in WLCG Ian Collier, STFC Rutherford Appleton Laboratory Group info (if required) 1 st February 2016, WLCG Workshop Lisbon.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
Predrag Buncic ALICE Status Report LHCC Referee Meeting CERN
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow EGI Technical Forum 21 st Sep.
Ian Bird WLCG Networking workshop CERN, 10 th February February 2014
Ian Bird Overview Board; CERN, 8 th March 2013 March 6, 2013
The Worldwide LHC Computing Grid Frédéric Hemmer IT Department Head Visit of INTEL ISEF CERN Special Award Winners 2012 Thursday, 21 st June 2012.
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
Meeting with University of Malta| CERN, May 18, 2015 | Predrag Buncic ALICE Computing in Run 2+ P. Buncic 1.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
A Computing Tier 2 Node Eric Fede – LAPP/IN2P3. 2 Eric Fede – 1st Chinese-French Workshop Plan What is a Tier 2 –Context and definition To be a Tier 2.
STFC in INDIGO DataCloud WP3 INDIGO DataCloud Kickoff Meeting Bologna April 2015 Ian Collier
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridP36 Collaboration Meeting.
1 David Britton, University of Glasgow IET, Oct 09 1 Pete Clarke University of Edinburgh GridPP36 Pitlochry April 12th 2016 Computing future News & Views.
Slide § David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP delivering The UK Grid.
EGI-InSPIRE EGI-InSPIRE RI The European Grid Infrastructure Steven Newhouse Director, EGI.eu Project Director, EGI-InSPIRE 29/06/2016CoreGrid.
Computing infrastructures for the LHC: current status and challenges of the High Luminosity LHC future Worldwide LHC Computing Grid (WLCG): Distributed.
J. Templon Nikhef Amsterdam Physics Data Processing Group Large Scale Computing Jeff Templon Nikhef Jamboree, Utrecht, 10 december 2012.
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
Ian Bird, CERN WLCG Project Leader Amsterdam, 24 th January 2012.
Ian Bird WLCG Workshop San Francisco, 8th October 2016
Cluster Optimisation using Cgroups
Dagmar Adamova, NPI AS CR Prague/Rez
Dagmar Adamova (NPI AS CR Prague/Rez) and Maarten Litmaath (CERN)
UK Status and Plans Scientific Computing Forum 27th Oct 2017
Thoughts on Computing Upgrade Activities
Connecting the European Grid Infrastructure to Research Communities
New strategies of the LHC experiments to meet
Collaboration Board Meeting
Presentation transcript:

Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP

Slide Outline Overview of GridPP Past and future growth GridPP5 Key messages David Britton, University of Glasgow UK-T0 Meeting 2

Slide GridPP is ~10% of WLCG 18 sites; 57k CPU; 32 PB Disk; 14 PB Tape GridPP in 2015 David Britton, University of Glasgow UK-T0 Meeting 3 WLCG: 170+ sites in 42 countries 515K CPU; 290PB Disk;150PB Tape

Slide Evolving Architecture David Britton, University of Glasgow UK-T0 Meeting 4 Evolution of computing models Hierarchy Mesh Network capabilities and data access technologies have significantly improved our ability to use resources independent of location. Jobs to data but Jobs can fall-back to remote data access if local copies not available. Tier-1 and Tier-2 still offer different capabilities and levels of service.

Slide GridPP Hardware Based mostly on off-the-shelf hardware –Compute: typically X86_64 Intel Xeon Architecture - minimum 2GB per core; 30-50GB scratch per core; 1 or 10 Gbps networking. –Storage: 2/4 TB disks in raid-6 configuration; 10 Gbps networking. Dedicated Tape Storage facility at the RAL Tier-1. ~12 PB disk at Tier-1 and ~20PB at the Tier-2 sites ~16k logical CPU at Tier-1 and ~40k at the Tier-2 sites Tier-1 has dedicated (redundant) 2x10G optical link to CERN (OPN) and 2*30G links to JANET. Tier-2 sites typically 10G connections to JANET. David Britton, University of Glasgow UK-T0 Meeting 5

Slide RAL Tier-1 Resource Growth David Britton, University of Glasgow UK-T0 Meeting 6 LHC turn-on Higgs Discovery Shutdown Scale Tests Similar growth in storage.

Slide Scale of Global Operation (ATLAS Experiment) David Britton, University of Glasgow UK-T0 Meeting 7

Slide LHC and More David Britton, University of Glasgow UK-T0 Meeting 8 9% of Tier-2 CPU and 4% of Tier-1 CPU was delivered to 32 non-LHC VOs between Jan 2012 and Dec 2014

Slide Looking Ahead David Britton, University of Glasgow UK-T0 Meeting 9  Data Volume

Slide Complexity David Britton, University of Glasgow UK-T0 Meeting 10 Z   decay with 25 vertices (April 15th 2012) Simulated Event Display at 140 PU (102 Vertices)

Slide Our Challenge David Britton, University of Glasgow UK-T0 Meeting 11 Volume: >200 PB at present (on disk for ATLAS alone) will grow by factor of >10x by Run-4. Complexity: Pile up increasing from 23 to 140 by Run-4 increases computational problem super- linearly.

Slide Service Decomposition GridPP’s value is the support and provision of a service layer that lays between the hardware and the software and enables the LHC computing models. The majority of GridPP effort is not is not used to run hardware. GridPP staff effort does not write experiment software. David Britton, University of Glasgow UK-T0 Meeting 12 (AiBM)

Slide EGI Services David Britton, University of Glasgow UK-T0 Meeting 13 Services required by WLCG that are provided via the EGI project. The UK leads two of these, co-leads a third, and makes contributions to two more.

Slide David Britton, University of Glasgow UK-T0 Meeting 14 WLCG services not provided via EGI: The UK contributes to 13 services and co-leads two. Most WLCG partners contribute to most of these shared international tasks. WLCG Services

Slide David Britton, University of Glasgow UK-T0 Meeting 15 UK Services: These are services that every country needs to perform for the benefit of their own infrastructure. These 14 tasks that are core business for GridPP. Some of these services do (and increasingly can) benefit other UK infrastructures. UK Services

Slide 16 Federate d HTC Clusters Federate d HTC Clusters Federate d Data Storage Note: this does NOT mean centralised Virtual platform Services: Monitoring Accountin g Incident reporting Services: Monitoring Accountin g Incident reporting AAAI VO tools AAAI VO tools Tape Archive LHC VO Management Reconstruction Data Manag. Analysis LHC VO Management Reconstruction Data Manag. Analysis LSST SKA Share in common where it makes sense to do so

Slide GridPP5 David Britton, University of Glasgow UK-T0 Meeting 17 GridPP5 proposal has just been funded (April 2016; 4-years). We presented ambitious plans to evolve our Tier-2 infrastructure to a sustainable model based on 5 large sites, complemented by smaller sites that can be run with little (0.5 FTE) manpower. We recognised the opportunity, particularly at the Tier-1, to optimise productivity by sharing the e-Infrastructure to improve cost efficiency and reduce duplication. We believe there are mutually beneficial opportunities in engaging with existing and emerging groups that require “e- infrastructure” under the UK-T0 and EU-T0 monikers.

Slide Key Messages -1 GridPP runs an HTC grid infrastructure primarily for LHC data but the HW is sufficiently generic that can be widely of use; middleware is getting less bespoke! There is a lot of knowledge/skills within GridPP, particularly around handling large volumes of data (transporting; metadata; processing). There is a lot of experience within the LHC Experiments in running global-scale complex infrastructures. We don’t suggest that others should necessarily do things our way; but we do have expert knowledge in some, but not all, areas that might be exploited. David Britton, University of Glasgow UK-T0 Meeting 18

Slide Key Messages -2 We have some limited hardware capacity that you can use for free to develop/test/scale-test things. We have some (very) limited manpower that can work with you to get you going. We are keen to explore economies of scale by aligning our infrastructure with yours where it makes mutual sense to do so. We are keen to enter peer-relationships with other groups, particularly STFC funded projects, under the auspices of UK-T0, to show a return on investment in GridPP (aka "Impact") that goes beyond the production of our headline science. David Britton, University of Glasgow UK-T0 Meeting 19