Report from Computing Advisory Panel

Slides:



Advertisements
Similar presentations
John Womersley The Science and Technology Facilities Council and Nuclear Physics John Womersley Director, Science Programmes October 2008.
Advertisements

S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Your university or experiment logo here What is it? What is it for? The Grid.
Science and Technology at Lancaster University. About Science and Technology 12% overseas students (36% PhD) 340 PhD students Over 2,500 full time students.
Panel created in 2001 as part of restructuring of PPARC Advisory Structure One of five advisory panels: Astronomy Advisory Panel (AAP) Solar System Advisory.
STFC and the UK e-Infrastructure Initiative The Hartree Centre Prof. John Bancroft Project Director, the Hartree Centre Member, e-Infrastructure Leadership.
Nuclear Physics Peer Review Processes 1.Objectives 2.Types of activities and relevant processes 3.New project /major upgrade 4.Project R&D 5.Exploitation.
E-Infrastructure Networking David Salmon. Topics e-Infrastructure funding – What has Janet been doing ? Emerging Issues – Some practicalities Broader.
– please look at our wiki! Part of the new national e-Infrastructure
Reflecting on the results of the 2013 National Student Survey Professor Kelvin Everest PVC for Student Experience November 2013.
UK e-Science and the White Rose Grid Paul Townend Distributed Systems and Services Group Informatics Research Institute University of Leeds.
UK -Tomato Chromosome Four Sarah Butcher Bioinformatics Support Service Centre For Bioinformatics Imperial College London
Birmingham Particle Physics Masterclass 23 th April 2008 Birmingham Particle Physics Masterclass 23 th April 2008 The Grid What & Why? Presentation by:
Kick-off University Partners Meeting September 18, 2012 Michael Owen, VP Research, Innovation & International, UOIT on behalf of Consortium partners Southern.
Queen Mary, University of London: Discipline Bridging Initiative 1 Queen Mary, University of London Discipline Bridging Initiative Funded via the MRC/EPSRC.
Techniques for Data Linkage and Anonymisation – A Funders View Turing Gateway Meeting 23 rd October 2014 Dr Mark Pitman.
ATTRACT is a proposal for an EU-funded R&D programme as part of H2020 for sensor, imaging and related computing (ICT) development Its purpose is to demonstrate.
Overview of UK Nuclear Physics Paul Nolan. Structure of how nuclear physics is organised in the UK Nuclear Physics funding in the UK Numbers to indicate:
John Womersley John Womersley Director, Science Programmes Science and Technology Facilities Council Technology Gateway Centres.
Organisation Management and Policy Group (MPG): Responsible for setting and policy decisions and resolving any issues concerning fractional usage, acceptable.
C. H. Shepherd-Themistocleous - RALIoP HEPP Conference, UCL 29 th March Particle Physics Advisory Panel C. H. Shepherd-Themistocleous Rutherford.
GGF-16 Athens Production Grid Computing in the UK Neil Geddes CCLRC Director, e-Science.
Particle Physics Funding Mechanism PPARC has become STFC –Should be same funding structure as before PP and A constitute about £330M/year About half is.
John Womersley PPD staff meeting John Womersley 6 February 2007.
Science Board Presentation to IOP town meeting Sheila Rowan.
What’s going on John Womersley. Outline Particle Physics Grants Round Financial Situation STFC Strategy Why our science matters more than ever.
The 1994 Group: Enhancing the Postgraduate Student Experience Professor Kevin Edge: Pro Vice Chancellor (Research), University of Bath Professor Janice.
11 March 2008 GridPP20 Collaboration meeting David Britton - University of Glasgow GridPP Status GridPP20 Collaboration Meeting, Dublin David Britton,
Particle Physics Advisory Panel Philip Burrows John Adams Institute for Accelerator Science Oxford University.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
PPAN Programmatic Review Presentation to PP town meeting Jordan Nash.
Your university or experiment logo here What is it? What is it for? The Grid.
Edinburgh e-Science MSc Bob Mann Institute for Astronomy & NeSC University of Edinburgh.
20-21 October 2015UK-T0 Workshop1 Experience of Data Transfer to the Tier-1 from a DIRAC Perspective Lydia Heck Institute for Computational Cosmology Manager.
STFC Knowledge Exchange Service Alex Efimov Opportunities for Technology Transfer in Grid Computing.
Tony Doyle - University of Glasgow Introduction. Tony Doyle - University of Glasgow 6 November 2006ScotGrid Expression of Interest Universities of Aberdeen,
UK Tier 1 Centre Glenn Patrick LHCb Software Week, 28 April 2006.
Welcome and CNAP News Thanks to Gareth Smith and RAL for hosting the meeting.
DiRAC-3 – The future Jeremy Yates, STFC DiRAC HPC Facility.
Regional Cyberinfrastructure Planning Great Plains Network Greg Monaco, Ph.D. Director for Research and Cyberinfrastructure Initiatives
Realising MRC’s Vision in Health and Bioinformatics MRC Open Council Meeting July 2014 Janet Valentine Head of Population Health and Informatics.
STFC’s National Laboratories Round Table on Synergies and Complementarity among Laboratories John Womersley Chief Executive, STFC 13 th Pisa meeting on.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
Astronomy toolkits and data structures Andrew Jenkins Durham University.
Earth System Modelling: an HPC perspective Mike Ashworth & Rupert Ford Scientific Computing Department and STFC Hartree Centre STFC Daresbury Laboratory.
VizNET and Visualization activities at STFC Lakshmi Sastry Applications Group E-Science Centre Science & Technology Facilities Council
RI EGI-InSPIRE RI Pre-OMB meeting Preparation for the Workshop “EGI towards H2020” NGI_UK John Gordon and.
Challenge Led Applied Systems Programme (CLASP) Energy and Environment Call Information and Networking day 5 July 2016 Professor John Lees University of.
NERC Strategy Dave Tappin Unconventionals Workshop 24 th September 2014.
1 Science Board and STFC News Prof. Dan Tovey University of Sheffield.
Centre of Excellence in Physics at Extreme Scales Richard Kenway.
Sean J Freeman The University of Manchester
A Brief Introduction to NERSC Resources and Allocations
Presentation to PPAP meeting
What is HPC? High Performance Computing (HPC)
Data Access and the Administrative Data Research Network David J
Formal Theory Nick Evans (University of Southampton)
Sean Freeman The University of Manchester
Scientific Computing Department
Sean Freeman The University of Manchester
BigPanDA Technical Interchange Meeting July 20, 2017 Hong Ma
Nuclear Physics Group: changes since July 2011
National e-Infrastructure Vision
UK Status and Plans Scientific Computing Forum 27th Oct 2017
Research Data Transfer Zones
The National Grid Service
HPC Facilities for UK Theory
The Cambridge Research Computing Service
STFC Update Charlotte Jamieson 3rd April 2019.
Is STEM for me ? What’s a STEM career ? STEM in career STEM employment
Presentation transcript:

Report from Computing Advisory Panel Arttu Rajantie (Imperial College London) 21/12/2016 A. Rajantie, Report from Computing Advisory Panel

Computing Advisory Panel The purpose of the Computing Advisory Panel (CAP) is to advise the STFC Executive on the strategy for, and management of, provision for computing resources (including data handling, data storage, software and hard provisions, skills, and developments in high performance and high throughput computing) in support of programmes either funded or delivered by STFC. 21/12/2016 A. Rajantie, Report from Computing Advisory Panel

A. Rajantie, Report from Computing Advisory Panel CAP Membership Professor Stephen Fairhurst – Cardiff University (Chair) Professor David Colling – Imperial College London Professor Martin Dove – Queen Mary University of London Dr David Fergusson – The Francis Crick Institute Dr John Pasley – University of York Professor Arttu Rajantie – Imperial College London Dr Debora Sijacki – University of Cambridge Dr Stuart Sim – Queen’s University Belfast Mr Ash Vadgama – AWE Professor Daniel Watts – University of Edinburgh 21/12/2016 A. Rajantie, Report from Computing Advisory Panel

STFC Computing Strategy Computing Strategic Review, December 2015 http://www.stfc.ac.uk/files/corp orate-publications/computing- strategic-review-december- 2015/ 21/12/2016 A. Rajantie, Report from Computing Advisory Panel

A. Rajantie, Report from Computing Advisory Panel STFC Computing PPAN (Particle Physics, Astronomy and Nuclear Physics) Research STFC Facilities (Central Laser Facility, ISIS, Diamond,..) Computing Facilities Hosted by STFC Hartree Centre (Collaborative R&D with industry) 21/12/2016 A. Rajantie, Report from Computing Advisory Panel

A. Rajantie, Report from Computing Advisory Panel PPAN Computing Rapidly increasing computing requirements Included in the ongoing Balance of Programmes exercise Experiments: GridPP (LHC), SKA, LSST, CTA DiRAC (Distributed Research using Advanced Computing) Grant-funded computing 21/12/2016 A. Rajantie, Report from Computing Advisory Panel

Particle Theory Computing Lattice QCD Particle Phenomenology Particle Cosmology Other areas? 21/12/2016 A. Rajantie, Report from Computing Advisory Panel

A. Rajantie, Report from Computing Advisory Panel DiRAC HPC (High Performance Computing) Facility for particle physics, astronomy and nuclear physics DiRAC 1 launched in 2009 - £12M Government funding DiRAC 2 in 2012 - £15M Government funding Supports over 400 researchers at 35 UK HEIs, £50M of funded research grants across PPAN (DiRAC Annual Report 2013) Time awarded by DiRAC Resource Allocation Committee 21/12/2016 A. Rajantie, Report from Computing Advisory Panel

A. Rajantie, Report from Computing Advisory Panel DiRAC 2 Five systems designed for different physics problems Cambridge HPCS Data Analytic Cluster (9600 cores, 4GB RAM/core, 0.75PB) Cambridge COSMOS Shared Memory Service (1856 cores, 14.8TB RAM, 146TB) Leicester Complexity Cluster (4352 cores, 8GB RAM/core , 0.8PB) Durham Data Centric Cluster (6740 cores, 8 GB RAM/core, 2PB) Edinburgh BlueGene/Q (98304 cores, 1GB RAM/core, 1.2PB) Now over four years old 21/12/2016 A. Rajantie, Report from Computing Advisory Panel

A. Rajantie, Report from Computing Advisory Panel Future of DiRAC DiRAC 3? Roughly 10x increase in computing power Hoped for 2015 21/12/2016 A. Rajantie, Report from Computing Advisory Panel

A. Rajantie, Report from Computing Advisory Panel Future of DiRAC DiRAC 2.5: Keep systems operational for now Upgrade to Cambridge, ex-Hartree hardware to Edinburgh, Durham DiRAC 3? Dependent on government funding 21/12/2016 A. Rajantie, Report from Computing Advisory Panel

A. Rajantie, Report from Computing Advisory Panel Other Issues Open Data: Data Management Plans Data storage, sharing, curation People: Training, software engineers and data scientists to make the best use of hardware 21/12/2016 A. Rajantie, Report from Computing Advisory Panel

A. Rajantie, Report from Computing Advisory Panel Your Input Let us know your views, ideas, concerns regarding STFC computing My email: a.rajantie@imperial.ac.uk 21/12/2016 A. Rajantie, Report from Computing Advisory Panel