– please look at our wiki! Part of the new national e-Infrastructure

Slides:



Advertisements
Similar presentations
Tony Doyle GridPP2 Proposal, BT Meeting, Imperial, 23 July 2003.
Advertisements

Research Councils ICT Conference Welcome Malcolm Atkinson Director 17 th May 2004.
Founded in 2010: UCL, Southampton, Oxford and Bristol Key Objectives of the Consortium: Prove the concept of shared, regional e-infrastructure services.
Distributed Data Processing
STFC and the UK e-Infrastructure Initiative The Hartree Centre Prof. John Bancroft Project Director, the Hartree Centre Member, e-Infrastructure Leadership.
The Internet2 NET+ Services Program Jerry Grochow Interim Vice President CSG January, 2012.
UK Campus Grid Special Interest Group Dr. David Wallom University of Oxford.
The Hartree Centre & The Daresbury Research Collaboratory in association with IBM Jonathan Follows HPC activities at STFC Daresbury Laboratory.
Grid Infrastructure in the UK Neil Geddes. Why this talk ? LHC to 2020 –GridPP to 2011 –SRIF3 to 2010 ? Who was successful in SRIF3? –Thereafter ? PPARC.
Cloud Computing Special Interest Group Cloud Computing for the UK Research Community Workshop December 2013 Philip Kershaw, STFC Rutherford Appleton.
Empowering Business in Real Time. © Copyright 2009, OSIsoft Inc. All rights Reserved. Virtualization and HA PI Systems: Three strategies to keep your PI.
UK e-Science and the White Rose Grid Paul Townend Distributed Systems and Services Group Informatics Research Institute University of Leeds.
Results Matter. Trust NAG. Numerical Algorithms Group Mathematics and technology for optimized performance Andrew Jones IDC HPC User Forum, Imperial College.
Houston, Texas Group members: Vinod Raj Mylapore Supriya Suryadevara Naveen Thinavakarasu Navin Negi.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
IPlant Collaborative Tools and Services Workshop iPlant Collaborative Tools and Services Workshop Collaborating with iPlant.
Open Source Grid Computing in the Finance Industry Alex Efimov STFC Kite Club Knowledge Exchange Advisor UK CERN Technology Transfer Officer
The OMII Perspective on Grid and Web Services At the University of Southampton.
1 European policies for e- Infrastructures Belarus-Poland NREN cross-border link inauguration event Minsk, 9 November 2010 Jean-Luc Dorel European Commission.
User requirements for and concerns about a European e-Infrastructure Steven Newhouse, Director.
Research Support Services Research Support Services.
Climate Sciences: Use Case and Vision Summary Philip Kershaw CEDA, RAL Space, STFC.
Integrated e-Infrastructure for Scientific Facilities Kerstin Kleese van Dam STFC- e-Science Centre Daresbury Laboratory
1 The IODE Ocean Data Portal - current status and future Nikolai Mikhailov, Chair of IODE/JCOMM ETDMP National Oceanographic Data Centre, Russia Four Session.
Presentation to Senior Management Team 24 th October 2008 UCD IT Services IT Strategy
Atkins New PSR Reporting Solution Using Actuate e.Spreadsheet October 2007.
API, Interoperability, etc.  Geoffrey Fox  Kathy Benninger  Zongming Fei  Cas De’Angelo  Orran Krieger*
Cloud Use Cases, Required Standards, and Roadmaps Excerpts From Cloud Computing Use Cases White Paper
3 June 2004GridPP10Slide 1 GridPP Dissemination Sarah Pearce Dissemination Officer
EGEE is proposed as a project funded by the European Union under contract IST EU eInfrastructure project initiatives FP6-EGEE Fabrizio Gagliardi.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE – paving the way for a sustainable infrastructure.
INFSO-RI Enabling Grids for E-sciencE Plan until the end of the project and beyond, sustainability plans Dieter Kranzlmüller Deputy.
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
Tom Furlani Director, Center for Computational Research SUNY Buffalo Metrics for HPC September 30, 2010.
…building the next IT revolution From Web to Grid…
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
E-Science Research Councils awarded e-Science funds ” science increasingly done through distributed global collaborations enabled by the Internet, using.
20-21 October 2015UK-T0 Workshop1 Experience of Data Transfer to the Tier-1 from a DIRAC Perspective Lydia Heck Institute for Computational Cosmology Manager.
Utility Computing: Security & Trust Issues Dr Steven Newhouse Technical Director London e-Science Centre Department of Computing, Imperial College London.
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
Tony Doyle - University of Glasgow Introduction. Tony Doyle - University of Glasgow 6 November 2006ScotGrid Expression of Interest Universities of Aberdeen,
Identity Management in DEISA/PRACE Vincent RIBAILLIER, Federated Identity Workshop, CERN, June 9 th, 2011.
EGI-InSPIRE Steven Newhouse Interim EGI.eu Director EGI-InSPIRE Project Director Technical Director EGEE-III 1GDB - December 2009.
Oregon Department of Education Technology and School Finance Presenter:Doug Kosty, Director Presentation to the Joint Ways & Means Subcommittee on Education.
DiRAC-3 – The future Jeremy Yates, STFC DiRAC HPC Facility.
INFSO-RI Enabling Grids for E-sciencE Plans for the EGEE-2 EGEE 2 Task Force 3rd EGEE Conference, Athens, April 2005.
The National Grid Service Mike Mineter.
Resource Provisioning EGI_DS WP3 consolidation workshop, CERN Fotis Karayannis, GRNET.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
Red Hat Enterprise Linux Presenter name Title, Red Hat Date.
COMPUTER NETWORKS Quizzes 5% First practical exam 5% Final practical exam 10% LANGUAGE.
EGI-InSPIRE EGI-InSPIRE RI The European Grid Infrastructure Steven Newhouse Director, EGI.eu Project Director, EGI-InSPIRE 29/06/2016CoreGrid.
The Mapper project receives funding from the EC's Seventh Framework Programme (FP7/ ) under grant agreement n° RI EGI and PRACE ecosystem.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI A pan-European Research Infrastructure supporting the digital European Research.
BioExcel - Intro Erwin Laure, KTH. PDC Center for High Performance Computing BioExcel Consortium KTH Royal Institute of Technology – Sweden University.
ScotGRID is the Scottish prototype Tier 2 Centre for LHCb and ATLAS computing resources. It uses a novel distributed architecture and cutting-edge technology,
Bob Jones EGEE Technical Director
Legacy and future of the World Data System (WDS) certification of data services and networks Dr Mustapha Mokrane, Executive Director, WDS International.
Report from Computing Advisory Panel
National e-Infrastructure Vision
UK Status and Plans Scientific Computing Forum 27th Oct 2017
INFS 3500 Martin, Brad, and John
IBM Start Now Host Integration Solutions
EGI Webinar - Introduction -
MAX IV Laboratory National Synchrotron Light Source – two storage rings (1.5 & 3.0 GeV) and a Short Pulse Facility Characteristics of User Program: 15.
Collaboration Board Meeting
The Cambridge Research Computing Service
Presentation transcript:

– please look at our wiki! Part of the new national e-Infrastructure Provides Advanced IT services for Theoretical and Simulation research in Particle Physics, Particle Astrophysics, Cosmology, Astrophysics and Solar System Science [90%] And for Industry, Commerce and Public Sector [10%] Given the mission of investigating and deploying cutting edge technologies (hardware and software) in a production environment Erk! We are a play pen as well as a research facility

Community Run DiRAC uses the standard Research Council Facility Model. Academic Community oversight and supervision of Technical staff Regular reporting to Project Management Board. This includes Science Outcomes Outcome driven resource allocation -no research outcomes no research Sound Familiar?

Structure Standard Facility Model –Oversight Committee (Meets twice yearly); Chair Foster (Cern) –Project Management Board (meets monthly); Chair Davies (Glasgow) –Technical Working Group (every fortnight); Chair Boyle (Edinburgh) –Project Director; Yates (UCL) PMB sets strategy, policy and considers reports on equipment usage and research outcomes – 10 members TWG members deliver HPC services and undertake projects for the Facility (8 members)

Computational Methods Monte Carlo – particle based codes for CFD and solution of PDEs Matrix Methods – Navier-Stokes and Shrodingers equations Integrators – Runge-Kutta, ODEs Numerical lattice QCD calculations using Monte Carlo methods

Where are we going - DiRAC-2 22 September 2011; DiRAC invited to submit a 1 page proposal to BIS to upgrade systems and make them part of the baseline service for the new National E- Infrastructure. Assisted by STFC (Trish, Malcolm and Janet) Awarded 14M for Compute and 1M for Storage. Strict Payment Deadlines (31/03/2012) imposed. Reviewed by Cabinet Office under the Gateway Review Mechanism Payment Profiles agreed with BIS

How has it gone Owning kit and paying for admin support at our sites works best – a hosting solution seems to get by far the mostest. –Rapid deployment of 5 systems using local HEI procurement Buying access via SLA/MoU – is simply not as good – just another user. SLAs don't always exist! Excellent Research outcomes – papers are flowing from the service Had to consolidate from 14 systems to 5 systems

New Systems III Final System Specs below. Costs include some Data Centre Capital Works System (supplier) Tflop/sConnectivityRAMPFSCost /£M BG Q (IBM)540 (Total now 1300) 5d Torus16TB1PB6.0 SMP (SGI)42NUMA16TB200TB1.8 Data Centric (OSF/IBM) 135QDR IB56TB2PB Usable 3.7 Data Analytic (DELL) 50% of 200Tflops FDR (Mell)38TB2PB Usable 1.5 Complexity (HP) 90FDR (Mell)36TB0.8PB2.0

User Access to Resources Now have Independent Peer Review System People apply for time; just like experimentalists do! First Call – 21 proposals! Over contention Central Budget from STFC for power and support staff (4FTE) Will need to leverage users' local sys admin support to assist with DiRAC We do need a cadre of developers to parallelise and optimise existing codes and to develop new applications Working with Vendors and Industry to attract more funds. Created a common login and reporting environment for DIRAC using EPCC-DL SAFE system – crude identity management

TWG Projects In Progress: Authentication/access for all our users to the allowed resources. Using SSH and Database updating cludge. In Progress: Authentication/access for all our users to the allowed resources. Using SSH and Database updating cludge. In Progress: Usage and System Monitoring – using SAFE initially In Progress: Usage and System Monitoring – using SAFE initially Networking Testing in concert with Janet GPFS Multi-Clustering – multi-clustering enables compute servers outside the cluster serving a file system to remotely mount that file system.

GPFS Multi-Clustering Why might we want it – you can use ls, cp, mv etc. Much more intuitive for humble users and no ftp-ing ivolved. Does it work over long distances (WAN)? Weeell – perhaps Offer from IBM to test between Edinburgh and Durham, both DiRAC GPFS sites. Would like to test simple data replication workflows. Understand and quantify identity and UID mapping issues. How YUK are they? Can SAFE help sort these out or do we need something else? Extend to the Hartree Centre, another GPFS site. Perform more complex workflows. Extend to Cambridge and Leciester – non IBM sites. IBM want to solve inter-operability issues

The Near Future New Management Structure –Project Director (me) in place and Project Manager now funded at 0.5FTE level. 1 FTE System Support at each of the four Hosting Sites Need to sort out sustainability of Facility –Tied to the establishment of the N E-I LC's 10 years Capital and Recurrent investment programme (~April 2014?) We can now perform Research Domain Leadership Simulations in the UK Need to federate access/authentication/monitoring systems middleware that actually works, is easily usable AND is easy to manage. Need to Federate Storage to allow easier workflow and data security

Some Theoretical Physics – The Universe (well, 12.5% of it)