Oxford Update HEPix Pete Gronbech GridPP Project Manager October 2014.

Slides:



Advertisements
Similar presentations
Southgrid Status Pete Gronbech: 21 st March 2007 GridPP 18 Glasgow.
Advertisements

SouthGrid Status Pete Gronbech: 12 th March 2008 GridPP 20 Dublin.
UKI-SouthGrid Overview GridPP27 Pete Gronbech SouthGrid Technical Coordinator CERN September 2011.
National Grid's Contribution to LHCb IFIN-HH Serban Constantinescu, Ciubancan Mihai, Teodor Ivanoaica.
Alastair Dewhurst, Dimitrios Zilaskos RAL Tier1 Acknowledgements: RAL Tier1 team, especially John Kelly and James Adams Maximising job throughput using.
Chris Brew RAL PPD Site Report Chris Brew SciTech/PPD.
Birmingham site report Lawrie Lowe: System Manager Yves Coppens: SouthGrid support HEP System Managers’ Meeting, RAL, May 2007.
17th October 2013Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and GridPP Project Manager.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
UKI-SouthGrid Overview GridPP30 Pete Gronbech SouthGrid Technical Coordinator and GridPP Project Manager Glasgow - March 2012.
Oxford Site Update HEPiX Sean Brisbane Tier 3 Linux System Administrator March 2015.
Tier 2 Prague Institute of Physics AS CR Status and Outlook J. Chudoba, M. Elias, L. Fiala, J. Horky, T. Kouba, J. Kundrat, M. Lokajicek, J. Svec, P. Tylka.
Cambridge Site Report Cambridge Site Report HEP SYSMAN, RAL th June 2010 Santanu Das Cavendish Laboratory, Cambridge Santanu.
SouthGrid Status Pete Gronbech: 4 th September 2008 GridPP 21 Swansea.
UKI-SouthGrid Overview Face-2-Face Meeting Pete Gronbech SouthGrid Technical Coordinator Oxford June 2013.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator GridPP 24 - RHUL 15 th April 2010.
Quarterly report SouthernTier-2 Quarter P.D. Gronbech.
Southgrid Technical Meeting Pete Gronbech: 16 th March 2006 Birmingham.
Preparation of KIPT (Kharkov) computing facilities for CMS data analysis L. Levchuk Kharkov Institute of Physics and Technology (KIPT), Kharkov, Ukraine.
SouthGrid Status Pete Gronbech: 2 nd April 2009 GridPP22 UCL.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPIX 2009 Umea, Sweden 26 th May 2009.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPSYSMAN RAL 30 th June 2009.
1 st EGEE Conference – April UK and Ireland Partner Dave Kant Deputy ROC Manager.
11th Oct 2005Hepix SLAC - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager and South Grid Technical Co-ordinator.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
Southgrid Technical Meeting Pete Gronbech: 26 th August 2005 Oxford.
UKI-SouthGrid Update Hepix Pete Gronbech SouthGrid Technical Coordinator April 2012.
NCPHEP ATLAS/CMS Tier3: status update V.Mossolov, S.Yanush, Dz.Yermak National Centre of Particle and High Energy Physics of Belarusian State University.
13th October 2011Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and GridPP Project Manager.
P. Kuipers Nikhef Amsterdam Computer- Technology Nikhef Site Report Paul Kuipers
RAL Site Report HEPiX FAll 2014 Lincoln, Nebraska October 2014 Martin Bly, STFC-RAL.
Southgrid Technical Meeting Pete Gronbech: 24 th October 2006 Cambridge.
Southgrid Technical Meeting Pete Gronbech: May 2005 Birmingham.
KOLKATA Grid Site Name :- IN-DAE-VECC-02Monalisa Name:- Kolkata-Cream VO :- ALICECity:- KOLKATACountry :- INDIA Shown many data transfers.
14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.
HEPSYSMAN May 2007 Oxford & SouthGrid Computing Status (Ian McArthur), Pete Gronbech May 2007 Physics IT Services PP Computing.
IHEP(Beijing LCG2) Site Report Fazhi.Qi, Gang Chen Computing Center,IHEP.
UK Tier 1 Centre Glenn Patrick LHCb Software Week, 28 April 2006.
Oxford & SouthGrid Update HEPiX Pete Gronbech GridPP Project Manager October 2015.
11th October 2012Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and GridPP Project Manager.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPSYSMAN – RAL 10 th June 2010.
RAL PPD Tier 2 (and stuff) Site Report Rob Harper HEP SysMan 30 th June
BaBar Cluster Had been unstable mainly because of failing disks Very few (
RALPP Site Report HEP Sys Man, 11 th May 2012 Rob Harper.
A. Mohapatra, T. Sarangi, HEPiX-Lincoln, NE1 University of Wisconsin-Madison CMS Tier-2 Site Report D. Bradley, S. Dasu, A. Mohapatra, T. Sarangi, C. Vuosalo.
J Jensen/J Gordon RAL Storage Storage at RAL Service Challenge Meeting 27 Jan 2005.
The RAL PPD Tier 2/3 Current Status and Future Plans or “Are we ready for next year?” Chris Brew PPD Christmas Lectures th December 2007.
Cambridge Site Report John Hill 20 June 20131SouthGrid Face to Face.
UK Status and Plans Catalin Condurache – STFC RAL ALICE Tier-1/Tier-2 Workshop University of Torino, February 2015.
RAL Site Report HEP SYSMAN June 2016 – RAL Gareth Smith, STFC-RAL With thanks to Martin Bly, STFC-RAL.
18/12/03PPD Christmas Lectures 2003 Grid in the Department A Guide for the Uninvolved PPD Computing Group Christmas Lecture 2003 Chris Brew.
WLCG IPv6 deployment strategy
Title of the Poster Supervised By: Prof.*********
Experience of Lustre at QMUL
Pete Gronbech GridPP Project Manager April 2016
Operations and plans - Polish sites
Pete Gronbech GridPP Project Manager April 2017
HEPiX Spring 2014 Annecy-le Vieux May Martin Bly, STFC-RAL
Yaodong CHENG Computing Center, IHEP, CAS 2016 Fall HEPiX Workshop
Oxford University Particle Physics Unix Overview
Update on Plan for KISTI-GSDC
Experience of Lustre at a Tier-2 site
Oxford Site Report HEPSYSMAN
HPEiX Spring RAL Site Report
Pete Gronbech, Kashif Mohammad and Vipul Davda
Presentation transcript:

Oxford Update HEPix Pete Gronbech GridPP Project Manager October 2014

Oxford Particle Physics - Overview Oxford University has one of the largest Physics Departments in the UK –~450 staff and 1100 students. Particle Physics sub department is the largest in the UK We now support Windows, Linux (Ubuntu) and Mac on the desktop across the department. –Ubuntu - Cobbler used to install base system, Cfengine used for package control and configuration. Two computational clusters for the PP physicists. –Grid Cluster part of the SouthGrid Tier-2 –Local Cluster (AKA Tier-3) A common Cobbler and Puppet system is used to install and maintain all the SL6 systems. Lincoln - October

UK GridPP Computing Structure One Tier-1 center - RAL 18 University Particle Physics Departments Each is part of a regional Grid Tier-2 center Most have some local computing clusters (AKA Tier-3) Oxford is part of SouthGrid. SouthGrid is comprised of all the non London based sites in the South of the UK. Birmingham, Bristol, Cambridge, JET, Oxford, RAL PPD, Sussex. Lincoln - October

UKI Tier-1 & Tier-2 contributions The UK the second largest contributor to the WLCG. (~11% cf. 28% for USA) Accounting for the last year. Tier-1 accounts for~31% Tier-2s share as below 4 Lincoln - October 2014

UK Grid Job Mix Lincoln - October

6 UK Tier-2 reported CPU – Historical View to present Comparing last update in 2012.

Lincoln - October SouthGrid Sites Accounting as reported by APEL

Oxford Grid Cluster Upgrades over the last year to increase the capacity of the storage and some modest CPU upgrades. 15 Dell 720XDs servers (12*4TB raw capacity). Note we used SATA not SAS this time. Unlikely to do this in the future. (SAS becoming the default and support for SATA costs extra). SE is running DPM. Three ‘twin-squared’ Viglen Supermicro HX525T2i worker nodes have been installed. Intel E5-2650v2 8 core (16 Hyper-threaded cores each) provides 384 job slots with 2GB RAM. Two thirds of the Grid Cluster now running HT Condor behind ARC CE. Remaining third running legacy torque/maui driven by CREAM CE. Lincoln - October Current capacity 16,768HS TB

Intel E v2 SL6 HEPSPEC06 Average result % improvement over 2650 v1 on SL5 Lincoln - October

Power Usage – Twin squared chassis Max 1165W Idle 310W Lincoln - October

Oxford’s Grid Cluster 11 Lincoln - October 2014

Begbroke Computer Room 12 Lincoln - October 2014

Local Computer room showing PP cluster & cold aisle containment 13 Very similar h/w to the Grid Cluster. Same Cobbler and Puppet management setup. Lustre used for larger groups Capacity: HS06, 716TB Lincoln - October 2014

Networking ~900 MB/s = 7.2 Gbps. The University had a 10Gbit link to JANET with a 10 Gbit failover link. A third link was added and the Grid traffic routed exclusively down that in August Plots from March 2014 (Atlas transfer from BNL) Lincoln - October

CMS Tier-3 –Supported by RALPPD’s PhEDEx server –Useful for CMS, and for us, keeping the site busy in quiet times –However can block Atlas jobs and during accounting period not so desirable ALICE Support –There is a need to supplement the support given to ALICE by Birmingham. –Made sense to keep this in SouthGrid so Oxford have deployed an ALICE VO box UK Regional Monitoring –Oxford runs the nagios based WLCG monitoring for the UK –These include the Nagios server itself, and support nodes for it, SE, MyProxy and WMS/LB –Multi VO Nagios Monitoring added two years ago. IPv6 Testing –We take leading part in the IPv6 testing, many services enabled and tested by the community. –perfSONAR IPv6 enabled. RIPE Atlas probe also on IPv6. Cloud Development –Openstack test setup (Has run Atlas jobs) –VAC setup (LHCb, Atlas & GridPP Dirac server jobs) Other Oxford Work Lincoln - October

Lincoln - October Conclusions Recent hardware purchases have provided both storage capacity and CPU performance improvements. Good Network connectivity Solid computer rooms. Medium sized Grid site but have involvement in many development projects.

HEPix Spring 2015 is coming to Oxford 17

Other Oxford Attractions! 18

Including Oxford Physics Lincoln - October