UKI-SouthGrid Overview Pete Gronbech SouthGrid Technical Coordinator GridPP 25 - Ambleside 25 th August 2010.

Slides:



Advertisements
Similar presentations
London Tier2 Status O.van der Aa. Slide 2 LT 2 21/03/2007 London Tier2 Status Current Resource Status 7 GOC Sites using sge, pbs, pbspro –UCL: Central,
Advertisements

S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Deployment metrics and planning (aka Potentially the most boring talk this week) GridPP16 Jeremy Coles 27 th June 2006.
Southgrid Status Pete Gronbech: 21 st March 2007 GridPP 18 Glasgow.
LondonGrid Site Status and Resilience Issues Duncan Rand GridPP22.
Southgrid Status Pete Gronbech: 30 th August 2007 GridPP 19 Ambleside.
Storage Workshop Summary Wahid Bhimji University Of Edinburgh On behalf all of the participants…
Project Status David Britton,15/Dec/ Outline Programmatic Review Outcome CCRC08 LHC Schedule Changes Service Resilience CASTOR Current Status Project.
RAL Tier1: 2001 to 2011 James Thorne GridPP th August 2007.
Partner Logo Tier1/A and Tier2 in GridPP2 John Gordon GridPP6 31 January 2003.
LondonGrid Status Duncan Rand. Slide 2 GridPP 21 Swansea LondonGrid Status LondonGrid Five Universities with seven GOC sites –Brunel University –Imperial.
SouthGrid Status Pete Gronbech: 12 th March 2008 GridPP 20 Dublin.
GridPP4 Project Management Pete Gronbech April 2012 GridPP28 Manchester.
Northgrid Status Alessandra Forti Gridpp24 RHUL 15 April 2010.
UKI-SouthGrid Overview GridPP27 Pete Gronbech SouthGrid Technical Coordinator CERN September 2011.
12th September 2002Tim Adye1 RAL Tier A Tim Adye Rutherford Appleton Laboratory BaBar Collaboration Meeting Imperial College, London 12 th September 2002.
HEPSYSMAN Monitoring Workshop Introduction to the Day and Overview of Ganglia Pete Gronbech.
Jon Wakelin, Physics & ACRC Bristol. 2 ACRC Server Rooms –PTR – 48 APC water cooled racks (Hot aisle, cold aisle) –MVB – 12 APC water cooled racks (Hot.
Winnie Lacesso Bristol Site Report May Scope User Support / Servers / Config Security / Network UKI-SOUTHGRID-BRIS-HEP Upcoming: major infrastructure.
A couple of slides on RAL PPD Chris Brew CCLRC - RAL - SPBU - PPD.
CBPF J. Magnin LAFEX-CBPF. Outline What is the GRID ? Why GRID at CBPF ? What are our needs ? Status of GRID at CBPF.
BINP/GCF Status Report Jan 2010
HTCondor and the European Grid Andrew Lahiff STFC Rutherford Appleton Laboratory European HTCondor Site Admins Meeting 2014.
Report of Liverpool HEP Computing during 2007 Executive Summary. Substantial and significant improvements in the local computing facilities during the.
Birmingham site report Lawrie Lowe: System Manager Yves Coppens: SouthGrid support HEP System Managers’ Meeting, RAL, May 2007.
Storage: Futures Flavia Donno CERN/IT WLCG Grid Deployment Board, CERN 8 October 2008.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
UKI-SouthGrid Overview GridPP30 Pete Gronbech SouthGrid Technical Coordinator and GridPP Project Manager Glasgow - March 2012.
Site Report HEPHY-UIBK Austrian federated Tier 2 meeting
SouthGrid Status Pete Gronbech: 4 th September 2008 GridPP 21 Swansea.
UKI-SouthGrid Overview Face-2-Face Meeting Pete Gronbech SouthGrid Technical Coordinator Oxford June 2013.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator GridPP 24 - RHUL 15 th April 2010.
Quarterly report SouthernTier-2 Quarter P.D. Gronbech.
RAL PPD Site Update and other odds and ends Chris Brew.
Southgrid Technical Meeting Pete Gronbech: 16 th March 2006 Birmingham.
David Hutchcroft on behalf of John Bland Rob Fay Steve Jones And Mike Houlden [ret.] * /.\ /..‘\ /'.‘\ /.''.'\ /.'.'.\ /'.''.'.\ ^^^[_]^^^ * /.\ /..‘\
SouthGrid Status Pete Gronbech: 2 nd April 2009 GridPP22 UCL.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
Monitoring the Grid at local, national, and Global levels Pete Gronbech GridPP Project Manager ACAT - Brunel Sept 2011.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPIX 2009 Umea, Sweden 26 th May 2009.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPSYSMAN RAL 30 th June 2009.
Oxford Update HEPix Pete Gronbech GridPP Project Manager October 2014.
INDIACMS-TIFR Tier 2 Grid Status Report I IndiaCMS Meeting, April 05-06, 2007.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
Southgrid Technical Meeting Pete Gronbech: 26 th August 2005 Oxford.
Site Report BEIJING-LCG2 Wenjing Wu (IHEP) 2010/11/21.
UKI-SouthGrid Update Hepix Pete Gronbech SouthGrid Technical Coordinator April 2012.
MW Readiness Verification Status Andrea Manzi IT/SDC 21/01/ /01/15 2.
London Tier 2 Status Report GridPP 11, Liverpool, 15 September 2004 Ben Waugh on behalf of Owen Maroney.
Southgrid Technical Meeting Pete Gronbech: 24 th October 2006 Cambridge.
Southgrid Technical Meeting Pete Gronbech: May 2005 Birmingham.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE Site Architecture Resource Center Deployment Considerations MIMOS EGEE Tutorial.
Tier 3 Status at Panjab V. Bhatnagar, S. Gautam India-CMS Meeting, July 20-21, 2007 BARC, Mumbai Centre of Advanced Study in Physics, Panjab University,
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPSYSMAN – RAL 10 th June 2010.
RAL PPD Tier 2 (and stuff) Site Report Rob Harper HEP SysMan 30 th June
BaBar Cluster Had been unstable mainly because of failing disks Very few (
CERN IT Department CH-1211 Genève 23 Switzerland t SL(C) 5 Migration at CERN CHEP 2009, Prague Ulrich SCHWICKERATH Ricardo SILVA CERN, IT-FIO-FS.
J Jensen/J Gordon RAL Storage Storage at RAL Service Challenge Meeting 27 Jan 2005.
The RAL PPD Tier 2/3 Current Status and Future Plans or “Are we ready for next year?” Chris Brew PPD Christmas Lectures th December 2007.
The HEPiX IPv6 Working Group David Kelsey (STFC-RAL) EGI OMB 19 Dec 2013.
INFN/IGI contributions Federated Clouds Task Force F2F meeting November 24, 2011, Amsterdam.
The status of IHEP Beijing Site WLCG Asia-Pacific Workshop Yaodong CHENG IHEP, China 01 December 2006.
Cambridge Site Report John Hill 20 June 20131SouthGrid Face to Face.
UK Status and Plans Catalin Condurache – STFC RAL ALICE Tier-1/Tier-2 Workshop University of Torino, February 2015.
Pete Gronbech GridPP Project Manager April 2016
LCG Service Challenge: Planning and Milestones
Update on Plan for KISTI-GSDC
Oxford Site Report HEPSYSMAN
Presentation transcript:

UKI-SouthGrid Overview Pete Gronbech SouthGrid Technical Coordinator GridPP 25 - Ambleside 25 th August 2010

Seven(-teen) Sisters

SouthGrid August UK Tier 2 reported CPU – Historical View to present

SouthGrid August SouthGrid Sites Accounting as reported by APEL Sites Upgrading to SL5 and recalibration of published SI2K values

SouthGrid August Site Resources HEPSPEC06 CPU (kSI2K) converted from HEPSPEC06 benchmarksStorage (TB) Site EDFA-JET Birmingham Bristol Cambridge Oxford RALPPD Totals

Gridpp3 h/w generated MoU for 2010,11, TB2011 TB2012 TB bham bris cam ox RALPPD HS HS HS06 bham14502, bris6611, cam11481, ox20342, RALPPD

SouthGrid August JET Stable operation, (SL5 WNs) Could handle more opportunistic LHC work 1772HS06 1.5TB

SouthGrid August Birmingham Just purchased 40TB Storage –total storage to 10TB + 6*20 + 2*40 = 210 TB in a week or two Two new 64 bit servers –(SL5) Site BDII + monitoring VMs –(SL5) DPM head node Everything (except mon) is SL5 Both clusters have dual lcg- CE/CreamCE front ends Sluggish response/instabilities with GPFS on Shared Cluster –Installed 4TB NFS mounted file server for experiment software/middleware/user areas Taken on someone else's proprietary (non SL5) smart phone. He couldn't get signal in there either.

SouthGrid August Birmingham

Bristol LCG StoRM SE with gpfs, 102TB 90% full of CMS data StoRM developers are finishing testing on SL5 64bit, plan to provide both for slc4 ia32 and sl5 x86_64 to Early Adopters this month (August). Bristol is waiting for stable well-tested StoRM v1.5 SL5 64-bit release. In the meantime Bristol's StoRM v1.3 (32-bit on SL4) working very well! On 1Gbps network, getting good bandwidth utilization Servers (StoRM & gridftp) very responsive despite load:

Prior WN: Intel XEON 2.0GHz; Dec2009 new WN: AMD 2.4GHz each AMD WN = 2 x 1TB drive, part of 1 disk = WN space Dr Metson experimenting with HDFS using rest of 1 disk + 2 nd disk, working with INFN on possibility of StoRM on top of HDFS Also experimenting with using Hadoop to process CMS data In Other News... Swingeing IT staff cuts being planned at U Bristol (and downgrades for those few remaining) Started planning that SouthGrid will take over Bristol LCG Site Admin from April 2011 Consolidate & reduce PP servers so Astro admin can inherit PP Staff will best-effort support Bristol AFS server (IS won't) HDFS with StoRM

SouthGrid August Bristol Plan to try to run the ces and other control nodes on Virtual machines using an identical setup to Oxford, to enable remote management. The StoRM SE on GPFS will be run by Bob Cregan on site.

SouthGrid August Cambridge 32 cores CPU installed April 2010: bought from GridPP3 tranche 2. Server to host several virtual machines (BDII, Mon, etc.) just delivered. Network upgraded last November to provide gigabit ethernet to all GRID systems. Storage is still 140TB; CPU will be increased due to the purchase in the first point. Atlas production is the main VO running on this site. Investigating current under utilisation, possible Accounting issues?

SouthGrid August RALPP We believe we are now through all the messing about with air conditioning, with our machine room now running on the refurbished/upgraded AC plant. Happy days, all except for the leaks shortly after they turned it on! We've been running well below nominal capacity for most of this year, but are pretty much back now. Joining with the Tier 1 for the tender process. Testing argus and glexec RGMA and site BDII now moved to SL5 VMs Working on setting up a test instance of dCache, working with the Tier 1, using Tier 2 hardware.

SouthGrid August Oxford Last 6 months cluster running with very high utilisation. Completed the tender for new kit and placed orders in July. Unfortunately the orders had to be cancelled due to manufacturing delays on the particular motherboard we ordered and a pricing problem. Now re-evaluating all suppliers with updated quotes. New Argus server installed. (Report by Kashif) –Installing Argus was easy and configuring was also OK once I understood the basic concept of policies but it took me a considerable time because of a bug in Argus which is partly due to old style of host certificate issued by UK CA. The same issue was responsible for gridpp voms server problem. I have reported this to UK CA. –Argus uses glexec on the WN, it is being tested the glexec installed on t2wn41. –Details on gridpp wiki Oxford has become an early adopter for CREAM and ARGUS.

SouthGrid August Grid Cluster setup CREAM ce & pilot setup t2ce02 CREAM Glite 3.2 SL5 T2wn41 glexec enabled t2argus02 t2ce06 CREAM Glite 3.2 SL5 T2wn Oxford

SouthGrid August gridppnagios Oxford runs the UKI Regional Nagios monitoring site. The Operations dashboard takes information from this. idServiceMonitoringInfo idServiceMonitoringInfo

Oxford Dashboard SouthGrid August Thanks to Glasgow for the idea / code

Oxfords Atlas dashboard SouthGrid August

SouthGrid August Conclusions SouthGrid sites utilisation generally improving Many had recent upgrades for hardware using Gridpp3 second tranche, others putting out tenders, some delays following issues with vendor at Oxford RALPPD back to full strength following AC upgrade Monitoring for production running improving Concerns over reduced manpower at sites as we move into GridPP 4

Future Meetings Look forward to GridPP 26 in Sheffield next April If you look in the right places the views are as good as here in the lakes.