Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Southgrid Status Pete Gronbech: 21 st March 2007 GridPP 18 Glasgow.
Southgrid Status Pete Gronbech: 30 th August 2007 GridPP 19 Ambleside.
SouthGrid Status Pete Gronbech: 12 th March 2008 GridPP 20 Dublin.
NorthGrid status Alessandra Forti Gridpp12 Brunel, 1 February 2005.
GridPP News NeSC opening “Media” dissemination Tier 1/A hardware Web pages Collaboration meetings Nick Brook University of Bristol.
9th May 2006HEPSYSMAN RAL - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager.
Birmingham site report Lawrie Lowe: System Manager Yves Coppens: SouthGrid support HEP System Managers’ Meeting, RAL, May 2007.
17th October 2013Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and GridPP Project Manager.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
Andrew McNab - Manchester HEP - 22 April 2002 UK Rollout and Support Plan Aim of this talk is to the answer question “As a site admin, what are the steps.
Cambridge Site Report Cambridge Site Report HEP SYSMAN, RAL th June 2010 Santanu Das Cavendish Laboratory, Cambridge Santanu.
SouthGrid Status Pete Gronbech: 4 th September 2008 GridPP 21 Swansea.
UKI-SouthGrid Overview Face-2-Face Meeting Pete Gronbech SouthGrid Technical Coordinator Oxford June 2013.
London Tier 2 Status Report GridPP 12, Brunel, 1 st February 2005 Owen Maroney.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
Quarterly report SouthernTier-2 Quarter P.D. Gronbech.
Southgrid Technical Meeting Pete Gronbech: 16 th March 2006 Birmingham.
David Hutchcroft on behalf of John Bland Rob Fay Steve Jones And Mike Houlden [ret.] * /.\ /..‘\ /'.‘\ /.''.'\ /.'.'.\ /'.''.'.\ ^^^[_]^^^ * /.\ /..‘\
SouthGrid Status Pete Gronbech: 2 nd April 2009 GridPP22 UCL.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPIX 2009 Umea, Sweden 26 th May 2009.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPSYSMAN RAL 30 th June 2009.
Oxford Update HEPix Pete Gronbech GridPP Project Manager October 2014.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
Andrew McNabNorthGrid, GridPP8, 23 Sept 2003Slide 1 NorthGrid Status Andrew McNab High Energy Physics University of Manchester.
11th Oct 2005Hepix SLAC - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager and South Grid Technical Co-ordinator.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH OS X Home server AFS using openafs 3 DB servers Kerberos 4 we will move.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
Southgrid Technical Meeting Pete Gronbech: 26 th August 2005 Oxford.
1 PRAGUE site report. 2 Overview Supported HEP experiments and staff Hardware on Prague farms Statistics about running LHC experiment’s DC Experience.
UKI-SouthGrid Update Hepix Pete Gronbech SouthGrid Technical Coordinator April 2012.
13th October 2011Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and GridPP Project Manager.
1st July 2004HEPSYSMAN RAL - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
London Tier 2 Status Report GridPP 11, Liverpool, 15 September 2004 Ben Waugh on behalf of Owen Maroney.
Southgrid Technical Meeting Pete Gronbech: 24 th October 2006 Cambridge.
Owen SyngeTitle of TalkSlide 1 Storage Management Owen Synge – Developer, Packager, and first line support to System Administrators. Talks Scope –GridPP.
Dave Newbold, University of Bristol8/3/2001 UK Testbed 0 Sites Sites that have committed to TB0: RAL (R) Birmingham (Q) Bristol (Q) Edinburgh (Q) Imperial.
Southgrid Technical Meeting Pete Gronbech: May 2005 Birmingham.
14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.
Tier1 Andrew Sansum GRIDPP 10 June GRIDPP10 June 2004Tier1A2 Production Service for HEP (PPARC) GRIDPP ( ). –“ GridPP will enable testing.
Southgrid Technical Meeting Pete Gronbech: February 2005 Birmingham.
Presenter Name Facility Name UK Testbed Status and EDG Testbed Two. Steve Traylen GridPP 7, Oxford.
HEPSYSMAN May 2007 Oxford & SouthGrid Computing Status (Ian McArthur), Pete Gronbech May 2007 Physics IT Services PP Computing.
HEP Computing Status Sheffield University Matt Robinson Paul Hodgson Andrew Beresford.
Gareth Smith RAL PPD RAL PPD Site Report. Gareth Smith RAL PPD RAL Particle Physics Department Overview About 90 staff (plus ~25 visitors) Desktops mainly.
UK Tier 1 Centre Glenn Patrick LHCb Software Week, 28 April 2006.
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
CERN Computer Centre Tier SC4 Planning FZK October 20 th 2005 CERN.ch.
11th October 2012Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and GridPP Project Manager.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPSYSMAN – RAL 10 th June 2010.
Maria Girone CERN - IT Tier0 plans and security and backup policy proposals Maria Girone, CERN IT-PSS.
BaBar Cluster Had been unstable mainly because of failing disks Very few (
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
1 Update at RAL and in the Quattor community Ian Collier - RAL Tier1 HEPiX FAll 2010, Cornell.
Evangelos Markatos and Charalampos Gkikas FORTH-ICS Athens, th Mar Institute of Computer Science - FORTH Christos.
J Jensen/J Gordon RAL Storage Storage at RAL Service Challenge Meeting 27 Jan 2005.
UK Status and Plans Catalin Condurache – STFC RAL ALICE Tier-1/Tier-2 Workshop University of Torino, February 2015.
18/12/03PPD Christmas Lectures 2003 Grid in the Department A Guide for the Uninvolved PPD Computing Group Christmas Lecture 2003 Chris Brew.
Stuart Wild. Particle Physics Group Meeting, January 2010.
Understanding the nature of matter -
UK GridPP Tier-1/A Centre at CLRC
The INFN TIER1 Regional Centre
Oxford Site Report HEPSYSMAN
UK Testbed Status Testbed 0 GridPP project Experiments’ tests started
Presentation transcript:

Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel

Southgrid Member Institutions Oxford RAL PPD Cambridge Birmingham Bristol Warwick

Status at Warwick No change since Gridpp 11. Third line institute – no resources as yet but remain interested in being involved in the future. Will not receive GridPP resources and so does not need to sign the MOU yet.

Operational Status RAL PPD Cambridge Bristol Birmingham Oxford

Status at RAL PPD Always on the leading edge of software deployment (Benefit of RAL Tier 1) SL3 cluster on worker nodes increasing. Legacy service LCG on RH7.3 (Winding down) CPUs: GHz, GHz –100% Dedicated to LCG 0.5 TB Storage –100% Dedicated to LCG

Status at Cambridge Currently LCG on RH7.3 Parallel install of SL3 with using yaim. CPUs: GHz – increase to 40 soon. –100% Dedicated to LCG 3 TB Storage –100% Dedicated to LCG

Status at Bristol Status –LCG involvement limited (“black dot”) for previous six months due to lack of manpower –New resources, posts now on the horizon! Existing resources –80-CPU BaBar farm to be switched to LCG –~ 2TB storage resources to be LCG – accessible –LCG head nodes installed by SouthGrid support team with New resources –Funding now confirmed for large University investment in hardware –Includes CPU, high quality and scratch disk resources Humans –New system manager post (RG) being filled –New SouthGrid support / development post (GridPP / HP) being filled –HP keen to expand industrial collaboration – suggestions?

Status at Birmingham Currently LCG 2.2 (since August). Currently installing SL3 on Gridpp Frontend Nodes, will use yaim to install LCG-2_3_0 CPUs: GHz Xenon (+48 soon) –100% LCG 2 TB Storage awaiting “Front End Machines” –100% LCG. Southgrid’s “Hardware Support Post” Yves Coppens appointed.

Status at Oxford Currently LCG on RH7.3 Parallel SL3 install, will use yaim to install asap CPUs: GHz –100% LCG 1.5 TB Storage – upgrade to 3TB planned –100% LCG.

Two racks each containing 20 Dell dual 2.8GHz Xeon’s with SCSI system disks. 1.6TB SCSI disk array in each rack. Systems are loaded with LCG2 software version SCSI disks and Broadcom Gigabit Ethernet causes some problems with installation initially. The systems have been heavily used by the LHCb Data Challenge. Oxford Tier 2 centre for LHC

First rack in very crowed computer room (650) Second rack currently temporarily located in theoretical physics computer room. On the limit of power in 650 Air Conditioning not reliable Problems: Space, Power and Cooling. A proposal for a new purpose built computer room on Level 1 (underground) is in progress.

CERN Computer Room

Site on Level 1 for proposed computer room An ideal location –Lots of power (5000A) –Underground (no heat from the sun and very secure). –Lots of headroom (false floor/ceiling for cooling systems) –Basement (so no floor loading limit) False floor, large Air conditioning units and power for approx racks to be provided. A rack full of 1U servers can create 12KW of heat and use 50A of power. Will offer space to other Oxford University departments

DWB computer room project. 26-Nov-2004

Centre of the Racks

Resource Summary CPU (3GHz equiv) –155.2 Total Storage (TB) –7 TB Total

LCG2 Administrator’s Course A lot of interest in a repeat, especially when the 8.5 “Hardware Support” posts are filled (suggestions welcome). PXE / kickstart install vs Quattor…?

Ongoing Issues Complexity of the installation. New yaim scripts have helped enormously. Difficulty sharing resources – almost all of those listed are 100% LCG due to difficult sharing issues. How will we manage clusters without LCFGng? Quattor has a learning curve. Course showed that it is very modular but PXE/kickstart + yaim preferred option at the moment. grid certificates supported browsers.