Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.

Slides:



Advertisements
Similar presentations
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Advertisements

Southgrid Status Pete Gronbech: 21 st March 2007 GridPP 18 Glasgow.
Southgrid Status Pete Gronbech: 30 th August 2007 GridPP 19 Ambleside.
SouthGrid Status Pete Gronbech: 12 th March 2008 GridPP 20 Dublin.
GridPP News NeSC opening “Media” dissemination Tier 1/A hardware Web pages Collaboration meetings Nick Brook University of Bristol.
LCG Tiziana Ferrari - SC3: INFN installation status report 1 Service Challenge Phase 3: Status report Tiziana Ferrari on behalf of the INFN SC team INFN.
National Grid's Contribution to LHCb IFIN-HH Serban Constantinescu, Ciubancan Mihai, Teodor Ivanoaica.
Quarterly report ScotGrid Quarter Fraser Speirs.
Alastair Dewhurst, Dimitrios Zilaskos RAL Tier1 Acknowledgements: RAL Tier1 team, especially John Kelly and James Adams Maximising job throughput using.
Report of Liverpool HEP Computing during 2007 Executive Summary. Substantial and significant improvements in the local computing facilities during the.
Birmingham site report Lawrie Lowe: System Manager Yves Coppens: SouthGrid support HEP System Managers’ Meeting, RAL, May 2007.
Site Report HEPHY-UIBK Austrian federated Tier 2 meeting
London Tier 2 Status Report GridPP 13, Durham, 4 th July 2005 Owen Maroney, David Colling.
March 27, IndiaCMS Meeting, Delhi1 T2_IN_TIFR of all-of-us, for all-of-us, by some-of-us Tier-2 Status Report.
Cambridge Site Report Cambridge Site Report HEP SYSMAN, RAL th June 2010 Santanu Das Cavendish Laboratory, Cambridge Santanu.
SouthGrid Status Pete Gronbech: 4 th September 2008 GridPP 21 Swansea.
UKI-SouthGrid Overview Face-2-Face Meeting Pete Gronbech SouthGrid Technical Coordinator Oxford June 2013.
London Tier 2 Status Report GridPP 12, Brunel, 1 st February 2005 Owen Maroney.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator GridPP 24 - RHUL 15 th April 2010.
Quarterly report SouthernTier-2 Quarter P.D. Gronbech.
Southgrid Technical Meeting Pete Gronbech: 16 th March 2006 Birmingham.
David Hutchcroft on behalf of John Bland Rob Fay Steve Jones And Mike Houlden [ret.] * /.\ /..‘\ /'.‘\ /.''.'\ /.'.'.\ /'.''.'.\ ^^^[_]^^^ * /.\ /..‘\
BINP/GCF Status Report BINP LCG Site Registration Oct 2009
SouthGrid Status Pete Gronbech: 2 nd April 2009 GridPP22 UCL.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
Quarterly report ScotGrid Quarter Fraser Speirs.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPIX 2009 Umea, Sweden 26 th May 2009.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPSYSMAN RAL 30 th June 2009.
Oxford Update HEPix Pete Gronbech GridPP Project Manager October 2014.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
Organisation Management and Policy Group (MPG): Responsible for setting and policy decisions and resolving any issues concerning fractional usage, acceptable.
11th Oct 2005Hepix SLAC - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager and South Grid Technical Co-ordinator.
Romanian Tier-2 Federation One site for all: RO-07-NIPNE Mihai Ciubancan on behalf of IT Department.
Project Management Sarah Pearce 3 September GridPP21.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
São Paulo Regional Analysis Center SPRACE Status Report 22/Aug/2006 SPRACE Status Report 22/Aug/2006.
Southgrid Technical Meeting Pete Gronbech: 26 th August 2005 Oxford.
UKI-SouthGrid Update Hepix Pete Gronbech SouthGrid Technical Coordinator April 2012.
13th October 2011Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and GridPP Project Manager.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
London Tier 2 Status Report GridPP 11, Liverpool, 15 September 2004 Ben Waugh on behalf of Owen Maroney.
Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.
Southgrid Technical Meeting Pete Gronbech: 24 th October 2006 Cambridge.
Southgrid Technical Meeting Pete Gronbech: May 2005 Birmingham.
14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.
HEPSYSMAN May 2007 Oxford & SouthGrid Computing Status (Ian McArthur), Pete Gronbech May 2007 Physics IT Services PP Computing.
HEP SYSMAN 23 May 2007 National Grid Service Steven Young National Grid Service Manager Oxford e-Research Centre University of Oxford.
UK Tier 1 Centre Glenn Patrick LHCb Software Week, 28 April 2006.
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
CERN Computer Centre Tier SC4 Planning FZK October 20 th 2005 CERN.ch.
11th October 2012Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and GridPP Project Manager.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPSYSMAN – RAL 10 th June 2010.
Site Report: Prague Jiří Chudoba Institute of Physics, Prague WLCG GridKa+T2s Workshop.
Materials for Report about Computing Jiří Chudoba x.y.2006 Institute of Physics, Prague.
RAL PPD Tier 2 (and stuff) Site Report Rob Harper HEP SysMan 30 th June
BaBar Cluster Had been unstable mainly because of failing disks Very few (
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
Evangelos Markatos and Charalampos Gkikas FORTH-ICS Athens, th Mar Institute of Computer Science - FORTH Christos.
J Jensen/J Gordon RAL Storage Storage at RAL Service Challenge Meeting 27 Jan 2005.
The RAL PPD Tier 2/3 Current Status and Future Plans or “Are we ready for next year?” Chris Brew PPD Christmas Lectures th December 2007.
Cambridge Site Report John Hill 20 June 20131SouthGrid Face to Face.
18/12/03PPD Christmas Lectures 2003 Grid in the Department A Guide for the Uninvolved PPD Computing Group Christmas Lecture 2003 Chris Brew.
London Tier-2 Quarter Owen Maroney
Pete Gronbech GridPP Project Manager April 2016
Luca dell’Agnello INFN-CNAF
Oxford Site Report HEPSYSMAN
Presentation transcript:

Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL

Southgrid Member Institutions Oxford RAL PPD Cambridge Birmingham Bristol HP-Bristol Warwick

Status at RAL PPD SL3 cluster on glite CPUs: GHz Xeons 6.4TB of RAID disks (dcache) 24 VO’s supported!! Network monitoring Machine installed and running Currently Installing New Kit 52 dual core dual opterons 64TB of storage. Nortel switches with 10Gb/s capable uplink. Hope to get 10Gb/s link to RAL T1 this summer.

Status at Cambridge Currently glite_3.0.1 on SL3 CPUs: GHz Xeons 3.2 TB Storage Network monitoring box requested. Condor Batch System Lack of Condor support from LCG teams Problems installing experiment software with the VO sgm account as this does not map well to condoruser1 / 2. Currently overcome by local intervention but need a long term solution. Future Upgrades to CAMgrid in Autumn 06 could mean additional 160 KSI2K plus 4-5 TB disk storage

Status at Bristol Status –Running glite on SL3 Existing resources –GridPP nodes plus local cluster nodes used to bring site on line. –Now have 16 WN cpu’s (This is an 8 fold increase since last gridpp) –Network monitoring box installed and running New resources –2TB storage increase in SE next week –University investment in hardware Includes CPU, high quality and scratch disk resources –Installation commences this summer (512 cores to be installed in August, 2048 by Jan 07) –1Gbps private link to RAL for the Grid Cluster next year Staff –New SouthGrid support / development post (GridPP / HP) being filled

Status at Birmingham Currently SL3 with glite_3.0.1 CPUs: GHz Xeon ( MHz ) 1.9TB DPM installed Oct 2005 Babar Farm runs SL3 and Bristol farm integrated Running Pre Production Service Network monitoring box installed and running ALICE VO Box installed

Status at Oxford Currently glite_3.0.1 on SL WN cpus’s running CPUs: GHz Total 3.0TB DPM storage on one server node and one pool node. Some further Air Conditioning Problems now resolved for Room 650, Second rack in overheating basement. Five dual 2.8GHz servers used for Southgrid Testzone, have been used for pre release testing of and glite 3 Network monitoring box in place and running Oxford was the first Southgrid site to install glite 3 OPS VO enabled Also in talks with the Oxford CampusGrid and NGS

SouthGrid Q1/2 2006

SouthGrid Site Shares Q1/ Q4 2005

LHCb ATLAS Oxford Tier 2 GridPP Cluster June 2006 Biomed Queued jobs; Have just reduced biomed allocation as there are waiting LHC jobs. Running Jobs: glite_3.0.0 upgrade

RAL PPD Support more VO’s = Better Utilization

Oxford RALPPD Birmingham Bristol Cambridge SouthGrid Utilization 2005/2006

SC4 All sites now on gigabit Ethernet to at least the SE’s. All sites > 250Mbps in the throughput tests 4/5 sites have the network monitoring box installed and running, remaining site is expecting theirs soon. All sites now supporting the OPS VO

Stability, Throughput and Involvement SouthGrid continues to perform well, with good stability and functionality. Bristol PP additional nodes All five sites running glite 3 by 22 nd June Support for many VO’s across the Tier2 including non LHC VO’s such as Biomed, zeus, hone, ilc and pheno All sites SRM enabled (1 dcache, 4 dpm) by Oct 05 5 sites running an LFC

Summary & outlook South Grid continues to maintain good momentum, all sites are running the latest release and have SRM enabled se’s. SouthGrid was the joint first T2 to be running glite 3. The new Systems Administrator at Bristol, Winnie Lacesso, has helped progress at Bristol to be more rapid. RALPPD installing large upgrade Cambridge expecting upgrade in Autumn Bristol will have a percentage of the new Campus cluster Birmingham will have a percentage of the new Campus Cluster Oxford will be able to expand resources once the new computer room is built. Yves Coppens continues to provide valuable help across SouthGrid.