Southgrid Status 2001-2011 Pete Gronbech: 30 th August 2007 GridPP 19 Ambleside.

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

S.L.LloydGrid Steering Committee 8 March 2002 Slide 1 Status of GridPP Overview Financial Summary Recruitment Status EU DataGrid UK Grid Status GridPP.
1 ALICE Grid Status David Evans The University of Birmingham GridPP 16 th Collaboration Meeting QMUL June 2006.
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Andrew McNab - Manchester HEP - 10 May 2002 UK Testbed Deployment Aim of this talk is to the answer the questions: –What are other sites doing? –What are.
Southgrid Status Pete Gronbech: 21 st March 2007 GridPP 18 Glasgow.
Your university or experiment logo here BaBar Status Report Chris Brew GridPP16 QMUL 28/06/2006.
UKI-SouthGrid Overview Pete Gronbech SouthGrid Technical Coordinator GridPP 25 - Ambleside 25 th August 2010.
RAL Tier1: 2001 to 2011 James Thorne GridPP th August 2007.
Status Report University of Bristol 3 rd GridPP Collaboration Meeting 14/15 February, 2002Marc Kelly University of Bristol 1 Marc Kelly University of Bristol.
Partner Logo Tier1/A and Tier2 in GridPP2 John Gordon GridPP6 31 January 2003.
B A B AR and the GRID Roger Barlow for Fergus Wilson GridPP 13 5 th July 2005, Durham.
SouthGrid Status Pete Gronbech: 12 th March 2008 GridPP 20 Dublin.
Andrew McNab - Manchester HEP - 17 September 2002 Putting Existing Farms on the Testbed Manchester DZero/Atlas and BaBar farms are available via the Testbed.
The National Grid Service Mike Mineter.
Liverpool HEP – Site Report May 2007 John Bland, Robert Fay.
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
Oxford PP Computing Site Report HEPSYSMAN 28 th April 2003 Pete Gronbech.
23rd April 2002HEPSYSMAN April Oxford University Particle Physics Site Report Pete Gronbech Systems Manager.
SCARF Duncan Tooke RAL HPCSG. Overview What is SCARF? Hardware & OS Management Software Users Future.
UCL HEP Computing Status HEPSYSMAN, RAL,
A couple of slides on RAL PPD Chris Brew CCLRC - RAL - SPBU - PPD.
CBPF J. Magnin LAFEX-CBPF. Outline What is the GRID ? Why GRID at CBPF ? What are our needs ? Status of GRID at CBPF.
Birmingham site report Lawrie Lowe: System Manager Yves Coppens: SouthGrid support HEP System Managers’ Meeting, RAL, May 2007.
17th October 2013Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and GridPP Project Manager.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
Site Report HEPHY-UIBK Austrian federated Tier 2 meeting
Cambridge Site Report Cambridge Site Report HEP SYSMAN, RAL th June 2010 Santanu Das Cavendish Laboratory, Cambridge Santanu.
SouthGrid Status Pete Gronbech: 4 th September 2008 GridPP 21 Swansea.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
Computing/Tier 3 Status at Panjab S. Gautam, V. Bhatnagar India-CMS Meeting, Sept 27-28, 2007 Delhi University, Delhi Centre of Advanced Study in Physics,
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator GridPP 24 - RHUL 15 th April 2010.
Quarterly report SouthernTier-2 Quarter P.D. Gronbech.
Southgrid Technical Meeting Pete Gronbech: 16 th March 2006 Birmingham.
Winnie Lacesso Bristol Storage June DPM LCG Storage lcgse01 = DPM built in 2005 by Yves Coppens & Pete Gronbech SuperMicro X5DPAGG (Streamline.
SouthGrid Status Pete Gronbech: 2 nd April 2009 GridPP22 UCL.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPIX 2009 Umea, Sweden 26 th May 2009.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPSYSMAN RAL 30 th June 2009.
Oxford Update HEPix Pete Gronbech GridPP Project Manager October 2014.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
INDIACMS-TIFR Tier 2 Grid Status Report I IndiaCMS Meeting, April 05-06, 2007.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
Southgrid Technical Meeting Pete Gronbech: 26 th August 2005 Oxford.
UKI-SouthGrid Update Hepix Pete Gronbech SouthGrid Technical Coordinator April 2012.
13th October 2011Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and GridPP Project Manager.
London Tier 2 Status Report GridPP 11, Liverpool, 15 September 2004 Ben Waugh on behalf of Owen Maroney.
Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.
Southgrid Technical Meeting Pete Gronbech: 24 th October 2006 Cambridge.
Southgrid Technical Meeting Pete Gronbech: May 2005 Birmingham.
14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.
Presenter Name Facility Name UK Testbed Status and EDG Testbed Two. Steve Traylen GridPP 7, Oxford.
HEPSYSMAN May 2007 Oxford & SouthGrid Computing Status (Ian McArthur), Pete Gronbech May 2007 Physics IT Services PP Computing.
HEP Computing Status Sheffield University Matt Robinson Paul Hodgson Andrew Beresford.
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
UK Tier 1 Centre Glenn Patrick LHCb Software Week, 28 April 2006.
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
11th October 2012Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and GridPP Project Manager.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPSYSMAN – RAL 10 th June 2010.
BaBar Cluster Had been unstable mainly because of failing disks Very few (
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
Evangelos Markatos and Charalampos Gkikas FORTH-ICS Athens, th Mar Institute of Computer Science - FORTH Christos.
J Jensen/J Gordon RAL Storage Storage at RAL Service Challenge Meeting 27 Jan 2005.
The RAL PPD Tier 2/3 Current Status and Future Plans or “Are we ready for next year?” Chris Brew PPD Christmas Lectures th December 2007.
Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.
18/12/03PPD Christmas Lectures 2003 Grid in the Department A Guide for the Uninvolved PPD Computing Group Christmas Lecture 2003 Chris Brew.
Oxford Site Report HEPSYSMAN
QMUL Site Report by Dave Kant HEPSYSMAN Meeting /09/2019
Presentation transcript:

Southgrid Status Pete Gronbech: 30 th August 2007 GridPP 19 Ambleside

RAL PPD 2006 installed large upgrade 200 (Opteron 270) cpu cores equiv. to and extra 260 KSI2k plus 86TB of storage. The 50TB which was loaned to RAL Tier 1, and is now being returned. 10Gb/S Connection to RAL Backbone 2007 upgrade Disk and CPU: –13 x 6TB SATA Disk servers, 3Ware RAID controllers 14 x 500GB WD disks –32 x Dual Intel 5150 Dual Core CPU Nodes with 8GB RAM Will be installed in the Atlas Centre, due to power/cooling issues in R Xeon cpus, and 6.5 TB storage supplemented by upgrade mid 2006

RAL PPD (2) Supports 22 Different VOs of which 18 have run jobs in the last year. RAL PPD has always supported a large number of VOs which has helped ensure the cluster is fully utilised. Yearly upgrades are planned for the forseable future. The current computer room will have incremental upgrades to house the increased capacity. The RAL Tier 1 computer room can be used for over flow when needed.

Status at Cambridge * 1.3Ghz P3 for EDG TB Storage 2004 CPUs: 32 * 2.8GHz Xeon 2005 DPM enabled 2006 Local computer room upgraded. Christmas Intel Woodcrest servers, giving 128 cpu cores equiv. to 358 KSI2k. Jun 2007 Storage upgrade of 40TB running DPM on SL4 64 bit Condor version is being used

Cambridge Futures CAMGRID –430 cpus across campus, mainly running debian & Condor. Upgrades expected in Special Projects: –CAMONT VO supported at Cambridge, Oxford and Birmingham. Job submission by Karl Harrison and David Sinclair –LHCb on Windows project (Ying Ying Li) Code ported to windows –HEP 4 node cluster –MS Research Lab 4 node cluster (Windows compute cluster) Code running on a server at Oxford, expansion on OERC windows cluster

Bristol History Early days with Owen Maroney on the UK EDG testbed. Bristol started in Gridpp in July 2005 with 4 LCG service nodes & one 2-CPU WN upgraded to 8 dual-CPU WN & more may be added TB storage upgrade When LCG is integrated to the Bristol HPC cluster (Blue Crystal) very soon, there will be a new CE & SE, providing access to GHz cores, and it will use StoRM to make over 50TB of GPFS storage available to the Grid. This is a Cluster vision / IBM SRIF funded project. The HPC WN number should be closer to 3712 cores (96 2 x dual core Opteron (4 cores/WN) x quad-core opterons (8 cores / WN) ) Large water cooled computer room being built on the top floor of the physics building. Currently integrating the first phase of the HPC cluster (Baby Blue) with the LC software.

Status at Bristol Current Gridpp cluster as at August 2007

Status at Birmingham Currently SL3 with glite 3 CPUs: GHz Xeon ( MHz ) 10TB DPM Storage service Babar Farm will be phased out as the new HPC cluster comes on line. Run Pre Production Service which is used for testing new versions of the middleware. SouthGrid Hardware support (Yves Coppens) based here.

Birmingham Futures Large SRIF funded Clustervision / IBM University HPC cluster. The name of the cluster is Blue Bear. It has bit AMD Opteron 2.6GhZ dual core sockets (1024 processing cores) with 8.0GB each. Gridpp should get at least 10% of the cluster usage. A second phase is planned for 2008.

Early beginnings at Oxford grid.physics.ox.ac.uk circa Following attendance at RAL rd June 2000 course by Ian Fosters Globus team. Initial installations on grid test system used globus installed using the globus method. (Aug - Oct 2000) Later reinstalled from Andrew McNabs RPMs (July 2001) and with UK host certificate (Nov 2001) grid machine modified to be front end for lhcb Monte Carlo; OpenAFS, Java, OpenPBS, Openssh installed( Nov 2001) First attempt using kickstart (RH6.2) method. Crashed with anaconda errors. –Read on TB-support mail list that kickstart method no longer supported. Decide to try manual EDG method. –Pulled all CE rpms to my NFS server. Tried simple rpm -i *.rpm which failed Converted to using the LCFG method.

Oxford goes to production status Early 2004 saw the arrival of two racks of Dell equipment providing: CPUs: GHz, 3.2TB of disk storage. (£60K investment (local Oxford Money)) –Compute Element 37 Worker Nodes, 74 Jobs Slots, 67 KSI2K –37 Dual 2.8GHz P4 Xeon, 2GB RAM –DPM SRM Storage Element 2 Disks servers 3.2TB Disk Space 1.6 TB DPM server – second 1.6TB DPM disk pool node. –Mon, LFC and UI nodes –GridMon Network Monitor 1Gb/s Connectivity to the Oxford Backbone –Oxford currently connected at 1Gb/s to TVM Submission from the Oxford CampusGrid via the NGS VO is possible. Working towards NGS affiliation status. Planned upgrades for 05 and 06 were hampered by lack of decent computer room with sufficient power and cooling.

Oxford Upgrade systems, 22 servers, 44 cpus, 176 cores. Intel 5345 clovertown cpus provide ~350KSI2K 11 servers each providing 9TB usable storage after RAID 6, total ~99TB Two racks, 4 Redundant Management Nodes, 4 PDUs, 4 UPSs

Two New Computer Rooms will provide excellent infrastructure for the future The New Computer room being built at Begbroke Science Park jointly for the Oxford Super Computer and the Physics department, will provide space for 55 (11KW) computer racks. 22 of which will be for Physics. Up to a third of these can be used for the Tier 2 centre. This £1.5M project is funded by SRIF and a contribution of ~£200K from Oxford Physics. All new Physics HPC clusters including the Grid will be housed here when it is ready in October / November 2007.

Local Oxford DWB Computer room Completely separate from the Begroke Science park a computer room with 100KW cooling and >200KW power is being built. ~£150K Oxford Physics Money. Local Physics department Infrastructure computer room (100KW) has been agreed. Will be complete next week (Sept 2007). This will relieve local computer rooms and house T2 equipment until the Begbroke room is ready. Racks that are currently in unsuitable locations can be re housed.

Summary SouthGrid is set for substantial expansion following significant infrastructure investment at all sites. Birmingham existing HEP and PPS clusters running well, new University Cluster will be utilised shortly. Bristol small cluster is stable, new University HPC cluster is starting to come on line. Cambridge cluster upgraded as part of the CamGrid SRIF3 bid. Oxford resources will be upgraded in the coming weeks being installed into the new local computer room. RAL PPD has expanded last year and this year, way above what was originally promised in the MoU. Continued yearly expansion planned. SouthGrid Striding out into the future To reach the summit of our ambitions for the Grid users of the Future !!!

Enjoy your walks, and recruit some new gridpp members