UK Testbed Report GridPP 9 Steve Traylen

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

Tony Doyle GridPP – From Prototype To Production, HEPiX Meeting, Edinburgh, 25 May 2004.
S.L.LloydGrid Steering Committee 8 March 2002 Slide 1 Status of GridPP Overview Financial Summary Recruitment Status EU DataGrid UK Grid Status GridPP.
GridPP9 – 5 February 2004 – Data Management DataGrid is a project funded by the European Union GridPP is funded by PPARC GridPP2: Data and Storage Management.
Deployment metrics and planning (aka Potentially the most boring talk this week) GridPP16 Jeremy Coles 27 th June 2006.
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
John Gordon eScience Centre UK Deployment of LCG1 John Gordon CCLRC Escience Centre GridPP7, Oxford, July 2003.
Andrew McNab - Manchester HEP - 10 May 2002 UK Testbed Deployment Aim of this talk is to the answer the questions: –What are other sites doing? –What are.
Southgrid Status Pete Gronbech: 21 st March 2007 GridPP 18 Glasgow.
Tony Doyle Executive Summary, PPARC, MRC London, 15 May 2003.
John Gordon CCLRC eScience centre Grid Support and Operations John Gordon CCLRC GridPP9 - Edinburgh.
Your university or experiment logo here What is it? What is it for? The Grid.
Status Report University of Bristol 3 rd GridPP Collaboration Meeting 14/15 February, 2002Marc Kelly University of Bristol 1 Marc Kelly University of Bristol.
Partner Logo Tier1/A and Tier2 in GridPP2 John Gordon GridPP6 31 January 2003.
Andrew McNab - Manchester HEP - 17 September 2002 Putting Existing Farms on the Testbed Manchester DZero/Atlas and BaBar farms are available via the Testbed.
Partner Logo UK GridPP Testbed Rollout John Gordon GridPP 3rd Collaboration Meeting Cambridge 15th February 2002.
S.L.LloydGridPP Collaboration Meeting IC Sept 2002Slide 1 Introduction Welcome to the 5 th GridPP Collaboration Meeting Steve Lloyd, Chair of GridPP.
GridPP Building a UK Computing Grid for Particle Physics A PPARC funded project.
NorthGrid status Alessandra Forti Gridpp12 Brunel, 1 February 2005.
UK Agency for the support of: High Energy Physics - the nature of matter and mass Particle Astrophysics - laws from natural phenomena Astronomy - the.
Tony Doyle Overview of UK Development and Deployment Programme, LCG PEB Meeting, CERN, 16 September 2003.
Dave Kant Grid Monitoring and Accounting Dave Kant CCLRC e-Science Centre, UK HEPiX at Brookhaven 18 th – 22 nd Oct GOSC Oct 28.
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
UCL HEP Computing Status HEPSYSMAN, RAL,
Steve Traylen Particle Physics Department Experiences of DCache at RAL UK HEP Sysman, 11/11/04 Steve Traylen
Presenter Name Facility Name EDG Testbed Status Moving to Testbed Two.
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
Enabling e-Research over GridPP Dan Tovey University of Sheffield.
Martin Bly RAL Tier1/A RAL Tier1/A Site Report HEPiX-HEPNT Vancouver, October 2003.
S.L.LloydGridPP CB 29 Oct 2002Slide 1 Agenda 1.Introduction – Steve Lloyd 2.Minutes of Previous Meeting (23 Oct 2001) 3.Matters Arising 4.Project Leader's.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
London Tier 2 Status Report GridPP 13, Durham, 4 th July 2005 Owen Maroney, David Colling.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Vendor Day 30 th April.
London Tier 2 Status Report GridPP 12, Brunel, 1 st February 2005 Owen Maroney.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
CMS Report – GridPP Collaboration Meeting VIII Peter Hobson, Brunel University22/9/2003 CMS Applications Progress towards GridPP milestones Data management.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
Southgrid Technical Meeting Pete Gronbech: 16 th March 2006 Birmingham.
Dave Kant Grid Monitoring and Accounting Dave Kant CCLRC e-Science Centre, UK HEPiX at Brookhaven 18 th – 22 nd Oct 2004.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
Oxford Update HEPix Pete Gronbech GridPP Project Manager October 2014.
1 st EGEE Conference – April UK and Ireland Partner Dave Kant Deputy ROC Manager.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
Andrew McNabNorthGrid, GridPP8, 23 Sept 2003Slide 1 NorthGrid Status Andrew McNab High Energy Physics University of Manchester.
Steve Traylen Particle Physics Department EDG and LCG Status 9 th December 2003
John Gordon CCLRC e-Science Centre LCG Deployment in the UK John Gordon GridPP10.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Δ Storage Middleware GridPP10 What’s new since GridPP9? CERN, June 2004.
Tony Doyle - University of GlasgowOutline EDG LCG GSC UK Core Grid GridPP2 EGEE Where do we go from here? Operations.
Southgrid Technical Meeting Pete Gronbech: 26 th August 2005 Oxford.
Steve Traylen PPD Rutherford Lab Grid Operations PPD Christmas Lectures Steve Traylen RAL Tier1 Grid Deployment
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
London Tier 2 Status Report GridPP 11, Liverpool, 15 September 2004 Ben Waugh on behalf of Owen Maroney.
Dave Newbold, University of Bristol8/3/2001 UK Testbed 0 Sites Sites that have committed to TB0: RAL (R) Birmingham (Q) Bristol (Q) Edinburgh (Q) Imperial.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Production Manager’s Report PMB Jeremy Coles 13 rd September 2004.
Tier1A Status Andrew Sansum 30 January Overview Systems Staff Projects.
Presenter Name Facility Name UK Testbed Status and EDG Testbed Two. Steve Traylen GridPP 7, Oxford.
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
Andrew McNab - Manchester HEP - 17 September 2002 UK Testbed Deployment Aim of this talk is to the answer the questions: –“How much of the Testbed has.
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
Stephen Burke – Sysman meeting - 22/4/2002 Partner Logo The Testbed – A User View Stephen Burke, PPARC/RAL.
J Jensen/J Gordon RAL Storage Storage at RAL Service Challenge Meeting 27 Jan 2005.
CERN LCG1 to LCG2 Transition Markus Schulz LCG Workshop March 2004.
Slide § David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP delivering The UK Grid.
UK Status and Plans Catalin Condurache – STFC RAL ALICE Tier-1/Tier-2 Workshop University of Torino, February 2015.
J Jensen / WP5 /RAL UCL 4/5 March 2004 GridPP / DataGrid wrap-up Mass Storage Management J Jensen
18/12/03PPD Christmas Lectures 2003 Grid in the Department A Guide for the Uninvolved PPD Computing Group Christmas Lecture 2003 Chris Brew.
BaBar-Grid Status and Prospects
Presentation transcript:

UK Testbed Report GridPP 9 Steve Traylen

Topics Current test beds. Resources at GridPP sites. (Corrections welcome) –Scot Grid –London Grid –North Grid –South Grid –Tier1/A Grid Future test beds and grid facilities.

EDG No big changes within EDG 2.1.X, only bug fixes and security updates. Released last November. 19 sites in total with 8 from the UK. Software freeze, 4 th February. New sites freeze, 9 th February. EDG review on 19 th, 20 th February. EDG will continue as is through March. OS security policy based on Fedora Legacy.

LCG 1 Released October Since then only a few security updates. In limited use by experiments but the deployment procedure is now well established as are plans for the next data challenge. Three sites in the UK of a total 28. No support for managed or mass storage access. Security updates provided by CERN Linux group.

How to join LCG. Procedure is quite formal but appears to work. –A How2Start document exists. A questionnaire must be filled in supplying your intentions. –Number of WNs. –Storage capacity. –VOs you plan to support. All install support for LCG is via your primary LCG, for the UK this is RAL. The existing tb-support list can also be used. LCG2 similar although the GOC will be collecting information via a Gridsite enabled web form.

ScotGrid (Glasgow) Previously within EDG. EDG2 being installed. Various WP2 test machines. New CE, SE and UI being arranged. Will join LCG2 with these resources and new resources. 6 x335 and 29 blades from IBM funded by eDIKT for bioinformatics being added. A CDF-SAM/JIM front end exists. Possible UK eScience front end run by NeSC Hub. Fraser Speirs now the tech co- ordinator for ScotGrid.

ScotGrid (Durham) EDG installed. Ganglia to be added. New GridPP hardware will be installed in EDG/LCG. Extra front ends will be made available from ScotGrid.

NorthGrid (Lancaster) Predominately SAMGrid. EDG CE, SE, MON, WNs. 6 figure sum of hardware, arriving end of This shiney farm will appear within LCG and NorthGrid. Ganglia is use and liked for debugging and ease of install. EDG 2.1.8

NorthGrid (Sheffield) Most recent full member of the EDG testbed. More WNs will now be added. Grid jobs could be fed to existing 68cpu farm. This farm will continue to grow. Ganglia is fantastic. Plans exist for a 400 CPU cluster for use by GridPP. EDG

NorthGrid (Manchester) VO services for GridPP, Babar and experimental MICE, CALICE and DESY. WWW for GridPP. Very active with Babar grid. Will join LCG1/2. Alessandra will be North grid coordinator. Ganglia in used throughout department. Also a GridPP ganglia view likely to be created CPU farm being commissioned this year.

NorthGrid (Liverpool) 940 DELL P4s installed. Around 80 will be committed to LCG2, this figure can vary once actual demand is known. Bunch of nodes for grid front ends are available. Plan to take part in ATLAS and LHCb data challenges as well as Babar grid.

ScotGrid (Edinburgh) Joining EDG2 now. Plan to join LCG and/or Babar. Grid jobs could be fed to a 17 node Babar farm. Grid access to some of 150TB of SRIF storage is planned. Phil Clark will coordinating ScotGrid activities in Edinburgh.

LondonGrid (UCL) UCL is a full member of the EDG2 testbed. SRIF funded cluster with 192 CPUs for Grid and E- Science projects being installed now A large portion will be made available within LCG2. Will be used for Atlas DC2. EDG 2.1.8

LondonGrid (QMUL) A full member of EDG2 testbed. Plan to front end SRIF farm CPUs are in tender process now Ganglia in use. Unhappy with ScalablePBS/Maui. Will consider Sun Grid Engine v6 once released. EDG 2.1.8

London Grid (Imperial College) Full member of LCG1 and EDG2. Eagerly awaiting LCG2 for CMS DC. Both CPU and Disk dedicated to LCG and EDG will be increased. A UI for LCG2--. Monitoring, job tracking map is now an applet. EDG LCG

South Grid (Bristol) VDT based CE within Babar Grid. This fronts an existing 40 WN Babar farm. There is a GridPP replica catalogue. Ganglia running. A 6 month plan to be running Babar SP production lies ahead. Lots of work for CMS DC04 such as GMcat. VDT

SouthGrid (Birmingham) EDG testbed installed. Possibly integrate to existing farms later. New on-site hardware expected, not dedicated to UKHEP. EDG 2.1.8

SouthGrid (RAL PPD) A full member of EDG2, but will ramp down to join LCG2. WP3 testbed also includes 2 R-GMA Nagios nodes. Ganglia in use, Nagios being considered CPUs, 5TB disk being arranged. Supporting SouthGrid install EDG and then LCG. EDG RGMA, EDG

SouthGrid (Oxford) A full member of EDG testbed. Oxford eScience centre has a Condor and JISC infrastructure testbed. Ganglia in use else where other than EDG cluster. Hardware being sourced now for 2004 data challenges. All resources will appear within SouthGrid. EDG

SouthGrid (Cambridge) 20 nodes in total including one for testing and a NM. Plan to feed jobs to existing eScience farms. 3TB and 20 more CPUs to deployed in 2months time. Ganglia in use and liked. EDG LCG

RAL Tier1/a (EDG App) RAL runs EDG core services such as RGMA catalogue and RLS for some VOs. EDG

RAL Tier1/a (EDG and Others) EDG Dev: CE, 2xSE, MON and RLS. R-GMA: CE, SE, MON and IC. EDG-SE: 4 x SEs Public access UIs for apptb and devtb. Gatekeeper into main production farm for Babar Grid. Central SRB MCAT server.

RAL Tier1/a (LCG1/2--) LCG1: UI, CE, SE, BDII, WN and West GIIS. –Skeleton service to be terminated ASAP. LCG2 : UI, CE, WN, BDII, PROXY, RB and west GIIS. –Babar and DZero VOs added to standard LHC VOs. –R-GMA added to LCG2 as proof of concept. –Nagios running across LCG2.

Common Problems/Requests More reliable or informative monitoring. Firewall requirements, one site was nearly charged because of the required complexity. Strange Globus TCP considered illegal by a couple of fire walls and blocked wrongly or rightly. Mistakes/omissions in the instructions. –Too many possible errors in the system to document.

Arrival of LCG2. LCG2 is appearing now. With this release it is good time commit resources. –Data challenges are all eager to start. –Teir1 will initially move 70 high end nodes into this. Installation by LCFG or manual by hand instructions exist for installed/shared resources. Likely to be in place for some time perhaps for the remainder of this year.

DCache SRM What is it with respect to the LCG? –One solution to SRMed fronted disk. –DCache is GridFTP server with a very flexible backend. –A SRM implementation exists. RAL is packaging, testing and configuring with LCFG. Testing at a brick wall due to minor, but critical, differences in SRM implementation. – The SRM does not create directories automagically. –As a classic SE gridftp-mkdir not implemented.

Plans for EDG testbed. Lots of speculation. After the review software may move closer to LCG2++. Officially EDG testbed will become a development/demonstration testbed for EGEE. From early April people inside EGEE will coordinate and run this resource. This may move to be the SA1 testbed, it will be the same people running it. A partial quattor install EDG is also being worked on.

Testbeds for EGEE JRA1, Middleware Engineering and Integration –Similar role to EDG dev testbed for EGEE middleware. –Rather more controlled and closed. –Resources at CERN, NIKHEF and RAL plus five testers at CERN. –Release frequency up to 1/week. SA1, EU Grid Operations, Support, and Management. –Testing ground for applications with EGEE middleware. –Release frequency of around 3 months WP3 testbed. –Will continue into the future for EGEE in some form.

Conclusions GridPP sites still make up the most significant portion of EDG. EDG will become smaller and somehow migrate to EGEE. As it stands it is probably to large for the early stages of EGEE. Running EDG is very good practise for LCG. Significant resources are wanted and best suited to LCG. Tier2s appear to working closely to do work on same things.