Quarterly report SouthernTier-2 Quarter 03 2005 P.D. Gronbech.

Slides:



Advertisements
Similar presentations
London Tier2 Status O.van der Aa. Slide 2 LT 2 21/03/2007 London Tier2 Status Current Resource Status 7 GOC Sites using sge, pbs, pbspro –UCL: Central,
Advertisements

NorthGrid status Alessandra Forti Gridpp15 RAL, 11 th January 2006.
Southgrid Status Pete Gronbech: 21 st March 2007 GridPP 18 Glasgow.
RAL Tier1: 2001 to 2011 James Thorne GridPP th August 2007.
SouthGrid Status Pete Gronbech: 12 th March 2008 GridPP 20 Dublin.
NorthGrid status Alessandra Forti Gridpp12 Brunel, 1 February 2005.
Nick Brook University of Bristol The LHC Experiments & Lattice EB News Brief overview of the expts  ATLAS  CMS  LHCb  Lattice.
LCG Tiziana Ferrari - SC3: INFN installation status report 1 Service Challenge Phase 3: Status report Tiziana Ferrari on behalf of the INFN SC team INFN.
Quarterly report ScotGrid Quarter Fraser Speirs.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
London Tier 2 Status Report GridPP 13, Durham, 4 th July 2005 Owen Maroney, David Colling.
Cambridge Site Report Cambridge Site Report HEP SYSMAN, RAL th June 2010 Santanu Das Cavendish Laboratory, Cambridge Santanu.
SouthGrid Status Pete Gronbech: 4 th September 2008 GridPP 21 Swansea.
UKI-SouthGrid Overview Face-2-Face Meeting Pete Gronbech SouthGrid Technical Coordinator Oxford June 2013.
14 July 2004GridPP Collaboration BoardSlide 1 GridPP Dissemination Sarah Pearce Dissemination Officer
London Tier 2 Status Report GridPP 12, Brunel, 1 st February 2005 Owen Maroney.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
D. Britton GridPP Status - ProjectMap 22/Feb/06. D. Britton22/Feb/2006GridPP Status GridPP2 ProjectMap.
Southgrid Technical Meeting Pete Gronbech: 16 th March 2006 Birmingham.
BINP/GCF Status Report BINP LCG Site Registration Oct 2009
SouthGrid Status Pete Gronbech: 2 nd April 2009 GridPP22 UCL.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
Quarterly report ScotGrid Quarter Fraser Speirs.
Oxford Update HEPix Pete Gronbech GridPP Project Manager October 2014.
LCG Service Challenge Phase 4: Piano di attività e impatto sulla infrastruttura di rete 1 Service Challenge Phase 4: Piano di attività e impatto sulla.
3 June 2004GridPP10Slide 1 GridPP Dissemination Sarah Pearce Dissemination Officer
Organisation Management and Policy Group (MPG): Responsible for setting and policy decisions and resolving any issues concerning fractional usage, acceptable.
GridPP3 project status Sarah Pearce 14 April 2010 GridPP24 RHUL.
GridPP3 Project Management GridPP20 Sarah Pearce 11 March 2008.
Project Management Sarah Pearce 3 September GridPP21.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Jeremy Coles UK LCG Operations The Geographical Distribution of GridPP Institutes Production Manager.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
Southgrid Technical Meeting Pete Gronbech: 26 th August 2005 Oxford.
GridPP Deployment Status GridPP14 Jeremy Coles 6 th September 2005.
UKI-SouthGrid Update Hepix Pete Gronbech SouthGrid Technical Coordinator April 2012.
Testing the UK Tier 2 Data Storage and Transfer Infrastructure C. Brew (RAL) Y. Coppens (Birmingham), G. Cowen (Edinburgh) & J. Ferguson (Glasgow) 9-13.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
London Tier 2 Status Report GridPP 11, Liverpool, 15 September 2004 Ben Waugh on behalf of Owen Maroney.
The Experiments – progress and status Roger Barlow GridPP7 Oxford 2 nd July 2003.
Southgrid Technical Meeting Pete Gronbech: 24 th October 2006 Cambridge.
Owen SyngeTitle of TalkSlide 1 Storage Management Owen Synge – Developer, Packager, and first line support to System Administrators. Talks Scope –GridPP.
Southgrid Technical Meeting Pete Gronbech: May 2005 Birmingham.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
Tier1A Status Andrew Sansum 30 January Overview Systems Staff Projects.
HEPSYSMAN May 2007 Oxford & SouthGrid Computing Status (Ian McArthur), Pete Gronbech May 2007 Physics IT Services PP Computing.
LCG Storage Accounting John Gordon CCLRC – RAL LCG Grid Deployment Board September 2006.
Derek Ross E-Science Department DCache Deployment at Tier1A UK HEP Sysman April 2005.
UK Tier 1 Centre Glenn Patrick LHCb Software Week, 28 April 2006.
Rutherford Appleton Lab, UK VOBox Considerations from GridPP. GridPP DTeam Meeting. Wed Sep 13 th 2005.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPSYSMAN – RAL 10 th June 2010.
RAL PPD Tier 2 (and stuff) Site Report Rob Harper HEP SysMan 30 th June
IAG – Israel Academic Grid, EGEE and HEP in Israel Prof. David Horn Tel Aviv University.
BaBar Cluster Had been unstable mainly because of failing disks Very few (
Accounting in LCG/EGEE Can We Gauge Grid Usage via RBs? Dave Kant CCLRC, e-Science Centre.
SL5 Site Status GDB, September 2009 John Gordon. LCG SL5 Site Status ASGC T1 - will be finished before mid September. Actually the OS migration process.
Enabling Grids for E-sciencE INFSO-RI Enabling Grids for E-sciencE Gavin McCance GDB – 6 June 2007 FTS 2.0 deployment and testing.
J Jensen/J Gordon RAL Storage Storage at RAL Service Challenge Meeting 27 Jan 2005.
The RAL PPD Tier 2/3 Current Status and Future Plans or “Are we ready for next year?” Chris Brew PPD Christmas Lectures th December 2007.
GridPP2 Data Management work area J Jensen / RAL GridPP2 Data Management Work Area – Part 2 Mass storage & local storage mgmt J Jensen
Cambridge Site Report John Hill 20 June 20131SouthGrid Face to Face.
Accounting Update John Gordon. Outline Multicore CPU Accounting Developments Cloud Accounting Storage Accounting Miscellaneous.
UK Status and Plans Catalin Condurache – STFC RAL ALICE Tier-1/Tier-2 Workshop University of Torino, February 2015.
18/12/03PPD Christmas Lectures 2003 Grid in the Department A Guide for the Uninvolved PPD Computing Group Christmas Lecture 2003 Chris Brew.
London Tier-2 Quarter Owen Maroney
LCG Service Challenge: Planning and Milestones
Luca dell’Agnello INFN-CNAF
Oxford Site Report HEPSYSMAN
Collaboration Board Meeting
Presentation transcript:

Quarterly report SouthernTier-2 Quarter P.D. Gronbech

September 2005Quarterly report: SouthGrid Current site status data SiteService nodes Worker nodes Local network connectivity Site connectivity SRMDays SFT failed Days in scheduled maintenance Security incidents this quarter which impact on Grid BirminghamSL304 LCG2.6.0 SL304 LCG Mb/s1Gb/sNo320 BristolSL304 LCG2.6.0 SL304 LCG Mb/s1Gb/sNo18190 CambridgeSL305 LCG2.6.0 SL305 LCG Mb/s2.5Gb/sNo750 OxfordSL304 LCG2.6.0 SL304 LCG Mb/s2.5Gb/sNo730 RAL PPDSL303 LCG2.6.0 SL304 LCG Gb/s2Gb/sDcache300 1)Local network connectivity is that to the site SE 2)It is understood that SFT failures do not always result from site problems, but it is the best measure currently available. Results based on the old SFT page as this contains history, Site only deemed to have failed if it did not have a good set of results for a particular day.

September 2005Quarterly report: SouthGrid All GridPP Resources SitePromisedActual Integrated kSI2K hours until this quarter CPU (kSI2K) Storage (TB) Integrated kSI2K hours until this quarter CPU (kSI2K) Storage (TB) Birmingham *9.3 Bristol *1.9 Cambridge Oxford **18.5 RAL PPD ***5.8 Total )The GridPP-Tier-2 MoUs made reference to integrated CPU over the 3 years of GridPP2. Under the “Promised – integrated kSI2K hours until this quarter” an estimate is provided of what the Tier-2 would have expected to provide to this quarter on the basis of planned installations. “Static kSI2K” shows what would currently be expected if all purchases planned to this quarter had been made and implemented. The actual columns show what has been delivered. 2)RAL PPD delayed purchase due to lack of use earlier in year. * The Bristol Babar cluster transferred to Birmingham ** Delayed purchasing due to lack of computer room *** Delayed purchasing due to earlier lack of use

September 2005Quarterly report: SouthGrid LCG resources SiteEstimated for LCGCurrently delivering to LCG Total job slots CPU (kSI2K) Storage (TB) Total jobs slots CPU (kSI2K) Storage (TB) Birmingham54 ** Bristol Cambridge Oxford * RAL PPD Total *** ) The estimated figures are those that were projected for LCG planning purposes: 2) Current total job slots are those reported by EGEE/LCG gstat page. * This figure includes the older se also, not currently reported on the gstat pages. ** 50% of ATLAS Farm available to LCG. *** The total estimated above comes from the MOU spreadsheet, the total from the LCG projected planning spreadsheet was 200 KSI2K and 7TB Shortfall is due to delayed purchasing at RAL and Oxford

September 2005Quarterly report: SouthGrid VOs supported by site SiteALICEATLASBABARBIOMEDCDFCMSDTEAMDZEROHONEILCLHCBNA48PHENOSIXTZEUSTotal Birmingham Bristol Cambridge Oxford RAL PPD Total => not supported 1 => supported

September 2005Quarterly report: SouthGrid Resources used per VO over quarter (KSI2K hours) Site CPUALICEATLASBABARBIOMEDCMSDTEAMDZEROHONEILCLHCBPHENOZEUSTotal Birmingham Bristol CambridgeN/A Oxford RAL PPD Total )Information currently available from APEL - please note these pages are still under development! Nb. This could be automated with an SQL/R-GMA query

September 2005Quarterly report: SouthGrid Usage by VO for Tier-2 Jobs July 2005Aug 2005Sep 2005 alice042 Atlas babar Biomed cms dteam dzero745 hone ilc020 lhcb Na48000 pheno09337 zeus Data taken from goc accounting pages NormSumCPU (CPU hours normalised to 1KSPECint2000 CPU July 2005Aug 2005Sep 2005 alice000 Atlas babar Biomed cms dteam dzero000 hone ilc000 lhcb Na48000 pheno zeus

September 2005Quarterly report: SouthGrid Storage resources in use per VO (TB) Site StorageALICEATLASBABARBIOMEDCMSDTEAMDZEROHONEILCLHCBNA48PHENOZEUSTotal Birmingham Bristol Cambridge Oxford RAL PPD Total Difficult to provide this for the period but we can at least show *current* usage. Numbers need to be provided by site Admins (> du – sh) but this will change under dCache.

September 2005Quarterly report: SouthGrid Usage by VO (CPU)

September 2005Quarterly report: SouthGrid Usage by VO (jobs) Nb: This can be extracted from APEL

September 2005Quarterly report: SouthGrid Progress over last quarter SiteSuccessesProblems/Issues BirminghamUpgraded to Took part in Pre release testing of Installed Babar clusters formerly at Bristol. BristolSite installed, connected on 5 th July and now maintained by Y. Coppens and P. Gronbech. Finalizing the HP funded post Recruited Sys admin on Rolling Grant Lack of Manpower has hindered progress CambridgeUpgraded to Integrated LCG cluster in to Cam Grid Condor cluster. Still some ownership problems wrt to Condor users cf lcg pool uid’s. Apel Accounting does not yet support Condor OxfordUpgraded to SL 304 Installed LCG_2.6.0 New Sys Admin Started New Computer room is some way off and some nodes are over heating. SRIF funding has been secured to build it. The upgrade of the lcg cluster can commence as soon as the room is ready as funds are reserved. RALPPDUpgraded to Install dcache

September 2005Quarterly report: SouthGrid Tier-2 risks General risks Lack of use by casual users. Feedback from jobs that go astray is not user friendly Mitigating actions Need better training for users and more useable software. Have Integrated UI’s with Local clusters for ease of use. Institute specific risks Lack of adequate computer room space at Oxford is still a problem. Several nodes are running hot. Slow progress on building of new computer room will delay upgrade of Oxfords resources. Use of Condor at Cambridge prevents monitoring statistics?? Bristol Manpower Lack of Bristol resources Mitigating actions Condor support to be added to APEL New sys admin due to start October 1 st. Local cluster will be used, SRIF funded eScience cluster to follow with luck

September 2005Quarterly report: SouthGrid Tier-2 planning for next quarter Setup and purchase integration test bed for SouthGrid use Coordinate use of this cluster within UK Testzone Install LCG Install SRM at all sites Support some non LHC VO’s Investigate DPM at Birmingham, then install either dcache or DPM at other sites dependent upon results of testing. Possible new Hardware purchases. Start some inter site performance tests Prepare for SC4 involvement

September 2005Quarterly report: SouthGrid Objectives and deliverables for last quarter Objective/deliverableDue dateMetric/output Install LCG2.6.0 at all sitesLate July 2005Completed Testing of disk-to-disk transfers in preparation for Service Challenge 4 31 st September 2005Not yet started First sites to install SRM31 st September 2005RAL install dcache Birmingham started installing DPM Support non LHC vo’s31 st September 2005Birmingham, Cambridge, Oxford and RAL supported Biomed, and were largest UK T2 contributor.

September 2005Quarterly report: SouthGrid Objectives and deliverables for next quarter Objective/deliverableDue dateMetric/output Install LCG2.7.0 at all sitesLate October 2005 Testing of disk-to-disk transfers in preparation for Service Challenge 4 31 st December 2005 All sites to install SRM31 st December 2005 Continue support non LHC VO’s31 st December 2005

September 2005Quarterly report: SouthGrid Meetings, papers & effort Tier-2 coordinator effortComments 3 months AreaDescription Talks ConferencesGridpp 14 Birmingham Publications For Tier-2 coordinator:

September 2005Quarterly report: SouthGrid Summary & outlook South Grid technical meeting held in August continued to focus sites on rapid upgrades. All sites are running the latest release and will start installing SRM’s in preparation for SC4. There are continuing manpower issues at Bristol but this will ease shortly as the new Systems Administrator starts in October. The part HP funded post has also been finalised. Oxford will be able to expand resources once the new computer room is built. SRIF funding for this has been obtained for the room. Oxford new Sys Admin is in place, allowing the T2C more time to Coordinate! Yves Coppens is providing valuable help across SouthGrid. Bristol are now on line and it is hoped to expand their cluster once the local sys admin is in place.