Presenter Name Facility Name UK Testbed Status and EDG Testbed Two. Steve Traylen GridPP 7, Oxford.

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

UK Testbed Report GridPP 9 Steve Traylen
Workload Management Status of current activity GridPP 13, Durham, 6 th July 2005.
GridPP Meeting, Cambridge, 15 Feb 2002 Paul Mealor, UCL UCL Testbed 1 status report Paul Mealor.
Andrew McNab - Manchester HEP - 10 May 2002 UK Testbed Deployment Aim of this talk is to the answer the questions: –What are other sites doing? –What are.
Status Report University of Bristol 3 rd GridPP Collaboration Meeting 14/15 February, 2002Marc Kelly University of Bristol 1 Marc Kelly University of Bristol.
Andrew McNab - Manchester HEP - 17 September 2002 Putting Existing Farms on the Testbed Manchester DZero/Atlas and BaBar farms are available via the Testbed.
Partner Logo UK GridPP Testbed Rollout John Gordon GridPP 3rd Collaboration Meeting Cambridge 15th February 2002.
Presenter Name Facility Name EDG Testbed Status Moving to Testbed Two.
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
Tier1A Status Andrew Sansum GRIDPP 8 23 September 2003.
Martin Bly RAL Tier1/A RAL Tier1/A Site Report HEPiX-HEPNT Vancouver, October 2003.
Andrew McNab - EDG Access Control - 14 Jan 2003 EU DataGrid security with GSI and Globus Andrew McNab University of Manchester
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
Oxford Jan 2005 RAL Computing 1 RAL Computing Implementing the computing model: SAM and the Grid Nick West.
London Tier 2 Status Report GridPP 13, Durham, 4 th July 2005 Owen Maroney, David Colling.
John Gordon and LCG and Grid Operations John Gordon CCLRC e-Science Centre, UK LCG Grid Operations.
Andrew McNab - Manchester HEP - 22 April 2002 UK Rollout and Support Plan Aim of this talk is to the answer question “As a site admin, what are the steps.
London Tier 2 Status Report GridPP 12, Brunel, 1 st February 2005 Owen Maroney.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
CMS Report – GridPP Collaboration Meeting VIII Peter Hobson, Brunel University22/9/2003 CMS Applications Progress towards GridPP milestones Data management.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
WP3 RGMA Deployment Laurence Field / RAL Steve Fisher / RAL.
Tony Doyle “GridPP – Year 2 to Year 3”, Collaboration Meeting, Bristol, 22 September 2003.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
Andrew McNab - Manchester HEP - 5 July 2001 WP6/Testbed Status Status by partner –CNRS, Czech R., INFN, NIKHEF, NorduGrid, LIP, Russia, UK Security Integration.
Southgrid Technical Meeting Pete Gronbech: 16 th March 2006 Birmingham.
Dave Kant Grid Monitoring and Accounting Dave Kant CCLRC e-Science Centre, UK HEPiX at Brookhaven 18 th – 22 nd Oct 2004.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
Grid Infrastructure for the ILC Andreas Gellrich DESY European ILC Software and Physics Meeting Cambridge, UK,
Cosener’s House – 30 th Jan’031 LHCb Progress & Plans Nick Brook University of Bristol News & User Plans Technical Progress Review of deliverables.
Nick Brook Current status Future Collaboration Plans Future UK plans.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
WP8 Status – Stephen Burke – 30th January 2003 WP8 Status Stephen Burke (RAL) (with thanks to Frank Harris)
Steve Traylen Particle Physics Department EDG and LCG Status 9 th December 2003
CMS Stress Test Report Marco Verlato (INFN-Padova) INFN-GRID Testbed Meeting 17 Gennaio 2003.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
Tony Doyle - University of GlasgowOutline EDG LCG GSC UK Core Grid GridPP2 EGEE Where do we go from here? Operations.
First attempt for validating/testing Testbed 1 Globus and middleware services WP6 Meeting, December 2001 Flavia Donno, Marco Serra for IT and WPs.
Steve Traylen PPD Rutherford Lab Grid Operations PPD Christmas Lectures Steve Traylen RAL Tier1 Grid Deployment
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
London Tier 2 Status Report GridPP 11, Liverpool, 15 September 2004 Ben Waugh on behalf of Owen Maroney.
Dave Newbold, University of Bristol8/3/2001 UK Testbed 0 Sites Sites that have committed to TB0: RAL (R) Birmingham (Q) Bristol (Q) Edinburgh (Q) Imperial.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
University of Bristol 5th GridPP Collaboration Meeting 16/17 September, 2002Owen Maroney University of Bristol 1 Testbed Site –EDG 1.2 –LCFG GridPP Replica.
Tier1A Status Andrew Sansum 30 January Overview Systems Staff Projects.
UK Grid Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid Prototype and Globus Technical Meeting QMW, 22nd November 2000 Glenn Patrick (RAL)
Jens G Jensen RAL, EDG WP5 Storage Element Overview DataGrid Project Conference Heidelberg, 26 Sep-01 Oct 2003.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE Site Architecture Resource Center Deployment Considerations MIMOS EGEE Tutorial.
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
Andrew McNab - Manchester HEP - 17 September 2002 UK Testbed Deployment Aim of this talk is to the answer the questions: –“How much of the Testbed has.
DataTAG Work Package 4 Meeting Bologna Simone Ludwig Brunel University 23rd and 24th of May 2002.
December 26, 2015 RHIC/USATLAS Grid Computing Facility Overview Dantong Yu Brookhaven National Lab.
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
Andrew McNabGrid in 2002, Manchester HEP, 7 Jan 2003Slide 1 Grid Work in 2002 Andrew McNab High Energy Physics University of Manchester.
LCG LCG-1 Deployment and usage experience Lev Shamardin SINP MSU, Moscow
SAM Sensors & Tests Judit Novak CERN IT/GD SAM Review I. 21. May 2007, CERN.
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
Accounting in LCG/EGEE Can We Gauge Grid Usage via RBs? Dave Kant CCLRC, e-Science Centre.
Gennaro Tortone, Sergio Fantinel – Bologna, LCG-EDT Monitoring Service DataTAG WP4 Monitoring Group DataTAG WP4 meeting Bologna –
INRNE's participation in LCG Elena Puncheva Preslav Konstantinov IT Department.
The GridPP DIRAC project DIRAC for non-LHC communities.
SAM Status Update Piotr Nyczyk LCG Management Board CERN, 5 June 2007.
J Jensen/J Gordon RAL Storage Storage at RAL Service Challenge Meeting 27 Jan 2005.
GridPP2 Data Management work area J Jensen / RAL GridPP2 Data Management Work Area – Part 2 Mass storage & local storage mgmt J Jensen
CERN LCG1 to LCG2 Transition Markus Schulz LCG Workshop March 2004.
J Jensen / WP5 /RAL UCL 4/5 March 2004 GridPP / DataGrid wrap-up Mass Storage Management J Jensen
18/12/03PPD Christmas Lectures 2003 Grid in the Department A Guide for the Uninvolved PPD Computing Group Christmas Lecture 2003 Chris Brew.
UK Testbed Status Testbed 0 GridPP project Experiments’ tests started
Presentation transcript:

Presenter Name Facility Name UK Testbed Status and EDG Testbed Two. Steve Traylen GridPP 7, Oxford

Steve Traylen, PPD 28th April 2003 Outline Status of the UK Sites. Release of EDG 2. UK Certificates. Grid monitoring in the UK.

Steve Traylen, PPD 28th April 2003 Manchester CE SE(1.5TB) EDG xWN CE SE(5TB) EDG xWN CE SE EDG 1.4 9xWN EDG Testbed BaBar Farm DZero Farm GridPP and BaBar VO Servers. User Interface Plan that DZero farm will join LCG. SRIF bid in place for significant HEP resources for the end of the year.

Steve Traylen, PPD 28th April 2003 UCL CE SE EDG 1.4 1xWN EDG Testbed Network Monitors for WP7 development. SRIF bid in place for  200 cpus for the end of the year to join LCG1.

Steve Traylen, PPD 28th April 2003 RAL PPD CE SE EDG 1.4 9xWN EDG Testbed CE SE  EDG 2.0 1xWN RGMA Testbed MON User Interface Plan to be a portion of the Southern Tier2 Centre within LCG1. 50 cpus and 5TB of disk expected for the end of year.

Steve Traylen, PPD 28th April 2003 Birmingham CE SE EDG 1.4 1xWN EDG Testbed Expansion to 60 cpus and 4TBs. Expect to participate within LCG1/EDG2 Liverpool CE SE EDG 1.4 1xWN EDG Testbed Currently unmaintained. Plan to follow EDG 2, possibly integrating BaBar farm.

Steve Traylen, PPD 28th April 2003 RAL CE SE EDG 1.4 5xWN EDG Testbed CE EDG xWN Teir1/a CE SE  EDG 2.0 RGMA Testbed MON CE SE EDG 2.0 1xWN EDG Dev Testbed MON SE ADS UI within CSF. NM for EDG2. Top level MDS for EDG. Various WP3 and WP5 dev nodes. VOMS for DEV TB. SE LCG0 Testbed CE 1xWN

Steve Traylen, PPD 28th April 2003 Cambridge CE SE EDG xWN EDG Testbed Farm shared with local NA-48, GANGA users. Some RH73 WNs for ongoing Atlas challenge. 3TB GridFTP-SE. Plan to join LCG1/EDG2 later in the year with an extra 50 cpus later this year. EDG jobs will soon be fed into the local E-Science farm.

Steve Traylen, PPD 28th April 2003 Bristol CE SE EDG 1.4 1xWN EDG Testbed CE SE  EDG 2.0 1xWN RGMA Testbed MON CE SE CMS-LCG0 CMS/LHCb Farm 24xWN CE SE EDG 1.4 BaBar Farm 78xWN GridPP RC. Plan to join EDG2 and LCG1

Steve Traylen, PPD 28th April 2003 Imperial College CE SE EDG 1.4 EDG Testbed WNs CE EDG 1.4 WNs BaBar Farm CE SE CMS-LCG0 WN CE SE  EDG 2.0 1xWN RGMA Testbed MON RB and BD-II for EDG 1.4. RB and BD-II for EDG 2.0. Plan to be in LCG1 and other testbeds.

Steve Traylen, PPD 28th April 2003 Queen Mary CE SE EDG 1.4 1xWN EDG Testbed 32xWN CE also feeds EDG jobs to 32 node E-Science farm. Plan to have LCG1/EDG2 running for the end of the year. Expansion with SRIF grants.(64WN+2TB in Jan 2004, 100WN + 8TB in Dec 2004.)

Steve Traylen, PPD 28th April 2003 Oxford CE SE EDG 1.4 2xWN EDG Testbed Plan to join EDG2/LCG1. Nagios monitoring has been set up. (RAL is also evaluating Nagios.) Planning to send EDG jobs into 10 WN CDF farm. 128 node cluster being ordered now.

Steve Traylen, PPD 28th April 2003 Glasgow CE SE EDG 1.4 ScotGRID 59xWN New hardware expected soon. WNs on a private network with outbound NAT in place. As ScotGRID grows plans to be part of LCG. Various WP2 development boxes. CE SE  EDG 2.0 RGMA Testbed MON

Steve Traylen, PPD 28th April 2003 UK Overview Now significant resources within EDG. Integrating EDG to farm has been repeated many times but it is difficult. Sites are keen to take part within LCG1 or EDG2. By the end of the year many HEP farms plan to be contributing to LCG1 resources.

Steve Traylen, PPD 28th April 2003 EDG 2.0 Now in a permanent state of immanent release. Since 27 th May: – 25 pre releases. – 295 configuration changes. –Range from a typo to a new resource broker.

Steve Traylen, PPD 28th April 2003 Criteria for cutting EDG 2.0 For EDG 2.0 to the following must be satisfied. –50 sequential jobs. 98% success. –250 jobs being ran by 1 RB. 80% success. –5 jobs with 2GB i/o sandbox. 80% success. –25 jobs which require two proxy renewals. 80% –Upload and register 1GB file to an SE, replicate to a mass storage device. –Register 1000 files in less than 1000s. –Match a job against three files on an SE.

Steve Traylen, PPD 28th April 2003 Installation of an EDG2 Testbed LCFGng recommended installation method - No manual install instructions yet. –Significantly better than LCFG. Configuration (site-cfg.h) is less cryptic. Less hand installation required. –Install host certificates. –PBS server. –MySQL tables. –mkgridmap.conf.

Steve Traylen, PPD 28th April 2003

Steve Traylen, PPD 28th April 2003 Integration after 2.0. Use gcc3.2.2 throughout. –Currently used by RB and the APIs the RB uses. GridFTP access to castor. Integration of VOMS. –Currently ongoing in parallel. –Has no impact on existing software. This will be EDG 2.1

Steve Traylen, PPD 28th April 2003 Required Nodes CE: gatekeeper, MDS, gin,.. SE: GridFTP, WP5-SE, gin, … WN: PBS batch worker + client tools. MON: Servlets for a site, GOut for the RB. Also collects fabric monitoring information… –On small sites can be moved to the CE. Generally configuration is more modular.

Steve Traylen, PPD 28th April 2003 LCG1 or EDG2 Which testbed should I join? –Significant resources best suited to LCG1. –Small dynamic testbeds can contribute to continued development of testbed two.

Steve Traylen, PPD 28th April 2003 UK Certificates UK EScience CA was added to production EDG testbed 3 weeks ago. UK Hep CA will stop issuing certificates. –Existing certificates will still be valid for the remainder of their lifetime.

Steve Traylen, PPD 28th April 2003 Ratio of UKHep to EScience Certs

Steve Traylen, PPD 28th April 2003 EScience Certs by OU.

Steve Traylen, PPD 28th April 2003 VO Membership + EDG Guidelines

Steve Traylen, PPD 28th April 2003 Ganglia Ganglia provides time plots of system metrics. In use at RAL, Cambridge and QMUL. By default load, network i/o, memory. Trivial to add new metrics, e.g. active MySQL connection for CMS. Expansion to the UK possible via LCFG objects and instructions, however WP4 tools might be a used instead. Data could be collected centrally for a UK view.

Steve Traylen, PPD 28th April 2003

Steve Traylen, PPD 28th April 2003

Steve Traylen, PPD 28th April 2003 GridPP Map Checks HEP sites every 6(?) hours for: –Ping –Globus Submission –EDG Job Submission via Imperial RB. –EDG Job Submission via LYON RB.

Steve Traylen, PPD 28th April 2003

Steve Traylen, PPD 28th April 2003 GridPP RB Imperial Publishes service status. Publishes times for LDAP queries of resources. Imperial also submits test jobs, more sophisticated jobs than the map, e.g. check for the existence of a CloseSE.

Steve Traylen, PPD 28th April 2003

Steve Traylen, PPD 28th April 2003

Steve Traylen, PPD 28th April 2003 Monitoring Currently lots of monitoring but no central location. Most monitoring currently only shows the current state. The Grid operations centre can coordinate much of this.