Andrew McNab - Manchester HEP - 10 May 2002 UK Testbed Deployment Aim of this talk is to the answer the questions: –What are other sites doing? –What are.

Slides:



Advertisements
Similar presentations
E. Ferro, CNAF, aprile LCFG: testbed upgrade Enrico Ferro INFN-LNL.
Advertisements

Andrew McNab - Manchester HEP - 15 February 2002 Testbed Release in the UK EDG Testbed 1 GridPP sources of information GridPP VO GIIS and Resource Broker.
30-31 Jan 2003J G Jensen, RAL/WP5 Storage Elephant Grid Access to Mass Storage.
S.L.LloydGrid Steering Committee 8 March 2002 Slide 1 Status of GridPP Overview Financial Summary Recruitment Status EU DataGrid UK Grid Status GridPP.
The Quantum Chromodynamics Grid James Perry, Andrew Jackson, Matthew Egbert, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
LCFG overview and future Michael George University of Liverpool
Andrew McNabTestbed / HTTPS, GridPP6, 30 Jan 2003Slide 1 UK Testbed Status Andrew McNab High Energy Physics University of Manchester.
Status Report University of Bristol 3 rd GridPP Collaboration Meeting 14/15 February, 2002Marc Kelly University of Bristol 1 Marc Kelly University of Bristol.
Partner Logo Tier1/A and Tier2 in GridPP2 John Gordon GridPP6 31 January 2003.
B A B AR and the GRID Roger Barlow for Fergus Wilson GridPP 13 5 th July 2005, Durham.
Andrew McNab - Manchester HEP - 17 September 2002 Putting Existing Farms on the Testbed Manchester DZero/Atlas and BaBar farms are available via the Testbed.
Partner Logo UK GridPP Testbed Rollout John Gordon GridPP 3rd Collaboration Meeting Cambridge 15th February 2002.
NorthGrid status Alessandra Forti Gridpp12 Brunel, 1 February 2005.
Andrew McNab - Manchester HEP - 24 May 2001 WorkGroup H: Software Support Both middleware and application support Installation tools and expertise Communication.
12th September 2002Tim Adye1 RAL Tier A Tim Adye Rutherford Appleton Laboratory BaBar Collaboration Meeting Imperial College, London 12 th September 2002.
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH Home server AFS using openafs 3 DB servers. Web server AFS Mail Server.
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
4 th DataGRID Project Conference, Paris, 5 March 2002 Testbed Software Test Plan I. Mandjavidze on behalf of L. Bobelin – CS SI; F.Etienne, E. Fede – CPPM;
Presenter Name Facility Name EDG Testbed Status Moving to Testbed Two.
GridPP News NeSC opening “Media” dissemination Tier 1/A hardware Web pages Collaboration meetings Nick Brook University of Bristol.
Andrew McNab - Manchester HEP - 2 May 2002 Testbed and Authorisation EU DataGrid Testbed 1 Job Lifecycle Software releases Authorisation at your site Grid/Web.
Andrew McNab - Manchester HEP - 31 January 2002 Testbed Release in the UK Integration Team UK deployment TB1 Job Lifecycle VO: Authorisation VO: GIIS and.
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
Andrew McNab - EDG Access Control - 14 Jan 2003 EU DataGrid security with GSI and Globus Andrew McNab University of Manchester
NorthGrid status Alessandra Forti Gridpp13 Durham, 4 July 2005.
Andrew McNab - Manchester HEP - 6 November Old version of website was maintained from Unix command line => needed (gsi)ssh access.
Task 3.5 Tests and Integration ( Wp3 kick-off meeting, Poznan, 29 th -30 th January 2002 Santiago González de la.
Andrew McNab - Manchester HEP - 22 April 2002 UK Rollout and Support Plan Aim of this talk is to the answer question “As a site admin, what are the steps.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
Andrew McNab - Manchester HEP - 26 June 2001 WG-H / Support status Packaging / RPM’s UK + EU DG CA’s central grid-users file grid “ping”
Andrew McNab - Manchester HEP - 5 July 2001 WP6/Testbed Status Status by partner –CNRS, Czech R., INFN, NIKHEF, NorduGrid, LIP, Russia, UK Security Integration.
Southgrid Technical Meeting Pete Gronbech: 16 th March 2006 Birmingham.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
28 April 2003Imperial College1 Imperial College Site Report HEP Sysman meeting 28 April 2003.
GridPP Collaboration meeting Manchester 9/10 May 2002 P.Clarke Resource Management: Instruments and Procedures Testbed Rollout: Development of Plan Education.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH OS X Home server AFS using openafs 3 DB servers Kerberos 4 we will move.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
First attempt for validating/testing Testbed 1 Globus and middleware services WP6 Meeting, December 2001 Flavia Donno, Marco Serra for IT and WPs.
DataGRID WPMM, Geneve, 17th June 2002 Testbed Software Test Group work status for 1.2 release Andrea Formica on behalf of Test Group.
Dave Newbold, University of Bristol8/3/2001 UK Testbed 0 Sites Sites that have committed to TB0: RAL (R) Birmingham (Q) Bristol (Q) Edinburgh (Q) Imperial.
Tier1 Andrew Sansum GRIDPP 10 June GRIDPP10 June 2004Tier1A2 Production Service for HEP (PPARC) GRIDPP ( ). –“ GridPP will enable testing.
22nd April 2002 Steve Traylen, RAL, 1 LCFG Installation Steve Traylen. LCFG – A tool for installation and configuration. UK HEP SYSMAN,
University of Bristol 5th GridPP Collaboration Meeting 16/17 September, 2002Owen Maroney University of Bristol 1 Testbed Site –EDG 1.2 –LCFG GridPP Replica.
Tier1A Status Andrew Sansum 30 January Overview Systems Staff Projects.
Presenter Name Facility Name UK Testbed Status and EDG Testbed Two. Steve Traylen GridPP 7, Oxford.
Jens G Jensen RAL, EDG WP5 Storage Element Overview DataGrid Project Conference Heidelberg, 26 Sep-01 Oct 2003.
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
Andrew McNab - Manchester HEP - 17 September 2002 UK Testbed Deployment Aim of this talk is to the answer the questions: –“How much of the Testbed has.
DataTAG Work Package 4 Meeting Bologna Simone Ludwig Brunel University 23rd and 24th of May 2002.
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
Andrew McNabGrid in 2002, Manchester HEP, 7 Jan 2003Slide 1 Grid Work in 2002 Andrew McNab High Energy Physics University of Manchester.
The EDG Testbed The European DataGrid Project Team
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
LHCb Grid MeetingLiverpool, UK GRID Activities Glenn Patrick Not particularly knowledgeable-just based on attending 3 meetings.  UK-HEP.
The GridPP DIRAC project DIRAC for non-LHC communities.
J Jensen/J Gordon RAL Storage Storage at RAL Service Challenge Meeting 27 Jan 2005.
10/18/01Linux Reconstruction Farms at Fermilab 1 Steven C. Timm--Fermilab.
CERN LCG1 to LCG2 Transition Markus Schulz LCG Workshop March 2004.
Dave Newbold, University of Bristol21/3/2001 (Short) WP6 Update Where are we? Testbed 0 going (ish); some UK sites being tried out for production (mostly.
CNAF - 24 September 2004 EGEE SA-1 SPACI Activity Italo Epicoco.
The EDG Testbed Deployment Details
LCG 3D Distributed Deployment of Databases
UK GridPP Tier-1/A Centre at CLRC
UK Testbed Status Testbed 0 GridPP project Experiments’ tests started
The EU DataGrid Fabric Management Services
Presentation transcript:

Andrew McNab - Manchester HEP - 10 May 2002 UK Testbed Deployment Aim of this talk is to the answer the questions: –What are other sites doing? –What are the steps between the situation today, and all interested UK sites being part of the Testbed?

Andrew McNab - Manchester HEP - 10 May 2002 Software Releases Deployment largely driven/limited by EDG software release status and cycles. Have 3 major releases to coincide with three yearly Testbeds 1, 2 and 3 Have minor releases every 2 months, and then patch level releases between those: currently frozen at to allow concentration on 1.2. We currently have a mixture of sites with: –some version of the Globus gatekeeper for Gavins Green Dots Map. –old Globus installations –Globus 2.0/2.0beta installations, including EDG installations (using 2.0beta), usually just a Computing Element. BaBar installations of EDG CE. Following table gives an overview of the testbed (not intended as per- site progress reports!)

Andrew McNab - Manchester HEP - 10 May 2002 Status Overview GreenDotG1.1.3G2.0(b)EDG-CEBabar-CE Birminghamyyy Bristolyyyyy Brunelyy Cambridgey Edinburghyy Glasgowyy Imperialyyy Lancasteryy Liverpoolyy Manchesteryyyyy Oxfordyy QMULyyy RALyyyy RHULyy UCLy (Based on a straw poll - apologies for any omissions)

Andrew McNab - Manchester HEP - 10 May 2002 UK WP6 Sites CLRC/RAL –LCFG server; UI (accessible as part of CSF); 8 CPUs running EDG (usually full now); 2 development boxes; CE SE WN with EDG 1.2alpha5 from Monday; WP7 Network Monitoring Element; GridPP MDS server. Bristol –LCFG server; several boxes with EDG IC –LCFG server; GridPP Resource Broker, CE and WNs with EDG Manchester –LCFG server; CE with manually installed EDG 1.1.4; GridPP and BaBar VO servers.

UK Tier1/A RAL Hardware Purchase installed March 156 Dual 1.4GHz 1GB RAM, 30GB disks (312 cpus) 26 Disk servers (Dual 1.266GHz) 1.9TB disk each Expand the capacity of the tape robot by 35TB Current EDG TB setup 14 Dual 1GHz PIII, 500MB RAM 40GB disks Compute Element (CE) Storage Element (SE) User Interfaces (UI) Information Node (IN) + Worker Nodes (WN) + Existing Central Facilities (Non Grid) 250 CPUs 10TB Disk 35TB Tape (Capacity 330 TB)

Andrew McNab - Manchester HEP - 10 May 2002 Constituencies involved in deployment Sysadmins (represented in Sysman and by WP6) –Care about the integrity of their site (firewalls etc) and the maintenance load of Grid Testbed equipment; may be under pressure to get a green dot on the relevant map. Experiments (represented by GridPP EB) –Want sites up and running for application development and to run data challenges. Esp. concerned about sites hosting their farms. Grid developers (represented by GridPP TB) –Concerned about getting functionality running in UK, esp. at sites where experts are. Local HEP groups (represented on GridPP CB) –Want a transparent process that includes everyone.

Andrew McNab - Manchester HEP - 10 May 2002 UK Deployment Plan Start with UK WP6 people (+ other key experts) –Use mailing list, which anyone can join and is archived. Once have some WP6 sites up and a procedure that should work for any site, then ask more sites to test installation procedure, docs etc. Once this has stabilised, invite all interested sites to install Testbed software: by this point, installation instructions should be clear and not require previous grid experience, ad-hoc fixes etc. Support will then be provided by tb-support mailing list on best-effort basis and Grid Support Centre with a formal ticket-based system. –(Likely that mailing list will be best place to get a quick answer.) Need to repeat this every 2 months for EDG releases, although not all sites may have commitments (eg from experiments) to do this. will have formal document.

Andrew McNab - Manchester HEP - 10 May 2002 Installation Procedure This needs to be based around LCFG because of complex configuration process, which is published as automatic LCFG configuration objects. Sites will almost certainly need to dedicate machines to be installed this way. LCFG itself has been non-trivial to install and configure. However, once your local LCFG server is working, installing CE etc is automated (similar to RedHat kickstart.) Steve Traylen has written up the procedure used to install LCFG at RAL, and contributed updated scripts back to WP6. So the situation is much better now.

Andrew McNab - Manchester HEP - 10 May 2002 install.gridpp.ac.uk Prototype installation procedure for LCFG: –modified RedHat HTTP install you take a standard install floppy point it at uses extra LCFG server option (additional to NFS server etc) –just takes published LCFG server installation procedure –adds any necessary fixes –puts all the commands into a small number of scripts (eg 1) Would like to have turnkey installation of local LCFG servers for each site. Sites will still need to edit local site config and generate profiles for each machine.

Andrew McNab - Manchester HEP - 10 May 2002 Testbed Joining Procedure Procedure for joining EDG testbed is being defined by WP6. For UK sites, WP6 UK will propose sites to rest of WP6. WP6-UK will provide help with testing UK sites: –Have GridPP VO, Resource Broker and MDS so can add sites to UK Testbed and validate functionality ourselves before trying in the EDG Testbed. –We dont want to propose sites without testing them ourselves; and cautious sites can put themselves up for validation within GridPP without risking embarrassment at a European level.Will use same validation criteria as WP6. Formal procedure + test suites will appear at (after TB etc has seen it)

Andrew McNab - Manchester HEP - 10 May 2002 More information WP6 website at GridPP Technical Board mailing list -install-cookbook/ has improved LCFG installation instructions. has: –mailing list information –Steves recipe for LCFG-installing CE etc. –will have link to formal joining procedure document and deployment plan.