Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

S.L.LloydGrid Steering Committee 8 March 2002 Slide 1 Status of GridPP Overview Financial Summary Recruitment Status EU DataGrid UK Grid Status GridPP.
31/03/ :05:55GridPP 3 Cambridge Feb 02Slide 1 Grid site report linux desktops test rig CDF ScotGRID in collaboration with Edinburgh and IBM.
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
SAM £17m 3-Year Project I: £2.49m £1.2m Experiment Objectives H*: 5.4% H: 3.2% Software Support G: Prototype Grid 9.7% F* 1.5% F 1.9% CERN J 2.6% E.
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
GridPP News NeSC opening “Media” dissemination Tier 1/A hardware Web pages Collaboration meetings Nick Brook University of Bristol.
Martin Bly RAL Tier1/A RAL Tier1/A Site Report HEPiX-HEPNT Vancouver, October 2003.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
GridPP meeting Feb 03 R. Hughes-Jones Manchester WP7 Networking Richard Hughes-Jones.
11 Dec 2000F Harris Datagrid Testbed meeting at Milan 1 LHCb ‘use-case’ - distributed MC production
EU funding for DataGrid under contract IST is gratefully acknowledged GridPP Tier-1A Centre CCLRC provides the GRIDPP collaboration (funded.
Edinburgh Site Report 1 July 2004 Steve Thorn Particle Physics Experiments Group.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick Central UK Computing (what.
April 2001HEPix/HEPNT1 RAL Site Report John Gordon CLRC, UK.
Southgrid Technical Meeting Pete Gronbech: 16 th March 2006 Birmingham.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
Robin Middleton RAL/PPD DG Co-ordination Rome, 23rd June 2001.
Oxford Update HEPix Pete Gronbech GridPP Project Manager October 2014.
1 st EGEE Conference – April UK and Ireland Partner Dave Kant Deputy ROC Manager.
March 2003 CERN 1 EDG and AliEn in Prague Dagmar Adamova INP Rez near Prague.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
CMS Report – GridPP Collaboration Meeting V Peter Hobson, Brunel University16/9/2002 CMS Status and Plans Progress towards GridPP milestones Workload management.
Edinburgh Investment in e-Science Infrastructure Dr Arthur Trew.
28 April 2003Imperial College1 Imperial College Site Report HEP Sysman meeting 28 April 2003.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
Laboratório de Instrumentação e Física Experimental de Partículas GRID Activities at LIP Jorge Gomes - (LIP Computer Centre)
John Gordon CCLRC e-Science Centre LCG Deployment in the UK John Gordon GridPP10.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
WP8 Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)
The II SAS Testbed Site Jan Astalos - Institute of Informatics Slovak Academy of Sciences.
October 2002 INFN Catania 1 The (LHCC) Grid Project Initiative in Prague Dagmar Adamova INP Rez near Prague.
1 PRAGUE site report. 2 Overview Supported HEP experiments and staff Hardware on Prague farms Statistics about running LHC experiment’s DC Experience.
Grid Glasgow Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID.
GridPP Presentation to AstroGrid 13 December 2001 Steve Lloyd Queen Mary University of London.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
Dave Newbold, University of Bristol8/3/2001 UK Testbed 0 Sites Sites that have committed to TB0: RAL (R) Birmingham (Q) Bristol (Q) Edinburgh (Q) Imperial.
…building the next IT revolution From Web to Grid…
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Tier1 Andrew Sansum GRIDPP 10 June GRIDPP10 June 2004Tier1A2 Production Service for HEP (PPARC) GRIDPP ( ). –“ GridPP will enable testing.
National HEP Data Grid Project in Korea Kihyeon Cho Center for High Energy Physics (CHEP) Kyungpook National University CDF CAF & Grid Meeting July 12,
Tier1A Status Andrew Sansum 30 January Overview Systems Staff Projects.
Presenter Name Facility Name UK Testbed Status and EDG Testbed Two. Steve Traylen GridPP 7, Oxford.
CDF computing in the GRID framework in Santander
UK Grid Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid Prototype and Globus Technical Meeting QMW, 22nd November 2000 Glenn Patrick (RAL)
Grid Glasgow Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID.
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
DataTAG Work Package 4 Meeting Bologna Simone Ludwig Brunel University 23rd and 24th of May 2002.
BaBar Cluster Had been unstable mainly because of failing disks Very few (
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
J Jensen/J Gordon RAL Storage Storage at RAL Service Challenge Meeting 27 Jan 2005.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
Slide § David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP delivering The UK Grid.
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
CNAF - 24 September 2004 EGEE SA-1 SPACI Activity Italo Epicoco.
18/12/03PPD Christmas Lectures 2003 Grid in the Department A Guide for the Uninvolved PPD Computing Group Christmas Lecture 2003 Chris Brew.
UK GridPP Tier-1/A Centre at CLRC
The INFN TIER1 Regional Centre
UK Testbed Status Testbed 0 GridPP project Experiments’ tests started
D. Galli, U. Marconi, V. Vagnoni INFN Bologna N. Brook Bristol
LHC Data Analysis using a worldwide computing grid
Presentation transcript:

Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002

John Gordon - LCG 13th March n° 2 What is the Grid? u The Grid means different things at different times to different people. u For me it means a method of allowing a loosely-connected and geographically-distributed group of people to share a distributed set of resources with a minimum of restrictions. u This group of people is known as a ‘Virtual Organisation’ and an HEP experiment is a good example of a VO. u If the set of software used by different VOs has a large overlap with a lot of re-use, then we can say we have a Grid. u Most current grids are basing their work on the Globus Toolkit

John Gordon - LCG 13th March n° 3 HEP Grids

John Gordon - LCG 13th March n° 4

John Gordon - LCG 13th March n° 5 Monarc model of Regional Centres Department    Desktop CERN 622 Mbps 155 mbps Tier2 Lab a Uni b Lab c Uni n Tier 1 FNAL RAL IN2P3 622 Mbps 2.5 Gbps INFN …….. Tier-0

John Gordon - LCG 13th March n° 6 Regional Centres - a More Realistic Topology ! Department    Desktop CERN – Tier 0 Tier 1 FNAL INFN IN2P3 622 Mbps 2.5 Gbps 622 Mbps 155 mbps Tier2 Lab a Uni b Lab c Uni n RAL …… Gbps DHL

John Gordon - LCG 13th March n° 7 LHC Computing Grid CERN Tier2 Lab a Uni a Lab c Uni n Lab m Lab b Uni b Uni y Uni x Physics Department    Desktop Tier 1 USA FermiLab UK France Italy NL Germany USA Brookhaven ……….

John Gordon - LCG 13th March n° 8

John Gordon - LCG 13th March n° 9 How will the UK participate? u Tier1 (and Babar TierA) at RAL u UK plans approx 4 Tier2 centres, not yet clear which n Candidates include Imperial/UCL/QMW, Manchester/Liverpool/Lancaster, Bristol, Cambridge, Oxford, Birmingham ScotGrid n Regional? u Tier2 centres likely to be shared use of node farms.

John Gordon - LCG 13th March n° 10 UK Tier1/A Status 2003 Hardware Purchase installed March 156 Dual 1.4GHz 1GB RAM, 30GB disks (312 cpus) 26 Disk servers (Dual 1.266GHz) 1.9TB disk each Expand the capacity of the tape robot by 35TB Current EDG TB setup 14 Dual 1GHz PIII, 500MB RAM 40GB disks Compute Element (CE) Storage Element (SE) User Interfaces (UI) Information Node (IN) + Worker Nodes (WN) + Existing Central Facilities (Non Grid) 250 CPUs 10TB Disk 35TB Tape (Capacity 330 TB)

John Gordon - LCG 13th March n° 11 Projected Staff Effort [SY] WP1 Workload Management 0.5[IC] 2.0 [IC] WP2 Data Management 1.5++[Ggo]1.0 [Oxf] WP3 Monitoring Services 5.0++[RAL, QMW]1.0 [HW] Security ++[RAL]1.0 [Oxf] WP4 Fabric Management 1.5[Edin., L’pool] WP5 Mass Storage 3.5++[RAL, L’pool] WP6 Integration Testbed 5.0++[RAL/M’cr/IC/Bristol] WP7 Network Services 2.0[UCL/M’cr] 1.0 [UCL] WP8 Applications 17.0 ATLAS/LHCb (Gaudi/Athena) 6.5 [Oxf, Cam, RHUL, B’ham, RAL] CMS 3.0 [IC, Bristol, Brunel] CDF/D0 (SAM) 4.0 [IC, Ggo, Oxf, Lanc] BaBar 2.5 [IC, M’cr, Bristol] UKQCD 1.0 [Edin.] Tier1/A13.0 [RAL] Total > = 80++

John Gordon - LCG 13th March n° 12 Future Resources u GridPP is a three year project u We will spend similar amounts on hardware at the end of 2002 and 2003 n But hope to get more for the money

John Gordon - LCG 13th March n° 13 UK Tier-2 Example Site - ScotGRID ScotGrid Processing nodes at Glasgow 59 IBM X Series 330 dual 1 GHz Pentium III with 2GB memory 2 IBM X Series 340 dual 1 GHz Pentium III with 2GB memory and dual ethernet 3 IBM X Series 340 dual 1 GHz Pentium III with 2GB memory and Mbit/s ethernet 1TB disk LTO/Ultrium Tape Library Cisco ethernet switches ScotGrid Storage at Edinburgh IBM X Series 370 PIII Xeon with 512 MB memory 32 x 512 MB RAM 70 x 73.4 GB IBM FC Hot- Swap HDD BaBar UltraGrid System at Edinburgh 4 UltraSparc 80 machines in a rack 450 MHz CPUs in each 4Mb cache, 1 GB memory Fast Ethernet and MirrorNet switching CDF equipment at Glasgow 8 x 700 MHz Xeon IBM xSeries GB memory 1 TB disk Griddev testrig at Glasgow 4 x 233 MHz Pentium II One of (currently) 10 GridPP sites running in the UK

John Gordon - LCG 13th March n° 14 Network u Tier1 internal networking will be a hybrid of n 100Mb to nodes of cpu farms with 1Gb up from switches n 1Gb to disk servers n 1Gb to tape servers u UK academic network SuperJANET4 n 2.5Gbit backbone upgrading to 20Gb in 2003 u RAL has 622Mb into SJ4 u SJ4 has 2.5Gb interconnect to Geant u New 2.5Gb link to ESnet and Abilene just for research users u UK involved in networking development n internal with Cisco on QoS n external with DataTAG

John Gordon - LCG 13th March n° 15 Network Monitoring u RAL is part of HEP Network Monitoring Groups. u Monitoring based on UK, SLAC, EDG, RIPE u u u u

John Gordon - LCG 13th March n° 16 Certification Authority - status & plans u UKHEP CA has been signing certificates since October 2000 n Trusted by EDG n Trusted by DoE s recent transatlantic transfers by D0 between FNAL and UK publicised by PPDG as first external use of DoE CA u UK Grid Support Centre setting up UK CA for UK eScience n based on OpenCA n HEP users will migrate to it over 2002

John Gordon - LCG 13th March n° 17 GridPP Deployment Provide architecture and middleware Use the Grid with simulated data Use the Grid with real data Future LHC Experiments Running US Experiments Build Tier-A/prototype Tier-1 and Tier-2 centres in the UK and join worldwide effort to develop middleware for the experiments

John Gordon - LCG 13th March n° 18 DataGrid Deployment in UK u RAL and Manchester in EDG TB1 u Expanding now to a core of 4 sites (Manchester, Bristol, Imperial, RAL) lead by Manchester u EDG TB1 presence at most UK HEP sites over the next few months u Expand RAL testbed to include production facilities if required u Include substantial resources at other UK sites u...including non-HEP Centres

John Gordon - LCG 13th March n° 19 Other Grid Deployment u But GridPP will not just be EDG Testbed

John Gordon - LCG 13th March n° 20 D0

John Gordon - LCG 13th March n° 21 CDF

John Gordon - LCG 13th March n° 22

John Gordon - LCG 13th March n° 23 Planned Testbed Use u Testbeds n EDG testbed1, 2, 3 n EDG development testbed, n DataTAG/GRIT/GLUE n LCG testbeds n other UK testbeds u Data Challenges n Alice, Atlas, CMS, and LHCb confirmed they will use RAL u Production n BaBar and others

John Gordon - LCG 13th March n° 24 Short-term Plans u Integrate Resources as closely as possible n To avoid different hardware/software for different VOs n This could be difficult (eg RH6/7 and Objectivity) and different grid projects u Start with existing infrastructure, develop EDG testbeds on the side u Move cpu and disk around logically, not physically u Provide additional front-ends where required u Move existing experiments to grid-based tools n E.g. for remote job submission

John Gordon - LCG 13th March n° 25 Involvement in GRID MW projects u EDG u DataTAG u BaBar Grid u SAM u Gaudi EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA LCFG MDS deployment FTREE GridSite SlashGrid Spitfire…

John Gordon - LCG 13th March n° 26 Experiment Grid Deployment

John Gordon - LCG 13th March n° 27 Summary u UK will put substantial resources into building a Grid for Particle Physics. u Tier1 at RAL and several Tier2s will form the backbone of a grid which will reach into all PP groups and other national centres. u This grid will be the main source of UK computing resources for the next few years u And it will be used as part of many grid projects.