Tony Doyle GridPP – Project Elements UK e-Science All Hands Conference, Sheffield 3 September 2002.

Slides:



Advertisements
Similar presentations
International Grid Communities Dr. Carl Kesselman Information Sciences Institute University of Southern California.
Advertisements

An open source approach for grids Bob Jones CERN EU DataGrid Project Deputy Project Leader EU EGEE Designated Technical Director
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
1 ALICE Grid Status David Evans The University of Birmingham GridPP 14 th Collaboration Meeting Birmingham 6-7 Sept 2005.
GridPP July 2003Stefan StonjekSlide 1 SAM middleware components Stefan Stonjek University of Oxford 7 th GridPP Meeting 02 nd July 2003 Oxford.
Tony Doyle GridPP – Project Elements AstroGrid Meeting MSSL, 26 Jun 2002.
WP2: Data Management Gavin McCance University of Glasgow November 5, 2001.
S.L.LloydGrid Steering Committee 8 March 2002 Slide 1 Status of GridPP Overview Financial Summary Recruitment Status EU DataGrid UK Grid Status GridPP.
GridPP9 – 5 February 2004 – Data Management DataGrid is a project funded by the European Union GridPP is funded by PPARC GridPP2: Data and Storage Management.
31/03/ :05:55GridPP 3 Cambridge Feb 02Slide 1 Grid site report linux desktops test rig CDF ScotGRID in collaboration with Edinburgh and IBM.
GridPP, The Grid & Industry Who we are, what it is and what we can do. Tony Doyle, Project Leader Steve Lloyd, Collaboration Board Chairman Robin Middleton,
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
OptorSim: A Replica Optimisation Simulator for the EU DataGrid W. H. Bell, D. G. Cameron, R. Carvajal, A. P. Millar, C.Nicholson, K. Stockinger, F. Zini.
5-Dec-02D.P.Kelsey, GridPP Security1 GridPP Security UK Security Workshop 5-6 Dec 2002, NeSC David Kelsey CLRC/RAL, UK
Tony Doyle Executive Summary, PPARC, MRC London, 15 May 2003.
Partner Logo Tier1/A and Tier2 in GridPP2 John Gordon GridPP6 31 January 2003.
Stephen Burke - WP8 Status - 14/2/2002 Partner Logo WP8 Status Stephen Burke, PPARC/RAL.
Partner Logo UK GridPP Testbed Rollout John Gordon GridPP 3rd Collaboration Meeting Cambridge 15th February 2002.
Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
Level 1 Components of the Project. Level 0 Goal or Aim of GridPP. Level 2 Elements of the components. Level 2 Milestones for the elements.
Tony Doyle GridPP2 Proposal, BT Meeting, Imperial, 23 July 2003.
Fighting Malaria With The Grid. Computing on The Grid The Internet allows users to share information across vast geographical distances. Using similar.
Exploiting the Grid to Simulate & Design the LHCb Experiment Eric van Herwijnen (CERN) Glenn Patrick (Rutherford Appleton Laboratory) National e-Science.
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
22-Apr-02D.P.Kelsey, Security, UKHEP Sysman1 Grid Security 22 Apr 2002 UK HEP Sysman Meeting David Kelsey CLRC/RAL, UK
LHCb(UK) Meeting Glenn Patrick1 LHCb Grid Activities in UK LHCb(UK) Meeting Cambridge, 10th January 2001 Glenn Patrick (RAL)
Data Management Expert Panel - WP2. WP2 Overview.
15 May 2006Collaboration Board GridPP3 Planning Executive Summary Steve Lloyd.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
Andrew McNab - Manchester HEP - 2 May 2002 Testbed and Authorisation EU DataGrid Testbed 1 Job Lifecycle Software releases Authorisation at your site Grid/Web.
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
Nick Brook University of Bristol The LHC Experiments & Lattice EB News Brief overview of the expts  ATLAS  CMS  LHCb  Lattice.
EGEE statement EU and EU member states major investment in Grid Technology Several good prototype results Next Step: –Leverage current and planned national.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
EU funding for DataGrid under contract IST is gratefully acknowledged GridPP Tier-1A Centre CCLRC provides the GRIDPP collaboration (funded.
Tony Doyle “GridPP – Project Elements” UK e-Science All Hands Conference, Sheffield 3 September 2002.
GridPP9 – 5 February 2004 – Data Management DataGrid is a project funded by the European Union GridPP is funded by PPARC WP2+5: Data and Storage Management.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
Andrew McNab - Manchester HEP - 5 July 2001 WP6/Testbed Status Status by partner –CNRS, Czech R., INFN, NIKHEF, NorduGrid, LIP, Russia, UK Security Integration.
Tony Doyle GridPP – From Prototype To Production, GridPP10 Meeting, CERN, 2 June 2004.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
Robin Middleton RAL/PPD DG Co-ordination Rome, 23rd June 2001.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
CMS Report – GridPP Collaboration Meeting V Peter Hobson, Brunel University16/9/2002 CMS Status and Plans Progress towards GridPP milestones Workload management.
3 June 2004GridPP10Slide 1 GridPP Dissemination Sarah Pearce Dissemination Officer
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
WP8 Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)
InterGrid Meeting 7 Oct 2001 Tony Doyle. Tony Doyle - University of Glasgow GridPP Status  Financial Background  Deliverables  Recruitment  Regional.
Quick Introduction to NorduGrid Oxana Smirnova 4 th Nordic LHC Workshop November 23, 2001, Stockholm.
Grid Glasgow Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID.
GridPP Presentation to AstroGrid 13 December 2001 Steve Lloyd Queen Mary University of London.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
GRID IIII D UK Particle Physics Tony Doyle - University of Glasgow GridPP Status Report  Executive Summary  News  “Unfunded” EU DataGrid Posts  Shadow.
…building the next IT revolution From Web to Grid…
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Presenter Name Facility Name UK Testbed Status and EDG Testbed Two. Steve Traylen GridPP 7, Oxford.
UK Grid Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid Prototype and Globus Technical Meeting QMW, 22nd November 2000 Glenn Patrick (RAL)
Grid Glasgow Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID.
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
Storage Management on the Grid Alasdair Earl University of Edinburgh.
UK GridPP Tier-1/A Centre at CLRC
D. Galli, U. Marconi, V. Vagnoni INFN Bologna N. Brook Bristol
Presentation transcript:

Tony Doyle GridPP – Project Elements UK e-Science All Hands Conference, Sheffield 3 September 2002

Tony Doyle - University of Glasgow GridPP – Project Elements From Web to Grid… e-Science = Middleware LHC Computing Challenge Infrastructure –Tiered Computer Centres –Network BaBar – a running experiment Non-technical issues …Building the Next IT Revolution UK GridPP EU DataGrid –Middleware Development –Operational Grid DataGrid Testbed Status: 25 Jun :38:47 GMT GridPP Testbed Grid Job Submission Things Missing, Apparently… …From Grid to Web

Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA LCFG MDS deployment GridSite SlashGrid Spitfire… Applications (start-up phase) BaBar CDF/D0 (SAM) ATLAS/LHCb CMS (ALICE) UKQCD £17m 3-year project funded by PPARC CERN - LCG (start-up phase) funding for staff and hardware... £3.78m £5.67m £3.66m £1.99m £1.88m CERN DataGrid Tier - 1/A Applications Operations

Tony Doyle - University of Glasgow Provide architecture and middleware Use the Grid with simulated data Use the Grid with real data Future LHC Experiments Running US Experiments Build Tier-A/prototype Tier-1 and Tier-2 centres in the UK and join worldwide effort to develop middleware for the experiments GridPP

Tony Doyle - University of Glasgow Who are we? Nick White /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Nick White member Roger Jones /O=Grid/O=UKHEP/OU=lancs.ac.uk/CN=Roger Jones member Sabah Salih /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Sabah Salih member Santanu Das /O=Grid/O=UKHEP/OU=hep.phy.cam.ac.uk/CN=Santanu Das member Tony Cass /O=Grid/O=CERN/OU=cern.ch/CN=Tony Cass member David Kelsey /O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=David Kelsey member Henry Nebrensky /O=Grid/O=UKHEP/OU=brunel.ac.uk/CN=Henry Nebrensky member Paul Kyberd /O=Grid/O=UKHEP/OU=brunel.ac.uk/CN=Paul Kyberd member Peter Hobson /O=Grid/O=UKHEP/OU=brunel.ac.uk/CN=Peter R Hobson member Robin Middleton /O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=Robin Middleton member Alexander Holt /O=Grid/O=UKHEP/OU=ph.ed.ac.uk/CN=Alexander Holt member Alasdair Earl /O=Grid/O=UKHEP/OU=ph.ed.ac.uk/CN=Alasdair Earl member Akram Khan /O=Grid/O=UKHEP/OU=ph.ed.ac.uk/CN=Akram Khan member Stephen Burke /O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=Stephen Burke member Paul Millar /O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=Paul Millar member Andy Parker /O=Grid/O=UKHEP/OU=hep.phy.cam.ac.uk/CN=M.A.Parker member Neville Harnew /O=Grid/O=UKHEP/OU=physics.ox.ac.uk/CN=Neville Harnew member Pete Watkins /O=Grid/O=UKHEP/OU=ph.bham.ac.uk/CN=Peter Watkins member Owen Maroney /O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Owen Maroney member Alex Finch /O=Grid/O=UKHEP/OU=lancs.ac.uk/CN=Alex Finch member Antony Wilson /O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=Antony Wilson member Tim Folkes /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Tim Folkes member Stan Thompson /O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=A. Stan Thompson member Mark Hayes /O=Grid/O=UKHEP/OU=amtp.cam.ac.uk/CN=Mark Hayes member Todd Huffman /O=Grid/O=UKHEP/OU=physics.ox.ac.uk/CN=B. Todd Huffman member Glenn Patrick /O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=G N Patrick member Pete Gronbech /O=Grid/O=UKHEP/OU=physics.ox.ac.uk/CN=Pete Gronbech member Nick Brook /O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Nick Brook member Marc Kelly /O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Marc Kelly member Dave Newbold /O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Dave Newbold member Kate Mackay /O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Catherine Mackay member Girish Patel /O=Grid/O=UKHEP/OU=ph.liv.ac.uk/CN=Girish D. Patel member David Martin /O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=David J. Martin member Peter Faulkner /O=Grid/O=UKHEP/OU=ph.bham.ac.uk/CN=Peter Faulkner member David Smith /O=Grid/O=UKHEP/OU=ph.bham.ac.uk/CN=David Smith member Steve Traylen /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Steve Traylen member Ruth Dixon del Tufo /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Ruth Dixon del Tufo member Linda Cornwall /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Linda Cornwall member /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Yee-Ting Li member Paul D. Mealor /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Paul D Mealor member /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Paul A Crosby member David Waters /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=David Waters member Bob Cranfield /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Bob Cranfield member Ben West /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Ben West member Rod Walker /O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Rod Walker member /O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Philip Lewis member Dave Colling /O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Dr D J Colling member Alex Howard /O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Alex Howard member Roger Barlow /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Roger Barlow member Joe Foster /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Joe Foster member Alessandra Forti /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Alessandra Forti member Peter Clarke /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Peter Clarke member Andrew Sansum /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Andrew Sansum member John Gordon /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=John Gordon member Andrew McNab /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Andrew McNab member Richard Hughes-Jones /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Richard Hughes-Jones member Gavin McCance /O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=Gavin McCance member Tony Doyle /O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=Tony Doyle admin Alex Martin /O=Grid/O=UKHEP/OU=ph.qmw.ac.uk/CN=A.J.Martin member Steve Lloyd /O=Grid/O=UKHEP/OU=ph.qmw.ac.uk/CN=S.L.Lloyd admin John Gordon /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=John Gordon member/O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Nick White/O=Grid/O=UKHEP/OU=lancs.ac.uk/CN=Roger Jones/O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Sabah Salih /O=Grid/O=UKHEP/OU=hep.phy.cam.ac.uk/CN=Santanu Das/O=Grid/O=CERN/OU=cern.ch/CN=Tony Cass /O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=David Kelsey/O=Grid/O=UKHEP/OU=brunel.ac.uk/CN=Henry Nebrensky/O=Grid/O=UKHEP/OU=brunel.ac.uk/CN=Paul Kyberd/O=Grid/O=UKHEP/OU=brunel.ac.uk/CN=Peter R Hobson/O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=Robin Middleton/O=Grid/O=UKHEP/OU=ph.ed.ac.uk/CN=Alexander Holt/O=Grid/O=UKHEP/OU=ph.ed.ac.uk/CN=Alasdair Earl/O=Grid/O=UKHEP/OU=ph.ed.ac.uk/CN=Akram Khan/O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=Stephen Burke/O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=Paul Millar/O=Grid/O=UKHEP/OU=hep.phy.cam.ac.uk/CN=M.A.Parker /O=Grid/O=UKHEP/OU=physics.ox.ac.uk/CN=Neville Harnew/O=Grid/O=UKHEP/OU=ph.bham.ac.uk/CN=Peter Watkins/O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Owen Maroney/O=Grid/O=UKHEP/OU=lancs.ac.uk/CN=Alex Finch/O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=Antony Wilson/O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Tim Folkes/O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=A. Stan Thompson /O=Grid/O=UKHEP/OU=amtp.cam.ac.uk/CN=Mark Hayes/O=Grid/O=UKHEP/OU=physics.ox.ac.uk/CN=B. Todd Huffman/O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=G N Patrick/O=Grid/O=UKHEP/OU=physics.ox.ac.uk/CN=Pete Gronbech/O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Nick Brook/O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Marc Kelly/O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Dave Newbold /O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Catherine Mackay/O=Grid/O=UKHEP/OU=ph.liv.ac.uk/CN=Girish D. Patel/O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=David J. Martin/O=Grid/O=UKHEP/OU=ph.bham.ac.uk/CN=Peter Faulkner/O=Grid/O=UKHEP/OU=ph.bham.ac.uk/CN=David Smith /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Steve Traylen/O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Ruth Dixon del Tufo/O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Linda Cornwall/O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Yee-Ting Li/O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Paul D Mealor/O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Paul A Crosby/O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=David Waters/O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Bob Cranfield/O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Ben West/O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Rod Walker/O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Philip Lewis/O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Dr D J Colling/O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Alex Howard /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Roger Barlow/O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Joe Foster/O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Alessandra Forti/O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Peter Clarke/O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Andrew Sansum /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=John Gordon/O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Andrew McNab/O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Richard Hughes-Jones /O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=Gavin McCance/O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=Tony Doyle /O=Grid/O=UKHEP/OU=ph.qmw.ac.uk/CN=A.J.Martin/O=Grid/O=UKHEP/OU=ph.qmw.ac.uk/CN=S.L.Lloyd /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=John Gordon

Tony Doyle - University of Glasgow GridPP Vision From Web to Grid - Building the next IT Revolution Premise The next IT revolution will be the Grid. The Grid is a practical solution to the data-intensive problems that must be overcome if the computing needs of many scientific communities and industry are to be fulfilled over the next decade. Aim The GridPP Collaboration aims to develop and deploy a large-scale science Grid in the UK for use by the worldwide particle physics community. Many Challenges.. Shared distributed infrastructure For all applications

Tony Doyle - University of Glasgow GridPP Objectives 1.SCALE: GridPP will deploy open source Grid software (middleware) and hardware infrastructure to enable the testing of a prototype of the Grid for the LHC of significant scale. 2.INTEGRATION: The GridPP project is designed to integrate with the existing Particle Physics programme within the UK, thus enabling early deployment and full testing of Grid technology and efficient use of limited resources. 3.DISSEMINATION: The project will disseminate the GridPP deliverables in the multi-disciplinary e-science environment and will seek to build collaborations with emerging non- PPARC Grid activities both nationally and internationally. 4.UK PHYSICS ANALYSES (LHC): The main aim is to provide a computing environment for the UK Particle Physics Community capable of meeting the challenges posed by the unprecedented data requirements of the LHC experiments. 5.UK PHYSICS ANALYSES (OTHER): The process of creating and testing the computing environment for the LHC will naturally provide for the needs of the current generation of highly data intensive Particle Physics experiments: these will provide a live test environment for GridPP research and development. 6.DATAGRID: Open source Grid technology is the framework used to develop this capability. Key components will be developed as part of the EU DataGrid project and elsewhere. 7.LHC COMPUTING GRID: The collaboration builds on the strong computing traditions of the UK at CERN. The CERN working groups will make a major contribution to the LCG research and development programme. 8.INTEROPERABILITY: The proposal is also integrated with developments from elsewhere in order to ensure the development of a common set of principles, protocols and standards that can support a wide range of applications. 9.INFRASTRUCTURE: Provision is made for facilities at CERN (Tier-0), RAL (Tier-1) and use of up to four Regional Centres (Tier-2). 10.OTHER FUNDING: These centres will provide a focus for dissemination to the academic and commercial sector and are expected to attract funds from elsewhere such that the full programme can be realised.

Tony Doyle - University of Glasgow GridPP Project Map - Elements

Tony Doyle - University of Glasgow Rare Phenomena – Huge Background 9 orders of magnitude! The HIGGS All interactions

Tony Doyle - University of Glasgow LHC Computing Challenge Tier2 Centre ~1 TIPS Online System Offline Farm ~20 TIPS CERN Computer Centre >20 TIPS RAL Regional Centre US Regional Centre French Regional Centre Italian Regional Centre Institute Institute ~0.25TIPS Workstations ~100 MBytes/sec Mbits/sec One bunch crossing per 25 ns 100 triggers per second Each event is ~1 Mbyte Physicists work on analysis channels Each institute has ~10 physicists working on one or more channels Data for these channels should be cached by the institute server Physics data cache ~PBytes/sec ~ Gbits/sec or Air Freight Tier2 Centre ~1 TIPS ~Gbits/sec Tier 0 Tier 1 Tier 3 Tier 4 1 TIPS = 25,000 SpecInt95 PC (1999) = ~15 SpecInt95 ScotGRID++ ~1 TIPS Tier 2

Tony Doyle - University of Glasgow Tier-0 - CERN Commodity Processors +IBM (mirrored) EIDE Disks Scale: ~1,000 CPUs ~5 PBytes Compute Element (CE) Storage Element (SE) User Interface (UI) Information Node (IN) Storage Systems..

Tony Doyle - University of Glasgow UK Tier-1 RAL New Computing Farm 4 racks holding 156 dual 1.4GHz Pentium III cpus. Each box has 1GB of memory, a 40GB internal disk and 100Mb ethernet. 50TByte disk-based Mass Storage Unit after RAID 5 overhead. PCs are clustered on network switches with up to 8x1000Mb ethernet out of each rack. Tape Robot upgraded last year uses 60GB STK 9940 tapes 45TB currrent capacity could hold 330TB Scale: 1000 CPUs 0.5 PBytes

Tony Doyle - University of Glasgow Regional Centres SRIF Infrastructure Local Perspective: Consolidate Research Computing Optimisation of Number of Nodes? 4 Relative size dependent on funding dynamics Global Perspective: V. Basic Grid Skeleton

Tony Doyle - University of Glasgow UK Tier-2 ScotGRID ScotGrid Processing nodes at Glasgow 59 IBM X Series 330 dual 1 GHz Pentium III with 2GB memory 2 IBM X Series 340 dual 1 GHz Pentium III with 2GB memory and dual ethernet 3 IBM X Series 340 dual 1 GHz Pentium III with 2GB memory and Mbit/s ethernet 1TB disk LTO/Ultrium Tape Library Cisco ethernet switches ScotGrid Storage at Edinburgh IBM X Series 370 PIII Xeon with 512 MB memory 32 x 512 MB RAM 70 x 73.4 GB IBM FC Hot- Swap HDD CDF equipment at Glasgow 8 x 700 MHz Xeon IBM xSeries GB memory 1 TB disk Griddev testrig at Glasgow 4 x 233 MHz Pentium II 2004 Scale: 300 CPUs 0.1 PBytes BaBar UltraGrid System at Edinburgh 4 UltraSparc 80 machines in a rack 450 MHz CPUs in each 4Mb cache, 1 GB memory Fast Ethernet and Myrinet switching

Tony Doyle - University of Glasgow Network Network Internal networking is currently a hybrid of –100Mb(ps) to nodes of cpu farms –1Gb to disk servers –1Gb to tape servers UK: academic network SuperJANET4 –2.5Gb backbone upgrading to 20Gb in 2003 EU: SJ4 has 2.5Gb interconnect to Geant US: New 2.5Gb link to ESnet and Abilene for researchers UK involved in networking development –internal with Cisco on QoS –external with DataTAG

Tony Doyle - University of Glasgow Grid issues – Coordination Technical part is not the only problem Sociological problems? resource sharing –Short-term productivity loss but long-term gain Key? communication/coordination between people/centres/countries –This kind of world-wide close coordination across multi-national collaborations has never been done in the past We need mechanisms here to make sure that all centres are part of a global planning –In spite of different conditions of funding, internal planning, timescales etc The Grid organisation mechanisms should be complementary and not parallel or conflicting to existing experiment organisation –LCG-DataGRID-eSC-GridPP –BaBar-CDF-D0-ALICE-ATLAS-CMS-LHCb-UKQCD Local Perspective: build upon existing strong PP links in the UK to build a single Grid for all experiments

Tony Doyle - University of Glasgow Experiment Deployment

Tony Doyle - University of Glasgow DataGrid Middleware Work Packages Collect requirements for middleware –Take into account requirements from application groups Survey current technology –For all middleware Core Services testbed –Testbed 0: Globus (no EDG middleware) First Grid testbed release Testbed 1: first release of EDG middleware WP1: workload –Job resource specification & scheduling WP2: data management –Data access, migration & replication WP3: grid monitoring services –Monitoring infrastructure, directories & presentation tools WP4: fabric management –Framework for fabric configuration management & automatic sw installation WP5: mass storage management –Common interface for Mass Storage Sys. WP7: network services –Network services and monitoring

Tony Doyle - University of Glasgow DataGrid Architecture Collective Services Information & Monitoring Replica Manager Grid Scheduler Local Application Local Database Underlying Grid Services Computing Element Services Authorization Authentication and Accounting Replica Catalog Storage Element Services SQL Database Services Fabric services Configuration Management Configuration Management Node Installation & Management Node Installation & Management Monitoring and Fault Tolerance Monitoring and Fault Tolerance Resource Management Fabric Storage Management Fabric Storage Management Grid Fabric Local Computing Grid Grid Application Layer Data Management Job Management Metadata Management Object to File Mapping Service Index

Tony Doyle - University of GlasgowAuthentication/Authorization Authentication (CA Working Group) –11 national certification authorities –policies & procedures mutual trust –users identified by CAs certificates Authorization (Authorization Working Group) –Based on Virtual Organizations (VO). –Management tools for LDAP-based membership lists. –6+1 Virtual Organizations VOs ALICEEarth Obs. ATLASBiomedical CMS LHCbGuidelines CAs CERN CESNET CNRS DataGrid- ES GridPP Grid-Ireland INFN LIP NIKHEF NorduGrid Russian DataGrid

Tony Doyle - University of Glasgow WP7 - EDG Authorisation grid-mapfile generation o=testbed, dc=eu-datagrid, dc=org CN=Franz Elmer ou=People CN=John Smith mkgridmap grid-mapfile VO Directory Authorization Directory CN=Mario Rossi o=xyz, dc=eu-datagrid, dc=org CN=Franz ElmerCN=John Smith Authentication Certificate ou=Peopleou=Testbed1ou=??? local usersban list

Tony Doyle - University of Glasgow Current User Base Grid Support Centre GridPP (UKHEP) CA uses primitive technology –It works but takes effort –201 personal certs issued –119 other certs issued GSC will run a CA for UK escience CA –Uses openCA; Registration Authority uses web –We plan to use it –Namespace identifies RA, not Project –Authentication not Authorisation Through GSC we have access to skills of CLRC eSC Use helpdesk to formalise support later in the rollout UK e-Science UK e-Science Certification Certification Authority Authority Scale Scale

Tony Doyle - University of Glasgow EDG TestBed 1 Status 30 Aug :38 Web interface showing status of (~400) servers at testbed 1 sites Production Centres

Tony Doyle - University of Glasgow GridPP Context (Externally) Neil Geddes Interoperability

Tony Doyle - University of Glasgow Interoperability Trust Relationships

Tony Doyle - University of Glasgow GridPP Sites in Testbed(s)

Tony Doyle - University of Glasgow GridPP Sites in Testbed: Status 30 Aug :38

Tony Doyle - University of Glasgow t0t0 t1t1 From Grid to Web… using GridSite

Tony Doyle - University of GlasgowDocumentation GridPP Web Site: EDG User Guide: EDG User Guide: A biomedical user point of view. JDL Howto: Document.pdf Document.pdf GDMP Guide:

Tony Doyle - University of Glasgow Job Submission 1. Authentication grid-proxy-init 2. Job submission to DataGrid dg-job-submit 3. Monitoring and control dg-job-status dg-job-cancel dg-job-get-output 4. Data publication and replication globus-url-copy, GDMP 5. Resource scheduling JDL, sandboxes, storage elements Linux text interfaces implemented GUIs next..

Tony Doyle - University of Glasgow Job Submission Example dg-job-submit /home/evh/sicb/sicb/bbincl jdl -o /home/evh/logsub/ bbincl jdl: # Executable = "script_prod"; Arguments = " ,v235r4dst,v233r2"; StdOutput = "file output"; StdError = "file err"; InputSandbox = {"/home/evhtbed/scripts/x509up_u149","/home/evhtbed/sicb/mcsend", "/home/evhtbed/sicb/fsize","/home/evhtbed/sicb/cdispose.class","/ home/evhtbed/v235r4dst.tar.gz","/home/evhtbed/sicb/sicb/bbincl sh","/home/evhtbed/script_prod","/home/evhtbed/sicb/sicb dat","/home/evhtbed/sicb/sicb dat","/home/evhtbed/sicb /sicb dat","/home/evhtbed/v233r2.tar.gz"}; OutputSandbox = {"job txt","D ","file output","file er r","job txt","job txt"};

Tony Doyle - University of Glasgow GUI - today

Tony Doyle - University of Glasgow GUI Future? Web Services Access via Grid Certificate

Tony Doyle - University of Glasgow GridPP – Achievements and Issues 1st Year Achievements Complete Project Map –Applications: Middleware: Hardware Fully integrated with EU DataGrid and LCG Projects Rapid middleware deployment /testing Integrated US-EU applications development e.g. BaBar+EDG Roll-out document for all sites in the UK (Core Sites, Friendly Testers, User Only). Testbed up and running at 15 sites in the UK Tier-1 Deployment 200 GridPP Certificates issued First significant use of Grid by an external user (LISA simulations) in May 2002 Web page development (GridSite) Issues for Year 2 Status: 19 Jul :52 GMT – keep monitoring and improve testbed deployment efficiency Importance of EU-wide development of middleware Integrated Testbed for use/testing by all applications Reduce integration layer between middleware and application software Integrated US-EU applications development Tier-1 Grid Production Mode Tier-2 Definitions and Deployment Integrated Tier-1 + Tier-2 Testbed Transfer to UK e-Science CA Integration with other UK projects e.g. AstroGrid, MyGrid…

Tony Doyle - University of Glasgow GridPP Sites in Testbed: Status 19 Jul :52 Project Map Software releases at each site

Tony Doyle - University of Glasgow GridPP – An Operational Grid From Web to Grid… Fit into UK e-Science structures LHC Computing – Particle physicists will use experience in distributed computing to build and exploit the Grid Infrastructure – tiered computing down to the physicist desktop Importance of networking Existing experiments have immediate requirements Non-technical issues = recognising/defining roles (at various levels) UK GridPP started 1/9/01 EU DataGrid First Middleware ~1/9/01 Development requires a testbed with feedback –Operational Grid Status: 25 Jun :38:47 GMT – a day in the life.. GridPP Testbed is relatively small scale – migration plans reqd. e.g. for CA. Grid jobs are being submitted today.. user feedback loop is important.. Grid tools web page development by a VO. Next stop. Web services…

Tony Doyle - University of GlasgowSummary A vision is only useful if its shared Grid success is fundamental for PP 1.Scale in UK? 0.5 Pbytes and 2,000 distrib. CPUs GridPP in Sept Integration – ongoing.. 3.Dissemination – external and internal 4.LHC Analyses – ongoing feedback mechanism.. 5.Other Analyses – closely integrated using EDG tools 6.DataGrid - major investment = must be (and is so far) successful 7.LCG – Grid as a Service 8.Interoperability – sticky subject 9.Infrastructure – Tier-A/1 in place, Tier-2s to follow… 10.Finances – (very well) under control Next steps on framework VI.. CERN = EUs e-science centre? Co-operation required with other disciplines/industry esp. AstroGrid