Download presentation
Presentation is loading. Please wait.
Published byCrystal Hudson Modified over 9 years ago
1
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board
2
Steve LloydPublic Service Summit - 22 September 2004 Slide 2 Outline Why? – The CERN LHC and the Data Deluge What? - GridPP and the Grid –What is the Grid? –Applications and Middleware –Tier-1 and Tier-2 Regional Centres How? - GridPP Management Summary and Challenges
3
Steve LloydPublic Service Summit - 22 September 2004 Slide 3 What is GridPP? 19 UK Universities, CCLRC (RAL & Daresbury) and CERN Funded by the Particle Physics and Astronomy Research Council (PPARC) GridPP1 - 2001-2004 £17m "From Web to Grid" GridPP2 - 2004-2007 £15m "From Prototype to Production"
4
Steve LloydPublic Service Summit - 22 September 2004 Slide 4 The CERN LHC 4 Large Experiments The world’s most powerful particle accelerator - 2007
5
Steve LloydPublic Service Summit - 22 September 2004 Slide 5 LHC Experiments > 10 8 electronic channels 8x10 8 proton-proton collisions/sec 2x10 -4 Higgs per sec 10 Petabytes of data a year (10 Million GBytes = 14 Million CDs) Searching for the Higgs Particle and exciting new Physics Starting from this event Looking for this ‘signature’ e.g. ATLAS
6
Steve LloydPublic Service Summit - 22 September 2004 Slide 6 What is the Grid? MIDDLEWARE CPU Disks, CPU etc PROGRAMS OPERATING SYSTEM Word/Excel Email/Web Your Program Games CPU Cluster User Interface Machine CPU Cluster CPU Cluster Resource Broker Information Service Single PC Grid Disk Cluster Your Program Middleware is the Operating System of a distributed computing system
7
Steve LloydPublic Service Summit - 22 September 2004 Slide 7 What is the Grid? From this: To this:
8
Steve LloydPublic Service Summit - 22 September 2004 Slide 8 International Collaboration EU DataGrid (EDG) 2001-2004 –Middleware Development Project US and other Grid projects –Interoperability LHC Computing Grid (LCG) –Grid Deployment Project for LHC EU Enabling Grids for e-Science in Europe (EGEE) 2004-2006 –Grid Deployment Project for all disciplines
9
Steve LloydPublic Service Summit - 22 September 2004 Slide 9 The LCG Grid
10
Steve LloydPublic Service Summit - 22 September 2004 Slide 10 Grid Snapshot
11
Steve LloydPublic Service Summit - 22 September 2004 Slide 11 GridPP1 Areas LHC Computing Grid Project (LCG) Applications, Fabrics, Technology and Deployment European DataGrid (EDG) Middleware Development UK Tier-1/A Regional Centre Hardware and Manpower Grid Application Development LHC and US Experiments + Lattice QCD Management Travel etc
12
Steve LloydPublic Service Summit - 22 September 2004 Slide 12 GridPP2 Areas Management Travel etc UK Tier-1/A Regional Centre Hardware UK Tier-2 Regional Centres Manpower UK Tier-1/A Manpower LHC Computing Grid Project (LCG) Manpower Middleware, Security and Networking Manpower Grid Application Development LHC and US Experiments + Lattice QCD, Phenomenology and Generic Portal
13
Steve LloydPublic Service Summit - 22 September 2004 Slide 13 Application Development GANGA SAMGrid Lattice QCD AliEn → ARDA CMS BaBar
14
Steve LloydPublic Service Summit - 22 September 2004 Slide 14 Middleware Development Configuration Management Storage Interfaces Network Monitoring Security Information Services Grid Data Management
15
Steve LloydPublic Service Summit - 22 September 2004 Slide 15 UK Tier-1/A Centre High quality data services National and International Role UK focus for International Grid development 700 Dual CPU 80 TB Disk 60 TB Tape (Capacity 1PB) Grid Operations Centre
16
Steve LloydPublic Service Summit - 22 September 2004 Slide 16 UK Tier-2 Centres ScotGrid Durham, Edinburgh, Glasgow NorthGrid Daresbury, Lancaster, Liverpool, Manchester, Sheffield SouthGrid Birmingham, Bristol, Cambridge, Oxford, RAL PPD, Warwick LondonGrid Brunel, Imperial, QMUL, RHUL, UCL Mostly funded by HEFCE
17
Steve LloydPublic Service Summit - 22 September 2004 Slide 17 UK Core e-Science Programme Institutes Tier-2 Centres CERN LCG EGEE GridPP GridPP in Context Tier-1/A Middleware, Security, Networking Experiments Grid Support Centre Not to scale! Apps Dev Apps Int GridPP
18
Steve LloydPublic Service Summit - 22 September 2004 Slide 18 Management Collaboration Board Project Management Board Project Leader Project Manager Deployment Board User Board Production Manager Dissemination Officer EGEE Leader Tier-1 Board Project Map Risk Register Tier-2 Board CERN LCG Liaison Deployment Team
19
Steve LloydPublic Service Summit - 22 September 2004 Slide 19 Summary BaBar D0 CDF ATLAS CMS LHCb ALICE 19 UK Institutes RAL Computer Centre CERN Computer Centre SAMGrid BaBarGrid LCG EDG GANGA EGEE UK Prototype Tier-1/A Centre CERN Prototype Tier-0 Centre 4 UK Tier-2 Centres LCG UK Tier-1/A Centre CERN Tier-0 Centre 2007 2004 2001 4 UK Prototype Tier-2 Centres ARDA Separate Experiments, Resources, Multiple Accounts 'One' Production Grid Prototype Grids
20
Steve LloydPublic Service Summit - 22 September 2004 Slide 20 Challenges Concorde (15 km) CD stack with 1 year LHC data (~ 20 km) We are here (1 km) Scaling to full size ~10,000 → 100,000 CPUs Stability, Robustness etc Security Sharing resources (in RAE environment!) International Collaboration Continued funding beyond start of LHC!
21
Steve LloydPublic Service Summit - 22 September 2004 Slide 21 Further Info http://www.gridpp.ac.uk
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.