Download presentation
Presentation is loading. Please wait.
Published byKimberly Thomas Modified over 8 years ago
1
Jürgen Knobloch/CERN Slide 1 Grid Computing by Jürgen Knobloch CERN IT-Department Presented at Physics at the Terascale DESY, Hamburg December 4, 2007 Computing at the Petascale
2
Jürgen Knobloch/CERN Slide 2 LHC gets ready …
3
… what about Computing? The challenge Starting to get the picture Going for the Grid Are we there? What other use is the Grid? Where do we go from here? Jürgen Knobloch/CERN Slide 3
4
LHC Computing Challenge Jürgen Knobloch/CERN Slide 4
5
Jürgen Knobloch/CERN Slide 5 Timeline LHC Computing 199419951996199719981999200020012002200320042005200620072008 LHC approved ATLAS & CMS approved ALICE approved LHCb approved “Hoffmann” Review 7x10 7 MIPS 1,900 TB disk ATLAS (or CMS) requirements for first year at design luminosity ATLAS&CMS CTP 10 7 MIPS 100 TB disk LHC start Computing TDRs 55x10 7 MIPS 70,000 TB disk (140 MSi2K)
6
Options as seen in 1996 Jürgen Knobloch/CERN Slide 6
7
CSC 2007 LCG Evolution of CPU Capacity at CERN SC (0.6GeV) PS (28GeV) ISR (300GeV) SPS (400GeV) ppbar (540GeV) LEP (100GeV) LEP II (200GeV) LHC (14 TeV) Costs (2007 Swiss Francs) Includes infrastructure costs (comp.centre, power, cooling,..) and physics tapes Slide from Les Robertson
8
Requirements Match? Jürgen Knobloch/CERN Slide 8 Tape & disk requirements: >10 times CERN possibility Substantial computing required outside CERN Data compiled by Les Robertson
9
Jürgen Knobloch/CERN Slide 9 Timeline - Grids Data Challenges First physics Cosmics GRID 3 EGEE 1 LCG 1 Service Challenges EU DataGrid GriPhyN, iVDGL, PPDG EGEE 2 OSG LCG 2 EGEE 3 199419951996199719981999200020012002200320042005200620072008 WLCG CCRC08 Common Computing Readiness Challenge
10
Jürgen Knobloch/CERN Slide 10 Centers around the world form a Global Computer System The EGEE and OSG projects are the basis of the Worldwide LHC Computing Grid Project WLCG Inter-operation between Grids is working!
11
WLCG Collaboration Jürgen Knobloch/CERN Slide 11
12
Events at LHC Jürgen Knobloch/CERN Slide 12 Luminosity : 10 34 cm -2 s -1 40 MHz – every 25 ns 20 events overlaying
13
Trigger & Data Acquisition Jürgen Knobloch/CERN Slide 13
14
Tier-0 Recording Jürgen Knobloch/CERN Slide 14 1.25 GB/sec (Ions)
15
Tier-0 -1 -2 Jürgen Knobloch/CERN Slide 15 Tier-0 (CERN): Data recording First-pass reconstruction Data distribution Tier-1 (11 centres): Permanent storage Re-processing Analysis Tier-2 (>200 centres): Simulation End-user analysis
16
Requirements 2006 Jürgen Knobloch/CERN Slide 16 CPUDisk CERN: ~ 10%
17
Moore Delivered for CPU & Disk Jürgen Knobloch/CERN Slide 17 Slides from 1996 Expectations fulfilled! We are now here
18
Network was a Concern … Jürgen Knobloch/CERN Slide 18 We are now here
19
Tier-0-1: Optical Private Network Jürgen Knobloch/CERN Slide 19
20
OPN Traffic Jürgen Knobloch/CERN Slide 20
21
Data Transfer out of Tier-0 Jürgen Knobloch/CERN Slide 21
22
Middleware Security –Virtual Organization Management (VOMS) –MyProxy Data management –File catalogue (LFC) –File transfer service (FTS) –Storage Element (SE) –Storage Resource Management (SRM) Job management –Work Load Management System(WMS) –Logging and Bookeeping (LB) –Computing Element (CE) –Worker Nodes (WN) Information System –Monitoring: BDII (Berkeley Database Information Index), RGMA (Relational Grid Monitoring Architecture) aggregate service information from multiple Grid sites, now moved to SAM (Site Availability Monitoring) –Monitoring & visualization (Gridview, Dashboard, Gridmap etc.) Jürgen Knobloch/CERN Slide 22
23
Jürgen Knobloch/CERN Slide 23 Site reliability
24
Site Reliability Note: For a site to be reliable, many things have to work simultaneously!
25
Jürgen Knobloch/CERN Slide 25 ARDA Dashboard
26
Gridmap Jürgen Knobloch/CERN Slide 26
27
Increasing workloads ⅓ non-LHC Data compiled by Ian Bird
28
Many Grid Applications At present there are about 20 applications from more than 10 domains on the EGEE Grid infrastructure –Astronomy & Astrophysics - MAGIC, Planck –Computational Chemistry –Earth Sciences - Earth Observation, Solid Earth Physics, Hydrology, Climate –Fusion –High Energy Physics - 4 LHC experiments (ALICE, ATLAS, CMS, LHCb) BaBar, CDF, DØ, ZEUSATLAS –Life Sciences - Bioinformatics (Drug Discovery, GPS@, Xmipp_MLrefine, etc.) –Condensed Matter Physics –Computational Fluid Dynamics –Computer Science/Tools –Civil Protection –Finance (through the Industry Task Force) Jürgen Knobloch/CERN Slide 28
29
Grid Applications Jürgen Knobloch/CERN Slide 29 Medical Seismology Chemistry Astronomy Fusion Particle Physics
30
Available EGEE Infrastructure 30 Data compiled by Ian Bird
31
Ramp-up Needed for Startup Jul Sep Apr -07 -07 -08 3.7 X Sep Jul Apr -06 -07 -08 Sep Jul Apr -06 -07 -08 3 X 2.9 X Sep Jul Apr -06 -07 -08 Sep Jul Apr -06 -07 -08 2.3 X 3.7 X target usage usage pledge installed
32
Jürgen Knobloch/CERN Slide 32 3D - Distributed Deployment of Databases for LCG ORACLE Streaming with Downstream Capture (ATLAS, LHCb) SQUID/FRONTIER Web caching (CMS)
33
The Next Step Jürgen Knobloch/CERN Slide 33
34
EGI - Consortium members Johannes Kepler Universität Linz (GUP) Greek Research and Technology Network S.A. (GRNET) Istituto Nazionale di Fisica Nucleare (INFN) CSC - Scientific Computing Ltd. (CSC) CESNET, z.s.p.o. (CESNET) European Organization for Nuclear Research (CERN) Verein zur Förderung eines Deutschen Forschungsnetzes - DFN-Verein (DFN) Science & Technology Facilities Council (STFC) Centre National de la Recherche Scientifique (CNRS) Jürgen Knobloch/CERN Slide 34
35
EGI – European Grid Initiative EGI Design Study started Sept 07 to establish a sustainable pan-European grid infrastructure after the end of EGEE-3 in 2010 The main foundations of EGI are 37 National Grid Initiatives (NGIs) Project is funded by the European Commission's 7 th Framework Program Jürgen Knobloch/CERN Slide 35 www.eu-egi.org
36
Jürgen Knobloch/CERN Slide 36 Tier-1 Centers: TRIUMF (Canada); GridKA(Germany); IN2P3 (France); CNAF (Italy); SARA/NIKHEF (NL); Nordic Data Grid Facility (NDGF); ASCC (Taipei); RAL (UK); BNL (US); FNAL (US); PIC (Spain) The Grid is now in operation, working on: reliability, scaling up, sustainability Computing at the Terra-Scale
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.