Download presentation
Presentation is loading. Please wait.
Published byEunice Doyle Modified over 9 years ago
1
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012
2
150 million sensors deliver data … … 40 million times per second
3
Ian Bird, CERN3 The LHC Computing Challenge Signal/Noise: 10 -13 (10 -9 offline) Data volume High rate * large number of channels * 4 experiments 15 PetaBytes of new data each year Compute power Event complexity * Nb. events * thousands users 200 k CPUs 45 PB of disk storage Worldwide analysis & funding Computing funding locally in major regions & countries Efficient analysis everywhere GRID technology 22 PB in 2011 150 PB 250 k CPU
4
Tier 0 at CERN: Acquisition, First pass reconstruction, Storage & Distribution Ian.Bird@cern.ch 1.25 GB/sec (ions) 4 2011: 400-500 MB/sec 2011: 4-6 GB/sec
5
A distributed computing infrastructure to provide the production and analysis environments for the LHC experiments Managed and operated by a worldwide collaboration between the experiments and the participating computer centres The resources are distributed – for funding and sociological reasons Our task was to make use of the resources available to us – no matter where they are located Ian Bird, CERN5 WLCG – what and why? Tier-0 (CERN): Data recording Initial data reconstruction Data distribution Tier-1 (11 centres): Permanent storage Re-processing Analysis Tier-2 (~130 centres): Simulation End-user analysis
6
Tier 0 Tier 1 Tier 2 WLCG Grid Sites Today >140 sites >250k CPU cores >150 PB disk Today >140 sites >250k CPU cores >150 PB disk
7
Lyon/CCIN2P3 Barcelona/PIC De-FZK US-FNAL Ca- TRIUMF NDGF CERN US-BNL UK-RAL Taipei/ASGC Ian Bird, CERN726 June 2009 Today we have 49 MoU signatories, representing 34 countries: Australia, Austria, Belgium, Brazil, Canada, China, Czech Rep, Denmark, Estonia, Finland, France, Germany, Hungary, Italy, India, Israel, Japan, Rep. Korea, Netherlands, Norway, Pakistan, Poland, Portugal, Romania, Russia, Slovenia, Spain, Sweden, Switzerland, Taipei, Turkey, UK, Ukraine, USA. Today we have 49 MoU signatories, representing 34 countries: Australia, Austria, Belgium, Brazil, Canada, China, Czech Rep, Denmark, Estonia, Finland, France, Germany, Hungary, Italy, India, Israel, Japan, Rep. Korea, Netherlands, Norway, Pakistan, Poland, Portugal, Romania, Russia, Slovenia, Spain, Sweden, Switzerland, Taipei, Turkey, UK, Ukraine, USA. WLCG Collaboration Status Tier 0; 11 Tier 1s; 68 Tier 2 federations WLCG Collaboration Status Tier 0; 11 Tier 1s; 68 Tier 2 federations Amsterdam/NIKHEF-SARA Bologna/CNAF
8
CERN IT Department CH-1211 Genève 23 Switzerland www.cern.ch/it WLCG 2010-11 CPU corresponds to >>150k cores permanently running; Peak job loads – around 200k concurrent jobs In 2010 WLCG delivered ~100 CPU-millenia! CPU corresponds to >>150k cores permanently running; Peak job loads – around 200k concurrent jobs In 2010 WLCG delivered ~100 CPU-millenia! Data traffic in Tier 0 and to grid larger than 2010 values: Up to 4 GB/s from DAQs to tape Data traffic in Tier 0 and to grid larger than 2010 values: Up to 4 GB/s from DAQs to tape LHCb Compass CMS ATLAS AMS ALICE 2 PB Data to tape/month CPU delivered (HS06-hours/month) 8CERN IT Department - Frédéric Hemmer 1 M jobs/day CMS HI data zero suppression & FNAL 2011 data Tier 1s Re-processing 2010 data ALICE HI data Tier 1s 2010 pp data Tier 1s & re-processing enables the rapid delivery of physics results
9
The grid really works All sites, large and small can contribute – And their contributions are needed! Significant use of Tier 2s for analysis The grid really works All sites, large and small can contribute – And their contributions are needed! Significant use of Tier 2s for analysis Ian.Bird@cern.ch9 CPU – around the Tiers
10
Relies on – OPN, GEANT, US-LHCNet – NRENs & other national & international providers Ian Bird, CERN10 LHC Networking
11
Ian.Bird@cern.ch WLCG has been leveraged on both sides of the Atlantic, to benefit the wider scientific community – Europe: Enabling Grids for E-sciencE (EGEE) 2004-2010 European Grid Infrastructure (EGI) 2010-- – USA: Open Science Grid (OSG) 2006-2012 (+ extension?) Many scientific applications 11 Broader Impact of the LHC Computing Grid Archeology Astronomy Astrophysics Civil Protection Comp. Chemistry Earth Sciences Finance Fusion Geophysics High Energy Physics Life Sciences Multimedia Material Sciences … Archeology Astronomy Astrophysics Civil Protection Comp. Chemistry Earth Sciences Finance Fusion Geophysics High Energy Physics Life Sciences Multimedia Material Sciences …
12
Grid in Latvia: Latvia was in Baltic Grid projects (ended 2010) Riga Technical University, Inst. Of Mathematics and Computer Science, Univ. Latvia Latvia (Latvia Grid) now a partner in European Grid Infrastructure 12
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.