Download presentation
Presentation is loading. Please wait.
Published byKevin Barber Modified over 6 years ago
1
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
Rector of “Benemerita Universidad Autonoma de Puebla” Mexico Monday 22nd June 2009 Frédéric Hemmer IT Department Head 1
2
generating data 40 millions times per second
The ATLAS experiment 7000 tons, 150 million sensors generating data 40 millions times per second i.e. a petabyte/s
3
A collision at LHC 3 3
4
The Data Acquisition 4 4
5
Tier 0 at CERN: Acquisition, First pass processing Storage & Distribution
The next two slides illustrate what happens to the data as it moves out from the experiments. Each of CMS and ATLAS produce data at the rate of 1 DVD-worth every 15 seconds or so, while the rates for LHCb and ALICE are somewhat less. However, during the part of the year when LHC will accelerate lead ions rather than protons, ALICE (which is an experiment dedicated to this kind of physics) alone will produce data at the rate of over 1 Gigabyte per second (1 DVD every 4 seconds). Initially the data is sent to the CERN Computer Centre – the Tier 0 - for storage on tape. Storage also implies guardianship of the data for the long term – the lifetime of the LHC – at least 20 years. This is not passive guardianship but requires migrating data to new technologies as they arrive. We need large scale sophisticated mass storage systems that not only are able to manage the incoming data streams, but also allow for evolution of technology (tapes and disks) without hindering access to the data. The Tier 0 centre provides the initial level of data processing – calibration of the detectors and the first reconstruction of the data. 1.25 GB/sec (ions) 5 5
6
The LHC Computing Grid, March 2009
The LHC Data Challenge The accelerator take data in 2009 and run for years Experiments will produce about 15 Million Gigabytes of data each year (about 20 million CDs!) LHC data analysis requires a computing power equivalent to ~100,000 of today's fastest PC processors Requires many cooperating computer centres, as CERN can only provide ~20% of the capacity The LHC Computing Grid, March 2009
7
The LHC Computing Grid, March 2009
Solution: the Grid Use the Grid to unite computing resources of particle physics institutes around the world The World Wide Web provides seamless access to information that is stored in many millions of different geographical locations The Grid is an infrastructure that provides seamless access to computing power and data storage capacity distributed over the globe The LHC Computing Grid, March 2009
8
The LHC Computing Grid, March 2009
How does the Grid work? It makes multiple computer centres look like a single system to the end-user Advanced software, called middleware, automatically finds the data the scientist needs, and the computing power to analyse it. Middleware balances the load on different resources. It also handles security, accounting, monitoring and much more. The LHC Computing Grid, March 2009
9
Tier 0 – Tier 1 – Tier 2 Tier-0 (CERN): Data recording
Initial data reconstruction Data distribution Tier-1 (11 centres): Permanent storage Re-processing Analysis Tier-2 (~130 centres): Simulation End-user analysis The Tier 0 centre at CERN stores the primary copy of all the data. A second copy is distributed between the 11 so-called Tier 1 centres. These are large computer centres in different geographical regions of the world, that also have a responsibility for long term guardianship of the data. The data is sent from CERN to the Tier 1s in real time over dedicated network connections. In order to keep up with the data coming from the experiments this transfer must be capable of running at around 1.3 GB/s continuously. This is equivalent to a full DVD every 3 seconds. The Tier 1 sites also provide the second level of data processing and produce data sets which can be used to perform the physics analysis. These data sets are sent from the Tier 1 sites to the around 130 Tier 2 sites. A Tier 2 is typically a university department or physics laboratories and are located all over the world in most of the countries that participate in the LHC experiments. Often, Tier 2s are associated to a Tier 1 site in their region. It is at the Tier 2s that the real physics analysis is performed. Tier-2s in UK: UK, London Tier 2 UK, NorthGrid UK, ScotGrid UK, SouthGrid 10 10
10
Alice ATLAS CMS LHCb Total
Main outstanding issues related to service/site reliability Alice ATLAS CMS LHCb Total Tier-1s 6.24 32.03 30.73 2.50 71.50 34.3% Tier-2s 9.61 52.23 55.04 20.14 137.02 65.7% 15.85 84.26 85.77 22.64 208.52 From APEL accounting portal for Aug.’08 to Jan.’09; #s in MSI2k
11
Impact of the LHC Computing Grid in Europe
LCG has been the driving force for the European multi-science Grid EGEE (Enabling Grids for E-sciencE) EGEE is now a global effort, and the largest Grid infrastructure worldwide Co-funded by the European Commission (Cost: ~130 M€ over 4 years, funded by EU ~70M€) EGEE already used for >100 applications, including… Bio-informatics Education, Training Medical Imaging 15
12
Grid infrastructure project co-funded by the European Commission - now in 3rd phase with partners in 45 countries 240 sites 45 countries 45,000 CPUs 12 PetaBytes > 5000 users > 100 VOs > 100,000 jobs/day Archeology Astronomy Astrophysics Civil Protection Comp. Chemistry Earth Sciences Finance Fusion Geophysics High Energy Physics Life Sciences Multimedia Material Sciences … In Europe, EGEE – which started in 2004 from the service that LCG had built over the preceding 18 months has now grown to be the world’s largest scientific grid infrastructure. While LHC is by far its biggest user, it supports a whole range of other scientific applications from life sciences, to physics, climate modeling, and many others. It is itself a worldwide infrastructure with partners in around 45 countries. 16
13
For more information about the Grid:
Thank you for your kind attention! 19
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.