Download presentation
Presentation is loading. Please wait.
Published byEric O'Hara Modified over 10 years ago
1
Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009
2
LHC Computing
3
The LHC Computing Challenge Signal/Noise: 10 -9 Data volume High rate * large number of channels * 4 experiments 15 PetaBytes of new data each year Compute power Event complexity * Nb. events * thousands users 100 k of (today's) fastest CPUs 45 PB of disk storage Worldwide analysis & funding Computing funding locally in major regions & countries Efficient analysis everywhere GRID technology
4
Wolfgang von Rüden, CERN4 Particle collisions in the centre of a detector June 2009
5
Wolfgang von Rüden, CERN5 Massive Online Data Reduction June 2009
6
Wolfgang von Rüden, CERN6 Tier 0 at CERN: Acquisition, First pass processing Storage & Distribution June 2009
7
Tier 0 – Tier 1 – Tier 2 Wolfgang von Rüden, CERN7 Tier-0 (CERN): Data recording Initial data reconstruction Data distribution Tier-1 (11 centres): Permanent storage Re-processing Analysis Tier-2 (~130 centres): Simulation End-user analysis June 2009
8
Recent grid activity These workloads are at the level anticipated for 2009 data In readiness testing WLCG ran more than10 million jobs /month (1 job is ~ 8 hours use of a single processor) 350k /day 8Wolfgang von Rüden, CERN June 2009
9
Data transfer out of Tier0 Full experiment rate needed is 650 MB/s Desire capability to sustain twice that to allow for Tier 1 sites to shutdown and recover Have demonstrated far in excess of that All experiments exceeded required rates for extended periods, & simultaneously All Tier 1s achieved (or exceeded) their target acceptance rates Full experiment rate needed is 650 MB/s Desire capability to sustain twice that to allow for Tier 1 sites to shutdown and recover Have demonstrated far in excess of that All experiments exceeded required rates for extended periods, & simultaneously All Tier 1s achieved (or exceeded) their target acceptance rates June 20099Wolfgang von Rüden, CERN
10
10 WLCG depends on two major science grid infrastructures …. EGEE - Enabling Grids for E-Science OSG - US Open Science Grid... as well as many national grid projects Interoperability & interoperation is vital significant effort in building the procedures to support it Interoperability & interoperation is vital significant effort in building the procedures to support it June 2009Wolfgang von Rüden, CERN
11
Enabling Grids for E-sciencE EGEE-II INFSO-RI-031688 240 sites 45 countries 45,000 CPUs 12 PetaBytes > 5000 users > 100 VOs > 100,000 jobs/day Archeology Astronomy Astrophysics Civil Protection Comp. Chemistry Earth Sciences Finance Fusion Geophysics High Energy Physics Life Sciences Multimedia Material Sciences … Grid infrastructure project co-funded by the European Commission - now in 3 rd phase with over 100 partners
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.