Download presentation
Presentation is loading. Please wait.
Published byBaldric Gaines Modified over 9 years ago
1
LHC Computing Review - Resources ATLAS Resource Issues John Huth Harvard University
2
LHC Computing Resources 24 March 2000 2 ATLAS Computing organization simulationreconstructiondatabasecoordinator QC groupsimulation reconstruction databaseArch. team Event filter Technical Group National Comp. Board Comp. Steering Group Physics Comp. Oversight Board Detector system
3
LHC Computing Resources 24 March 2000 3 Scales of Effort Best benchmarks are Tevatron Collider Experiments (CDF, D0) Scaling: CPU – factor of 1000 to LHC (event complexity) Data volume – 10x to 100x User/developer community: 5x Distribution effort: 5x
4
LHC Computing Resources 24 March 2000 4 The ATLAS Computing Model Data sizes/event (CTP numbers): RAW : 1 MB (100 Hz) ESD : 100 kB (moving up) AOD : 10 kB TAG : 100 B Tier-0 : RAW, ESD, AOD, TAG Tier-1 : ESD, AOD, TAG Tier-2 : AOD, TAG Might be different for the first year(s)
5
LHC Computing Resources 24 March 2000 5 U.S. ATLAS Model as example
6
LHC Computing Resources 24 March 2000 6 Data Grid Hierarchy Tier 1 FNAL/BNL T2 3 3 3 3 3 3 3 3 3 3 3 3 Tier 0 (CERN) 4444 3 3
7
LHC Computing Resources 24 March 2000 7 ATLAS Milestones 2001 Number and places for Tier-1 centers should be known 2002 Basic world wide computing strategy should be defined 2003 Typical sizes for Tier-0 and Tier-1 centers should be proposed 2003 The role of Tier-2 centers in the GRID should be known
8
LHC Computing Resources 24 March 2000 8 Facilities Architecture : USA as Example US ATLAS Tier-1 Computing Center at BNL National in scope at ~20% of Tier-0 (see notes at end) US ATLAS Tier-2 Computing Centers Regional in scope at ~20% of Tier-1 Likely one of them at CERN US ATLAS Institutional Computing Facilities US ATLAS Individual Desk Top Systems
9
LHC Computing Resources 24 March 2000 9 U.S. ATLAS as example Total US ATLAS facilities in ‘05 should include... 10,000 SPECint95 for Re-reconstruction 85,000 SPECint95 for Analysis 35,000 SPECint95 for Simulation 190 TBytes/year of On-line (Disk) Storage 300 TBytes/year of Near-line (Robotic Tape) Storage Dedicated OC12 622 Mbit/sec Tier-1 connectivity to each Tier-2 Dedicated OC12 622 Mbit/sec to CERN
10
LHC Computing Resources 24 March 2000 10 US ATLAS: Integrated Capacities by Year
11
LHC Computing Resources 24 March 2000 11 Muon Level 2 Trigger Radius of curvature map for muons.
12
LHC Computing Resources 24 March 2000 12 Neutron Background Studies Total neutron flux KHz/cm 2
13
LHC Computing Resources 24 March 2000 13 Resource Estimates for 1 st Year Assumptions 100 Hz event rate 2 passes through reconstruction Low luminosity running (1.0E+33) Two pass calibration 2000 Costing and Moore’s law adjusted Note: Some estimates are “bottom – up” using ATLAS Physics TDR numbers.
14
LHC Computing Resources 24 March 2000 14 ATLAS and the RC Hierarchy Intentions of setting up a local Tier-1 have been expressed already in : Canada (ATLAS,Tier-1/2) France (LHC), Germany (LHC or multinational? at CERN), Italy (ATLAS?), Japan (ATLAS,Tier-1/2), Netherlands (LHC) Russia (LHC), UK (LHC), USA (ATLAS)
15
LHC Computing Resources 24 March 2000 15 CTP Estimate :Tier-1 Center Tier-1 RC should have at startup (at least) 30,000 SPECint95 for Analysis 20,000 SPECint95 for Simulation 100 TBytes/year of On-line (Disk) Storage 200 TBytes/year of Near-line (Mass) Storage 100 Mbit/sec connectivity to CERN Assume no major raw data processing or handling outside of CERN Re-reconstruction partially in RC´s
16
LHC Computing Resources 24 March 2000 16 Calibration Assumptions Muon system – 100 Hz of “autocalibration” data 200 SI95/event 2 nd pass=20 Hz for alignment Inner Detector – 10 Hz, 1 SI95 for calibration (muon tracks) 2 nd pass =alignment EM Cal – 0.2 Hz, 10 SI 95/event – Z->e+e- 2 nd pass=repeat analysis Had. Cal – 1 Hz, 100 SI95 (isolated tracks) 2 nd pass = repeat, with found tracks
17
LHC Computing Resources 24 March 2000 17 Calibration Numbers CPU: 24,000 SI95 Required Data storage: 1.3 PB (assuming one stores data from this pass – fed into raw data store)
18
LHC Computing Resources 24 March 2000 18 Reconstruction Two passes Breakdown by system Muon: 200 SI95 Had+EM Cal. :10 SI95 Inner Detector: 100 SI 95 NB: At high luminosity ID numbers may rise drastically. Numbers may vary substantially by 2006 Total CPU: 64,000 SI95 (Robertson: 65,000) Robotic Store: 2 PB Reprocessing: 128K SI95 (1 per 3 months)
19
LHC Computing Resources 24 March 2000 19 Generation and Simulation “Astrophysical” uncertainties Highly model dependent – scale of G4 activities vs. fast simulation (CDF vs. D0 models) Assume 1% of total data volume is simulated via G4 3000 SI95/event Data store 10 TB Remainder (10x) via fast simulation 30(?) TB, negligible CPU
20
LHC Computing Resources 24 March 2000 20 Analysis 130,000 SI95 from ATLAS CTP MONARC has pushed this number up Depends strongly on assumptions Example: U.S. Estimate = 85K SI95, which would suggest a minimum of 500K SI95 for ATLAS, but large uncertainties 300 TB storage/regional center
21
LHC Computing Resources 24 March 2000 21 Resources CERN: Raw data store 2 passes of reconstruction Calibration Reprocessing Assume analysis/etc. part of contributions (e.g. RC at CERN) Tier-1’s Each has 20% of CERN capacity in CPU/Tape/Disk (reconstruction…) Monte Carlo, Calibration and analysis Costing via 2000 prices, Moore’s law (1.4/year CPU, 1.18/year tape, 1.35/year disk)
22
LHC Computing Resources 24 March 2000 22 CPU CERN: 216,000 SI95 Calibration, reconstruction, reprocessing only Single Tier 1: 130k SI95 (U.S. Example) Total: 1,500 kSI95 NB. Uncertainties in analysis model, reprocessing times can dominate estimates.
23
LHC Computing Resources 24 March 2000 23 Data Storage Tape CERN: 2 PB( was 1 PB in TP) Each Tier 1: 400 TB (U.S. Est) Total: 4.0 PB Single Tier 1: 400 TB
24
LHC Computing Resources 24 March 2000 24 Disk Storage More uncertainty: usage of compressed data,etc Figure of merit: 25% of Robotic tape 540 TB at CERN 100 TB in ATLAS Computing TP U.S. Estimate: 100 TB Sum of CERN+ Tier 1’s : 1,540 TB
25
LHC Computing Resources 24 March 2000 25 Multipliers CPU: 2000: 70 CHF/SI95, 10 factor from Moore Robotic Tape: 2000: 2700 CHF/TB, 2.5 factor from Moore Disk: 2000: 50,000/TB, 5 from Moore Networking: 20% of sum of other hardware costs Decent “rule of thumb”
26
LHC Computing Resources 24 March 2000 26 Costs CPU: CERN: 15 MCHF Total: 106 MCHF (Tier 1’s+CERN) Tape: CERN: 5.4 MCHF Total: 11 MCHF Disk: CERN: 27 MCHF Total: 77 MCHF Networking: 37 MCHF
27
LHC Computing Resources 24 March 2000 27 Moore’s Law CPU: CERN: 2 MCHF Total: 11 MCHF (Tier 1’s+CERN) Tape: CERN: 2.2 MCHF Total: 4.3 MCHF Disk: CERN: 1.9 MCHF Total: 5.5 MCHF Networking: 7.1 MCHF Comment: Cannot buy everything at last moment
28
LHC Computing Resources 24 March 2000 28 Commentary Comparisons: ATLAS TP, Robertson Unit costs show wide variation (unit cost of SI95 now, robotic tape, disk) Moore’s law – varying assumptions Requirements can have large variations ATLAS, CMS, MONARC etc. One should not take these as cast in stone – variations in ATLAS for CPU/event Monte Carlo methodology Analysis models Nonetheless – this serves as a starting point.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.