Download presentation
Presentation is loading. Please wait.
Published byJade Glenn Modified over 6 years ago
1
Simulation in IceCube Science Advisory Committee Meeting Paolo Desiati
UW-Madison March 1st-2nd 2007 Paolo Desiati
2
introduction offline software based on custom-made framework: “IceTray” modular and configurable at runtime modules (specific tasks) and services (shared tasks) structure of simulation software developing simulation testing simulation benchmarking simulation producing simulation data using simulation data
3
structure of simulation software
the core : offline-software tools framework functionalities input/output data structure visualization tools simulation software based on C++ framework “IceTray” some use of third party code (FORTRAN, Java) provided interfaces to IceTray modules and services grouped in projects projects grouped in a meta-project SVN version control system
4
structure of simulation software
simulation modules physics generators CORSIKA – cosmic muons neutrino-generator, JULIeT – neutrinos with EH and EHE MMC – neutrino generator simple-generator – muon/cascade generator for benchmarking event propagators MMC – muon propagator in media JULIeT – propagator for EHE events detector response simulation ice properties & photon density PMT-noise simulation PMT & DOM / TWR simulation trigger simulation IceCube, IceTop and AMANDA
5
structure of simulation software
simulation services c2j service (C++ → java) random generator access to IceCube database event time generator AMANDA calibration IceCube calibration and status legacy : AMANDA simulation in transition from C/FORTRAN-based AMASIM to IceTray
6
developing simulation
project-level test, verification and release a collaboration-wide effort authorship responsibility for each project insure stability within meta-project difficult to coordinate : iterative process code review policy insure clean code, correct usage of standards, units, … meta-project level test, verification and release mainly coordinated by Alex Olivas (UMD) guarantee stable releases for production reduce delegating problems to this level documentation of code more efficient understanding of code written by others
7
testing simulation unit tests for each project integrated tests
one or more scoped test source and test module verify that modules/services perform tasks as designed verify that they provide what expected within the known constraints detect bugs integrated tests physics-oriented tests for higher level code verification if unit tests work properly no need of specific integrated test test example:
8
benchmarking simulation
speed and memory performance time profiling tools provided by IceTray use third party profiler such as SunStudio 11 find bottlenecks in simulation chain detect odd runtime behavior help in detecting bugs automatic code compilation in different architectures include tests possibly include well scoped benchmarking
9
producing simulation data
neutrino-telescopes to win against backgrounds cosmic muon events ×106 coincident cosmic muon events × atmospheric neutrinos ×1 select pure atmospheric neutrino sample detailed description of cosmic muon background producing cosmic muon background is demanding using important sampling require large computing resources
10
producing simulation data
simulation production on tagged software releases keep track of project versions keep track of entire production history integrated tests at production level certain bugs stay invisible up to production often difficult to find origin of bugs and strange behavior more effort to catch problems earlier test at production level not efficient need more efficient distribution of tasks at production level
11
producing simulation data
IceCube distributed computing resources Tier 1 institutions to contribute to production/processing Tier 2 institutions to contribute if available each site to provide local responsible and to become expert in production testing, troubleshooting and maintenance production tools custom-developed (Juan Carlos D-V) use diverse batch systems flexible and configurable production database, logging, runtime monitor
12
producing simulation data
use of dedicated clusters and of grid systems GLOW at UW – Madison SWEGrid in Sweden LHC Grid in DESY generating background for IceCube currently Dual Core AMD Opteron(tm) Processor 280, 2.4 GHz (npx2) CORSIKA pre-generation: 1-2h livetime in 1d execution time (x 100 with important sampling) trigger : 0.5 events/sec (assuming IC only & current trigger settings) 300 CPU in IC9 for real time production (10MB/file) 900 CPU in IC (65MB/file) 3200 CPU in IC (230MB/file) double coincident muons : 0.4 events/sec 5 CPU in IC9 for real time production (25MB/file)
13
producing simulation data
generating background for IceCube-9 cosmic muon events 16h livetime in 5d execution time and 240d CPU time (UMD) coincident muon events 5d livetime in 8h execution time and 48d CPU time (UW) generating signal for IceCube-9 (E-1 spectrum) 400 Kevents in 1d exec time and 43d CPU time (UW) generating background for IceCube-80 3h livetime in 30d execution time and 430d CPU time (SWEGRID)
14
producing simulation data
location batch system Cores type CPU equivalent (GHz) UW (US) Condor 2128 (272) 32bit (573) 388 (0) 64bit 573 (0) PBS 256 (256) 614 (614) UMD (US) 70 (70) 188 (188) Mons (Belgium) 24 (?) 44 (?) Souther (US) OpenPBS 56 (56) 212 (212) LBNL (US) SGE 257 (16) 694 (48) Stockholm (Sweden) Swegrid 600 (~60) 1680 (168) Chiba (Japan) 30 (?) 42 (?) 10 (10) 27 (27) Desy (Germany) 400 (?) 1080 (?) Brussels (Belgium) Condor/OpenPBS 90 (?) 112 (?) 302 (?) Wuppertal (Germany) ALiCEnext 450 (225) 1080 (540) Aachen (Germany) 20 (?) 28 (?)
15
producing simulation data
separate detector configurations to be used IceCube in-ice strings IceTop surface stations IceCube/IceTop coincidence events IceCube/AMANDA merged events use of real detector snapshots from online calibration and detector status database need to increase production efficiency merge detector configurations in same production process make better use of important sampling for CORSIKA reduce file size increase simulation speed and production handling speed need to increase computing resources use and contribute to grids more massively (e.g. GLOW)
16
using simulation data first data analyses with IceCube-9 (John Pretz, UMD)
17
using simulation data simulation verification
comparison with experimental data at basic levels simulation agreement at trigger level and various filter levels ice properties treatment and implementation IceCube and AMANDA have different issues PMT response simulation (1 spe) OM response and waveforms TWR and DOM OM individual sensitivities induced by local hole ice AMASIM still in use for current AMANDA analyses detector simulation quality still improving
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.