Simulation in IceCube Science Advisory Committee Meeting Paolo Desiati

Slides:



Advertisements
Similar presentations
GWDAW 16/12/2004 Inspiral analysis of the Virgo commissioning run 4 Leone B. Bosi VIRGO coalescing binaries group on behalf of the VIRGO collaboration.
Advertisements

Jiri Chudoba for the Pierre Auger Collaboration Institute of Physics of the CAS and CESNET.
The IceCube Neutrino Observatory To Explore the EHE Universe Shigeru Yoshida The Chiba University The Baseline Performance.
A Search for Point Sources of High Energy Neutrinos with AMANDA-B10 Scott Young, for the AMANDA collaboration UC-Irvine PhD Thesis:
Current Monte Carlo calculation activities in ATLAS (ATLAS Data Challenges) Oxana Smirnova LCG/ATLAS, Lund University SWEGRID Seminar (April 9, 2003, Uppsala)
SUSY06, June 14th, The IceCube Neutrino Telescope and its capability to search for EHE neutrinos Shigeru Yoshida The Chiba University (for the IceCube.
Rainbow Facilitating Restorative Functionality Within Distributed Autonomic Systems Philip Miseldine, Prof. Taleb-Bendiab Liverpool John Moores University.
Framework for Automated Builds Natalia Ratnikova CHEP’03.
IceCube System Testing
Apostolos Tsirigotis Simulation Studies of km3 Architectures KM3NeT Collaboration Meeting April 2007, Pylos, Greece The project is co-funded by the.
Report of the HOU contribution to KM3NeT TDR (WP2) A. G. Tsirigotis In the framework of the KM3NeT Design Study WP2 Meeting - Marseilles, 29June-3 July.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
IceCube: String 21 reconstruction Dmitry Chirkin, LBNL Presented by Spencer Klein LLH reconstruction algorithm Reconstruction of digital waveforms Muon.
Greg Sullivan University of Maryland Data Filtering and Software IceCube Collaboration Meeting Monday, March 21, 2005.
Application of Data Mining Algorithms in Atmospheric Neutrino Analyses with IceCube Tim Ruhe, TU Dortmund.
LHC Computing Review - Resources ATLAS Resource Issues John Huth Harvard University.
1 N eutrino E xtended S ubmarine T elescope with O ceanographic R esearch Operation and performance of the NESTOR test detector.
Computing in IceCube Georges Kohnen, Université de Mons-Hainaut, Belgium for the IceCube Collaboration The IceCube neutrino telescope.
CIPANP 2006K. Filimonov, UC Berkeley From AMANDA to IceCube: Neutrino Astronomy at the South Pole Kirill Filimonov University of California, Berkeley.
Why Neutrino ? High energy photons are absorbed beyond ~ 150Mpc   HE  LE  e - e + HE s are unique to probe HE processes in the vicinity of cosmic.
Prediction W. Buchmueller (DESY) arXiv:hep-ph/ (1999)
Tool Integration with Data and Computation Grid GWE - “Grid Wizard Enterprise”
Erik Blaufuss University of Maryland Data Filtering and Software IceCube Collaboration Meeting Monday, March 21, 2005.
MMC and UCR icetray modules Dima Chirkin, LBNL Presented by Spencer Klein Tau 2-bang Coincident showers.
NESTOR SIMULATION TOOLS AND METHODS Antonis Leisos Hellenic Open University Vlvnt Workhop.
IceCube Software System Design Methodology DAQ S/W Workshop 28 Oct 2002 John Cavin.
The IceCube Neutrino Observatory is a cubic kilometer detector at the geographic South Pole. We give an overview of searches for time-variable neutrino.
Azriel Goldschmidt AMANDA/IceCube Collaboration Meeting Laguna Beach, CA March 2003 String-18 Progress and Plans.
AliRoot survey P.Hristov 11/06/2013. Offline framework  AliRoot in development since 1998  Directly based on ROOT  Used since the detector TDR’s for.
Science Advisory Committee March 30, 2006 Jim Yeck IceCube Project Director IceCube Construction Progress.
Firmware - 1 CMS Upgrade Workshop October SLHC CMS Firmware SLHC CMS Firmware Organization, Validation, and Commissioning M. Schulte, University.
IceCube Calibration Overview Kurt Woschnagg University of California, Berkeley MANTS 2009 Berlin, 25 September identical sensors in ultraclean,
23/2/2000Status of GAUDI 1 P. Mato / CERN Computing meeting, LHCb Week 23 February 2000.
Tool Integration with Data and Computation Grid “Grid Wizard 2”
Nov 30, 2003Tom Gaisser The IceTop component of IceCube Perspective from the South Pole.
KM 3 Neutrino Telescope European deep-sea research infrastructure DANS – symposium Maarten de Jong.
General requirements for BES III offline & EF selection software Weidong Li.
IceCube simulation with PPC Dmitry Chirkin, UW Madison, 2010 effective scattering coefficient (from Ryan Bay)
IceCube DAQ Mtg. 10,28-30 IceCube DAQ: Implementation Plan.
Simulation Production System Science Advisory Committee Meeting UW-Madison March 1 st -2 nd 2007 Juan Carlos Díaz Vélez.
IceTop Design: 1 David Seckel – 3/11/2002 Berkeley, CA IceTop Overview David Seckel IceTop Group University of Delaware.
EHE Search for EHE neutrinos with the IceCube detector Aya Ishihara Chiba University.
MAUS Status A. Dobbs CM43 29 th October Contents MAUS Overview Infrastructure Geometry and CDB Detector Updates CKOV EMR KL TOF Tracker Global Tracking.
KM3NeT Neutrino conference 2-7 June 2014, Boston, U.S.A. Maarten de Jong on behalf of the KM3NeT collaboration The next generation neutrino telescope in.
GLAST LAT ProjectNovember 18, 2004 I&T Two Tower IRR 1 GLAST Large Area Telescope: Integration and Test Two Tower Integration Readiness Review SVAC Elliott.
Dark Matter Searches with AMANDA and IceCube Catherine De Clercq for the IceCube Collaboration Vrije Universiteit Brussel Interuniversity Institute for.
SAC/PAP Review UW-Madison March 1-2, 2007
Simulation Production System
Muons in IceCube PRELIMINARY
IceCube Construction Endgame
Overview of the Belle II computing
CMS High Level Trigger Configuration Management
Pasquale Migliozzi INFN Napoli
South Pole Ice model Dmitry Chirkin, UW, Madison.
IceCube System Testing
GWE Core Grid Wizard Enterprise (
Recent Results of Point Source Searches with the IceCube Neutrino Telescope Lake Louise Winter Institute 2009 Erik Strahler University of Wisconsin-Madison.
Commissioning of the ALICE HLT, TPC and PHOS systems
IC40 Physics Run Preparations
String 21 Flashers and AMANDA
MUPAGE: A fast muon generator
Leigh Grundhoefer Indiana University
Karen Andeena, Katherine Rawlinsb, Chihwa Song*a
Ice Investigation with PPC
AMANDA-II Experiment Located at the Geographic South Pole
Search for coincidences and study of cosmic rays spectrum
MonteCarlo production for the BaBar experiment on the Italian grid
Summary of yet another Photonics Workshop AMANDA/IceCube Collaboration Meeting Berkeley, March 19, 2005.
Planning next release of GAUDI
University of Wisconsin-Madison
Presentation transcript:

Simulation in IceCube Science Advisory Committee Meeting Paolo Desiati UW-Madison March 1st-2nd 2007 Paolo Desiati

introduction offline software based on custom-made framework: “IceTray” modular and configurable at runtime modules (specific tasks) and services (shared tasks) structure of simulation software developing simulation testing simulation benchmarking simulation producing simulation data using simulation data

structure of simulation software the core : offline-software tools framework functionalities input/output data structure visualization tools simulation software based on C++ framework “IceTray” some use of third party code (FORTRAN, Java) provided interfaces to IceTray modules and services grouped in projects projects grouped in a meta-project SVN version control system http://wiki.icecube.wisc.edu/index.php/Offline_Software_Main_Page#Simulation_Information

structure of simulation software simulation modules physics generators CORSIKA – cosmic muons neutrino-generator, JULIeT – neutrinos with EH and EHE MMC – neutrino generator simple-generator – muon/cascade generator for benchmarking event propagators MMC – muon propagator in media JULIeT – propagator for EHE events detector response simulation ice properties & photon density PMT-noise simulation PMT & DOM / TWR simulation trigger simulation IceCube, IceTop and AMANDA

structure of simulation software simulation services c2j service (C++ → java) random generator access to IceCube database event time generator AMANDA calibration IceCube calibration and status legacy : AMANDA simulation in transition from C/FORTRAN-based AMASIM to IceTray

developing simulation project-level test, verification and release a collaboration-wide effort authorship responsibility for each project insure stability within meta-project difficult to coordinate : iterative process code review policy insure clean code, correct usage of standards, units, … meta-project level test, verification and release mainly coordinated by Alex Olivas (UMD) guarantee stable releases for production reduce delegating problems to this level documentation of code more efficient understanding of code written by others

testing simulation unit tests for each project integrated tests one or more scoped test source and test module verify that modules/services perform tasks as designed verify that they provide what expected within the known constraints detect bugs integrated tests physics-oriented tests for higher level code verification if unit tests work properly no need of specific integrated test test example: http://wiki.icecube.wisc.edu/index.php/Noise_Generator

benchmarking simulation speed and memory performance time profiling tools provided by IceTray use third party profiler such as SunStudio 11 find bottlenecks in simulation chain detect odd runtime behavior help in detecting bugs automatic code compilation in different architectures include tests possibly include well scoped benchmarking

producing simulation data neutrino-telescopes to win against backgrounds cosmic muon events ×106 coincident cosmic muon events ×102-104 atmospheric neutrinos ×1 select pure atmospheric neutrino sample detailed description of cosmic muon background producing cosmic muon background is demanding using important sampling require large computing resources

producing simulation data simulation production on tagged software releases keep track of project versions keep track of entire production history integrated tests at production level certain bugs stay invisible up to production often difficult to find origin of bugs and strange behavior more effort to catch problems earlier test at production level not efficient need more efficient distribution of tasks at production level

producing simulation data IceCube distributed computing resources Tier 1 institutions to contribute to production/processing Tier 2 institutions to contribute if available each site to provide local responsible and to become expert in production testing, troubleshooting and maintenance production tools custom-developed (Juan Carlos D-V) use diverse batch systems flexible and configurable production database, logging, runtime monitor http://internal.icecube.wisc.edu/simulation

producing simulation data use of dedicated clusters and of grid systems GLOW at UW – Madison SWEGrid in Sweden LHC Grid in DESY generating background for IceCube currently Dual Core AMD Opteron(tm) Processor 280, 2.4 GHz (npx2) CORSIKA pre-generation: 1-2h livetime in 1d execution time (x 100 with important sampling) trigger : 0.5 events/sec (assuming IC only & current trigger settings) 300 CPU in IC9 for real time production (10MB/file) 900 CPU in IC22 (65MB/file) 3200 CPU in IC80 (230MB/file) double coincident muons : 0.4 events/sec 5 CPU in IC9 for real time production (25MB/file)

producing simulation data generating background for IceCube-9 cosmic muon events 16h livetime in 5d execution time and 240d CPU time (UMD) coincident muon events 5d livetime in 8h execution time and 48d CPU time (UW) generating signal for IceCube-9 (E-1 spectrum) 400 Kevents in 1d exec time and 43d CPU time (UW) generating background for IceCube-80 3h livetime in 30d execution time and 430d CPU time (SWEGRID) http://internal.icecube.wisc.edu/simulation/time

producing simulation data location batch system Cores type CPU equivalent (GHz) UW (US) Condor 2128 (272) 32bit 6140.80 (573)   388 (0) 64bit 573 (0) PBS 256 (256) 614 (614) UMD (US) 70 (70) 188 (188) Mons (Belgium) 24 (?) 44 (?) Souther (US) OpenPBS 56 (56) 212 (212) LBNL (US) SGE 257 (16) 694 (48) Stockholm (Sweden) Swegrid 600 (~60) 1680 (168) Chiba (Japan) 30 (?) 42 (?) 10 (10) 27 (27) Desy (Germany) 400 (?) 1080 (?) Brussels (Belgium) Condor/OpenPBS 90 (?) 112 (?) 302 (?) Wuppertal (Germany) ALiCEnext 450 (225) 1080 (540) Aachen (Germany) 20 (?) 28 (?)

producing simulation data separate detector configurations to be used IceCube in-ice strings IceTop surface stations IceCube/IceTop coincidence events IceCube/AMANDA merged events use of real detector snapshots from online calibration and detector status database need to increase production efficiency merge detector configurations in same production process make better use of important sampling for CORSIKA reduce file size increase simulation speed and production handling speed need to increase computing resources use and contribute to grids more massively (e.g. GLOW)

using simulation data first data analyses with IceCube-9 (John Pretz, UMD)

using simulation data simulation verification comparison with experimental data at basic levels simulation agreement at trigger level and various filter levels ice properties treatment and implementation IceCube and AMANDA have different issues PMT response simulation (1 spe) OM response and waveforms TWR and DOM OM individual sensitivities induced by local hole ice AMASIM still in use for current AMANDA analyses detector simulation quality still improving