Bulk production of Monte Carlo

Slides:



Advertisements
Similar presentations
MAUS Update Adam Dobbs, MICE Project Board, 16 th April 2015.
Advertisements

Grid and CDB Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
Batch Production and Monte Carlo + CDB work status Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
1 Data Storage MICE DAQ Workshop 10 th February 2006 Malcolm Ellis & Paul Kyberd.
K.Harrison CERN, 23rd October 2002 HOW TO COMMISSION A NEW CENTRE FOR LHCb PRODUCTION - Overview of LHCb distributed production system - Configuration.
Henry Nebrensky – Data Flow Workshop – 30 June 2009 MICE Data Flow Workshop Henry Nebrensky Brunel University 1.
Large scale data flow in local and GRID environment V.Kolosov, I.Korolko, S.Makarychev ITEP Moscow.
Software Summary 1M.Ellis - CM23 - Harbin - 16th January 2009  Four very good presentations that produced a lot of useful discussion: u Online Reconstruction.
SOFTWARE & COMPUTING Durga Rajaram MICE PROJECT BOARD Nov 24, 2014.
1M. Ellis - MICE Video Conference - 15th March 2007 Software Report  New G4MICE Users  TOF Simulation and Digitisation  SciFi Reconstruction  Tracker.
Grid Data Management A network of computers forming prototype grids currently operate across Britain and the rest of the world, working on the data challenges.
Workload Management WP Status and next steps Massimo Sgaravatto INFN Padova.
Alexandre A. P. Suaide VI DOSAR workshop, São Paulo, 2005 STAR grid activities and São Paulo experience.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
November 7, 2001Dutch Datagrid SARA 1 DØ Monte Carlo Challenge A HEP Application.
Building a distributed software environment for CDF within the ESLEA framework V. Bartsch, M. Lancaster University College London.
3rd June 2004 CDF Grid SAM:Metadata and Middleware Components Mòrag Burgon-Lyon University of Glasgow.
8th November 2002Tim Adye1 BaBar Grid Tim Adye Particle Physics Department Rutherford Appleton Laboratory PP Grid Team Coseners House 8 th November 2002.
Sep 21, 20101/14 LSST Simulations on OSG Sep 21, 2010 Gabriele Garzoglio for the OSG Task Force on LSST Computing Division, Fermilab Overview OSG Engagement.
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
EGEE is a project funded by the European Union under contract IST HEP Use Cases for Grid Computing J. A. Templon Undecided (NIKHEF) Grid Tutorial,
T3 analysis Facility V. Bucard, F.Furano, A.Maier, R.Santana, R. Santinelli T3 Analysis Facility The LHCb Computing Model divides collaboration affiliated.
The KLOE computing environment Nuclear Science Symposium Portland, Oregon, USA 20 October 2003 M. Moulson – INFN/Frascati for the KLOE Collaboration.
The CMS Simulation Software Julia Yarba, Fermilab on behalf of CMS Collaboration 22 m long, 15 m in diameter Over a million geometrical volumes Many complex.
1 Andrea Sciabà CERN Critical Services and Monitoring - CMS Andrea Sciabà WLCG Service Reliability Workshop 26 – 30 November, 2007.
M. Ellis - MICE Collaboration Meeting - Thursday 28th October Sci-Fi Tracker Performance Software Status –RF background simulation –Beam simulation.
SAM Sensors & Tests Judit Novak CERN IT/GD SAM Review I. 21. May 2007, CERN.
Status of MICE on the GRID  MICE VO u CEs  G4MICE Installation  Example test job  Station QA Analysis  Analysis jobs  File Storage  Documentation.
CMS: T1 Disk/Tape separation Nicolò Magini, CERN IT/SDC Oliver Gutsche, FNAL November 11 th 2013.
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
The GridPP DIRAC project DIRAC for non-LHC communities.
LHCb Computing activities Philippe Charpentier CERN – LHCb On behalf of the LHCb Computing Group.
Geant4 GRID production Sangwan Kim, Vu Trong Hieu, AD At KISTI.
This presentation will describe the state of each element in the beam line with regards to the current update being undertaken. Firstly, it will describe.
GDB Meeting CERN 09/11/05 EGEE is a project funded by the European Union under contract IST A new LCG VO for GEANT4 Patricia Méndez Lorenzo.
MAUS Status A. Dobbs CM43 29 th October Contents MAUS Overview Infrastructure Geometry and CDB Detector Updates CKOV EMR KL TOF Tracker Global Tracking.
MICE. Outline Experimental methods and goals Beam line Diagnostics – In HEP parlance – the detectors Magnet system 2MICE Optics Review January 14, 2016.
Pledged and delivered resources to ALICE Grid computing in Germany Kilian Schwarz GSI Darmstadt ALICE Offline Week.
Scientific Data Processing Portal and Heterogeneous Computing Resources at NRC “Kurchatov Institute” V. Aulov, D. Drizhuk, A. Klimentov, R. Mashinistov,
MICE. Outline Experimental methods and goals Beam line Diagnostics – In HEP parlance – the detectors Magnet system 2MICE Optics Review January 14, 2016.
Compute and Storage For the Farm at Jlab
Xiaomei Zhang CMS IHEP Group Meeting December
MICE Computing and Software
MICE Step IV Lattice Design Based on Genetic Algorithm Optimizations
MCproduction on the grid
NA61/NA49 virtualisation:
Progress on NA61/NA49 software virtualisation Dag Toppe Larsen Wrocław
Belle II Physics Analysis Center at TIFR
Moving the LHCb Monte Carlo production system to the GRID
Offline Software A. Dobbs CM43 30th October 2015.
Bulk production of Monte Carlo
ALICE FAIR Meeting KVI, 2010 Kilian Schwarz GSI.
Data Management and Database Framework for the MICE Experiment
LHCb Computing Model and Data Handling Angelo Carbone 5° workshop italiano sulla fisica p-p ad LHC 31st January 2008.
Philippe Charpentier CERN – LHCb On behalf of the LHCb Computing Group
Readiness of ATLAS Computing - A personal view
Global PID MICE CM43 29/10/15 Celeste Pidcott University of Warwick
Developments in Batch and the Grid
Simulation use cases for T2 in ALICE
The INFN Tier-1 Storage Implementation
Status of MC production on the grid
Status of MC production on the grid
R. Graciani for LHCb Mumbay, Feb 2006
 YongPyong-High Jan We appreciate that you give an opportunity to have this talk. Our Belle II computing group would like to report on.
K. Tilley, ISIS, Rutherford Appleton Laboratory, UK Introduction
Geant4 in HARP V.Ivanchenko For the HARP Collaboration
ATLAS DC2 & Continuous production
The LHCb Computing Data Challenge DC06
Presentation transcript:

Bulk production of Monte Carlo MICE Collaboration Bulk production of Monte Carlo Dimitrije Maletic Institute of Physics, University of Belgrade MICE Project Board, 7th of March 2017.

MICE Project Board, 7th of March 2017. Outline Software used for MICE MC simulations MCproduction on the grid Information about finished MCproductions Resources available for MCproduction Conclusions MICE Project Board, 7th of March 2017.

Divide and conquer Monte Carlo The full MICE experiment simulation is pieced out in manageable components for speed and versatility. Target hadroproduction simulation MICE beam line simulation MICE experiment simulation MICE runs with many beam line and experimental configurations and needs a wide range of simulations. Tunable currents in the dipoles, quads, solenoids Choice of proton absorber thickness Variable absorber material, diffuser thickness Adjustable geometry during commissioning (detectors move) Requires a robust infrastructure to handle many MC production jobs MICE Project Board, 7th of March 2017.

Target hadroproduction and G4BeamLine All secondary particle species are sampled from a parent distribution originally produced by tracking 1.97 x 109 POT using G4Beamline: - Geant4 physics based simulation. - Would be very inefficient to simulate from target each time G4BeamLine simulation steps: 1. Sample p+, p+, p -, e+, e-, g from secondary parent distribution 2. Match though the quads Q1-3 fields 3. Most pions decay in flight in the decay solenoid between the two bending magnets. Propagation through the fields and scattering handled by G4Beamline, particles that hit the magnets bores are killed 4. Store a file with a list of particles and their parameters 1m downstream of D2. MICE Project Board, 7th of March 2017.

MICE Analysis User Software Downstream of D2, the MICE Analysis User Software (MAUS) takes over - Uses the G4BL "chunks" as an input beam - Geant4 based simulation - Handles complex detector geometries and field interpolation - Robust connection with the reconstruction software to ensure consistency between truth and digitized MC MICE Project Board, 7th of March 2017.

MICE on the grid organisation MICE MC production on the grid block schema Image by Henry Nebrensky MICE on the grid has CERN like organisation into Tears. MC production running on sites supporting MICE VO. MAUS or G4BL output (or replica) copied to Imperial SE for http access. Also, copy of aggregated output files should be on RAL T1 tape. MICE Project Board, 7th of March 2017.

About MICE MC production MCproduction is regularly discussed at Grid and Data Mover meetings, as a part of Software and Computing Infrastructure project. Talks are also given on Analysis meetings and MICE CMs. The MC production on the grid starts with the request on the request page. The production manager (me) should be informed about the request. I discuss the request with Durga. Then I insert the entry about the MC production into the CDB, and submit the grid jobs. (needs a valid grid certificate and to be included into MICE VOMS) The MCproduction is using the MAUS software installed on CVMFS on the grid. Necessary information for MC simulation are http/srm list of G4Beamline chunks, MAUS SW Version, and a simulation datacard details. Information about MC production and output links are placed at the MCproduction page, linked from MICE Software home page. MICE Project Board, 7th of March 2017.

MCproduction on the grid information pages (1/2) Information about finished and running MCproductions on the grid: http://micewww.pp.rl.ac.uk/projects/analysis/wiki/MCProduction Information ( and also as examples ) of MCproduction Requests http://micewww.pp.rl.ac.uk/projects/analysis/wiki/MCProductionRequests Information about MCproduction requests entries in CDB You can check MCSerialNumber entries in CDB: http://147.91.87.158/cgi-bin/get_mcserial The scripts used for MC production on the grid are available on launchpad https://launchpad.net/mice-mc-batch-submission MICE Project Board, 7th of March 2017.

MCproduction request page G4BeamLine libraries: 3-140+M3 (8161) : 3_140_M3.json 6-140+M3 (8775): 6_140_M3.json 10-140+M3 (8773): 10_140_M3.json 3-170+M3 (8046): 3_170_M3.json 3-200+M3 (8163): 3_200_M3.json 6-200+M3 (8794): 6_200_M3.json 10-200+M3 (8818): 10_200_M3.json 3-240+M3 (8180): 3_240_M3.json 6-240+M3 (8873): 6_240_M3.json 10-240+M3 (8907): 10_240_M3.json MAUS MC grid production does not simulate all runs from the experiment, but one of the runs with same Beam line and Cooling channel configurations. 7469 Emittance production (Done) Cards: -geometry_download_by “run_number" -geometry_download_run_number 7469 -TOF_calib_by = "date" -TOF_cabling_by = "date" -TOF_calib_date_from = '2015-10-07 14:00:00' -TOF_cabling_date_from = '2015-10-07 14:00:00' -TOFscintLightSpeed = 134.5 # mm/ns G4BL input: http://www.ppe.gla.ac.uk/~rbayes/MICE/batch_scripts/3182Mu.txt Software: MAUS-v2.5.0 8154 140 MeV ECE FC44 (Done) -geometry_download_run_number 8154 G4BL input: http://micewww.pp.rl.ac.uk/attachments/download/6378/3140_p1.txt Software: MAUS-v2.6.0 MICE Project Board, 7th of March 2017.

MCproduction on the grid information pages (2/2) Http access: http://gfe02.grid.hep.ph.ic.ac.uk:8301/Simulation/MCproduction/ 33 productions done till March 2016. New MCproductions. With MCSerialNumber entries in CDB Old MCproductions. WithNO MCSerialNumber entries in CDB, only in preprodcdb MICE Project Board, 7th of March 2017.

MICE Project Board, 7th of March 2017. Information about finished MCproductions Number of jobs started since March 2016. is 52811 Parallel running jobs : mean 45.5, maximum 1382. Storage space used for MCproduction is 642 GB There were 33 productions / 365 days. Simulation processed in few up to 10h a day (depending on a queues on grid sites). Processing time for MCproduction not an issue. MICE Project Board, 7th of March 2017.

Resources available for MCproduction Running Waiting Total Free Queue ------------------------------------------------------------------------------------------------------------------------------------------------- 0 0 0 216 arc-ce01.gridpp.rl.ac.uk:2811/nordugrid-Condor-grid3000M 0 0 0 208 arc-ce02.gridpp.rl.ac.uk:2811/nordugrid-Condor-grid3000M 0 0 0 212 arc-ce03.gridpp.rl.ac.uk:2811/nordugrid-Condor-grid3000M 0 0 0 185 arc-ce04.gridpp.rl.ac.uk:2811/nordugrid-Condor-grid3000M 0 0 0 728 ce-01.roma3.infn.it:8443/cream-pbs-fastgrid 0 0 0 728 ce-01.roma3.infn.it:8443/cream-pbs-grid ………… 0 0 0 6 ceprod08.grid.hep.ph.ic.ac.uk:8443/cream-sge-grid.q 0 0 0 140 cream2.ppgrid1.rhul.ac.uk:8443/cream-pbs-mice 0 0 1169 226 dc2-grid-21.brunel.ac.uk:2811/nordugrid-Condor-default 0 0 1080 77 dc2-grid-22.brunel.ac.uk:2811/nordugrid-Condor-default 0 0 38 14 dc2-grid-25.brunel.ac.uk:2811/nordugrid-Condor-default 0 0 1029 576 dc2-grid-26.brunel.ac.uk:2811/nordugrid-Condor-default 0 0 764 6 dc2-grid-28.brunel.ac.uk:2811/nordugrid-Condor-default 0 0 0 764 hepgrid2.ph.liv.ac.uk:2811/nordugrid-Condor-grid 0 0 0 461 heplnv146.pp.rl.ac.uk:2811/nordugrid-Condor-grid 4047 250 4297 985 svr009.gla.scotgrid.ac.uk:2811/nordugrid-Condor-condor_q2d 4036 244 4280 996 svr010.gla.scotgrid.ac.uk:2811/nordugrid-Condor-condor_q2d 4050 255 4305 982 svr011.gla.scotgrid.ac.uk:2811/nordugrid-Condor-condor_q2d 4048 270 4318 984 svr019.gla.scotgrid.ac.uk:2811/nordugrid-Condor-condor_q2d Status of available slots for grid jobs for MICE VO MICE VO View 27.02.2017 12:00 Total (Running/Waiting) Free 21281 18718 Mcproduction mean of parallel jobs 45 Available storage space for MICE VO Free Used Reserved Free Used Reserved Tag SE Online Online Online Nearline Nearline Nearline 224333 833672 0 0 0 0 - dc2-grid-64.brunel.ac.uk ……. 106387 42797 0 0 0 0 - gfe02.grid.hep.ph.ic.ac.uk 51770 8225 0 0 0 0 - heplnx204.pp.rl.ac.uk …… 177832 82794 0 0 0 0 - se01.dur.scotgrid.ac.uk 12018 13338 0 0 0 0 - se2.ppgrid1.rhul.ac.uk 6589 4593 11183 0 0 0 - srm-mice.gridpp.rl.ac.uk 6589 4593 11183 0 0 0 MICE_MISC_TAPE srm-mice.gridpp.rl.ac.uk 6589 4593 11183 0 0 0 MICE_RECO srm-mice.gridpp.rl.ac.uk 6589 4593 11183 14314 15747 30061 - srm-mice.gridpp.rl.ac.uk 6589 4593 11183 14314 15747 30061 MICE_RAW_TAPE srm-mice.gridpp.rl.ac.uk 6589 4593 11183 6602 4002 10604 - srm-mice.gridpp.rl.ac.uk 6589 4593 11183 6602 4002 10604 MICE_RAW_TAPE2 srm-mice.gridpp.rl.ac.uk … Free 626526 GB Used 1254470 GB Imperial SE Free 106387 GB Used 42797 GB MCproduction usage 640GB At RAL (srm-mice only): Disk Used: 41.08 % 4,593 GB Disk total: 11,183 GB Tape used: 20,054 GB Tape to go to 300 TB MICE Project Board, 7th of March 2017.

MICE Project Board, 7th of March 2017. Conclusions MAUS MC production, using already made G4BL libraries, was done upon request with no delays, since March last year. As of last month, the G4BeamLine production was restarted. Processed 33 MCproductions on the grid. Used 642 GB of storage space. Processing time for MCproduction not an issue. Available processing power and storage capacity are not a bottleneck for running more (much more than the order of magnitude) productions per year. My availability is not an issue. Permanent employment. Management of my institute and colleagues from my Laboratory are very happy that we are part of international collaboration on UK based experiment. Expecting/ready for increase of requests for MC production after finished February/March ISIS user run cycle. Using new G4BeamLine libraries and MAUS-v2.8.4 version on CVMFS. MICE Project Board, 7th of March 2017.

THANK YOU!