Bulk production of Monte Carlo MICE Collaboration Bulk production of Monte Carlo Dimitrije Maletic Institute of Physics, University of Belgrade MICE Project Board, 7th of March 2017.
MICE Project Board, 7th of March 2017. Outline Introduction MCproduction on the grid information pages Information about finished MCproductions Resources available for MCproduction Conclusions MICE Project Board, 7th of March 2017.
MICE Project Board, 7th of March 2017. Introduction (1/3) MCproduction is regularly discussed at Grid and Data Mover meetings, as a part of Software and Computing Infrastructure project. Talks are also given on Analysis meetings and MICE CMs. The MC production on the grid includes G4BeamLine simulation and G4 Cooling Channel simulations using MAUS. MAUS MC production, using already made G4BL libraries, was done upon request with no delays, since March last year. As of last month, the G4BeamLine production was restarted on the grid, and the grid production of Version 4 of the G4BeamLine libraries started. MICE Project Board, 7th of March 2017.
MICE MC production on the grid block schema Introduction (2/3) MICE MC production on the grid block schema Image by Henry Nebrensky MC production running on sites supporting MICE VO. MAUS or G4BL output (or replica) copied to Imperial SE for http access. Also, copy of aggregated output files should be on RAL T1 tape. MICE Project Board, 7th of March 2017.
MICE Project Board, 7th of March 2017. Introduction (3/3) The MC production on the grid starts with the request on the request page. The production manager (me) should be informed about the request. I discuss the request with Durga. Then I insert the entry about the MC production into the CDB, and submit the grid jobs. In order to be able to continue with the MC production, production manager has to have a valid grid certificate and to be included into MICE VOMS. The MCproduction is using the MAUS software installed on CVMFS on the grid. Necessary information for MC simulation are http/srm list of G4Beamline chunks, MAUS SW Version, and a simulation datacard details. MAUS accesses the CDB to get appropriate configuration and calibrations, defined by reading of the datacard. Each request/start of MCProduction, is tagged with unique MCSerialNumber (row number in CDB table). Information about MC production and output links are placed at the MCproduction page, linked from MICE Software home page. MICE Project Board, 7th of March 2017.
MCproduction on the grid information pages (1/2) Information about finished and running MCproductions on the grid: http://micewww.pp.rl.ac.uk/projects/analysis/wiki/MCProduction Information ( and also as examples ) of MCproduction Requests http://micewww.pp.rl.ac.uk/projects/analysis/wiki/MCProductionRequests Information about MCproduction requests entries in CDB You can check MCSerialNumber entries in CDB: http://147.91.87.158/cgi-bin/get_mcserial The scripts used for MC production on the grid are available on launchpad https://launchpad.net/mice-mc-batch-submission MICE Project Board, 7th of March 2017.
MCproduction on the grid information pages (2/2) Http access: http://gfe02.grid.hep.ph.ic.ac.uk:8301/Simulation/MCproduction/ 33 productions done till March 2016. New MCproductions. With MCSerialNumber entries in CDB Old MCproductions. WithNO MCSerialNumber entries in CDB, only in preprodcdb MICE Project Board, 7th of March 2017.
MICE Project Board, 7th of March 2017. Latest production: Number of outputs stored / time of day interval 48 17:40-49 135 17:50-59 56 18:00-09 135 18:10-19 132 18:20-29 89 18:30-39 138 18:40-49 138 18:50-59 67 19:00-09 2 19:10-19 8 19:20-29 2 19:30-39 3 19:40-49 4 19:50-59 2 20:00-09 1 20:20-29 1 20:30-39 1 20:40-49 Information about finished MCproductions Number of jobs started since March 2016. is 52811 Parallel running jobs : mean 45.5, maximum 1382. Storage space used for MCproduction is 642 GB Processing time for MCproduction not an issue. No Jobs | The Logging and Bookeeping Subsystem (LB) used by WMS 5632 https://lcglb01.gridpp.rl.ac.uk 5643 https://lcglb02.gridpp.rl.ac.uk 3185 https://svr024.gla.scotgrid.ac.uk 6926 https://wmslb01.grid.hep.ph.ic.ac.uk 31378 https://wmslb02.grid.hep.ph.ic.ac.uk No jobs started: 52811, parallel running jobs: mean 45.5 maximum 1382. Broj jobova - statistika: 322 45,45342 111,53472 14636 1 8 1382 MICE Project Board, 7th of March 2017.
Resources available for MCproduction Running Waiting Total Free Queue ------------------------------------------------------------------------------------------------------------------------------------------------- 0 0 0 216 arc-ce01.gridpp.rl.ac.uk:2811/nordugrid-Condor-grid3000M 0 0 0 208 arc-ce02.gridpp.rl.ac.uk:2811/nordugrid-Condor-grid3000M 0 0 0 212 arc-ce03.gridpp.rl.ac.uk:2811/nordugrid-Condor-grid3000M 0 0 0 185 arc-ce04.gridpp.rl.ac.uk:2811/nordugrid-Condor-grid3000M 0 0 0 728 ce-01.roma3.infn.it:8443/cream-pbs-fastgrid 0 0 0 728 ce-01.roma3.infn.it:8443/cream-pbs-grid ………… 0 0 0 6 ceprod08.grid.hep.ph.ic.ac.uk:8443/cream-sge-grid.q 0 0 0 140 cream2.ppgrid1.rhul.ac.uk:8443/cream-pbs-mice 0 0 1169 226 dc2-grid-21.brunel.ac.uk:2811/nordugrid-Condor-default 0 0 1080 77 dc2-grid-22.brunel.ac.uk:2811/nordugrid-Condor-default 0 0 38 14 dc2-grid-25.brunel.ac.uk:2811/nordugrid-Condor-default 0 0 1029 576 dc2-grid-26.brunel.ac.uk:2811/nordugrid-Condor-default 0 0 764 6 dc2-grid-28.brunel.ac.uk:2811/nordugrid-Condor-default 0 0 0 764 hepgrid2.ph.liv.ac.uk:2811/nordugrid-Condor-grid 0 0 0 461 heplnv146.pp.rl.ac.uk:2811/nordugrid-Condor-grid 4047 250 4297 985 svr009.gla.scotgrid.ac.uk:2811/nordugrid-Condor-condor_q2d 4036 244 4280 996 svr010.gla.scotgrid.ac.uk:2811/nordugrid-Condor-condor_q2d 4050 255 4305 982 svr011.gla.scotgrid.ac.uk:2811/nordugrid-Condor-condor_q2d 4048 270 4318 984 svr019.gla.scotgrid.ac.uk:2811/nordugrid-Condor-condor_q2d Status of available slots for grid jobs for MICE VO MICE VO View 27.02.2017 12:00 Total (Running/Waiting) Free 21281 18718 Free Used Reserved Free Used Reserved Tag SE Online Online Online Nearline Nearline Nearline 224333 833672 0 0 0 0 - dc2-grid-64.brunel.ac.uk ……. 106387 42797 0 0 0 0 - gfe02.grid.hep.ph.ic.ac.uk 51770 8225 0 0 0 0 - heplnx204.pp.rl.ac.uk …… 177832 82794 0 0 0 0 - se01.dur.scotgrid.ac.uk 12018 13338 0 0 0 0 - se2.ppgrid1.rhul.ac.uk 6589 4593 11183 0 0 0 - srm-mice.gridpp.rl.ac.uk 6589 4593 11183 0 0 0 MICE_MISC_TAPE srm-mice.gridpp.rl.ac.uk 6589 4593 11183 0 0 0 MICE_RECO srm-mice.gridpp.rl.ac.uk 6589 4593 11183 14314 15747 30061 - srm-mice.gridpp.rl.ac.uk 6589 4593 11183 14314 15747 30061 MICE_RAW_TAPE srm-mice.gridpp.rl.ac.uk 6589 4593 11183 6602 4002 10604 - srm-mice.gridpp.rl.ac.uk 6589 4593 11183 6602 4002 10604 MICE_RAW_TAPE2 srm-mice.gridpp.rl.ac.uk … Free 626526 GB Used 1254470 GB Available storage space for MICE VO Imperial SE Free 106387 GB Used 42797 GB At RAL (srm-mice only): Disk Used: 41.08 % 4,593 GB Disk total: 11,183 GB Tape used: 20,054 GB Tape to go to 300 TB MICE Project Board, 7th of March 2017.
MICE Project Board, 7th of March 2017. Conclusions Processed 33 MCproductions on the grid. Used 642 GB of storage space. Processing time for MCproduction not an issue. Available processing power and storage capacity are not a bottleneck for running more (much more than the order of magnitude) productions per year. My availability is not an issue. Permanent employment. Management of my institute and colleagues from my Laboratory are very happy that we are part of international collaboration on UK based experiment. Expecting/ready for increase of requests for MC production after finished February/March ISIS user run cycle. Using new G4BeamLine libraries and MAUS-v2.8.4 version on CVMFS. MICE Project Board, 7th of March 2017.
THANK YOU!