MCproduction on the grid

Slides:



Advertisements
Similar presentations
Development of test suites for the certification of EGEE-II Grid middleware Task 2: The development of testing procedures focused on special details of.
Advertisements

Grid and CDB Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
Batch Production and Monte Carlo + CDB work status Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
BaBar WEB job submission with Globus authentication and AFS access T. Adye, R. Barlow, A. Forti, A. McNab, S. Salih, D. H. Smith on behalf of the BaBar.
Let’s Make An Form! Bonney Armstrong GD 444 Westwood College February 9, 2005.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
Marianne BargiottiBK Workshop – CERN - 6/12/ Bookkeeping Meta Data catalogue: present status Marianne Bargiotti CERN.
Nadia LAJILI User Interface User Interface 4 Février 2002.
Sep 21, 20101/14 LSST Simulations on OSG Sep 21, 2010 Gabriele Garzoglio for the OSG Task Force on LSST Computing Division, Fermilab Overview OSG Engagement.
1 DIRAC – LHCb MC production system A.Tsaregorodtsev, CPPM, Marseille For the LHCb Data Management team CHEP, La Jolla 25 March 2003.
Belle MC Production on Grid 2 nd Open Meeting of the SuperKEKB Collaboration Soft/Comp session 17 March, 2009 Hideyuki Nakazawa National Central University.
LHCb and DataGRID - the workplan for 2001 Eric van Herwijnen Wednesday, 28 march 2001.
Group 1 : Grid Computing Laboratory of Information Technology Supervisors: Alexander Ujhinsky Nikolay Kutovskiy.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
Enabling Grids for E-sciencE EGEE-III INFSO-RI Using DIANE for astrophysics applications Ladislav Hluchy, Viet Tran Institute of Informatics Slovak.
GRID. Register Fill the form. Your IP (Juanjo) signature is needed and the one from the.
June 24-25, 2008 Regional Grid Training, University of Belgrade, Serbia Introduction to gLite gLite Basic Services Antun Balaž SCL, Institute of Physics.
LHCb Software Week November 2003 Gennady Kuznetsov Production Manager Tools (New Architecture)
CERN Using the SAM framework for the CMS specific tests Andrea Sciabà System Analysis WG Meeting 15 November, 2007.
Foundation API The Fast Version. Follow my examples Example: building App for GMAP on Stampede Materials are in the Data Store: – Community Data/iplantcollaborative/example_data/gmap.
T3 analysis Facility V. Bucard, F.Furano, A.Maier, R.Santana, R. Santinelli T3 Analysis Facility The LHCb Computing Model divides collaboration affiliated.
Working with AliEn Kilian Schwarz ALICE Group Meeting April
August 30, 2002Jerry Gieraltowski Launching ATLAS Jobs to either the US-ATLAS or EDG Grids using GRAPPA Goal: Use GRAPPA to launch a job to one or more.
Tier 3 Status at Panjab V. Bhatnagar, S. Gautam India-CMS Meeting, July 20-21, 2007 BARC, Mumbai Centre of Advanced Study in Physics, Panjab University,
MICE Configuration DB Janusz Martyniak, Imperial College London MICE CM39 Software Paralell.
SAM Sensors & Tests Judit Novak CERN IT/GD SAM Review I. 21. May 2007, CERN.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
The GridPP DIRAC project DIRAC for non-LHC communities.
INFSO-RI Enabling Grids for E-sciencE Using of GANGA interface for Athena applications A. Zalite / PNPI.
Open Science Grid Build a Grid Session Siddhartha E.S University of Florida.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI VO auger experience with large scale simulations on the grid Jiří Chudoba.
EGEE is a project funded by the European Union under contract IST Enabling bioinformatics applications to.
The GridPP DIRAC project DIRAC for non-LHC communities.
Geant4 GRID production Sangwan Kim, Vu Trong Hieu, AD At KISTI.
MAUS Status A. Dobbs CM43 29 th October Contents MAUS Overview Infrastructure Geometry and CDB Detector Updates CKOV EMR KL TOF Tracker Global Tracking.
The CMS Beijing Tier 2: Status and Application Xiaomei Zhang CMS IHEP Group Meeting December 28, 2007.
SEE-GRID-SCI WRF-ARW model: Grid usage The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures.
Workload Scheduler plug-in for JSR 352 Java Batch IBM Workload Scheduler IBM.
1 MC Production and Reconstruction Summary and Plans Valeria Bartsch, Fabrizio Salvatore, A.-M. Magnan, and Nigel Watson.
SPS Spotlight Series October 2014
Architecture Review 10/11/2004
gLite Basic APIs Christos Filippidis
Bulk production of Monte Carlo
MICE Computing and Software
NA61/NA49 virtualisation:
Progress on NA61/NA49 software virtualisation Dag Toppe Larsen Wrocław
How to connect your DG to EDGeS? Zoltán Farkas, MTA SZTAKI
MyProxy Server Installation
IW2D migration to HTCondor
Offline Software A. Dobbs CM43 30th October 2015.
Bulk production of Monte Carlo
Work report Xianghu Zhao Nov 11, 2014.
ITK configuration on PI
ALICE FAIR Meeting KVI, 2010 Kilian Schwarz GSI.
INFN-GRID Workshop Bari, October, 26, 2004
Data Management and Database Framework for the MICE Experiment
Patricia Méndez Lorenzo ALICE Offline Week CERN, 13th July 2007
CRAB and local batch submission
Submit BOSS Jobs on Distributed Computing System
CRC exercises Not happy with the way the document for testbed architecture is progressing More a collection of contributions from the mware groups rather.
Grid Deployment Board meeting, 8 November 2006, CERN
Job workflow Pre production operations:
Status of MC production on the grid
Status of MC production on the grid
 YongPyong-High Jan We appreciate that you give an opportunity to have this talk. Our Belle II computing group would like to report on.
gLite Job Management Christos Theodosiou
Job Application Monitoring (JAM)
Status and plans for bookkeeping system and production tools
Production client status
Presentation transcript:

MCproduction on the grid MICE Collaboration MCproduction on the grid Dimitrije Maletic Institute of Physics, University of Belgrade MICE collaboration meeting 47, 13th of February 2017.

MICE collaboration meeting 47, 13th of February 2017. Outline Introduction About MCproduction on the grid MCproduction on the grid information pages Page about done or running MCproductions MCproduction request page MCproduction scripts on launchpad Http access to MCproductions Restarting of the grid G4BL production MICE collaboration meeting 47, 13th of February 2017.

MICE collaboration meeting 47, 13th of February 2017. Introduction (1/2) The MC production on the grid includes G4BeamLine simulation and G4 Detector simulations using MAUS. MAUS G4 detector simulation production, using already made G4BL libraries, was done upon request with no delays, since March last year. As of last week, the G4BeamLine production was restarted on the grid, and the grid production of Version 4 of the G4BeamLine libraries can be started. Until now one G4BL library with 1k chunks was done. MICE collaboration meeting 47, 13th of February 2017.

MICE collaboration meeting 47, 13th of February 2017. Introduction (2/2) The MC production on the grid starts with the request which includes the information on what should be included in the CDB card (used by simulation), the list of G4BL input files paths the MAUS Software version comment The production manager should be informed about the request. The production manager inserts the entry about the MC production into the CDB, and submits the grid jobs. In order to be able to continue with the MC production, production manager has to have a valid grid certificate and to be included into MICE VOMS. MICE collaboration meeting 47, 13th of February 2017.

About MCproduction on the grid (1/2) The MCproduction is using the MAUS software installed on CVMFS on grid sites which support MICE VO. In MAUS bin/utilities directory there is a python script for MC detector simulation: execute_MC.py. The <some>.sh script, with settings for running on the grid environment, is passed to the grid (WMS) by submitting a grid job. This script is used to execute execute_MC.py by passing MCSerialNumber and run number. Necessary information for the running of the MC simulation are http/srm link to list of G4Beamline chunks (where line number in this list represents the run number), <SW_Version>, and a simulation datacard details. MAUS accesses the CDB to get appropriate configuration and calibrations, defined by reading of the datacard. Each request/start of MCProduction, will be tagged with unique MCSerialNumber. MCSerialNumber is a row number in CDB. MICE collaboration meeting 47, 13th of February 2017.

About MCproduction on the grid (2/2) MAUS writes a tarball like <RunNumber>_mc.tar for each chunk in G4BL list. All <RunNumber>_mc.tar are stored on Imperial SE for http access. http://gfe02.grid.hep.ph.ic.ac.uk:8301/Simulation/MCproduction/ To access the files from the grid, the <RunNumber>_mc.tar LFN is /grid/mice/Simulation/MCproduction/<TenThousands>/<Century>/MCSerialNumber/<RunNumber>_mc.tar When all jobs are done, one job will get all created <RunNumber>_mc.tar files and make <MCSerialNumber>_mc.tar The <MCSerialNumber>_mc.tar is to be stored at RAL castor. MICE collaboration meeting 47, 13th of February 2017.

MCproduction on the grid information pages Information about finished and running MCproductions on the grid: http://micewww.pp.rl.ac.uk/projects/analysis/wiki/MCProduction Information ( and also as examples ) of MCproduction Requests http://micewww.pp.rl.ac.uk/projects/analysis/wiki/MCProductionRequests Information about MCproduction requests entries in CDB You can check MCSerialNumber entries in CDB: http://147.91.87.158/cgi-bin/get_mcserial The scripts used for MC production on the grid are available on launchpad https://launchpad.net/mice-mc-batch-submission MICE collaboration meeting 47, 13th of February 2017.

Page about done or running MCproductions Link from Software page http://micewww.pp.rl.ac.uk/projects/maus/ http://micewww.pp.rl.ac.uk/projects/analysis/wiki/MCProduction - Contains information about MCproduction and lists the productions already done. Example: Run number 8681. 200 MeV. Status (Done) MCSerialNumber: 27 comment: Download by run number 8681. 200 MeV. G4BL 3200_p1. softw: 2.6.0 data: geometry_download_by="run_number" geometry_download_run_number=8681 G4BL input list From page http://micewww.pp.rl.ac.uk/attachments/download/6379/3200_p1.txt output: HTTP: http://gfe02.grid.hep.ph.ic.ac.uk:8301/Simulation/MCproduction/000000/000000/000027/ LFN: /grid/mice/Simulation/MCproduction/000000/000000/000027 SRM:srm://gfe02.grid.hep.ph.ic.ac.uk/pnfs/hep.ph.ic.ac.uk/data/mice/Simulation/MCproduction/000000/000000/000027 MICE collaboration meeting 47, 13th of February 2017.

MCproduction request page http://micewww.pp.rl.ac.uk/projects/analysis/wiki/MCProductionRequests Example: 7469 Emittance production (Done) Cards: -geometry_download_by “run_number" -geometry_download_run_number 7469 -TOF_calib_by = "date" -TOF_cabling_by = "date" -TOF_calib_date_from = '2015-10-07 14:00:00' -TOF_cabling_date_from = '2015-10-07 14:00:00' -TOFscintLightSpeed = 134.5 # mm/ns G4BL input: http://www.ppe.gla.ac.uk/~rbayes/MICE/batch_scripts/3182Mu.txt Software: MAUS-v2.5.0 8154 140 MeV ECE FC44 (Done) -geometry_download_run_number 8154 G4BL input: http://micewww.pp.rl.ac.uk/attachments/download/6378/3140_p1.txt Software: MAUS-v2.6.0 MICE collaboration meeting 47, 13th of February 2017.

Http access to MCproductions 30 productions done till March 2016. http://gfe02.grid.hep.ph.ic.ac.uk:8301/Simulation/MCproduction/ 30 productions done till March 2016. New MCproductions. With MCSerialNumber entries in CDB Old MCproductions. WithNO MCSerialNumber entries in CDB, only in preprodcdb MICE collaboration meeting 47, 13th of February 2017.

MCproduction scripts on launchpad The scripts used for MC production on the grid are available on launchpad https://launchpad.net/mice-mc-batch-submission The short description would be: MICE MC simulation using g4bl json files as input. Grid jobs submission scripts Scripts helping to submit and monitor status of grid jobs. Developed for MC simulations using G4BeamLine json files as input files for MC simulation in MAUS There is also a detailed README file The submission to the grid and monitoring of the jobs is done using bash scripts and local sqlite db stored at the UI of the submitter. The cronjob is set up to start the checking of status of grid jobs. There are scripts to create job files and submit them, script for checking the status of jobs and some utilities MICE collaboration meeting 47, 13th of February 2017.

Restarting of the grid G4BL production Durga created a script and sent me files needed to locally run G4BL simulation. - simulate_only_G4BL.bash —> main bash driver script — takes 3 args, deck-name, chunk-number, run-number - simulate_mice_G4BL.py —> the python script that runs g4bl - run_g4bl.py —> the maus module map driver - 3140M3.json —> the g4bl deck for generation Used like: ./simulate_only_G4BL.bash deck-name chunk-number run-number e.g. ./simulate_only_G4BL.bash 3140M3 1 8681 simulate_only_G4BL.bash was modified to work on the grid environment, along with modifying the submission script. Options: we can run the grid MC simulation like Use several decks, 1k G4BL chunks each —> generate the “G4BL chunk lists” —> run MC against these G4BL chunk lists using the MCProductionRequest mechanism Generate one G4BL chunk -> run simulation using one chunk -> store G4BL chunk and simulation output in appropriate places. MICE collaboration meeting 47, 13th of February 2017.

MICE collaboration meeting 47, 13th of February 2017. THANK YOU! MICE collaboration meeting 47, 13th of February 2017.