Batch Production and Monte Carlo + CDB work status Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.

Slides:



Advertisements
Similar presentations
Building Portals to access Grid Middleware National Technical University of Athens Konstantinos Dolkas, On behalf of Andreas Menychtas.
Advertisements

30-31 Jan 2003J G Jensen, RAL/WP5 Storage Elephant Grid Access to Mass Storage.
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
Operating-System Structures
13/05/2004Janusz Martyniak Imperial College London 1 Using Ganga to Submit BaBar Jobs Development Status.
CHEP 2012 – New York City 1.  LHC Delivers bunch crossing at 40MHz  LHCb reduces the rate with a two level trigger system: ◦ First Level (L0) – Hardware.
Software Summary Database Data Flow G4MICE Status & Plans Detector Reconstruction 1M.Ellis - CM24 - 3rd June 2009.
1 Databases in ALICE L.Betev LCG Database Deployment and Persistency Workshop Geneva, October 17, 2005.
Grid and CDB Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.

Oxford Jan 2005 RAL Computing 1 RAL Computing Implementing the computing model: SAM and the Grid Nick West.
Henry Nebrensky – CM26 – 24 March 2010 Computing Panel Discussion: SSH Bastion Henry Nebrensky Brunel University 1.
K.Harrison CERN, 23rd October 2002 HOW TO COMMISSION A NEW CENTRE FOR LHCb PRODUCTION - Overview of LHCb distributed production system - Configuration.
Installing software on personal computer
The SAM-Grid Fabric Services Gabriele Garzoglio (for the SAM-Grid team) Computing Division Fermilab.
Software Summary 1M.Ellis - CM23 - Harbin - 16th January 2009  Four very good presentations that produced a lot of useful discussion: u Online Reconstruction.
SOFTWARE & COMPUTING Durga Rajaram MICE PROJECT BOARD Nov 24, 2014.
The SAMGrid Data Handling System Outline:  What Is SAMGrid?  Use Cases for SAMGrid in Run II Experiments  Current Operational Load  Stress Testing.
Software Engineering in Robotics Packaging and Deployment of Systems Henrik I. Christensen –
Chapter 2: Operating-System Structures. 2.2 Silberschatz, Galvin and Gagne ©2005 Operating System Concepts Chapter 2: Operating-System Structures Operating.
Computing Infrastructure Status. LHCb Computing Status LHCb LHCC mini-review, February The LHCb Computing Model: a reminder m Simulation is using.
3rd June 2004 CDF Grid SAM:Metadata and Middleware Components Mòrag Burgon-Lyon University of Glasgow.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
Real Time Monitor of Grid Job Executions Janusz Martyniak Imperial College London.
Cosener’s House – 30 th Jan’031 LHCb Progress & Plans Nick Brook University of Bristol News & User Plans Technical Progress Review of deliverables.
Nick Brook Current status Future Collaboration Plans Future UK plans.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
Some Design Notes Iteration - 2 Method - 1 Extractor main program Runs from an external VM Listens for RabbitMQ messages Starts a light database engine.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
Configuration Database Antony Wilson MICE CM February 2011 RAL 1.
Enabling Grids for E-sciencE System Analysis Working Group and Experiment Dashboard Julia Andreeva CERN Grid Operations Workshop – June, Stockholm.
1 LCG-France sites contribution to the LHC activities in 2007 A.Tsaregorodtsev, CPPM, Marseille 14 January 2008, LCG-France Direction.
GridPP Dirac Service The 4 th Dirac User Workshop May 2014 CERN Janusz Martyniak, Imperial College London.
1 LHCb on the Grid Raja Nandakumar (with contributions from Greig Cowan) ‏ GridPP21 3 rd September 2008.
NA61/NA49 virtualisation: status and plans Dag Toppe Larsen CERN
CERN IT Department t LHCb Software Distribution Roberto Santinelli CERN IT/GS.
MICE Configuration DB Janusz Martyniak, Imperial College London MICE CM39 Software Paralell.
JAliEn Java AliEn middleware A. Grigoras, C. Grigoras, M. Pedreira P Saiz, S. Schreiner ALICE Offline Week – June 2013.
The GridPP DIRAC project DIRAC for non-LHC communities.
A proposal: from CDR to CDH 1 Paolo Valente – INFN Roma [Acknowledgements to A. Di Girolamo] Liverpool, Aug. 2013NA62 collaboration meeting.
+ AliEn site services and monitoring Miguel Martinez Pedreira.
NA61/NA49 virtualisation: status and plans Dag Toppe Larsen Budapest
EGEE is a project funded by the European Union under contract IST Enabling bioinformatics applications to.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
1 Configuration Database David Forrest University of Glasgow RAL :: 31 May 2009.
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
D.Spiga, L.Servoli, L.Faina INFN & University of Perugia CRAB WorkFlow : CRAB: CMS Remote Analysis Builder A CMS specific tool written in python and developed.
36 th LHCb Software Week Pere Mato/CERN.  Provide a complete, portable and easy to configure user environment for developing and running LHC data analysis.
The GridPP DIRAC project DIRAC for non-LHC communities.
INFSO-RI Enabling Grids for E-sciencE File Transfer Software and Service SC3 Gavin McCance – JRA1 Data Management Cluster Service.
ATLAS Physics Analysis Framework James R. Catmore Lancaster University.
CDF SAM Deployment Status Doug Benjamin Duke University (for the CDF Data Handling Group)
9 Copyright © 2004, Oracle. All rights reserved. Getting Started with Oracle Migration Workbench.
MAUS Status A. Dobbs CM43 29 th October Contents MAUS Overview Infrastructure Geometry and CDB Detector Updates CKOV EMR KL TOF Tracker Global Tracking.
A Web Based Job Submission System for a Physics Computing Cluster David Jones IOP Particle Physics 2004 Birmingham 1.
Compute and Storage For the Farm at Jlab
Bulk production of Monte Carlo
CMS DCS: WinCC OA Installation Strategy
MICE Computing and Software
MCproduction on the grid
NA61/NA49 virtualisation:
Dag Toppe Larsen UiB/CERN CERN,
Data Management and Database Framework for the MICE experiment
Dag Toppe Larsen UiB/CERN CERN,
Bulk production of Monte Carlo
Data Management and Database Framework for the MICE Experiment
Configuration Database
Status and plans for bookkeeping system and production tools
Production Manager Tools (New Architecture)
The LHCb Computing Data Challenge DC06
Presentation transcript:

Batch Production and Monte Carlo + CDB work status Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction

MICE Data Handling on the Grid Raw Data Mover – upload raw data files onto safe tape storage as soon as the data have been written out by the DAQ, MAUS reconstruction: - offline reconstruction (triggered by raw data upload to RAL Castor), - batch reconstruction (reprocessing). 22/02/2014J. Martyniak, MICE CM38

Raw Data Mover No major changes since the last run In order to simplify the data moving workflow a copy to the PPD SE will not longer be done there is a special program written at IC – the Grid Distribution Agent (GDA, aka Ray’s tool) which may be used to automatically download files from RAL Castor as soon as they appear there. Also works for the reco data. Used at IC. 22/02/2014J. Martyniak, MICE CM38

Raw Data Mover (1) Plan to move to SL6 – waiting for a box. Have to test the new grid UI on SL6, since the python APIs used by the mover stopped working on SL5, we had to downgrade the UI. If the APIs turn out to be unusable, we have to modify the mover to use grid command line tools. 22/02/2014J. Martyniak, MICE CM38

Raw Data Mover Workflow (as a reminder) Control Room DATE Event Builder tar tar …… tar md5sum ……………… tar Data Mover Castor Tape MICE Metadata DB run number LFN file size adler32 LFC Disk SE gsiftp 22/02/2014J. Martyniak, MICE CM38

Raw Data Mover Summary Developed in summer 2011, run since Oct 2011, Successfully used to store data on Castor to date, Uses Python API which requires rather old gLite UI. Newer UI broke the APIs, never fixed to my knowledge, Have to test a new SL6 UI, Move the code to a new SL6 hardware, Use robot certificate, currently my personal, 22/02/2014J. Martyniak, MICE CM38

MAUS Reconstruction on the Grid GRID MAUS installation via CVMFS at RAL, Propagated to Brunel and Imperial, Build and compiles on SL6, last build used is MAUS_v0.6.0, All data reprocessed with this version using 2 sites within 2 weeks. No major changes to the framework (see my talk for CM37) 22/02/2014J. Martyniak, MICE CM38

MAUS Grid Reconstruction - Reprocessing MAUS – build and tested on an SL6 node at IC. Problem with installation on the Grid. The installation procedure requires a Grid job submission to RAL T1 The s/w, once installed automatically propagates to the client sites (Brunel, Imperial) RAL made some changes to the CVMFS and…. 22/02/2014J. Martyniak, MICE CM38

MAUS Grid Reprocessing (1) My installation jobs never finish… Can’t look at the output … Investigation at RAL shows that the file permissions within MAUS are not set properly (this should be done by the installation job) In this situation we have corrected this manually and will test the propagated MAUS software at IC and Brunel. We will further investigate the problem with the installation job using a different MAUS distro location while running the reprocessing in parallel. 22/02/2014J. Martyniak, MICE CM38

Configuration Database project takeover status To make it easier to work on this complex project a fully functional version of the system was installed from scratch on a laptop: Postgres DBMS installed and configured Tomcat installed and configured MICE CDB (java) compiled form source and deployed on Tomcat as a WS Java and Python clients tested. 22/02/2014J. Martyniak, MICE CM38

Configuration Database (contd.) The portable version allows potential users to test their software against a “private” DB, even with no Internet connection Allows me and others to write and test the code w/o interacting with the preprod DB Helps to resolving CDB related issues locally. The installation turned out to be a simple process so I can help anyone willing to get one on his/her own computer (Mac, Linux) 22/02/2014J. Martyniak, MICE CM38

Configuration DB work in the near future Request from Pierrick to provide API to store/retrieve data for the cooling channel. Relevant table exists in the DB, might need some tweaking The Web Service interface is deployed, Python API exists. Requires to create a façade to interface the data format provided by Pierrick to the existing Python API. 22/02/2014J. Martyniak, MICE CM38

Configuration DB (contd.) Create a new table to store a relation between a batch iteration number used by the Grid reconstruction system and MAUS. This will allow to reprocess the data to allow for any changes not related to the MAUS version itself. The batch reconstruction number will determine the datacards which used by MAUS. The datacards will be stored in the DB. See: 22/02/2014J. Martyniak, MICE CM38