Gridifying the LHCb Monte Carlo simulation system

Slides:



Advertisements
Similar presentations
ATLAS/LHCb GANGA DEVELOPMENT Introduction Requirements Architecture and design Interfacing to the Grid Ganga prototyping A. Soroko (Oxford), K. Harrison.
Advertisements

Exploiting the Grid to Simulate & Design the LHCb Experiment Eric van Herwijnen (CERN) Glenn Patrick (Rutherford Appleton Laboratory) National e-Science.
EU 2nd Year Review – Jan – Title – n° 1 WP1 Speaker name (Speaker function and WP ) Presentation address e.g.
Workload management Owen Maroney, Imperial College London (with a little help from David Colling)
INFSO-RI Enabling Grids for E-sciencE Workload Management System and Job Description Language.
Job Submission The European DataGrid Project Team
INFSO-RI Enabling Grids for E-sciencE EGEE Middleware The Resource Broker EGEE project members.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Production Planning Eric van Herwijnen Thursday, 20 june 2002.
Basic Grid Job Submission Alessandra Forti 28 March 2006.
DataGrid Kimmo Soikkeli Ilkka Sormunen. What is DataGrid? DataGrid is a project that aims to enable access to geographically distributed computing power.
Using The EDG Testbed The European DataGrid Project Team
11 Dec 2000F Harris Datagrid Testbed meeting at Milan 1 LHCb ‘use-case’ - distributed MC production
Exploiting the Grid to Simulate and Design the LHCb Experiment K Harrison 1, N Brook 2, G Patrick 3, E van Herwijnen 4, on behalf of the LHCb Grid Group.
Enabling Grids for E-sciencE gLite training at Sinaia '06 Victor Penso Kilian Schwarz GSI Darmstadt Germany.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
Grid Data Management A network of computers forming prototype grids currently operate across Britain and the rest of the world, working on the data challenges.
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
LHCb Applications and GRID Integration Domenico Galli Catania, April 9, st INFN-GRID Workshop.
INFSO-RI Enabling Grids for E-sciencE GILDA Praticals GILDA Tutors INFN Catania ICTP/INFM-Democritos Workshop on Porting Scientific.
Computational grids and grids projects DSS,
:: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :: GridKA School 2009 MPI on Grids 1 MPI On Grids September 3 rd, GridKA School 2009.
SLICE Simulation for LHCb and Integrated Control Environment Gennady Kuznetsov & Glenn Patrick (RAL) Cosener’s House Workshop 23 rd May 2002.
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
Cosener’s House – 30 th Jan’031 LHCb Progress & Plans Nick Brook University of Bristol News & User Plans Technical Progress Review of deliverables.
Nick Brook Current status Future Collaboration Plans Future UK plans.
DataGrid WP1 Massimo Sgaravatto INFN Padova. WP1 (Grid Workload Management) Objective of the first DataGrid workpackage is (according to the project "Technical.
Nadia LAJILI User Interface User Interface 4 Février 2002.
1 DIRAC – LHCb MC production system A.Tsaregorodtsev, CPPM, Marseille For the LHCb Data Management team CHEP, La Jolla 25 March 2003.
LHCb and DataGRID - the workplan for 2001 Eric van Herwijnen Wednesday, 28 march 2001.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
- Distributed Analysis (07may02 - USA Grid SW BNL) Distributed Processing Craig E. Tull HCG/NERSC/LBNL (US) ATLAS Grid Software.
Group 1 : Grid Computing Laboratory of Information Technology Supervisors: Alexander Ujhinsky Nikolay Kutovskiy.
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
LHCb planning for DataGRID testbed0 Eric van Herwijnen Thursday, 10 may 2001.
WP8 Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)
EGEE-III INFSO-RI Enabling Grids for E-sciencE Feb. 06, Introduction to High Performance and Grid Computing Faculty of Sciences,
Jan 31, 2006 SEE-GRID Nis Training Session Hands-on V: Standard Grid Usage Dušan Vudragović SCL and ATLAS group Institute of Physics, Belgrade.
J.J.Blaising April 02AMS DataGrid-status1 DataGrid Status J.J Blaising IN2P3 Grid Status Demo introduction Demo.
Job Management DIRAC Project. Overview  DIRAC JDL  DIRAC Commands  Tutorial Exercises  What do you have learned? KEK 10/2012DIRAC Tutorial.
AliEn AliEn at OSC The ALICE distributed computing environment by Bjørn S. Nilsen The Ohio State University.
Workload Management System Jason Shih WLCG T2 Asia Workshop Dec 2, 2006: TIFR.
EGEE-II INFSO-RI Enabling Grids for E-sciencE Command Line Grid Programming Spiros Spirou Greek Application Support Team NCSR “Demokritos”
GRID commands lines Original presentation from David Bouvet CC/IN2P3/CNRS.
Enabling Grids for E-sciencE Work Load Management & Simple Job Submission Practical Shu-Ting Liao APROC, ASGC EGEE Tutorial.
EU 2nd Year Review – Feb – WP1 Demo – n° 1 WP1 demo Grid “logical” checkpointing Fabrizio Pacini (Datamat SpA, WP1 )
ScotGRID is the Scottish prototype Tier 2 Centre for LHCb and ATLAS computing resources. It uses a novel distributed architecture and cutting-edge technology,
LHCb computing model and the planned exploitation of the GRID Eric van Herwijnen, Frank Harris Monday, 17 July 2000.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI solution for high throughput data analysis Peter Solagna EGI.eu Operations.
The EDG Testbed Deployment Details
Real Time Fake Analysis at PIC
U.S. ATLAS Grid Production Experience
Moving the LHCb Monte Carlo production system to the GRID
GGF OGSA-WG, Data Use Cases Peter Kunszt Middleware Activity, Data Management Cluster EGEE is a project funded by the European.
EGEE tutorial, Job Description Language - more control over your Job Assaf Gottlieb Tel-Aviv University EGEE is a project.
Introduction to Grid Technology
Grid2Win: Porting of gLite middleware to Windows XP platform
5. Job Submission Grid Computing.
Job Management with DATA
D. Galli, U. Marconi, V. Vagnoni INFN Bologna N. Brook Bristol
Stephen Burke, PPARC/RAL Jeff Templon, NIKHEF
LHCb Distributed Computing and the Grid V. Vagnoni (INFN Bologna)
Alice Software Demonstration
Gridifying the LHCb Monte Carlo production system
gLite Job Management Christos Theodosiou
GENIUS Grid portal Hands on
Status and plans for bookkeeping system and production tools
Presentation transcript:

Gridifying the LHCb Monte Carlo simulation system Eric van Herwijnen, Joel Closier Friday, 1 march 2002 WP8 Demo for DataGrid Review

Contents LHCb distributed computing environment Demonstration of DataGrid middleware Authentication Job submission to DataGrid Monitoring and control Data replication Resource scheduling – use of CERN MSS

LHCb LHC collider experiment 109 events * 1Mb = 1 Pb We have a distributed computing model Need to create, distribute and keep track of data automatically We need the Grid to handle our distributed computing environment

Submit jobs remotely viaWeb Update bookkeeping database Transfer data to Mass store Execute on farm Data Quality Check

1. Authentication grid-proxy-init

2. Job submission dg-job-submit /home/evh/sicb/sicb/bbincl1600061.jdl -o /home/evh/logsub bbincl1600061.jdl: # Executable = "script_prod"; Arguments = "1600061,v235r4dst,v233r2"; StdOutput = "file1600061.output"; StdError = "file1600061.err"; InputSandbox = {"/home/evhtbed/scripts/x509up_u149","/home/evhtbed/sicb/mcsend","/home/evhtbed/sicb/fsize","/home/evhtbed/sicb/cdispose.class","/home/evhtbed/v235r4dst.tar.gz","/home/evhtbed/sicb/sicb/bbincl1600061.sh","/home/evhtbed/script_prod","/home/evhtbed/sicb/sicb1600061.dat","/home/evhtbed/sicb/sicb1600062.dat","/home/evhtbed/sicb/sicb1600063.dat","/home/evhtbed/v233r2.tar.gz"}; OutputSandbox = {"job1600061.txt","D1600063","file1600061.output","file1600061.err","job1600062.txt","job1600063.txt"};

3. Monitoring and control

3. Monitoring and control dg-job-status dg-job-cancel dg-job-get-output

CERN testbed Compute Element Storage Element data Local disk Job globus-url-copy data register-local-file publish replica catalog (Nikhef) replica-get Rest-of-Grid Job data Storage Element

4. Publish data on storage element Copy data file to storage element: globus-url-copy file:///${chemin}/L69999 gsiftp://lxshare0219.cern.ch/flatfiles/SE1/lhcb/L69999 Register stored data in the catalog: /opt/globus/bin/globus-job-run lxshare0219.cern.ch /bin/bash -c "export GDMP_CONFIG_FILE=/opt/edg/lhcb/etc/gdmp.conf;/opt/edg/bin/gdmp_register_local_file -d /flatfiles/SE1/lhcb" Publish catalog: /opt/globus/bin/globus-job-run lxshare0219.cern.ch /bin/bash -c "export GDMP_CONFIG_FILE=/opt/edg/lhcb/etc/gdmp.conf; /opt/edg/bin/gdmp_publish_catalogue -n"

5. Resource scheduling – use of CERN MSS Copy output to MSS: rfcp L1600061 /castor/cern.ch/lhcb/mc/L1600061 JDL to specify use of CERN MSS: Executable = "script_prod"; Arguments = "1600061,v235r4dst,v233r2"; StdOutput = "file1600061.output"; StdError = "file1600061.err"; OutputSE = "lxshare0219.cern.ch"; InputSandbox = {"/home/evhtbed/scripts/x509up_u149","/home/evhtbed/sicb/mcsend","/home/evhtbed/sicb/fsize","/home/evhtbed/sicb/cdispose.class","/home/evhtbed/v235r4dst.tar.gz","/home/evhtbed/sicb/sicb/bbincl1600061.sh","/home/evhtbed/script_prod","/home/evhtbed/sicb/sicb1600061.dat","/home/evhtbed/sicb/sicb1600062.dat","/home/evhtbed/sicb/sicb1600063.dat","/home/evhtbed/v233r2.tar.gz"}; OutputSandbox = {"job1600061.txt","D1600063","file1600061.output","file1600061.err","job1600062.txt","job1600063.txt"};

6. Inspection of replica catalog Inspect replica catalog

Data Quality Analysis of testbed data histograms on web