Short to middle term GRID deployment plan for LHCb

Slides:



Advertisements
Similar presentations
GridPP July 2003Stefan StonjekSlide 1 SAM middleware components Stefan Stonjek University of Oxford 7 th GridPP Meeting 02 nd July 2003 Oxford.
Advertisements

31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
LHCb(UK) Meeting Glenn Patrick1 LHCb Grid Activities in UK LHCb(UK) Meeting Cambridge, 10th January 2001 Glenn Patrick (RAL)
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
NIKHEF Testbed 1 Plans for the coming three months.
Grid and CDB Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Status of SICB and Monte Carlo productions Eric van Herwijnen Friday, 2 march 2001.
12. March 2003Bernd Panzer-Steindel, CERN/IT1 LCG Fabric status
Production Planning Eric van Herwijnen Thursday, 20 june 2002.
K.Harrison CERN, 23rd October 2002 HOW TO COMMISSION A NEW CENTRE FOR LHCb PRODUCTION - Overview of LHCb distributed production system - Configuration.
11 Dec 2000F Harris Datagrid Testbed meeting at Milan 1 LHCb ‘use-case’ - distributed MC production
Exploiting the Grid to Simulate and Design the LHCb Experiment K Harrison 1, N Brook 2, G Patrick 3, E van Herwijnen 4, on behalf of the LHCb Grid Group.
Ian M. Fisk Fermilab February 23, Global Schedule External Items ➨ gLite 3.0 is released for pre-production in mid-April ➨ gLite 3.0 is rolled onto.
DIRAC API DIRAC Project. Overview  DIRAC API  Why APIs are important?  Why advanced users prefer APIs?  How it is done?  What is local mode what.
08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick Central UK Computing (what.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
Andrew McNab - Manchester HEP - 5 July 2001 WP6/Testbed Status Status by partner –CNRS, Czech R., INFN, NIKHEF, NorduGrid, LIP, Russia, UK Security Integration.
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
LHCb Applications and GRID Integration Domenico Galli Catania, April 9, st INFN-GRID Workshop.
Central Reconstruction System on the RHIC Linux Farm in Brookhaven Laboratory HEPIX - BNL October 19, 2004 Tomasz Wlodek - BNL.
November 7, 2001Dutch Datagrid SARA 1 DØ Monte Carlo Challenge A HEP Application.
3rd Nov 2000HEPiX/HEPNT CDF-UK MINI-GRID Ian McArthur Oxford University, Physics Department
3rd June 2004 CDF Grid SAM:Metadata and Middleware Components Mòrag Burgon-Lyon University of Glasgow.
SLICE Simulation for LHCb and Integrated Control Environment Gennady Kuznetsov & Glenn Patrick (RAL) Cosener’s House Workshop 23 rd May 2002.
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
DataGrid Applications Federico Carminati WP6 WorkShop December 11, 2000.
Cosener’s House – 30 th Jan’031 LHCb Progress & Plans Nick Brook University of Bristol News & User Plans Technical Progress Review of deliverables.
Nick Brook Current status Future Collaboration Plans Future UK plans.
1 DIRAC – LHCb MC production system A.Tsaregorodtsev, CPPM, Marseille For the LHCb Data Management team CHEP, La Jolla 25 March 2003.
LHCb and DataGRID - the workplan for 2001 Eric van Herwijnen Wednesday, 28 march 2001.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
F. Fassi, S. Cabrera, R. Vives, S. González de la Hoz, Á. Fernández, J. Sánchez, L. March, J. Salt, A. Lamas IFIC-CSIC-UV, Valencia, Spain Third EELA conference,
7April 2000F Harris LHCb Software Workshop 1 LHCb planning on EU GRID activities (for discussion) F Harris.
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
LHCb planning for DataGRID testbed0 Eric van Herwijnen Thursday, 10 may 2001.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
WP8 Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)
28 March 2001F Harris LHCb Software Week1 Overview of GGF1 (Global Grid Forum) and Datagrid meeting, NIKHEF, Mar 5-9 F Harris(Oxford)
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
UK Grid Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid Prototype and Globus Technical Meeting QMW, 22nd November 2000 Glenn Patrick (RAL)
LHCb Data Challenge in 2002 A.Tsaregorodtsev, CPPM, Marseille DataGRID France meeting, Lyon, 18 April 2002.
LHCb GRID Meeting 11/12 Sept Sept LHCb-GRID T. Bowcock 2 AGENDA 9:30 LHCb MC Production –Points SICB Processing Req. Data Storage Data Transfer.
CLRC Grid Team Glenn Patrick LHCb GRID Plans Glenn Patrick LHCb has formed a GRID technical working group to co-ordinate practical Grid.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
ATLAS Distributed Analysis Dietrich Liko IT/GD. Overview  Some problems trying to analyze Rome data on the grid Basics Metadata Data  Activities AMI.
LHCb computing model and the planned exploitation of the GRID Eric van Herwijnen, Frank Harris Monday, 17 July 2000.
Workload Management Workpackage
U.S. ATLAS Grid Production Experience
Moving the LHCb Monte Carlo production system to the GRID
GWE Core Grid Wizard Enterprise (
LHCb Computing Model and Data Handling Angelo Carbone 5° workshop italiano sulla fisica p-p ad LHC 31st January 2008.
UK Testbed Status Testbed 0 GridPP project Experiments’ tests started
D. Galli, U. Marconi, V. Vagnoni INFN Bologna N. Brook Bristol
US ATLAS Physics & Computing
LHCb Distributed Computing and the Grid V. Vagnoni (INFN Bologna)
Wide Area Workload Management Work Package DATAGRID project
Gridifying the LHCb Monte Carlo production system
The National Grid Service Mike Mineter NeSC-TOE
ATLAS DC2 & Continuous production
HEC Beam Test Software schematic view T D S MC events ASCII-TDS
Summary Computing Model SICb Event Model Detector Description
LHCb thinking on Regional Centres and Related activities (GRIDs)
First attempt at using WIRED
Status and plans for bookkeeping system and production tools
Development of LHCb Computing Model F Harris
Production Manager Tools (New Architecture)
Presentation transcript:

Short to middle term GRID deployment plan for LHCb Eric van Herwijnen Wednesday, 5 July 2000

Overview GRID unique opportunity to unify CPU resources in the collaborating institutes Need a common deployment plan Mini project EU GRID project submitted

LHCb WP8 application (F. Harris) MAP Farm(300 cpu) at Liverpool to generate 107 events over 4 months. Data volumes transferred between facilities: Liverpool to RAL 3TB (RAW,ESD,AOD,TAG) RAL to Lyon/CERN 0.3TB (AOD and TAG) Lyon to LHCb inst. 0.3TB (AOD and TAG) RAL to LHCb inst. 100GB (ESD for sys. studies) Physicists run jobs at regional centre or move AOD & TAG data to local institute and run jobs there. Also, copy ESD for 10% of events for systematic studies. Formal Production scheduled  start 2002 to mid-2002 (EU schedule) BUT we are pushing ahead to get experience so we can define project requirements Aim for a ‘production’ run by end 2000 On basis of experience will give input on HEP application needs to the Middleware groups

LHCb GRID working group Glenn Patrick Chris Brew Frank Harris Ian McArthur Nick Brook Girish Patel Themis Bowcock Eric van Herwijnen first meeting june 14th, RAL

Mini project (1) Install Globus 1.1.3 at CERN, RAL and Liverpool CERN: installation completed on a single Linux node (pcrd25.cern.ch) running Redhat 6.1, not available to public yet RAL: installed but not yet tested MAP: being installed Members to get access to respective sites Eric and Glenn have access to CERN. Eric, Glenn, and Chris have access to RAL. Girish has access to MAP. Timescale: 2nd week in july

Mini project (2) 1st checkpoint during LHCb software week, Wednesday 5 july inform collaboration invite other members to join Run SICBMC at CERN, RAL and Liverpool Eric to prepare a script using Globus commands to run sicbmc and copy the data back to the host where the job was executed from Other partners to test the script from their sites Timescale: end of july

Mini project (3) Verify data can be shipped back to CERN and written onto tape Use globus commands Some scripting required to use SHIFT and update bookkeeping database Time scale: early August Second checkpoint meeting at Liverpool 2nd half of August Benchmark sustained datatransfer rates Report completed project at LHCb week in Milano

Integration of the GRID between CERN, RAL and MAP (1) Eric, Chris, Glenn, Girish and Gareth attended course at RAL during 21,22 june: Globus toolkit has a C API (easy to integrate with Gaudi) Globus commands for remotely running scripts (or executables), recovering data from std output, saving consulting metadata through LDAP The gains: Everyone uses the same executable Everyone uses the same scripts Data is handled in a uniform way Batch system (LSF, PBS, Condor) to be discussed in Lyon on june 30th

Integration of the GRID between CERN, RAL and MAP (2) Explore the use of LDAP for our bookkeeping database: API in C would solve current Oracle -> Gaudi interface problem Simplification of DB updating by MC production Everybody is heading in this direction Oracle have an LDAP server product, someone should investigate Java job submission tools should be modified to create Globus jobs. Timescale: October (to be done in parallel with current NT production)

Integration of the GRID between CERN, RAL and MAP (3) RAL NT farm to be converted to Linux this autumn MAP uses Linux CERN have been asked to install Globus on the Linux batch service (Tony Cass) Should LHCb abandon NT for MC production? To be discussed with collaboration next sw week A full MC production run using Globus is aimed for in december

Extension to other institutes Establish a “GRID” architecture: Intel PCs running Linux Redhat 6.1 LSF, PBS or Condor for job scheduling Globus 1.1.3 for managing the GRID LDAP for our bookkeeping database java tools for connecting production, bookkeeping & GRID