LHCb GRID Meeting 11/12 Sept 2000. Sept 11 2000LHCb-GRID T. Bowcock 2 AGENDA 9:30 LHCb MC Production –Points SICB Processing Req. Data Storage Data Transfer.

Slides:



Advertisements
Similar presentations
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
Advertisements

LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
Grid and CDB Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
Status of SICB and Monte Carlo productions Eric van Herwijnen Friday, 2 march 2001.
Production Planning Eric van Herwijnen Thursday, 20 june 2002.
K.Harrison CERN, 23rd October 2002 HOW TO COMMISSION A NEW CENTRE FOR LHCb PRODUCTION - Overview of LHCb distributed production system - Configuration.
Israel Cluster Structure. Outline The local cluster Local analysis on the cluster –Program location –Storage –Interactive analysis & batch analysis –PBS.
11 Dec 2000F Harris Datagrid Testbed meeting at Milan 1 LHCb ‘use-case’ - distributed MC production
Exploiting the Grid to Simulate and Design the LHCb Experiment K Harrison 1, N Brook 2, G Patrick 3, E van Herwijnen 4, on behalf of the LHCb Grid Group.
Large scale data flow in local and GRID environment V.Kolosov, I.Korolko, S.Makarychev ITEP Moscow.
08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick Central UK Computing (what.
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
LHCb Applications and GRID Integration Domenico Galli Catania, April 9, st INFN-GRID Workshop.
Alain Romeyer - 15/06/20041 CMS farm Mons Final goal : included in the GRID CMS framework To be involved in the CMS data processing scheme.
3rd Nov 2000HEPiX/HEPNT CDF-UK MINI-GRID Ian McArthur Oxford University, Physics Department
Computing Infrastructure Status. LHCb Computing Status LHCb LHCC mini-review, February The LHCb Computing Model: a reminder m Simulation is using.
Status of LHCb-INFN Computing CSN1, Catania, September 18, 2002 Domenico Galli, Bologna.
Jean-Yves Nief CC-IN2P3, Lyon HEPiX-HEPNT, Fermilab October 22nd – 25th, 2002.
Cosener’s House – 30 th Jan’031 LHCb Progress & Plans Nick Brook University of Bristol News & User Plans Technical Progress Review of deliverables.
Nick Brook Current status Future Collaboration Plans Future UK plans.
LHCb and DataGRID - the workplan for 2001 Eric van Herwijnen Wednesday, 28 march 2001.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
7April 2000F Harris LHCb Software Workshop 1 LHCb planning on EU GRID activities (for discussion) F Harris.
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
LHCb planning for DataGRID testbed0 Eric van Herwijnen Thursday, 10 may 2001.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
WP8 Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)
PHENIX and the data grid >400 collaborators Active on 3 continents + Brazil 100’s of TB of data per year Complex data with multiple disparate physics goals.
1 LCG-France sites contribution to the LHC activities in 2007 A.Tsaregorodtsev, CPPM, Marseille 14 January 2008, LCG-France Direction.
PPDG update l We want to join PPDG l They want PHENIX to join NSF also wants this l Issue is to identify our goals/projects Ingredients: What we need/want.
The LHCb CERN R. Graciani (U. de Barcelona, Spain) for the LHCb Collaboration International ICFA Workshop on Digital Divide Mexico City, October.
A B A B AR InterGrid Testbed Proposal for discussion Robin Middleton/Roger Barlow Rome: October 2001.
UK Grid Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid Prototype and Globus Technical Meeting QMW, 22nd November 2000 Glenn Patrick (RAL)
PHENIX and the data grid >400 collaborators 3 continents + Israel +Brazil 100’s of TB of data per year Complex data with multiple disparate physics goals.
1 L.Didenko Joint ALICE/STAR meeting HPSS and Production Management 9 April, 2000.
UTA MC Production Farm & Grid Computing Activities Jae Yu UT Arlington DØRACE Workshop Feb. 12, 2002 UTA DØMC Farm MCFARM Job control and packaging software.
A proposal: from CDR to CDH 1 Paolo Valente – INFN Roma [Acknowledgements to A. Di Girolamo] Liverpool, Aug. 2013NA62 collaboration meeting.
Large scale data flow in local and GRID environment Viktor Kolosov (ITEP Moscow) Ivan Korolko (ITEP Moscow)
CMS Computing Model summary UKI Monthly Operations Meeting Olivier van der Aa.
MC Production in Canada Pierre Savard University of Toronto and TRIUMF IFC Meeting October 2003.
Workflows and Data Management. Workflow and DM Run3 and after: conditions m LHCb major upgrade is for Run3 (2020 horizon)! o Luminosity x 5 ( )
LHCb datasets and processing stages. 200 kB100 kB 70 kB 0.1 kB 10kB 150 kB 0.1 kB 200 Hz LHCb datasets and processing stages.
1 LHCb computing for the analysis : a naive user point of view Workshop analyse cc-in2p3 17 avril 2008 Marie-Hélène Schune, LAL-Orsay for LHCb-France Framework,
CLRC Grid Team Glenn Patrick LHCb GRID Plans Glenn Patrick LHCb has formed a GRID technical working group to co-ordinate practical Grid.
8 August 2006MB Report on Status and Progress of SC4 activities 1 MB (Snapshot) Report on Status and Progress of SC4 activities A weekly report is gathered.
11/01/20081 Data simulator status CCRC’08 Preparatory Meeting Radu Stoica, CERN* 11 th January 2007 * On leave from IFIN-HH.
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
LHCb Computing activities Philippe Charpentier CERN – LHCb On behalf of the LHCb Computing Group.
Joe Foster 1 Two questions about datasets: –How do you find datasets with the processes, cuts, conditions you need for your analysis? –How do.
Apr. 25, 2002Why DØRAC? DØRAC FTFM, Jae Yu 1 What do we want DØ Regional Analysis Centers (DØRAC) do? Why do we need a DØRAC? What do we want a DØRAC do?
Database 12.2 and Oracle Enterprise Manager 13c Liana LUPSA.
LHCb computing model and the planned exploitation of the GRID Eric van Herwijnen, Frank Harris Monday, 17 July 2000.
BESIII data processing
Moving the LHCb Monte Carlo production system to the GRID
Bulk production of Monte Carlo
Gridifying the LHCb Monte Carlo simulation system
LHCb Computing Model and Data Handling Angelo Carbone 5° workshop italiano sulla fisica p-p ad LHC 31st January 2008.
Status of MC production on the grid
Stephen Burke, PPARC/RAL Jeff Templon, NIKHEF
R. Graciani for LHCb Mumbay, Feb 2006
Gridifying the LHCb Monte Carlo production system
LHCb thinking on Regional Centres and Related activities (GRIDs)
First attempt at using WIRED
Status and plans for bookkeeping system and production tools
Short to middle term GRID deployment plan for LHCb
MC production plans : 1/08/ /03/2001
Joel Closier Eric van Herwijnen Agnieszka Jacholkowska
Production client status
The LHCb Computing Data Challenge DC06
Presentation transcript:

LHCb GRID Meeting 11/12 Sept 2000

Sept LHCb-GRID T. Bowcock 2 AGENDA 9:30 LHCb MC Production –Points SICB Processing Req. Data Storage Data Transfer Quality Checking Bookeeping ?? –Discussion –Points of Action 11:00Tea 12:00Departure for

Sept LHCb-GRID T. Bowcock 3 Action Items Responsibilities SICb – Eric, Martin (L), David S. (RAL) Production –Bbar inclusive 500k events in DST2 fomat without trigger requirements (RAL) –2M events in mini DST2 format (600kB/evt) after L0/L1 cuts (Liverpool) Need to generate about 20M events at 210 secs Reconstruct 2M at 130 secs /event DST2 : min bias 2M at 100 secs/evt Min bias reconstruction 2M at 50 secs /evt Pileup 2M at 50 secs/evt Total : 20M at 250 secs / event (6 mths on MAP) Target 500 events/file : merge files if necessary Total data sample (2M evts) is 1.2 TB

Sept LHCb-GRID T. Bowcock 4 Action Items Data storage/Transfer –Data rate generated 60kB/sec –Network bandwidth Liverpool – CERN 500 kB/s overnight (50% duty cycle) –Bandwidth Liverpool – RAL to be measured –Data at MAP (disk), copy to RAL, copy RAL to CERN where central bookkeeping is updated –Data sample corresponds to 50 x 20 GB tapes (5kSFr) –Try direct transfer from Liverpool to CERN –Eric (CERN), Martin (liverpool), Dave S (RAL)

Sept LHCb-GRID T. Bowcock 5 Data Quality Checking SICbCHK now exists (summer student) Histos available viewable from web AJ data quality manager responsible No formal responsibility for data quality checking at Liverpool RNDM numbers : seeds? Chris to Eric with DELPHI scheme.

Sept LHCb-GRID T. Bowcock 6 Bookkeeping For production – use same mechanism for generating scripts for submitting batch jobs. –Event type, size of dataset etc Send back log files (at least in short term), incl. new log file with bookkeeping information Update database at CERN on receipt of the data Database has entry with name of dataset at place where it was generated Joint development needed (Eric and new postdoc at Liverpool)