Your university or experiment logo here LHCb Development Glenn Patrick Raja Nandakumar GridPP18, 20 March 2007.

Slides:



Advertisements
Similar presentations
Your university or experiment logo here LHCb is Beautiful? Glenn Patrick GridPP19, 29 August 2007.
Advertisements

LHCb on the Grid A Tale of many Migrations
1 GANGA, LHCb and ATLAS Planning Glenn Patrick GRIDPP11 – 14 September 2004.
CHEP 2004, 27 September - 1 October 2004, Interlaken1 DIRAC – the distributed production and analysis for LHCb A.Tsaregorodtsev, CPPM, Marseille CHEP 2004,
Stuart K. PatersonCHEP 2006 (13 th –17 th February 2006) Mumbai, India 1 from DIRAC.Client.Dirac import * dirac = Dirac() job = Job() job.setApplication('DaVinci',
LHCb Quarterly Report October Core Software (Gaudi) m Stable version was ready for 2008 data taking o Gaudi based on latest LCG 55a o Applications.
LHCC Comprehensive Review – September WLCG Commissioning Schedule Still an ambitious programme ahead Still an ambitious programme ahead Timely testing.
Petabyte-scale computing challenges of the LHCb experiment UK e-Science All Hands Meeting 2008, Edinburgh, 9 th September 2008 Y. Y. Li on behalf of the.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
LCG-France, 22 July 2004, CERN1 LHCb Data Challenge 2004 A.Tsaregorodtsev, CPPM, Marseille LCG-France Meeting, 22 July 2004, CERN.
Computing and LHCb Raja Nandakumar. The LHCb experiment  Universe is made of matter  Still not clear why  Andrei Sakharov’s theory of cp-violation.
Computing Infrastructure Status. LHCb Computing Status LHCb LHCC mini-review, February The LHCb Computing Model: a reminder m Simulation is using.
SLICE Simulation for LHCb and Integrated Control Environment Gennady Kuznetsov & Glenn Patrick (RAL) Cosener’s House Workshop 23 rd May 2002.
Cosener’s House – 30 th Jan’031 LHCb Progress & Plans Nick Brook University of Bristol News & User Plans Technical Progress Review of deliverables.
1 DIRAC – LHCb MC production system A.Tsaregorodtsev, CPPM, Marseille For the LHCb Data Management team CHEP, La Jolla 25 March 2003.
4th February 2004GRIDPP91 LHCb Development Glenn Patrick Rutherford Appleton Laboratory.
LHCb week, 27 May 2004, CERN1 Using services in DIRAC A.Tsaregorodtsev, CPPM, Marseille 2 nd ARDA Workshop, June 2004, CERN.
Results of the LHCb experiment Data Challenge 2004 Joël Closier CERN / LHCb CHEP’ 04.
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
1 LCG-France sites contribution to the LHC activities in 2007 A.Tsaregorodtsev, CPPM, Marseille 14 January 2008, LCG-France Direction.
LHCb Computing Model and Grid Status Glenn Patrick GRIDPP13, Durham – 5 July 2005.
LHCb Computing and Grid Status Glenn Patrick LHCb(UK), Dublin – 23 August 2005.
Getting started DIRAC Project. Outline  DIRAC information system  Documentation sources  DIRAC users and groups  Registration with DIRAC  Getting.
LHCb The LHCb Data Management System Philippe Charpentier CERN On behalf of the LHCb Collaboration.
Author: Andrew C. Smith Abstract: LHCb's participation in LCG's Service Challenge 3 involves testing the bulk data transfer infrastructure developed to.
1 LHCb on the Grid Raja Nandakumar (with contributions from Greig Cowan) ‏ GridPP21 3 rd September 2008.
1 LHCb File Transfer framework N. Brook, Ph. Charpentier, A.Tsaregorodtsev LCG Storage Management Workshop, 6 April 2005, CERN.
CHEP 2006, February 2006, Mumbai 1 LHCb use of batch systems A.Tsaregorodtsev, CPPM, Marseille HEPiX 2006, 4 April 2006, Rome.
LHCb File-Metadata: Bookkeeping Carmine Cioffi Department of Physics, Oxford University UK Metadata Workshop Oxford, 04 July 2006.
DIRAC Review (12 th December 2005)Stuart K. Paterson1 DIRAC Review Workload Management System.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
Data Transfer Service Challenge Infrastructure Ian Bird GDB 12 th January 2005.
EGEE is a project funded by the European Commission under contract IST NA4/HEP work F Harris (Oxford/CERN) M.Lamanna(CERN) NA4 Open meeting.
WLCG LHCC mini-review LHCb Summary. Outline m Activities in 2008: summary m Status of DIRAC m Activities in 2009: outlook m Resources in PhC2.
LHCb report to LHCC and C-RSG Philippe Charpentier CERN on behalf of LHCb.
DIRAC Project A.Tsaregorodtsev (CPPM) on behalf of the LHCb DIRAC team A Community Grid Solution The DIRAC (Distributed Infrastructure with Remote Agent.
1 LHCb view on Baseline Services A.Tsaregorodtsev, CPPM, Marseille Ph.Charpentier CERN Baseline Services WG, 4 March 2005, CERN.
1 LHCb computing for the analysis : a naive user point of view Workshop analyse cc-in2p3 17 avril 2008 Marie-Hélène Schune, LAL-Orsay for LHCb-France Framework,
LHCb status and plans Ph.Charpentier CERN. LHCb status and plans WLCG Workshop 1-2 Sept 2007, Victoria, BC 2 Ph.C. Status of DC06  Reminder:  Two-fold.
1 DIRAC agents A.Tsaregorodtsev, CPPM, Marseille ARDA Workshop, 7 March 2005, CERN.
Distributed Analysis Tutorial Dietrich Liko. Overview  Three grid flavors in ATLAS EGEE OSG Nordugrid  Distributed Analysis Activities GANGA/LCG PANDA/OSG.
8 August 2006MB Report on Status and Progress of SC4 activities 1 MB (Snapshot) Report on Status and Progress of SC4 activities A weekly report is gathered.
CHEP 2006, February 2006, Mumbai 1 DIRAC, the LHCb Data Production and Distributed Analysis system A.Tsaregorodtsev, CPPM, Marseille CHEP 2006,
D.Spiga, L.Servoli, L.Faina INFN & University of Perugia CRAB WorkFlow : CRAB: CMS Remote Analysis Builder A CMS specific tool written in python and developed.
GAG meeting, 5 July 2004, CERN1 LHCb Data Challenge 2004 A.Tsaregorodtsev, Marseille N. Brook, Bristol/CERN GAG Meeting, 5 July 2004, CERN.
The GridPP DIRAC project DIRAC for non-LHC communities.
1 DIRAC WMS & DMS A.Tsaregorodtsev, CPPM, Marseille ICFA Grid Workshop,15 October 2006, Sinaia.
LHCb 2009-Q4 report Q4 report LHCb 2009-Q4 report, PhC2 Activities in 2009-Q4 m Core Software o Stable versions of Gaudi and LCG-AA m Applications.
1 LHCb Computing A.Tsaregorodtsev, CPPM, Marseille 14 March 2007, Clermont-Ferrand.
1 DIRAC project A.Tsaregorodtsev, CPPM, Marseille DIRAC review panel meeting, 15 November 2005, CERN.
Lessons learned administering a larger setup for LHCb
LHCb D ata P rocessing S oftware J. Blouw, A. Zhelezov Physikalisches Institut, Universitaet Heidelberg DESY Computing Seminar, Nov. 29th, 2010.
DIRAC: Workload Management System Garonne Vincent, Tsaregorodtsev Andrei, Centre de Physique des Particules de Marseille Stockes-rees Ian, University of.
WLCG Tier-2 Asia Workshop TIFR, Mumbai 1-3 December 2006
Real Time Fake Analysis at PIC
L’analisi in LHCb Angelo Carbone INFN Bologna
LCG Service Challenge: Planning and Milestones
INFN GRID Workshop Bari, 26th October 2004
The LHCb Software and Computing NSS/IEEE workshop Ph. Charpentier, CERN B00le.
LHCb Computing Model and Data Handling Angelo Carbone 5° workshop italiano sulla fisica p-p ad LHC 31st January 2008.
Grid Deployment Board meeting, 8 November 2006, CERN
Short update on the latest gLite status
R. Graciani for LHCb Mumbay, Feb 2006
LHCb Computing Philippe Charpentier CERN
LHCb Grid Computing LHCb is a particle physics experiment which will study the subtle differences between matter and antimatter. The international collaboration.
LHC Data Analysis using a worldwide computing grid
LHCb status and plans Ph.Charpentier CERN.
Status and plans for bookkeeping system and production tools
The LHCb Computing Data Challenge DC06
Presentation transcript:

Your university or experiment logo here LHCb Development Glenn Patrick Raja Nandakumar GridPP18, 20 March 2007

2 LHCb December 2006 Muon Calorimeters RICH2 Trackers Magnet RICH1 VELO p p

3 2 kB/event 60MB/s LHCb Computing Model 40 MHz Level-0 Hardware 1 MHz Level-1 Software HLT Software 40 kHz

4 DIRAC Production and Analysis DIRAC Job Management Service DIRAC Job Management Service DIRAC CE LCG Resource Broker Resource Broker CE 1 DIRAC Sites Agent CE 2 CE 3 Production manager Production manager GANGA UI User CLI JobMonitorSvc JobAccountingSvc AccountingDB Job monitor InformationSvc FileCatalogSvc MonitoringSvc BookkeepingSvc BK query webpage BK query webpage FileCatalog browser FileCatalog browser User interfaces DIRAC services DIRAC resources DIRAC Storage DiskFile gridftp bbftp rfio Next talk

5 DIRAC Workload Management Job Receiver Job Receiver Job JDL Sandbox Job Input JobDB Job Receiver Job Receiver Job Receiver Job Receiver Data Optimizer Data Optimizer Task Queue LFC checkData Agent Director Agent Director checkJob RB Pilot Job CE WN Pilot Agent Pilot Agent Job Wrapper Job Wrapper execute (glexec) User Application User Application fork Matcher CE JDL Job JDL getReplicas WMS Admin WMS Admin getProxy SE uploadData VO-box putRequest Agent Monitor Agent Monitor checkPilot getSandbox Job Monitor Job Monitor DIRAC services DIRAC services LCG services LCG services Workload On WN Workload On WN

6 DIRAC3 Revision and Roadmap Dec 2006.Brainstorming meeting at CERN amongst developers. Jan Barcelona workshop. Feb – April 2007.Re-implementation of the code base according to new design. May 2007.Integration of DIRAC3 system and thorough testing. June 2007.Release of DIRAC 3. Operation in multiplatform environment - various Linux flavours, 32 bit/64 bit, Windows(!) Need to separate generic and LHCb behaviour. Need for new functionality affecting multiple components (e.g. job state machinery). DIRAC3 will be the result of this major code revision and reorganisation. Gennady Kuznetsov (RAL)

7 Oracle DB JDBC Driver BK Service BookkeepingSvc BookkeepingQuery Tomcat Servlet Web Browser Read Jython Server XML-RPC GANGA application Read volhcb01 Write Read/Write AMGA Client GANGA application volhcb01 Write AMGA AMGA Client Read R/W AMGA-Bookkeeping Architecture Carmine Cioffi (Oxford): AMGA now used in Production – old system retired. New production machine (volhcb01) for bookkeeping.

8 Reconstruction Brunel Simulation Gauss Digitisation Boole Stripping and Analysis DaVinci MC Truth Raw Data rDST Event Tag Collection DST+RAW Software Modules and Data Flow SoftwareInstallation module GaussApplication module BookkeepingUpdate module Gauss Step SoftwareInstallation module BooleApplication module BookkeepingUpdate module Boole Step Monte-Carlo Production Job

9 Last Month Activity Bugs found, stop and restart the production Record of running jobs 9715 CNAFGRIDKA IN2P3 NIKHEF PIC RAL ALL CERN Average of 7.5K running jobs in the last month Temporary problems at PIC and RAL Raja Nandakumar (RAL)

10 CPU Use since Dec COUNTRYCPU USE (%) UK41.1 CERN12.1 Italy9.6 German8.1 France7.7 Spain6.6 Greece3.8 Netherlands3.1 Poland2.4 Russia2.0 Hungary0.9 UK CERN German Spain France Italy

11 CPU Use since Dec Main sitesCPU Time (%) Manchester17 CERN11 QMUL8 CNAF7 GRIDKA7 IN2P34 Brunel(UK)4 RAL3 Glasgow3 NIKHEF3 USC3 Lancashire2 HG-06 (Greece)2 Barcelona2 PIC2 Other sites20 CPU Use - T1s Manchester CERN CNAF QMUL GRIDKA

12 Reconstruction since Dec PIC 32% CNAF 20% CERN 19% IN2P3 17% RAL 10% GRIDKA 2% NIKHEF 0% Tier 1 Events Reconstructed PIC32.4% CNAF20.1% CERN18.6% IN2P316.5% RAL10.4% GRIDKA5.3% NIKHEF0.1% Data access problems the main cause of delays to reconstruction. RAL dCache unstable since December. Problems with file staging through SRM. Some GridFTP problems. New staging component in DIRAC

13 Data Transfer 1 Problems with transfers: When a job fails to transfer data to one or more T1, the transfer request is queued through VO box. Storage is not always available at T1s and number of pending transfer requests increase. Failed

14 Data Transfer 2 Improvements: Temporary replication to a fail-over SE (all Tier 1s). Replication to final destination queued in VO box. VO box retries until transfer succeeds. Extremely reliable (multi-threaded transfer agent required). Success

15 Next Steps Castor-2 Castor Migration. LHCb tests progressing at RAL (Raja). Once jobs run and are stable aim to switch and replicate existing data from dCache to Castor. End June deadline for Castor approaching fast! Data Stripping. Delayed to ~June because of late availability of high performance pre-selection algorithms. Stripped DSTs to be shipped to all Tier 1 centres. Analysis using Ganga. Output used for LHCb “Physics Book”.

16 Alignment Challenge First release of alignment framework – March. First Alignment Challenge using tracking detectors – end April for production of datasets. ~June for alignment demonstration. Second Alignment Challenge using all sub- detectors – September? VELO is most precise device in LHCb, but it moves! Retracted by ~3cm in between fills. 21 tracking stations. 4 sensors per station (r/  ) Different Configurations: Magnet OFF, VELO Open Magnet OFF, VELO Closed Magnet ON, VELO Open Magnet ON, VELO Closed Grid test of Conditions Database – streaming of data constants and running of LHCb applications.

Timetable December 2007 January 2007 From March April: DAQ -Tier 0 throughput tests June/July: Full chain DAQ -Tier 0 - Tier 1 tests September: Re-reconstruction of b and Min Bias events November: First data! Jan - March: DC06 Production Phase June: Release of DIRAC3 end April - June: First Alignment Challenge September: Second Alignment Challenge