Your university or experiment logo here LHCb is Beautiful? Glenn Patrick GridPP19, 29 August 2007.

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

1 ALICE Grid Status David Evans The University of Birmingham GridPP 14 th Collaboration Meeting Birmingham 6-7 Sept 2005.
GridPP July 2003Stefan StonjekSlide 1 SAM middleware components Stefan Stonjek University of Oxford 7 th GridPP Meeting 02 nd July 2003 Oxford.
1 ALICE Grid Status David Evans The University of Birmingham GridPP 16 th Collaboration Meeting QMUL June 2006.
LHCb on the Grid A Tale of many Migrations
1 GANGA, LHCb and ATLAS Planning Glenn Patrick GRIDPP11 – 14 September 2004.
B A B AR and the GRID Roger Barlow for Fergus Wilson GridPP 13 5 th July 2005, Durham.
Ganga Status and Outlook K. Harrison (University of Cambridge) 16th GridPP Meeting Queen Mary, University of London, 27th-29th June 2006
ATLAS/LHCb GANGA DEVELOPMENT Introduction Requirements Architecture and design Interfacing to the Grid Ganga prototyping A. Soroko (Oxford), K. Harrison.
GridPP Building a UK Computing Grid for Particle Physics A PPARC funded project.
Slide 1 of 24 Steve Lloyd NW Grid Seminar - 11 May 2006 GridPP and the Grid for Particle Physics Steve Lloyd Queen Mary, University of London NW Grid Seminar.
Slide 1 Steve Lloyd London Tier-2 Workshop - 16 Apr 2007 Introduction to Grids and GridPP Steve Lloyd Queen Mary, University of London London Tier-2 Workshop.
UK Agency for the support of: High Energy Physics - the nature of matter and mass Particle Astrophysics - laws from natural phenomena Astronomy - the.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP24 Collaboration Meeting.
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
LHCb Bologna Workshop Glenn Patrick1 Backbone Analysis Grid A Skeleton for LHCb? LHCb Grid Meeting Bologna, 14th June 2001 Glenn Patrick (RAL)
LHCb(UK) Meeting Glenn Patrick1 LHCb Grid Activities in UK LHCb(UK) Meeting Cambridge, 10th January 2001 Glenn Patrick (RAL)
VOORBLAD.
GridPP From Prototype to Production David Britton 21/Sep/06 1.Context – Introduction to GridPP 2.Performance of the GridPP/EGEE/wLCG Grid 3.Some Successes.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
Enabling e-Research over GridPP Dan Tovey University of Sheffield.
EU 2nd Year Review – Jan – Title – n° 1 WP1 Speaker name (Speaker function and WP ) Presentation address e.g.
INFSO-RI Enabling Grids for E-sciencE Workload Management System and Job Description Language.
Grid and CDB Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
11 Dec 2000F Harris Datagrid Testbed meeting at Milan 1 LHCb ‘use-case’ - distributed MC production
Exploiting the Grid to Simulate and Design the LHCb Experiment K Harrison 1, N Brook 2, G Patrick 3, E van Herwijnen 4, on behalf of the LHCb Grid Group.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
LCG-France, 22 July 2004, CERN1 LHCb Data Challenge 2004 A.Tsaregorodtsev, CPPM, Marseille LCG-France Meeting, 22 July 2004, CERN.
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
SLICE Simulation for LHCb and Integrated Control Environment Gennady Kuznetsov & Glenn Patrick (RAL) Cosener’s House Workshop 23 rd May 2002.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
Cosener’s House – 30 th Jan’031 LHCb Progress & Plans Nick Brook University of Bristol News & User Plans Technical Progress Review of deliverables.
Nick Brook Current status Future Collaboration Plans Future UK plans.
4th February 2004GRIDPP91 LHCb Development Glenn Patrick Rutherford Appleton Laboratory.
LHCb week, 27 May 2004, CERN1 Using services in DIRAC A.Tsaregorodtsev, CPPM, Marseille 2 nd ARDA Workshop, June 2004, CERN.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
WP8 Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)
Your university or experiment logo here LHCb Development Glenn Patrick Raja Nandakumar GridPP18, 20 March 2007.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
LHCb Computing Model and Grid Status Glenn Patrick GRIDPP13, Durham – 5 July 2005.
LHCb Computing and Grid Status Glenn Patrick LHCb(UK), Dublin – 23 August 2005.
Getting started DIRAC Project. Outline  DIRAC information system  Documentation sources  DIRAC users and groups  Registration with DIRAC  Getting.
Presenter Name Facility Name UK Testbed Status and EDG Testbed Two. Steve Traylen GridPP 7, Oxford.
UK Grid Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid Prototype and Globus Technical Meeting QMW, 22nd November 2000 Glenn Patrick (RAL)
CHEP 2006, February 2006, Mumbai 1 LHCb use of batch systems A.Tsaregorodtsev, CPPM, Marseille HEPiX 2006, 4 April 2006, Rome.
The GridPP DIRAC project DIRAC for non-LHC communities.
CHEP 2006, February 2006, Mumbai 1 DIRAC, the LHCb Data Production and Distributed Analysis system A.Tsaregorodtsev, CPPM, Marseille CHEP 2006,
GAG meeting, 5 July 2004, CERN1 LHCb Data Challenge 2004 A.Tsaregorodtsev, Marseille N. Brook, Bristol/CERN GAG Meeting, 5 July 2004, CERN.
The GridPP DIRAC project DIRAC for non-LHC communities.
1 DIRAC WMS & DMS A.Tsaregorodtsev, CPPM, Marseille ICFA Grid Workshop,15 October 2006, Sinaia.
1 LHCb Computing A.Tsaregorodtsev, CPPM, Marseille 14 March 2007, Clermont-Ferrand.
INFN GRID Workshop Bari, 26th October 2004
Moving the LHCb Monte Carlo production system to the GRID
The LHCb Software and Computing NSS/IEEE workshop Ph. Charpentier, CERN B00le.
LHCb Computing Model and Data Handling Angelo Carbone 5° workshop italiano sulla fisica p-p ad LHC 31st January 2008.
Building a UK Computing Grid for Particle Physics
Grid Deployment Board meeting, 8 November 2006, CERN
D. Galli, U. Marconi, V. Vagnoni INFN Bologna N. Brook Bristol
LHCb Distributed Computing and the Grid V. Vagnoni (INFN Bologna)
R. Graciani for LHCb Mumbay, Feb 2006
LHCb Computing Philippe Charpentier CERN
LHCb Grid Computing LHCb is a particle physics experiment which will study the subtle differences between matter and antimatter. The international collaboration.
Gridifying the LHCb Monte Carlo production system
Status and plans for bookkeeping system and production tools
The LHCb Computing Data Challenge DC06
Presentation transcript:

Your university or experiment logo here LHCb is Beautiful? Glenn Patrick GridPP19, 29 August 2007

2 In the beginning…

3 LHCb – GridPP1 Era (May 2002) Empty!

4 LHCb – GridPP2 Era (Mar 2005) Not Beautiful!

5 LHCb December 2006 Muon Calorimeters RICH2 Trackers Magnet RICH1 VELO p p Getting Pretty!

6 Summer 2008 – Beauty at Last? 1000 million B mesons/year 2008 Suddenly Beautiful! B0B0 B0B0 b d d b

7 …and so it is with the Grid? Job Local disk Compute Element globus-url-copy Replica Catalogue NIKHEF - Amsterdam CERN TESTBED REST-OF-GRID Job Storage Element replica-get publish register-local-file Storage Element mss Data Origins of Grid for LHCb … GridPP at NeSc Opening – 25 April 2002

8 DIRAC WMS Evolution (2006) Job Receiver Job Receiver Job JDL Sandbox Job Input JobDB Job Receiver Job Receiver Job Receiver Job Receiver Data Optimizer Data Optimizer Task Queue LFC checkData Agent Director Agent Director checkJob RB Pilot Job CE WN Pilot Agent Pilot Agent Job Wrapper Job Wrapper execute (glexec) User Application User Application fork Matcher CE JDL Job JDL getReplicas WMS Admin WMS Admin getProxy SE uploadData VO-box putRequest Agent Monitor Agent Monitor checkPilot getSandbox Job Monitor Job Monitor DIRAC services DIRAC services LCG services LCG services Workload On WN Workload On WN

9 DIRAC Production & Analysis DIRAC Job Management Service DIRAC Job Management Service DIRAC CE LCG Resource Broker Resource Broker CE 1 DIRAC Sites Agent CE 2 CE 3 Production manager Production manager GANGA UI User CLI JobMonitorSvc JobAccountingSvc AccountingDB Job monitor InformationSvc FileCatalogSvc MonitoringSvc BookkeepingSvc BK query webpage BK query webpage FileCatalog browser FileCatalog browser User interfaces DIRAC services DIRAC resources DIRAC Storage DiskFile gridftp bbftp rfio GridPP: Gennady Kuznetsov (RAL) – DIRAC Production Tools DIRAC1: started DIRAC3 (data ready): due 2007

10 GANGA: Gaudi ANd Grid Alliance GAUDI Program GANGA GUI JobOptions Algorithms Collective & Resource Grid Services Histograms Monitoring Results First ideas… Pere Mato: LHCb Workshop, Bologna, 15 June 2001 GridPP - Alexander Soroko (Oxford) Karl Harrison (Cambridge) Ulrik Egede (Imperial) Alvin Tan (Birmingham)

11 PBSOSGNorduGridLocalLSFPANDA US-ATLAS WMS LHCb WMS Executable Athena (Simulation/Digitisation/ Reconstruction/Analysis) AthenaMC (Production) Gauss/Boole/Brunel/DaVinci (Simulation/Digitisation/ Reconstruction/Analysis) LHCbExperiment neutralATLAS Ganga Evolution: Replaces

12 Ganga 2007: Elegant Beauty? CERN, September 2005Cambridge, January 2006 Job details Logical Folders Job Monitoring Log window Job builder Scriptor Screenshot of the Ganga GUI Edinburgh, January 2007

13 Ganga Users unique users since 1 Jan 2007: LHCb=162 unique users ATLAS LHCb Other

14 Ganga by Domain CERN Other

15 RAL CSF 120 Linux cpu IBM 3494 tape robot LIVERPOOL MAP 300 Linux cpu CERN pcrd25.cern.ch lxplus009.cern.ch RAL (PPD) Bristol Imperial College Oxford GLASGOW/ EDINBURGH Proto-Tier 2 Initial LHCb-UK Testbed Institutes Exists Planned RAL DataGrid Testbed Cambridge LHCb Grid - circa 2001

16 2 kB/event 60MB/s LHCb Computing Model 40 MHz Level-0 Hardware 1 MHz Level-1 Software HLT Software 40 kHz

17 Monte Carlo Simulation 2007 Record of 9715 simultaneous jobs over 70+ sites on 28 Feb 2007 Raja Nandakumar (RAL) 700M events simulated since May M jobs submitted

18 Reconstruction & Stripping …but not so often we get all Tier 1 centres working together. Peak of 439 jobs. CNAF NIKHEF RAL IN2P3 CERN

19 Data Management Production jobs upload output to associated Tier 1 SE (i.e. RAL in UK). Multiple Failover SE and Multiple VO Boxes used in case of failure. Replication done via FTS and centralised Transfer DB. eScience PhD: Andrew Smith (Edinburgh)

20 Data Transfer RAW data replicated from Tier 0 to one of six Tier 1 sites. gLite FTS used for T0 – T1 replication. Transfers trigger automated job submission for reconstruction. Sustained total rate of 40MB/s required (and achieved). Further DAQ –T0 – T1 throughput tests at 42MB/s aggregate rate scheduled later in

21 Bookkeeping (2007) GridPP: Carmine Cioffi (Oxford) Tomcat volhcb01 AMGA Client Read Oracle DB BK Service BookkeepingSvc BookkeepingQuery Servlet Web Browser Read AMGA Client AMGA R/W

22 LHCb CPU Use COUNTRYCPU USE (%) UK34.0 Italy16.1 Switzerland13.7 France9.8 Spain7.1 Germany4.8 Greece4.0 Netherlands4.0 Russia2.0 Poland1.8 Hungary0.6 CERN UK Italy Swiss France Spain Germany Many thanks to: Birmingham, Bristol, Brunel, Cambridge, Durham, Edinburgh, Glasgow, Imperial, Lancaster, Liverpool, Manchester, Oxford, QMUL, RAL, RHUL, Sheffield and all others.

23 UKI Evolution for LHCb Tier 1 NorthGrid London ScotGrid SouthGrid

24 GridPP3: Final Crucial Step(s) GridPP Beauty!

25 Some Milestones Sustain DAQ-T0–T1 throughput tests at 40+ MB/s. Reprocessing (second pass) of data at Tier 1 centres. Prioritisation of analysis, reconstruction and stripping jobs (all at Tier 1 for LHCb). CASTOR has to work reliably for all service classes! Ramp up of hardware resources in UK. Alignment. Monte-Carlo done with perfectly positioned detectors…. reality will be different! Calibration. Monte-Carlo done with well understood detectors… reality will be different! Distributed Conditions Database plays vital role. Analysis. Increasing load of individual users.

26 EPS Conference on High Energy Physics, Manchester 23 July 2007 Lyn Evans The End (and the Start) GridPP3