BaBarGrid UK Distributed Analysis Roger Barlow Montréal collaboration meeting June 22 nd 2006.

Slides:



Advertisements
Similar presentations
NorthGrid status Alessandra Forti Gridpp15 RAL, 11 th January 2006.
Advertisements

BaBarGrid GridPP10 Meeting CERN June 3 rd 2004 Roger Barlow Manchester University 1: Simulation 2: Data Distribution: The SRB 3: Distributed Analysis.
Your university or experiment logo here BaBar Status Report Chris Brew GridPP16 QMUL 28/06/2006.
Status Report University of Bristol 3 rd GridPP Collaboration Meeting 14/15 February, 2002Marc Kelly University of Bristol 1 Marc Kelly University of Bristol.
B A B AR and the GRID Roger Barlow for Fergus Wilson GridPP 13 5 th July 2005, Durham.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
GridPP News NeSC opening “Media” dissemination Tier 1/A hardware Web pages Collaboration meetings Nick Brook University of Bristol.
EasyGrid: the job submission system that works! James Cunha Werner GridPP18 Meeting – University of Glasgow.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
BaBarGrid: Some UK developments Roger Barlow Imperial College 13th September 2002.
The story of BaBar: an IT perspective Roger Barlow DESY 4 th September 2002.
1 Use of the European Data Grid software in the framework of the BaBar distributed computing model T. Adye (1), R. Barlow (2), B. Bense (3), D. Boutigny.
The B A B AR G RID demonstrator Tim Adye, Roger Barlow, Alessandra Forti, Andrew McNab, David Smith What is BaBar? The BaBar detector is a High Energy.
Personal review of 2006 Roger Barlow. Manchester Christmas Meeting 2006 Roger Barlow Review Committee 266 ISR e + e -   f 0  KK  Structure at 2200.
Tier1 Grid from users point of view: urge of standards Dr James Cunha Werner Babar UK Grid Meeting.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
BaBar WEB job submission with Globus authentication and AFS access T. Adye, R. Barlow, A. Forti, A. McNab, S. Salih, D. H. Smith on behalf of the BaBar.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
April 2001HEPix/HEPNT1 RAL Site Report John Gordon CLRC, UK.
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
Group Computing Strategy Introduction and BaBar Roger Barlow June 28 th 2005.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
Cosener’s House – 30 th Jan’031 LHCb Progress & Plans Nick Brook University of Bristol News & User Plans Technical Progress Review of deliverables.
8th November 2002Tim Adye1 BaBar Grid Tim Adye Particle Physics Department Rutherford Appleton Laboratory PP Grid Team Coseners House 8 th November 2002.
21 st October 2002BaBar Computing – Stephen J. Gowdy 1 Of 25 BaBar Computing Stephen J. Gowdy BaBar Computing Coordinator SLAC 21 st October 2002 Second.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
WP8 Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)
D0 Taking Stock 11/ /2005 Calibration Database Servers.
Steve Traylen PPD Rutherford Lab Grid Operations PPD Christmas Lectures Steve Traylen RAL Tier1 Grid Deployment
OSG Tier 3 support Marco Mambelli - OSG Tier 3 Dan Fraser - OSG Tier 3 liaison Tanya Levshina - OSG.
Manchester Site report Sabah Salih HEPP The University of Manchester UK HEP Tier3.
The Experiments – progress and status Roger Barlow GridPP7 Oxford 2 nd July 2003.
Owen SyngeTitle of TalkSlide 1 Storage Management Owen Synge – Developer, Packager, and first line support to System Administrators. Talks Scope –GridPP.
Dave Newbold, University of Bristol8/3/2001 UK Testbed 0 Sites Sites that have committed to TB0: RAL (R) Birmingham (Q) Bristol (Q) Edinburgh (Q) Imperial.
GridPP11 Liverpool Sept04 SAMGrid GridPP11 Liverpool Sept 2004 Gavin Davies Imperial College London.
BaBar and the Grid Roger Barlow Dave Bailey, Chris Brew, Giuliano Castelli, James Werner, Fergus Wilson and Will Roethel GridPP18 Glasgow March 20 th 2007.
Presenter Name Facility Name UK Testbed Status and EDG Testbed Two. Steve Traylen GridPP 7, Oxford.
A B A B AR InterGrid Testbed Proposal for discussion Robin Middleton/Roger Barlow Rome: October 2001.
Accelerators and other things Roger Barlow Manchester Christmas Meeting 2005/6.
Storage Federations and FAX (the ATLAS Federation) Wahid Bhimji University of Edinburgh.
UK Grid Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid Prototype and Globus Technical Meeting QMW, 22nd November 2000 Glenn Patrick (RAL)
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
Andrew McNabGrid in 2002, Manchester HEP, 7 Jan 2003Slide 1 Grid Work in 2002 Andrew McNab High Energy Physics University of Manchester.
11th April 2003Tim Adye1 RAL Tier A Status Tim Adye Rutherford Appleton Laboratory BaBar UK Collaboration Meeting Liverpool 11 th April 2003.
Tier-1 Andrew Sansum Deployment Board 12 July 2007.
The GridPP DIRAC project DIRAC for non-LHC communities.
Xrootd Proxy Service Andrew Hanushevsky Heinz Stockinger Stanford Linear Accelerator Center SAG September-04
BaBar and the GRID Tim Adye CLRC PP GRID Team Meeting 3rd May 2000.
RAL PPD Tier 2 (and stuff) Site Report Rob Harper HEP SysMan 30 th June
LHCb Data Challenge in 2002 A.Tsaregorodtsev, CPPM, Marseille DataGRID France meeting, Lyon, 18 April 2002.
Climate-SDM (1) Climate analysis use case –Described by: Marcia Branstetter Use case description –Data obtained from ESG –Using a sequence steps in analysis,
LHCb Grid MeetingLiverpool, UK GRID Activities Glenn Patrick Not particularly knowledgeable-just based on attending 3 meetings.  UK-HEP.
The GridPP DIRAC project DIRAC for non-LHC communities.
J Jensen/J Gordon RAL Storage Storage at RAL Service Challenge Meeting 27 Jan 2005.
ATLAS TIER3 in Valencia Santiago González de la Hoz IFIC – Instituto de Física Corpuscular (Valencia)
Grid development at University of Manchester Hardware architecture: - 1 Computer Element and 10 Work nodes Software architecture: - EasyGrid to submit.
Best 20 jobs jobs sites.
BaBar & Grid Eleonora Luppi for the BaBarGrid Group TB GRID Bologna 15 febbraio 2005.
1 The BbgUtils package Roger Barlow BaBarGrid Workshop Ferrara, 13 th July 2005.
Status of the SL5 migration ALICE TF Meeting
GridPP10 Meeting CERN June 3 rd 2004
Moving the LHCb Monte Carlo production system to the GRID
UK Testbed Status Testbed 0 GridPP project Experiments’ tests started
VOMS deployment for small national VOs and local groups
CC and LQCD dimanche 13 janvier 2019dimanche 13 janvier 2019
MonteCarlo production for the BaBar experiment on the Italian grid
Short to middle term GRID deployment plan for LHCb
The LHCb Computing Data Challenge DC06
Presentation transcript:

BaBarGrid UK Distributed Analysis Roger Barlow Montréal collaboration meeting June 22 nd 2006

Sites RAL Tier 1/A Manchester Tier 2 – 1000 nodes, 400 TB RAL Tier 2 Manchester BaBar farm (40 nodes) QMUL (large) Liverpool (large) Massive opportunity

$BbgUtils/BbgWhere List of all sites available to the BaBar VO =========================================== ce-fzk.gridka.de: with 1614 CPUs available gridce.pi.infn.it: with 38 CPUs available gridba2.ba.infn.it: with 134 CPUs available lcgce01.gridpp.rl.ac.uk: with 1062 CPUs available grid0.fe.infn.it: with 24 CPUs available gridce.pg.infn.it: with 90 CPUs available t2-ce-01.mi.infn.it: with 62 CPUs available a gridka.de: with 1614 CPUs available ce1.pp.rhul.ac.uk: with 140 CPUs available grid002.ca.infn.it: with 32 CPUs available griditce01.na.infn.it: with 30 CPUs available prod-ce-01.pd.infn.it: with 80 CPUs available ce2.egee.unile.it: with 28 CPUs available ce1-gla.scotgrid.ac.uk: with 190 CPUs available ce.epcc.ed.ac.uk: with 6 CPUs available gridit-ce-001.cnaf.infn.it: with 10 CPUs available gw39.hep.ph.ic.ac.uk: with 60 CPUs available spaci01.na.infn.it: with 0 CPUs available atlasce01.na.infn.it: with 32 CPUs available ce01.esc.qmul.ac.uk: with 1462 CPUs available gw-2.ccc.ucl.ac.uk: with 360 CPUs available dgc-grid-40.brunel.ac.uk: with 126 CPUs available dgc-grid-35.brunel.ac.uk: with 4 CPUs available heplnx201.pp.rl.ac.uk: with 58 CPUs available mars-ce.mars.lesc.doc.ic.ac.uk: with 178 CPUs available epgce1.ph.bham.ac.uk: with 28 CPUs available hepgrid2.ph.liv.ac.uk: with 492 CPUs available e5grid05.physik.uni-dortmund.de: with 54 CPUs available e5grid06.physik.uni-dortmund.de: with 4 CPUs available prod-ce-02.pd.infn.it: with 80 CPUs available t2ce02.physics.ox.ac.uk: with 68 CPUs available glite-ce-01.cnaf.infn.it: with 10 CPUs available helmsley.dur.scotgrid.ac.uk: with 106 CPUs available fal-pygrid-18.lancs.ac.uk: with 362 CPUs available ce01.tier2.hep.manchester.ac.uk: with 556 CPUs available

bbrbsub Extended by Giuliano Castelli Add grid functionality to standard bbrbsub Can submit from RAL to RAL Working on RAL to MAN (Basic grid stuff OK. Database OK. No data to read. Manchester uses dCache but is moving away from that)

bbrbsub (contd) Have afs and sandbox I/O versions May access BaBar software at remote site of at local site (afs or tarball) Progress ongoing. Data at MAN is bottleneck. Setting up xrootd server To be incorporated into standard Simple Job Manager framework (Will Roethel)

easyGrid James is still developing easyGrid (also easyApp easyRoot) for grid analysis Works for him. Not yet widely taken up