Dan Tovey, University of Sheffield GridPP: Experiment Status & User Feedback Dan Tovey University Of Sheffield.

Slides:



Advertisements
Similar presentations
GridPP9 – 5 February 2004 – Data Management DataGrid is a project funded by the European Union GridPP is funded by PPARC GridPP2: Data and Storage Management.
Advertisements

The Quantum Chromodynamics Grid James Perry, Andrew Jackson, Matthew Egbert, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
Applications Area Issues RWL Jones GridPP16 QMUL 28 th June 2006.
HEPiX Edinburgh 28 May 2004 LCG les robertson - cern-it-1 Data Management Service Challenge Scope Networking, file transfer, data management Storage management.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
RLS Production Services Maria Girone PPARC-LCG, CERN LCG-POOL and IT-DB Physics Services 10 th GridPP Meeting, CERN, 3 rd June What is the RLS -
11 Dec 2000F Harris Datagrid Testbed meeting at Milan 1 LHCb ‘use-case’ - distributed MC production
QCDgrid Technology James Perry, George Beckett, Lorna Smith EPCC, The University Of Edinburgh.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
CERN - IT Department CH-1211 Genève 23 Switzerland t Monitoring the ATLAS Distributed Data Management System Ricardo Rocha (CERN) on behalf.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
December 17th 2008RAL PPD Computing Christmas Lectures 11 ATLAS Distributed Computing Stephen Burke RAL.
QCDgrid UKQCD Achievements and Future Priorities Who and what Achievements QCDgrid middleware Future priorities Demo of meta-data catalogue browser Alan.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
Applications Review RWL Jones GridPP12. ATLAS Risks: –Delayed middleware, raid changes in framework, competition Progress –New GANGA version (first deliverable)
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
RLS Tier-1 Deployment James Casey, PPARC-LCG Fellow, CERN 10 th GridPP Meeting, CERN, 3 rd June 2004.
Computing Infrastructure Status. LHCb Computing Status LHCb LHCC mini-review, February The LHCb Computing Model: a reminder m Simulation is using.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
Cosener’s House – 30 th Jan’031 LHCb Progress & Plans Nick Brook University of Bristol News & User Plans Technical Progress Review of deliverables.
CERN IT Department CH-1211 Genève 23 Switzerland t Internet Services Job Monitoring for the LHC experiments Irina Sidorova (CERN, JINR) on.
Nick Brook Current status Future Collaboration Plans Future UK plans.
Your university or experiment logo here Caitriana Nicholson University of Glasgow Dynamic Data Replication in LCG 2008.
UKQCD QCDgrid Richard Kenway. UKQCD Nov 2001QCDgrid2 why build a QCD grid? the computational problem is too big for current computers –configuration generation.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
Neil Geddes GridPP-10, June 2004 UK e-Science Grid Dr Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations Support Centre.
GridPP18 Glasgow Mar 07 DØ – SAMGrid Where’ve we come from, and where are we going? Evolution of a ‘long’ established plan Gavin Davies Imperial College.
GridPP3 Project Management GridPP20 Sarah Pearce 11 March 2008.
Alex Read, Dept. of Physics Grid Activity in Oslo CERN-satsingen/miljøet møter MN-fakultetet Oslo, 8 juni 2009 Alex Read.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
Enabling Grids for E-sciencE System Analysis Working Group and Experiment Dashboard Julia Andreeva CERN Grid Operations Workshop – June, Stockholm.
1 LCG-France sites contribution to the LHC activities in 2007 A.Tsaregorodtsev, CPPM, Marseille 14 January 2008, LCG-France Direction.
Dan Tovey, University of Sheffield User Board Overview Dan Tovey University Of Sheffield.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
The Experiments – progress and status Roger Barlow GridPP7 Oxford 2 nd July 2003.
Owen SyngeTitle of TalkSlide 1 Storage Management Owen Synge – Developer, Packager, and first line support to System Administrators. Talks Scope –GridPP.
…building the next IT revolution From Web to Grid…
T3 analysis Facility V. Bucard, F.Furano, A.Maier, R.Santana, R. Santinelli T3 Analysis Facility The LHCb Computing Model divides collaboration affiliated.
1 LHCb on the Grid Raja Nandakumar (with contributions from Greig Cowan) ‏ GridPP21 3 rd September 2008.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
UKQCD Grid Status Report GridPP 13 th Collaboration Meeting Durham, 4th—6th July 2005 Dr George Beckett Project Manager, EPCC +44.
CERN IT Department CH-1211 Genève 23 Switzerland t Frédéric Hemmer IT Department Head - CERN 23 rd August 2010 Status of LHC Computing from.
GridPP11 Liverpool Sept04 SAMGrid GridPP11 Liverpool Sept 2004 Gavin Davies Imperial College London.
A B A B AR InterGrid Testbed Proposal for discussion Robin Middleton/Roger Barlow Rome: October 2001.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
Alex Read, Dept. of Physics Grid Activities in Norway R-ECFA, Oslo, 15 May, 2009.
Integration of the ATLAS Tag Database with Data Management and Analysis Components Caitriana Nicholson University of Glasgow 3 rd September 2007 CHEP,
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
Plans for Service Challenge 3 Ian Bird LHCC Referees Meeting 27 th June 2005.
The GridPP DIRAC project DIRAC for non-LHC communities.
Large scale data flow in local and GRID environment Viktor Kolosov (ITEP Moscow) Ivan Korolko (ITEP Moscow)
1 LHCb computing for the analysis : a naive user point of view Workshop analyse cc-in2p3 17 avril 2008 Marie-Hélène Schune, LAL-Orsay for LHCb-France Framework,
1 A Scalable Distributed Data Management System for ATLAS David Cameron CERN CHEP 2006 Mumbai, India.
CLRC Grid Team Glenn Patrick LHCb GRID Plans Glenn Patrick LHCb has formed a GRID technical working group to co-ordinate practical Grid.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
The GridPP DIRAC project DIRAC for non-LHC communities.
Status of gLite-3.0 deployment and uptake Ian Bird CERN IT LCG-LHCC Referees Meeting 29 th January 2007.
INFSO-RI Enabling Grids for E-sciencE File Transfer Software and Service SC3 Gavin McCance – JRA1 Data Management Cluster Service.
ATLAS Computing Model Ghita Rahal CC-IN2P3 Tutorial Atlas CC, Lyon
DØ Grid Computing Gavin Davies, Frédéric Villeneuve-Séguier Imperial College London On behalf of the DØ Collaboration and the SAMGrid team The 2007 Europhysics.
Data Challenge with the Grid in ATLAS
LHCb Computing Model and Data Handling Angelo Carbone 5° workshop italiano sulla fisica p-p ad LHC 31st January 2008.
Readiness of ATLAS Computing - A personal view
R. Graciani for LHCb Mumbay, Feb 2006
The LHCb Computing Data Challenge DC06
Presentation transcript:

Dan Tovey, University of Sheffield GridPP: Experiment Status & User Feedback Dan Tovey University Of Sheffield

Dan Tovey, University of Sheffield Introduction This talk will be in two parts: 1.The good news  Details of Grid use by the experiments 2.The less good news  Feedback from the experiments regarding their experiences 

Dan Tovey, University of Sheffield ATLAS Grid Use Almost all resources for ATLAS are Grid-based –Three Grid flavours to work with – LCG-2, NorduGrid, Grid3 –Considerable issues of interoperability/federation Next large exercise is Rome Physics Workshop in June –Generation mixture of Grid and non-Grid, but much non-Grid for convenience –Simulation/Digitisation/Reconstruction All on Grid –Analysis Some Grid-based analysis already, distributed analysis being rolled-out in Spring Rome will use a mixture of Grid/non-Grid analysis

Dan Tovey, University of Sheffield ATLAS Issues Interoperability –Currently, production system has to layer job scheduling over system for each deployment –Absolute need for a unified file catalogue system Currently layer additional catalogue over others Information system/policy –Inaccurate advertisement of sites –SE saturation Internal – production system need better clean-up and more robust back-up SE should advertise if it is really for storage! SCR$MONTH class required? Lines of reporting need to be improved/clarified LCG issues –LCG-Castor failures –RLS corruption Resource issues –Still trying to ensure required resources for 2007/2008

Dan Tovey, University of Sheffield General CMS Outlook Tool development –Going very well: leading contributions from the UK in the most important areas –Integration between tools is starting –Moving away from LCG-style data management, for now –Our modular approach can re-integrate LCG tools later on if needed Collaboration status / plans –Computing Model now blessed and publicly available –Computing TDR well under way –UK making a strong contribution –Use of Tier-1 / Tier-2 resources in the UK will start to grow rapidly as Grid-enabled DST analysis begins

Dan Tovey, University of Sheffield LCG Aug 2004 DIRAC27% LCG73% LHCb Production Desktop Provides control and decomposes complex workflows into steps and modules. DIRAC alone LCG in action LCG paused LCG restarted 186 M Produced Events DC04 Phase 1 186M events. 424 CPU years LCG(UK)Tier 17.7% Tier 2London3.9% Tier 2South2.3% Tier 2North1.4% DIRAC(UK)Imperial2.0% Liverpool3.1% Oxford0.1% ScotGrid5.1% 3-5M events/day 1.8M events/day DIRAC May 2004 DIRAC89% LCG11% ~50% run on LCG resources LCG Efficiency = 61% UK second largest producer (25%) after CERN.

Dan Tovey, University of Sheffield DC05 – Real Time Data Challenge Mimic LHC data taking, test HLT, reconstruction and streaming software from pre- simulated data. Production phase (150M events) April- June 05. LHCb-2005 DC04 Phase 2 – Stripping (Ongoing) Using Production Desktop developed in UK (G. Kuznetsov). Data reduction on 65TB distributed over Tier 1 sites. Using DIRAC with input data and LCG for data access. SRM was on critical path – available at CERN, PIC, CNAF. Production version unavailable at RAL, UK not participating. DC04 Phase 3 – End User Analysis Use GANGA Grid interface (UK project). Improvements Jan–March 05 (A. Soroko at CERN) User training started (e.g. Cambridge event funded by GridPP in December 04) Distributed analysis from March 2005 with datasets replicated to RAL. Production Desktop

Dan Tovey, University of Sheffield Infrastructure QCDgrid is primarily a data-grid and is aimed at providing a storage and processing infrastructure for QCDOC (5 Tflop-sustained QCD simulation facility) QCDOC is now installed and being shaken down in Edinburgh along with ‘Tier 1’ 50 Tbyte store. ‘Tier 2’ storage nodes have been installed in Edinburgh, Liverpool, Swansea and Southampton. (4 x 12.5 Tbyte) Additional storage/access nodes are operating at RAL and Columbia Processing clusters at Edinburgh (QCDOC FE), Liverpool, Southampton,….

Dan Tovey, University of Sheffield Usage and Uptake Run by Grid administrator + local sysadmins + users Main Edinburgh + Liverpool. All UKQCD primary data already stored and all secondary data produced by grid-retrieval and (currently non-grid) processing. Secondary data is also stored back on QCDgrid (metadata markup not yet automated). All QCDOC data to be archived on QCDgrid with NO tape copies. Job submission software allows submision to any grid-enabled system (only requires Globus) No. of actual users (~8) is quite low at the moment because production data from QCDOC has not (quite) started to flow.

Dan Tovey, University of Sheffield ZEUS MC Grid In % of ZEUS MC is being produced via Grid. RAL, UCL and Scotgrid-Glasgow accept ZEUS VO. 27% of ZEUS Grid MC comes from UK.

Dan Tovey, University of Sheffield ZEUS MC Grid Grid integrated with previous MC production. ZEUS MC production Million Events. Now with grid on target for 458 Million in Monte Carlo data from Grid is being used in ZEUS physics analysis.

Dan Tovey, University of Sheffield User Feedback Degree of engagement between GridPP and experiments questioned by OsC. Questionnaire distributed to all experiments asking for views. Put simply (and bluntly): results suggest strong barriers to successful take-up of Grid in general and LCG in particular by most experiments. Dissatisfaction especially with –stability, –support, –site configuration, –data management and movement More work needed by LCG and GridPP to address these issues  encouraging discussion yesterday of some issues.

Dan Tovey, University of Sheffield Usage Statistics collected for grid use: –Overall –GridPP supported overall –GridPP supported in UK Some reason for optimism –Some expts using Grid significantly Still large spikes at ~0…

Dan Tovey, University of Sheffield General User Feedback Perception that Grid techniques are being forced upon experiments through e.g. switch to Grid-only access to Tier-1. Problem of conflict between UK Grid strategy and the priorities of wider international collaborations –This could potentially harm UK physics return. Concern that some experiments having to integrate complex existing software infrastructure with the Grid with little or no available effort or ear-marked financial support. –It is clear that Portal project is going to be key. Shift in emphasis needed towards more pro-active approach aimed at helping experiments to achieve their ‘real-world’ data processing goals  GridPP2