GridPP Dirac Service The 4 th Dirac User Workshop 26-28 May 2014 CERN Janusz Martyniak, Imperial College London.

Slides:



Advertisements
Similar presentations
Workload Management Status of current activity GridPP 13, Durham, 6 th July 2005.
Advertisements

GridPP Meeting, Cambridge, 15 Feb 2002 Paul Mealor, UCL UCL Testbed 1 status report Paul Mealor.
User Board - Supporting Other Experiments Stephen Burke, RAL pp Glenn Patrick.
GLite Status Stephen Burke RAL GridPP 13 - Durham.
Andrew C. Smith, 11 th May 2007 EGEE User Forum 2 - DIRAC Data Management System User Forum 2 Data Management System “Tell me and I forget. Show me and.
HTCondor and the European Grid Andrew Lahiff STFC Rutherford Appleton Laboratory European HTCondor Site Admins Meeting 2014.
13/05/2004Janusz Martyniak Imperial College London 1 Using Ganga to Submit BaBar Jobs Development Status.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
Grid and CDB Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
Batch Production and Monte Carlo + CDB work status Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Services Abderrahman El Kharrim
Oxford Jan 2005 RAL Computing 1 RAL Computing Implementing the computing model: SAM and the Grid Nick West.
LHC Experiment Dashboard Main areas covered by the Experiment Dashboard: Data processing monitoring (job monitoring) Data transfer monitoring Site/service.
Pilots 2.0: DIRAC pilots for all the skies Federico Stagni, A.McNab, C.Luzzi, A.Tsaregorodtsev On behalf of the DIRAC consortium and the LHCb collaboration.
UK NGI Operations John Gordon 10 th January 2012.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
Southgrid Technical Meeting Pete Gronbech: 16 th March 2006 Birmingham.
Building a distributed software environment for CDF within the ESLEA framework V. Bartsch, M. Lancaster University College London.
Computing Infrastructure Status. LHCb Computing Status LHCb LHCC mini-review, February The LHCb Computing Model: a reminder m Simulation is using.
Oxford Update HEPix Pete Gronbech GridPP Project Manager October 2014.
BESIII Production with Distributed Computing Xiaomei Zhang, Tian Yan, Xianghu Zhao Institute of High Energy Physics, Chinese Academy of Sciences, Beijing.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
Your university or experiment logo here Storage and Data Management - Background Jens Jensen, STFC.
Enabling Grids for E-sciencE EGEE-III INFSO-RI Using DIANE for astrophysics applications Ladislav Hluchy, Viet Tran Institute of Informatics Slovak.
Enabling Grids for E-sciencE System Analysis Working Group and Experiment Dashboard Julia Andreeva CERN Grid Operations Workshop – June, Stockholm.
1 LCG-France sites contribution to the LHC activities in 2007 A.Tsaregorodtsev, CPPM, Marseille 14 January 2008, LCG-France Direction.
EGEE-III INFSO-RI Enabling Grids for E-sciencE Training services offered by SZTAKI for EGEE and EGI Gergely Sipos MTA SZTAKI (Hungarian.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Status report on Application porting at SZTAKI.
Author: Andrew C. Smith Abstract: LHCb's participation in LCG's Service Challenge 3 involves testing the bulk data transfer infrastructure developed to.
1 LHCb on the Grid Raja Nandakumar (with contributions from Greig Cowan) ‏ GridPP21 3 rd September 2008.
GridPP11 Liverpool Sept04 SAMGrid GridPP11 Liverpool Sept 2004 Gavin Davies Imperial College London.
1 Andrea Sciabà CERN Critical Services and Monitoring - CMS Andrea Sciabà WLCG Service Reliability Workshop 26 – 30 November, 2007.
Jens G Jensen RAL, EDG WP5 Storage Element Overview DataGrid Project Conference Heidelberg, 26 Sep-01 Oct 2003.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks The GILDA t-Infrastructure Roberto Barbera.
Glite. Architecture Applications have access both to Higher-level Grid Services and to Foundation Grid Middleware Higher-Level Grid Services are supposed.
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
The ATLAS Cloud Model Simone Campana. LCG sites and ATLAS sites LCG counts almost 200 sites. –Almost all of them support the ATLAS VO. –The ATLAS production.
Transformation System report Luisa Arrabito 1, Federico Stagni 2 1) LUPM CNRS/IN2P3, France 2) CERN 5 th DIRAC User Workshop 27 th – 29 th May 2015, Ferrara.
Catalin Condurache STFC RAL Tier-1 GridPP OPS meeting, 10 March 2015.
Site Report: Prague Jiří Chudoba Institute of Physics, Prague WLCG GridKa+T2s Workshop.
The CMS Top 5 Issues/Concerns wrt. WLCG services WLCG-MB April 3, 2007 Matthias Kasemann CERN/DESY.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
The GridPP DIRAC project DIRAC for non-LHC communities.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) The Egyptian Grid Infrastructure Maha Metawei
DIRAC Project A.Tsaregorodtsev (CPPM) on behalf of the LHCb DIRAC team A Community Grid Solution The DIRAC (Distributed Infrastructure with Remote Agent.
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
Analysis of job submissions through the EGEE Grid Overview The Grid as an environment for large scale job execution is now moving beyond the prototyping.
CernVM-FS Infrastructure for EGI VOs Catalin Condurache - STFC RAL Tier1 EGI Webinar, 5 September 2013.
INFSO-RI Enabling Grids for E-sciencE gLite Test and Certification Effort Nick Thackray CERN.
JRA1 Testing Current Status Leanne Guy Testing Coordination Meeting, 13 th September 2004 EGEE is a project funded by the European.
Stephen Burke – Sysman meeting - 22/4/2002 Partner Logo The Testbed – A User View Stephen Burke, PPARC/RAL.
CERN - IT Department CH-1211 Genève 23 Switzerland t Grid Reliability Pablo Saiz On behalf of the Dashboard team: J. Andreeva, C. Cirstoiu,
The GridPP DIRAC project DIRAC for non-LHC communities.
WLCG Operations Coordination report Maria Alandes, Andrea Sciabà IT-SDC On behalf of the WLCG Operations Coordination team GDB 9 th April 2014.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE Operations: Evolution of the Role of.
Breaking the frontiers of the Grid R. Graciani EGI TF 2012.
SAM architecture EGEE 07 Service Availability Monitor for the LHC experiments Simone Campana, Alessandro Di Girolamo, Nicolò Magini, Patricia Mendez Lorenzo,
Slide § David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP delivering The UK Grid.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
UK Status and Plans Catalin Condurache – STFC RAL ALICE Tier-1/Tier-2 Workshop University of Torino, February 2015.
GridPP DIRAC Daniela Bauer & Simon Fayer.
Data Management and Database Framework for the MICE experiment
Data Management and Database Framework for the MICE Experiment
CASTOR-SRM Status GridPP NeSC SRM workshop
LHCb Computing Model and Data Handling Angelo Carbone 5° workshop italiano sulla fisica p-p ad LHC 31st January 2008.
gLite The EGEE Middleware Distribution
IPv6 update Duncan Rand Imperial College London
The LHCb Computing Data Challenge DC06
Presentation transcript:

GridPP Dirac Service The 4 th Dirac User Workshop May 2014 CERN Janusz Martyniak, Imperial College London

Why GridPP Dirac Service? GridPP (UK computing for Particle Physics) also supports non-LHC (“small”) VO Dirac is considered as a replacement for EMI WMS Also as a replacement for the LFC still widely used by small Vos One instance installed and maintained at Imperial College London: 26/05/2014Janusz Martyniak Imperial College London2

Supported VOs First installation in April “Technical” VOs: vo.londongrid.ac.uk vo.northgrid.ac.uk vo.gridpp.ac.uk Small experiments: na62.vo.gridpp.ac.uk mice (planned) vo.landslides.mossaic.org t2k.org cernatschool.org snoplus 26/05/2014Janusz Martyniak Imperial College London3

Resources available Resources support by GridPP Dirac is customer driven We support most of the UK EMI grid sites (CEs, SEs) Also support VAC virtual machines in Manchester and Oxford. ( ) 26/05/2014Janusz Martyniak Imperial College London4

NA62 support Modify NA62 jobs to run via Dirac –define sites, Ses and the dfc. Submit and monitor jobs Use Dirac to perform file management: local SE stage-out with failover, copy to RAL and CERN using asynchronous FTS requests. Use Dirac dfc as file and potentially metadata catalog. 26/05/2014Janusz Martyniak Imperial College London5

MICE Currently use the Grid to move raw data to Castor at RAL Use a similar mechanism as NA62 for off-line reconstructions (glite tool + custom FTS movement tools) Move to Dirac for Workload management, DFC and asynchronous FTS file movement 26/05/2014Janusz Martyniak Imperial College London6

Landslides VO A small VO to study landslides They used their own Dirac at Bristol Moved to the GridPP Dirac w/o major probems A project in hibernation at the moment 26/05/2014Janusz Martyniak Imperial College London7

Cernatschool VO Just started - added recently to the GridPP Dirac Enabled on a few sites in the UK Asked for a catalog support and SEs 26/05/2014Janusz Martyniak Imperial College London8

T2k.org Main interest expressed for a group at TRIUMF The required SE definitions for Imperial and RAL Use of the LFC Not much activity recently 26/05/2014Janusz Martyniak Imperial College London9

Dirac for small VOs future plan Move Workload management, file management and replication entirely to Dirac for NA62 and MICE Add support for cernatschool, WMS + DFC Extend end-user documentation in the GridPP wiki. 26/05/2014Janusz Martyniak Imperial College London10