A Dutch LHC Tier-1 Facility

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

Tony Doyle GridPP2 Proposal, BT Meeting, Imperial, 23 July 2003.
SWITCH Visit to NeSC Malcolm Atkinson Director 5 th October 2004.
Nikhef Jamboree 2008 BiG Grid Update Jan Just Keijser.
Grid Jeff Templon PDP Group, NIKHEF NIKHEF Jamboree 22 december 2005 Throbbing jobsGoogled Grid.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
Large scale data flow in local and GRID environment V.Kolosov, I.Korolko, S.Makarychev ITEP Moscow.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
Status Report on Tier-1 in Korea Gungwon Kang, Sang-Un Ahn and Hangjin Jang (KISTI GSDC) April 28, 2014 at 15th CERN-Korea Committee, Geneva Korea Institute.
Andrew McNab - Manchester HEP - 5 July 2001 WP6/Testbed Status Status by partner –CNRS, Czech R., INFN, NIKHEF, NorduGrid, LIP, Russia, UK Security Integration.
Organisation Management and Policy Group (MPG): Responsible for setting and policy decisions and resolving any issues concerning fractional usage, acceptable.
Dutch Tier Hardware Farm size –now: 150 dual nodes + scavenging 200 nodes –buildup to ~1500 up-to-date nodes in 2007 Network –now: 2 Gbit/s internatl.
GridPP3 Project Management GridPP20 Sarah Pearce 11 March 2008.
Grid Computing Status Report Jeff Templon PDP Group, NIKHEF NIKHEF Scientific Advisory Committee 20 May 2005.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
1 PRAGUE site report. 2 Overview Supported HEP experiments and staff Hardware on Prague farms Statistics about running LHC experiment’s DC Experience.
The DutchGrid Platform – An Overview – 1 DutchGrid today and tomorrow David Groep, NIKHEF The DutchGrid Platform Large-scale Distributed Computing.
A Distributed Tier-1 An example based on the Nordic Scientific Computing Infrastructure GDB meeting – NIKHEF/SARA 13th October 2004 John Renner Hansen.
HEPix April 2006 NIKHEF site report What’s new at NIKHEF’s infrastructure and Ramping up the LCG tier-1 Wim Heubers / NIKHEF (+SARA)
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Nikhef/(SARA) tier-1 data center infrastructure
11 November 2010 Natascha Hörmann Computing at HEPHY Evaluation 2010.
NIKHEF CT/ Status NIKHEF (NL). NIKHEFDataGrid/Oxford/July DutchGrid Participation of High-energy Physics Earth observation Computer.
119 May 2003HEPiX/HEPNT National Institute for Nuclear Physics and High Energy Physics Coordinates all (experimental) subatomic physics research in The.
High Energy FermiLab Two physics detectors (5 stories tall each) to understand smallest scale of matter Each experiment has ~500 people doing.
EGI-InSPIRE Steven Newhouse Interim EGI.eu Director EGI-InSPIRE Project Director Technical Director EGEE-III 1GDB - December 2009.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
IAG – Israel Academic Grid, EGEE and HEP in Israel Prof. David Horn Tel Aviv University.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
Physics Data Processing at NIKHEF Jeff Templon WAR 7 May 2004.
LHC Computing – the 3 rd Decade Jamie Shiers LHC OPN meeting October 2010.
Research organization technology David Groep, October 2007.
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
Grid Computing Jeff Templon Programme: Group composition (current): 2 staff, 10 technicians, 1 PhD. Publications: 2 theses (PD Eng.) 16 publications.
J. Templon Nikhef Amsterdam Physics Data Processing Group “Grid” Computing J. Templon SAC, 26 April 2012.
Status of the NL-T1. BiG Grid – the dutch e-science grid Realising an operational ICT infrastructure at the national level for scientific research (e.g.
DutchGrid KNMI KUN Delft Leiden VU ASTRON WCW Utrecht Telin Amsterdam Many organizations in the Netherlands are very active in Grid usage and development,
J. Templon Nikhef Amsterdam Physics Data Processing Group Large Scale Computing Jeff Templon Nikhef Jamboree, Utrecht, 10 december 2012.
Hall D Computing Facilities Ian Bird 16 March 2001.
18/12/03PPD Christmas Lectures 2003 Grid in the Department A Guide for the Uninvolved PPD Computing Group Christmas Lecture 2003 Chris Brew.
Critical Issues in Distributed Computing Jeff Templon NIKHEF ACAT’07 Conference Amsterdam, 26 april 2007.
Grid Computing at NIKHEF Shipping High-Energy Physics data, be it simulated or measured, required strong national and trans-Atlantic.
Gene Oleynik, Head of Data Storage and Caching,
Status of WLCG FCPPL project
U.S. ATLAS Tier 2 Computing Center
JRA3 Introduction Åke Edlund EGEE Security Head
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
LCG 3D Distributed Deployment of Databases
Ian Bird GDB Meeting CERN 9 September 2003
Christos Markou Institute of Nuclear Physics NCSR ‘Demokritos’
I Brazilian LHC Computing Workshop Welcome
Russian Regional Center for LHC Data Analysis
The INFN TIER1 Regional Centre
Dagmar Adamova (NPI AS CR Prague/Rez) and Maarten Litmaath (CERN)
UK Testbed Status Testbed 0 GridPP project Experiments’ tests started
NIKHEF Data Processing Fclty
Pierre Girard ATLAS Visit
Collaboration Board Meeting
GRIF : an EGEE site in Paris Region
Workflow and HPC erhtjhtyhy Doug Benjamin Argonne National Lab.
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

A Dutch LHC Tier-1 Facility Jeff Templon PDP Group, NIKHEF RECFA Visit 23 sept 2005

LHC Computing in NL SARA / NIKHEF : Tier-1 for ALICE, ATLAS, LHCb SARA: bulk of resources, all of long-term mass store NIKHEF: significant resources, bridge to experiments & middleware groups Contribution to exp’ts scale roughly as our presence: 4 : 2 : 1 LHCb : ATLAS : ALICE Fraction of world Tier-1 resources in Amsterdam: 23% : 11.5% : 5.75% Tier-1 forms large part of National e-Science Center n.b. LHC middleware must be generic enough for others! NIKHEF : gridified analysis facility with rough scale of Tier-2

numbers 3 kSI2k == latest dual-CPU machine NIKHEF today +- 230 kSI2k, 10 TB disk 3 kSI2k == latest dual-CPU machine

External (personnel) funds now from EGEE and VL-E projects How We Got Here D0 Distributed Computing (since 2000) first farm at NIKHEF NIKHEF led 1st HEP reco on Grid One of principal project partners Third site to join EDG grid Remained one of the five ‘core sites’ throughout the project Contributions in this project earned us a fantastic position in the grid world Bring LHC and “rest of world” closer together Active participation builds consensus and makes an e-Science center with LHC Tier-1 a real possibility External (personnel) funds now from EGEE and VL-E projects

Local Highlights Positioned well in LCG & EGEE Enthusiastically accepted as Tier-1 One of ‘best run’ sites One of first sites (#3 in EDG, compare #4 in WWW) Membership on: Middleware Design Team (US collaboration here too) Middleware Security Group Various other working groups and RTAGs D. Groep chairs world-recognized EUGridPMA and IGTF K. Bos chairs LHC Grid Deployment Board Profit from close collaboration with best network in world (Surfnet) and experts to exploit it (UvA)

people PDP group (grid) 10 staff (5 permanent), 9,5 FTE Three arms: operations / applications / security CT group (everything else; ~ 70% PDP also in CT) 21 staff (16 permanent) 15% support / management 35% system & hardware administration 50% technical (grid, data acquisition, etc)

budget Tier-1 Desktops, NIKHEF analysis facility budget for new investments short term (2006 – 2007): 2,8 M€ from NCF; Tier-1 part ~ 0,6 M€ mid term (2007 – 2010) : 2 M€ / yr from NCF; part of NCF long-range plan but funds not yet secured. Budget for maintenance included in above figures Desktops, NIKHEF analysis facility New investments 170 k€ Maintenance & overhead 160 k€ Extra funding for analysis facility from NIKHEF discretionary budget if necessary

Progress at NIKHEF LHCb DC ‘04 Up to 3000 simultaneous “jobs” per experiment 2.2 million CPU-hours (250 years) used in one month Total data volume > 25 TB For LHCb: NIKHEF ~ 6% of global total

“The Dutch Contribution” Progress at SARA LCG Service Challenge II “The Dutch Contribution”