J. Templon Nikhef Amsterdam Physics Data Processing Group Large Scale Computing Jeff Templon Nikhef Jamboree, Utrecht, 10 december 2012.

Slides:



Advertisements
Similar presentations
Nikhef Jamboree 2008 BiG Grid Update Jan Just Keijser.
Advertisements

Grid Jeff Templon PDP Group, NIKHEF NIKHEF Jamboree 22 december 2005 Throbbing jobsGoogled Grid.
Large Scale Computing Systems
ASCR Data Science Centers Infrastructure Demonstration S. Canon, N. Desai, M. Ernst, K. Kleese-Van Dam, G. Shipman, B. Tierney.
The DutchGrid Platform Collaboration of projects from –Computer Science, HEP and service providers Participating and supported projects –Virtual Laboratory.
 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
February 12, 2002 DØRACE 1 Grid activities in NIKHEF Willem van Leeuwen.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
ATLAS computing in Geneva 268 CPU cores (login + batch) 180 TB for data the analysis facility for Geneva group grid batch production for ATLAS special.
Regional Computing Centre for Particle Physics Institute of Physics AS CR (FZU) TIER2 of LCG (LHC Computing Grid) 1M. Lokajicek Dell Presentation.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
Current Status of the Grid Computing for Physics in Romania Horia Hulubei National Institute of R&D in Physics and Nuclear Engineering (IFIN-HH)
Take on messages from Lecture 1 LHC Computing has been well sized to handle the production and analysis needs of LHC (very high data rates and throughputs)
A short introduction to the Worldwide LHC Computing Grid Maarten Litmaath (CERN)
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
Dutch Tier Hardware Farm size –now: 150 dual nodes + scavenging 200 nodes –buildup to ~1500 up-to-date nodes in 2007 Network –now: 2 Gbit/s internatl.
Alex Read, Dept. of Physics Grid Activity in Oslo CERN-satsingen/miljøet møter MN-fakultetet Oslo, 8 juni 2009 Alex Read.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
Grid Computing Status Report Jeff Templon PDP Group, NIKHEF NIKHEF Scientific Advisory Committee 20 May 2005.
ATLAS in LHCC report from ATLAS –ATLAS Distributed Computing has been working at large scale Thanks to great efforts from shifters.
The DutchGrid Platform – An Overview – 1 DutchGrid today and tomorrow David Groep, NIKHEF The DutchGrid Platform Large-scale Distributed Computing.
EGEE is a project funded by the European Union under contract IST HEP Use Cases for Grid Computing J. A. Templon Undecided (NIKHEF) Grid Tutorial,
P. Kuipers Nikhef Amsterdam Computer- Technology Nikhef Site Report Paul Kuipers
OSG Tier 3 support Marco Mambelli - OSG Tier 3 Dan Fraser - OSG Tier 3 liaison Tanya Levshina - OSG.
…building the next IT revolution From Web to Grid…
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Nikhef/(SARA) tier-1 data center infrastructure
Computing Coordination Aspects for HEP in Germany International ICFA Workshop on HEP Networking, Grid and Digital Divide Issues for Global e-Science nLCG.
WLCG and the India-CERN Collaboration David Collados CERN - Information technology 27 February 2014.
NIKHEF CT/ Status NIKHEF (NL). NIKHEFDataGrid/Oxford/July DutchGrid Participation of High-energy Physics Earth observation Computer.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
Alex Read, Dept. of Physics Grid Activities in Norway R-ECFA, Oslo, 15 May, 2009.
LHC Computing, CERN, & Federated Identities
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
Eygene Ryabinkin, on behalf of KI and JINR Grid teams Russian Tier-1 status report May 9th 2014, WLCG Overview Board meeting.
tons, 150 million sensors generating data 40 millions times per second producing 1 petabyte per second The ATLAS experiment.
Physics Data Processing at NIKHEF Jeff Templon WAR 7 May 2004.
Research organization technology David Groep, October 2007.
CERN - IT Department CH-1211 Genève 23 Switzerland t IT-GD-OPS attendance to EGEE’09 IT/GD Group Meeting, 09 October 2009.
1 September 2007WLCG Workshop, Victoria, Canada 1 WLCG Collaboration Workshop Victoria, Canada Site Readiness Panel Discussion Saturday 1 September 2007.
Breaking the frontiers of the Grid R. Graciani EGI TF 2012.
A Computing Tier 2 Node Eric Fede – LAPP/IN2P3. 2 Eric Fede – 1st Chinese-French Workshop Plan What is a Tier 2 –Context and definition To be a Tier 2.
Update on CHEP from the Computing Speaker Committee G. Carlino (INFN Napoli) on behalf of the CSC ICB, October
IBERGRID as RC Total Capacity: > 10k-20K cores, > 3 Petabytes Evolving to cloud (conditioned by WLCG in some cases) Capacity may substantially increase.
Grid Computing Jeff Templon Programme: Group composition (current): 2 staff, 10 technicians, 1 PhD. Publications: 2 theses (PD Eng.) 16 publications.
Mihnea Dulea, IFIN-HH R-ECFA Meeting, National Physics Library IFIN-HH, Magurele Romanian participation in WLCG M. Dulea Elementary Particles.
J. Templon Nikhef Amsterdam Physics Data Processing Group “Grid” Computing J. Templon SAC, 26 April 2012.
Status of the NL-T1. BiG Grid – the dutch e-science grid Realising an operational ICT infrastructure at the national level for scientific research (e.g.
DutchGrid KNMI KUN Delft Leiden VU ASTRON WCW Utrecht Telin Amsterdam Many organizations in the Netherlands are very active in Grid usage and development,
EGI-InSPIRE EGI-InSPIRE RI The European Grid Infrastructure Steven Newhouse Director, EGI.eu Project Director, EGI-InSPIRE 29/06/2016CoreGrid.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
Computing infrastructures for the LHC: current status and challenges of the High Luminosity LHC future Worldwide LHC Computing Grid (WLCG): Distributed.
RECENT DEVELOPMENTS IN THE CONTRIBUTION OF DFCTI/IFIN-HH TO THE WLCG COLLABORATION Department of Computational Physics and Information Technologies (DFCTI)
EMI is partially funded by the European Commission under Grant Agreement RI Commercial applications of open source middleware: the EMI and DCore.
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
Grid Computing: dealing with GB/s dataflows David Groep, NIKHEF Graphics: Real Time Monitor, Gidon Moont, Imperial College London, see
Ian Bird, CERN WLCG Project Leader Amsterdam, 24 th January 2012.
Accessing the VI-SEEM infrastructure
Status of WLCG FCPPL project
NDGF The Distributed Tier1 Site.
(Prague, March 2009) Andrey Y Shevel
A Dutch LHC Tier-1 Facility
Tools and Services Workshop
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
One independent ‘policy-bridge’ PKI
Dagmar Adamova (NPI AS CR Prague/Rez) and Maarten Litmaath (CERN)
Recap: introduction to e-science
Gridifying the LHCb Monte Carlo production system
Grid activities in NIKHEF
Presentation transcript:

J. Templon Nikhef Amsterdam Physics Data Processing Group Large Scale Computing Jeff Templon Nikhef Jamboree, Utrecht, 10 december 2012

J. Templon Nikhef Amsterdam Physics Data Processing Group 10 december 2012Jeff Templon, Nikhef Jamboree, Utrecht2 CT directorate intern PhD Retired SPYOps Devel Secur Users Ops

J. Templon Nikhef Amsterdam Physics Data Processing Group Facilities overview Jeff Templon, Nikhef Jamboree, Utrecht310 december 2012 Nikhef Grid Facilities: 3500 cores, ≈ 2 petabyte, 200 Gb/s Tier-1 for ATLAS, LHCb, ALICE Site for WLCG, BiG Grid, EGI Stoomb oot (PDP & CT)

J. Templon Nikhef Amsterdam Physics Data Processing Group Tier-1 Performance (LHCb) Jeff Templon, Nikhef Jamboree, Utrecht410 december 2012 Measured centrally by WLCG! 2% unavailable means 7 days out of the year.

J. Templon Nikhef Amsterdam Physics Data Processing Group Tier-1 performance (ATLAS) Jeff Templon, Nikhef Jamboree, Utrecht510 december 2012

J. Templon Nikhef Amsterdam Physics Data Processing Group Jeff Templon, Nikhef Jamboree, Utrecht610 december 2012 Local Heroes 1.dgeerts 35 years, 117k jobs 2.vdgeer 15 months, 6k jobs 3.templon(?) 18 days, 3k jobs 4.rvdleeuw 13 hours, 3k jobs

J. Templon Nikhef Amsterdam Physics Data Processing Group New Hardware Jeff Templon, Nikhef Jamboree, Utrecht710 december 2012

J. Templon Nikhef Amsterdam Physics Data Processing Group Manycore ◦ LHC experiments “experimenting” ◦ Virgo group (general relativity simulations) ◦ LOFAR ◦ Biobanking Hadoop? CERN gives a lot of courses these days End of Horizontal Scaling? Jeff Templon, Nikhef Jamboree, Utrecht810 december 2012

J. Templon Nikhef Amsterdam Physics Data Processing Group Lots of SARA Much internal use of “cloud” Great for services Not ready for HTC Cloud Computing Jeff Templon, Nikhef Jamboree, Utrecht910 december 2012 Scalability of BiG Grid HPC SARA; ATLAS analysis, H.C. Lee

J. Templon Nikhef Amsterdam Physics Data Processing Group “operational security” ◦ Going strong, funded for continuation by EGI “middleware” ◦ Funding ends > maintenance “Identity Federations” ◦ NL work with surfnet ◦ Extend use to other services … standardize by ubiquity “Security” Jeff Templon, Nikhef Jamboree, Utrecht1010 december 2012

J. Templon Nikhef Amsterdam Physics Data Processing Group 11 Hard-Core CS : Formal Methods From last year D. Remenska 10 december 2012Jeff Templon, Nikhef Jamboree, Utrecht

J. Templon Nikhef Amsterdam Physics Data Processing Group Jeff Templon, Nikhef Jamboree, Utrecht1210 december 2012

J. Templon Nikhef Amsterdam Physics Data Processing Group 1. Data Acquisition, Trigger and Controls 2. Event Processing, Simulation and Analysis 3. Distributed Processing and Data Handling 4. Data Stores, Data Bases, and Storage Systems 5. Software Engineering, Parallelism & Multi- Core 6. Facilities, Production Infrastructures, Networking and Collaborative Tools CHEP 2013 Jeff Templon, Nikhef Jamboree, Utrecht1310 december

J. Templon Nikhef Amsterdam Physics Data Processing Group Jeff Templon, Nikhef Jamboree, Utrecht1410 december % of cluster But that is 210 cores 24x7 And this is just at Nikhef … there are other BiG Grid sites SARA Philips Eindhoven Life Science Grid The Rest …

J. Templon Nikhef Amsterdam Physics Data Processing Group HEP analysis on Cloud Network provisioning for Cloud R portal Medical Imaging (AMC) Leiden Grid Initiative DANS (humanities) Identity Federations (Max Planck Inst. Psycholinguistics) BiG Grid Projects 2012 Jeff Templon, Nikhef Jamboree, Utrecht1510 december 2012 BiG Grid and Beyond in the next talk