J. Templon Nikhef Amsterdam Physics Data Processing Group “Grid” Computing J. Templon SAC, 26 April 2012.

Slides:



Advertisements
Similar presentations
9 Sep IWoRID2002 Arjen van Rijn Welcome at the National Institute for Nuclear Physics and High Energy Physics Science Park Amsterdam.
Advertisements

Nikhef Jamboree 2008 BiG Grid Update Jan Just Keijser.
Grid Jeff Templon PDP Group, NIKHEF NIKHEF Jamboree 22 december 2005 Throbbing jobsGoogled Grid.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update June 12,
Duke Atlas Tier 3 Site Doug Benjamin (Duke University)
AN INTRODUCTION TO SURF WHAT COLLABORATION CAN DO FOR HIGHER EDUCATION AND RESEARCH Walter van Dijk Member Management Team SURFnet.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Technology Steering Group January 31, 2007 Academic Affairs Technology Steering Group February 13, 2008.
DATA PRESERVATION IN ALICE FEDERICO CARMINATI. MOTIVATION ALICE is a 150 M CHF investment by a large scientific community The ALICE data is unique and.
Regional Computing Centre for Particle Physics Institute of Physics AS CR (FZU) TIER2 of LCG (LHC Computing Grid) 1M. Lokajicek Dell Presentation.
Assessment of Core Services provided to USLHC by OSG.
F Run II Experiments and the Grid Amber Boehnlein Fermilab September 16, 2005.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
A Roadmap to Service Excellence Information Technology Strategic Plan University of Wisconsin-Madison A report to the ITC
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
Advanced Computing Services for Research Organisations Bob Jones Head of openlab IT dept CERN This document produced by Members of the Helix Nebula consortium.
Computing for HEP in the Czech Republic Jiří Chudoba Institute of Physics, AS CR, Prague.
Ian Bird LHCC Referee meeting 23 rd September 2014.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
Alex Read, Dept. of Physics Grid Activity in Oslo CERN-satsingen/miljøet møter MN-fakultetet Oslo, 8 juni 2009 Alex Read.
Grid Computing Status Report Jeff Templon PDP Group, NIKHEF NIKHEF Scientific Advisory Committee 20 May 2005.
1 HiGrade Kick-off Welcome to DESY Hamburg Zeuthen.
1 ASTRONET Coordinating strategic planning for European Astronomy.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
October 2006ICFA workshop, Cracow1 HEP grid computing in Portugal Jorge Gomes LIP Computer Centre Lisbon Laboratório de Instrumentação e Física Experimental.
Total Cost of Ownership for Technology Resources at Mission College Description and Recommendations.
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
E-Science Research Councils awarded e-Science funds ” science increasingly done through distributed global collaborations enabled by the Internet, using.
Opérations : Statut et perspectives Gilles Mathieu Workshop opérations 10 mai - Lille Lille – Mai 2012.
11 November 2010 Natascha Hörmann Computing at HEPHY Evaluation 2010.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
EGI-InSPIRE Steven Newhouse Interim EGI.eu Director EGI-InSPIRE Project Director Technical Director EGEE-III 1GDB - December 2009.
Alex Read, Dept. of Physics Grid Activities in Norway R-ECFA, Oslo, 15 May, 2009.
1 Report of PAC for Particle Physics T. Hallman Presented by P. Spillantini JINR Scientific Council Meeting June 3-4, 2004 Dubna, Russia.
Funding Technology at Mission College Challenges and Recommendations.
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
LHC Computing, CERN, & Federated Identities
Computing Division FY03 Budget and budget outlook for FY04 + CDF International Finance Committee April 4, 2003 Vicky White Head, Computing Division.
IAG – Israel Academic Grid, EGEE and HEP in Israel Prof. David Horn Tel Aviv University.
Physics Data Processing at NIKHEF Jeff Templon WAR 7 May 2004.
Information Technology Assessment Findings Presented to the colleges of the State Center Community College District.
Research organization technology David Groep, October 2007.
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
Ian Bird LCG Project Leader Status of EGEE  EGI transition WLCG LHCC Referees’ meeting 21 st September 2009.
IPCEI on High performance computing and big data enabled application: a pilot for the European Data Infrastructure Antonio Zoccoli INFN & University of.
Re.Ca.S. Rete di Calcolo per SuperB G. Russo Ferrara, 2011 july 7th 1.
Evaluation and Strategy Reports NWO: evaluation of all institutes at the ‘same’ time SEP (Standard Evaluation Protocol ): –Two levels of assessment:
Grid Computing Jeff Templon Programme: Group composition (current): 2 staff, 10 technicians, 1 PhD. Publications: 2 theses (PD Eng.) 16 publications.
INFSO-RI Enabling Grids for E-sciencE EGEE general project update Fotis Karayannis EGEE South East Europe Project Management Board.
November 28, 2007 Dominique Boutigny – CC-IN2P3 CC-IN2P3 Update Status.
Slide § David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP delivering The UK Grid.
Status of the NL-T1. BiG Grid – the dutch e-science grid Realising an operational ICT infrastructure at the national level for scientific research (e.g.
DutchGrid KNMI KUN Delft Leiden VU ASTRON WCW Utrecht Telin Amsterdam Many organizations in the Netherlands are very active in Grid usage and development,
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
J. Templon Nikhef Amsterdam Physics Data Processing Group Large Scale Computing Jeff Templon Nikhef Jamboree, Utrecht, 10 december 2012.
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
1 CF lab review ; September 16-19, 2013, H.Weerts Budget and Activity Summary Slides with budget numbers and FTE summaries/activity for FY14 through FY16.
EGI-InSPIRE EGI-InSPIRE RI EGI strategy towards the Open Science Commons Tiziana Ferrari EGI-InSPIRE Director at EGI.eu.
Operations Coordination Team Maria Girone, CERN IT-ES GDB, 11 July 2012.
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
EGI-InSPIRE RI An Introduction to European Grid Infrastructure (EGI) March An Introduction to the European Grid Infrastructure.
Ian Bird, CERN WLCG Project Leader Amsterdam, 24 th January 2012.
Gene Oleynik, Head of Data Storage and Caching,
A Dutch LHC Tier-1 Facility
MICE Project in the US: Completion of Efforts
UK Status and Plans Scientific Computing Forum 27th Oct 2017
EGI Webinar - Introduction -
Collaboration Board Meeting
This work is supported by projects Research infrastructure CERN (CERN-CZ, LM ) and OP RDE CERN Computing (CZ /0.0/0.0/1 6013/ ) from.
Presentation transcript:

J. Templon Nikhef Amsterdam Physics Data Processing Group “Grid” Computing J. Templon SAC, 26 April 2012

J. Templon Nikhef Amsterdam Physics Data Processing Group 26-apr-2012Jeff Templon2 Mission & Research Questions Mission: Operation of state-of-the-art computing resources for Nikhef physicists, participation in national and international distributed computing infrastructures, and R&D on large-scale scientific computing. Fundamental Research Application of formal methods to distributed systems (with PhD student & Vrije Universiteit CS) World-class “security” group Operational security in distributed infrastructures Data management in distributed infrastructures Scaling in real-world distributed systems Engineering Research Operating part of Netherlands T1 in-house 3500 cores, 1.7 PB, 200Gb/s backbone Operations Physics users (WLCG and Nikhef) BiG Grid & other users (Life Sciences, Humanities, Astro, …) Applications

J. Templon Nikhef Amsterdam Physics Data Processing Group T1 operationally in good shape BiG Grid thriving (final year) Reappraising group activities Funding during next 2 years is a challenge Conclusions

J. Templon Nikhef Amsterdam Physics Data Processing Group NWO Evaluation The computing infrastructure at Nikhef enables the science from the LHC and from the other areas of the Nikhef scientific program. The committee endorses continued support for the computing operations teams and for the associated computing R&D. Nikhef has been playing a leadership role in HEP computing. We recommend that Nikhef pursue a senior scientist with expertise in scientific computing to succeed Kors Bos. The group has a successful history in strategic planning for distributed computing both on the national and international level. Now that the stabilization of grid operations is on track and operations responsibilities have transitioned over to other organizations, we recommend Nikhef update its long-term strategic plan for research and development in the area The computing infrastructure (Tier-1, Stoomboot) must be upgraded following the increasing amount of data produced by the LHC Research programme Grid Current research programme leader: Jeff Templon Tenured staff: 3.0 Other personnel : 0 Postdocs, 2 x 0.5 PhD Students, 10 Technical Staff Publications ( ): 17 Quality5 Productivity5 Relevance5 Vitality4.5 Overall5 19 Sept 2011

J. Templon Nikhef Amsterdam Physics Data Processing Group Nikhef CPU last year

J. Templon Nikhef Amsterdam Physics Data Processing Group Usage per group

J. Templon Nikhef Amsterdam Physics Data Processing Group Non-collider usage

J. Templon Nikhef Amsterdam Physics Data Processing Group Not Just LHC 20-sept-20118

J. Templon Nikhef Amsterdam Physics Data Processing Group 2006 – 31 dec 2012, 28, SAC, worries about lack of users 28 proposals in so far in 2012 No active user-scrounging activity Main conclusions March AH meeting ◦ Happy users ◦ Need better internal communication BiG Grid

J. Templon Nikhef Amsterdam Physics Data Processing Group Security: ◦ Maintenance of products (!! WLCG, OSG, …) ◦ Federated identity (hot, but in flux) Operations: ◦ Virtualization (“clouds”) ◦ High-performance networking Data Access & Management ◦ Caching & content delivery ◦ Data access on Cloud ◦ Formal methods & distributed systems R&D activities

J. Templon Nikhef Amsterdam Physics Data Processing Group 11 Computer Science: PhD project Daniela Remenska Inconsistencies found, confirmed in real system

J. Templon Nikhef Amsterdam Physics Data Processing Group Funding slide from last SAC

J. Templon Nikhef Amsterdam Physics Data Processing Group National Funding e-infra Ambitions & Expected Funding Sources Requested Roadmap funding, not approved Good news: from 2014 onwards, 8 M€ of funding becomes budgeted (not project based)

J. Templon Nikhef Amsterdam Physics Data Processing Group To be submitted in 2013, funding starts in Contains 0.9 M€/yr (~ hw investments) for T1, Total estimated cost of T1 2.5 M€/yr (2011). LHC Upgrade Roadmap

J. Templon Nikhef Amsterdam Physics Data Processing Group External Manpower Funding BiG Grid 31 Dec 2012 ◦ 350 k€ / 4 FTE ◦ 2 FTE (S&D) probably continues IGE 31 March 2013 ◦ 70 k€ / 1.5 FTE EMI 30 April 2013 ◦ 60 k€ / 1.5 FTE EGI-Inspire 30 April 2014 ◦ 60 k€ / 2 FTE 17-febr Reduction of 5 FTE external funding post-April … no new EU calls in NLeSC?? Funded, but no projects to Nikhef so far …

J. Templon Nikhef Amsterdam Physics Data Processing Group Keep base infra running (maint on enclosures, network, server park) Extend life on older cpu hardware (5% decay per year) Creative financing of running costs Replace disks if possible Upgrade disk space if possible Funding Gap Strategy

J. Templon Nikhef Amsterdam Physics Data Processing Group T1 operationally in good shape BiG Grid thriving (final year) Reappraising group activities Funding during next 2 years is a challenge Conclusions

J. Templon Nikhef Amsterdam Physics Data Processing Group Backup slides

J. Templon Nikhef Amsterdam Physics Data Processing Group ambitions