The Worldwide LHC Computing Grid Frédéric Hemmer IT Department Head Visit of INTEL ISEF CERN Special Award Winners 2012 Thursday, 21 st June 2012.

Slides:



Advertisements
Similar presentations
Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.
Advertisements

CURRENT AND FUTURE HPC SOLUTIONS. T-PLATFORMS  Russia’s leading developer of turn-key solutions for supercomputing  Privately owned  140+ employees.
T1 at LBL/NERSC/OAK RIDGE General principles. RAW data flow T0 disk buffer DAQ & HLT CERN Tape AliEn FC Raw data Condition & Calibration & data DB disk.
Linux Clustering A way to supercomputing. What is Cluster? A group of individual computers bundled together using hardware and software in order to make.
CERN IT Department CH-1211 Genève 23 Switzerland Visit of Professor Jerzy Szwed Under Secretary of State Ministry of Science and Higher.
SUMS Storage Requirement 250 TB fixed disk cache 130 TB annual increment for permanently on- line data 100 TB work area (not controlled by SUMS) 2 PB near-line.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
Agenda Network Infrastructures LCG Architecture Management
March 27, IndiaCMS Meeting, Delhi1 T2_IN_TIFR of all-of-us, for all-of-us, by some-of-us Tier-2 Status Report.
CERN IT Department CH-1211 Genève 23 Switzerland t Next generation of virtual infrastructure with Hyper-V Michal Kwiatek, Juraj Sucik, Rafal.
IT Department 29 October 2012 LHC Resources Review Board2 LHC Resources Review Boards Frédéric Hemmer IT Department Head.
Status Report on Tier-1 in Korea Gungwon Kang, Sang-Un Ahn and Hangjin Jang (KISTI GSDC) April 28, 2014 at 15th CERN-Korea Committee, Geneva Korea Institute.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Emmanuel Tsesmelis 2 nd CERN School Thailand 2012 Suranaree University of Technology.
Advanced Computing Services for Research Organisations Bob Jones Head of openlab IT dept CERN This document produced by Members of the Helix Nebula consortium.
9/16/2000Ian Bird/JLAB1 Planning for JLAB Computational Resources Ian Bird.
Frédéric Hemmer, CERN, IT Department The LHC Computing Grid – June 2006 The LHC Computing Grid Visit of the Comité d’avis pour les questions Scientifiques.
Computing for HEP in the Czech Republic Jiří Chudoba Institute of Physics, AS CR, Prague.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
Computing for LHCb-Italy Domenico Galli, Umberto Marconi and Vincenzo Vagnoni Genève, January 17, 2001.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
Spending Plans and Schedule Jae Yu July 26, 2002.
… where the Web was born 11 November 2003 Wolfgang von Rüden, IT Division Leader CERN openlab Workshop on TCO Introduction.
EGEE is a project funded by the European Union under contract IST HellasGrid Hardware Tender Christos Aposkitis GRNET EGEE 3 rd parties Advanced.
CERN IT Department CH-1211 Genève 23 Switzerland Visit of Professor Karel van der Toorn President University of Amsterdam Wednesday 10 th.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
CERN IT Department CH-1211 Genève 23 Switzerland Introduction to CERN Computing Services Bernd Panzer-Steindel, CERN/IT.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
CERN IT Department CH-1211 Genève 23 Switzerland t Frédéric Hemmer IT Department Head - CERN 23 rd August 2010 Status of LHC Computing from.
ATLAS Tier 1 at BNL Overview Bruce G. Gibbard Grid Deployment Board BNL 5-6 September 2006.
CERN – IT Department CH-1211 Genève 23 Switzerland t Working with Large Data Sets Tim Smith CERN/IT Open Access and Research Data Session.
WLCG and the India-CERN Collaboration David Collados CERN - Information technology 27 February 2014.
ATLAS WAN Requirements at BNL Slides Extracted From Presentation Given By Bruce G. Gibbard 13 December 2004.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
CERN Computer Centre Tier SC4 Planning FZK October 20 th 2005 CERN.ch.
CERN - IT Department CH-1211 Genève 23 Switzerland t High Availability Databases based on Oracle 10g RAC on Linux WLCG Tier2 Tutorials, CERN,
Dr. Andreas Wagner Deputy Group Leader - Operating Systems and Infrastructure Services CERN IT Department The IT Department & The LHC Computing Grid –
Computing for LHC Physics 7th March 2014 International Women's Day - CERN- GOOGLE Networking Event Maria Alandes Pradillo CERN IT Department.
Network Connections for the Worldwide LHC Computing Grid Tony Cass Leader, Communication Systems Group Information Technology Department 11 th December.
A follow-up on network projects 10/29/2013 HEPiX Fall Co-authors:
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Frédéric Hemmer IT Department 26 th January 2010 Visit of Michael Dell 1 Frédéric.
JINR WLCG Tier 1 for CMS CICC comprises 2582 Core Disk storage capacity 1800 TB Availability and Reliability = 99% 49% 44% JINR (Dubna)End of.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
tons, 150 million sensors generating data 40 millions times per second producing 1 petabyte per second The ATLAS experiment.
Pathway to Petaflops A vendor contribution Philippe Trautmann Business Development Manager HPC & Grid Global Education, Government & Healthcare.
CERN - IT Department CH-1211 Genève 23 Switzerland t Power and Cooling Challenges at CERN IHEPCCC Meeting April 24 th 2007 Tony Cass.
Joint Institute for Nuclear Research Synthesis of the simulation and monitoring processes for the data storage and big data processing development in physical.
Meeting with University of Malta| CERN, May 18, 2015 | Predrag Buncic ALICE Computing in Run 2+ P. Buncic 1.
Computing Facilities CERN IT Department CH-1211 Geneva 23 Switzerland t CF CERN IT Facility Planning and Procurement HEPiX Fall 2010 Workshop.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
26. Juni 2003Bernd Panzer-Steindel, CERN/IT1 LHC Computing re-costing for for the CERN T0/T1 center.
Grid technologies for large-scale projects N. S. Astakhov, A. S. Baginyan, S. D. Belov, A. G. Dolbilov, A. O. Golunov, I. N. Gorbunov, N. I. Gromova, I.
Computing infrastructures for the LHC: current status and challenges of the High Luminosity LHC future Worldwide LHC Computing Grid (WLCG): Distributed.
ATLAS Computing: Experience from first data processing and analysis Workshop TYL’10.
The Tier-1 center for CMS experiment at LIT JINR N. S. Astakhov, A. S. Baginyan, S. D. Belov, A. G. Dolbilov, A. O. Golunov, I. N. Gorbunov, N. I. Gromova,
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
Status of WLCG FCPPL project
Ian Bird WLCG Workshop San Francisco, 8th October 2016
Grid site as a tool for data processing and data analysis
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
CERN Data Centre ‘Building 513 on the Meyrin Site’
Clouds of JINR, University of Sofia and INRNE Join Together
The LHC Computing Grid Visit of Her Royal Highness
Tour of CERN Computer Center
Dagmar Adamova (NPI AS CR Prague/Rez) and Maarten Litmaath (CERN)
Southwest Tier 2.
Vision for CERN IT Department
New strategies of the LHC experiments to meet
The LHC Computing Grid Visit of Professor Andreas Demetriou
Presentation transcript:

The Worldwide LHC Computing Grid Frédéric Hemmer IT Department Head Visit of INTEL ISEF CERN Special Award Winners 2012 Thursday, 21 st June 2012

CERN IT Department CH-1211 Genève 23 Switzerland The LHC Data Challenge The accelerator will run for 20 years Experiments are producing about 20 Million Gigabytes of data each year (about 3 million DVDs – 700 years of movies!) LHC data analysis requires a computing power equivalent to ~100,000 of today's fastest PC processors Requires many cooperating computer centres, as CERN can only provide ~20% of the capacity June Frédéric Hemmer 2

A distributed computing infrastructure to provide the production and analysis environments for the LHC experiments Managed and operated by a worldwide collaboration between the experiments and the participating computer centres The resources are distributed – for funding and sociological reasons Our task was to make use of the resources available to us – no matter where they are located June Frédéric Hemmer WLCG – what and why? Tier-0 (CERN): Data recording Initial data reconstruction Data distribution Tier-1 (11 centres): Permanent storage Re-processing Analysis Tier-2 (~130 centres): Simulation End-user analysis 3

CERN IT Department CH-1211 Genève 23 Switzerland Global Lambda Integrated Facility June Frédéric Hemmer 4

CERN IT Department CH-1211 Genève 23 Switzerland Data acquired in Data written: Total 9.4 PB to end May >3 PB in May (cf 2 PB/month in 2011) 2012 Data written: Total 9.4 PB to end May >3 PB in May (cf 2 PB/month in 2011) Data accessed from tape, 2012 June Frédéric Hemmer5

CERN IT Department CH-1211 Genève 23 Switzerland Data transfers Global transfers > 10 GB/s (1 day) Global transfers (last month) CERN  Tier 1s (last 2 weeks) June Frédéric Hemmer6

WLCG – no stop for computing Activity on 3 rd Jan

Problem - Technology Explosion with NGS 8

Problem - Data Growth & Storage Costs Tiered storage (2x disk, 1x tape) Invest: + 40% p.a. Disk Price: - 20% p.a. New Storage: 2x each 15 Month 9

Sequence Production & IT Infrastructure at EMBL Compute Power: CPU Cores, 6+ TB RAM Storage: 1+ PB High Performance Disk 4 x Ilumina HiSeq TB data each week 2 x Ilumina GAIIx 10

NGS - The Big Picture ~ 8.7 million species in the world (estimate) ~ 7 billion people Sequencers exist in both large centres & small research groups > 200 Ilumina HiSeq sequencers in Europe alone => capacity to sequence 1600 human genomes / month Largest centre: Beijing Genomics Institute (BGI) 167 sequencers, 130 HiSeq 2,000 human genomes / day Hiseq devices worldwide today 3-6 PB /day 1.1 – 2.2 Exabytes / year 11

World Map of High-throughput Sequencers 12

CERN IT Department CH-1211 Genève 23 Switzerland The CERN Data Centre in Numbers Data Centre Operations (Tier 0) – 24x7 operator support and System Administration services to support 24x7 operation of all IT services. – Hardware installation & retirement ~7,000 hardware movements/year; ~1800 disk failures/year – Management and Automation framework for large scale Linux clusters High Speed Routers (640 Mbps → 2.4 Tbps) 24 Ethernet Switches Gbps ports2000 Switching Capacity4.8 Tbps 1 Gbps ports16, Gbps ports558 Racks828 Servers8938 Processors15,694 Cores64,238 HEPSpec06482,507 Disks64,109 Raw disk capacity (TiB)63,289 Memory modules56,014 Memory capacity (TiB)158 RAID controllers3,749 Tape Drives160 Tape Cartridges45000 Tape slots56000 Tape Capacity (TiB)34000 IT Power Consumption2456 KW Total Power Consumption3890 KW June Frédéric Hemmer 13

CERN IT Department CH-1211 Genève 23 Switzerland Scaling CERN Data Center(s) to anticipated needs CERN Data Center dates back to the 70’s – Now optimizing the current facility (cooling automation, temperatures, infrastructure) Renovation of the “barn” for accommodating 450 KW of “critical” IT loads – an EN, FP, GS, HSE, IT joint venture June Frédéric Hemmer Exploitation of 100 KW of remote facility down town – Understanding costs, remote dynamic management, ensure business continuity Exploitation of a remote Data center in Hungary – 100 Gbps connections – Agile infrastructure – virtualization 14

CERN IT Department CH-1211 Genève 23 Switzerland 15 June Frédéric Hemmer

CERN IT Department CH-1211 Genève 23 Switzerland June Frédéric Hemmer 16