The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.

Slides:



Advertisements
Similar presentations
Programme: 145 sessions & social events
Advertisements

Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
Welcome to CERN Accelerating Science and Innovation 2 nd March 2015 – Bidders Conference – DO-29161/EN.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
Welcome to CERN Research Technology Training Collaborating.
Project Status Report Ian Bird Computing Resource Review Board 30 th October 2012 CERN-RRB
13 October 2014 Eric Grancher, head of database services, CERN IT Manuel Martin Marquez, data scientist, CERN openlab.
Massive Computing at CERN and lessons learnt
LHC: An example of a Global Scientific Community Sergio Bertolucci CERN 5 th EGEE User Forum Uppsala, 14 th April 2010.
Resources and Financial Plan Sue Foffano WLCG Resource Manager C-RRB Meeting, 12 th October 2010.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Emmanuel Tsesmelis 2 nd CERN School Thailand 2012 Suranaree University of Technology.
Frédéric Hemmer, CERN, IT DepartmentThe LHC Computing Grid – October 2006 LHC Computing and Grids Frédéric Hemmer IT Deputy Department Head October 10,
16 October 2005 Collaboration Meeting1 Computing Issues & Status L. Pinsky Computing Coordinator ALICE-USA.
Frédéric Hemmer, CERN, IT Department The LHC Computing Grid – June 2006 The LHC Computing Grid Visit of the Comité d’avis pour les questions Scientifiques.
Petabyte-scale computing for LHC Ian Bird, CERN WLCG Project Leader WLCG Project Leader ISEF Students 18 th June 2012 Accelerating Science and Innovation.
Rackspace Analyst Event Tim Bell
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
Ian Bird LCG Project Leader Project status report WLCG LHCC Referees’ meeting 16 th February 2010.
Progress in Computing Ian Bird ICHEP th July 2010, Paris
CERN IT Department CH-1211 Genève 23 Switzerland Visit of Professor Karel van der Toorn President University of Amsterdam Wednesday 10 th.
Jürgen Knobloch/CERN Slide 1 A Global Computer – the Grid Is Reality by Jürgen Knobloch October 31, 2007.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
CERN IT Department CH-1211 Genève 23 Switzerland t Frédéric Hemmer IT Department Head - CERN 23 rd August 2010 Status of LHC Computing from.
WLCG and the India-CERN Collaboration David Collados CERN - Information technology 27 February 2014.
SC4 Planning Planning for the Initial LCG Service September 2005.
WelcomeWelcome CSEM – CERN Day 23 rd May 2013 CSEM – CERN Day 23 rd May 2013 to Accelerating Science and Innovation to Accelerating Science and Innovation.
CERN as a World Laboratory: From a European Organization to a global facility CERN openlab Board of Sponsors July 2, 2010 Rüdiger Voss CERN Physics Department.
Ian Bird LCG Project Leader WLCG Update 6 th May, 2008 HEPiX – Spring 2008 CERN.
Computing for LHC Physics 7th March 2014 International Women's Day - CERN- GOOGLE Networking Event Maria Alandes Pradillo CERN IT Department.
LHC Computing, CERN, & Federated Identities
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Frédéric Hemmer IT Department 26 th January 2010 Visit of Michael Dell 1 Frédéric.
WLCG Worldwide LHC Computing Grid Markus Schulz CERN-IT-GT August 2010 Openlab Summer Students.
tons, 150 million sensors generating data 40 millions times per second producing 1 petabyte per second The ATLAS experiment.
Geography Review On Map 1, please identify: -Spain -France -England -Russia -Ottoman empire -Persia -China -Mughal India -Songhai Empire.
Country EPS-12 Total (with ICPS) Hungary7979 Germany5559 Romania3841 Ukraine2527 United Kingdom1930 Finland1842 France1616 Italy1616 Poland1313 Switzerland1314.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 1 st March 2011 Visit of Dr Manuel Eduardo Baldeón.
The Mission of CERN  Push back  Push back the frontiers of knowledge E.g. the secrets of the Big Bang …what was the matter like within the first moments.
Chris Onions Update on training for ATLAS computing 19/5/99 1 Update on Training for ATLAS Computing Chris Onions.
WLCG after 1 year with data: Prospects for the future Ian Bird; WLCG Project Leader openlab BoS meeting CERN4 th May 2011.
Figure 1. PARTICIPATING STEM CELL DONOR REGISTRIES Number of registries Year ©BMDW.
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
WLCG: The 1 st year with data & looking to the future WLCG: Ian Bird, CERN WLCG Project Leader WLCG Project LeaderLCG-France; Strasbourg; 30 th May 2011.
Global Aluminium Pipe and Tube Market to 2018 (Market Size, Growth, and Forecasts in Nearly 60 Countries) Published Date: Jul-2014 Reports and Intelligence.
IEC System of Conformity Assessment Schemes for Electrotechnical Equipment and Components.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
IEC System of Conformity Assessment Schemes for Electrotechnical Equipment and Components.
IEC System of Conformity Assessment Schemes for Electrotechnical Equipment and Components.
WLCG – Status and Plans Ian Bird WLCG Project Leader openlab Board of Sponsors CERN, 23 rd April 2010.
European Innovation Scoreboard European Commission Enterprise and Industry DG EPG DGs meeting, May 2008.
France Ireland Norway Sweden Finland Estonia Latvia Spain Portugal Belgium Netherlands Germany Switzerland Italy Czech Rep Slovakia Austria Poland Ukraine.
Best Sustainable Development Practices for Food Security UV-B radiation: A Specific Regulator of Plant Growth and Food Quality in a Changing Climate The.
Dr. Ian Bird LHC Computing Grid Project Leader Göttingen Tier 2 Inauguration 13 th May 2008 Challenges and Opportunities.
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
The CMS Experiment at LHC
Ian Bird WLCG Workshop San Francisco, 8th October 2016
Grid Computing in HIGH ENERGY Physics
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
LHC Computing Grid Status of Resources Financial Plan and Sue Foffano
Project Status Report Computing Resource Review Board Ian Bird
Resources and Financial Plan
The LHC Computing Challenge
Dagmar Adamova, NPI AS CR Prague/Rez
New strategies of the LHC experiments to meet
LHC Data Analysis using a worldwide computing grid
The LHC Grid Service A worldwide collaboration Ian Bird
Overview & Status Al-Ain, UAE November 2007.
2006 Rank Adjusted for Purchasing Power
The LHC Computing Grid Visit of Professor Andreas Demetriou
Presentation transcript:

The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum

2 Collisions at the LHC: summary 26 June 2009Ian Bird, CERN

4 pp collisions at 14 TeV at cm -2 s -1 How to extract this: (Higgs  4 muons) From this: With: 20 proton-proton collisions overlap And this repeats every 25 ns… A very difficult environment … Z  at LEP (e+e-) 26 June 2009Ian Bird, CERN

The LHC Computing Challenge 5  Signal/Noise: (10 -9 offline)  Data volume High rate * large number of channels * 4 experiments  15 PetaBytes of new data each year  Compute power Event complexity * Nb. events * thousands users  100 k of (today's) fastest CPUs  45 PB of disk storage  Worldwide analysis & funding Computing funding locally in major regions & countries Efficient analysis everywhere  GRID technology 26 June 2009Ian Bird, CERN

6 A collision at LHC 26 June 2009Ian Bird, CERN

7 The Data Acquisition 26 June 2009

Ian Bird, CERN GB/sec (ions) Tier 0 at CERN: Acquisition, First pass processing Storage & Distribution 26 June 2009

Tier 0 – Tier 1 – Tier 2 Ian Bird, CERN9 Tier-0 (CERN): Data recording Initial data reconstruction Data distribution Tier-1 (11 centres): Permanent storage Re-processing Analysis Tier-2 (~130 centres): Simulation End-user analysis 26 June 2009

(w)LCG – Project and Collaboration LCG was set up as a project in 2 phases: – Phase I – Development & planning; prototypes End of this phase the computing Technical Design Reports were delivered (1 for LCG and 1 per experiment) – Phase II – – Deployment & commissioning of the initial services Program of data and service challenges During Phase II, the WLCG Collaboration was set up as the mechanism for the longer term: – Via an MoU – signatories are CERN and the funding agencies – Sets out conditions and requirements for Tier 0, Tier 1, Tier 2 services, reliabilities etc (“SLA”) – Specifies resource contributions – 3 year outlook 26 June Ian Bird, CERN

De-FZK US-FNAL Ca- TRIUMF NDGF CERN Barcelona/PIC Lyon/CCIN2P3 US-BNL UK-RAL Taipei/ASGC 26 June 2009Ian Bird, CERN11 Today we have 49 MoU signatories, representing 34 countries: Australia, Austria, Belgium, Brazil, Canada, China, Czech Rep, Denmark, Estonia, Finland, France, Germany, Hungary, Italy, India, Israel, Japan, Rep. Korea, Netherlands, Norway, Pakistan, Poland, Portugal, Romania, Russia, Slovenia, Spain, Sweden, Switzerland, Taipei, Turkey, UK, Ukraine, USA. Today we have 49 MoU signatories, representing 34 countries: Australia, Austria, Belgium, Brazil, Canada, China, Czech Rep, Denmark, Estonia, Finland, France, Germany, Hungary, Italy, India, Israel, Japan, Rep. Korea, Netherlands, Norway, Pakistan, Poland, Portugal, Romania, Russia, Slovenia, Spain, Sweden, Switzerland, Taipei, Turkey, UK, Ukraine, USA. WLCG Today Tier 0; 11 Tier 1s; 61 Tier 2 federations (121 Tier 2 sites) WLCG Today Tier 0; 11 Tier 1s; 61 Tier 2 federations (121 Tier 2 sites) Amsterdam/NIKHEF-SARA Bologna/CNAF

Preparation for accelerator start up Since 2004 WLCG has been running a series of challenges to demonstrate aspects of the system; with increasing targets for: – Data throughput – Workloads – Service availability and reliability Recent significant challenges – May 2008 – Combined Readiness Challenge All 4 experiments running realistic work (simulating data taking) Demonstrated that we were ready for real data – June 2009 – Scale Testing Stress and scale testing of all workloads including massive analysis loads In essence the LHC Grid service has been running for several years 1226 June 2009Ian Bird, CERN

Data transfer Full experiment rate needed is 650 MB/s Desire capability to sustain twice that to allow for Tier 1 sites to shutdown and recover Have demonstrated far in excess of that All experiments exceeded required rates for extended periods, & simultaneously All Tier 1s have exceeded their target acceptance rates Full experiment rate needed is 650 MB/s Desire capability to sustain twice that to allow for Tier 1 sites to shutdown and recover Have demonstrated far in excess of that All experiments exceeded required rates for extended periods, & simultaneously All Tier 1s have exceeded their target acceptance rates

Grid Activity – distribution of CPU delivered 26 June 2009Ian Bird, CERN14 Distribution of work across Tier0/Tier1/Tier 2 really illustrates the importance of the grid system – Tier 2 contribution is ~ 50%; – >85% is external to CERN Tier 2 sites Tier 0 + Tier 1 sites

First events 26 June Ian Bird, CERN

WLCG depends on two major science grid infrastructures …. EGEE - Enabling Grids for E-Science OSG - US Open Science Grid 16 Interoperability & interoperation is vital significant effort in building the procedures to support it Interoperability & interoperation is vital significant effort in building the procedures to support it 26 June 2009Ian Bird, CERN

Enabling Grids for E-sciencE EGEE-III INFSO-RI EGEE scale users LCPUs (cores) 25Pb disk 39Pb tape 12 million jobs/month +45% in a year 268 sites +5% in a year 48 countries +10% in a year 162 VOs +29% in a year Technical Status - Steven Newhouse - EGEE-III First Review June

CERN Computing – in numbers Computing – CPU: – 5700 systems = cores (+ planned 3000 systems, cores) – Used for CPU servers, disk servers, general services Computing – disk: – TB on disk drives (+ planned TB on drives) Computing – tape: – TB on tape cartridges – tape slots in robots, 160 tape drives Computer centre: – 2.9 MW usable power, + ~1.5 MW for cooling Ian Bird, CERN18

26 June 2009Ian Bird, CERN19