Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.

Slides:



Advertisements
Similar presentations
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Why Grids Matter to Europe Bob Jones EGEE.
Advertisements

Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.
T1 at LBL/NERSC/OAK RIDGE General principles. RAW data flow T0 disk buffer DAQ & HLT CERN Tape AliEn FC Raw data Condition & Calibration & data DB disk.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Infrastructure overview Arnold Meijster &
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
Massive Computing at CERN and lessons learnt
1 Large-scale Data Processing Challenges David Wallom.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
What is EGI? The European Grid Infrastructure enables access to computing resources for European scientists from all fields of science, from Physics to.
Frédéric Hemmer, CERN, IT DepartmentThe LHC Computing Grid – October 2006 LHC Computing and Grids Frédéric Hemmer IT Deputy Department Head October 10,
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
Advanced Computing Services for Research Organisations Bob Jones Head of openlab IT dept CERN This document produced by Members of the Helix Nebula consortium.
A short introduction to GRID Gabriel Amorós IFIC.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks The EGEE Infrastructure and Remote Instruments.
A short introduction to the Worldwide LHC Computing Grid Maarten Litmaath (CERN)
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Ian Bird LCG Deployment Manager EGEE Operations Manager LCG - The Worldwide LHC Computing Grid Building a Service for LHC Data Analysis 22 September 2006.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE – paving the way for a sustainable infrastructure.
INFSO-RI Enabling Grids for E-sciencE Plan until the end of the project and beyond, sustainability plans Dieter Kranzlmüller Deputy.
Progress in Computing Ian Bird ICHEP th July 2010, Paris
1 The LHC Computing Grid – February 2007 Frédéric Hemmer, CERN, IT Department LHC Computing and Grids Frédéric Hemmer Deputy IT Department Head January.
CERN IT Department CH-1211 Genève 23 Switzerland Visit of Professor Karel van der Toorn President University of Amsterdam Wednesday 10 th.
Jürgen Knobloch/CERN Slide 1 A Global Computer – the Grid Is Reality by Jürgen Knobloch October 31, 2007.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
CERN IT Department CH-1211 Genève 23 Switzerland t Frédéric Hemmer IT Department Head - CERN 23 rd August 2010 Status of LHC Computing from.
INFSO-RI Enabling Grids for E-sciencE EGEE SA1 in EGEE-II – Overview Ian Bird IT Department CERN, Switzerland EGEE.
WLCG and the India-CERN Collaboration David Collados CERN - Information technology 27 February 2014.
Dr. Andreas Wagner Deputy Group Leader - Operating Systems and Infrastructure Services CERN IT Department The IT Department & The LHC Computing Grid –
Computing for LHC Physics 7th March 2014 International Women's Day - CERN- GOOGLE Networking Event Maria Alandes Pradillo CERN IT Department.
LHC Computing, CERN, & Federated Identities
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Frédéric Hemmer IT Department 26 th January 2010 Visit of Michael Dell 1 Frédéric.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
INFSO-RI Enabling Grids for E-sciencE Quality Assurance Gabriel Zaquine - JRA2 Activity Manager - CS SI EGEE Final EU Review
1 The LHC Computing Grid – April 2007 Frédéric Hemmer, CERN, IT Department The LHC Computing Grid A World-Wide Computer Centre Frédéric Hemmer Deputy IT.
tons, 150 million sensors generating data 40 millions times per second producing 1 petabyte per second The ATLAS experiment.
Ian Bird WLCG Networking workshop CERN, 10 th February February 2014
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 1 st March 2011 Visit of Dr Manuel Eduardo Baldeón.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE: Enabling grids for E-Science Bob Jones.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
Ian Bird LCG Project Leader Status of EGEE  EGI transition WLCG LHCC Referees’ meeting 21 st September 2009.
Meeting with University of Malta| CERN, May 18, 2015 | Predrag Buncic ALICE Computing in Run 2+ P. Buncic 1.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Round-table session «Presentation of the new telecommunication channel JINR-Moscow and of the JINR Grid-segment» of the JINR Grid-segment» Korenkov Vladimir.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks The EGEE project and the future of European.
INFSO-RI Enabling Grids for E-sciencE EGEE general project update Fotis Karayannis EGEE South East Europe Project Management Board.
WLCG – Status and Plans Ian Bird WLCG Project Leader openlab Board of Sponsors CERN, 23 rd April 2010.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
Dr. Ian Bird LHC Computing Grid Project Leader Göttingen Tier 2 Inauguration 13 th May 2008 Challenges and Opportunities.
Ian Bird LCG Project Leader Summary of EGI workshop.
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
Ian Bird, CERN WLCG Project Leader Amsterdam, 24 th January 2012.
Ian Bird WLCG Workshop San Francisco, 8th October 2016
Grid Computing in HIGH ENERGY Physics
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
The Worldwide LHC Computing Grid BEGrid Seminar
Dagmar Adamova, NPI AS CR Prague/Rez
EGEE support for HEP and other applications
The LHC Computing Grid Visit of Her Royal Highness
New strategies of the LHC experiments to meet
Input on Sustainability
Visit of US House of Representatives Committee on Appropriations
LHC Data Analysis using a worldwide computing grid
Cécile Germain-Renaud Grid Observatory meeting 19 October 2007 Orsay
The LHC Grid Service A worldwide collaboration Ian Bird
Overview & Status Al-Ain, UAE November 2007.
The LHC Computing Grid Visit of Professor Andreas Demetriou
Presentation transcript:

Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration

Introduction The LHC Grid Service is a worldwide collaboration between: – 4 LHC experiments and – ~140 computer centres that contribute resources – International grid projects providing software and services The collaboration is brought together by a MoU that: – Commits resources for the coming years – Agrees a certain level of service availability and reliability As of today 33 countries have signed the MoU: – CERN (Tier 0) + 11 large Tier 1 sites – 130 Tier 2 sites in 60 “federations” Other sites are expected to participate but without formal commitment

The LHC Computing Challenge  Signal/Noise:  Data volume High rate * large number of channels * 4 experiments  15 PetaBytes of new data each year  Compute power Event complexity * Nb. events * thousands users  100 k of (today's) fastest CPUs  45 PB of disk storage  Worldwide analysis & funding Computing funding locally in major regions & countries Efficient analysis everywhere  GRID technology

Tier 0 at CERN: Acquisition, First pass processing Storage & Distribution 1.25 GB/sec (ions)

Tier 0 – Tier 1 – Tier 2 Tier-0 (CERN): Data recording Initial data reconstruction Data distribution Tier-1 (11 centres): Permanent storage Re-processing Analysis Tier-2 (~130 centres): Simulation End-user analysis

Evolution of Grids Data Challenges First physics GRID 3 EGEE 1 LCG 1 EU DataGrid GriPhyN, iVDGL, PPDG EGEE 2 OSG LCG 2 EGEE WLCG Cosmics Service Challenges

Preparation for accelerator start up Since 2004 WLCG has been running a series of challenges to demonstrate aspects of the system; with increasing targets for: – Data throughput – Workloads – Service availability and reliability Culminating in a 1 month challenge in May with – All 4 experiments running realistic work (simulating what will happen in data taking) – Demonstrated that we were ready for real data In essence the LHC Grid service has been running for several years

Recent grid use The grid concept really works – all contributions – large & small contribute to the overall effort! Tier 2: 54% CERN: 11% Tier 1: 35% 350k /day

Data transfer out of Tier 0 Full experiment rate needed is 650 MB/s Desire capability to sustain twice that to allow for Tier 1 sites to shutdown and recover Have demonstrated far in excess of that All experiments exceeded required rates for extended periods, & simultaneously All Tier 1s achieved (or exceeded) their target acceptance rates Full experiment rate needed is 650 MB/s Desire capability to sustain twice that to allow for Tier 1 sites to shutdown and recover Have demonstrated far in excess of that All experiments exceeded required rates for extended periods, & simultaneously All Tier 1s achieved (or exceeded) their target acceptance rates

Production Grids WLCG relies on a production quality infrastructure – Requires standards of: Availability/reliability Performance Manageability – Will be used 365 days a year... (has been for several years!) – Tier 1s must store the data for at least the lifetime of the LHC - ~20 years Not passive – requires active migration to newer media Vital that we build a fault-tolerant and reliable system – That can deal with individual sites being down and recover

11 WLCG depends on two major science grid infrastructures …. EGEE - Enabling Grids for E-Science OSG - US Open Science Grid... as well as many national grid projects Interoperability & interoperation is vital significant effort in building the procedures to support it Interoperability & interoperation is vital significant effort in building the procedures to support it

Enabling Grids for E-sciencE EGEE-II INFSO-RI sites 45 countries 45,000 CPUs 12 PetaBytes > 5000 users > 100 VOs > 100,000 jobs/day Archeology Astronomy Astrophysics Civil Protection Comp. Chemistry Earth Sciences Finance Fusion Geophysics High Energy Physics Life Sciences Multimedia Material Sciences … Grid infrastructure project co-funded by the European Commission - now in 2 nd phase with 91 partners in 32 countries

 Partnering with:  US LHC: Tier-1s, Tier-2s, Tier-3s  Campus Grids: Clemson, Wisconsin, Fermilab, Purdue  Regional & National Grids: TeraGrid, New York State Grid, EGEE, UK NGS  International Collaboration: South America, Central America, Taiwan, Korea, UK.  Access to 45,000 Cores, 6 Petabytes Disk, 15 Petabytes Tape  >15,000 CPU Days/Day  ~85% Physics: LHC, Tevatron Run II, LIGO;  ~15% non-physics: biology, climate, text mining,  Including ~20% Opportunistic use of others resources.  Virtual Data Toolkit: Common software developed between Computer Science & applications used by OSG and others. OSG Project : Supported by the Department of Energy & the National Science Foundation

A worldwide collaboration Has been in production for several years Is now being used for real data Is ready to face the computing challenges as LHC gets up to full speed