Jürgen Knobloch/CERN Slide 1 Grid Computing by Jürgen Knobloch CERN IT-Department Presented at Physics at the Terascale DESY, Hamburg December 4, 2007.

Slides:



Advertisements
Similar presentations
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Why Grids Matter to Europe Bob Jones EGEE.
Advertisements

S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.
 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
The EGI – a sustainable European grid infrastructure Michael Wilson STFC RAL.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
Les Les Robertson WLCG Project Leader WLCG – Worldwide LHC Computing Grid Where we are now & the Challenges of Real Data CHEP 2007 Victoria BC 3 September.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
Les Les Robertson LCG Project Leader LCG - The Worldwide LHC Computing Grid LHC Data Analysis Challenges for 100 Computing Centres in 20 Countries HEPiX.
Jürgen Knobloch/CERN Slide 1 Grids for Science in Europe by Jürgen Knobloch CERN IT-Department Presented at Gridforum.nl Annual Business Day Eindhoven.
Frédéric Hemmer, CERN, IT DepartmentThe LHC Computing Grid – October 2006 LHC Computing and Grids Frédéric Hemmer IT Deputy Department Head October 10,
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
Advanced Computing Services for Research Organisations Bob Jones Head of openlab IT dept CERN This document produced by Members of the Helix Nebula consortium.
Experience with the WLCG Computing Grid 10 June 2010 Ian Fisk.
A short introduction to the Worldwide LHC Computing Grid Maarten Litmaath (CERN)
LCG Service Challenge Phase 4: Piano di attività e impatto sulla infrastruttura di rete 1 Service Challenge Phase 4: Piano di attività e impatto sulla.
European Grid Initiative Dieter Kranzlmüller (GUP, Joh. Kepler University Linz)
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
Data Grid projects in HENP R. Pordes, Fermilab Many HENP projects are working on the infrastructure for global distributed simulated data production, data.
Ian Bird LCG Deployment Manager EGEE Operations Manager LCG - The Worldwide LHC Computing Grid Building a Service for LHC Data Analysis 22 September 2006.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
1 The LHC Computing Grid – February 2007 Frédéric Hemmer, CERN, IT Department LHC Computing and Grids Frédéric Hemmer Deputy IT Department Head January.
CERN IT Department CH-1211 Genève 23 Switzerland Visit of Professor Karel van der Toorn President University of Amsterdam Wednesday 10 th.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
Jürgen Knobloch/CERN Slide 1 A Global Computer – the Grid Is Reality by Jürgen Knobloch October 31, 2007.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
…building the next IT revolution From Web to Grid…
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Ian Bird LCG Deployment Area Manager & EGEE Operations Manager IT Department, CERN Presentation to HEPiX 22 nd October 2004 LCG Operations.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
CERN IT Department CH-1211 Genève 23 Switzerland t Frédéric Hemmer IT Department Head - CERN 23 rd August 2010 Status of LHC Computing from.
ATLAS WAN Requirements at BNL Slides Extracted From Presentation Given By Bruce G. Gibbard 13 December 2004.
SC4 Planning Planning for the Initial LCG Service September 2005.
Dr. Andreas Wagner Deputy Group Leader - Operating Systems and Infrastructure Services CERN IT Department The IT Department & The LHC Computing Grid –
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
The LHC Computing Environment Challenges in Building up the Full Production Environment [ Formerly known as the LCG Service Challenges ]
Plans for Service Challenge 3 Ian Bird LHCC Referees Meeting 27 th June 2005.
LHC Computing, CERN, & Federated Identities
Computing on the Grid and in the Clouds Rocío Rama Ballesteros CERN IT-SDC Support for Distributed Computing Group.
European Grid Initiative Design Study Current Status Luděk Matyska CESNET EGI_DS Project Director (With thanks to Juergen Knobloch for pictures)
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Frédéric Hemmer IT Department 26 th January 2010 Visit of Michael Dell 1 Frédéric.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
1 The LHC Computing Grid – April 2007 Frédéric Hemmer, CERN, IT Department The LHC Computing Grid A World-Wide Computer Centre Frédéric Hemmer Deputy IT.
tons, 150 million sensors generating data 40 millions times per second producing 1 petabyte per second The ATLAS experiment.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 1 st March 2011 Visit of Dr Manuel Eduardo Baldeón.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE: Enabling grids for E-Science Bob Jones.
My Jobs at CERN April 2015 My Jobs at CERN2
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
Ian Bird LCG Project Leader Status of EGEE  EGI transition WLCG LHCC Referees’ meeting 21 st September 2009.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
SAM architecture EGEE 07 Service Availability Monitor for the LHC experiments Simone Campana, Alessandro Di Girolamo, Nicolò Magini, Patricia Mendez Lorenzo,
WLCG – Status and Plans Ian Bird WLCG Project Leader openlab Board of Sponsors CERN, 23 rd April 2010.
Top 5 Experiment Issues ExperimentALICEATLASCMSLHCb Issue #1xrootd- CASTOR2 functionality & performance Data Access from T1 MSS Issue.
Grid technologies for large-scale projects N. S. Astakhov, A. S. Baginyan, S. D. Belov, A. G. Dolbilov, A. O. Golunov, I. N. Gorbunov, N. I. Gromova, I.
2 nd EGEE/OSG Workshop Data Management in Production Grids 2 nd of series of EGEE/OSG workshops – 1 st on security at HPDC 2006 (Paris) Goal: open discussion.
Dr. Ian Bird LHC Computing Grid Project Leader Göttingen Tier 2 Inauguration 13 th May 2008 Challenges and Opportunities.
WLCG Tier-2 Asia Workshop TIFR, Mumbai 1-3 December 2006
Grid Computing in HIGH ENERGY Physics
The LHC Computing Environment
IT Department and The LHC Computing Grid
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
Applications Using the EGEE Grid Infrastructure
A high-performance computing facility for scientific research
LHC Data Analysis using a worldwide computing grid
The LHC Computing Grid Visit of Prof. Friedrich Wagner
Overview & Status Al-Ain, UAE November 2007.
The LHCb Computing Data Challenge DC06
Presentation transcript:

Jürgen Knobloch/CERN Slide 1 Grid Computing by Jürgen Knobloch CERN IT-Department Presented at Physics at the Terascale DESY, Hamburg December 4, 2007 Computing at the Petascale

Jürgen Knobloch/CERN Slide 2 LHC gets ready …

… what about Computing? The challenge Starting to get the picture Going for the Grid Are we there? What other use is the Grid? Where do we go from here? Jürgen Knobloch/CERN Slide 3

LHC Computing Challenge Jürgen Knobloch/CERN Slide 4

Jürgen Knobloch/CERN Slide 5 Timeline LHC Computing LHC approved ATLAS & CMS approved ALICE approved LHCb approved “Hoffmann” Review 7x10 7 MIPS 1,900 TB disk ATLAS (or CMS) requirements for first year at design luminosity ATLAS&CMS CTP 10 7 MIPS 100 TB disk LHC start Computing TDRs 55x10 7 MIPS 70,000 TB disk (140 MSi2K)

Options as seen in 1996 Jürgen Knobloch/CERN Slide 6

CSC 2007 LCG Evolution of CPU Capacity at CERN SC (0.6GeV) PS (28GeV) ISR (300GeV) SPS (400GeV) ppbar (540GeV) LEP (100GeV) LEP II (200GeV) LHC (14 TeV) Costs (2007 Swiss Francs) Includes infrastructure costs (comp.centre, power, cooling,..) and physics tapes Slide from Les Robertson

Requirements Match? Jürgen Knobloch/CERN Slide 8 Tape & disk requirements: >10 times CERN possibility Substantial computing required outside CERN Data compiled by Les Robertson

Jürgen Knobloch/CERN Slide 9 Timeline - Grids Data Challenges First physics Cosmics GRID 3 EGEE 1 LCG 1 Service Challenges EU DataGrid GriPhyN, iVDGL, PPDG EGEE 2 OSG LCG 2 EGEE WLCG CCRC08 Common Computing Readiness Challenge

Jürgen Knobloch/CERN Slide 10 Centers around the world form a Global Computer System The EGEE and OSG projects are the basis of the Worldwide LHC Computing Grid Project WLCG Inter-operation between Grids is working!

WLCG Collaboration Jürgen Knobloch/CERN Slide 11

Events at LHC Jürgen Knobloch/CERN Slide 12 Luminosity : cm -2 s MHz – every 25 ns 20 events overlaying

Trigger & Data Acquisition Jürgen Knobloch/CERN Slide 13

Tier-0 Recording Jürgen Knobloch/CERN Slide GB/sec (Ions)

Tier Jürgen Knobloch/CERN Slide 15 Tier-0 (CERN): Data recording First-pass reconstruction Data distribution Tier-1 (11 centres): Permanent storage Re-processing Analysis Tier-2 (>200 centres): Simulation End-user analysis

Requirements 2006 Jürgen Knobloch/CERN Slide 16 CPUDisk CERN: ~ 10%

Moore Delivered for CPU & Disk Jürgen Knobloch/CERN Slide 17 Slides from 1996 Expectations fulfilled! We are now here

Network was a Concern … Jürgen Knobloch/CERN Slide 18 We are now here

Tier-0-1: Optical Private Network Jürgen Knobloch/CERN Slide 19

OPN Traffic Jürgen Knobloch/CERN Slide 20

Data Transfer out of Tier-0 Jürgen Knobloch/CERN Slide 21

Middleware Security –Virtual Organization Management (VOMS) –MyProxy Data management –File catalogue (LFC) –File transfer service (FTS) –Storage Element (SE) –Storage Resource Management (SRM) Job management –Work Load Management System(WMS) –Logging and Bookeeping (LB) –Computing Element (CE) –Worker Nodes (WN) Information System –Monitoring: BDII (Berkeley Database Information Index), RGMA (Relational Grid Monitoring Architecture)  aggregate service information from multiple Grid sites, now moved to SAM (Site Availability Monitoring) –Monitoring & visualization (Gridview, Dashboard, Gridmap etc.) Jürgen Knobloch/CERN Slide 22

Jürgen Knobloch/CERN Slide 23 Site reliability

Site Reliability Note: For a site to be reliable, many things have to work simultaneously!

Jürgen Knobloch/CERN Slide 25 ARDA Dashboard

Gridmap Jürgen Knobloch/CERN Slide 26

Increasing workloads ⅓ non-LHC Data compiled by Ian Bird

Many Grid Applications At present there are about 20 applications from more than 10 domains on the EGEE Grid infrastructure –Astronomy & Astrophysics - MAGIC, Planck –Computational Chemistry –Earth Sciences - Earth Observation, Solid Earth Physics, Hydrology, Climate –Fusion –High Energy Physics - 4 LHC experiments (ALICE, ATLAS, CMS, LHCb) BaBar, CDF, DØ, ZEUSATLAS –Life Sciences - Bioinformatics (Drug Discovery, Xmipp_MLrefine, etc.) –Condensed Matter Physics –Computational Fluid Dynamics –Computer Science/Tools –Civil Protection –Finance (through the Industry Task Force) Jürgen Knobloch/CERN Slide 28

Grid Applications Jürgen Knobloch/CERN Slide 29 Medical Seismology Chemistry Astronomy Fusion Particle Physics

Available EGEE Infrastructure 30 Data compiled by Ian Bird

Ramp-up Needed for Startup Jul Sep Apr X Sep Jul Apr Sep Jul Apr X 2.9 X Sep Jul Apr Sep Jul Apr X 3.7 X target usage usage pledge installed

Jürgen Knobloch/CERN Slide 32 3D - Distributed Deployment of Databases for LCG ORACLE Streaming with Downstream Capture (ATLAS, LHCb) SQUID/FRONTIER Web caching (CMS)

The Next Step Jürgen Knobloch/CERN Slide 33

EGI - Consortium members Johannes Kepler Universität Linz (GUP) Greek Research and Technology Network S.A. (GRNET) Istituto Nazionale di Fisica Nucleare (INFN) CSC - Scientific Computing Ltd. (CSC) CESNET, z.s.p.o. (CESNET) European Organization for Nuclear Research (CERN) Verein zur Förderung eines Deutschen Forschungsnetzes - DFN-Verein (DFN) Science & Technology Facilities Council (STFC) Centre National de la Recherche Scientifique (CNRS) Jürgen Knobloch/CERN Slide 34

EGI – European Grid Initiative EGI Design Study started Sept 07 to establish a sustainable pan-European grid infrastructure after the end of EGEE-3 in 2010 The main foundations of EGI are 37 National Grid Initiatives (NGIs) Project is funded by the European Commission's 7 th Framework Program Jürgen Knobloch/CERN Slide 35

Jürgen Knobloch/CERN Slide 36 Tier-1 Centers: TRIUMF (Canada); GridKA(Germany); IN2P3 (France); CNAF (Italy); SARA/NIKHEF (NL); Nordic Data Grid Facility (NDGF); ASCC (Taipei); RAL (UK); BNL (US); FNAL (US); PIC (Spain) The Grid is now in operation, working on: reliability, scaling up, sustainability Computing at the Terra-Scale