LHC Computing Grid Project

Slides:



Advertisements
Similar presentations
CERN STAR TAP June 2001 Status of the EU DataGrid Project Fabrizio Gagliardi CERN EU-DataGrid Project Leader June 2001
Advertisements

Fabric and Storage Management GridPP Fabric and Storage Management GridPP 24/24 May 2001.
Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Last update: 02/06/ :05 LCG les robertson - cern-it 1 The LHC Computing Grid Project Preparing for LHC Data Analysis NorduGrid Workshop Stockholm,
The LHC Computing Grid Project Tomi Kauppi Timo Larjo.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
Supercomputing Center Jysoo Lee KISTI Supercomputing Center National e-Science Project.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
CERN TERENA Lisbon The Grid Project Fabrizio Gagliardi CERN Information Technology Division May, 2000
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
Rackspace Analyst Event Tim Bell
Data Grid projects in HENP R. Pordes, Fermilab Many HENP projects are working on the infrastructure for global distributed simulated data production, data.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
LCG LHC Computing Grid Project – LCG CERN – European Organisation for Nuclear Research Geneva, Switzerland LCG LHCC Comprehensive.
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
GridPP Presentation to AstroGrid 13 December 2001 Steve Lloyd Queen Mary University of London.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
October 30, 2001ATLAS PCAP1 LHC Computing at CERN and elsewhere The LHC Computing Grid Project as approved by Council, on September 20, 2001 M Kasemann,
…building the next IT revolution From Web to Grid…
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
LCG LCG Workshop – March 23-24, Middleware Development within the EGEE Project LCG Workshop CERN March 2004 Frédéric Hemmer.
CERN as a World Laboratory: From a European Organization to a global facility CERN openlab Board of Sponsors July 2, 2010 Rüdiger Voss CERN Physics Department.
1CHEP2000 February 2000F. Gagliardi EU HEP GRID Project Fabrizio Gagliardi
CERN openlab Overview CERN openlab Introduction Alberto Di Meglio.
1 Future Circular Collider Study Preparatory Collaboration Board Meeting September 2014 R-D Heuer Global Future Circular Collider (FCC) Study Goals and.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
CERN The GridSTART EU accompany measure Fabrizio Gagliardi CERN
DataGrid is a project funded by the European Commission under contract IST rd EU Review – 19-20/02/2004 The EU DataGrid Project Three years.
November 27, 2001DOE/NSF review of US LHC S&C projects1 The Software and Computing Committee (SC2) in the LHC Computing Grid Project M Kasemann, FNAL.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
J.J.Blaising 18 April 02DataGrid and LCG1 LCG J.J Blaising LAPP-IN2P3 LHC Computing Grid Project (LCG) Launch Workshop March RTAG Common Use Cases.
- 11apr03 # 1 Operations Operations Management = LCG deployment management Management team at CERN (+7 FTEs) Core infrastructure.
Project Execution Board Gets agreement on milestones, schedule, resource allocation Manages the progress and direction of the project Ensures conformance.
LCG Project Organisation Requirements and Monitoring LHCC Comprehensive Review November 24, 2003 Matthias Kasemann Software + Computing Committee (SC2)
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
ScotGRID is the Scottish prototype Tier 2 Centre for LHCb and ATLAS computing resources. It uses a novel distributed architecture and cutting-edge technology,
Ian Bird, CERN WLCG Project Leader Amsterdam, 24 th January 2012.
Bob Jones EGEE Technical Director
EGEE Middleware Activities Overview
Grid site as a tool for data processing and data analysis
A successful public-private partnership
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
The European Strategy for Particle Physics
Ian Bird GDB Meeting CERN 9 September 2003
Grid related projects CERN openlab LCG EDG F.Fluckiger
Alice Week Offline Day F.Carminati June 17, 2002.
Long-term Grid Sustainability
Russian Regional Center for LHC Data Analysis
Fabric and Storage Management
ATLAS DC2 ISGC-2005 Taipei 27th April 2005
Alice Week Offline Day F.Carminati March 18, 2002 ALICE Week India.
The CERN openlab and the European DataGrid Project
LHC Computing Grid Project - LCG
US ATLAS Physics & Computing
CERN, the LHC and the Grid
LCG experience in Integrating Grid Toolkits
Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002
LHC Data Analysis using a worldwide computing grid
Collaboration Board Meeting
Defining the Grid Fabrizio Gagliardi EMEA Director Technical Computing
LHCb thinking on Regional Centres and Related activities (GRIDs)
LHC Computing Grid Project
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

LHC Computing Grid Project GridPP Collaboration Meeting Edinburgh, November 2001 Les Robertson CERN - IT Division les.robertson@cern.ch The lectures will survey technologies that could be used for storing and managing the many PetaBytes of data that will be collected and processed at LHC. This will cover current mainline hardware technologies, their capacity, performance and reliability characteristics, the likely evolution in the period from now to LHC, and their fundamental limits. It will also cover promising new technologies including both products which are emerging from the home and office computing environment (such as DVDs) and more exotic techniques. The importance of market acceptance and production volume as cost factors will be mentioned. Robotic handling systems for mass storage media will also be discussed. After summarising the mass storage requirements of LHC, some suggestions will be made of how these requirements may be met with the technology which will be available.

The Requirements

The Large Hadron Collider Project 4 detectors CMS ATLAS Storage – Raw recording rate 0.1 – 1 GBytes/sec Accumulating at 5-8 PetaBytes/year 10 PetaBytes of disk Processing – 200,000 of today’s fastest PCs LHCb

Worldwide distributed computing system Small fraction of the analysis at CERN ESD analysis – using 12-20 large regional centres how to use the resources efficiently establishing and maintaining a uniform physics environment Data exchange – with tens of smaller regional centres, universities, labs Importance of cost containment components & architecture utilisation efficiency maintenance, capacity evolution personnel & management costs ease of use (usability efficiency)

From Distributed Clusters to Fabrics & Grids

Distributed Computing Distributed computing - 1990’s - locally distributed systems Clusters Parallel computers (IBM SP) Advances in local area networks, cluster management techniques  1,000-way clusters widely available Distributed Computing – 2000’s Giant clusters  fabrics New level of automation required Geographically distributed systems Computational Grids Key areas for R&D Fabric management Grid middleware High-performance networking Grid operation mass storage application servers WAN data cache

The MONARC Multi-Tier Model (1999) Department    Desktop CERN – Tier 0 MONARC report: http://home.cern.ch/~barone/monarc/RCArchitecture.html Tier 1 FNAL RAL IN2P3 622 Mbps 2.5 Gbps 155 mbps Tier2 Lab a Uni b Lab c Uni n les.robertson@cern.ch

LHC Computing Model 2001 - evolving The opportunity of Grid technology LHC Computing Model 2001 - evolving Tier3 physics department    Desktop Tier2 Lab a Uni a Lab c Uni n Lab m Lab b Uni b Uni y Uni x regional group CMS ATLAS LHCb CERN Tier 0 Centre at CERN Germany Tier 1 USA UK France Italy ………. CERN Tier 1 The opportunity of Grid technology CERN Tier 0 The LHC Computing Centre physics group les.robertson@cern.ch

The Project

The LHC Computing Grid Project Two phases Phase 1 – 2002-04 Development and prototyping Approved by CERN Council 20 September 2001 Phase 2 – 2005-07 Installation and operation of the full world-wide initial production Grid

The LHC Computing Grid Project Phase 1 Goals – Prepare the LHC computing environment provide the common tools and infrastructure for the physics application software establish the technology for fabric, network and grid management (buy, borrow, or build) develop models for building the Phase 2 Grid validate the technology and models by building progressively more complex Grid prototypes operate a series of data challenges for the experiments maintain reasonable opportunities for the re-use of the results of the project in other fields Deploy a 50% model* production GRID including the committed LHC Regional Centres Produce a Technical Design Report for the full LHC Computing Grid to be built in Phase 2 of the project * 50% of the complexity of one of the LHC experiments

Funding of Phase 1 at CERN Funding for R&D activities at CERN during 2002-2004 partly through special contributions from member and associate states Major funding – people and materials - from United Kingdom – as part of PPARC’s GridPP project Italy – INFN Personnel and some materials at CERN also promised by – Austria, Belgium, Bulgaria, Czech Republic, France, Germany, Greece, Hungary, Israel, Spain, Switzerland Industrial funding – CERN openlab European Union – Datagrid, DataTag Funded so far - all of the personnel, ~1/3 of the materials

Areas of Work Computing System Physics Data Management Fabric Management Physics Data Storage LAN Management Wide-area Networking Security Internet Services Grid Technology Grid middleware Scheduling Data Management Monitoring Error Detection & Recovery Standard application services layer Applications Support & Coordination Application Software Infrastructure – libraries, tools Object persistency, data models Common Frameworks – Simulation, Analysis, .. Adaptation of Physics Applications to Grid environment Grid tools, Portals Grid Deployment Data Challenges Integration of the Grid & Physics Environments Regional Centre Coordination Network Planning Grid Operations

Synchronised with DataGrid Prototypes

Time constraints continuing R&D programme prototyping pilot technology selection pilot service system software selection, development, acquisition hardware selection, acquisition 1st production service 2001 2002 2003 2004 2005 2006

Organisation

The LHC Computing Grid Project Structure Common Computing RRB Reviews The LHC Computing Grid Project Reports Resource Matters Project Overview Board Project Manager Project Execution Board Software and Computing Committee (SC2) Requirements, Monitoring RTAG implementation teams

The LHC Computing Grid Project Structure Common Computing RRB Reviews The LHC Computing Grid Project Reports Resource Matters Project Overview Board Other Computing Grid Projects Project Manager Project Execution Board Software and Computing Committee (SC2) EU DataGrid Project Requirements, Monitoring Other HEP Grid Projects RTAG implementation teams Other Labs

A few of the Grid Technology Projects Data-intensive projects DataGrid – 21 partners, coordinated by CERN (Fabrizio Gagliardi) CrossGrid – 23 partners complementary to DataGrid (Michal Turala) DataTAG – funding for transatlantic demonstration Grids (Olivier Martin) European national HEP related projects GridPP (UK); INFN Grid; Dutch Grid; NorduGrid; Hungarian Grid; …… US HEP projects GriPhyN – NSF funding; HEP applications PPDG – Particle Physics Data Grid – DoE funding iVDGL – international Virtual Data Grid Laboratory Global Coordination Global Grid Forum InterGrid – ad hoc HENP Grid coordination (Larry Price)

Grid Technology for the LHC Grid An LHC collaboration needs a usable, coherent computing environment – a Virtual Computing Centre - a Worldwide Grid Already – even in the HEP community - there are several Grid technology development projects, with similar but different goals And many of these overlap with other communities How do we achieve and maintain compatibility, provide one usable computing system? architecture? api? protocols? …… while remaining open to external, industrial solutions This will be a significant challenge for the LHC Computing Grid Project

The LHC Computing Grid Project Structure Common Computing RRB Project Overview Board Chair: CERN Director for Scientific Computing Secretary: CERN IT Division Leader Membership: Spokespersons of LHC experiments CERN Director for Colliders Representatives of countries/regions with Tier-1 center : France, Germany, Italy, Japan, United Kingdom, United States of America 4 Representatives of countries/regions with Tier-2 center from CERN Member States In attendance: Project Leader SC2 Chairperson Reviews The LHC Computing Grid Project Reports Resource Matters Project Overview Board Project Manager Project Execution Board Software and Computing Committee (SC2) Requirements, Monitoring RTAG implementation teams

The LHC Computing Grid Project Structure Common Computing RRB Software and Computing Committee (SC2) (Preliminary) Sets the requirements Approves the strategy & workplan Monitors progress and adherence to the requirements Gets technical advice from short-lived focused RTAGs (Requirements & Technology Assessment Groups) Chair: to be appointed by CERN Director General Secretary Membership: 2 coordinators from each LHC experiment Representative from CERN EP Division Technical Managers from centers in each region represented in the POB Leader of the CERN Information Technology Division Project Leader Invited: POB Chairperson Reviews The LHC Computing Grid Project Reports Resource Matters Project Overview Board Project Manager Project Execution Board Software and Computing Committee (SC2) Requirements, Monitoring RTAG implementation teams

The LHC Computing Grid Project Structure Project Execution Board Gets agreement on milestones, schedule, resource allocation Manages the progress and direction of the project Ensures conformance with SC2 recommendations Identifies areas for study/resolution by SC2 Membership (preliminary – POB approval required) Project Management Team: Project Leader Area Coordinators Applications Fabric & basic computing systems Grid technology - from worldwide grid projects Grid deployment, regional centres, data challenges Empowered representative from each LHC Experiment Project architect Resource manager Leaders of major contributing teams Constrain to 15—18 members LHCC Common Computing RRB Reviews The LHC Computing Grid Project Reports Resource Matters Project Overview Board Project Manager Project Execution Board Software and Computing Committee (SC2) Requirements, Monitoring RTAG implementation teams

Startup Collaborations to appoint board members by 12 November Hope to start POB, SC2, PEB meetings in November Kick-off workshop in February