Presentation is loading. Please wait.

Presentation is loading. Please wait.

LHC Computing Grid Project

Similar presentations


Presentation on theme: "LHC Computing Grid Project"— Presentation transcript:

1 LHC Computing Grid Project
General Presentation November 2001 Les Robertson CERN - IT Division The lectures will survey technologies that could be used for storing and managing the many PetaBytes of data that will be collected and processed at LHC. This will cover current mainline hardware technologies, their capacity, performance and reliability characteristics, the likely evolution in the period from now to LHC, and their fundamental limits. It will also cover promising new technologies including both products which are emerging from the home and office computing environment (such as DVDs) and more exotic techniques. The importance of market acceptance and production volume as cost factors will be mentioned. Robotic handling systems for mass storage media will also be discussed. After summarising the mass storage requirements of LHC, some suggestions will be made of how these requirements may be met with the technology which will be available.

2 The Requirements

3 The Large Hadron Collider Project
4 detectors CMS ATLAS Storage – Raw recording rate 0.1 – 1 GBytes/sec Accumulating at 5-8 PetaBytes/year 10 PetaBytes of disk Processing – 200,000 of today’s fastest PCs LHCb

4 Worldwide distributed computing system
Small fraction of the analysis at CERN ESD analysis – using large regional centres how to use the resources efficiently establishing and maintaining a uniform physics environment Data exchange – with tens of smaller regional centres, universities, labs Importance of cost containment components & architecture utilisation efficiency maintenance, capacity evolution personnel & management costs ease of use (usability efficiency)

5 From Distributed Clusters to Fabrics & Grids

6 Distributed Computing
Distributed computing ’s - locally distributed systems Clusters Parallel computers (IBM SP) Advances in local area networks, cluster management techniques  1,000-way clusters widely available Distributed Computing – 2000’s Giant clusters  fabrics New level of automation required Geographically distributed systems Computational Grids Key areas for R&D Fabric management Grid middleware High-performance networking Grid operation mass storage application servers WAN data cache

7 The MONARC Multi-Tier Model (1999)
Department Desktop CERN – Tier 0 MONARC report: Tier 1 FNAL RAL IN2P3 622 Mbps 2.5 Gbps 155 mbps Tier2 Lab a Uni b Lab c Uni n

8 LHC Computing Model 2001 - evolving
The opportunity of Grid technology LHC Computing Model evolving Tier3 physics department Desktop Tier2 Lab a Uni a Lab c Uni n Lab m Lab b Uni b Uni y Uni x regional group CMS ATLAS LHCb CERN Tier 0 Centre at CERN Germany Tier 1 USA UK France Italy ………. CERN Tier 1 The opportunity of Grid technology CERN Tier 0 The LHC Computing Centre physics group

9 The Project

10 The LHC Computing Grid Project
Goal – Prepare and deploy the LHC computing environment applications tools, frameworks, environment computing system  services cluster  fabric collaborating computer centres  grid CERN-centric analysis  global analysis environment foster collaboration, coherence of LHC computing centres This is not yet another grid technology project – it is a grid deployment project

11 The LHC Computing Grid Project
Phase 1 – Development and prototyping Approved by CERN Council 20 September 2001 Phase 2 – Installation and operation of the full world-wide initial production Grid Costs (materials + staff) included in the LHC cost to completion estimates Two phases

12 The LHC Computing Grid Project
Phase 1 Goals – Prepare the LHC computing environment provide the common tools and infrastructure for the physics application software establish the technology for fabric, network and grid management (buy, borrow, or build) develop models for building the Phase 2 Grid validate the technology and models by building progressively more complex Grid prototypes operate a series of data challenges for the experiments maintain reasonable opportunities for the re-use of the results of the project in other fields Deploy a 50% model* production GRID including the committed LHC Regional Centres Produce a Technical Design Report for the full LHC Computing Grid to be built in Phase 2 of the project * 50% of the complexity of one of the LHC experiments

13 Areas of Work Applications Support & Coordination Computing System Grid Technology Grid Deployment

14 Applications Support & Coordination
Application Software Infrastructure – libraries, tools Object persistency, data management tools, data models Common Frameworks – Simulation, Analysis, .. Adaptation of Physics Applications to Grid environment Grid tools, Portals

15 Computing System Physics Data Storage and Management Fabric Management
LAN Management Wide-area Networking Security Internet Services

16 Grid Technology Grid middleware Scheduling Data Management Monitoring
Error Detection & Recovery Standard application services layer Inter-project coherence/compatibility

17 Grid Technology for the LHC Grid
An LHC collaboration needs a usable, coherent computing environment – a Virtual Computing Centre - a Worldwide Grid Already – even in the HEP community - there are several Grid technology development projects, with similar but different goals

18 A few of the Grid Technology Projects
Data-intensive projects DataGrid – 21 partners, coordinated by CERN (Fabrizio Gagliardi) CrossGrid – 23 partners complementary to DataGrid (Michal Turala) DataTAG – funding for transatlantic demonstration Grids (Olivier Martin) European national HEP related projects GridPP (UK); INFN Grid; Dutch Grid; NorduGrid; Hungarian Grid; …… US HEP projects GriPhyN – NSF funding; HEP applications PPDG – Particle Physics Data Grid – DoE funding iVDGL – international Virtual Data Grid Laboratory Global Coordination Global Grid Forum InterGrid – ad hoc HENP Grid coordination (Larry Price)

19 Grid Technology for the LHC Grid
An LHC collaboration needs a usable, coherent computing environment – a Virtual Computing Centre - a Worldwide Grid Already – even in the HEP community - there are several Grid technology development projects, with similar but different goals And many of these overlap with other communities How do we achieve and maintain compatibility, provide one usable computing system? architecture? api? protocols? …… while remaining open to external, industrial solutions This will be a significant challenge for the LHC Computing Grid Project

20 Grid Deployment Data Challenges Grid Operations
Integration of the Grid & Physics Environments Network Planning Regional Centre Coordination Security & access policy

21 Synchronised with DataGrid Prototypes

22 Time constraints continuing R&D programme prototyping
pilot technology selection pilot service system software selection, development, acquisition hardware selection, acquisition 1st production service

23 Organisation

24 The LHC Computing Grid Project Structure
Common Computing RRB Reviews The LHC Computing Grid Project Reports Resource Matters Project Overview Board Project Manager Project Execution Board Software and Computing Committee (SC2) Requirements, Monitoring RTAG implementation teams

25 The LHC Computing Grid Project Structure
Common Computing RRB Reviews The LHC Computing Grid Project Reports Resource Matters Project Overview Board Other Computing Grid Projects Project Manager Project Execution Board Software and Computing Committee (SC2) EU DataGrid Project Requirements, Monitoring Other HEP Grid Projects RTAG implementation teams Other Labs

26 The LHC Computing Grid Project Structure
Common Computing RRB Project Overview Board Chair: CERN Director for Scientific Computing Secretary: CERN IT Division Leader Membership: Spokespersons of LHC experiments CERN Director for Colliders Representatives of countries/regions with Tier-1 center : France, Germany, Italy, Japan, United Kingdom, United States of America 4 Representatives of countries/regions with Tier-2 center from CERN Member States In attendance: Project Leader SC2 Chairperson Reviews The LHC Computing Grid Project Reports Resource Matters Project Overview Board Project Manager Project Execution Board Software and Computing Committee (SC2) Requirements, Monitoring RTAG implementation teams

27 The LHC Computing Grid Project Structure
Common Computing RRB Software and Computing Committee (SC2) (Preliminary) Sets the requirements Approves the strategy & workplan Monitors progress and adherence to the requirements Gets technical advice from short-lived focused RTAGs (Requirements & Technology Assessment Groups) Chair: to be appointed by CERN Director General Secretary Membership: 2 coordinators from each LHC experiment Representative from CERN EP Division Technical Managers from centers in each region represented in the POB Leader of the CERN Information Technology Division Project Leader Invited: POB Chairperson Reviews The LHC Computing Grid Project Reports Resource Matters Project Overview Board Project Manager Project Execution Board Software and Computing Committee (SC2) Requirements, Monitoring RTAG implementation teams

28 The LHC Computing Grid Project Structure
Project Execution Board Gets agreement on milestones, schedule, resource allocation Manages the progress and direction of the project Ensures conformance with SC2 recommendations Identifies areas for study/resolution by SC2 Membership (preliminary – POB approval required) Project Management Team: Project Leader Area Coordinators Applications Fabric & basic computing systems Grid technology - from worldwide grid projects Grid deployment, regional centres, data challenges Empowered representative from each LHC Experiment Project architect Resource manager Leaders of major contributing teams Constrain to 15—18 members LHCC Common Computing RRB Reviews The LHC Computing Grid Project Reports Resource Matters Project Overview Board Project Manager Project Execution Board Software and Computing Committee (SC2) Requirements, Monitoring RTAG implementation teams

29 Status

30 Funding of Phase 1 at CERN
Funding for R&D activities at CERN during partly through special contributions from member and associate states - Austria, Belgium, Bulgaria, Czech Republic, France, Germany, Greece, Hungary, Israel, Italy, Spain, Switzerland, United Kingdom Industrial funding – CERN openlab Intel, Enterasys, KPNQwest European Union – Datagrid, DataTag further possibilities (FP6) Funded so far - all of the personnel, ~40% of the materials

31 Project Startup Collaborations have named their representatives in the various committees First pre-POB meetings being scheduled – will not be fully populated PEB – 23 November SC2 – beginning of December Kick-off workshop in February?


Download ppt "LHC Computing Grid Project"

Similar presentations


Ads by Google