October 30, 2001ATLAS PCAP1 LHC Computing at CERN and elsewhere The LHC Computing Grid Project as approved by Council, on September 20, 2001 M Kasemann,

Slides:



Advertisements
Similar presentations
CERN STAR TAP June 2001 Status of the EU DataGrid Project Fabrizio Gagliardi CERN EU-DataGrid Project Leader June 2001
Advertisements

Tips and Resources IASC Cluster/Sector Leadership Training
An open source approach for grids Bob Jones CERN EU DataGrid Project Deputy Project Leader EU EGEE Designated Technical Director
15 May 2006Collaboration Board GridPP3 Planning Executive Summary Steve Lloyd.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
The LHC Computing Grid Project Tomi Kauppi Timo Larjo.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
The OMII Perspective on Grid and Web Services At the University of Southampton.
The Preparatory Phase Proposal a first draft to be discussed.
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Related Projects Dieter Kranzlmüller Deputy.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
Results of the HPC in Europe Taskforce (HET) e-IRG Workshop Kimmo Koski CSC – The Finnish IT Center for Science April 19 th, 2007.
DataGrid Applications Federico Carminati WP6 WorkShop December 11, 2000.
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
InterGrid Meeting 7 Oct 2001 Tony Doyle. Tony Doyle - University of Glasgow GridPP Status  Financial Background  Deliverables  Recruitment  Regional.
D0RACE: Testbed Session Lee Lueking D0 Remote Analysis Workshop February 12, 2002.
The HiLumi LHC Design Study is included in the High Luminosity LHC project and is partly funded by the European Commission within the Framework Programme.
LCG LHC Computing Grid Project – LCG CERN – European Organisation for Nuclear Research Geneva, Switzerland LCG LHCC Comprehensive.
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
GridPP Presentation to AstroGrid 13 December 2001 Steve Lloyd Queen Mary University of London.
24-Aug-11 ILCSC -Mumbai Global Design Effort 1 ILC: Future after 2012 preserving GDE assets post-TDR pre-construction program.
EGEE is a project funded by the European Union under contract IST Middleware Planning for LCG/EGEE Bob Jones EGEE Technical Director e-Science.
JRA Execution Plan 13 January JRA1 Execution Plan Frédéric Hemmer EGEE Middleware Manager EGEE is proposed as a project funded by the European.
GRID IIII D UK Particle Physics Tony Doyle - University of Glasgow GridPP Status Report  Executive Summary  News  “Unfunded” EU DataGrid Posts  Shadow.
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
E-Science Research Councils awarded e-Science funds ” science increasingly done through distributed global collaborations enabled by the Internet, using.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
EGEE MiddlewareLCG Internal review18 November EGEE Middleware Activities Overview Frédéric Hemmer EGEE Middleware Manager EGEE is proposed as.
28/2/2000DJ for FOCUS1 LHC Computing Review  Announced by H. Hoffmann autumn 1999  First Steering Committee meeting 12/1999  By end of 'spring' (?)
Ian Bird GDB CERN, 9 th September Sept 2015
1CHEP2000 February 2000F. Gagliardi EU HEP GRID Project Fabrizio Gagliardi
Overview and status of the project EuroVO-AIDA – Final review – 5 October 2010 Françoise Genova, Project Coordinator, 1 Overview and status of the project.
Overview of RUP Lunch and Learn. Overview of RUP © 2008 Cardinal Solutions Group 2 Welcome  Introductions  What is your experience with RUP  What is.
1 Future Circular Collider Study Preparatory Collaboration Board Meeting September 2014 R-D Heuer Global Future Circular Collider (FCC) Study Goals and.
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
SEE-GRID-2 The SEE-GRID-2 initiative is co-funded by the European Commission under the FP6 Research Infrastructures contract no
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
CERN The GridSTART EU accompany measure Fabrizio Gagliardi CERN
EGEE Project Review Fabrizio Gagliardi EDG-7 30 September 2003 EGEE is proposed as a project funded by the European Union under contract IST
November 27, 2001DOE/NSF review of US LHC S&C projects1 The Software and Computing Committee (SC2) in the LHC Computing Grid Project M Kasemann, FNAL.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
CERN openlab Phase 5 Preparation (Technical) Alberto Di Meglio CERN openlab CTO office.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Operations Automation Team Kickoff Meeting.
Project Execution Board Gets agreement on milestones, schedule, resource allocation Manages the progress and direction of the project Ensures conformance.
LCG Project Organisation Requirements and Monitoring LHCC Comprehensive Review November 24, 2003 Matthias Kasemann Software + Computing Committee (SC2)
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
WP leaders meeting R. Aleksan October 5 th, 2009 TIARA 1.Objectives 2.General Context 3.Building TIARA 4.Conclusion.
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
JRA1 Middleware re-engineering
Bob Jones EGEE Technical Director
EGEE Middleware Activities Overview
Ian Bird GDB Meeting CERN 9 September 2003
Grid related projects CERN openlab LCG EDG F.Fluckiger
Alice Week Offline Day F.Carminati June 17, 2002.
Long-term Grid Sustainability
UK Testbed Status Testbed 0 GridPP project Experiments’ tests started
Alice Week Offline Day F.Carminati March 18, 2002 ALICE Week India.
Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002
Overview of working draft v. 29 January 2018
Collaboration Board Meeting
LHC Computing Grid Project
LHC Computing Grid Project
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

October 30, 2001ATLAS PCAP1 LHC Computing at CERN and elsewhere The LHC Computing Grid Project as approved by Council, on September 20, 2001 M Kasemann, FNAL See:

October 30, 2001ATLAS PCAP2 n Following recommendations of the LHC Computing Review in 2000 CERN proposes the LHC Computing Grid Project n It involves a two-phased approach to the problem, covering the years 2001 to 2007: u Phase 1: Development and prototyping at CERN and in Member States and Non Member States from 2001 to 2004, requiring expert manpower and some investment to establish a distributed production prototype at CERN and elsewhere that will be operated as a platform for the data challenges of the experiments. u The experience acquired towards the end of this phase will allow the elaboration of a Technical Design Report, which will serve as a basis for agreeing the relations between the distributed Grid nodes and their co-ordinated deployment and exploitation. u Phase 2: Installation and operation of the full world-wide initial production Grid system in the years 2005 to 2007, requiring continued manpower efforts and substantial material resources. n Milestones and activities can be defined precisely for the immediately following years, becoming progressively less certain for the more distant future APPROVED by Council, September 20, 2001 LHC Computing Grid Project

October 30, 2001ATLAS PCAP3 Formal Project Structure n A formal project structure will ensure the achievement of the required functionality and performance of the overall system with an efficient use of the allocated resources. n Participation in the project structure by the LHC experiments and the emerging regional centres will ensure the formulation of a work plan addressing the fundamental needs of the LHC experimental programme. n The formulation of the work plan as a set of work packages, schedules and milestones will facilitate contributions by collaborating institutes and by pre-existing projects, in particular the EU DataGrid and other Grid projects. n Appropriate liaisons with these pre-existing projects as well as with industry will be put in place to promote efficient use of resources, avoid duplication of work and preserve possibilities for technology transfer. n Leadership provided by CERN would have a clear executive role in the CERN developments and in the provision of the application infrastructure, and would provide co-ordination for the prototyping and development of the Regional Centres.

October 30, 2001ATLAS PCAP4 The LHC Computing Grid Project Structure The LHC Computing Grid Project LHCC Project Overview Board Project Execution Board Software and Computing Committee (SC2) Work Plan Definition WP RTAG WP Reports Reviews Common Computing RRB Resource Matters e-Science Project Leader Other Computing Grid Projects Other HEP Grid Projects EU DataGrid Project industry Regional Center Host labs

October 30, 2001ATLAS PCAP5 The LHC Computing Grid Project Structure The LHC Computing Grid Project LHCC Project Overview Board Project Execution Board Software and Computing Committee (SC2) Work Plan Definition WP RTAG WP Reports Reviews Common Computing RRB Resource Matters Project Leader Project Overview Board Chair: CERN Director for Scientific Computing Secretary: CERN Information Technology Division Leader Membership: Spokespersons of LHC experiments CERN Director for Colliders Representatives of countries/regions with Tier-1 center : France, Germany, Italy, Japan, United Kingdom, United States of America 4 Representatives of countries/regions with Tier-2 center from CERN Member States In attendance: Project Leader SC2 Chairperson

October 30, 2001ATLAS PCAP6 The LHC Computing Grid Project Structure The LHC Computing Grid Project LHCC Project Overview Board Project Execution Board Software and Computing Committee (SC2) Work Plan Definition WP RTAG WP Reports Reviews Common Computing RRB Resource Matters Project Leader Software and Computing Committee (SC2) (preliminary) Chair: to be appointed by CERN Director General Secretary Membership: 2 coordinators from each LHC experiment Representative from CERN EP Division Technical Managers from centers in each region represented in the POB Leader of the CERN Information Technology Division Project Leader Invited: POB Chairperson

October 30, 2001ATLAS PCAP7 The LHC Computing Grid Project Structure The LHC Computing Grid Project LHCC Project Overview Board Project Execution Board Software and Computing Committee (SC2) Work Plan Definition WP RTAG WP Reports Reviews Common Computing RRB Resource Matters Project Leader Project Execution Board (Preliminary – POB approval required) Constrain to 15—18 members Project Management Team: Project Leader Project architect Area Coordinators Applications Fabric & basic computing systems Grid technology Grid deployment, regional centres, data challenges Empowered representative from each LHC Experiment Leaders of major contributing teams

October 30, 2001ATLAS PCAP8 CERN openlab concept u Create synergies between basic research and industry u Research provides challenge, industry provides advanced items, concepts into collaborative forum u Participation fee Collaborative forum between public sector and industries to solve a well defined problem through open integration of technologies, aiming at open standards (open standard for example: Web with html, xml )

October 30, 2001ATLAS PCAP9 The projects requires Collaboration n The CERN activity is part of the wider programme of work that must be undertaken as a close collaboration between u CERN: all computing activities u Regional Centres u institutes participating in the experiments n Scope: to develop, test and build the full LHC Computing Grid. n Once the project has begun and both CERN and the Regional Centres have completed more detailed plans, it may be appropriate to change the balance of the investment and activities made at CERN and in other centres. n This balance should be reviewed at regular intervals during the life of the project.

October 30, 2001ATLAS PCAP10 Phase 1: High Level Goals n Provide a common infrastructure that LHC experimentalists can use to develop and run their applications efficiently on a grid of computing fabrics. n Execute successfully a series of data challenges satisfying the requirements of the experiments. n Provide methodologies, technical guidelines and costing models for building the high-throughput data-intensive computing fabrics and grids that will be required for Phase 2 of the project. n Validate the methodologies, guidelines and models by building a series of prototypes of the Tier 0 and distributed Tier 1 facility of increasing capacity and complexity, demonstrating the operation as an integrated Grid computing system with the required levels of reliability and performance. n Provide models that can be used by various institutions to build the remaining part of the tiered model (Tier-2 and below). n Maintain reasonable opportunities for the re-use of the results of the project in other fields, particularly in science. n Produce a Technical Design Report for the LHC Computing Grid to be built in Phase 2 of the project.

October 30, 2001ATLAS PCAP11 Project Scope: Phase 1 (-2004) n Build a prototype of the LHC Computing Grid, with capacity and performance satisfying the needs of the forthcoming data challenges of the LHC experiments: u Develop the system software, middleware and expertise required to manage very large-scale computing fabrics to be located at various locations. u Develop the grid middleware to organise the interaction between such computing fabrics installed at geographically remote locations, creating a single coherent computing environment. u Acquire experience with high-speed wide-area network and data management technologies, developing appropriate tools to achieve required levels of performance and reliability for migration, replication and caching of large data collections between computing fabrics.

October 30, 2001ATLAS PCAP12 Project Scope: Phase 1 (-2004) u Develop a detailed model for distributed data analysis for LHC, refining previous work by the MONARC collaboration and the LHC Computing Review, providing detailed estimates of data access patterns to enable realistic modelling and prototyping of the complex grid environment. u Adapt LHC applications to exploit the fabric and grid environment. u Progressively deploy at CERN and in a number of future Tier 1 and Tier 2 centres a half-scale (for a single LHC experiment) prototype of the LHC Computing Grid, demonstrating the required functionality, usability, performance and production- quality reliability. u Define the characteristics of the initial full production facility, including the CERN Tier 0 centre, the Tier 1 environment distributed across CERN and the Regional Centres, and the integration with the Tier 2 installations

October 30, 2001ATLAS PCAP13 Project Scope: Phase 1 (-2004) n Complete the development of the first versions of the physics application software and enable these for the distributed computing grid model: u Develop and support common libraries, tools and frameworks to support the development of the application software, particularly in the areas of simulation and analysis. u In parallel with this, the LHC collaborations must develop and deploy the first versions of their core software.

October 30, 2001ATLAS PCAP14 Project Scope: Phase 1 (-2004) n This work will be carried out in a close collaboration between CERN, Tier 1 and Tier 2 centers, and the LHC Collaborations and their participating institutes. n It is assumed that for Phase 1 the national funding agencies in the Member States and non-Member States will ensure the construction of the prototype Tier 1 and Tier 2 centers and their share of the distributed computing infrastructure. n It is further assumed that the experimental collaborations will ensure the software requirements

October 30, 2001ATLAS PCAP15 Project Scope: Phase 2 (2005-7) n Construct the initial full production version of the LHC Computing Grid ( ) according to the experience gained in the years of prototyping. u The resources required are still not known to a sufficient degree of precision but will be defined as part of the Phase 1 activity. u A specific proposal for Phase 2 will be prepared during 2003, in line with the Computing Technical Design Reports of the LHC experiments. n Regular later updates (after 2007) of the LHC Computing Grid should be foreseen according to the accumulation of data and the evolving needs of the experiments.

October 30, 2001ATLAS PCAP16 Human Resources required at CERN

October 30, 2001ATLAS PCAP17 Estimated Material Cost at CERN

October 30, 2001ATLAS PCAP18 Additional Resources required at CERN PHASE 1 Total Phase 1 Phase 2 under discussion with FC,.. Total Phase 2 remains, including 10 MCHF contingency

October 30, 2001ATLAS PCAP19 Summary of Milestones (1) n March 2002Prototype I –Performance and scalability testing of components of the computing fabric (clusters, disk storage, mass storage system, system installation, system monitoring) using straightforward physics applications. –Testing of job scheduling and data replication software. –Operation of a straightforward Grid including several Tier 1 centers. n March 2003Prototype II –Prototyping of the integrated local computing fabric, with emphasis on scaling, reliability and resilience to errors. –Performance testing of LHC applications at about 50% final prototype scale. –Stable operation of a distributed Grid for data challenges including a number of Tier1 and Tier 2 centers.

October 30, 2001ATLAS PCAP20 Summary of Milestones (2) n Dec. 2003Phase 2 Proposal –A detailed proposal for the construction of the full LHC Computing Grid, as Phase 2 of the project, including resource estimates n March 2004Prototype III –Testing of the complete LHC computing model with fabric management and grid management software for Tier0 and Tier1 centers, with some Tier2 components. –This is the prototype system that will be used to define the parameters for the acquisition of the initial LHC production system. –This will use the final software delivered by the DataGRID project.

October 30, 2001ATLAS PCAP21 Summary of Milestones (3) n Dec. 2004Production Prototype –Model of the initial phase of the production services, including final selections of the software and hardware implementations, demonstrating appropriate reliability and performance characteristics. n Dec. 2004TDR –Technical Design Report for the LHC Computing Grid to be built in Phase 2 of the project.

October 30, 2001ATLAS PCAP22 Status of DataGrid n Started on 1/1/2001, 9.8 M Euros EU funding, 21 partners (Main partners from CERN member states funding agencies, institutions: PPARC, INFN, CNRS, NIKHEF, ESA) n Programme of work concentrated on middleware, test beds and applications (90% HEP/LHC focussed) n Project in good shape ready for first test bed to be released to applications by October. Formal EU deadline at the end of the year. First EU review in March n Other EU projects: CrossGrid, DataTAG and, later, Dissemination Cluster

October 30, 2001ATLAS PCAP23 iVDGL INFN Grid CrossGrid DataTAG.... Collaboration with other Grid activities n Intensive work of collaboration with other Grid projects in Europe (EU Projects, GridPP, INFN-Grid, etc) and elsewhere (mostly in US: GriPhyN, PPDG, DTF, iVDGL), to ensure interoperability of the various grid efforts n Major role in international bodies: GGF (Global Grid Forum) and InterGrid (coordination among HEP Grid projects) n Starting effort to LHC Computing Grid n Monitor and support (within capabilities) similar activities in other sciences

October 30, 2001ATLAS PCAP24 Next Steps n POB, PEB, SC2 first meeting in November or December u Identify dates u Write letters to experiments requiring proposals of participants u Write letters to contributors requiring proposals of participants n Names required by mid November, in order that boards can have first meetings already this year n Early next year: Launching Workshop for all those doing the work (later ~one “LHC Computing Grid” week/year?)