November 27, 2001DOE/NSF review of US LHC S&C projects1 The Software and Computing Committee (SC2) in the LHC Computing Grid Project M Kasemann, FNAL.

Slides:



Advertisements
Similar presentations
Strengthening statistical capacity in support of progress towards the Internationally Agreed Development Goals in countries of South Asia United Nations.
Advertisements

ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
15 May 2006Collaboration Board GridPP3 Planning Executive Summary Steve Lloyd.
National Report - Canada - Sustaining Arctic Observing Networks Board Meeting – Potsdam, Germany October 2012.
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
Session V: Programme Roles and Responsibilities
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Proposal for a Constitution for MICE A Plan for Discussion P Dornan G Gregoire Y Nagashima A Sessler.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
Critical Role of ICT in Parliament Fulfill legislative, oversight, and representative responsibilities Achieve the goals of transparency, openness, accessibility,
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
CERN Computing Review Recommendations ATLAS Plenary 22 February 2001 Gilbert Poulard / CERN-EP-ATC.
The Preparatory Phase Proposal a first draft to be discussed.
Global Action Plan and its implementation in other regions Meeting for Discussion of the draft Plan for the Implementation of the Global Strategy to Improve.
23 Feb 2000F Harris Hoffmann Review Status1 Status of Hoffmann Review of LHC computing.
Cultural Heritage and Global Change: a new challenge for Europe Working for strengthening the European Research Area Towards Joint Programming in Research.
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
DataGrid Applications Federico Carminati WP6 WorkShop December 11, 2000.
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
December_2009 Partnership building. December_2009 Partnership building within the partnering process COREGROUPCOREGROUP FORMAL LAUNCH $ $ $ $ $ cost centre.
LCG LHC Computing Grid Project – LCG CERN – European Organisation for Nuclear Research Geneva, Switzerland LCG LHCC Comprehensive.
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
CLIC Implementation Studies Ph. Lebrun & J. Osborne CERN CLIC Collaboration Meeting addressing the Work Packages CERN, 3-4 November 2011.
European Commission, DG Education and Culture,
1 Direction scientifique Networks of Excellence objectives  Reinforce or strengthen scientific and technological excellence on a given research topic.
GridPP Presentation to AstroGrid 13 December 2001 Steve Lloyd Queen Mary University of London.
Norman McCubbin1 Report from the Collaboration MICE Project Board (CMPB) Norman McCubbin Director, Particle Physics Department MICE/UKNF Oversight Committee.
24-Aug-11 ILCSC -Mumbai Global Design Effort 1 ILC: Future after 2012 preserving GDE assets post-TDR pre-construction program.
GRID IIII D UK Particle Physics Tony Doyle - University of Glasgow GridPP Status Report  Executive Summary  News  “Unfunded” EU DataGrid Posts  Shadow.
October 30, 2001ATLAS PCAP1 LHC Computing at CERN and elsewhere The LHC Computing Grid Project as approved by Council, on September 20, 2001 M Kasemann,
…building the next IT revolution From Web to Grid…
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
E-Science Research Councils awarded e-Science funds ” science increasingly done through distributed global collaborations enabled by the Internet, using.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
28/2/2000DJ for FOCUS1 LHC Computing Review  Announced by H. Hoffmann autumn 1999  First Steering Committee meeting 12/1999  By end of 'spring' (?)
15 December 2015M. Lamanna “The ARDA project”1 The ARDA Project (meeting with the LCG referees) Massimo Lamanna CERN.
1 Report of PAC for Particle Physics T. Hallman Presented by P. Spillantini JINR Scientific Council Meeting June 3-4, 2004 Dubna, Russia.
Ian Bird GDB CERN, 9 th September Sept 2015
Management’s preliminary comments to the ERC Report FINANCE COMMITTEE June 19, 2002.
1 Future Circular Collider Study Preparatory Collaboration Board Meeting September 2014 R-D Heuer Global Future Circular Collider (FCC) Study Goals and.
LCG CERN David Foster LCG WP4 Meeting 20 th June 2002 LCG Project Status WP4 Meeting Presentation David Foster IT/LCG 20 June 2002.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
Costing review Extract from slide of Philippe: The Panel was impressed by the quality of the presentations and degree of detail to which the value.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Operations Automation Team Kickoff Meeting.
J.J.Blaising 18 April 02DataGrid and LCG1 LCG J.J Blaising LAPP-IN2P3 LHC Computing Grid Project (LCG) Launch Workshop March RTAG Common Use Cases.
Project Execution Board Gets agreement on milestones, schedule, resource allocation Manages the progress and direction of the project Ensures conformance.
LCG Project Organisation Requirements and Monitoring LHCC Comprehensive Review November 24, 2003 Matthias Kasemann Software + Computing Committee (SC2)
ARDA Massimo Lamanna / CERN Massimo Lamanna 2 TOC ARDA Workshop Post-workshop activities Milestones (already shown in December)
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
LHCbComputing Update of LHC experiments Computing & Software Models Selection of slides from last week’s GDB
Bob Jones EGEE Technical Director
EGEE Middleware Activities Overview
Ian Bird GDB Meeting CERN 9 September 2003
Alice Week Offline Day F.Carminati June 17, 2002.
S4 will be a “big” Collaboration:
Alice Week Offline Day F.Carminati March 18, 2002 ALICE Week India.
Process of the 2nd update of the European Strategy for Particle Physics FCC week, 29 May 2017, Berlin Sijbrand de Jong, President of the CERN Council (slides.
Overview of working draft v. 29 January 2018
The ERA.Net instrument Aims and benefits
Statistics Governance and Quality Assurance: the Experience of FAO
Collaboration Board Meeting
LHC Computing Grid Project
LHC Computing Grid Project
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

November 27, 2001DOE/NSF review of US LHC S&C projects1 The Software and Computing Committee (SC2) in the LHC Computing Grid Project M Kasemann, FNAL

November 27, 2001DOE/NSF review of US LHC S&C projects2 n Following recommendations of the LHC Computing Review in 2000 CERN proposes the LHC Computing Grid Project n It involves a two-phased approach to the problem, covering the years 2001 to 2007: u Phase 1: Development and prototyping at CERN and in Member States and Non Member States from 2001 to 2004, requiring expert manpower and some investment to establish a distributed production prototype at CERN and elsewhere that will be operated as a platform for the data challenges of the experiments. u The experience acquired towards the end of this phase will allow the elaboration of a Technical Design Report, which will serve as a basis for agreeing the relations between the distributed Grid nodes and their co-ordinated deployment and exploitation. u Phase 2: Installation and operation of the full world-wide initial production Grid system in the years 2005 to 2007, requiring continued manpower efforts and substantial material resources. n Milestones and activities can be defined precisely for the immediately following years, becoming progressively less certain for the more distant future APPROVED by Council, September 20, 2001 LHC Computing Grid Project

November 27, 2001DOE/NSF review of US LHC S&C projects3 Formal Project Structure n A formal project structure will ensure the achievement of the required functionality and performance of the overall system with an efficient use of the allocated resources. n Participation in the project structure by the LHC experiments and the emerging regional centres will ensure the formulation of a work plan addressing the fundamental needs of the LHC experimental programme. n The formulation of the work plan as a set of work packages, schedules and milestones will facilitate contributions by collaborating institutes and by pre-existing projects, in particular the EU DataGrid and other Grid projects. n Appropriate liaisons with these pre-existing projects as well as with industry will be put in place to promote efficient use of resources, avoid duplication of work and preserve possibilities for technology transfer. n Leadership provided by CERN would have a clear executive role in the CERN developments and in the provision of the application infrastructure, and would provide co-ordination for the prototyping and development of the Regional Centres.

November 27, 2001DOE/NSF review of US LHC S&C projects4 The LHC Computing Grid Project Structure The LHC Computing Grid Project LHCC Project Overview Board Project Execution Board Software and Computing Committee (SC2) Work Plan Definition WP RTAG WP Reports Reviews Common Computing RRB Resource Matters e-Science Project Leader Other Computing Grid Projects Other HEP Grid Projects EU DataGrid Project industry Regional Center Host labs

November 27, 2001DOE/NSF review of US LHC S&C projects5 The LHC Computing Grid Project Structure The LHC Computing Grid Project LHCC Project Overview Board Project Execution Board Software and Computing Committee (SC2) Work Plan Definition WP RTAG WP Reports Reviews Common Computing RRB Resource Matters Project Leader Project Overview Board Chair: CERN Director for Scientific Computing Secretary: CERN Information Technology Division Leader Membership: Spokespersons of LHC experiments CERN Director for Colliders Representatives of countries/regions with Tier-1 center : France, Germany, Italy, Japan, United Kingdom, United States of America 4 Representatives of countries/regions with Tier-2 center from CERN Member States In attendance: Project Leader SC2 Chairperson

November 27, 2001DOE/NSF review of US LHC S&C projects6 The LHC Computing Grid Project Structure The LHC Computing Grid Project LHCC Project Overview Board Project Execution Board Software and Computing Committee (SC2) Work Plan Definition WP RTAG WP Reports Reviews Common Computing RRB Resource Matters Project Leader Software and Computing Committee (SC2) (preliminary) Chair: to be appointed by CERN Director General Secretary Membership: 2 coordinators from each LHC experiment Representative from CERN EP Division Technical Managers from centers in each region represented in the POB Leader of the CERN Information Technology Division Project Leader Invited: POB Chairperson

November 27, 2001DOE/NSF review of US LHC S&C projects7 The LHC Computing Grid Project Structure The LHC Computing Grid Project LHCC Project Overview Board Project Execution Board Software and Computing Committee (SC2) Work Plan Definition WP RTAG WP Reports Reviews Common Computing RRB Resource Matters Project Leader Project Execution Board (Preliminary – POB approval required) Constrain to 15—18 members Project Management Team: Project Leader Project architect Area Coordinators Applications Fabric & basic computing systems Grid technology Grid deployment, regional centres, data challenges Empowered representative from each LHC Experiment Leaders of major contributing teams

November 27, 2001DOE/NSF review of US LHC S&C projects8 The projects requires Collaboration n The CERN activity is part of the wider programme of work that must be undertaken as a close collaboration between u CERN u Regional Centres u institutes participating in the experiments u Grid projects n Scope: to develop, test and build the full LHC Computing Grid. n Once the project has begun and both CERN and the Regional Centres have completed more detailed plans, it may be appropriate to change the balance of the investment and activities made at CERN and in other centres. n This balance should be reviewed at regular intervals during the life of the project.

November 27, 2001DOE/NSF review of US LHC S&C projects9 Phase 1: High Level Goals n Provide a common infrastructure that LHC experimentalists can use to develop and run their applications efficiently on a grid of computing fabrics. n Execute successfully a series of data challenges satisfying the requirements of the experiments. n Provide methodologies, technical guidelines and costing models for building the high-throughput data-intensive computing fabrics and grids that will be required for Phase 2 of the project. n Validate the methodologies, guidelines and models by building a series of prototypes of the Tier 0 and distributed Tier 1 facility of increasing capacity and complexity, demonstrating the operation as an integrated Grid computing system with the required levels of reliability and performance. n Provide models that can be used by various institutions to build the remaining part of the tiered model (Tier-2 and below). n Maintain reasonable opportunities for the re-use of the results of the project in other fields, particularly in science. n Produce a Technical Design Report for the LHC Computing Grid to be built in Phase 2 of the project.

November 27, 2001DOE/NSF review of US LHC S&C projects10 Project Scope: Phase 1 (-2004) n Build a prototype of the LHC Computing Grid, with capacity and performance satisfying the needs of the forthcoming data challenges of the LHC experiments: u Develop the system software, middleware and expertise required to manage very large-scale computing fabrics to be located at various locations. u Develop the grid middleware to organise the interaction between such computing fabrics installed at geographically remote locations, creating a single coherent computing environment. u Acquire experience with high-speed wide-area network and data management technologies, developing appropriate tools to achieve required levels of performance and reliability for migration, replication and caching of large data collections between computing fabrics.

November 27, 2001DOE/NSF review of US LHC S&C projects11 Project Scope: Phase 1 (-2004) u Develop a detailed model for distributed data analysis for LHC, refining previous work by the MONARC collaboration and the LHC Computing Review, providing detailed estimates of data access patterns to enable realistic modelling and prototyping of the complex grid environment. u Adapt LHC applications to exploit the fabric and grid environment. u Progressively deploy at CERN and in a number of future Tier 1 and Tier 2 centres a half-scale (for a single LHC experiment) prototype of the LHC Computing Grid, demonstrating the required functionality, usability, performance and production- quality reliability. u Define the characteristics of the initial full production facility, including the CERN Tier 0 centre, the Tier 1 environment distributed across CERN and the Regional Centres, and the integration with the Tier 2 installations

November 27, 2001DOE/NSF review of US LHC S&C projects12 Project Scope: Phase 1 (-2004) n Complete the development of the first versions of the physics application software and enable these for the distributed computing grid model: u Develop and support common libraries, tools and frameworks to support the development of the application software, particularly in the areas of simulation and analysis. u In parallel with this, the LHC collaborations must develop and deploy the first versions of their core software.

November 27, 2001DOE/NSF review of US LHC S&C projects13 Project Scope: Phase 1 (-2004) n This work will be carried out in a close collaboration between CERN, Tier 1 and Tier 2 centers, and the LHC Collaborations and their participating institutes. n It is assumed that for Phase 1 the national funding agencies in the Member States and non-Member States will ensure the construction of the prototype Tier 1 and Tier 2 centers and their share of the distributed computing infrastructure. n It is further assumed that the experimental collaborations will ensure the software requirements

November 27, 2001DOE/NSF review of US LHC S&C projects14 Project Scope: Phase 2 (2005-7) n Construct the initial full production version of the LHC Computing Grid ( ) according to the experience gained in the years of prototyping. u The resources required are still not known to a sufficient degree of precision but will be defined as part of the Phase 1 activity. u A specific proposal for Phase 2 will be prepared during 2003, in line with the Computing Technical Design Reports of the LHC experiments. n Regular later updates (after 2007) of the LHC Computing Grid should be foreseen according to the accumulation of data and the evolving needs of the experiments.

November 27, 2001DOE/NSF review of US LHC S&C projects15 SC2: Roles n The SC2 sets the REQUIREMENTS for the project n The project team, in consultation with the SC2, takes the requirements and CREATES a Work Plan, identifying the high-level activities with associated goals and milestones. n The SC2 APPROVES the high-level Work Plan n The SC2 MONITORS the project: u It received regular status reports u It organizes 'peer reviews'

November 27, 2001DOE/NSF review of US LHC S&C projects16 SC2: Members n Chair: to be appointed by D.G. n Secretary n Representatives from regional centers n CERN EP Representative n CERN IT Leader n Project Leader n Proposed: u Grid projects representation n Invited: POB Chairperson n Nominated by experiments: u ALICE: F Federico Carminati F Wisla Carena u ATLAS: F Norman McCubbin F Gilbert Poulard u CMS: F Paolo Capiluppi F David Stickland u LHCb: F Nick Brook F John Harvey

November 27, 2001DOE/NSF review of US LHC S&C projects17 SC2: Next Steps n Finalize membership: u Need to get regional centres representation F should have solid technical background relevant for regional center implementation, operation and Grid software deployment F Representatives of countries/regions with Tier-1 centers (France, Germany, Italy, Japan, UK, US) I believe it would be beneficial to get 2 US representatives F 4 representatives of countries with Tier-2 centers from 'CERN member states' u Propose: get representation covering major Grid projects n First pre-SC2 meeting scheduled for December 7 u Start organizing the work, Rtag’s n Early next year: Launching Workshop for all those doing the work (later ~one “LHC Computing Grid” week/year?)