Last update: 02/06/2015 23:05 LCG les robertson - cern-it 1 The LHC Computing Grid Project Preparing for LHC Data Analysis NorduGrid Workshop Stockholm,

Slides:



Advertisements
Similar presentations
Conference xxx - August 2003 Fabrizio Gagliardi EDG Project Leader and EGEE designated Project Director Position paper Delivery of industrial-strength.
Advertisements

An open source approach for grids Bob Jones CERN EU DataGrid Project Deputy Project Leader EU EGEE Designated Technical Director
LCG Project Status & Plans (with an emphasis on applications software) Torre Wenaus, BNL/CERN LCG Applications Area Manager
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
Last update: 01/06/ :09 LCG les robertson - cern-it 1 LHC Computing Grid Project - LCG The LHC Computing Grid First steps towards a Global Computing.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
1 Developing Countries Access to Scientific Knowledge Ian Willers CERN, Switzerland.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
Les Les Robertson LCG Project Leader LCG - The Worldwide LHC Computing Grid LHC Data Analysis Challenges for 100 Computing Centres in 20 Countries HEPiX.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
A. Aimar - EP/SFT LCG - Software Process & Infrastructure1 Software Process panel SPI GRIDPP 7 th Collaboration Meeting 30 June – 2 July 2003 A.Aimar -
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
CERN Deploying the LHC Computing Grid The LCG Project Ian Bird IT Division, CERN CHEP March 2003.
11 December 2000 Paolo Capiluppi - DataGrid Testbed Workshop CMS Applications Requirements DataGrid Testbed Workshop Milano, 11 December 2000 Paolo Capiluppi,
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
D0RACE: Testbed Session Lee Lueking D0 Remote Analysis Workshop February 12, 2002.
LCG LHC Computing Grid Project – LCG CERN – European Organisation for Nuclear Research Geneva, Switzerland LCG LHCC Comprehensive.
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
SA1/SA2 meeting 28 November The status of EGEE project and next steps Bob Jones EGEE Technical Director EGEE is proposed as.
GridPP Presentation to AstroGrid 13 December 2001 Steve Lloyd Queen Mary University of London.
EGEE is a project funded by the European Union under contract IST Middleware Planning for LCG/EGEE Bob Jones EGEE Technical Director e-Science.
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
JRA Execution Plan 13 January JRA1 Execution Plan Frédéric Hemmer EGEE Middleware Manager EGEE is proposed as a project funded by the European.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
LCG LHC Computing Grid Project – LCG CERN – European Organisation for Nuclear Research Geneva, Switzerland LCG LHCC Comprehensive.
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
E-Science Research Councils awarded e-Science funds ” science increasingly done through distributed global collaborations enabled by the Internet, using.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Ian Bird LCG Deployment Area Manager & EGEE Operations Manager IT Department, CERN Presentation to HEPiX 22 nd October 2004 LCG Operations.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
LCG Denis Linglin - 1 MàJ : 9/02/03 07:24 LHC Computing Grid Project Status Report 12 February 2003.
EGEE MiddlewareLCG Internal review18 November EGEE Middleware Activities Overview Frédéric Hemmer EGEE Middleware Manager EGEE is proposed as.
15 December 2015M. Lamanna “The ARDA project”1 The ARDA Project (meeting with the LCG referees) Massimo Lamanna CERN.
LCG LCG Workshop – March 23-24, Middleware Development within the EGEE Project LCG Workshop CERN March 2004 Frédéric Hemmer.
1CHEP2000 February 2000F. Gagliardi EU HEP GRID Project Fabrizio Gagliardi
23.March 2004Bernd Panzer-Steindel, CERN/IT1 LCG Workshop Computing Fabric.
1 Future Circular Collider Study Preparatory Collaboration Board Meeting September 2014 R-D Heuer Global Future Circular Collider (FCC) Study Goals and.
LCG CERN David Foster LCG WP4 Meeting 20 th June 2002 LCG Project Status WP4 Meeting Presentation David Foster IT/LCG 20 June 2002.
LHC Computing, CERN, & Federated Identities
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
© Copyright AARNet Pty Ltd PRAGMA Update & some personal observations James Sankar Network Engineer - Middleware.
EGEE Project Review Fabrizio Gagliardi EDG-7 30 September 2003 EGEE is proposed as a project funded by the European Union under contract IST
DataGrid is a project funded by the European Commission under contract IST rd EU Review – 19-20/02/2004 The EU DataGrid Project Three years.
Last update: 27/02/ :04 LCG Early Thinking on ARDA in the Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager PEB Dec 9, 2003.
November 27, 2001DOE/NSF review of US LHC S&C projects1 The Software and Computing Committee (SC2) in the LHC Computing Grid Project M Kasemann, FNAL.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
Last update: 03/03/ :37 LCG Grid Technology Area Quarterly Status & Progress Report SC2 February 6, 2004.
ALICE RRB-T ALICE Computing – an update F.Carminati 23 October 2001.
Project Execution Board Gets agreement on milestones, schedule, resource allocation Manages the progress and direction of the project Ensures conformance.
LCG Project Organisation Requirements and Monitoring LHCC Comprehensive Review November 24, 2003 Matthias Kasemann Software + Computing Committee (SC2)
Ian Bird LCG Project Leader Status of EGEE  EGI transition WLCG LHCC Referees’ meeting 21 st September 2009.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
INFSO-RI Enabling Grids for E-sciencE EGEE general project update Fotis Karayannis EGEE South East Europe Project Management Board.
1 ALICE Summary LHCC Computing Manpower Review September 3, 2003.
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
Bob Jones EGEE Technical Director
WLCG Tier-2 Asia Workshop TIFR, Mumbai 1-3 December 2006
LCG Applications Area Milestones
EGEE Middleware Activities Overview
Alice Week Offline Day F.Carminati June 17, 2002.
F.Carminati Geneva, February 13, 2003
US ATLAS Physics & Computing
Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002
LHC Data Analysis using a worldwide computing grid
Collaboration Board Meeting
LHC Computing Grid Project
LHC Computing Grid Project
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

last update: 02/06/ :05 LCG les robertson - cern-it 1 The LHC Computing Grid Project Preparing for LHC Data Analysis NorduGrid Workshop Stockholm, 11 November 2002 Les Robertson IT Division, CERN

last update 02/06/ :05 LCG les robertson - cern-it-2 Project Goals  applications - tools, frameworks, environment, persistency  computing system  global grid service  cluster  automated fabric  collaborating computer centres  grid  CERN-centric analysis  global analysis environment  central role of data challenges Goal – Prepare and deploy the LHC computing environment for the analysis and management of the data coming from the detectors This is not another grid technology project – it is a grid deployment project

last update 02/06/ :05 LCG les robertson - cern-it-3 Background  Recommendations of the LHC Computing Review CERN-LHCC – 20 February 2001  Common solutions and support for applications  Estimates of total requirements and costs  Distributed computing environment using Grid technology  Data recording and reconstruction at CERN  Analysis in Regional Centres and CERN (CERN only a small fraction of the analysis capacity)  Simulation in Regional Centres  Launch Committee – CERN/2379/Rev  Council 20 September 200  organisation  integrating and coordinating work done by experiments, regional centres, grid projects  separating requirements setting (SC2) from implementation (PEB)  reviewed by - technical & scientific – LHC Committee (LHCC) - resources - Computing Resource Review Board (with representatives of all funding agencies)

last update 02/06/ :05 LCG les robertson - cern-it-4 Project Resources  National funding for resources at Regional Centres – funded as resources for LHC experiments  Grid projects – suppliers and maintainers of middleware  funded by EU, NSF, DoE, national and regional investments  CERN personnel and materials  from the base budget – IT and EP  Special contributions by member and observer states of people and materials at CERN during Phase I of the project  Resources from other institutes  signing up for common applications projects  providing infrastructure services for operating the grid, supporting users, maintaining systems software,..  CERN openlab industrial contributions Six components

last update 02/06/ :05 LCG les robertson - cern-it-5  CERN will provide the data reconstruction & recording service (Tier 0) -- but only a small part of the analysis capacity (Tier 1)  current planning for capacity at principal Regional Centres + CERN  2002: 650 KSI2000  <1% of capacity required n 2008  2005: 6,600 KSI2000  < 10% of 2008 capacity

last update 02/06/ :05 LCG les robertson - cern-it-6 LCG Project Organisation Four work areas –  Applications  Grid deployment  Grid Technology  Fabrics (management & technology of large computing clusters)

last update 02/06/ :05 LCG les robertson - cern-it-7 Applications Area  Base support for the development process, infrastructure, tools, libraries  Frameworks for simulation and analysis  Object persistency and data management  Projects common to several experiments  everything that is not an experiment-specific component is a potential candidate for a common project  long term advantages in use of resources, support, maintenance

last update 02/06/ :05 LCG les robertson - cern-it-8 Applications Work Packages  Software process and infrastructure (SPI) – Alberto Aimar  Persistency framework (POOL) – Dirk Duellmann  Math libraries – Fred James  Core tools and services - Pere Mato  Physics interfaces (Launching)  Detector description (Requirements agreed)  Event generators (Requirements agreed)  Simulation (Requirements stage)  Analysis tools, distributed analysis (next priority)

last update 02/06/ :05 LCG les robertson - cern-it-9 Domain Decomposition Products mentioned are examples; not a comprehensive list slide by Torre Wenaus

last update 02/06/ :05 LCG les robertson - cern-it-10 Simulation  First set of formal requirements for LCG for MC generators (October) and simulation (November)  There is a need for support for both GEANT 4 and FLUKA  GEANT4  independent collaboration, including HEP institutes, LHC and other experiments, other sciences  significant CERN and LHC-related related resources  MoU being re-discussed now  Proposal to create an HEP User Requirements Committee chaired by an LHC physicist  need to ensure long-term support

last update 02/06/ :05 LCG les robertson - cern-it-11 Grid Deployment  Planning, building, commissioning, operating - - a stable, reliable, manageable Grid for - - Data Challenges  Distributed Production  Distributed Analysis  Integrating services from many Regional Centres around the world  Permanent service – on which Data Challenges are scheduled

last update 02/06/ :05 LCG les robertson - cern-it-12 Current status  Experiments are doing their event production using distributed resources with a variety of solutions  classic distributed production – send jobs to specific sites, simple bookkeeping  some use of Globus, and some of the HEP Grid tools  vertically integrated solutions (ALIEN) Grid Deployment

last update 02/06/ :05 LCG les robertson - cern-it-13 Data Challenges in 2002

6 million events ~20 sites

grid tools used at 11 sites

last update 02/06/ :05 LCG les robertson - cern-it-16 Grid Deployment The hard problem for distributed computing is data analysis – ESD and AOD  chaotic workload  unpredictable data access patterns this is the problem that the LCG has to solve and this is where Grid technology should really help After two years of grid developments we are just at the beginning of grid services

last update 02/06/ :05 LCG les robertson - cern-it-17 Deploying the LHC Grid  A priority now is to move from testbeds to a SERVICE  We need to learn how to OPERATE a Grid Service Quality is the Key to Acceptance of Grids Reliable OPERATION will be the factor that limits the practical size of Grids

last update 02/06/ :05 LCG les robertson - cern-it-18 Centres taking part in LCG-1 around the world  around the clock

last update 02/06/ :05 LCG les robertson - cern-it-19 Centres taking part in LCG-1 Tier 1 Centres  FZK Karlsruhe, CNAF Bologna, Rutherford Appleton Lab (UK), IN2P3 Lyon, University of Tokyo, Fermilab, Brookhaven National Lab Other Centres  GSI, Moscow State University, NIKHEF Amsterdam, Academica Sinica (Taipei), NorduGrid, Caltech, University of Florida, Ohio Supercomputing Centre, Tata Institute (India), Torino, Milano, Legnaro, ……

last update 02/06/ :05 LCG les robertson - cern-it-20 Grid Deployment - The Strategy Get a basic grid service into production so that we know what works, what doesn’t, what the priorities are And evolve from there to the full LHC service  Decide on a common set of middleware to be used for the first LCG grid service – LCG-1  target- LCG-1 in operation mid LCG-1 in full service by end of 2003  this will be conservative – stability before functionality and will not satisfy all of the stated requirements  but must be sufficient for the data challenges scheduled in 2004

last update 02/06/ :05 LCG les robertson - cern-it-21 LCG-1 as a service for LHC experiments  Mid-2003  5~8 of the larger regional centres  available as one of the services used for simulation campaigns  2H03  add more capacity at operational regional centres  add more regional centres  initiate operations centre, user support infrastructure  Early 2004  principal service for LHC physics data challenges

last update 02/06/ :05 LCG les robertson - cern-it-22 Grid Technology in LCG LCG expects to obtain Grid Technology from projects (well-)funded by national and regional e-science initiatives -- and (later) from industry the LCG project will concentrate on deploying a global grid service

last update 02/06/ :05 LCG les robertson - cern-it-23 Strategy  Press for complementary middleware developments (world wide)  parallel developments are a problem until the standards emerge  Reduce unnecessary divergence and complexity  Ensure at least one complete and supported solution for LHC  Understand & resolve middleware support issues  As a last resort – develop a plan to meet requirements not taken on by other projects

last update 02/06/ :05 LCG les robertson - cern-it-24 Grid Technology Status  A base set of requirements has been defined (HEPCAL)  43 use cases  currently funded projects say they will satisfy ~2/3 of these in 2003  Good experience of experiments working with Grid projects in Europe and the United States  Practical results from testbeds used for physics simulation campaigns  Everyone builds on the Globus toolkit -- which is undergoing a radical re-design  GLUE initiative – working on integration of the (European) DataGrid project and the (US) Virtual Data Toolkit – VDT  Reliability, Scalability are of major concern  Long term maintenance of tools developed by R&D projects is unclear - quality and commitment  Evolution needed from Research & Development  Engineering

last update 02/06/ :05 LCG les robertson - cern-it-25 Grid Technology – Next Steps  leverage the massive investments being made  proposals being prepared for EU 6 th Framework Programme, NSF ITR funding round, national science infrastructure funding  priority: a basic set of reliable, maintainable middleware  hardening / re-engineering of current prototypes  including all the functions that must be common  and at least one example of other essential functions  complementary (or at least coordinated) developments of higher- level functionality by different projects, experiments  prepare for major architectural changes before things mature - do not become too attached to current implementations

last update 02/06/ :05 LCG les robertson - cern-it-26 Proposal for integrated project to harden/re-engineer basic middleware industrial management – s/w engineering focus few development centres – funded from different sources design group - representative of all m/w projects

last update 02/06/ :05 LCG les robertson - cern-it-27 Challenges  Integrating all of the players towards a common goal  Regional Centres, Grid Projects, physics institutes, CERN  Influencing external providers to satisfy LCG requirements – GEANT4, ROOT,.. Grid Projects  Grid technology –  the funding is excellent  but not so obvious that it will provide solid infrastructure suitable for LHC  LCG will be a global grid service – Europe, Asia, America  national ambitions, different agendas make it very hard to navigate – major contrast with WWW!  We are only at the beginning of understanding Grid operation  Grids imply operation and management by the community a federation not empires

last update 02/06/ :05 LCG les robertson - cern-it-28 The Work Plan  Get a production Grid into operation  Deliver a service to LHC experiments  Understand & fix the real problems