LCG Project Status & Plans (with an emphasis on applications software) Torre Wenaus, BNL/CERN LCG Applications Area Manager

Slides:



Advertisements
Similar presentations
Distributed Analysis at the LCG Torre Wenaus, BNL/CERN LCG Applications Area Manager Caltech Grid Enabled Analysis.
Advertisements

1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
LCG Applications Area Status Torre Wenaus, BNL/CERN LCG Applications Area Manager LHCC Meeting January 27, 2003.
Last update: 02/06/ :05 LCG les robertson - cern-it 1 The LHC Computing Grid Project Preparing for LHC Data Analysis NorduGrid Workshop Stockholm,
1 Developing Countries Access to Scientific Knowledge Ian Willers CERN, Switzerland.
Simulation Project Organization update & review of recommendations Gabriele Cosmo, CERN/PH-SFT Application Area Internal.
SEAL V1 Status 12 February 2003 P. Mato / CERN Shared Environment for Applications at LHC.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager DOE/NSF Review of US LHC Physics and Computing.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
EMI INFSO-RI SA2 - Quality Assurance Alberto Aimar (CERN) SA2 Leader EMI First EC Review 22 June 2011, Brussels.
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
May 8, 20071/15 VO Services Project – Status Report Gabriele Garzoglio VO Services Project – Status Report Overview and Plans May 8, 2007 Computing Division,
CERN Deploying the LHC Computing Grid The LCG Project Ian Bird IT Division, CERN CHEP March 2003.
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL U.S. ATLAS Physics and Computing Advisory Panel Review Argonne National Laboratory Oct 30, 2001.
11 December 2000 Paolo Capiluppi - DataGrid Testbed Workshop CMS Applications Requirements DataGrid Testbed Workshop Milano, 11 December 2000 Paolo Capiluppi,
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
LCG Applications Area – Overview, Planning, Resources Torre Wenaus, BNL/CERN LCG Applications Area Manager LHCC Comprehensive Review.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
1 Planning for Reuse (based on some ideas currently being discussed in LHCb ) m Obstacles to reuse m Process for reuse m Project organisation for reuse.
LCG LHC Computing Grid Project – LCG CERN – European Organisation for Nuclear Research Geneva, Switzerland LCG LHCC Comprehensive.
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
GridPP Presentation to AstroGrid 13 December 2001 Steve Lloyd Queen Mary University of London.
JRA Execution Plan 13 January JRA1 Execution Plan Frédéric Hemmer EGEE Middleware Manager EGEE is proposed as a project funded by the European.
Early Thinking on ARDA in the Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager PEB Dec 9, 2003.
SEAL: Common Core Libraries and Services for LHC Applications CHEP’03, March 24-28, 2003 La Jolla, California J. Generowicz/CERN, M. Marino/LBNL, P. Mato/CERN,
SEAL Core Libraries and Services CLHEP Workshop 28 January 2003 P. Mato / CERN Shared Environment for Applications at LHC.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
SEAL Project Core Libraries and Services 18 December 2002 P. Mato / CERN Shared Environment for Applications at LHC.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
LCG Denis Linglin - 1 MàJ : 9/02/03 07:24 LHC Computing Grid Project Status Report 12 February 2003.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
15 December 2015M. Lamanna “The ARDA project”1 The ARDA Project (meeting with the LCG referees) Massimo Lamanna CERN.
The LHC Computing Grid Project (LCG) and ROOT Torre Wenaus, BNL/CERN LCG Applications Area Manager John Harvey, CERN EP/SFT Group Leader
SEAL Project Overview LCG-AA Internal Review October 2003 P. Mato / CERN.
D. Duellmann - IT/DB LCG - POOL Project1 The LCG Pool Project and ROOT I/O Dirk Duellmann What is Pool? Component Breakdown Status and Plans.
LCG CERN David Foster LCG WP4 Meeting 20 th June 2002 LCG Project Status WP4 Meeting Presentation David Foster IT/LCG 20 June 2002.
State of Georgia Release Management Training
- LCG Blueprint (19dec02 - Caltech Pasadena, CA) LCG BluePrint: PI and SEAL Craig E. Tull Trillium Analysis Environment for the.
David Foster LCG Project 12-March-02 Fabric Automation The Challenge of LHC Scale Fabrics LHC Computing Grid Workshop David Foster 12 th March 2002.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
SEAL Project Status SC2 Meeting 16th April 2003 P. Mato / CERN.
November 27, 2001DOE/NSF review of US LHC S&C projects1 The Software and Computing Committee (SC2) in the LHC Computing Grid Project M Kasemann, FNAL.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
Last update: 03/03/ :37 LCG Grid Technology Area Quarterly Status & Progress Report SC2 February 6, 2004.
Ian Bird Overview Board; CERN, 8 th March 2013 March 6, 2013
LCG Applications Area Internal Review Response (preliminary and brief version) (main points are on last slide) Torre Wenaus, BNL/CERN LCG Applications.
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL DOE/NSF Review of US LHC Software and Computing Fermilab Nov 29, 2001.
Simulation Project Setup Status Torre Wenaus, BNL/CERN LCG Applications Area Manager PEB Meeting January 28, 2003.
Follow-up to SFT Review (2009/2010) Priorities and Organization for 2011 and 2012.
Project Work Plan SEAL: Core Libraries and Services 7 January 2003 P. Mato / CERN Shared Environment for Applications at LHC.
Applications Area Status Torre Wenaus, BNL/CERN PEB Meeting October 8, 2002.
LCG Project Organisation Requirements and Monitoring LHCC Comprehensive Review November 24, 2003 Matthias Kasemann Software + Computing Committee (SC2)
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
Update on CHEP from the Computing Speaker Committee G. Carlino (INFN Napoli) on behalf of the CSC ICB, October
Grid Deployment Technical Working Groups: Middleware selection AAA,security Resource scheduling Operations User Support GDB Grid Deployment Resource planning,
An Architectural Blueprint for the Common LHC Physics Application Software Torre Wenaus, BNL/CERN LCG Applications Area Manager
SEAL: Common Core Libraries and Services for LHC Applications
Bob Jones EGEE Technical Director
LCG Applications Area Milestones
EGEE Middleware Activities Overview
(on behalf of the POOL team)
Ian Bird GDB Meeting CERN 9 September 2003
Alice Week Offline Day F.Carminati June 17, 2002.
Dirk Düllmann CERN Openlab storage workshop 17th March 2003
Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002
LHC Data Analysis using a worldwide computing grid
SEAL Project Core Libraries and Services
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

LCG Project Status & Plans (with an emphasis on applications software) Torre Wenaus, BNL/CERN LCG Applications Area Manager US ATLAS PCAP Review November 14, 2002

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 2 The LHC Computing Grid (LCG) Project  Approved (3 years) by CERN Council, September 2001  meanwhile extended by 1 year due to LHC delay  Injecting substantial new facilities and personnel resources  Scope:  Common software for physics applications  Tools, frameworks, analysis environment  Computing for the LHC  Computing facilities (fabrics)  Grid middleware, deployment  Deliver a global analysis environment Goal – Prepare and deploy the LHC computing environment

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 3 Goal of the LHC Computing Grid Project - LCG Phase 1 – development of common applications, libraries, frameworks, prototyping of the environment, operation of a pilot computing service Phase 2 – acquire, build and operate the LHC computing service

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 4  CERN will provide the data reconstruction & recording service (Tier 0) -- but only a small part of the analysis capacity  current planning for capacity at CERN + principal Regional Centres  2002: 650 KSI2000  <1% of capacity required n 2008  2005: 6,600 KSI2000  < 10% of 2008 capacity Non-CERN Hardware Need

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 5 LHC Manpower needs for Core Software 2000 Have (miss) ALICE12(5) ATLAS23(8) CMS15(10) LHCb14(5) Total64(28) Only computing professionals counted From LHC Computing Review (FTEs)

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 6 The LHC Computing Grid Project Structure Project Overview Board Project Execution Board (PEB) Software and Computing Committee (SC2) Requirements, Work plan, Monitoring WP RTAG WP Project Leader Grid Projects Project Work Packages

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 7 Funding LCG  National funding of Computing Services at Regional Centres  Funding from the CERN base budget  Special contributions of people and materials at CERN during Phase I of the project  Germany, United Kingdom, Italy, ……  Grid projects  Institutes taking part in applications common projects  Industrial collaboration  Institutes providing grid infrastructure services  operations centre, user support, training, ……

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 8 Project Execution Board Decision taking – - as close as possible to the work - by those who will be responsible for the consequences Two bodies set up to coordinate & take decisions  Architects Forum  software architect from each experiment and the application area manager  makes common design decisions and agreements between experiments in the applications area  supported by a weekly applications area meeting open to all participants  Grid Deployment Board  representatives from the experiments and from each country with an active Regional Centre taking part in the LCG Grid Service  forges the agreements, takes the decisions, defines the standards and policies that are needed to set up and manage the LCG Global Grid Services  coordinates the planning of resources for physics and computing data challenges

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 9 LCG Areas of Work Fabric (Computing System)  Physics Data Management  Fabric Management  Physics Data Storage  LAN Management  Wide-area Networking  Security  Internet Services Grid Technology  Grid middleware  Standard application services layer  Inter-project coherence/compatibility Physics Applications Software  Application Software Infrastructure – libraries, tools  Object persistency, data management tools  Common Frameworks – Simulation, Analysis,..  Adaptation of Physics Applications to Grid environment  Grid tools, Portals Grid Deployment  Data Challenges  Grid Operations  Network Planning  Regional Centre Coordination  Security & access policy

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 10 Fabric Area  CERN Tier 0+1 centre  Automated systems management package  Evolution & operation of CERN prototype – integration into the LCG grid  Tier 1,2 centre collaboration  develop/share experience on installing and operating a Grid  exchange information on planning and experience of large fabric management  look for areas for collaboration and cooperation  Technology tracking & costing  new technology assessment nearing completion  re-costing of Phase II will be done next year

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 11 Grid Technology (GTA) in LCG Quote from recent Les Robertson slide: LCG expects to obtain Grid Technology from projects funded by national and regional e-science initiatives -- and from industry concentrating ourselves on deploying a global grid service All true, but there is a real role for GTA, not just deployment, in LCG: Ensuring that the needed middleware is/will be there, tested, selected and of production grade Message seems to be getting through; (re)organization in progress to create an active GTA Has been dormant and subject to EDG conflict of interest up to now New leader David Foster, CERN

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 12 A few of the Grid Projects with strong HEP collaboration US projects European projects Many national, regional Grid projects -- GridPP(UK), INFN-grid(I), NorduGrid, Dutch Grid, …

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 13 Grid Technology Area This area of the project is concerned with  ensuring that the LCG requirements are known to current and potential Grid projects  active lobbying for suitable solutions – influencing plans and priorities  evaluating potential solutions  negotiating support for tools developed by Grid projects  developing a plan to supply solutions that do not emerge from other sources  BUT this must be done with caution – important to avoid HEP-SPECIAL solutions important to migrate to standards as they emerge (avoid emotional attachment to prototypes)

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 14 Grid Technology Status  A base set of requirements has been defined (HEPCAL)  43 use cases  ~2/3 of which should be satisfied ~2003 by currently funded projects  Good experience of working with Grid projects in Europe and the United States  Practical results from testbeds used for physics simulation campaigns  Built on the Globus toolkit  GLUE initiative – working on integration of the two main HEP Grid project groupings –  around the (European) DataGrid project  subscribing to the (US) Virtual Data Toolkit - VDT

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 15 Grid Deployment Area  New leader Ian Bird, CERN, formerly Jefferson Lab  Job is to set up and operate a Global Grid Service  stable, reliable, manageable Grid for – Data Challenges and regular production work  integrating computing fabrics at Regional Centres  learn how to provide support, maintenance, operation  Short term (this year):  consolidate (stabilize, maintain) middleware – and see it used for some physics  learn what a “production grid” really means by working with the Grid R&D projects

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 16 Grid Deployment Board  Computing management of regional centers and experiments  Decision forum for planning, deploying and operating the LCG grid  Service and resource scheduling and planning  Registration, authentication, authorization, security  LCG Grid operations  LCG Grid user support  One person from each country with an active regional center  Typically senior manager of a regional center  This role currently admixed with [inappropriately] broader responsibility  Middleware selection  For political/expediency reasons (dormancy of GTA)  Not harmful because it is being led well (David Foster)  Should be corrected during/after LCG-1

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 17 Medium term (next year): Target June 03 - deploy a Global Grid Service (LCG-1)  sustained 24 X 7 service  including sites from three continents  identical or compatible Grid middleware and infrastructure  several times the capacity of the CERN facility  and as easy to use Having stabilised this base service – progressive evolution –  number of nodes, performance, capacity and quality of service  integrate new middleware functionality  migrate to de facto standards as soon as they emerge Priority: Move from testbeds to a reliable service

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 18 Centers taking part in LCG-1 Tier 1 Centres  FZK Karlsruhe, CNAF Bologna, Rutherford Appleton Lab (UK), IN2P3 Lyon, University of Tokyo, Fermilab, Brookhaven National Lab Other Centres  GSI, Moscow State University, NIKHEF Amsterdam, Academica Sinica (Taipei), NorduGrid, Caltech, University of Florida, Ohio Supercomputing Centre, Torino, Milano, Legnaro, ……

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 19 LCG-1 as a service for LHC experiments  Mid-2003  5-10 of the larger regional centres  available as one of the services used for simulation campaigns  2H03  add more capacity at operational regional centres  add more regional centres  activate operations centre, user support infrastructure  Early 2004  principal service for physics data challenges

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 20 Applications Area Projects  Software Process and Infrastructure (operating)  Librarian, QA, testing, developer tools, documentation, training, …  Persistency Framework (operating)  POOL hybrid ROOT/relational data store  Mathematical libraries (operating)  Math and statistics libraries; GSL etc. as NAGC replacement  Core Tools and Services (just launched)  Foundation and utility libraries, basic framework services, system services, object dictionary and whiteboard, grid enabled services  Physics Interfaces (being initiated)  Interfaces and tools by which physicists directly use the software. Interactive (distributed) analysis, visualization, grid portals  Simulation (coming soon)  Geant4, FLUKA, virtual simulation, geometry description & model, …  Generator Services (coming soon)  Generator librarian, support, tool development

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 21 Applications Area Organization Project WP Project WP Project WP Overall management, coordination, architecture Apps Area Leader Project Leaders Work Package Leaders Architects Forum … Direct technical collaboration between experiment participants, IT, EP, ROOT, LCG personnel

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 22 Candidate RTAG timeline from March Blue: RTAG/activity launched or (light blue) imminent

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 23 LCG Applications Area Timeline Highlights Q1 Q2 Q3 Q4 Hybrid Event Store available for general users Distributed production using grid services First Global Grid Service (LCG-1) available Distributed end-user interactive analysis Full Persistency Framework LCG-1 reliability and performance targets “50% prototype” (LCG-3) LCG TDR Applications LCG POOL V0.1 internal release Architectural blueprint complete LCG launch week

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 24 Personnel status  15 new LCG hires in place and working; a few more soon  Manpower ramp is on schedule  Contributions from UK, Spain, Switzerland, Germany, Sweden, Israel, Portugal, US  ~10 FTEs from IT (DB and API groups) also participating  ~7 FTEs from experiments (CERN EP and outside CERN) also participating, primarily in persistency project at present  Important experiment contributions also in the RTAG process

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 25 Software Architecture Blueprint RTAG  Established early June 2002  Goals  Integration of LCG and non LCG software to build coherent applications  Provide the specifications of an architectural model that allows this, i.e. a ‘blueprint’  Mandate  Define the main domains and identify the principal components  Define the architectural relationships between these ‘frameworks’ and components, identify the main requirements for their inter-communication, and suggest possible first implementations.  Identify the high level deliverables and their order of priority.  Derive a set of requirements for the LCG

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 26 Architecture Blueprint Report  Executive summary  Response of the RTAG to the mandate  Blueprint scope  Requirements  Use of ROOT  Blueprint architecture design precepts  High level architectural issues, approaches  Blueprint architectural elements  Specific architectural elements, suggested patterns, examples  Domain decomposition  Schedule and resources  Recommendations After 14 RTAG meetings, much ... A 36-page final report Accepted by SC2 October

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 27 Architecture requirements  Long lifetime: support technology evolution  Languages: LCG core sw in C++ today; support language evolution  Seamless distributed operation  TGV and airplane work: usability off-network  Modularity of components  Component communication via public interfaces  Interchangeability of implementations

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 28 Architecture Requirements (2)  Integration into coherent framework and experiment software  Design for end-user’s convenience more than the developer’s  Re-use existing implementations  Software quality at least as good as any LHC experiment  Meet performance, quality requirements of trigger/DAQ software  Platforms: Linux/gcc, Linux/icc, Solaris, Windows

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 29 Basic Framework Foundation Libraries Simulation Framework Reconstruction Framework Visualization Framework Applications... Optional Libraries Other Frameworks Software Structure Implementation- neutral services STL, ROOT libs, CLHEP, Boost, … Grid middleware, … ROOT, Qt, …

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 30 Component Model  Granularity driven by component replacement criteria; development team organization; dependency minimization  Communication via public interfaces  Plug-ins  Logical module encapsulating a service that can be loaded, activated and unloaded at run time  APIs targeted not only to end-users but to embedding frameworks and internal plug-ins

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 31 Distributed Operation  Architecture should enable but not require the use of distributed resources via the Grid  Configuration and control of Grid-based operation via dedicated services  Making use of optional grid middleware services at the foundation level of the software structure  Insulating higher level software from the middleware  Supporting replaceability  Apart from these services, Grid-based operation should be largely transparent  Services should gracefully adapt to ‘unplugged’ environments  Transition to ‘local operation’ modes, or fail informatively

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 32 Managing Objects  Object Dictionary  To query a class about its internal structure (Introspection)  Essential for persistency, data browsing, interactive rapid prototyping, etc.  The ROOT team and LCG plan to develop and converge on a common dictionary (common interface and implementation) with an interface anticipating a C++ standard (XTI)  To be used by LCG, ROOT and CINT  Timescale ~1 year  Object Whiteboard  Uniform access to application-defined transient objects

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 33 Other Architectural Elements  Python-based Component Bus  Plug-in integration of components providing a wide variety of functionality  Component interfaces to bus derived from their C++ interfaces  Scripting Languages  Python and CINT (ROOT) to both be available  Access to objects via object whiteboard in these environments  Interface to the Grid  Must support convenient, efficient configuration of computing elements with all needed components

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 34 (LHCb) Example of LCG–Experiment SW Mapping LCG Pool LCG DDD LCG Pool Other LCG services LCG CLS LCG CLS HepPDT LCG CLS

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 35 Domain Decomposition Products mentioned are examples; not a comprehensive list Grey: not in common project scope (also event processing framework, TDAQ)

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 36 Use of ROOT in LCG Software  Among the LHC experiments  ALICE has based its applications directly on ROOT  The 3 others base their applications on components with implementation-independent interfaces  Look for software that can be encapsulated into these components  All experiments agree that ROOT is an important element of LHC software  Leverage existing software effectively and do not unnecessarily reinvent wheels  Therefore the blueprint establishes a user/provider relationship between the LCG applications area and ROOT  Will draw on a great ROOT strength: users are listened to very carefully!  The ROOT team has been very responsive to needs for new and extended functionality coming from the persistency effort

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 37 Personnel Resources – Required and Available Estimate of Required Effort FTEs today: 15 LCG, 10 CERN IT, 7 CERN EP + experiments Sep-02 Dec-02 Mar-03 Jun-03 Sep-03Dec-03 Mar-04 Jun-04 Sep-04Dec-04 Mar-05 Quarter ending FTEs SPI Math libraries Physics interfaces Generator services Simulation CoreToolsS&Services POOL Blue = Available effort: Future estimate: 20 LCG, 13 IT, 28 EP + experiments Now

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 39 RTAG Conclusions and Recommendations  Use of ROOT as described  Start common project on core tools and services  Start common project on physics interfaces  Start RTAG on analysis, including distributed aspects  Tool/technology recommendations  CLHEP, CINT, Python, Qt, AIDA, …  Develop a clear process for adopting third party software

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 40 Core Libraries and Services Project  Pere Mato (CERN/LHCb) is leading the new Core Libraries and Services (CLS) Project  Project being launched now, developing immediate plans over the next week or so and a full work plan over the next couple of months  Scope:  Foundation, utility libraries  Basic framework services  Object dictionary  Object whiteboard  System services  Grid enabled services  Many areas of immediate relevance to POOL  Clear process for adopting third party libraries will be addressed early in this project

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 41 Physics Interfaces Project  Launching it now, led by Vincenzo Innocente (CMS)  Covers the interfaces and tools by which physicists will directly use the software  Should be treated coherently, hence coverage by a single project  Expected scope once analysis RTAG concludes:  Interactive environment  Analysis tools  Visualization  Distributed analysis  Grid portals

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 42 POOL  Pool of persistent objects for LHC, currently in prototype  Targeted at event data but not excluding other data  Hybrid technology approach  Object level data storage using file-based object store (ROOT)  RDBMS for meta data: file catalogs, object collections, etc (MySQL)  Leverages existing ROOT I/O technology and adds value  Transparent cross-file and cross-technology object navigation  RDBMS integration  Integration with Grid technology (eg EDG/Globus replica catalog)  network and grid decoupled working modes  Follows and exemplifies the LCG blueprint approach  Components with well defined responsibilities  Communicating via public component interfaces  Implementation technology neutral

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 43 Pool Release Schedule  End September - V0.1 (Released on schedule)  All core components for navigation exist and interoperate  Assumes ROOT object (TObject) on read and write  End October - V0.2  First collection implementation  End November - V0.3 (First public release)  EDG/Globus FileCatalog integrated  Persistency for general C++ classes (not instrumented by ROOT)  Event meta data annotation and query  June 2003 – Production release

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 44 Four Experiments, Four Viewpoints, Two Paths  Four viewpoints (of course) among the experiments; two basic positions  ATLAS, CMS, LHCb similar views; ALICE differing  The blueprint establishes the basis for a good working relationship among all  LCG applications software is developed according to the blueprint  To be developed and used by ATLAS, CMS and LHCb, with ALICE contributing mainly via ROOT  ALICE continues to develop their line making direct use of ROOT as their software framework

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 45 What next in LCG? Much to be done to be done in middleware functionality - Job Scheduling, Workflow Management, Database access, Replica Management, Monitoring, Global Resource Optimisation, Evolution to Web Services, …. But we also need an evolution from Research & Development  Engineering Effective use of high-bandwidth Wide Area Networks work on protocols, file transfer High quality computer centre services  high quality Grid services sociology as well as technology

Torre Wenaus, BNL/CERN PCAP Review, November 14, 2002 Slide 46 Concluding Remarks  Good engagement and support from the experiments and CERN  Determination to make the LCG work  LCG organizational structure seems to be working, albeit slow  Facilities money problematic; manpower on track  Grid Technology and Grid Deployment areas still in flux  Important for US to make sure interests are represented and experience injected  Applications area going well  Architecture laid out  Working relationship among the four experiments  Many common projects  First software deliverable (POOL) on schedule