Presentation is loading. Please wait.

Presentation is loading. Please wait.

LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager DOE/NSF Review of US LHC Physics and Computing.

Similar presentations


Presentation on theme: "LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager DOE/NSF Review of US LHC Physics and Computing."— Presentation transcript:

1 LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager http://cern.ch/lcg/peb/applications DOE/NSF Review of US LHC Physics and Computing Projects January 14, 2003

2 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 2 The LHC Computing Grid Project Structure Project Overview Board Project Execution Board (PEB) Software and Computing Committee (SC2) Requirements, Work plan, Monitoring WP RTAG WP Project Leader Grid Projects Project Work Packages

3 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 3 LCG Areas of Work Fabric (Computing System)  Physics Data Management  Fabric Management  Physics Data Storage  LAN Management  Wide-area Networking  Security  Internet Services Grid Technology  Grid middleware  Standard application services layer  Inter-project coherence/compatibility Physics Applications Software  Application Software Infrastructure – libraries, tools  Object persistency, data management tools  Common Frameworks – Simulation, Analysis,..  Adaptation of Physics Applications to Grid environment  Grid tools, Portals Grid Deployment  Data Challenges  Grid Operations  Network Planning  Regional Centre Coordination  Security & access policy

4 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 4 Applications Area Organization Project WP Project WP Project WP Overall management, coordination, architecture Apps Area Leader Project Leaders Work Package Leaders Architects Forum … Direct technical collaboration between experiment participants, IT, EP, ROOT, LCG personnel

5 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 5 Focus on Experiment Need  Project structured and managed to ensure a focus on real experiment needs  SC2/RTAG process to identify, define (need-driven requirements), initiate and monitor common project activities in a way guided by the experiments themselves  Architects Forum to involve experiment architects in day to day project management and execution  Open-ness of information flow and decision making  Direct participation of experiment developers in the projects  Tight, iterative feedback loop to gather user feedback from frequent releases  Early deployment and evaluation of LCG software in experiment contexts  Success defined by experiment adoption and production deployment

6 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 6 Applications Area Projects  Software Process and Infrastructure (SPI) (operating – A.Aimar)  Librarian, QA, testing, developer tools, documentation, training, …  Persistency Framework (POOL) (operating – D.Duellmann)  POOL hybrid ROOT/relational data store  Mathematical libraries (operating – F.James)  Math and statistics libraries; GSL etc. as NAGC replacement  Group in India will work on this (workplan in development)  Core Tools and Services (SEAL) (operating – P.Mato)  Foundation and utility libraries, basic framework services, system services, object dictionary and whiteboard, grid enabled services  Physics Interfaces (PI) (launched – V.Innocente)  Interfaces and tools by which physicists directly use the software. Interactive (distributed) analysis, visualization, grid portals  Simulation (launch planning in progress)  Geant4, FLUKA, simulation framework, geometry model, …  Generator Services (launch as part of simu)  Generator librarian, support, tool development Bold: Recent developments (last 3 months)

7 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 7 Project Relationships Software Process & Infrastructure (SPI) Core Libraries & Services (SEAL) Persistency (POOL) Physicists Interface (PI) Math Libraries … LCG Applications Area Other LCG Projects in other Areas LHC Experiments

8 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 8 Candidate RTAG timeline from March Blue: RTAG/activity launched or (light blue) imminent

9 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 9 LCG Applications Area Timeline Highlights 2002200520042003 Q1 Q2 Q3 Q4 Hybrid Event Store available for general users Distributed production using grid services First Global Grid Service (LCG-1) available Distributed end-user interactive analysis Full Persistency Framework LCG-1 reliability and performance targets “50% prototype” (LCG-3) LCG TDR Applications LCG POOL V0.1 internal release Architectural blueprint complete LCG launch week

10 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 10 Architecture Blueprint  Executive summary  Response of the RTAG to the mandate  Blueprint scope  Requirements  Use of ROOT  Blueprint architecture design precepts  High level architectural issues, approaches  Blueprint architectural elements  Specific architectural elements, suggested patterns, examples  Domain decomposition  Schedule and resources  Recommendations RTAG established in June After 14 meetings, much email... A 36-page final report Accepted by SC2 October 11 http://lcgapp.cern.ch/project/blueprint/

11 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 11 Component Model  Granularity driven by component replacement criteria; development team organization; dependency minimization  Communication via public interfaces  Plug-ins  Logical module encapsulating a service that can be loaded, activated and unloaded at run time  APIs targeted not only to end-users but to embedding frameworks and internal plug-ins

12 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 12 Basic Framework Foundation Libraries Simulation Framework Reconstruction Framework Visualization Framework Applications... Optional Libraries Other Frameworks Software Structure Implementation- neutral services STL, ROOT libs, CLHEP, Boost, … Grid middleware, … ROOT, Qt, …

13 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 13 Distributed Operation  Architecture should enable but not require the use of distributed resources via the Grid  Configuration and control of Grid-based operation via dedicated services  Making use of optional grid middleware services at the foundation level of the software structure  Insulating higher level software from the middleware  Supporting replaceability  Apart from these services, Grid-based operation should be largely transparent  Services should gracefully adapt to ‘unplugged’ environments  Transition to ‘local operation’ modes, or fail informatively

14 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 14 Managing Objects  Object Dictionary  To query a class about its internal structure  Essential for persistency, data browsing, etc.  The ROOT team and LCG plan to develop and converge on a common dictionary (common interface and implementation) with an interface anticipating a C++ standard (XTI) (Timescale ~1yr?)  Will contact Stroustrup, who has started implementation  Object Whiteboard  Uniform access to application-defined transient objects, including in the ROOT environment  What this will be (how similar to Gaudi, StoreGate?) not yet defined  Object definition based on C++ header files  Now that ATLAS as well as CMS will use this approach, it is being addressed in a common way via the LCG AA

15 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 15 Dictionary: Reflection / Population / Conversion In progress New in POOL 0.3

16 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 16 Other Architectural Elements  Python-based Component Bus  Plug-in integration of components providing a wide variety of functionality  Component interfaces to bus derived from their C++ interfaces  Scripting Languages  Python and CINT (ROOT) to both be available  Access to objects via object whiteboard in these environments  Interface to the Grid  Must support convenient, efficient configuration of computing elements with all needed components

17 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 17 Domain Decomposition Products mentioned are examples; not a comprehensive list Grey: not in common project scope (also event processing framework, TDAQ)

18 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 18 Use of ROOT in LCG Software  Among the LHC experiments  ALICE has based its applications directly on ROOT  The 3 others base their applications on components with implementation-independent interfaces  Look for software that can be encapsulated into these components  All experiments agree that ROOT is an important element of LHC software  Leverage existing software effectively and do not unnecessarily reinvent wheels  Therefore the blueprint establishes a user/provider relationship between the LCG applications area and ROOT  LCG AA software will make use of ROOT as an external product  Draws on a great ROOT strength: users are listened to very carefully!  So far so good: the ROOT team has been very responsive to needs for new and extended functionality coming from POOL

19 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 19 Blueprint RTAG Outcomes  SC2 decided in October…  Blueprint is accepted  RTAG recommendations accepted to  Start common project on core tools and services  Start common project on physics interfaces

20 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 20 Applications Area Personnel Status  18 LCG apps hires in place and working; +2 in Jan, Feb  Manpower ramp is on target (expected to reach 20-23)  Contributions from UK, Spain, Switzerland, Germany, Sweden, Israel, Portugal, US  ~10 FTEs from IT (DB and API groups) also participating  ~8 FTEs from experiments (CERN EP and outside CERN) also participating in (mainly) POOL, SEAL, SPI  CERN established a new software group as the EP home of the LCG applications area (EP/SFT)  Led by John Harvey. Taking shape well. Localized in B.32  Fraction of experiment contribution which is US-supported (CERN or US resident) is currently ~30%  US fraction of total effort is <10%

21 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 21 LHC Manpower needs for Core Software 2000 Have (miss) 20012002200320042005 ALICE12(5)17.516.51717.516.5 ATLAS23(8)3635302829 CMS15(10)273133 LHCb14(5)2524232221 Total64(28)105.5106.5103100.599.5 Only computing professionals counted From LHC Computing (‘Hoffman’) Review (FTEs)

22 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 22 Personnel Resources – Required and Available Estimate of Required Effort FTEs today: 18 LCG, 10 CERN IT, 8 CERN EP + experiments 0 10 20 30 40 50 60 Sep-02 Dec-02 Mar-03 Jun-03 Sep-03Dec-03 Mar-04 Jun-04 Sep-04Dec-04 Mar-05 Quarter ending FTEs SPI Math libraries Physics interfaces Generator services Simulation CoreToolsS&Services POOL Blue = Available effort: Future estimate: 20-23 LCG, 13 IT, 28 EP + experiments Now

23 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 23 Current Personnel Distribution

24

25 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 25 U.S. Leadership  Direct leadership and financial contribution: T.Wenaus as AA manager  In addition to contributions via ATLAS and CMS  A.75FTE job requiring CERN residence  Salary support from the BNL base program (is this fair?)  CERN residency and US travel costs borne by CERN  Together with the strong U.S. presence in CMS and ATLAS computing leadership, this role gives the U.S. a strong voice in the LCG applications area  Not a dominating influence of course; e.g. at this point all the applications area project leaders are Europeans  Presence at CERN is very important, like it or not  Importance is increased because of the utterly deplorable state of the CERN infrastructure for both audio and video conferencing  The U.S. should put up the money to fix this, if no one else will; it is in our own vital interest

26 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 26 Schedule and Resource Tracking (example)

27 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 27 Apps area planning materials  Planning page linked from applications area page  Applications area plan spreadsheet: overall project plan  http://lcgapp.cern.ch/project/mgmt/AppPlan.xls http://lcgapp.cern.ch/project/mgmt/AppPlan.xls  High level schedule, personnel resource requirements  Applications area plan document: overall project plan  http://lcgapp.cern.ch/project/mgmt/AppPlan.doc http://lcgapp.cern.ch/project/mgmt/AppPlan.doc  Incomplete draft  Personnel spreadsheet  http://lcgapp.cern.ch/project/mgmt/AppManpower.xls  Currently active/foreseen apps area personnel, activities  WBS, milestones, assigned personnel resources  http://atlassw1.phy.bnl.gov/Planning/lcgPlanning.html http://atlassw1.phy.bnl.gov/Planning/lcgPlanning.html  Follow Applications Area planning link on the review web page

28 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 28 Core Libraries and Services (SEAL) Project  Launched in Oct, led by Pere Mato (CERN/LHCb)  6-member (~3 FTE) team initially; M.Marino from ATLAS  Scope:  Foundation, utility libraries  Basic framework services  Object dictionary  Grid enabled services  Many areas of immediate relevance to POOL; these are given priority  Users of this project are software developers in other projects and the experiments  Establishing initial plan, reviewing existing libraries and services  Process for adopting third party code will be addressed in this project  Initial workplan will be presented to SC2 on Jan 10  2003/3/31: SEAL V1 essentials in alpha

29 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 29 SEAL Work Packages  Foundation and utility libraries  Boost, CLHEP, …, complementary in-house development  Component model and plug-in manager  The core expression in code of the component architecture described in the blueprint. Mainly in-house development.  LCG object dictionary  Already active project in POOL; being moved to SEAL (wider scope than persistency). Will include filling dictionary from C++ header files.  Basic framework services  Object whiteboard, message reporting, component configuration, ‘event’ management  Scripting services  Grid services: common interface to middleware  Education and documentation  Assisting experiments with integration

30 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 30 Physicist Interface (PI) Project  Led by Vincenzo Innocente (CERN/CMS)  Covers the interfaces and tools by which physicists will directly use the software  Planned scope:  Interactive environment: physicist’s desktop  Analysis tools  Visualization  Distributed analysis, grid portals  Very poorly defined and understood  Currently surveying experiments on their needs and interests  In more of an ‘RTAG mode’ than project mode initially, to flesh out plans and try to clarify the grid area  Will present initial plans (and possibly an analysis RTAG proposal) to SC2 on Jan 29

31 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 31 Software Process and Infrastructure (SPI)  Code documentation, browsingDoxygen, LXR, ViewCVS  Testing FrameworkCppUnit, Oval  Memory Leaks Valgrind  Automatic BuildsProbably the ATLAS system  Coding and design guidelines RuleChecker  CVS organization  Configuration/release mgmtSCRAM  Software documentation templates http://spi.cern.ch  Components available:

32 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 32 SPI Services  CVS repositories  One repository per project  Standard repository structure and #include conventions being finalized this week  Will eventually move to IT CVS service when it is proven  AFS delivery area, Software Library  /afs/cern.ch/sw/lcg  Installations of LCG-developed and external software  Installation kits for offsite installation  LCG Software Library ‘toolsmith’ started in December  Build servers  Machines with various Linux, Solaris configurations available for use  Project portal (similar to SourceForge) http://lcgappdev.cern.chhttp://lcgappdev.cern.ch  Very nice new system using Savannah (savannah.gnu.org)  Used by CMS as well as LCG; ATLAS will probably be using it soon  Bug tracking, project news, FAQ, mailing lists, download area, CVS access, …

33 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 33

34 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 34 POOL  Pool of persistent objects for LHC, currently in prototype  Targeted at event data but not excluding other data  Hybrid technology approach  Object level data storage using file-based object store (ROOT)  RDBMS for meta data: file catalogs, object collections, etc (MySQL)  Leverages existing ROOT I/O technology and adds value  Transparent cross-file and cross-technology object navigation  RDBMS integration  Integration with Grid technology (eg EDG/Globus replica catalog)  network and grid decoupled working modes  Follows and exemplifies the LCG blueprint approach  Components with well defined responsibilities  Communicating via public component interfaces  Implementation technology neutral

35 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 35 Pool Release Schedule  End September - V0.1 (Released Oct 2)  All core components for navigation exist and interoperate  Assumes ROOT object (TObject) on read and write  End October - V0.2 (Released Nov 15)  First collection implementation  End November - V0.3 (Released Dec 18)  First public release  EDG/Globus FileCatalog integrated  Persistency for general C++ classes (not instrumented by ROOT), but very limited: elementary types only  Event metadata annotation and query  June 2003 – Production release

36 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 36 POOL Milestones

37 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 37 Simulation Project  Mandated by SC2 to initiate simulation project following the RTAG  Project being organized now  Expected to cover  generic simulation framework  Multiple simulation engine support, geometry model, generator interface, MC truth, user actions, user interfaces, average tracking, utilities  ALICE virtual MC as starting point if it meets requirements  Geant4 development and integration  FLUKA (development and) integration  physics validation  simulation test and benchmark suite  fast (shower) parameterisation  generator services

38 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 38 Comment on Grid Technology Area (GTA) Quote from slide of Les: LCG expects to obtain Grid Technology from projects funded by national and regional e-science initiatives -- and from industry concentrating ourselves on deploying a global grid service All true, but there is a real role for the GTA, not just deployment, in LCG: Ensuring that the needed middleware is/will be there, tested, selected and of production grade (Re)organization in progress to create an active GTA along these lines Important for the Applications Area: AA distributed software will be robust and usable only if the grid middleware it uses is so

39 Torre Wenaus, BNL/CERN Agency Review, January 14, 2003 Slide 39 Concluding Remarks  Essentially the full expected AA scope is covered by the anticipated activities of the projects now defined  Manpower is in quite good shape  Buy-in by the experiments, apart from ALICE, is good  Substantial direct participation in leadership, development, prompt testing and evaluation, RTAGs  U.S. CMS represented well because of strong presence in computing management and in CERN-based personnel  U.S. ATLAS representation will improve with D.Quarrie’s relocation to CERN as Software Leader; further increases in CERN presence being sought  Groups remote from CERN are contributing, but it isn’t always easy  Have pushed to lower the barriers, but still it isn’t easy  New CERN EP/SFT group is taking shape well as a CERN hub for applications area activities  POOL and SPI are delivering, and the other projects are ramping up  First persistency prototype released in 2002, as targeted in March 2002


Download ppt "LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager DOE/NSF Review of US LHC Physics and Computing."

Similar presentations


Ads by Google