Presentation is loading. Please wait.

Presentation is loading. Please wait.

DOE/U.S. ATLAS Computing Sept. 9, 1999 U.S. ATLAS Computing  Overview  Status of ATLAS computing  U.S. ATLAS  Project Management Organization  Status.

Similar presentations


Presentation on theme: "DOE/U.S. ATLAS Computing Sept. 9, 1999 U.S. ATLAS Computing  Overview  Status of ATLAS computing  U.S. ATLAS  Project Management Organization  Status."— Presentation transcript:

1 DOE/U.S. ATLAS Computing Sept. 9, 1999 U.S. ATLAS Computing  Overview  Status of ATLAS computing  U.S. ATLAS  Project Management Organization  Status of efforts  Core software  Subsystems  Facilities  Schedule  FY 00 funding request  Summary

2 DOE/U.S. ATLAS Computing Sept. 9, 1999 Scale of Computing Effort  Rough scaling of factors of 5 to 1E+3 in relevant parameters from Tevatron Experiments  Manpower x5  CPU/event x1E+3 (event complexity)  Data volume x10 to x1E+2 (channel count)  Distribution of data x10  U.S. effort comparable to scale of Tevatron experiment.  Effort $15M/year

3 DOE/U.S. ATLAS Computing Sept. 9, 1999 Scales from experience

4 DOE/U.S. ATLAS Computing Sept. 9, 1999 Goals for the next year  Project organization  Management  Identify areas of responsibility  Integration of efforts into ATLAS  Inception/development of software  U.S. support facilities  Planning/development of infrastructure  Prepare for reviews

5 DOE/U.S. ATLAS Computing Sept. 9, 1999 International ATLAS  New Computing Coordinator  Norman McCubbin (RAL)  Available full time November  Approval vote - ATLAS CB June 10th  Responsibility: Core software  New Physics Coordinator  Fabiola Gianotti (CERN)  Approval vote - ATLAS CB June 10th  Detector specific sim/reconstruction  Organized within subsystems

6 DOE/U.S. ATLAS Computing Sept. 9, 1999 Architecture Taskforce  Software partitioned into work packages  Katsuya Amako, KEK  Laurent Chevalier, CEA  Andrea Dell’Acqua, CERN  Fabiola Gianotti, CERN  Steve Haywood, RAL (Chair)  Jurgen Knobloch, CERN  Norman McCubbin, RAL  David Quarrie, LBL  R.D. Schaffer, LAL  Marjorie Shapiro, LBNL  Valerio Vercesi, Pavia

7 DOE/U.S. ATLAS Computing Sept. 9, 1999 Architecture T.F. Status  Three meetings so far  Directions:  Language: C++ (allow for migration to other e.g. JAVA)  Examine GAUDI (LHCb) architecture  Adoption of “use cases”  Goals for October  Outline of architecture design  Appointment of Chief Architect  Commission work on prototyping of parts of design  Create use-cases, requirement document  Define packages and relations (package diagram)

8 DOE/U.S. ATLAS Computing Sept. 9, 1999 Stages in Software Management

9 DOE/U.S. ATLAS Computing Sept. 9, 1999 Quality Control  Recommend software performance specifications, review process  Makoto Asai, Hiroshima  Dario Barberis, Genoa  Martine Bosman, Barcelona  Bob Jones, CERN  Jean-Francois LaPorte, CEA  Helge Meinhard, CERN  Maya Stavrianakou, CERN

10 DOE/U.S. ATLAS Computing Sept. 9, 1999 Action on other Groups  National Board  Supported platforms  Regional centers  Training  Network of national contacts for training  C++, OO programming  GEANT 4  ATLAS Specific

11 DOE/U.S. ATLAS Computing Sept. 9, 1999 ATLAS/CERN Schedule ‘00  Sept. ‘99  Start of Cashmore/Hoffman review  Oct. ‘99  Report of architecture T.F.  Commissioning of prototyping code  Jan ‘00  Start preparations for bilateral agreements  Fall ‘00  Report of Cashmore/Hoffman review  MOU preparations

12 DOE/U.S. ATLAS Computing Sept. 9, 1999 U.S. Participation  Frank Paige - Co- convenor of SUSY working group  David Malon - Co-leader of database group  Craig Tull - Architecture Group  Ian Hinchliffe - Leader of Event Generator group  David Quarrie, Marjorie Shapiro - Architecture Task Force  John Parsons - Co-convenor of Top working group  Misha Leltchouk - L Ar simulation coordinator  Michael Shupe - Convenor of Background working group  Fred Luehring - TRT software coordinator  Steve Goldfarb - Muon Database Coordinator  Tom LeCompte - Tilecal Database Coordinator  Krzys Sliwa - Chair of ATLAS World-wide computing group  Frank Merritt - Training contact, Tilecal Reconstruction coord.  Bruce Gibbard - Regional center contact  John Huth- National Board contact

13 DOE/U.S. ATLAS Computing Sept. 9, 1999 U.S. ATLAS Computing  NSF, DOE: LHC computing activities are now “projectized”  Implications for U.S. ATLAS:  Direct reporting lines through Project Manager (Bill Willis) and BNL Directorate (Tom Kirk)  Appointment of Associate Project Manager for Computing and Physics (John Huth)  Implications for Agencies:  Must clarify reporting lines, operations

14 DOE/U.S. ATLAS Computing Sept. 9, 1999 Proposed Management

15 DOE/U.S. ATLAS Computing Sept. 9, 1999 Management Structure  Reflects flow of deliverables to, from ATLAS  Appointments (2 year renewable terms)  Physics: Ian Hinchliffe (LBNL)  Facilities: Bruce Gibbard (BNL) + deputy  Issues  Software manager  Availability within U.S. ATLAS - hire?  Flatter structure for the time being?  Bring on soon ? Most desirable!

16 DOE/U.S. ATLAS Computing Sept. 9, 1999 Management Plan  Associate Project Manager  Member of E.C.  Develop and execute project plan  Establish and maintain project organization+Tracking  Develop annual budget requests  Liason to ATLAS Computing Management  Appoint L2 managers  Review and approve MOU’s to CERN and Institutes  Exercise change control authority  Establish advisory committees where appropriate  Provide reports and organize reviews

17 DOE/U.S. ATLAS Computing Sept. 9, 1999 Implications for APM  APM is a time consuming job.  Actions for John Huth:  Relief from MDT electronics coordinator (Jay Chapman, U. Michigan)  Relief from U.S. ATLAS Inst. Bd. Chair (Jim Siegrist, LBNL)  Teaching relief - spring terms ‘00 and ‘01  Granted by Harvard University

18 DOE/U.S. ATLAS Computing Sept. 9, 1999 Level 2 Managers  Appointed by APM, concurrance of Exec. Comm.  Members of E.C. (+ APM, + deputy)  Two year renewable terms  Generic responsibilities  Develop definition of milestones and deliverables  Define, with APM, organizational substructure of level 2  Develop, with APM, annual budget proposals  Identify resource imbalances within subprojects and recommend adjustments  Deliver scope of subproject on time within budget  Maintain cost and schedule  Provide reports to APM, PM  Liason with counterparts at CERN

19 DOE/U.S. ATLAS Computing Sept. 9, 1999 Specific Responsibilities  Physics Manager  Generators, physics objects, benchmark studies, mock data challenge  Software  Core  Detector specific sim/recon  Training  Facilities  Tier 1,2, networking, support

20 DOE/U.S. ATLAS Computing Sept. 9, 1999 Project Engineer  Same roles as project engineer’s for construction project  Tracking  Reviews, oversight  Reporting  Technical input

21 DOE/U.S. ATLAS Computing Sept. 9, 1999 Proposed Names  Physics: Ian Hinchliffe (LBNL)  Contacted, accepted  Policy question: physicists on project  Software: search underway  Facilities: Bruce Gibbard (+deputy)  Deputy: Jim Shank

22 DOE/U.S. ATLAS Computing Sept. 9, 1999 Facilities Manager  Bruce Gibbard (BNL)  Proposal to add U.S. ATLAS Deputy  U.S. ATLAS and RHIC Deputies  General agreement with Bruce, Tom Kirk  Begin to fill in other areas  Networking  Tier 1  Remote sites  Support

23 DOE/U.S. ATLAS Computing Sept. 9, 1999 Detector Contacts  L Ar - Srini Rajagopalan (BNL)  Tilecal - Frank Merritt (U.Chicago)  ID- Laurent Vacavant (LBNL)  Muon - Bing Zhou (U. Michigan)  TRT - Keith Baker (Hampton)  Trigger/DAQ -Andy Lankford (UCI)

24 DOE/U.S. ATLAS Computing Sept. 9, 1999 Planning Activities  Writing/preparation assignments for review - next week  L2 managers where appropriate (ie. Facilities, physics)  Core  Sim/recon  Training  Management (PMP for computing)  MRE/IT “team”  Review in October

25 DOE/U.S. ATLAS Computing Sept. 9, 1999 WBS Structure  Should be flexible while project definition is underway (level 3+beyond)  Level 2’s should be fixed now  Commensurate with management structure  Adopt lead number “2”

26 DOE/U.S. ATLAS Computing Sept. 9, 1999 High Levels of WBS  Draft WBS  2.1 Physics  Generators, benchmarks, mock data challenges, physics objects  2.2 Software  2.2.1 Core –Control/Framework,database, event model, analysis tools  2.2.2 Detector specific simulation and recon.  2.2.3 Collaborative tools  2.2.3 Training  2.3 Facilities  Regional center, remote sites, networking, support

27 DOE/U.S. ATLAS Computing Sept. 9, 1999 Near Term Activities/Issues  U.S. ATLAS Web-site  Weekly video conferences  Support role of BNL  Gathering FY 00 requests  Advisory group appointment  Writing assignments for proposal  NSF MRE/IT proposal - Tier 2 centers  Discussions of deliverables with ATLAS  Interactions with agencies  JOG, Computing review

28 DOE/U.S. ATLAS Computing Sept. 9, 1999 Software  Core Software  Control/Framework (Tull)  Database, Tilecal Pilot Project (Malon)  Event Model (Rajagopalan)  Detector-specific sim/reconstruction  Representatives from subsystems chosen  Training (Merritt)  Establishment of OO courses (BNL, U. Chicago)

29 DOE/U.S. ATLAS Computing Sept. 9, 1999 General Requirements  Software must last over lifetime of experiment, yet track language changes  Well defined interface layers  Maintainability, engineering critical  Number of users, use of software professionals  Adaptability to distributed environments  Learn from experiments working on OO (BaBar, D0, CDF, STAR)

30 DOE/U.S. ATLAS Computing Sept. 9, 1999 Database  David Malon (ANL)  Tilecal pilot project  Tilecal testbeam data in object database  Testbed for ATLAS technologies and strategies  Early feedback to developers  Generalized to other subsystems  Database core software  Transient and persistent object mapping  Definition of database/control interface  Specifications  Examine alternatives to Objectivity

31 DOE/U.S. ATLAS Computing Sept. 9, 1999 Tilecal Model

32 DOE/U.S. ATLAS Computing Sept. 9, 1999 Database Schedule

33 DOE/U.S. ATLAS Computing Sept. 9, 1999 Database Milestones  Jan ‘00  Survey of mapping strategies  Feb. ‘00  Infrastructure for developers deployed  April ‘00  Validation of strategies in testbed  July ‘00  Database management infrastructure defined  Oct ‘00  Infrastructure for distributed access deployed  Jan ‘01  Scalability test  Oct ‘01  Beta release

34 DOE/U.S. ATLAS Computing Sept. 9, 1999 Control/Framework  Craig Tull (LBNL)  Working on user requirements document (w/ Hinchliffe, Shapiro, Vacavent)  Market survey of framework systems  Object component model  AC++  Compatibility with ATLAS architecture  Resource loaded work plan exists  Work with A.T.F. for design requirements  Already have tested prototype designs

35 DOE/U.S. ATLAS Computing Sept. 9, 1999 Framework Milestones

36 DOE/U.S. ATLAS Computing Sept. 9, 1999 Framework Schedule

37 DOE/U.S. ATLAS Computing Sept. 9, 1999 One Framework Model Software Bus (eg. CORBA) Component C++ Classes/Objects Component Class Adatpers Scripting Interface (eg. Tcl, …) Command Marshalling (eg. SWIG,...) GUI Interface (eg. Tk, …)

38 DOE/U.S. ATLAS Computing Sept. 9, 1999 Event Model  Type of objects that are stored  Client’s view of the event  Physical organization of the event  Client’s interface to the event  Mechanism to navigate between objects  Transient to Persistent Object mapping

39 DOE/U.S. ATLAS Computing Sept. 9, 1999 Event Model Milestones  Nov. 99  Review of models in other exp’s  Requirements documents  Dec. 99  Resource loaded schedule  Mar. 00  Baseline design  June 00  Alpha release  Aug. 00  Review experience  FY 01  Beta release

40 DOE/U.S. ATLAS Computing Sept. 9, 1999 Framework Options

41 DOE/U.S. ATLAS Computing Sept. 9, 1999 Some Detector Activities  TRT/ID  Put full TRT simulation into GEANT4  L-Ar  Coil, cryos in GEANT4 (Nevis)  Accordian structure in GEANT4 (BNL)  Tilecal  Pilot project

42 DOE/U.S. ATLAS Computing Sept. 9, 1999 Some Detector Activities  Muon  Study of noise in Higgs-> 4 muon  Combined performance of ID+muon system (A reconstruction)  CSC into simulation  Trigger/DAQ  Comparison of switching architectures  Background studies  Optimization of shielding (100 MeV muon background)

43 DOE/U.S. ATLAS Computing Sept. 9, 1999 Training  New paradigm of OO programming  Training courses (F. Merritt)  Course offered at BNL (near future)  Course offered at Chicago  Successful programs seen at other experiments (CDF, D0, BaBar)  Ongoing need for training throughout course of experiment  Documentation  ATLAS-specific

44 DOE/U.S. ATLAS Computing Sept. 9, 1999 Software Development  Asymptotic level - est. 10 software professionals  Peak load (circa 2003) est. 16 S.P.’s  Extrapolations based on existing experiments and proposed areas of responsibility, fractional of U.S. participation  Choice of technology can influence actual needs strongly (e.g. BaBar, STAR in database)

45 DOE/U.S. ATLAS Computing Sept. 9, 1999 Planned Training  All by Object Mentor (BaBar, others)  Organized by Frank Merritt  Courses approximately 1 week long  Aug. 9 - BNL - OO Design - 13 people  Sept. 20 - U.C. - OO Design - 15 people  Oct. 18 - ANL or BNL - Advanced OO - 10 people  Nov. 8 - FNAL - GEANT 4 - 14 people

46 DOE/U.S. ATLAS Computing Sept. 9, 1999 Facilities  BNL ramping up support facility  Taps into RHIC Computing Facility  Major issue of Tier 1/2 facilities  Scale of “Tier 2’s”  Size for support staff, infrastructure  Computing model for U.S. (e.g. grids)  Being addressed in NSF MRE/IT proposal  In the process of developing policy on usage, support of platforms at institutions

47 DOE/U.S. ATLAS Computing Sept. 9, 1999 Facilities  Tier 1 (Regional Center)  BNL  Leverages RCF  ATLAS specific needs, however.  Primary support function for U.S.  Code release, support  Major processing, event store  Personnel scale estimate:  Roughly linear ramp from 2 FTE’s (now) to 22 or more (depending on computing model)

48 DOE/U.S. ATLAS Computing Sept. 9, 1999 MONARC  Models of Networked Architecture at Regional Centers (ATLAS+CMS)  Alexander Nazarenko, Tufts hire  Tasks:  Validate simulation models  Perform first simulations of LHC architectures  After Dec. ‘99, focus on planning for regional centers  Model validation - end of September  Understanding of U.S. computing facilities

49 DOE/U.S. ATLAS Computing Sept. 9, 1999 NSF MRE/IT Proposal  Tier 2 centers  Approx. 5 total  256 node systems  100 TB tape system  Low maintenance  Linked by computing grid  Computing professionals  Dual role - user/developers

50 DOE/U.S. ATLAS Computing Sept. 9, 1999 Review Process  Send proposals to Advisory Group (Aug. 4th)  Charge:  Identify areas of overlap/commonality  Suggest simplifications/savings  Coherency with ATLAS effort  Prioritize  Meet with Agencies to establish scale for FY 00, initial request (Now)  Confer with Advisory group on feedback  Prepare, review documentation for Dec. review (mid-late Sept.)

51 DOE/U.S. ATLAS Computing Sept. 9, 1999 Priorities  Critical personnel  People who would otherwise be lost, fulfilling a critical role  Core software effort  Prerequisite to inclusion of sim/recon software  Yet, cannot commit to major ramp (no MOU’s)  Support of U.S. efforts (facilities)  Critical studies  Transition to OO

52 DOE/U.S. ATLAS Computing Sept. 9, 1999 Priorities  Coherency in development of plan  Matching of facilities scope to usage  E.g. database effort, simulations  Contiguous/overlapping areas  E.g. event model, database, control/framework

53 DOE/U.S. ATLAS Computing Sept. 9, 1999 FY 00 Needs  Starting Oct. 1st - will need more than simple continuation of present level  Support functions at BNL  Deputy Facilities Manager  Core support - database  Estimate roughly 4 FTE increment  Remaining needs made part of review process for Dec.  Still give estimates for needs beyond December -> now

54 DOE/U.S. ATLAS Computing Sept. 9, 1999 Facilities  Support for U.S. users a necessary precondition for effective U.S. participation (like training)  Use of RCF leverages existing facilities  Requesting 3 FTE’s Oct. 1 (deputy+support)  $50K of equipment Oct. 1 (500 SpecInt95)  $500K (CPU, disk, mass storage) - after review

55 DOE/U.S. ATLAS Computing Sept. 9, 1999 Training  Necessary precondition to effective U.S. participation.  Must be done now (trained group of physicists)  Substantial pay-back (experience in industry)  NSF request of $75K to subsidize courses ($1k/student + setup)

56 DOE/U.S. ATLAS Computing Sept. 9, 1999 Leveraging Funds  Groups that have applied for internal funds  BNL: Already support, 2 FTE’s for FY 00 (core software)  LBNL: Request for support on control/framework  U.Chicago: Advance support for training, request for software professional to start

57 DOE/U.S. ATLAS Computing Sept. 9, 1999 FTE Request

58 DOE/U.S. ATLAS Computing Sept. 9, 1999 Schedule  July  Propose management structure to E.C., PM  Collaboration meeting  Tier 1/2 scoping  Plans for FY 00 reviewed  MRE “White paper”  August  Present FY 00 plans to agencies  Outline and writing assignments for proposal (Dec.)

59 DOE/U.S. ATLAS Computing Sept. 9, 1999 Schedule  September  First drafts of proposal  Management PMP  Software: Core and recon/sim  Facilities  Training, collaborative tools  October  Revise proposal, review  November  Meeting to prepare for Dec. review

60 DOE/U.S. ATLAS Computing Sept. 9, 1999 Schedule  December (January?)  Agency review  January  Revise funding plan for FY 00  Begin work on bilateral agreements  Ongoing - and beyond January  Prototyping code  Progress toward baselining  Filling in management slots  Bilateral agreements

61 DOE/U.S. ATLAS Computing Sept. 9, 1999 Summary  Project organization  Management  Identify areas of responsibility  Integration of efforts into ATLAS  Inception/development of software  U.S. support facilities  Planning/development of infrastructure  Prepare for reviews

62 DOE/U.S. ATLAS Computing Sept. 9, 1999 Summary  Major points:  Oct. 1st FY 00 needs are more than ongoing (4 FTE)  Hardware to augment RCF  Training physicists in OO design  Continue to fill in management structure


Download ppt "DOE/U.S. ATLAS Computing Sept. 9, 1999 U.S. ATLAS Computing  Overview  Status of ATLAS computing  U.S. ATLAS  Project Management Organization  Status."

Similar presentations


Ads by Google