Presentation is loading. Please wait.

Presentation is loading. Please wait.

June 02 John Huth, LHC Computing 1 U.S. ATLAS Overview  Project ManagementJ. Huth  SoftwareT.Wenaus  ArchitectureD. Quarrie  PhysicsI. Hinchliffe 

Similar presentations


Presentation on theme: "June 02 John Huth, LHC Computing 1 U.S. ATLAS Overview  Project ManagementJ. Huth  SoftwareT.Wenaus  ArchitectureD. Quarrie  PhysicsI. Hinchliffe "— Presentation transcript:

1 June 02 John Huth, LHC Computing 1 U.S. ATLAS Overview  Project ManagementJ. Huth  SoftwareT.Wenaus  ArchitectureD. Quarrie  PhysicsI. Hinchliffe  Data ManagementD. Malon  Facilities OverviewB.Gibbard/R. Baker  Grid Computing, ATLAS TestbedR. Gardner/K. De  Project Management PerspectiveH. Gordon

2 U.S. ATLAS Project Overview John Huth Harvard University LHC Computing mid-term Review NSF Headquarters June 2002

3 June 02 John Huth, LHC Computing 3 Major Changes Since Nov. Review  Research Program Launched – M+O and Computing considered as one “program”  Funding shortfall relative to guidance  Extreme tension on the M+O and Computing split  LCG Project Launched  Major US ATLAS participation  LHC turn-on 2007  Data Challenge Delays  Hybrid Root/RDB solution adopted  Supercomputing demonstrations (iVDGL/GriPhyN/PPDG)  Work toward University based ITR proposal

4 June 02 John Huth, LHC Computing 4 International ATLAS  Issue of competing frameworks has been settled  New simulation coordinator: Andrea Dell’Aqua  Major new releases of Athena (control/framework)  Lack of infrastructure remains a problem  Intl. ATLAS management is determined to solve it!  Ramp up of grid activities, in context of Data challenges  P. Nevski (BNL) in support role  Delay of data challenge 1, phase 2 by approx. 6 months  J. Shank named muon software coordinator  Geometry model of CDF leading candidate (J. Boudreau)  Hybrid database decision (LCG RTAG – see talks)

5 June 02 John Huth, LHC Computing 5 FTE Fraction of Core SW

6 June 02 John Huth, LHC Computing 6 Project Core SW FTE

7 June 02 John Huth, LHC Computing 7 ATLAS Detector/Task matrix Offline Coordinator ReconstructionSimulationDatabaseChair N. McCubbin D. Rousseau A. dell’Aqua D. Malon Inner Detector D. Barberis D. Rousseau F. Luehring S. Bentvelsen Liquid Argon J. Collot S. Rajagopalan M. Leltchouk H. Ma H. Ma Tile Calorimeter A. Solodkov F. Merritt A. Solodkov T. LeCompte Muon J. Shank J.F. Laporte A. Rimoldi S. Goldfarb LVL 2 Trigger/ Trigger DAQ S. George S. Tapprogge M. Weilers A. Amorim Event Filter F. Touchard M. Bosman Physics Coordinator: F.Gianotti Chief Architect: D.Quarrie

8 June 02 John Huth, LHC Computing 8 Other Labs The LHC Computing Grid Project LHCC Project Overview Board Technical Study Groups Reports Reviews Resources Board Resource Issues Computing Grid Projects Project execution teams HEP Grid Projects Project Manager Project Execution Board Requirements, Monitoring Software and Computing Committee LCG Project Structure

9 June 02 John Huth, LHC Computing 9 LHC Comp. Grid Project  3 Main bodies  Proj. Oversight Board – yet to meet (J.Huth, US rep.)  SC2 (Gives requirements, L. Bauerdick US rep.)  Proj. Execution Board – T. Wenaus Applications leader  D.B. RTAG gives recommendation (see talks)  US ATLAS contributing (consonant with ATLAS goals)  Phase I until 2004 (R+D), Phase II afterward  20% data challenge milestone in Dec. 2004 for Tier 0,1,2,3  Drives Tier 1, 2 funding

10 June 02 John Huth, LHC Computing 10 Software  Further consolidation  Hybrid event store (ROOT + relational DB)  Second framework effectively dead  Athena developments  Architecture model including triggers  Infrastructure at intl ATLAS remains a problem  One solution is “tithing” countries  Tara Shears – web documentation, CVS support  Steve O’Neal - librarian  Data challenges and incorporation of OO reconstruction

11 June 02 John Huth, LHC Computing 11 Physics, Facilities  New hire at LBNL for physics support  Tier 1 ramp slowed by funding profile  At a smaller level for participation in data challenges (barely making cut), participation smaller than % of authors  2004 milestone for LCG  Continuing to explore all-disk solution  Tier 2’s – US ATLAS testbed plans for SC2002 demo’s  Grid enabled production  Monitoring, virtual data, interoperability with CMS, replica and meta- data catalogs (GriPhyN, iVDGL, PPDG)

12 June 02 John Huth, LHC Computing 12 US ATLAS Persistent Grid Testbed Calren Esnet, Abilene, Nton Esnet, Mren UC Berkeley LBNL-NERSC Esne t NPACI, Abilene Brookhaven National Laboratory Indiana University Boston University Argonne National Laboratory U Michigan Oklahoma University Abilene Prototype Tier 2s HPSS sites University of Texas At Arlington

13 June 02 John Huth, LHC Computing 13 Funding Issues  Changing budget numbers, uncertainties very difficult to live with….  For FY 02, living on goodwill, counting on $520k from NSF to prevent further reductions in force  ANL, BNL, LBNL base programs are severely squeezed, additional burden on project costs  Attempting to put together a large ITR using US ATLAS universities  Monitoring, quality of service, networking

14 June 02 John Huth, LHC Computing 14 Positive News on Funding  UT Arlington – MRI for D0 (ATLAS) K. De  High Energy Physics Analysis Center  J. Huth/J. Schopf (ANL/NW/Harvard) ITR Grant  Resource Estimation for High Energy Physics  Large ITR  Resource Estimation (continuation of above)  Networking  Quality of Service  Grid Monitoring  Security  Integration of all above into the ATLAS fabric

15 June 02 John Huth, LHC Computing 15 Budgeting  Attempt to meet obligations for software and facilities  Make “bare bones” assumptions – stretch out of LHC  A big unknown is the result of the split between M+O and computing funds  Have to guess at profiles for the time being

16 June 02 John Huth, LHC Computing 16 Assumed Budget Profile (AY$)

17 June 02 John Huth, LHC Computing 17 Bare Bones Budget

18 June 02 John Huth, LHC Computing 18 Responses to Recommendation from Nov.  Received this last week  Chief architect should have resources/authority  Infrastructure remains an issue, Torsten Akesson is trying to solve this via the NCB – it has been recognized as the major issue in intl. ATLAS. Software infrastructure team is being identified.  Software decisions  Resolution of second framework, some improvements overall, far from perfect  Be less willing to take on additional workloads  Trying, but infrastructure support is a major issue – in process of resolution.  Staffing for DC 1 – progress, funding issues

19 June 02 John Huth, LHC Computing 19 Summary  We are making progress on many fronts, despite funding uncertainties  US ATLAS testbed, control/framework, data management, incorporation of grid tools.  Tier 1 the most severely impacted by funding uncertainties  Intl. ATLAS situation improving, but still has some distance to go  We are barely rolling on the present budget  Severe impact of base program cuts in US ATLAS institutions


Download ppt "June 02 John Huth, LHC Computing 1 U.S. ATLAS Overview  Project ManagementJ. Huth  SoftwareT.Wenaus  ArchitectureD. Quarrie  PhysicsI. Hinchliffe "

Similar presentations


Ads by Google