June 02 John Huth, LHC Computing 1 U.S. ATLAS Overview  Project ManagementJ. Huth  SoftwareT.Wenaus  ArchitectureD. Quarrie  PhysicsI. Hinchliffe 

Slides:



Advertisements
Similar presentations
Project Overview John Huth Harvard University U.S. ATLAS Physics and Computing Project Review ANL October 2001.
Advertisements

Distributed IT Infrastructure for U.S. ATLAS Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Information Systems and Data Acquisition for ATLAS What was achievedWhat is proposedTasks Database Access DCS TDAQ Athena ConditionsDB Time varying data.
US ATLAS Project Management J. Shank U.S. ATLAS Computing and Physics meeting Aug., 2003 BNL.
U.S. ATLAS Software WBS 2.2 S. Rajagopalan July 8, 2003 DOE/NSF Review of LHC Computing.
US ATLAS Distributed IT Infrastructure Rob Gardner Indiana University October 26, 2000
Software Project Status Torre Wenaus BNL U.S. ATLAS Physics and Computing Advisory Panel Review Argonne National Laboratory Oct 30, 2001.
Software Status/Plans Torre Wenaus, BNL/CERN US ATLAS Software Manager US ATLAS PCAP Review November 14, 2002.
U.S. ATLAS Physics and Computing Budget and Schedule Review John Huth Harvard University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
BNL PCAP Meeting Jan U.S. ATLAS Computing  Overview  Status of ATLAS computing  U.S. ATLAS  Project Management Organization  Status of.
BNL ATLAS Meeting July 1999 U.S. ATLAS Computing  Goals for the next year  Status of ATLAS computing  U.S. ATLAS  Management proposal  Brief status.
Experiment Requirements for Global Infostructure Irwin Gaines FNAL/DOE.
HEP Experiment Integration within GriPhyN/PPDG/iVDGL Rick Cavanaugh University of Florida DataTAG/WP4 Meeting 23 May, 2002.
Software Overview and LCG Project Status & Plans Torre Wenaus BNL/CERN DOE/NSF Review of US LHC Software and Computing NSF, Arlington June 20, 2002.
U.S. ATLAS Computing Project: Budget Profiles, Milestones Jim Shank Boston University Physics and Computing Advisory Panel Review LBNL Nov., 2002.
Distributed Facilities for U.S. ATLAS Rob Gardner Indiana University PCAP Review of U.S. ATLAS Physics and Computing Project Argonne National Laboratory.
Software Project Status Torre Wenaus, BNL/CERN US ATLAS Software Manager DOE/NSF Review of the US ATLAS Physics and Computing Project January 15, 2003.
US-ATLAS Management Overview John Huth Harvard University Agency Review of LHC Computing Lawrence Berkeley Laboratory January 14-17, 2003.
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
US-ATLAS Grid Efforts John Huth Harvard University Agency Review of LHC Computing Lawrence Berkeley Laboratory January 14-17, 2003.
U.T. Arlington High Energy Physics Research Summary of Activities August 1, 2001.
U.S. ATLAS Computing Facilities Bruce G. Gibbard Brookhaven National Laboratory Review of U.S. LHC Software and Computing Projects LBNL, Berkeley, California.
K. De UTA Grid Workshop April 2002 U.S. ATLAS Grid Testbed Workshop at UTA Introduction and Goals Kaushik De University of Texas at Arlington.
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL U.S. ATLAS Physics and Computing Advisory Panel Review Argonne National Laboratory Oct 30, 2001.
Tier 1 Facility Status and Current Activities Rich Baker Brookhaven National Laboratory NSF/DOE Review of ATLAS Computing June 20, 2002.
PPDG and ATLAS Particle Physics Data Grid Ed May - ANL ATLAS Software Week LBNL May 12, 2000.
PCAP Management Overview John Huth Harvard University PCAP Review of U.S. ATLAS Lawrence Berkeley Laboratory NOVEMBER 14-16, 2002.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
ATLAS, U.S. ATLAS, and Databases David Malon Argonne National Laboratory DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National Laboratory.
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
ATLAS, U.S. ATLAS, and Databases David Malon Argonne National Laboratory PCAP Review of U.S. ATLAS Computing Project Argonne National Laboratory
October LHCUSA meeting BNL Bjørn S. Nilsen Update on NSF-ITR Proposal Bjørn S. Nilsen The Ohio State University.
U.S. ATLAS Tier 1 Planning Rich Baker Brookhaven National Laboratory US ATLAS Computing Advisory Panel Meeting Argonne National Laboratory October 30-31,
US ATLAS Project Management J. Shank DOE/NSF review of LHC Computing 8 July, 2003 NSF Headquarters.
Software Project Status Torre Wenaus BNL U.S. ATLAS Physics and Computing Advisory Panel Review Brookhaven National Laboratory May 21, 2001.
U.S. ATLAS Software WBS 2.2 S. Rajagopalan July 8, 2004 DOE-NSF Review of U.S. ATLAS Computing.
ATLAS Heavy Ions Executive Summary: Challenged by DOE in 9/2005 to:  Firm up our plans (needs matched to concrete resources).  Get personnel commitments.
ATLAS Simulation/Reconstruction Software Reported by S. Rajagopalan work done by most US Institutes. U.S. ATLAS PCAP review Lawrence Berkeley National.
U.S. ATLAS Project Overview John Huth Harvard University LHC Computing Review FNAL November 2001.
US ATLAS Tier 1 Facility Rich Baker Brookhaven National Laboratory Review of U.S. LHC Software and Computing Projects Fermi National Laboratory November.
High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery1 Integration with iVDGL è International Virtual-Data Grid Laboratory  A global Grid laboratory (US, EU, Asia,
ATLAS Simulation/Reconstruction Software James Shank DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National Laboratory NOVEMBER 14-17,
Open Science Grid & its Security Technical Group ESCC22 Jul 2004 Bob Cowles
Software Project Status Torre Wenaus BNL DOE/NSF Review of US LHC Software and Computing Fermilab Nov 29, 2001.
Data Management Overview David M. Malon Argonne U.S. ATLAS Physics and Computing Project Advisory Panel Meeting Berkeley, CA November 2002.
Data Management Overview David M. Malon Argonne U.S. LHC Computing Review Berkeley, CA January 2003.
LCG CERN David Foster LCG WP4 Meeting 20 th June 2002 LCG Project Status WP4 Meeting Presentation David Foster IT/LCG 20 June 2002.
U.S. ATLAS Computing Facilities (Overview) Bruce G. Gibbard Brookhaven National Laboratory US ATLAS Computing Advisory Panel Meeting Argonne National Laboratory.
Overview of US Work on Simulation and Reconstruction Frederick Luehring August 28, 2003 US ATLAS Computing Meeting at BNL.
U.S. ATLAS Computing Facilities Overview Bruce G. Gibbard Brookhaven National Laboratory U.S. LHC Software and Computing Review Brookhaven National Laboratory.
Data Management Overview David M. Malon Argonne NSF/DOE Review of U.S. ATLAS Physics and Computing Project NSF Headquarters 20 June 2002.
U.S. ATLAS Computing Facilities (Overview) Bruce G. Gibbard Brookhaven National Laboratory Review of U.S. LHC Software and Computing Projects Fermi National.
Computing Division FY03 Budget and budget outlook for FY04 + CDF International Finance Committee April 4, 2003 Vicky White Head, Computing Division.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
U.S. ATLAS Computing Facilities Bruce G. Gibbard Brookhaven National Laboratory Mid-year Review of U.S. LHC Software and Computing Projects NSF Headquarters,
Introduction S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting.
US ATLAS – new grid initiatives John Huth Harvard University US ATLAS Software Meeting: BNL Aug 03.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
Towards deploying a production interoperable Grid Infrastructure in the U.S. Vicky White U.S. Representative to GDB.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
1 ALICE Summary LHCC Computing Manpower Review September 3, 2003.
S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting
Alice Week Offline Day F.Carminati June 17, 2002.
Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

June 02 John Huth, LHC Computing 1 U.S. ATLAS Overview  Project ManagementJ. Huth  SoftwareT.Wenaus  ArchitectureD. Quarrie  PhysicsI. Hinchliffe  Data ManagementD. Malon  Facilities OverviewB.Gibbard/R. Baker  Grid Computing, ATLAS TestbedR. Gardner/K. De  Project Management PerspectiveH. Gordon

U.S. ATLAS Project Overview John Huth Harvard University LHC Computing mid-term Review NSF Headquarters June 2002

June 02 John Huth, LHC Computing 3 Major Changes Since Nov. Review  Research Program Launched – M+O and Computing considered as one “program”  Funding shortfall relative to guidance  Extreme tension on the M+O and Computing split  LCG Project Launched  Major US ATLAS participation  LHC turn-on 2007  Data Challenge Delays  Hybrid Root/RDB solution adopted  Supercomputing demonstrations (iVDGL/GriPhyN/PPDG)  Work toward University based ITR proposal

June 02 John Huth, LHC Computing 4 International ATLAS  Issue of competing frameworks has been settled  New simulation coordinator: Andrea Dell’Aqua  Major new releases of Athena (control/framework)  Lack of infrastructure remains a problem  Intl. ATLAS management is determined to solve it!  Ramp up of grid activities, in context of Data challenges  P. Nevski (BNL) in support role  Delay of data challenge 1, phase 2 by approx. 6 months  J. Shank named muon software coordinator  Geometry model of CDF leading candidate (J. Boudreau)  Hybrid database decision (LCG RTAG – see talks)

June 02 John Huth, LHC Computing 5 FTE Fraction of Core SW

June 02 John Huth, LHC Computing 6 Project Core SW FTE

June 02 John Huth, LHC Computing 7 ATLAS Detector/Task matrix Offline Coordinator ReconstructionSimulationDatabaseChair N. McCubbin D. Rousseau A. dell’Aqua D. Malon Inner Detector D. Barberis D. Rousseau F. Luehring S. Bentvelsen Liquid Argon J. Collot S. Rajagopalan M. Leltchouk H. Ma H. Ma Tile Calorimeter A. Solodkov F. Merritt A. Solodkov T. LeCompte Muon J. Shank J.F. Laporte A. Rimoldi S. Goldfarb LVL 2 Trigger/ Trigger DAQ S. George S. Tapprogge M. Weilers A. Amorim Event Filter F. Touchard M. Bosman Physics Coordinator: F.Gianotti Chief Architect: D.Quarrie

June 02 John Huth, LHC Computing 8 Other Labs The LHC Computing Grid Project LHCC Project Overview Board Technical Study Groups Reports Reviews Resources Board Resource Issues Computing Grid Projects Project execution teams HEP Grid Projects Project Manager Project Execution Board Requirements, Monitoring Software and Computing Committee LCG Project Structure

June 02 John Huth, LHC Computing 9 LHC Comp. Grid Project  3 Main bodies  Proj. Oversight Board – yet to meet (J.Huth, US rep.)  SC2 (Gives requirements, L. Bauerdick US rep.)  Proj. Execution Board – T. Wenaus Applications leader  D.B. RTAG gives recommendation (see talks)  US ATLAS contributing (consonant with ATLAS goals)  Phase I until 2004 (R+D), Phase II afterward  20% data challenge milestone in Dec for Tier 0,1,2,3  Drives Tier 1, 2 funding

June 02 John Huth, LHC Computing 10 Software  Further consolidation  Hybrid event store (ROOT + relational DB)  Second framework effectively dead  Athena developments  Architecture model including triggers  Infrastructure at intl ATLAS remains a problem  One solution is “tithing” countries  Tara Shears – web documentation, CVS support  Steve O’Neal - librarian  Data challenges and incorporation of OO reconstruction

June 02 John Huth, LHC Computing 11 Physics, Facilities  New hire at LBNL for physics support  Tier 1 ramp slowed by funding profile  At a smaller level for participation in data challenges (barely making cut), participation smaller than % of authors  2004 milestone for LCG  Continuing to explore all-disk solution  Tier 2’s – US ATLAS testbed plans for SC2002 demo’s  Grid enabled production  Monitoring, virtual data, interoperability with CMS, replica and meta- data catalogs (GriPhyN, iVDGL, PPDG)

June 02 John Huth, LHC Computing 12 US ATLAS Persistent Grid Testbed Calren Esnet, Abilene, Nton Esnet, Mren UC Berkeley LBNL-NERSC Esne t NPACI, Abilene Brookhaven National Laboratory Indiana University Boston University Argonne National Laboratory U Michigan Oklahoma University Abilene Prototype Tier 2s HPSS sites University of Texas At Arlington

June 02 John Huth, LHC Computing 13 Funding Issues  Changing budget numbers, uncertainties very difficult to live with….  For FY 02, living on goodwill, counting on $520k from NSF to prevent further reductions in force  ANL, BNL, LBNL base programs are severely squeezed, additional burden on project costs  Attempting to put together a large ITR using US ATLAS universities  Monitoring, quality of service, networking

June 02 John Huth, LHC Computing 14 Positive News on Funding  UT Arlington – MRI for D0 (ATLAS) K. De  High Energy Physics Analysis Center  J. Huth/J. Schopf (ANL/NW/Harvard) ITR Grant  Resource Estimation for High Energy Physics  Large ITR  Resource Estimation (continuation of above)  Networking  Quality of Service  Grid Monitoring  Security  Integration of all above into the ATLAS fabric

June 02 John Huth, LHC Computing 15 Budgeting  Attempt to meet obligations for software and facilities  Make “bare bones” assumptions – stretch out of LHC  A big unknown is the result of the split between M+O and computing funds  Have to guess at profiles for the time being

June 02 John Huth, LHC Computing 16 Assumed Budget Profile (AY$)

June 02 John Huth, LHC Computing 17 Bare Bones Budget

June 02 John Huth, LHC Computing 18 Responses to Recommendation from Nov.  Received this last week  Chief architect should have resources/authority  Infrastructure remains an issue, Torsten Akesson is trying to solve this via the NCB – it has been recognized as the major issue in intl. ATLAS. Software infrastructure team is being identified.  Software decisions  Resolution of second framework, some improvements overall, far from perfect  Be less willing to take on additional workloads  Trying, but infrastructure support is a major issue – in process of resolution.  Staffing for DC 1 – progress, funding issues

June 02 John Huth, LHC Computing 19 Summary  We are making progress on many fronts, despite funding uncertainties  US ATLAS testbed, control/framework, data management, incorporation of grid tools.  Tier 1 the most severely impacted by funding uncertainties  Intl. ATLAS situation improving, but still has some distance to go  We are barely rolling on the present budget  Severe impact of base program cuts in US ATLAS institutions