Introduction S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting.

Slides:



Advertisements
Similar presentations
Project Overview John Huth Harvard University U.S. ATLAS Physics and Computing Project Review ANL October 2001.
Advertisements

Athena/POOL integration
1 Databases in ALICE L.Betev LCG Database Deployment and Persistency Workshop Geneva, October 17, 2005.
D. Düllmann - IT/DB LCG - POOL Project1 POOL Release Plan for 2003 Dirk Düllmann LCG Application Area Meeting, 5 th March 2003.
Architecture/Framework Status David R. Quarrie LBNL U.S. ATLAS Physics and Computing Project Review ANL October 2001.
Information Systems and Data Acquisition for ATLAS What was achievedWhat is proposedTasks Database Access DCS TDAQ Athena ConditionsDB Time varying data.
U.S. ATLAS Software WBS 2.2 S. Rajagopalan July 8, 2003 DOE/NSF Review of LHC Computing.
Software Status/Plans Torre Wenaus, BNL/CERN US ATLAS Software Manager US ATLAS PCAP Review November 14, 2002.
Simulation Project Organization update & review of recommendations Gabriele Cosmo, CERN/PH-SFT Application Area Internal.
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ATLAS Computing Planning ATLAS Software Workshop August 1998 Jürgen Knobloch Slides also.
ATLAS-Specific Activity in GridPP EDG Integration LCG Integration Metadata.
BNL ATLAS Meeting July 1999 U.S. ATLAS Computing  Goals for the next year  Status of ATLAS computing  U.S. ATLAS  Management proposal  Brief status.
U.S. ATLAS Executive Meeting Upgrade R&D August 3, 2005Toronto, Canada A. Seiden UC Santa Cruz.
David Adams ATLAS ATLAS Distributed Analysis David Adams BNL March 18, 2004 ATLAS Software Workshop Grid session.
The ATLAS Database Project Richard Hawkings, CERN Torre Wenaus, BNL/CERN ATLAS plenary meeting June 24, 2004.
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
N ATIONAL E NERGY R ESEARCH S CIENTIFIC C OMPUTING C ENTER Charles Leggett The Athena Control Framework in Production, New Developments and Lessons Learned.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
LCG Applications Area – Overview, Planning, Resources Torre Wenaus, BNL/CERN LCG Applications Area Manager LHCC Comprehensive Review.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
ATLAS, U.S. ATLAS, and Databases David Malon Argonne National Laboratory DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National Laboratory.
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
ATLAS, U.S. ATLAS, and Databases David Malon Argonne National Laboratory PCAP Review of U.S. ATLAS Computing Project Argonne National Laboratory
June 02 John Huth, LHC Computing 1 U.S. ATLAS Overview  Project ManagementJ. Huth  SoftwareT.Wenaus  ArchitectureD. Quarrie  PhysicsI. Hinchliffe 
ATLAS Detector Description Database Vakho Tsulaia University of Pittsburgh 3D workshop, CERN 14-Dec-2004.
19 November 98 1 Jürgen Knobloch ATLAS Computing ATLAS Computing - issues for 1999 Jürgen Knobloch Slides also on:
24/06/03 ATLAS WeekAlexandre Solodkov1 Status of TileCal software.
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
U.S. ATLAS Software WBS 2.2 S. Rajagopalan July 8, 2004 DOE-NSF Review of U.S. ATLAS Computing.
Data Management Overview David M. Malon U.S. ATLAS Computing Meeting Brookhaven, New York 28 August 2003.
ATLAS Simulation/Reconstruction Software Reported by S. Rajagopalan work done by most US Institutes. U.S. ATLAS PCAP review Lawrence Berkeley National.
Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.
SEAL Core Libraries and Services CLHEP Workshop 28 January 2003 P. Mato / CERN Shared Environment for Applications at LHC.
ATLAS Core Software - Status & Plans David R. Quarrie LBNL U.S. ATLAS Physics and Computing Project Review LBNL November 2002.
SEAL Project Core Libraries and Services 18 December 2002 P. Mato / CERN Shared Environment for Applications at LHC.
The POOL Persistency Framework POOL Project Review Introduction & Overview Dirk Düllmann, IT-DB & LCG-POOL LCG Application Area Internal Review October.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
- Early Adopters (09mar00) May 2000 Prototype Framework Early Adopters Craig E. Tull HCG/NERSC/LBNL ATLAS Arch CERN March 9, 2000.
Architecture and Control Framework David R. Quarrie Lawrence Berkeley National Laboratory DOE/NSF Review of U.S. ATLAS and CMS Computing.
Atlas Computing Planning Helge Meinhard / CERN-EP Atlas Software Workshop Berkeley, 11 May 2000.
Status of the LAr OO Reconstruction Srini Rajagopalan ATLAS Larg Week December 7, 1999.
Integration of the ATLAS Tag Database with Data Management and Analysis Components Caitriana Nicholson University of Glasgow 3 rd September 2007 CHEP,
G.Govi CERN/IT-DB 1 September 26, 2003 POOL Integration, Testing and Release Procedure Integration  Packages structure  External dependencies  Configuration.
SEAL Project Overview LCG-AA Internal Review October 2003 P. Mato / CERN.
23/2/2000Status of GAUDI 1 P. Mato / CERN Computing meeting, LHCb Week 23 February 2000.
Data Management Overview David M. Malon Argonne U.S. ATLAS Physics and Computing Project Advisory Panel Meeting Berkeley, CA November 2002.
Data Management Overview David M. Malon Argonne U.S. LHC Computing Review Berkeley, CA January 2003.
12 February 2004 ATLAS presentation to LCG PEB 1 Why ATLAS needs MySQL  For software developed by the ATLAS offline group, policy is to avoid dependencies.
Overview of US Work on Simulation and Reconstruction Frederick Luehring August 28, 2003 US ATLAS Computing Meeting at BNL.
Data Management Overview David M. Malon Argonne NSF/DOE Review of U.S. ATLAS Physics and Computing Project NSF Headquarters 20 June 2002.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
PCAP Close Out Feb 2, 2004 BNL. Overall  Good progress in all areas  Good accomplishments in DC-2 (and CTB) –Late, but good.
David Adams ATLAS ATLAS Distributed Analysis (ADA) David Adams BNL December 5, 2003 ATLAS software workshop CERN.
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ACOS Report ATLAS Software Workshop December 1998 Jürgen Knobloch Slides also on:
David Adams ATLAS ATLAS Distributed Analysis and proposal for ATLAS-LHCb system David Adams BNL March 22, 2004 ATLAS-LHCb-GANGA Meeting.
Follow-up to SFT Review (2009/2010) Priorities and Organization for 2011 and 2012.
Project Work Plan SEAL: Core Libraries and Services 7 January 2003 P. Mato / CERN Shared Environment for Applications at LHC.
Architecture/Framework Status David R. Quarrie LBNL DOE/NSF Review of U.S. ATLAS Physics and Computing Project FNAL November 2001.
David Adams ATLAS ADA: ATLAS Distributed Analysis David Adams BNL December 15, 2003 PPDG Collaboration Meeting LBL.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
The LHCb Software and Computing NSS/IEEE workshop Ph. Charpentier, CERN B00le.
S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting
US ATLAS Physics & Computing
ATLAS Core Software - Status & Plans
Architecture/Framework Status
Simulation Framework Subproject cern
Simulation and Physics
ATLAS DC2 & Continuous production
SEAL Project Core Libraries and Services
Presentation transcript:

Introduction S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting

US ATLAS Computing S. Rajagopalan 2 Outline   Organizational Issues  ATLAS & U.S. ATLAS software   Main areas of U.S. Software participation  Current priorities   Analysis & Support   Conclusion  US participation

US ATLAS Computing S. Rajagopalan 3 New Computing Organization x x x x x x x x x x x x x x x x x

US ATLAS Computing S. Rajagopalan 4 Computing Management Board  Coordinate & Manage computing activities  Set priorities and take executive decisions  Computing Coordinator (chair)  Software Project Leader (D. Quarrie, LBNL)  TDAQ Liaison  Physics Coordinator  International Computing Board Chair  GRID, Data Challenge and Operations Coordinator  Planning & Resources Coordinator (T. Lecompte, ANL)  Data Management Coordinator (D. Malon, ANL)  Meets bi-weekly

US ATLAS Computing S. Rajagopalan 5 Software Project Management Board  Coordinate the coherent development of software  (core, applications and software support)  Software Project Leader (chair) D. Quarrie  Simulation coordinator  Event Selection, Reconstruction & Analysis Tools coordinator  Core Services Coordinator (D. Quarrie)  Software Infrastructure Team Coordinator  LCG Applications Liaison (T. Wenaus, BNL)  Physics Liaison  TDAQ Liaison  Sub-System: Inner Detector, Liquid Argon, Tile, Muon coordinators  Liquid Argon: S. Rajagopalan (BNL), Muon: S. Goldfarb (U Mich)  Meets bi-weekly

US ATLAS Computing S. Rajagopalan 6 US ATLAS Software Organization Software Project (WBS 2.2) S. Rajagopalan Core Services (WBS 2.2.2) D. Quarrie Data Management (WBS 2.2.3) D. Malon Application Software (WBS 2.2.4) F. Luehring Software Support (WBS 2.2.5) A. Undrus  US ATLAS software WBS scrubbed, consistent with ATLAS  Resource Loading and Reporting established at Level 4  Major change compared to previous WBS:  Production and Grid Tools & Services moved under Facilities Coordination (WBS 2.2.1)

US ATLAS Computing S. Rajagopalan 7 Framework  People involved:  P. Calafiura, W. Lavrijsen, C. Leggett, M. Marino, D. Quarrie, C. Tull  S. Rajagopalan, H. Ma, J. Boudreau  Scope:  Framework support, Event Merging (pile-up)  SEAL plug-in and component support  Event Data Model Infrastructure  User interfaces, Python scripting, binding to dictionary, integration with GANGA.  Detector Description & Conditions infrastructure.  Athena Grid Integration

US ATLAS Computing S. Rajagopalan 8 Database  People involved:  D. Malon, K. Karr, S. Vanyachine  D. Adams, W. Deng, V. Fine, V. Perevotchikov  ATLAS specific contributions:  Athena-Pool integration to support data persistency  Support for NOVA database  (primary source for detector description for simulation)  LCG contributions  POOL collections/metadata work package interface  Support for Foreign object persistence  Detailed review of the LCG POOL design in context of ATLAS requirements

US ATLAS Computing S. Rajagopalan 9 Application Software  Geant3 simulation support  Calorimeter (LAr & Tile) software incl. calibration  Pixel, TRT detector simulation & digitization  Muon reconstruction and database  Hadronic calibration, tau and jet reconstruction  electron-gamma reconstruction  High Level Trigger software  Physics analysis with new software BNL ANL, BNL, Nevis Labs, U. Arizona, U. Chicago, U. Pittsburgh, SMU Indiana U., LBNL BNL, Boston U., LBNL, U. Michigan U. Arizona, U. Chicago, ANL, BNL, LBNL BNL, Nevis Labs, SMU U. Wisconsin U. S. ATLAS

US ATLAS Computing S. Rajagopalan 10 Software Support  Full software support and maintenance at BNL (A. Undrus)  Release and maintenance of ATLAS and all associated external software (including LCG software, LHCb Gaudi builds) at the Tier 1 Facility.  Typical ATLAS releases once every 3 weeks  Deployment of a nightly build system at BNL, CERN and now used by LCG as well.  ATLAS Software Infrastructure Team : Forum for discussions of issues related to support of ATLAS software and associated tools. A. Undrus is a member of this body.

US ATLAS Computing S. Rajagopalan 11 LCG Application Component  T. Wenaus (BNL) serves as the LCG Applications Area coordinator  US effort in SEAL : 1.0 FTE (FY03)  Plug-in manager (M. Marino (0.75 FTE, LBNL)  Internal use by POOL now, Full integration into Athena Q  Scripting Services (W. Lavjrisen; 0.25 FTE, LBNL)  Python support and integration  US effort in POOL : 1.2 FTE (FY03)  Principal responsibility in POOL collections and metadata WP  D. Malon, K. Karr, S. Vanyachine (0.5 FTE) [ANL] D. Adams, 0.2 FTE, BNL)  POOL Datasets (D. Adams, 0.2 FTE, BNL)  Common Data Management Software  V. Perevoztchikov, ROOT I/O foreign object persistence (0.3 FTE, BNL]  POOL mysql package and server configurations (ANL, 0.2 FTE)

US ATLAS Computing S. Rajagopalan 12 US ATLAS contribution in LCG Contribution to Application Area only Contribution to Application Area only Limited contributions from ATLASLimited contributions from ATLAS Snapshot (June 2003) contribution Snapshot (June 2003) contribution

US ATLAS Computing S. Rajagopalan 13 Major near term milestones  July to Dec 2003: SEAL/POOL/PI deployment by LCG  Sept. 2003: Geant 4 based simulation release  Dec. 2003: Validate Geant4 release for DC2 and test-beam  Dec. 2003: First release of full ATLAS software chain using LCG components and Geant4 for use in DC2 and combined test-beam.  Spring 2004: Combined Test-Beam runs.  Spring 2004: Data Challenge 2  Principal means by which ATLAS will test and validate its proposed Computing Model  Dec. 2004: Computing Model Document released

US ATLAS Computing S. Rajagopalan 14 Detector Description  ATLAS lacked a Detector Description Model  Numbers hardwired in reconstruction, no commonality with simulation.  Along came Joe Boudreau (U. Pittsburgh) CDF experience  Successfully designed, developed and deployed a prototype model for both material and readout geometry. We encouraged this!  Automatically handles alignments, Optimized for memory (5 MB for describing ATLAS geometry), Not coupled to visualization software.  Critical items include Material Integration Service, Configuration Utility, Identifiers and Transient Model for readout geometry  Recognizing it is important to support such university based contributions and the critical needs of Detector Description, we have decided to allocate 1 FTE to Pittsburgh in 2004

US ATLAS Computing S. Rajagopalan 15 Analysis and Support  Updated Web-Site at US ATLAS:   Documentation of use of Tier1 Facility, Software and Physics has been updated.  In particular, we have established a guide to physics analysis in U.S. ATLAS  User Guides to Generators, Simulation, Trigger, Reconstruction and use of analysis tools.  Concrete realistic examples for each components.  Hands-on Tutorials

US ATLAS Computing S. Rajagopalan 16 Conclusion T 34 institutes in U.S. ATLAS q 20 are participating in some aspect of software z Core software, Reconstruction & Simulation, Trigger, Analysis, Production, Grid. q We have not heard from the other 14 institutes: z SUNY Albany, Brandeis, Duke, Hampton, Iowa State, UC Irvine, MIT, Ohio State, U. Pennsylvania, U. Rochester, UC Santa Cruz, TUFTS, U. Illinois, U. Washington.

US ATLAS Computing S. Rajagopalan 17 Conclusion (2) T Now is the best time to get involved in software development and physics analysis. q Full chain of simulation & reconstruction exists, but much work needs to be done to achieve perfection. q Subsequent talks: try to expose areas where US is primarily participating and where you can possibly contribute. T These US ATLAS workshops are primarily meant to encourage your participation. T What can we do to help you get involved? q Documentation & Tutorials by themselves are not sufficient q one on one help would work better q But we need you to put take the first step.