N ATIONAL E NERGY R ESEARCH S CIENTIFIC C OMPUTING C ENTER Charles Leggett The Athena Control Framework in Production, New Developments and Lessons Learned.

Slides:



Advertisements
Similar presentations
GNAM and OHP: Monitoring Tools for the ATLAS Experiment at LHC GNAM and OHP: Monitoring Tools for the ATLAS Experiment at LHC M. Della Pietra, P. Adragna,
Advertisements

K A Assamagan Analysis Tutorial – December 19, Tucson, 2004 Overview of ATLAS Software and the Athena Framework (extracted from an earlier talk by S. R.
1 Databases in ALICE L.Betev LCG Database Deployment and Persistency Workshop Geneva, October 17, 2005.
Simulation Project Major achievements (past 6 months 2007)
Architecture/Framework Status David R. Quarrie LBNL U.S. ATLAS Physics and Computing Project Review ANL October 2001.
6/4/20151 Introduction LHCb experiment. LHCb experiment. Common schema of the LHCb computing organisation. Common schema of the LHCb computing organisation.
25/03/2003Simulation Application for the LHCb Experiment CHEP March 2003 Presented by: W. Pokorski / CERN Authors: I. Belyaev, Ph. Charpentier,
1 The ATLAS Online High Level Trigger Framework: Experience reusing Offline Software Components in the ATLAS Trigger Werner Wiedenmann University of Wisconsin,
LAr Athena Tutorial – November 2, 2004 Software Tutorial (Minimally modified transparencies from Offline Software Tutorial of Srini R., Hong M., David.
LC Software Workshop, May 2009, CERN P. Mato /CERN.
Alignment Strategy for ATLAS: Detector Description and Database Issues
Nick Brook Current status Future Collaboration Plans Future UK plans.
ANL/BNL Virtual Data Technologies in ATLAS Alexandre Vaniachine Pavel Nevski US-ATLAS Core/GRID software workshop Brookhaven National Laboratory May 6-7,
David N. Brown Lawrence Berkeley National Lab Representing the BaBar Collaboration The BaBar Mini  BaBar  BaBar’s Data Formats  Design of the Mini 
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
Event Data History David Adams BNL Atlas Software Week December 2001.
GLAST Gaudi Code Review, 10 Sept. 2002, H. Kelly, 2-1 GLAST Event Data Model and Persistency.
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
Analysis of the ROOT Persistence I/O Memory Footprint in LHCb Ivan Valenčík Supervisor Markus Frank 19 th September 2012.
N ATIONAL E NERGY R ESEARCH S CIENTIFIC C OMPUTING C ENTER 1 ACAT 2000, Fermilab Oct Control States... Control States for the Atlas Software Framework.
Introduction to Gaudi LHCb software tutorial - September
N ATIONAL E NERGY R ESEARCH S CIENTIFIC C OMPUTING C ENTER Charles Leggett Updates to Interval of Validity Service (IOVSvc) ATLAS Software Week Dec
And Tier 3 monitoring Tier 3 Ivan Kadochnikov LIT JINR
Computing in High Energy Physics – Interlaken - September 2004 Ada Farilla Offline Software for the ATLAS Combined Test Beam Ada Farilla – I.N.F.N. Roma3.
24/06/03 ATLAS WeekAlexandre Solodkov1 Status of TileCal software.
Combined HEC/EMEC testbeam data can be read and analyzed within the ATLAS Athena framework A “cookbook” gives an introduction for how to access the data.
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
ATLAS is a general-purpose particle physics experiment which will study topics including the origin of mass, the processes that allowed an excess of matter.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
The CMS Simulation Software Julia Yarba, Fermilab on behalf of CMS Collaboration 22 m long, 15 m in diameter Over a million geometrical volumes Many complex.
David Adams ATLAS DIAL: Distributed Interactive Analysis of Large datasets David Adams BNL August 5, 2002 BNL OMEGA talk.
Detector Description in LHCb Detector Description Workshop 13 June 2002 S. Ponce, P. Mato / CERN.
Kyle Cranmer (BNL)HCP, Isola d’Elba, March 23, The ATLAS Analysis Architecture Kyle Cranmer Brookhaven National Lab.
Predrag Buncic Future IT challenges for ALICE Technical Workshop November 6, 2015.
Combined HEC/EMEC testbeam data can be read and analyzed within the ATLAS Athena framework A “cookbook” gives an introduction for how to access the data.
General requirements for BES III offline & EF selection software Weidong Li.
CERN Tutorial, February Introduction to Gaudi.
1 OO Muon Reconstruction in ATLAS Michela Biglietti Univ. of Naples INFN/Naples Atlas offline software MuonSpectrometer reconstruction (Moore) Atlas combined.
- GMA Athena (24mar03 - CHEP La Jolla, CA) GMA Instrumentation of the Athena Framework using NetLogger Dan Gunter, Wim Lavrijsen,
Introduction S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting.
INFSO-RI Enabling Grids for E-sciencE Using of GANGA interface for Athena applications A. Zalite / PNPI.
ATLAS Distributed Computing perspectives for Run-2 Simone Campana CERN-IT/SDC on behalf of ADC.
Introduction to the Athena Software Hong Ma BNL Athena Tutorial USATLAS Software Workshop.
Overview Methodology Design Architecture Outline of future work Ideas for discussion.
N ATIONAL E NERGY R ESEARCH S CIENTIFIC C OMPUTING C ENTER Charles Leggett Prototype GEANT4 Service for ATHENA framework ATLAS Software Workshop Dec 3.
14 th IEEE-NPSS Real Time Stockholm - June 9 th 2005 P. F. Zema The GNAM monitoring system and the OHP histogram presenter for ATLAS 14 th IEEE-NPSS Real.
June 2004 ATLAS WeekAlexander Solodkov1 testbeam 2004 offline reconstruction.
Architecture/Framework Status David R. Quarrie LBNL DOE/NSF Review of U.S. ATLAS Physics and Computing Project FNAL November 2001.
ATLAS The ConditionDB is accessed by the offline reconstruction framework (ATHENA). COOLCOnditions Objects for LHC The interface is provided by COOL (COnditions.
The ATLAS detector … … is composed of cylindrical layers: Tracking detector: Pixel, SCT, TRT (Solenoid magnetic field) Calorimeter: Liquid Argon, Tile.
October 2014 HYBRIS ARCHITECTURE & TECHNOLOGY 01 OVERVIEW.
Database Replication and Monitoring
CMS High Level Trigger Configuration Management
The LHCb Software and Computing NSS/IEEE workshop Ph. Charpentier, CERN B00le.
S. Rajagopalan August 28, 2003 US ATLAS Computing Meeting
OO Muon Reconstruction in ATLAS
US ATLAS Physics & Computing
SW Architecture SG meeting 22 July 1999 P. Mato, CERN
Architecture/Framework Status
LHCb Detector Description Framework Radovan Chytracek CERN Switzerland
GAUSS - GEANT4 based simulation for LHCb
Simulation and Physics
Major Design Criteria Clear separation between “data” and “algorithms”
ATLAS DC2 & Continuous production
HEC-EMEC Test Beam Software
HEC Beam Test Software schematic view T D S MC events ASCII-TDS
Use Of GAUDI framework in Online Environment
Planning next release of GAUDI
LHCb Detector Description Framework Radovan Chytracek CERN Switzerland
Presentation transcript:

N ATIONAL E NERGY R ESEARCH S CIENTIFIC C OMPUTING C ENTER Charles Leggett The Athena Control Framework in Production, New Developments and Lessons Learned September C. Leggett, P. Calafiura, W. Lavrijsen, M. Marino, D. Quarrie

2 Charles Leggett 9/27/2004 Athena and Gaudi Ø Gaudi framework shared by LHCb, ATLAS, GLAST, HARP, and OPERA Ø Based on a modular component architecture, structured around dynamically loadable libraries Ø Maintains a separation between transient and persistent layers, allowing new technologies to replace aging ones with minimal impact on the end user Ø Athena comprises the ATLAS specific extensions to Gaudi, most notably: l Storegate – the data store l Interval Of Validity Svc – managing time dependent data l Pileup – combining multiple events l HistorySvc – maintaining a multi-level record

3 Charles Leggett 9/27/2004 Athena and Gaudi Converter Algorithm StoreGateSvc Persistency Service Data Files Algorithm StoreGateSvc Persistency Service Data Files Detector Store Message Service JobOptions Service Particle Prop. Service Other Services Histogram Service Application Manager Converter Event LoopMgr Auditors Scripting Service Sequencer Event Store D D D D D D D D D H H H H H T T T

4 Charles Leggett 9/27/2004 Athena in Production Ø Data Challenge II: l phase 1: F ~30 Physics channels, 10s of millions of events F several million calibration events F currently producing raw data in a distributed worldwide environment using Grid l phase 2: reconstruction and real time distribution of data to tier 1 institutes l phase 3: worldwide distributed analysis on the Grid Ø Combined TestBeam l taking data since July in various configurations l ~5000 runs, > 1Tb data written l G4 simulation and reconstruction of CTB setup occurring in parallel l conditions databases in production, both read and write. l preparing for phase II: massive reconstruction of all real data and production of MC data.

5 Charles Leggett 9/27/2004 Combined Test Beam Setup

6 Charles Leggett 9/27/2004 Interval of Validity Service Ø Makes associations between user data and time dependent data that resides in specialized conditions databases Ø Transparent to user Ø Data only read from persistent layer when it is used. Validity interval information is separate from data Ø Hierarchical callback functions can be associated with time dependent data such that they are triggered when data enters a new interval of validity Ø Validity interval information and time dependent data can be preloaded on a job or run basis for trigger or testbeam situations where database access is unwanted

7 Charles Leggett 9/27/2004 Access to Time Varying Data Ø Maintains separation on transient and persistent layers Ø Testbeam environment making good use of IOVService for: l Slow control l Calibration

8 Charles Leggett 9/27/2004 Detector Pileup in DC2 Ø Overlay ~1000 min bias events to original physics stream l Requirement: digitization algorithms should run unchanged Ø Tuple event iterator: manage multiple input streams l Select random permutations from a circular buffer of min- bias events l Memory optimization: requirement total job size < 1GB 2-dim detector and time-dependent event caching Ø Stress test architecture flexibility Ø Excellent tool to expose memory leaks (they become x1000 bigger)

9 Charles Leggett 9/27/2004History Ø Provenance of data must be assured Ø User selection of data based on its history. l full history of generation and processing recorded and associated with all data Ø Important in analysis to know complete source of data, and all cuts applied Ø History Service keeps track of l Environment l Job configuration l Services l Algorithms, AlgTools, SubAlgorithms l DataObjects

10 Charles Leggett 9/27/2004 Python Based Scripting Interface Ø Python woven into the framework, replacing flat text configuration files Ø dynamic job configuration l conditional branching l detFlags Ø interactive analysis Ø data object access and manipulation Ø connection to ROOT histogramming facilities Ø object type info for dictionaries and persistency

11 Charles Leggett 9/27/2004 Detector Configuration Matrix A DetFlags.Print() : pixel SCT TRT em HEC FCal Tile MDT CSC TGC RPC Truth LVL1 detdescr : ON ON ON ON ON -- digitize : ON ON ON ON ON -- geometry : ON ON ON ON ON -- haveRIO : makeRIO : pileup : ON ON ON ON -- readRDOBS : readRDOPool : ON ON ON ON ON -- readRIOBS : readRIOPool : simulate : writeBS : writeRDOPool : ON ON ON ON -- writeRIOPool :

12 Charles Leggett 9/27/2004 Lessons Learned Ø Design for Performance l pileup excellent testbed l database access can be problematic Ø Design for Persistency l various container classes rewritten with persistency in mind l ClassID Service to globally monitor objects Ø Better support for realtime monitoring l essential for proper testbeam studies