Trigger ESD/AOD Simon George (RHUL) Ricardo Goncalo (RHUL) Monika Wielers (RAL) Contributions from many people.

Slides:



Advertisements
Similar presentations
Sander Klous on behalf of the ATLAS Collaboration Real-Time May /5/20101.
Advertisements

Trigger Data Formats From CBNT to ESD and/or DPD June 11, 2008 Fabrizio Salvatore, Ricardo Gonçalo, RHUL.
The ATLAS High Level Trigger Steering Journée de réflexion – Sept. 14 th 2007 Till Eifert DPNC – ATLAS group.
Reconstruction and Analysis on Demand: A Success Story Christopher D. Jones Cornell University, USA.
New muon EF trigger with offline supertools Sergio Grancagnolo INFN Lecce & Salento University.
The ATLAS trigger Ricardo Gonçalo Royal Holloway University of London.
Real Time 2010Monika Wielers (RAL)1 ATLAS e/  /  /jet/E T miss High Level Trigger Algorithms Performance with first LHC collisions Monika Wielers (RAL)
General Trigger Philosophy The definition of ROI’s is what allows, by transferring a moderate amount of information, to concentrate on improvements in.
Trigger and online software Simon George & Reiner Hauser T/DAQ Phase 1 IDR.
L3 Filtering: status and plans D  Computing Review Meeting: 9 th May 2002 Terry Wyatt, on behalf of the L3 Algorithms group. For more details of current.
Status and outlook of HLT software High Level Trigger algorithms for e/  How to analyse HLT data? Trigger information in ESD/AOD Ricardo Gonçalo (RHUL)
Algorithm / Data-flow Interface
Trigger-aware analysis Current status Under development What’s missing Conclusions & Outlook Ricardo Gonçalo (RHUL)
TRIGGER-AWARE ANALYSIS TUTORIAL ARTEMIS Workshop – Pisa – 18 th to 19 th June 2009 Alessandro Cerri (CERN), Ricardo Gonçalo – Royal Holloway.
The New TrigDecision Nicolas Berger, Till Eifert, Ricardo Gonçalo Physics Analysis Tools session ATLAS Software Workshop – Munich, March 2007.
What’s in the ATLAS data : Trigger Decision ATLAS Offline Software Tutorial CERN, August 2008 Ricardo Gonçalo - RHUL.
Trigger validation for Event size TrigDecision Jets Electrons Ricardo Gonçalo, RHUL Physics Validation Meeting – 9 October 2007.
Analysis of BS, ESD and AOD data What trigger data is available offline and what data should be available? How are we going to analyse HLT data?
Event Data History David Adams BNL Atlas Software Week December 2001.
Introduction Advantages/ disadvantages Code examples Speed Summary Running on the AOD Analysis Platforms 1/11/2007 Andrew Mehta.
Level 2 ID-tracking truth association Trigger AOD discussion 13 December 2006 Ricardo Gonçalo - RHUL.
News from Atlantis Event Display Qiang Lu, Mark Stockton, Juergen Thomas, Peter Watkins (Birmingham) Hans Drevermann (CERN) Andrew Haas (Columbia) Eric.
Summary and feedback on trigger AODs Introduction Overview of trigger data in the AODs Trigger-aware analyses and feedback Outlook Ricardo Gonçalo (RHUL)
Overview of trigger EDM and persistency Focusing on the e/  slice Referring to the work of several people Ricardo Gonçalo.
A Technical Validation Module for the offline Auger-Lecce, 17 September 2009  Design  The SValidStore Module  Example  Scripting  Status.
LVL2 ID ESD/AOD classes Status and plans. PESA L2 ID Algorithms Review - RAL 25 July Ricardo Goncalo ESD/AOD More and more interest from physics.
DPDs and Trigger Plans for Derived Physics Data Follow up and trigger specific issues Ricardo Gonçalo and Fabrizio Salvatore RHUL.
Trigger input to FFReq 1. Specific Issues for Trigger The HLT trigger reconstruction is a bit different from the offline reconstruction: – The trigger.
Trigger ESD/AOD Simon George (RHUL) Ricardo Goncalo (RHUL) Monika Wielers (RAL) Reporting on the work of many people. ATLAS software week September.
1 “Steering the ATLAS High Level Trigger” COMUNE, G. (Michigan State University ) GEORGE, S. (Royal Holloway, University of London) HALLER, J. (CERN) MORETTINI,
N ATIONAL E NERGY R ESEARCH S CIENTIFIC C OMPUTING C ENTER Charles Leggett Interval of Validity Service IOVSvc ATLAS Software Week May Architecture.
Valeria Perez Reale University of Bern On behalf of the ATLAS Physics and Event Selection Architecture Group 1 ATLAS Physics Workshop Athens, May
Status of trigger software and EDM Ricardo Gonçalo, RHUL – on behalf of ATLAS TDAQ.
The ATLAS Trigger: High-Level Trigger Commissioning and Operation During Early Data Taking Ricardo Gonçalo, Royal Holloway University of London On behalf.
Navigation Timing Studies of the ATLAS High-Level Trigger Andrew Lowe Royal Holloway, University of London.
TDAQ Upgrade Software Plans John Baines, Tomasz Bold Contents: Future Framework Exploitation of future Technologies Work for Phase-II IDR.
Trigger Software Validation Olga Igonkina (U.Oregon), Ricardo Gonçalo (RHUL) TAPM Open Meeting – April 12, 2007 Outline: Reminder of plans Status of infrastructure.
September 2007CHEP 07 Conference 1 A software framework for Data Quality Monitoring in ATLAS S.Kolos, A.Corso-Radu University of California, Irvine, M.Hauschild.
Latest News & Other Issues Ricardo Goncalo (LIP), David Miller (Chicago) Jet Trigger Signature Group Meeting 9/2/2015.
Artemis School On Calibration and Performance of ATLAS Detectors Jörg Stelzer / David Berge.
TB1: Data analysis Antonio Bulgheroni on behalf of the TB24 team.
AOD/ESD plans Status and plans focusing on LVL2 e/  and some items for discussion On behalf of: J.Baines, P.Casado, G.Comune, A.DiMattia, S.George, R.Goncalo,
Moritz Backes, Clemencia Mora-Herrera Département de Physique Nucléaire et Corpusculaire, Université de Genève ATLAS Reconstruction Meeting 8 June 2010.
A New Tool For Measuring Detector Performance in ATLAS ● Arno Straessner – TU Dresden Matthias Schott – CERN on behalf of the ATLAS Collaboration Computing.
PESAsim – the e/  analysis framework Validation of the framework First look at a trigger menu combining several signatures Short-term plans Mark Sutton.
Integration of the ATLAS Tag Database with Data Management and Analysis Components Caitriana Nicholson University of Glasgow 3 rd September 2007 CHEP,
Trigger Validation Olga Igonkina (U.Oregon), Ricardo Gonçalo (RHUL) on behalf of trigger community Physics Validation Meeting – Feb. 13, 2007.
M. Gilchriese Basic Trigger Rates December 3, 2004.
Software offline tutorial, CERN, Dec 7 th Electrons and photons in ATHENA Frédéric DERUE – LPNHE Paris ATLAS offline software tutorial Detectors.
S t a t u s a n d u pd a t e s Gabriella Cataldi (INFN Lecce) & the group Moore … in the H8 test-beam … in the HLT(Pesa environment) … work in progress.
Fast Simulation and the Higgs: Parameterisations of photon reconstruction efficiency in H  events Fast Simulation and the Higgs: Parameterisations of.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment is one of the four major experiments operating at the Large Hadron Collider.
Hadronic Jet Energy Scale Hadronic t-t bar selection and Jet Energy Scale calibration Part I : Accessing the trigger information 09/02 - Menelaos Tsiakiris.
LHCbComputing Computing for the LHCb Upgrade. 2 LHCb Upgrade: goal and timescale m LHCb upgrade will be operational after LS2 (~2020) m Increase significantly.
Muon Persistency Persistent Analysis Objects Muon Persistency Norbert Neumeister µ-PRS meeting February 10, 2004.
Status report for LVL2 e/  ESD/AOD Aims and constraints Tracking status Calorimetry status (Monika) Monika Wielers Ricardo Gonçalo.
Trigger study on photon slice Yuan Li Feb 27 th, 2009 LPNHE ATLAS group meeting.
1 Configuration Database David Forrest University of Glasgow RAL :: 31 May 2009.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment [1] is one of the four major experiments operating at the Large Hadron Collider.
1/5/09 Jiri Masik1 Event Filter Tracking in the Inner Detector Jiri Masik EFID Trigger related work –Trigger bytestream –Event display.
ATLAS UK physics meeting, 10/01/08 1 Triggers for B physics Julie Kirk RAL Overview of B trigger strategy Algorithms – current status and plans Menus Efficiencies.
ATLAS Distributed Computing Tutorial Tags: What, Why, When, Where and How? Mike Kenyon University of Glasgow.
L1Calo Databases ● Overview ● Trigger Configuration DB ● L1Calo OKS Database ● L1Calo COOL Database ● ACE Murrough Landon 16 June 2008.
Monitoring of L1Calo EM Efficiencies
Online Database Work Overview Work needed for OKS database
CMS High Level Trigger Configuration Management
ATLAS L1Calo Phase2 Upgrade
MOORE (Muon Object Oriented REconstruction) MuonIdentification
CHEP La Jolla San Diego 24-28/3/2003
Presentation transcript:

Trigger ESD/AOD Simon George (RHUL) Ricardo Goncalo (RHUL) Monika Wielers (RAL) Contributions from many people.

ATLAS Software Week 26-30Sep05S.George, R.Goncalo, M.Wielers2 Contents  General interest:  Event data from the trigger  TDAQ data flow  Online constraints  Use cases  Inventory of ESD & AOD classes  LVL1, LVL2 & EF  Current and planned  How to access trigger information in Rome data  For PAT:  Size estimates  Persistency technologies  Generic serializer  Trigger menu configuration as conditions data

ATLAS Software Week 26-30Sep05S.George, R.Goncalo, M.Wielers3 Sources of event data from the trigger <2.5 ms ~10 ms ~1 s ~2 kHz out ~200 Hz out Event passed to EF, uses LVL2 data to seed Detector data - raw event fragments Level 2 data appended to raw event EF data appended to raw event TDAQ data flow

ATLAS Software Week 26-30Sep05S.George, R.Goncalo, M.Wielers4 Data HLT chain: data sources in detail – e/  example L2 e25i hypothesis T2Calo L2Tracking Algorithm sequence L2Result EF e25i hypothesis TrigCaloRec EFTracking EMTauRoI TrigCluster CaloCluster TrigElectron TrackParticle Electron EFResult TrigIndetTrack L2 Decision EF Decision Steering

ATLAS Software Week 26-30Sep05S.George, R.Goncalo, M.Wielers5 HLT persistency: use cases  Online  L2 Result appended to raw event for purpose of seeding EF  Extended L2 Result appended to raw event for monitoring or debugging  Extended EF Result with output for calibration appended to raw event  L2 & EF Results appended to raw event for use offline  Offline:  Store L2 & EF Results produced by HLT simulation in ESD/AOD  Physics analysis: get trigger decision for an event from AOD  Trigger performance tuning or more detailed physics analysis: re-run HLT hypotheses & decision on AOD.  Detailed trigger performance studies: re-run algorithms on AOD.  The same code is run online & offline, so would like to write the same objects in both cases, but with different “persistency technology”, byte stream and POOL respectively.

ATLAS Software Week 26-30Sep05S.George, R.Goncalo, M.Wielers6 Online constraints  Size  Average L2Result size within 2kB/event  Max L2Result 64kB/event  Classes must be designed to convey minimum necessary data for use case.  E.g. use float rather than double if sufficient, bit-pack bools.  Format  Objects are written out in raw event format (byte stream), not ROOT  LVL2 and EF may append a small amount of data to the raw event  This is used to pass RoI seeding information from LVL2 to EF  It could also be used (within constraints) to extract information for debugging, monitoring and calibration  Simple, generic serializer turns objects into vector  Class design  Serializer limits class design  E.g. only support float, double, int, pointer  Do not intend to support full offline EDM e.g. ElementLinks  Complex inheritance structures would cause problems

ATLAS Software Week 26-30Sep05S.George, R.Goncalo, M.Wielers7 LVL1 classes in AOD/ESD (already in Rome)  RoI classes (in ESD and AOD)  EMTau_ROI, JetET_ROI, JETET_ROI, EnergySum_ROI, MUON_ROI  ‘Hardware like’  Contain bit pattern of which threshold passed, eta/phi  Small objects: few uint32, double  Reconstruction objects (only in ESD: also in AOD from rel )  L1EMTauObject, L1EMJetObject, L1ETmissObject  ‘Software like’  Contain energies, isolation, eta/phi quantities which are used for optimisations  Small objects: 4-8 doubles per class  Future plans:  MUON_ROI fine, no further plans  Need single consolidated Calo class, which contains cluster/isolation sums + thresholds passed

ATLAS Software Week 26-30Sep05S.George, R.Goncalo, M.Wielers8 LVL1 classes only in AOD/ESD  CTPDecision(already in Rome ESD&AOD)  ‘Hardware like’  Contains word with trigger decisions  Contains just few words  Final hardware not yet decided upon (should be very soon), thus might need revision

ATLAS Software Week 26-30Sep05S.George, R.Goncalo, M.Wielers9 LVL1 classes only in ESD (New from rel )  TriggerTower (available in release 11)  ‘Hardware like’  contains for towers above threshold (in total ~7200 tower, typically only after zero-suppression)  EM and had energies (final calibrated 8-bit ET values) per tower  Raw energy, digits, filter output per tower  eta/phi  Currently contains more data than actually read-out  Hardware will only give digits and final energies + eta/phi  Future plans:  further re-writing/re-design:  separate raw tower for internal use + stripped down calibrated tower for persistency.  JetElement (same as TriggerTower – rel )  ‘Hardware like’  Similar to TT but coarser granularity for jet trigger (~1k tower in total, again zero-suppressed)  Contains EM/had energy, eta/phi

ATLAS Software Week 26-30Sep05S.George, R.Goncalo, M.Wielers10 What was available for Rome from LVL2/EF  EMShowerMinimal (ESD only):  Output from T2Calo  Contains relevant shower shapes and pointer to CaloCluster (also stored in ESD)  TrackParticle (ESD and AOD…by accident)  Output from several LVL2 tracking algorithms  Obtained by conversion from TrigInDetTracks  Contains a few doubles, an ElementLink to Trk::Track, and pointers to RecVertex, MeasuredPerigee, TrackSummary, FitQuality…  No L2Result!  CaloCluster (only in ESD)  Produced by TrigCaloRec  No other Event Filter reconstruction, so objects from offline reconstruction used instead (in ESD, AOD)  No EFResult!

ATLAS Software Week 26-30Sep05S.George, R.Goncalo, M.Wielers11 Planned LVL2 classes in AOD/ESD - I  TrigInDetTrack  Inner detector track quantities  21 doubles and 5 int per track  Plan to optionally include space points for special trigger studies  MuonFeature  Muon track quantities  1 int, 7 float  Calorimeter classes  Discussed last LArg week  Classes to be implemented very soon after rel. 11, change algorithms to use new classes in 2 nd step

ATLAS Software Week 26-30Sep05S.George, R.Goncalo, M.Wielers12 Planned LVL2 classes in AOD/ESD - II  TrigParticle  TrigElectron, TrigMuon, TrigTau, TrigJet…  Summary data to use for seeding and analysis  Example: TrigElectron data members:  Roi_Id  eta, phi  Z vertex  p T, E T  pointer to track  Pointer to cluster  Variables filled by hypothesis algo with “best values”  Pointers can be 0 when track & cluster not needed  Small object  Aim to have prototype in release 11  Other classes will be added as required  Reflect hypothesis algorithms in the trigger  E.g. J/Psi, Z, di-muon

ATLAS Software Week 26-30Sep05S.George, R.Goncalo, M.Wielers13 Planned EF classes  EF reconstruction is independent from offline reconstruction  Same output classes but produced under different conditions  Algorithms run from seeds and run typically in a ‘simpler’ mode  Therefore need to save both EF and offline reconstruction objects  Store information per RoI  less space than full offline reconstruction  Space requirements have to be investigated

ATLAS Software Week 26-30Sep05S.George, R.Goncalo, M.Wielers14 Trigger Decision  Planned for inclusion in ESD/AOD  LVL2 and EF Result  decision: bit for each signature  navigation & steering 'state', e.g. needed to re-run hypothesis.  Bit pattern interpreted through MenuTable (conditions data)  Plan to provide a Tool to give L1, L2, EF results and signature results by interpreting bit patterns in AOD  See PAT discussion on accessing MenuTable  Short term implementation: while there are only one or two signatures  First tool implementation to use this instead of MenuTable and bit pattern  Store map in AOD: map  Derive trigger decisions from ESD  Only a few signatures - wasted AOD space by repeating labels each event is negligable  Aim to have prototype in release 11  To use tool, could add map to personal AOD’s made from Rome data  Map and bit patterns need to be included automatically in AOD in future productions

ATLAS Software Week 26-30Sep05S.George, R.Goncalo, M.Wielers15 How to use trigger decision today on Rome data  The software described on the previous slides is not yet available. What is possible today?  We currently offer 3 options to access the trigger decision when analysing Rome data:  Analyse ESDs  Create customised AODs  Use CBNTs and AODs  For more information on each of the methods, look at the wiki 

ATLAS Software Week 26-30Sep05S.George, R.Goncalo, M.Wielers16 Analyse physics data on ESDs  Run the HLT steering  Feature extraction algorithms (those that reconstruct objects, like tracks or clusters) are substituted by other algorithms that just read those objects from AOD  Run the real hypothesis algorithms so you can change cuts  Since there is no actual reconstruction, running is very fast and the job options are rather simple.  Note: only the recolum01 Rome data contains the trigger information  Instructions based on can be found in  ChainOnESDs ChainOnESDs  How to run the e/  slice  How to derive trigger efficiency  Solutions to known problems  Recipe will be updated once release 11 is out

ATLAS Software Week 26-30Sep05S.George, R.Goncalo, M.Wielers17 Analyse data using AODs  2 nd method: create customised AODs  Make your own AOD from ESD  Add trigger classes in addition to ‘default’ classes into your AOD  Then one can run the trigger chain in the same way on the AODs as just described for the ESDs  3 rd method: Bricolage  Not recommended - but we thought we should admit what we had to resort to, to get some of our results  Run the Root e/  analysis program on CBNTs  Write out the event numbers that pass the trigger to a text file.  Read back text file during AOD analysis to get trigger decision  See Wiki page for details and limitations

ATLAS Software Week 26-30Sep05S.George, R.Goncalo, M.Wielers18 Current size of trigger objects in ESD/AOD  Average size per event using Z  ee files from Rome data at low luminosity (250 events)  Of course this is physics/sample dependant. L1 RoI+reco objects~65 Bytes L1 Trigger tower and JetElements ~5 kBytes EMShowerMinimal + CaloCluster ~800 Bytes TrigInDetTracks from all 4 L2 track algorithms ~2.5 kBytes

ATLAS Software Week 26-30Sep05S.George, R.Goncalo, M.Wielers19 Persistency technologies 1/2  Offline  POOL/ROOT is the obvious choice when running trigger software offline, so objects can be included in ESD and AOD  Online  Only raw event format (“byte stream”)  LVL2 and EF may append a small amount of data to the raw event  This is used to pass RoI seeding information from LVL2 to EF  It could also be used (within constraints) to extract information for debugging, monitoring and calibration  Technically  The steering must pass all data to be written out to the DataFlow layer in a single vector. This is then packed into the raw event with the correct formatting.  So any classes to be written out online must be “serialized” as a vector of integers.

ATLAS Software Week 26-30Sep05S.George, R.Goncalo, M.Wielers20 Persistency technologies 2/2  Where does this leave EF output?  Simple data from steering (decision) is same as LVL2 and therefore easily serialized  No byte stream converters or serializers for offline reconstruction EDM so they will not be written out  Simple objects for monitoring and calibration could be serialized if suitably designed  Note that only use case for EF output is debugging.  Can re-run EF offline to emulate it (O(1s)/event)  Open question  Wrap serialised event data in POOL?  This is the easiest solution for the steering state and navigation data (Trigger Elements, navigation)  Extending this approach to the all HLT data is the easiest way to get all we need to satisfy the use case “Trigger performance tuning or more detailed physics analysis: re-run HLT hypotheses & decision on AOD.”  Storing reconstructed objects directly in POOL poses the challenge of how to restore pointers from steering state to this data when reading back  E.g. Trigger element to TrigIndetTrack, TrigElectron, etc.

ATLAS Software Week 26-30Sep05S.George, R.Goncalo, M.Wielers21 Generic serializer  In previous releases, any instances of classes which inherit from ISerializable are written out.  Classes deriving from ISerializable must implement serialize & deserialize methods  Works (CTB’04) but concerned about having to write a lot of similar code and practicality of requiring inheritance for all EDM  In release 11 we are trying a new generic serializer  Uses SEAL Reflection  Same dictionaries as offline persistency  We are not reinventing POOL & ROOT i/o; keep it simple  Constraints on classes that can be serialized  E.g. only support float, double, int, pointers  Do not intend to support full offline EDM e.g. ElementLinks  Unlikely that complex inheritance will work  Select objects by following pointers  Select classes by class id – others will be ignored  Short term  Timing study underway, initial indications are promising  Test with new LVL2 EDM  Try to support all that is needed, e.g. STL containers  Longer term  Reflex migration; Schema evolution

ATLAS Software Week 26-30Sep05S.George, R.Goncalo, M.Wielers22 HLT decision and associated configuration  Decision is a bit pattern to indicate which signatures an event fulfils  Needs configuration data to understand which signature each bit corresponds to  Decision is created during simulation (LVL1) or reconstruction (LVL2, EF)  At this time, configuration is read from XML files and turned into a few objects in the detector store  Data could potentially change from run to run, by editing XML file  Need a mechanism to access the exact same configuration when using the Decision from ESD/AOD  Configuration becomes conditions Algorithm: TriggerConfig HLTsignatures.xmlHLTsequences.xml TDS Configuration data: MenuTable SequenceTable Read via XercesC Write to detector store

ATLAS Software Week 26-30Sep05S.George, R.Goncalo, M.Wielers23 Trigger configuration persistency options  Long term: XML files will be replaced by Trigger configuration DB  Any used configuration automatically referenced in conditions DB  Accessible through IoV mechanism  Assume trigger config database replicated on Grid.  Short term: interim solution needed  Two options have been discussed:  1) Save config data in POOL file, then manage by hand or with IoV Svc  Either keep this POOL file with the corresponding AOD file by hand, and read back in using CondProxyProviderSvc  Disadvantage: no IoV Svc, only works when running over AOD file(s) which use the same config data file  Or reference POOL file for corresponding run(s) in conditions DB  Either way, problem of replicating or accessing POOL files on the Grid  Advantage: config data is pre-computed so no time penalty when running over AOD.  2) Save XML as string in conditions DB  modify code to take XML as string instead of file  re-produce config data in TDS  disadvantage: time to reproduce config data from XML every run change, could significantly slow down running on AOD  Thanks to Richard H, RD, David M, and A-Team for input. I hope I summarised it correctly.