The ATLAS Trigger Configuration System Design and Commissioning A.dos Anjos, P.Bell, D.Berge, J.Haller, S.Head, T.Kohno, S.Li, T.McMahon, M.Nozicka, H.v.d.

Slides:



Advertisements
Similar presentations
Database System Concepts and Architecture
Advertisements

Sander Klous on behalf of the ATLAS Collaboration Real-Time May /5/20101.
Releases & validation Simon George & Ricardo Goncalo Royal Holloway University of London HLT UK – RAL – 13 July 2009.
March 24-28, 2003Computing for High-Energy Physics Configuration Database for BaBar On-line Rainer Bartoldus, Gregory Dubois-Felsmann, Yury Kolomensky,
TRIGGER STATUS AND MENU OPTIMIZATION LHCC Referee Meeting with ATLAS – 7 th July 2009 Ricardo Gonçalo (RHUL) on behalf of ATLAS Trigger/DAQ.
1 The ATLAS Missing E T trigger Pierre-Hugues Beauchemin University of Oxford On behalf of the ATLAS Collaboration Pierre-Hugues Beauchemin University.
The First-Level Trigger of ATLAS Johannes Haller (CERN) on behalf of the ATLAS First-Level Trigger Groups International Europhysics Conference on High.
The ATLAS High Level Trigger Steering Journée de réflexion – Sept. 14 th 2007 Till Eifert DPNC – ATLAS group.
J. Leonard, U. Wisconsin 1 Commissioning the Trigger of the CMS Experiment at the CERN Large Hadron Collider Jessica L. Leonard Real-Time Conference Lisbon,
Reconstruction and Analysis on Demand: A Success Story Christopher D. Jones Cornell University, USA.
Top Trigger Strategy in ATLASWorkshop on Top Physics, 18 Oct Patrick Ryan, MSU Top Trigger Strategy in ATLAS Workshop on Top Physics Grenoble.
Real Time 2010Monika Wielers (RAL)1 ATLAS e/  /  /jet/E T miss High Level Trigger Algorithms Performance with first LHC collisions Monika Wielers (RAL)
General Trigger Philosophy The definition of ROI’s is what allows, by transferring a moderate amount of information, to concentrate on improvements in.
First year experience with the ATLAS online monitoring framework Alina Corso-Radu University of California Irvine on behalf of ATLAS TDAQ Collaboration.
TRIGGER DATA FOR PHYSICS ANALYSIS ATLAS Software Tutorial – 22 nd to 24 th April 2009 Ricardo Gonçalo – Royal Holloway.
L3 Filtering: status and plans D  Computing Review Meeting: 9 th May 2002 Terry Wyatt, on behalf of the L3 Algorithms group. For more details of current.
Event Metadata Records as a Testbed for Scalable Data Mining David Malon, Peter van Gemmeren (Argonne National Laboratory) At a data rate of 200 hertz,
TRIGGER-AWARE ANALYSIS TUTORIAL ARTEMIS Workshop – Pisa – 18 th to 19 th June 2009 Alessandro Cerri (CERN), Ricardo Gonçalo – Royal Holloway.
IceCube DAQ Mtg. 10,28-30 IceCube DAQ: “DOM MB to Event Builder”
What’s in the ATLAS data : Trigger Decision ATLAS Offline Software Tutorial CERN, August 2008 Ricardo Gonçalo - RHUL.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
 Carlos A. Chavez Barajas 22nd, May 2013 Latest work : Data Analysis Non-collision background estimation (SUSY searches) W+Jets cross section (data driven.
Alignment Strategy for ATLAS: Detector Description and Database Issues
Claudia-Elisabeth Wulz Institute for High Energy Physics Vienna Level-1 Trigger Menu Working Group CERN, 9 November 2000 Global Trigger Overview.
1 Trigger “box” and related TDAQ organization Nick Ellis and Xin Wu Chris Bee and Livio Mapelli.
Online Calibration of the D0 Vertex Detector Initialization Procedure and Database Usage Harald Fox D0 Experiment Northwestern University.
The Region of Interest Strategy for the ATLAS Second Level Trigger
Databases E. Leonardi, P. Valente. Conditions DB Conditions=Dynamic parameters non-event time-varying Conditions database (CondDB) General definition:
Event Data History David Adams BNL Atlas Software Week December 2001.
TRIGGER STATUS AND MENU OPTIMIZATION LHCC Referee Meeting with ATLAS – 7 th July 2009 Ricardo Gonçalo (RHUL) on behalf of the ATLAS TDAQ.
ALICE, ATLAS, CMS & LHCb joint workshop on
ATLAS ATLAS Week: 25/Feb to 1/Mar 2002 B-Physics Trigger Working Group Status Report
CMS pixel data quality monitoring Petra Merkel, Purdue University For the CMS Pixel DQM Group Vertex 2008, Sweden.
LHCb Software Week November 2003 Gennady Kuznetsov Production Manager Tools (New Architecture)
1 “Steering the ATLAS High Level Trigger” COMUNE, G. (Michigan State University ) GEORGE, S. (Royal Holloway, University of London) HALLER, J. (CERN) MORETTINI,
Introduction CMS database workshop 23 rd to 25 th of February 2004 Frank Glege.
3rd November Richard Hawkings Luminosity, detector status and trigger - conditions database and meta-data issues  How we might apply the conditions.
The ATLAS Trigger: High-Level Trigger Commissioning and Operation During Early Data Taking Ricardo Gonçalo, Royal Holloway University of London On behalf.
Navigation Timing Studies of the ATLAS High-Level Trigger Andrew Lowe Royal Holloway, University of London.
IOP HEPP: Beauty Physics in the UK, 12/11/08Julie Kirk1 B-triggers at ATLAS Julie Kirk Rutherford Appleton Laboratory Introduction – B physics at LHC –
Conditions Metadata for TAGs Elizabeth Gallas, (Ryan Buckingham, Jeff Tseng) - Oxford ATLAS Software & Computing Workshop CERN – April 19-23, 2010.
September 2007CHEP 07 Conference 1 A software framework for Data Quality Monitoring in ATLAS S.Kolos, A.Corso-Radu University of California, Irvine, M.Hauschild.
CHEP March 2003 Sarah Wheeler 1 Supervision of the ATLAS High Level Triggers Sarah Wheeler on behalf of the ATLAS Trigger/DAQ High Level Trigger.
Latest News & Other Issues Ricardo Goncalo (LIP), David Miller (Chicago) Jet Trigger Signature Group Meeting 9/2/2015.
Artemis School On Calibration and Performance of ATLAS Detectors Jörg Stelzer / David Berge.
David Adams ATLAS DIAL: Distributed Interactive Analysis of Large datasets David Adams BNL August 5, 2002 BNL OMEGA talk.
A New Tool For Measuring Detector Performance in ATLAS ● Arno Straessner – TU Dresden Matthias Schott – CERN on behalf of the ATLAS Collaboration Computing.
Online (GNAM) and offline (Express Stream and Tier0) monitoring produced results during cosmic/collision runs (Oct-Dec 2009) Shifter and expert level monitoring.
Nov 1, 2002D0 DB Taking Stock1 Trigger Database Status and Plans Elizabeth Gallas – FNAL CD (with recent help from Jeremy Simmons, John Weigand, and Adam.
Claudio Grandi INFN-Bologna CHEP 2000Abstract B 029 Object Oriented simulation of the Level 1 Trigger system of a CMS muon chamber Claudio Grandi INFN-Bologna.
Online Monitoring System at KLOE Alessandra Doria INFN - Napoli for the KLOE collaboration CHEP 2000 Padova, 7-11 February 2000 NAPOLI.
Status of the ATLAS first-level Central Trigger and the Muon Barrel Trigger and First Results from Cosmic-Ray Data David Berge (CERN-PH) for the ATLAS.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment is one of the four major experiments operating at the Large Hadron Collider.
The ATLAS DAQ System Online Configurations Database Service Challenge J. Almeida, M. Dobson, A. Kazarov, G. Lehmann-Miotto, J.E. Sloper, I. Soloviev and.
TAGS in the Analysis Model Jack Cranshaw, Argonne National Lab September 10, 2009.
Performance of the ATLAS Trigger with Proton Collisions at the LHC John Baines (RAL) for the ATLAS Collaboration 1.
Conditions Metadata for TAGs Elizabeth Gallas, (Ryan Buckingham, Jeff Tseng) - Oxford ATLAS Software & Computing Workshop CERN – April 19-23, 2010.
Finding Data in ATLAS. May 22, 2009Jack Cranshaw (ANL)2 Starting Point Questions What is the latest reprocessing of cosmics? Are there are any AOD produced.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment [1] is one of the four major experiments operating at the Large Hadron Collider.
ATLAS The ConditionDB is accessed by the offline reconstruction framework (ATHENA). COOLCOnditions Objects for LHC The interface is provided by COOL (COnditions.
ATLAS UK physics meeting, 10/01/08 1 Triggers for B physics Julie Kirk RAL Overview of B trigger strategy Algorithms – current status and plans Menus Efficiencies.
L1Calo Databases ● Overview ● Trigger Configuration DB ● L1Calo OKS Database ● L1Calo COOL Database ● ACE Murrough Landon 16 June 2008.
EPS HEP 2007 Manchester -- Thilo Pauly July The ATLAS Level-1 Trigger Overview and Status Report including Cosmic-Ray Commissioning Thilo.
Ricardo Gonçalo, RHUL BNL Analysis Jamboree – Aug. 6, 2007
CMS High Level Trigger Configuration Management
Controlling a large CPU farm using industrial tools
TriggerDB copy in TriggerTool
The First-Level Trigger of ATLAS
TDAQ commissioning and status Stephen Hillier, on behalf of TDAQ
Presentation transcript:

The ATLAS Trigger Configuration System Design and Commissioning A.dos Anjos, P.Bell, D.Berge, J.Haller, S.Head, T.Kohno, S.Li, T.McMahon, M.Nozicka, H.v.d. Schmitt, J.Stelzer, T.Wengler, W.Wiedenmann

Thursday, September 6 The ATLAS Trigger Configuration System2 Outline Trigger design Configurable components Configuration system requirements Data taking, MC production, offline analysis Design and implementation Commissioning status Conclusions

Thursday, September 6 The ATLAS Trigger Configuration System3 Trigger Design Fast, highly selective, efficient 40 MHz bunch crossing Hz storage rate ~40ms ~4sec ~3% of detector introduction to the ATLAS trigger in ATLAS HLT steering by S. George (Mo 16:50 Online Computing)

Thursday, September 6 The ATLAS Trigger Configuration System4 Level 1 Trigger Pre-processor Cluster Processor Jet/Energy Processor End-cap Muon Trigger (TGC) Barrel Muon Trigger (RPC) Muon-CTP-Interface (MuCTPI) Central Trigger Processor (CTP) LTP BusyTTC Detector Front-Ends/Read-out LTP BusyTTC Muon DetectorsCalorimeter Detectors Common Merger Modules Trigger objects: Muons, EM and hadronic clusters, jets, total and missing E T CTP Mapping of 480 hardware signals onto 160 CTP internal signals these signals encode the object multiplicities Thresholds of trigger objects multiple thresholds for different object types Item definitions: logic, prescale and veto rates maximum 256 trigger items Random trigger rates, trigger on bunches or bunch groups Configurable information on CTP:

Thursday, September 6 The ATLAS Trigger Configuration System5 High Level Trigger (HLT) Concept of trigger lines (chains) Chain: ordered list of trigger conditions (multiplicities of HLT trigger elements) to be evaluated in sequence) Description how algorithms produce trigger elements (example: L1EM3  ClusterFinder&Hypo  L2_e5cl) Collection of chains (with prescale and forced- accept rates)  HLT menu (see Teresa’s talk) HLT algorithms configured through parameters Set via python (used in ATLAS as high level scripting language) more details in talks about ATLAS HLT steering by S. George (Mo 16:50 Online Computing) and about Trigger Reconstruction Algorithms by T. Fonseca Martin (Mo 17:55 Online Computing) signature (e j) sequence (e) [EM  “e-FEX, e-Hypo”  e] sequence (j) [JET  “j-FEX, j-Hypo”  j] signature (e’ j’) Chain (EJ-L2) input = “EMJET” Chain (EJ-EF) input = “EJ-L2” Lvl1 Trigger Item EMJET y/n L2 EF HLT Chain Configurable information: Chain definitions: logic (trigger conditions, algorithms), prescale and forced-accept rates maximum 8192 chains per menu Algorithm parameters Data streams, monitoring groups Steering, see Simon’s talk

Thursday, September 6 The ATLAS Trigger Configuration System6 Design Requirements for the Configuration System Complete and consistent configuration of the ATLAS trigger Online software and hardware for data taking Trigger simulation software in Monte Carlo production jobs Configuration information provided to the user to perform trigger aware analyses and trigger studies Flexible and fast configuration changes during data-taking to react to different beam and detector conditions History of configurations for the purpose of understanding and reproduction of the trigger behavior

Thursday, September 6 The ATLAS Trigger Configuration System7 Components of the Configuration System Relational database stores trigger configuration (TriggerDB) Trigger configuration via a single key Offline reproducibility Trigger history Schema reflects trigger design Tool for database browsing and manipulation (TriggerTool) Flexible and fast changes of the trigger during data taking Trigger experts, shift crew, offline analyst Software clients to directly access the TriggerDB for Data taking, simulation, and distribution of configuration data (conditions database) TriggerDB simulation production shift crewoffline userexpert TriggerTool Relational Access Layer conditions database data taking Level 1 Menu + Prescales HLT Menu + Prescales Algorithm parameters Release version TriggerDB Schema

Thursday, September 6 The ATLAS Trigger Configuration System8 Operation of the Configuration System – Preparation Level 1 Trigger Menu: stable in time, small changes to thresholds and trigger items by hand using TriggerTool Trigger-Menu-Compiler creates image for Level 1 hardware Prescales: adjusted by shifter to match the luminosity  optimize bandwidth usage High Level Trigger Prepare and validate trigger menu for data taking Populate the TriggerDB with the HLT configuration information Menu, algorithm parameters, prescale rates Check consistency with Level 1 configuration Configuration alias for shifter Logical names (‘PHYSICS’, ‘COSMICS’, ‘CALIBRATION’) pointing to current valid configurations ATLAS Trigger community responsible for the development and testing of the trigger algorithms to achieve the ATLAS physics goals

Thursday, September 6 The ATLAS Trigger Configuration System9 TriggerDB during Data Taking Shifter to chose trigger configuration alias before CONFIG transition Configuration key written to online configuration database to be picked up by Level 1 CTP controller and HLT processes At CONFIG: CTP controller loads image from TriggerDB into the CTP hardware HLT processes load configuration from TriggerDB into memory and initialize themselves At START run: partial configuration information is written into the ATLAS conditions database (COOL) as run-wise trigger configuration data While RUNNING: Level 1 prescales can change  written to COOL Trigger Panel in the ATLAS Run Control interface see talk about ATLAS Online Configuration Database by I. Soloviev (Wed 15:05 Online Computing)

Thursday, September 6 The ATLAS Trigger Configuration System10 Trigger Result Run-wise configuration data to interpret trigger decision (conditions database) Event wise trigger decision encoded in the bytestream Maps: trigger names to bit position and chain counter Allows for access to the trigger decision using trigger names Information about trigger definition at each step of chain-processing To rebuild the HLT menu and access the trigger objects by name Prescale, LVL1 veto and HLT forced accept rates and trigger chains Level 1 trigger: Acceptance flags for the up to 256 active trigger items before and after the application of prescale and veto  3 x 256 bits High level trigger: Acceptance flags for each chain before and after the application of prescale and forced-accept Chains are identified by a short integer (chain counter) Index of last successfully processed step for each chain Trigger objects e.g. hadronic clusters, muon tracks Information to link these HLT trigger objects to the Level 1 trigger objects (trigger studies)

Thursday, September 6 The ATLAS Trigger Configuration System11 ESD 100MB/s AOD 20MB/s ESD AO D TAG files/DB Flow of Configuration Data More details in talk about ATLAS Databases by A. Vaniachine (Wed 14:40 Distributed data analysis) and about ATLAS Tag DB by F. Viegas (Wed 14:40 Software Components) L1Result to Tier0 express calib Tier 1 transfer Tier 0 Prompt Reconstruction Express Reconstruction, calibration Tier 1 Reprocessin g Tier 2 MC production Conditions Database Trigger Menus into Conditions DB Conditions Database TriggerDB DbProxy LVL2 Result EF Result RODs Front-end LVL2 Subfarm Input EF Event Builder LVL1/ CTP Subfarm Output EF Trigger Result Trigger Objects 1. TriggerDB to configure trigger for data taking 2. Configuration data to COOL 3. Trigger result in each event 4. Shipped to reconstruct- ion sites 5. ESD, AOD, TAG for trigger aware analysis TriggerDB Replication

Thursday, September 6 The ATLAS Trigger Configuration System12 Trigger Information for trigger studies and physics analysis Configuration Data in Offline Analysis TriggerDB All configuration data online DB (COOL) Configuration: Lvl1 items and HLT chains (name, version), prescale-, Lvl1 veto- and HLT pass-through rates Trigger result: pass or fail? reason: prescaled, vetoed, pass-through? last successful step in each chain? Navigation: which trigger object caused the trigger decision ? Trigger Event Data: rerun the trigger selection offline with tightened criteria Encoded trigger decision Decoded trigger menu Persistence ESD Event Summary Data AOD Analysis Object Data Decoded Trigger Menu Configures for Data taking Trigger Information (transient) Transient

Thursday, September 6 The ATLAS Trigger Configuration System13 Analysis and Trigger Studies Common: check if event passed a desired trigger Trigger efficiency, luminosity calculations (on TAG DB) Navigate to trigger object that caused trigger-accept Z  ee trigger + single electron trigger to study electron trigger efficiency Rerunning the trigger without reconstruction of trigger objects Run trigger as during data taking, but switch off trigger feature extraction (FEX) algorithms (see Simon’s talk) Perform selection with tighter requirements (HYPO algorithms) Turn on curves, … Most trigger analyses possible on AOD data!

Thursday, September 6 The ATLAS Trigger Configuration System14 Interaction with the TriggerDB – TriggerTool Integrated consistency checking Tree view for trigger menus (and subsets) Table view for plain data Simple and advanced search capabilities Shifter Mode Change prescale factors or trigger menu to react on changing detector or beam conditions User Mode Browse: trigger menus, detailed information like algorithm parameters Expert mode Upload new and manipulate existing configurations Intuitive JAVA based GUI to browse and manipulate trigger configurations Search results Edit pane (e.g. for chains)

Thursday, September 6 The ATLAS Trigger Configuration System15 Commissioning LVL1 Configuration ATLAS Central Trigger Processor (CTP) tested for over a 1 year on cosmic data using input from various detector sub-systems See also Commissioning the ATLAS trigger by J. Boyd (Wed 17:30 Online Computing) Cosmic-ray RPC impact points, extrapolated to ground level ATLAS access shafts Complete muon cosmic ray slice LVL1 + LVL2 in February 2007 Muon trigger  LVL1 provides trigger and seed for LVL2  LVL2 algorithm requests detector data to reconstruct muon candidates Configuration of CTP from the TriggerDB is default August commissioning week: Writing of LVL1 trigger configuration to conditions database for each run

Thursday, September 6 The ATLAS Trigger Configuration System16 Commissioning HLT Configuration Large Scale Tests in December 2006 Simple trigger setup, large computing farm: 600 dual core processors TriggerDB access via the ATLAS DbProxy – service that caches database requests and replies to reduce load on primary database Two following technical runs in March and May 2007 on simulated and cosmic data Small computing farm (part of the final ATLAS trigger farm) Test of complex trigger menus Exercising the TriggerTool in a shift-like environment August commissioning week: Tested writing of HLT configuration to conditions database for each run ATLAS Run Control interface during LST

Thursday, September 6 The ATLAS Trigger Configuration System17 Trigger Configuration for Simulation Production Advantages Consistency between online trigger and simulation Configurations created and used during data taking can easily be used in MC production More flexible propagation of configuration changes to MC production sites Currently a new MC production cycle requires a new software release Can be replaced by software release + configuration key  faster turn around if only trigger menus or algorithm parameters need adjustment System under construction TriggerDB holds complete trigger configuration  can be used to simulate the trigger in the exact same way as it is configured for data taking

Thursday, September 6 The ATLAS Trigger Configuration System18 Summary System designed and implemented to provide consistent configuration of all three trigger levels Access to trigger configuration consistently for data taking, Monte Carlo production, and trigger studies / trigger aware analysis Fast and flexible response to changing detector or beam conditions during data taking System provides a trigger history over the ATLAS lifetime, easily accessible by the analyst using the TriggerTool Commissioning of the system for Level 1 and HLT separately and combined during cosmic test runs and large scale farm tests