TRIGGER STATUS AND MENU OPTIMIZATION LHCC Referee Meeting with ATLAS – 7 th July 2009 Ricardo Gonçalo (RHUL) on behalf of ATLAS Trigger/DAQ.

Slides:



Advertisements
Similar presentations
Sander Klous on behalf of the ATLAS Collaboration Real-Time May /5/20101.
Advertisements

Releases & validation Simon George & Ricardo Goncalo Royal Holloway University of London HLT UK – RAL – 13 July 2009.
ATLAS Tile Calorimeter Performance Henric Wilkens (CERN), on behalf of the ATLAS collaboration.
1 The ATLAS Missing E T trigger Pierre-Hugues Beauchemin University of Oxford On behalf of the ATLAS Collaboration Pierre-Hugues Beauchemin University.
Digital Filtering Performance in the ATLAS Level-1 Calorimeter Trigger David Hadley on behalf of the ATLAS Collaboration.
27 th June 2008Johannes Albrecht, BEACH 2008 Johannes Albrecht Physikalisches Institut Universität Heidelberg on behalf of the LHCb Collaboration The LHCb.
J. Leonard, U. Wisconsin 1 Commissioning the Trigger of the CMS Experiment at the CERN Large Hadron Collider Jessica L. Leonard Real-Time Conference Lisbon,
July 7, 2008SLAC Annual Program ReviewPage 1 High-level Trigger Algorithm Development Ignacio Aracena for the SLAC ATLAS group.
Top Trigger Strategy in ATLASWorkshop on Top Physics, 18 Oct Patrick Ryan, MSU Top Trigger Strategy in ATLAS Workshop on Top Physics Grenoble.
The ATLAS trigger Ricardo Gonçalo Royal Holloway University of London.
Real Time 2010Monika Wielers (RAL)1 ATLAS e/  /  /jet/E T miss High Level Trigger Algorithms Performance with first LHC collisions Monika Wielers (RAL)
General Trigger Philosophy The definition of ROI’s is what allows, by transferring a moderate amount of information, to concentrate on improvements in.
CMS Alignment and Calibration Yuriy Pakhotin on behalf of CMS Collaboration.
Trigger and online software Simon George & Reiner Hauser T/DAQ Phase 1 IDR.
First year experience with the ATLAS online monitoring framework Alina Corso-Radu University of California Irvine on behalf of ATLAS TDAQ Collaboration.
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ATLAS Computing Planning ATLAS Software Workshop August 1998 Jürgen Knobloch Slides also.
O FFLINE T RIGGER M ONITORING TDAQ Training 30 July 2010 On behalf of the Trigger Offline Monitoring Experts team.
February 19th 2009AlbaNova Instrumentation Seminar1 Christian Bohm Instrumentation Physics, SU Upgrading the ATLAS detector Overview Motivation The current.
L3 Filtering: status and plans D  Computing Review Meeting: 9 th May 2002 Terry Wyatt, on behalf of the L3 Algorithms group. For more details of current.
Tracking at the ATLAS LVL2 Trigger Athens – HEP2003 Nikos Konstantinidis University College London.
06/03/06Calice TB preparation1 HCAL test beam monitoring - online plots & fast analysis - - what do we want to monitor - how do we want to store & communicate.
TRIGGER-AWARE ANALYSIS TUTORIAL ARTEMIS Workshop – Pisa – 18 th to 19 th June 2009 Alessandro Cerri (CERN), Ricardo Gonçalo – Royal Holloway.
News from the Beatenberg Trigger Workshop Ricardo Gonçalo - 18 February 2009 Higgs WG Meeting during ATLAS Week.
Ricardo Gonçalo (RHUL) Higgs WG meeting – 28 th August, 2007 Outline: Introduction & release plans Progress in menus for cm -2 s -1 Workshop on trigger.
Plans for Trigger Software Validation During Running Trigger Data Quality Assurance Workshop May 6, 2008 Ricardo Gonçalo, David Strom.
1 Trigger “box” and related TDAQ organization Nick Ellis and Xin Wu Chris Bee and Livio Mapelli.
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
Thomas Jefferson National Accelerator Facility Page 1 EC / PCAL ENERGY CALIBRATION Cole Smith UVA PCAL EC Outline Why 2 calorimeters? Requirements Using.
TRIGGER STATUS AND MENU OPTIMIZATION LHCC Referee Meeting with ATLAS – 7 th July 2009 Ricardo Gonçalo (RHUL) on behalf of the ATLAS TDAQ.
ATLAS Liquid Argon Calorimeter Monitoring & Data Quality Jessica Levêque Centre de Physique des Particules de Marseille ATLAS Liquid Argon Calorimeter.
1 John Baines Commissioning of the ATLAS High Level Trigger.
19 November 98 1 Jürgen Knobloch ATLAS Computing ATLAS Computing - issues for 1999 Jürgen Knobloch Slides also on:
OFFLINE TRIGGER MONITORING TDAQ Training 5 th November 2010 Ricardo Gonçalo On behalf of the Trigger Offline Monitoring Experts team.
CMS pixel data quality monitoring Petra Merkel, Purdue University For the CMS Pixel DQM Group Vertex 2008, Sweden.
DPDs and Trigger Plans for Derived Physics Data Follow up and trigger specific issues Ricardo Gonçalo and Fabrizio Salvatore RHUL.
Status of trigger software and EDM Ricardo Gonçalo, RHUL – on behalf of ATLAS TDAQ.
The ATLAS Trigger: High-Level Trigger Commissioning and Operation During Early Data Taking Ricardo Gonçalo, Royal Holloway University of London On behalf.
All Experimenters MeetingDmitri Denisov Week of July 16 to July 22 D0 Summary  Delivered luminosity and operating efficiency u Delivered: 1.6pb -1 u Recorded:
IOP HEPP: Beauty Physics in the UK, 12/11/08Julie Kirk1 B-triggers at ATLAS Julie Kirk Rutherford Appleton Laboratory Introduction – B physics at LHC –
Organisation of the Beatenberg Trigger Workshop Ricardo Gonçalo Higgs WG Meeting - 22 Jan.09.
Alliance Alliance Performance Status - CREQ Régis ELLING July 2011.
Trigger Software Validation Olga Igonkina (U.Oregon), Ricardo Gonçalo (RHUL) TAPM Open Meeting – April 12, 2007 Outline: Reminder of plans Status of infrastructure.
Reflections of a System Run Coordinator Or Equivalent! Bruce M Barnett STFC, Rutherford Appleton Laboratory L1Calo Collaboration Meeting January.
September 2007CHEP 07 Conference 1 A software framework for Data Quality Monitoring in ATLAS S.Kolos, A.Corso-Radu University of California, Irvine, M.Hauschild.
D0 Status: 01/14-01/28 u Integrated luminosity s delivered luminosity –week of 01/ pb-1 –week of 01/ pb-1 –luminosity to tape: 40% s major.
Artemis School On Calibration and Performance of ATLAS Detectors Jörg Stelzer / David Berge.
All Experimenters MeetingDmitri Denisov Week of July 7 to July 15 Summary  Delivered luminosity and operating efficiency u Delivered: 1.4pb -1 u Recorded:
CMS Week Sept '07Leonard Apanasevich (UIC) Pedrame Bargassa (Rice) 1 Physics Priorities for Trigger Development Leonard Apanasevich (UIC) Pedrame Bargessa.
Overview of the High-Level Trigger Electron and Photon Selection for the ATLAS Experiment at the LHC Ricardo Gonçalo, Royal Holloway University of London.
All Experimenters MeetingDmitri Denisov Week of September 2 to September 8 D0 Summary  Delivered luminosity and operating efficiency u Delivered: 3.9pb.
The ATLAS Trigger Configuration System Design and Commissioning A.dos Anjos, P.Bell, D.Berge, J.Haller, S.Head, T.Kohno, S.Li, T.McMahon, M.Nozicka, H.v.d.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment is one of the four major experiments operating at the Large Hadron Collider.
Performance DPDs and trigger commissioning Preparing input to DPD task force.
Trigger Input to First-Year Analysis Model Working Group And some soul searching… Trigger Open Meeting – 29 July 2009.
Predrag Buncic CERN ALICE Status Report LHCC Referee Meeting 01/12/2015.
Online Consumers produce histograms (from a limited sample of events) which provide information about the status of the different sub-detectors. The DQM.
Performance of the ATLAS Trigger with Proton Collisions at the LHC John Baines (RAL) for the ATLAS Collaboration 1.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment [1] is one of the four major experiments operating at the Large Hadron Collider.
Trigger, Online, and Computing Readiness A. J. Lankford University of California, Irvine August 3-5, 2009Lankford - ATLAS Physics Workshop of the Americas1.
Points from DPD task force First meeting last Tuesday (29 th July) – Need to have concrete proposal in 1 month – Still some confusion and nothing very.
Trigger release and validation status and plans David Strom (Oregon), Simon George (RHUL), Ricardo Gonçalo (RHUL)
Analysis Meeting, November 9-11, 2003 Manuel Calderón de la Barca Sánchez Heavy Flavor Working Group Heavy Flavor in ‘04: Prospects for J/ . Heavy.
Introduction 08/11/2007 Higgs WG – Trigger meeting Ricardo Gonçalo, RHUL.
ATLAS UK physics meeting, 10/01/08 1 Triggers for B physics Julie Kirk RAL Overview of B trigger strategy Algorithms – current status and plans Menus Efficiencies.
Monitoring of L1Calo EM Efficiencies
CMS Luminosity Status Report
Concluding discussion on Commissioning, Etmiss and DPD
ALICE analysis preservation
FORWARD CARRIAGE UPDATE – March-July 2015
TDAQ commissioning and status Stephen Hillier, on behalf of TDAQ
Presentation transcript:

TRIGGER STATUS AND MENU OPTIMIZATION LHCC Referee Meeting with ATLAS – 7 th July 2009 Ricardo Gonçalo (RHUL) on behalf of ATLAS Trigger/DAQ

Outline  Recent activity  Highlights from recent results  Planning for 2009/10 run  Online and offline monitoring  Trigger menus for the coming run  Conclusions 2 LHCC Referee Meeting with ATLAS – 7th July 2009

Recent activity  Much has happened since the last Trigger report at this meeting:  See talk by Nick Ellis in September 22 nd LHCC review:  The short single-beam run and the cosmics runs in 2008/09 provided a good stress test of the Trigger/DAQ system  The trigger successfully worked with LHC beams for the first time!  Excellent progress was made on timing-in the various detectors in late 2008 and 2009  Trigger successfully selected events for detector and trigger commissioning  Since then the collected data were thoroughly analyzed:  Residual problems were identified and fixed  Lessons from the operational experience have led to new tools and improved procedures 3 LHCC Referee Meeting with ATLAS – 7th July 2009 Cosmic event triggered by the L1 tau and jet triggers Oscilloscope traces from beam pickup (yellow) and min.bias scintillators for single injected bunch

Single-beam and cosmic runs  Single-beam events were selected with various triggers: beam pickups, minimum-bias trigger scintillators, calorimeters and forward muon detectors  Data streaming done by the High-Level Trigger (HLT) based on Level 1 trigger type and on Level 2 tracking (see figure)  HLT algorithms exercised online to reconstruct trigger data  Running in parasitic mode without rejecting events  Contributed much to weeding out problems in real environment  Monitoring benefited much from exercise  Heavy use of CAF for testing bug fixes and new menus before deployment  Menus updated almost daily, responding to feedback from data analysis and to needs from commissioning of detector systems  Menu configuration machinery responded fast and well to new demands 4 LHCC Referee Meeting with ATLAS – 7th July 2009 Number of events recorded into each stream in 2008

Highlights from cosmic runs  Complete HLT infrastructure was tested  Including the algorithm steering and configuration  Also the online and offline monitoring  Weak points were identified and were/are being addressed – e.g. new trigger rate display  Level 2 inner detector tracking algorithms were useful to select events with track candidates (top figure)  Modified algorithms to accept tracks not coming from nominal IP  Used to create stream enriched in good track candidates for inner detector and offline tracking commissioning  Many HLT algorithms were exercised with cosmics by relaxing selection thresholds and requirements (bottom figure)…many other examples would be possible… 5 E 3x7 /E 7x7 cells in LAr calorimeter for trigger clusters matching a cluster reconstructed offline Level 2 tracking efficiency for events with a good track reconstructed offline LHCC Referee Meeting with ATLAS – 7th July 2009

Technical runs  Playback of simulated collision data through the DAQ system  Performed regularly to integrate new software releases in online environment  Allows to test new DAQ and HLT software in real system  Allows to test the system with collision-type data: enhanced bias, physics channels  Preparation before running detectors on cosmics  Estimate system performance:  Estimated processing time compatible with design goals (top figure); mean HLT average should be ~ 40ms at Level 2 and ~ 4s at the Event Filter or less  Throughput rate: tested up to ~78-90kHz input to Level 2 (bottom figure; 50-75kHz expected)  Event building rate of ~5kHz starts to compete with Level 2 data requests (~2kHz event building rate expected); E.B. design rate is 3.5kHz  Note: processing time is strongly dependent on event topology and on trigger menu 6 LHCC Referee Meeting with ATLAS – 7th July 2009 Event Filter processing time for accepted events HLT input rate test: input rate (Hz) versus time (hours)

noisy module Monitoring and diagnosing problems  Monitoring essential to quickly spot and diagnose problems  Both online – in control room – and offline, trigger monitoring was exercised in the cosmics and technical runs  Online monitoring based on histograms of:  Trigger rates  Overlap between trigger streams  Characteristic event quantities for each selection  Offline monitoring based on:  Histograms produced during Tier0 event reconstruction  Re-processing and error analysis of events from the debug stream  Improved trigger monitoring currently being tested with cosmic events 7 LHCC Referee Meeting with ATLAS – 7th July 2009 Level 2 calorimeter transverse energy (MeV) Reference histogram Measured: mismatch due to calibration laser pulses in this run

Offline testing and monitoring LHCC Referee Meeting with ATLAS – 7th July  As learned from the 2008 run, it is essential to thoroughly test new menus and HLT algorithms with real data before online deployment  Also, it is important to be able to react quickly to new problems – or risk wasting bandwidth collecting bad data  The CAF is an essential part of the trigger strategy for this:  Used for automatic re-processing events from the debug stream, where an error condition occurred online – e.g. a time-out during a Level 2 data request  Used to test new menus once they are loaded to the trigger configuration database and before they are deployed online  Needs to provide access to RAW data from a small number of runs where a problem was identified until the debugging is completed This is essential and allows us to study problematic events offline, correct weaknesses in the software and test the fixes – it minimizes lost time and disruption to online running  Other debugging tools are provided by:  The monitoring histograms produced during event reconstruction at Tier 0  The production of commissioning ntuples at Tier 1 for fast analysis of problems (and Tier 0 for time-critical needs)  The “Preseries” subfarm: a set of HLT nodes serving as a test platform and not used for normal trigger processing

Planning for the 2009/10 run  A set of reviews was done after last year’s run to examine critically what had been done  Touched the following subjects: offline monitoring infrastructure, tools for online monitoring and debugging, shift crew operation and tools, information flow, timing measurements, configuration and streaming  The trigger workshop in February 2009 was an important milestone:  Reviewed the trigger activity in the 2008 single-beam and cosmics run and establish plans to prepare for the 2009 run  Led by panel from broader ATLAS community and with experience from other experiments  Raised interest and involved people from detector, physics, combined performance, data preparation, etc  Resulted in a report and ~80 identified action items with responsible people’s names attached Touching on all areas from trigger menus to monitoring, documentation, configuration etc We have been following up on these for the last five months in the various trigger domains Many translated into agreed procedures, software deliverables or tools 9 LHCC Referee Meeting with ATLAS – 7th July 2009

Commissioning plans LHCC Referee Meeting with ATLAS – 7th July  Combined ATLAS cosmic run will start at T0 – 4 weeks with all systems and 24h coverage  The trigger will start with the already familiar Level 1 selection for cosmic events Menu will be ready at T0 – 10 weeks, to be deployed at T0 – 8 weeks for runs with some systems  Have the HLT in passthrough mode – exercise the algorithms, event data, monitoring etc without rejecting events  Data streaming by the HLT based on Level 1 trigger type and on tracking/cosmic event algorithms  Exercise HLT algorithms with loose event selection to accept and process cosmic events  Single-beam events to be selected with dedicated menu  Based on use of beam pickup and minimum bias scintillators  Refine timing of signals from various detector systems  Continue to exercise HLT algorithms in passthrough mode using beam-gas events and halo muons  Initial collisions triggered with Level 1 only  Significant amount of work on e.g. Level 1 calibration needs to be done with initial collisions  This data will be essential for commissioning of detectors, Level 1trigger, HLT selections  HLT deployed to reject events only when needed to keep event rate within budget  Both Level 1 and HLT trigger prescales can now be updated during the run to increase operational flexibility – prescale factors constant within luminosity blocks  Now creating conditions to have fast feedback from the trigger on the collected samples  Using Tier 1 and/or CAF to process events and create monitoring histograms and dedicated ntuples (root) with fast turnaroun

Menus for initial data  Cosmics menus have been thoroughly exercised in recent runs in May and June  Level 1 calorimeter and muon trigger have been reliably providing triggers for cosmics runs this year  Evolution of cosmics menu will contain some muon triggers in physics configuration  The initial-beam menu will be used for single-beam running and first collisions  It will need to be able to handle different LHC scenarios and be resilient to a badly timed-in detectors  Rely on beam pickup to identify filled bunches  Experience from single-beam running in 2008 used in the design of this menu  Bunch-group mechanism will be commissioned carefully to replace beam pickups  High-Level trigger will be used to reject events only when necessary 11 LHCC Referee Meeting with ATLAS – 7th July 2009 Cosmics menu 1 Cosmics menu 2 Initial beam menu

Menu evolution LHCC Referee Meeting with ATLAS – 7th July  The evolution of the trigger menu is very much tied to the evolution of the LHC luminosity (and to the beam energy)  Several commissioning menus are being put in place for the initial beam period with detector and trigger commissioning as the highest priority  Procedures for menu evolution agreed but still need to be tested in real life  Menus exist or are being developed in Monte Carlo simulation for average luminosities of cm -2 s -1 and cm -2 s -1  These are possible scenarios for the coming run  Depending on the detailed bunch spacing scenario, this could mean up to 5 overlapping events per bunch crossing, on average – might require changes to the menu, in order to keep the rate manageable  These menus provide a reference for planning and evolving the online trigger menu as the LHC luminosity grows  Some high-p T physics triggers, needed for analysis in channels with low cross section, are “un- prescalable”  Some practical questions remain on what menus should be used in Monte-Carlo production  Can have impact on the analysis of initial data  It must be possible to simulate the response of menus/algorithms used in online event selection

Conclusions LHCC Referee Meeting with ATLAS – 7th July  The trigger was ready for beam in 2008 and a lot was achieved from the single-beam and 2008/09 cosmics runs  The HLT was successfully used to stream single-beam events and to select and stream cosmic events for detector commissioning  The cosmics runs provided vital experience from prolonged stable running (>200 million cosmics recorded)  Level 1 (muon, calorimeter) triggers were selecting events from the start and reliably providing events for detector commissioning since then  The lessons learned from this initial running period were extremely important in planning for this year's activities  Addressing weak areas, improving robustness, preparing for the unexpected  As a result we are even better prepared for running in 2009

Backup slides LHCC Referee Meeting with ATLAS – 7th July

Trigger stream overlap  Data streams determined by the high-level trigger  ATLAS inclusive streaming model relies on small overlap between streams  Exclusive debug streams for events with online error conditions 15 LHCC Referee Meeting with ATLAS – 7th July 2009 Stream overlap for cosmics run with debug, “physics” and calibration streams defined

Trigger information for physics analysis 16 ESD AOD TAG Configures Stores decoded Trigger Menu DPD With decreasing amount of detail LHCC Referee Meeting with ATLAS – 7th July 2009

 Java based front end to TriggerDB, launch from the web (Java web-start):  Overview of all trigger configurations  Detailed and convenient investigation of trigger menus Trigger definition L1->L2->EF: prescales, threshold algorithms, selection criteria, streaming information, etc.  Possibility to compare different trigger configurations LHCC Referee Meeting with ATLAS – 7th July Trigger menu configuration

 Web interface  Runs TriggerTool on the server, result presented as dynamic html pages LHCC Referee Meeting with ATLAS – 7th July Web-based access to trigger configuration 1. Search run-range 2. Run list 3. Trigger configuration (browsable) (definition, algorithms, selection cuts) Also with simple comparison functionality

Types of bunch crossings 19 A C A C A C A C A C Both bunches filled: Both bunches empty: Empty bunch crossing after filled: A-side filled: C-side filled: LHCC Referee Meeting with ATLAS – 7th July 2009

Bunch groups  All bunch crossings are numbered with Bunch-Crossing IDentifiers (BCID)  A set of BCIDs falling into one category is called a bunch group.  Bunch groups are realised as 7 lists of numbers that set internal thresholds in the Central Trigger Processor (CTP)  Relying on the bunch group mechanism means relying on the clocks  This requires well timed-in detectors and is not feasible with initial beams LHCC Referee Meeting with ATLAS – 7th July BGRP0Not in BCR veto BGRP1Filled BGRP2Empty reserved for calibration BGRP3Empty BGRP4Unpaired beam1 BGRP5Unpaired beam2 BGRP6Empty after filled Beam: L1_EM3 = EM3 & BGRP0 & BGRP1 Cosmic: L1_EM3_EMPTY = EM3 & BGRP0 & BGRP3 Beam: L1_EM3 = EM3 & BGRP0 & BGRP1 Cosmic: L1_EM3_EMPTY = EM3 & BGRP0 & BGRP3 Makes a well-defined cosmic slice possible in a physics menu!

LHCC Referee Meeting with ATLAS – 7th July

LHCC Referee Meeting with ATLAS – 7th July

LHCC Referee Meeting with ATLAS – 7th July

LHCC Referee Meeting with ATLAS – 7th July

LHCC Referee Meeting with ATLAS – 7th July

LHCC Referee Meeting with ATLAS – 7th July

LHCC Referee Meeting with ATLAS – 7th July