TRIGGER STATUS AND MENU OPTIMIZATION LHCC Referee Meeting with ATLAS – 7 th July 2009 Ricardo Gonçalo (RHUL) on behalf of the ATLAS TDAQ.

Slides:



Advertisements
Similar presentations
Sander Klous on behalf of the ATLAS Collaboration Real-Time May /5/20101.
Advertisements

Jet Slice Status Report Ricardo Gonçalo (LIP) and David Miller (Chicago) For the Jet Trigger Group Trigger General Meeting – 9 July 2014.
Releases & validation Simon George & Ricardo Goncalo Royal Holloway University of London HLT UK – RAL – 13 July 2009.
TRIGGER STATUS AND MENU OPTIMIZATION LHCC Referee Meeting with ATLAS – 7 th July 2009 Ricardo Gonçalo (RHUL) on behalf of ATLAS Trigger/DAQ.
J. Leonard, U. Wisconsin 1 Commissioning the Trigger of the CMS Experiment at the CERN Large Hadron Collider Jessica L. Leonard Real-Time Conference Lisbon,
Reconstruction and Analysis on Demand: A Success Story Christopher D. Jones Cornell University, USA.
The ATLAS trigger Ricardo Gonçalo Royal Holloway University of London.
Real Time 2010Monika Wielers (RAL)1 ATLAS e/  /  /jet/E T miss High Level Trigger Algorithms Performance with first LHC collisions Monika Wielers (RAL)
General Trigger Philosophy The definition of ROI’s is what allows, by transferring a moderate amount of information, to concentrate on improvements in.
Trigger and online software Simon George & Reiner Hauser T/DAQ Phase 1 IDR.
1 The ATLAS Online High Level Trigger Framework: Experience reusing Offline Software Components in the ATLAS Trigger Werner Wiedenmann University of Wisconsin,
First year experience with the ATLAS online monitoring framework Alina Corso-Radu University of California Irvine on behalf of ATLAS TDAQ Collaboration.
TRIGGER DATA FOR PHYSICS ANALYSIS ATLAS Software Tutorial – 22 nd to 24 th April 2009 Ricardo Gonçalo – Royal Holloway.
August 98 1 Jürgen Knobloch ATLAS Software Workshop Ann Arbor ATLAS Computing Planning ATLAS Software Workshop August 1998 Jürgen Knobloch Slides also.
O FFLINE T RIGGER M ONITORING TDAQ Training 30 July 2010 On behalf of the Trigger Offline Monitoring Experts team.
From Olivier to commissioning team plans for the start-up of regular operations of LHCb 30/06 to 4/07 : Global commissioning week, all detectors, full.
L3 Filtering: status and plans D  Computing Review Meeting: 9 th May 2002 Terry Wyatt, on behalf of the L3 Algorithms group. For more details of current.
06/03/06Calice TB preparation1 HCAL test beam monitoring - online plots & fast analysis - - what do we want to monitor - how do we want to store & communicate.
TRIGGER-AWARE ANALYSIS TUTORIAL ARTEMIS Workshop – Pisa – 18 th to 19 th June 2009 Alessandro Cerri (CERN), Ricardo Gonçalo – Royal Holloway.
News from the Beatenberg Trigger Workshop Ricardo Gonçalo - 18 February 2009 Higgs WG Meeting during ATLAS Week.
What’s in the ATLAS data : Trigger Decision ATLAS Offline Software Tutorial CERN, August 2008 Ricardo Gonçalo - RHUL.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
Trigger validation for Event size TrigDecision Jets Electrons Ricardo Gonçalo, RHUL Physics Validation Meeting – 9 October 2007.
Alignment Strategy for ATLAS: Detector Description and Database Issues
Ricardo Gonçalo (RHUL) Higgs WG meeting – 28 th August, 2007 Outline: Introduction & release plans Progress in menus for cm -2 s -1 Workshop on trigger.
1 A ROOT Tool for 3D Event Visualization in ATLAS Calorimeters Luciano Andrade José de Seixas Federal University of Rio de Janeiro/COPPE.
Plans for Trigger Software Validation During Running Trigger Data Quality Assurance Workshop May 6, 2008 Ricardo Gonçalo, David Strom.
1 Trigger “box” and related TDAQ organization Nick Ellis and Xin Wu Chris Bee and Livio Mapelli.
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
H->bb Weekly Meeting Ricardo Gonçalo (RHUL) HSG5 H->bb Weekly Meeting, 26 April 2011.
ATLAS Liquid Argon Calorimeter Monitoring & Data Quality Jessica Levêque Centre de Physique des Particules de Marseille ATLAS Liquid Argon Calorimeter.
1 John Baines Commissioning of the ATLAS High Level Trigger.
19 November 98 1 Jürgen Knobloch ATLAS Computing ATLAS Computing - issues for 1999 Jürgen Knobloch Slides also on:
OFFLINE TRIGGER MONITORING TDAQ Training 5 th November 2010 Ricardo Gonçalo On behalf of the Trigger Offline Monitoring Experts team.
Trigger & Analysis Avi Yagil UCSD. 14-June-2007HCPSS - Triggers & AnalysisAvi Yagil 2 Table of Contents Introduction –Rates & cross sections –Beam Crossings.
DPDs and Trigger Plans for Derived Physics Data Follow up and trigger specific issues Ricardo Gonçalo and Fabrizio Salvatore RHUL.
Status of trigger software and EDM Ricardo Gonçalo, RHUL – on behalf of ATLAS TDAQ.
The ATLAS Trigger: High-Level Trigger Commissioning and Operation During Early Data Taking Ricardo Gonçalo, Royal Holloway University of London On behalf.
Trigger Offline Expert Report Simon George (RHUL), Ricardo Gonçalo (RHUL) Trigger General Meeting – 25 th November 2009.
All Experimenters MeetingDmitri Denisov Week of July 16 to July 22 D0 Summary  Delivered luminosity and operating efficiency u Delivered: 1.6pb -1 u Recorded:
IOP HEPP: Beauty Physics in the UK, 12/11/08Julie Kirk1 B-triggers at ATLAS Julie Kirk Rutherford Appleton Laboratory Introduction – B physics at LHC –
Organisation of the Beatenberg Trigger Workshop Ricardo Gonçalo Higgs WG Meeting - 22 Jan.09.
Commissioning and Performance of the CMS High Level Trigger Leonard Apanasevich University of Illinois at Chicago for the CMS collaboration.
Alliance Alliance Performance Status - CREQ Régis ELLING July 2011.
Trigger Software Validation Olga Igonkina (U.Oregon), Ricardo Gonçalo (RHUL) TAPM Open Meeting – April 12, 2007 Outline: Reminder of plans Status of infrastructure.
Conditions Metadata for TAGs Elizabeth Gallas, (Ryan Buckingham, Jeff Tseng) - Oxford ATLAS Software & Computing Workshop CERN – April 19-23, 2010.
D0 Status: 01/14-01/28 u Integrated luminosity s delivered luminosity –week of 01/ pb-1 –week of 01/ pb-1 –luminosity to tape: 40% s major.
Artemis School On Calibration and Performance of ATLAS Detectors Jörg Stelzer / David Berge.
Moritz Backes, Clemencia Mora-Herrera Département de Physique Nucléaire et Corpusculaire, Université de Genève ATLAS Reconstruction Meeting 8 June 2010.
Online (GNAM) and offline (Express Stream and Tier0) monitoring produced results during cosmic/collision runs (Oct-Dec 2009) Shifter and expert level monitoring.
Overview of the High-Level Trigger Electron and Photon Selection for the ATLAS Experiment at the LHC Ricardo Gonçalo, Royal Holloway University of London.
Online Monitoring System at KLOE Alessandra Doria INFN - Napoli for the KLOE collaboration CHEP 2000 Padova, 7-11 February 2000 NAPOLI.
The ATLAS Trigger Configuration System Design and Commissioning A.dos Anjos, P.Bell, D.Berge, J.Haller, S.Head, T.Kohno, S.Li, T.McMahon, M.Nozicka, H.v.d.
The ATLAS Computing Model and USATLAS Tier-2/Tier-3 Meeting Shawn McKee University of Michigan Joint Techs, FNAL July 16 th, 2007.
Performance DPDs and trigger commissioning Preparing input to DPD task force.
TAGS in the Analysis Model Jack Cranshaw, Argonne National Lab September 10, 2009.
Trigger Input to First-Year Analysis Model Working Group And some soul searching… Trigger Open Meeting – 29 July 2009.
Predrag Buncic CERN ALICE Status Report LHCC Referee Meeting 01/12/2015.
Victoria, Sept WLCG Collaboration Workshop1 ATLAS Dress Rehersals Kors Bos NIKHEF, Amsterdam.
Online Consumers produce histograms (from a limited sample of events) which provide information about the status of the different sub-detectors. The DQM.
Performance of the ATLAS Trigger with Proton Collisions at the LHC John Baines (RAL) for the ATLAS Collaboration 1.
Calorimeter global commissioning: progress and plans Patrick Robbe, LAL Orsay & CERN, 25 jun 2008.
Trigger, Online, and Computing Readiness A. J. Lankford University of California, Irvine August 3-5, 2009Lankford - ATLAS Physics Workshop of the Americas1.
Points from DPD task force First meeting last Tuesday (29 th July) – Need to have concrete proposal in 1 month – Still some confusion and nothing very.
Trigger release and validation status and plans David Strom (Oregon), Simon George (RHUL), Ricardo Gonçalo (RHUL)
Introduction 08/11/2007 Higgs WG – Trigger meeting Ricardo Gonçalo, RHUL.
ATLAS UK physics meeting, 10/01/08 1 Triggers for B physics Julie Kirk RAL Overview of B trigger strategy Algorithms – current status and plans Menus Efficiencies.
Monitoring of L1Calo EM Efficiencies
Commissioning of the ALICE HLT, TPC and PHOS systems
Presentation transcript:

TRIGGER STATUS AND MENU OPTIMIZATION LHCC Referee Meeting with ATLAS – 7 th July 2009 Ricardo Gonçalo (RHUL) on behalf of the ATLAS TDAQ

Outlook  Recent Activity  Strategic Planning  Highlights from recent results  Monitoring and finding problems  Commissioning plans  Menu evolution  Testing and solving problems 2 LHCC Referee Meeting with ATLAS – 7th July 2009

Recent activity  Much has happened since the last TDAQ report at this meeting:  See talk by Nick Ellis in September 22 nd LHCC review:  The short single-beam run and the cosmics run in 2008 provided a good stress test of the TDAQ system  Detector systems, DAQ and trigger algorithms exercised on real data  Trigger successfully selected events for detector and trigger commissioning  Since then, data was analyzed and lessons were drawn…  Collected data was scrutinized; problems found and fixed  Organization of activity benefited much from these running periods 3 LHCC Referee Meeting with ATLAS – 7th July 2009 Cosmic event triggered by the L1 tau and jet triggers

Strategic Planning  A set of reviews was done after last year’s run to examine critically what had been done  Touched the following subjects: offline monitoring infrastructure, tools for online monitoring and debugging, shift crew operation and tools, information flow, timing measurements, configuration and streaming  The trigger workshop in February 2009 was an important milestone:  Reviewed the trigger activity in the 2008 single-beam and cosmics run and establish plans to prepare for the 2009 run   Led by panel from broader ATLAS community and with experience from other experiments  Raised interest and involved people from detector, physics, combined performance, data preparation, etc  Resulted in a report and ~80 identified action items with responsible people’s names attached   Touching on all areas from trigger menus to monitoring, documentation, configuration etc  We have been following up on these for the last five months in the various trigger domains  Many translated into agreed procedures, software deliverables or tools 4 LHCC Referee Meeting with ATLAS – 7th July 2009

Single-beam and cosmic runs  “Splash” events triggered with beam pickup (BPTX) and minimum-bias trigger scintillators (MBTS) allowed rough timing-in of detectors  More complete test of the HLT possible during cosmics run until end of October 2008  High-Level Trigger (HLT) running in passthrough mode  Data streaming done by HLT based on Level 1 trigger type and on Level 2 tracking (see figure)  HLT algorithms exercised online to reconstruct trigger data  Contributed much to weeding out problems in real environment  Monitoring benefited much from exercise  Heavy use of CAF for testing bug fixes and new menus before deployment  New menus deployed practically every day to respond to new problems and needs from detector systems  Menu configuration machinery responded fast and well to new demands 5 LHCC Referee Meeting with ATLAS – 7th July 2009 Number of events recorded into each stream in 2008

Highlights from cosmic runs  Only 2008 cosmic run analysed so far  June 2009 combined cosmic run finished yesterday  Complete HLT infrastructure was tested  Including the algorithm steering and configuration  Also the online and offline monitoring  Weak points were identified and were/are being addressed – e.g. new trigger rate display  LHC clock not available  Event timing provided by muon trigger chambers  Leads to lower resolution in some detectors  Level 2 inner detector tracking algorithms were useful to select events with track candidates (top figure)  Modified algorithms to accept tracks not coming from nominal IP  Many HLT algorithms were exercised with cosmics by relaxing selection thresholds and requirements (bottom figure)  Many other examples would be possible… 6 E 3x7 /E 7x7 cells in LAr calorimeter for trigger clusters matching a cluster reconstructed offline Level 2 tracking efficiency for events with a good track reconstructed offline LHCC Referee Meeting with ATLAS – 7th July 2009

Technical runs  Playback of simulated collision data through the DAQ system  Performed regularly to integrate new releases in online environment  Allow to test new DAQ and HLT software in real system  Allows to test the system with collision-type data: enhanced bias, physics channels  Preparation before running detectors on cosmics  Estimate system performance:  Estimated processing time within limits imposed by processing capability (top figure; average should be < 40ms at Level 2 and < ~1s at the Event Filter)  Throughput rate: tested up to ~78kHz input to Level 2 (bottom figure; 50-75kHz expected)  Event building rate of ~5kHz starts to compete with Level 2 data requests (~2kHz event building rate expected) 7 LHCC Referee Meeting with ATLAS – 7th July 2009 Event Filter processing time for accepted events HLT input rate test: input rate (Hz) versus time (hours)

noisy module Monitoring and diagnosing problems  Monitoring essential to quickly spot and diagnose problems  Both online – in control room – and offline, trigger monitoring was exercised in the cosmics and technical runs  Online monitoring based on histograms of:  Trigger rates  Overlap between trigger streams  Characteristic event quantities for each selection  Offline monitoring based on:  Histograms produced during Tier0 event reconstruction  Re-processing and error analysis of events from the debug stream  Trigger monitoring currently being reviewed 8 LHCC Referee Meeting with ATLAS – 7th July 2009 Level 2 calorimeter transverse energy (MeV) Reference histogram Measured: mismatch due to calibration laser pulses in this run

Commissioning plans LHCC Referee Meeting with ATLAS – 7th July  Combined ATLAS cosmic run will start at T0 – 4 weeks  The trigger will start with the already familiar Level 1 selection for cosmic events  Have the HLT in passthrough mode – exercise the algorithms, event data, monitoring etc without rejecting events  Data streaming by the HLT based on Level 1 trigger type and on tracking/cosmic event algorithms  Exercise HLT algorithms with loose event selection to accept and process cosmic events  Single-beam events to be selected with dedicated menu  Based on use of beam pickup and minimum bias scintillators  Continue to exercise HLT algorithms in passthrough mode using beam-gas events and halo muons  Initial collisions triggered with Level 1 only  HLT deployed to reject events only when needed to keep event rate within budget  Both Level 1 and HLT triggers can be prescaled during the run  This avoids a lengthy run stop-start, which may take a long time  Now creating conditions to have fast feedback from the trigger on the collected samples  Using Tier 1 and/or CAF to process events and create monitoring histograms and dedicated ntuples (root) with fast turnaround

Menus for initial data LHCC Referee Meeting with ATLAS – 7th July  Menus dedicated to cosmic-ray events and single beam exist in different degrees of readiness  Model menus exist or are being developed in simulations for average luminosities of cm -2 s -1 and cm -2 s -1  These are possible scenarios for the coming run  Depending on the detailed bunch spacing scenario, this could mean up to 5 overlapping events per bunch crossing, on average  This might require changes to the menu, in order to keep the rate manageable  These model menus provide a reference for planning and evolving the online trigger menu as the LHC luminosity grows  Some practical questions remain on what menus should be used in Monte-Carlo production  Can have impact on the analysis of initial data  It must be possible to simulate the response of menus/algorithms used in online event selection  The trigger decision can be re-run on analysis-level files (AOD) with changed cuts, but this is an expert task and has limitations

Menu evolution  The evolution of the trigger menu is very much tied to the evolution of the LHC luminosity (and to the beam energy)  Procedures for menu evolution agreed but still need to be tested in real life  Expect to increase low-p T trigger prescales to keep rate under budget  Quality cuts can perhaps be used instead of p T, but may need previous studies with collision data  Some physics triggers, needed for analysis in channels with low cross section, are “un-prescalable” 11 LHCC Referee Meeting with ATLAS – 7th July 2009 Formal shceme for the approval of new triggers

Testing and problem solving LHCC Referee Meeting with ATLAS – 7th July  As learned from the 2008 run, it is essential to thoroughly test new menus and HLT algorithms with real data before online deployment  Also, it is important to be able to react quickly to new problems – or risk wasting bandwidth collecting bad data  The Cern Analysis Facility (CAF) is an essential part of the trigger strategy for this:  Used for automatic re-processing events from the debug stream, where an error condition occurred online – e.g. a time-out during a Level 2 data request  Used to test new menus once they are loaded to the trigger configuration database and before they are deployed online  Needs to provide access to RAW data from a small number of runs where a problem was identified until the debugging is completed – this is essential as a last-resort possibility to diagnose urgent problems  Other debugging tools are provided by:  The monitoring histograms produced during event reconstruction at Tier 0  The production of commissioning ntuples at Tier 1 for fast analysis of problems (and Tier 0 for time-critical needs)  The “Preseries” subfarm: a set of older HLT nodes serving as a test platform and not used for normal trigger processing

Conclusions LHCC Referee Meeting with ATLAS – 7th July  Much was achieved from the 2008/09 single-beam and cosmics runs  Not least in learning how to deal with a running ATLAS  The lessons learned from this initial running period were used in planning for this year’s activities  Since the trigger is a large and complex system, there is (always!) much more that could be done …  …But we are ready for taking collision data, and need it in order to make progress!  And as always we need to plan for the unexpected!

Backup slides LHCC Referee Meeting with ATLAS – 7th July

Stream overlap 15 LHCC Referee Meeting with ATLAS – 7th July 2009

ATLAS Software Tutorial - Trigger Data 16 ESD AOD TAG Configures Stores decoded Trigger Menu DPD With decreasing amount of detail

 Java based front end to TriggerDB, launch from the web (Java web-start):  Overview of all trigger configurations  Detailed and convenient investigation of trigger menus Trigger definition L1->L2->EF: prescales, threshold algorithms, selection criteria, streaming information, etc.  Possibility to compare different trigger configurations ATLAS Software Tutorial - Trigger Data 17 TriggerTool

 Web interface  Runs TriggerTool on the server, result presented as dynamic html pages ATLAS Software Tutorial - Trigger Data 18 Web Interface to COOL and the TriggerDB 1. Search run-range 2. Run list 3. Trigger configuration (browsable) (definition, algorithms, selection cuts) Also with simple comparison functionality

 checkTrigger.py AOD.pool.root  Runs over ESD/AOD/DPD and presents detailed (chain ‐ wise) counts of the trigger decision  checkTriggerConfig.py ‐ d AOD.pool.root  Runs over ESD/AOD/DPD and presents detailed trigger configuration(s) in the file  Shows multiple configurations, in DPD possible/likely ATLAS Software Tutorial - Trigger Data 19 Scripts to Check Pool File Content