Performance of the NOA Data Acquisition and Data Driven Trigger Systems for the full 14 kT Far Detector A. Norman Fermilab, Scientific Computing Division.

Slides:



Advertisements
Similar presentations
JLab High Resolution TDC Hall D Electronics Review (7/03) - Ed Jastrzembski.
Advertisements

Digital Filtering Performance in the ATLAS Level-1 Calorimeter Trigger David Hadley on behalf of the ATLAS Collaboration.
Analog Peak Detector and Derandomizer G. De Geronimo, A. Kandasamy, P. O’Connor Brookhaven National Laboratory IEEE Nuclear Sciences Symposium, San Diego.
CMS ECAL Laser Monitoring System Toyoko J. Orimoto, California Institute of Technology, on behalf of the CMS ECAL Group High-resolution, high-granularity.
An accelerator beam of muon neutrinos is manufactured at the Fermi Laboratory in Illinois, USA. The neutrino beam spectrum is sampled by two detectors:
28 August 2002Paul Dauncey1 Readout electronics for the CALICE ECAL and tile HCAL Paul Dauncey Imperial College, University of London, UK For the CALICE-UK.
Y. Karadzhov MICE Video Conference Thu April 9 Slide 1 Absolute Time Calibration Method General description of the TOF DAQ setup For the TOF Data Acquisition.
VELO Testbeam 2006 Tracking and Triggering Jianchun (JC) Wang Syracuse University VELO Testbeam and Software Review 09/05/2005 List of tasks 1)L0 trigger.
NuMI MINOS The MINOS Far Detector Cosmic Rays and their neutrinos are being collected now. How does this work, and of what use is this data? Alec Habig,
Fermilab Scientific Computing Division Fermi National Accelerator Laboratory, Batavia, Illinois, USA. The Power of Data Driven Triggering DAQ Topology.
Fermilab Fermilab Scientific Computing Division & Computing Enabling Technologies Department, Fermi National Accelerator Laboratory, Batavia, Illinois,
The Time-of-Flight system of the PAMELA experiment: in-flight performances. Rita Carbone INFN and University of Napoli RICAP ’07, Rome,
10/24/2005Zelimir Djurcic-PANIC05-Santa Fe Zelimir Djurcic Physics Department Columbia University Backgrounds in Backgrounds in neutrino appearance signal.
3/7/05A. Semenov Batch-by-Batch Intensity Monitor 1 Two-Channel Batch by Batch Intensity Monitor for Main Injector BBI.
N. Anfimov (JINR) on behalf of the ECAL0 team.  Introduction  Installation and commissioning  Calibration  Data taking  Preliminary result  Plans.
The Transverse detector is made of an array of 256 scintillating fibers coupled to Avalanche PhotoDiodes (APD). The small size of the fibers (5X5mm) results.
Gretina data flow and formats Chris Campbell LBL.
The PEPPo e - & e + polarization measurements E. Fanchini On behalf of the PEPPo collaboration POSIPOL 2012 Zeuthen 4-6 September E. Fanchini -Posipol.
Data Acquisition Data acquisition (DAQ) basics Connecting Signals Simple DAQ application Computer DAQ Device Terminal Block Cable Sensors.
PERFORMANCE OF THE MACRO LIMITED STREAMER TUBES IN DRIFT MODE FOR MEASUREMENTS OF MUON ENERGY - Use of the MACRO limited streamer tubes in drift mode -Use.
IceCube DAQ Mtg. 10,28-30 IceCube DAQ: “DOM MB to Event Builder”
U N C L A S S I F I E D FVTX Detector Readout Concept S. Butsyk For LANL P-25 group.
Development of Multi-pixel photon counters(2) M.Taguchi, T.Nakaya, M.Yokoyama, S.Gomi(kyoto) T.Nakadaira, K.Yoshimura(KEK) for KEKDTP photon sensor group.
VC Feb 2010Slide 1 EMR Construction Status o General Design o Electronics o Cosmics test Jean-Sebastien Graulich, Geneva.
Status of the NO ν A Near Detector Prototype Timothy Kutnink Iowa State University For the NOvA Collaboration.
ATLAS Liquid Argon Calorimeter Monitoring & Data Quality Jessica Levêque Centre de Physique des Particules de Marseille ATLAS Liquid Argon Calorimeter.
21-Aug-06DoE Site Review / Harvard(1) Front End Electronics for the NOvA Neutrino Detector John Oliver Long baseline neutrino experiment Fermilab (Chicago)
July 19, 2006VLCW-06 Vancouver1 Scint/MAPMT Muon Prototype Operation Robert Abrams Indiana University.
28/03/2003Julie PRAST, LAPP CNRS, FRANCE 1 The ATLAS Liquid Argon Calorimeters ReadOut Drivers A 600 MHz TMS320C6414 DSPs based design.
C.Vigorito, University & INFN Torino, Italy 30 th International Cosmic Ray Conference Merida, Mexico Search for neutrino bursts from Gravitational stellar.
January 31, MICE DAQ MICE and ISIS Introduction MICE Detector Front End Electronics Software and MICE DAQ Architecture MICE Triggers Status and Schedule.
1 Performance of a Magnetised Scintillating Detector for a Neutrino Factory Scoping Study Meeting Rutherford Appleton Lab Tuesday 25 th April 2006 M. Ellis.
Florida Institute of Technology, Melbourne, FL
(s)T3B Update – Calibration and Temperature Corrections AHCAL meeting– December 13 th 2011 – Hamburg Christian Soldner Max-Planck-Institute for Physics.
1 Electronics Status Trigger and DAQ run successfully in RUN2006 for the first time Trigger communication to DRS boards via trigger bus Trigger firmware.
STAR Pixel Detector readout prototyping status. LBNL-IPHC-06/ LG22 Talk Outline Quick review of requirements and system design Status at last meeting.
Time and amplitude calibration of the Baikal-GVD neutrino telescope Vladimir Aynutdinov, Bair Shaybonov for Baikal collaboration S Vladimir Aynutdinov,
11 October 2002Paul Dauncey - CDR Introduction1 CDR Introduction and Overview Paul Dauncey Imperial College London.
Tracking at the Fermilab Test Beam Facility Matthew Jones, Lorenzo Uplegger April 29 th Infieri Workshop.
ATLAS and the Trigger System The ATLAS (A Toroidal LHC ApparatuS) Experiment [1] is one of the four major experiments operating at the Large Hadron Collider.
DAQ Selection Discussion DAQ Subgroup Phone Conference Christopher Crawford
1 Cosmic Ray Physics with IceTop and IceCube Serap Tilav University of Delaware for The IceCube Collaboration ISVHECRI2010 June 28 - July 2, 2010 Fermilab.
Development of Multi-pixel photon counters(2) M.Taguchi, T.Nakaya, M.Yokoyama, S.Gomi(kyoto) T.Nakadaira, K.Yoshimura(KEK) for KEKDTP photon sensor group.
Event Display The DAQ system architecture The NOA Data Acquisition system with the full 14 kt far detector P.F. Ding, D. Kalra, A. Norman and J. Tripathi.
Monitoring and Commissioning the NOνA Far Detector M. Baird 1, J. Bian 2, J. Coelho 4, G. Davies 3, M. Messier 1, M. Muether 5, J. Musser 1 and D. Rocco.
IRFU The ANTARES Data Acquisition System S. Anvar, F. Druillole, H. Le Provost, F. Louis, B. Vallage (CEA) ACTAR Workshop, 2008 June 10.
DAQ and Trigger for HPS run Sergey Boyarinov JLAB July 11, Requirements and available test results 2. DAQ status 3. Trigger system status and upgrades.
The NOνA data driven trigger Matthew Tamsett for the NOνA collaboration NOνA Neutrino 2014, XXVI International Conference on Neutrino Physics and Astrophysics.
 13 Readout Electronics A First Look 28-Jan-2004.
Giovanna Lehmann Miotto CERN EP/DT-DI On behalf of the DAQ team
DAQ ACQUISITION FOR THE dE/dX DETECTOR
Andreas Horneffer for the LOFAR-CR Team
Modeling event building architecture for the triggerless data acquisition system for PANDA experiment at the HESR facility at FAIR/GSI Krzysztof Korcyl.
The Silicon Drift Detector of the ALICE Experiment
Controlling a large CPU farm using industrial tools
ALICE – First paper.
RT2003, Montreal Niko Neufeld, CERN-EP & Univ. de Lausanne
PADME L0 Trigger Processor
Commissioning of the ALICE HLT, TPC and PHOS systems
The Monitoring System of the ARGO-YBJ Data Acquisition
ProtoDUNE SP DAQ assumptions, interfaces & constraints
A First Look J. Pilcher 12-Mar-2004
Example of DAQ Trigger issues for the SoLID experiment
on behalf of the NEMO Collaboration
BESIII EMC electronics
John Harvey CERN EP/LBC July 24, 2001
Operating Systems.
Commissioning of the ALICE-PHOS trigger
Low Level HLT Reconstruction Software for the CMS SST
The CMS Tracking Readout and Front End Driver Testing
Presentation transcript:

Performance of the NOA Data Acquisition and Data Driven Trigger Systems for the full 14 kT Far Detector A. Norman Fermilab, Scientific Computing Division for the NOA collaboration

NOνA Detectors A.Norman, Far Detector Surface Detector 14 kt “Totally Active”, Low Z, Range Stack/Calorimeter Liquid Scintillator filled PVC 896 alternating X-Y planes Optimized for EM shower reconstruction & muon tracking, X 0 ≈40cm, R m ≈11cm Dims: 53x53x180 ft “Largest Plastic Structure built by man” Started Operations May 2013 First Beam Aug 2013 Completed April 2014 Near Detector Underground Detector Identical to far detector Optimized for NuMI cavern rates -- 4x sampling rate electronics Completed April 2014

Introduction NOA is a unique challenge for Trigger & DAQ At one level it’s very simple There is only one detector technology But… That element is repeated times All of these are precisely synchronized to each other and to their counter parts 810 km away Elements are ALL in free running continuous waveform readout 180 kHz of cosmic rays are constantly lighting up the detector A. NormanCHEP 2015—Okinawa, Japan3

A. NormanCHEP 2015—Okinawa, Japan4 Job of the trigger is to examine…. 60 m 15.6 m 5ms data window at the NO A Far Detector Each pixel is one hit cell Color shows charge digitized from the light ~Thousand cosmic rays cross the detector during readout frame (the many peaks in the timing distribution below)

Trigger To find ultra rare topologies like this: A. NormanCHEP 2015—Okinawa, Japan5 NOA #1 [Nov. 12, 2013]

Or this…. A. NormanCHEP 2015—Okinawa, Japan6

DAQ/Trigger Configuration Fully instrumented/operational 14 kt far detector 344,064 detection cells, 10,752 front end board, 168 data concentrators, 45 data buffer/L3 trigger nodes, 5 data logging/arrays, 30 timing distributions units, 4 GPS clock systems A. NormanCHEP 2015—Okinawa, Japan7 Top view of completed NOA far detector

DAQ/Trigger Configuration Fully instrumented/operational 14 kt far detector 344,064 detection cells, 10,752 front end board, 168 data concentrators, 45 data buffer/L3 trigger nodes, 5 data logging/arrays, 30 timing distributions units, 4 GPS clock systems A. NormanCHEP 2015—Okinawa, Japan8 Top view of completed NOA far detector Front End Boards & Data Concentrators (Stage 1 Event Builders) mounted directly on the detector

A. NormanCHEP 2015—Okinawa, Japan9 NOA DAQ & Trigger Topology Continuous Flood of data into Trigger & Buffer Farm at peak of ~4.2 GB/s corresponding to 100% total live readout

Readout Data Structure A. NormanCHEP 2015—Okinawa, Japan10 Data Readout is continuous series of 5ms readout frames “Movie of the Detector” Minimal data suppression (~6 MeV single hit threshold) There are no gaps between frames Each frame is routed to a separate buffer/trigger nodes for processing 5 ms

EventBuilder/Buffer to Data Driven Trigger Interface Primary event builders assemble 5ms snap shots of full detector –5 ms Data blocks need to be passed to software trigger chain –Desire for isolation between DAQ and Trigger e.g. bad trigger algorithm should not affect event building Use Sys V IPC shared memory as isolation layer Single writer, multiple reader design –Implemented as circular buffer for writer –Presented as fixed length fifo w/ destructive read for readers A. NormanCHEP 2015—Okinawa, Japan11

Buffer Node to Data Driven Trigger Interface A. NormanCHEP 2015—Okinawa, Japan12 Shared Memory Event Fifo 5ms … Write Oldest Newest Trigger Filter Application #1 Trigger Filter Application #2 Trigger Filter Application #13 … Destructive Reads Average Wait Queue ~ 1 miliblock Buffer Node Event Builder Application Input Buffer … DCM DCM Ring Buffer 5ms … To Datalogger Global Trigger IN Out to Global Trigger

16 CPU cores on each buffer node are task assigned based on a shield/set model Three affinity sets: System, DAQ, Trigger Enforce resource allocations Impose subsystem separation Balances I/O, Proc and Comm. loads Operational Loads: [Idle, 6-8%, 99%] CPU Resource Mapping and Affinity A. NormanCHEP 2015—Okinawa, Japan13 NOA DAQ Buffer Node CPU MAP Data Driven Trigger Analysis (CPU3-15) System (CPU0) DDS Message Backbone DAQ Monitoring All General System Processes Input and Event Building (CPU1-2) Buffer Node Event Builder Input I/O Requests Search/Extract Output I/O DDS Comm. [idle] DDS Comm. [idle] NUMA Units 1-3 NUMA Unit 0 Data Driven Trigger Filter 0 Input I/O Buffing [Blocking] Analysis Thread DDS Comm. [on decision] DDS Comm. [on decision] … ×13 DDT Filter Instances

Cascading Trigger Algorithms Trigger is designed to take advantage of modularity –Each trigger is a series of “paths” through different common algorithm modules Builds complex triggers from smaller units of work Paths are conditionally cascaded to run each module at most once –Entire branches can be aborted Each final decision is independent, –Out of order and minimized time to decision for each trigger path A. NormanCHEP 2015—Okinawa, Japan14 Cascade of trigger/reconstruction modules for upward going muon, magnetic monopoles, calibration and high energy deposition triggers. Trigger decision points shown in blue

A. NormanCHEP 2015—Okinawa, Japan15 Trigger Timing Resolution NOA Front End boards now run in multi- point “waveform readout” Spare sampling of APD input Fit for t 0 performed at data unpack for trigger over full dynamic range Improves timing resolution by 7x over raw readout Uses 12 bit (Int) ➙ 9 bit (FP) encoding and lookup table scheme Shared 44 MB lookup table covers > 68 B fit solutions at < 1.2% induced error on t 0 15% impact on data unpack Single channel time resolution ~10 ns Make possible high resolution timing on track fits ➙ Upward going detection Fit for t 0 Fit for pusle height NOA multi-point readout scheme

Time to decision for each 5 ms data block is driven by the total number of buffer nodes N Buff that are used in a round robin pattern and the number of filters per node M filt that are analyzing the data Current configuration runs: 45 buffer nodes 13 filters per node Trigger Time to Decision A. NormanCHEP 2015—Okinawa, Japan16 NOA trigger execution time for leading trigger processes. Total average time to decision < 1 s.

Trigger is fully integrated with offline analysis framework Allows complete characterization of new triggers performance Use combination of Zero bias readout data & Monte Carlo simulation to determine scaling prior to new trigger deployment Trigger Time to Decision A. NormanCHEP 2015—Okinawa, Japan17 Scaling of trigger decision time with detector readout mass. Triggers are characterized based on zero bias readout of current detector configuration Detector Mass [kt]

Maximum time to decision is based on the total data volume buffered across the DAQ farm. Each buffer nodes is configured with a 5 GB memory segment dedicated to live data storage. Each buffer holds ≈ 5.5 s of full detector data Aggregated ≈ 4 min look back Maximum Time to Decision A. NormanCHEP 2015—Okinawa, Japan18 Look back buffer is aggregated across buffer node pool. Gives a maximum 240 s of latency to trigger decisions

DAQ Readout is 100% live in streaming mode Trigger operates on the stream with live time determined from the number of dropped 5 ms data frames which are NOT examined by the trigger Total trigger live time is balanced against physics sensitivity for current trigger suite Average live time > 0.9 Trigger Live Time A. NormanCHEP 2015—Okinawa, Japan19 NOA trigger live time corresponds to the absolute wall time that the trigger was able to process to decision across all paths. Computing Center Cooling Failure Retuning of monopole triggers Retuning of upward going triggers

NOA Global Trigger Design Global Trigger aggregations multiple trigger sources through prioritized queue structure Permits independent trigger and trigger group based –Prioritization –Prescaling –Throttling Allows for balancing –High Level software DDTs –External Beam triggers –Pulsers (hardware/software) –Readouts 50 s – 60 s Spring Colab 2014DDT Plans, A.Norman20 NOA trigger rate balancing and stability for select trigger groups

NOA Global Trigger Design Spring Colab 2014DDT Plans, A.Norman21 Trigger Thread “Beam” Trigger Thread “Data Driven” Trigger Thread “Internal/Pulser” Main Application Thread Trigger Reception/Generat ion Pool External Sources Beam Queue Servicing Agent (Sender) Prioritized Queues Fifo Medium Fifo Low Fifo Lower Fifo High Prioritized/Balanced Servicing Configuration & Startup DDS Receivers DDS Sender Data Logger Disk Buffer Farm Buffer Node 001 Buffer Node 002 Buffer Node 003 Buffer Node 200 … Buffer Node 003 Data Copy Master Trig Master Trigger

A. NormanCHEP 2015—Okinawa, Japan22 DAQ Uptime

A. NormanCHEP 2015—Okinawa, Japan23 DAQ Uptime Acquired far detector neutrino exposure for first 12 months of physics running

Beam Timing and Sync 24A. NormanCHEP 2015—Okinawa, Japan NOA Near Detector beam profile as a function of the time t between the trigger time t 0 and the time of the observed hits in the detector. Beam crossing was computed to occur s after the trigger time t 0. The 6 batch structure of the extracted NuMI beam is evident.

A. NormanCHEP 2015—Okinawa, Japan25 Far Detector Observation Observed time profile of candidate neutrino events observed at the NOA Far detector from March 2014-March The excess seen in the spectrum corresponds to the time interval, corrected for flight times, electronic and signal propagation delays, of the predicted far detector beam crossing. NOA Preliminary

Summary The NOA experiment has successfully created, commissioned and is operating, a continuous, free running, dead-timeless DAQ system that is able to scale to readout the full 14 kt NOA far detector. The experiment has also successfully created a high level “data driven trigger” system that is capable of analyzing the full, unfiltered NOA data stream and readout triggers windows from 50 s to 1 minute in length. These combined systems have currently collected over 5.7 billion events which are currently under analysis. A. NormanCHEP 2015—Okinawa, Japan26