Jaroslav Zalesak The Institute of Physics of the Czech Academy of Sciences / Fermilab and Peter Shanahan, Andrew Norman, Kurt Biery, Ronald Rechenmacher,

Slides:



Advertisements
Similar presentations
Digital Filtering Performance in the ATLAS Level-1 Calorimeter Trigger David Hadley on behalf of the ATLAS Collaboration.
Advertisements

MICE Fiber Tracker Electronics AFEII for MICE (Front end readout board) Recall: AFEs mount on ether side of the VLPC cass, with fibers going to the VLPCs.
Overview of the current TDR system Introduction Electronics for instrumentation TDR core : Metronome, ADC, Merge Data Handling Pete Jones, University of.
Coupling an array of Neutron detectors with AGATA The phases of AGATA The AGATA GTS and data acquisition.
How to Build a Neutrino Oscillations Detector - Why MINOS is like it is! Alfons Weber March 2005.
MICE Tracker Front End Progress Tracker Data Readout Basics Progress in Increasing Fraction of Muons Tracker Can Record Determination of Recordable Muons.
CHEP04 - Interlaken - Sep. 27th - Oct. 1st 2004T. M. Steinbeck for the Alice Collaboration1/20 New Experiences with the ALICE High Level Trigger Data Transport.
DSP online algorithms for the ATLAS TileCal Read Out Drivers Cristobal Cuenca Almenar IFIC (University of Valencia-CSIC)
NuMI MINOS The MINOS Far Detector Cosmic Rays and their neutrinos are being collected now. How does this work, and of what use is this data? Alec Habig,
Fermilab Scientific Computing Division Fermi National Accelerator Laboratory, Batavia, Illinois, USA. The Power of Data Driven Triggering DAQ Topology.
Fermilab Fermilab Scientific Computing Division & Computing Enabling Technologies Department, Fermi National Accelerator Laboratory, Batavia, Illinois,
Data Acquisition Software for CMS HCAL Testbeams Jeremiah Mans Princeton University CHEP2003 San Diego, CA.
Use of ROOT in the D0 Online Event Monitoring System Joel Snow, D0 Collaboration, February 2000.
D0 Farms 1 D0 Run II Farms M. Diesburg, B.Alcorn, J.Bakken, T.Dawson, D.Fagan, J.Fromm, K.Genser, L.Giacchetti, D.Holmgren, T.Jones, T.Levshina, L.Lueking,
ILC Trigger & DAQ Issues - 1 ILC DAQ issues ILC DAQ issues By P. Le Dû
Performance of the NOA Data Acquisition and Data Driven Trigger Systems for the full 14 kT Far Detector A. Norman Fermilab, Scientific Computing Division.
Status of the NO ν A Near Detector Prototype Timothy Kutnink Iowa State University For the NOvA Collaboration.
6-10 Oct 2002GREX 2002, Pisa D. Verkindt, LAPP 1 Virgo Data Acquisition D. Verkindt, LAPP DAQ Purpose DAQ Architecture Data Acquisition examples Connection.
1 Online Calibration of Calorimeter Mrinmoy Bhattacharjee SUNY, Stony Brook Thanks to: D. Schamberger, L. Groer, U. Bassler, B. Olivier, M. Thioye Institutions:
ATLAS Liquid Argon Calorimeter Monitoring & Data Quality Jessica Levêque Centre de Physique des Particules de Marseille ATLAS Liquid Argon Calorimeter.
11th March 2008AIDA FEE Report1 AIDA Front end electronics Report February 2008.
21-Aug-06DoE Site Review / Harvard(1) Front End Electronics for the NOvA Neutrino Detector John Oliver Long baseline neutrino experiment Fermilab (Chicago)
Timing Distribution System (TDS) 9 April, 2010 Greg Deuerling Rick Kwarciany Neal Wilcer.
The NOvA Experiment Ji Liu On behalf of the NOvA collaboration College of William and Mary APS April Meeting April 1, 2012.
Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
28/03/2003Julie PRAST, LAPP CNRS, FRANCE 1 The ATLAS Liquid Argon Calorimeters ReadOut Drivers A 600 MHz TMS320C6414 DSPs based design.
Xiangming Sun1PXL Sensor and RDO review – 06/23/2010 STAR XIANGMING SUN LAWRENCE BERKELEY NATIONAL LAB Firmware and Software Architecture for PIXEL L.
The Main Injector Beam Position Monitor Front-End Software Luciano Piccoli, Stephen Foulkes, Margaret Votava and Charles Briegel Fermi National Accelerator.
Dec.11, 2008 ECL parallel session, Super B1 Results of the run with the new electronics A.Kuzmin, Yu.Usov, V.Shebalin, B.Shwartz 1.New electronics configuration.
NUMI Off Axis NUMI Off Axis Workshop Workshop Argonne Meeting Electronics for RPCs Gary Drake, Charlie Nelson Apr. 25, 2003 p. 1.
DEPARTEMENT DE PHYSIQUE NUCLEAIRE ET CORPUSCULAIRE JRA1 Parallel - DAQ Status, Emlyn Corrin, 8 Oct 2007 EUDET Annual Meeting, Palaiseau, Paris DAQ Status.
January 31, MICE DAQ MICE and ISIS Introduction MICE Detector Front End Electronics Software and MICE DAQ Architecture MICE Triggers Status and Schedule.
May, 2006FEE 2006 / Perugia, Italy(1) Front End Electronics for the NOvA Neutrino Detector John Oliver, Nathan Felt Harvard University Long baseline.
1 07/10/07 Forward Vertex Detector Technical Design – Electronics DAQ Readout electronics split into two parts – Near the detector (ROC) – Compresses and.
Sep. 17, 2002BESIII Review Meeting BESIII DAQ System BESIII Review Meeting IHEP · Beijing · China Sep , 2002.
June 17th, 2002Gustaaf Brooijmans - All Experimenter's Meeting 1 DØ DAQ Status June 17th, 2002 S. Snyder (BNL), D. Chapin, M. Clements, D. Cutts, S. Mattingly.
Online Monitoring System at KLOE Alessandra Doria INFN - Napoli for the KLOE collaboration CHEP 2000 Padova, 7-11 February 2000 NAPOLI.
XLV INTERNATIONAL WINTER MEETING ON NUCLEAR PHYSICS Tiago Pérez II Physikalisches Institut For the PANDA collaboration FPGA Compute node for the PANDA.
1 Electronics Status Trigger and DAQ run successfully in RUN2006 for the first time Trigger communication to DRS boards via trigger bus Trigger firmware.
1 MICE Tracker Readout Update Introduction/Overview TriP-t hardware tests AFE IIt firmware development VLSB firmware development Hardware progress Summary.
Management of the LHCb DAQ Network Guoming Liu *†, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
Tracking at the Fermilab Test Beam Facility Matthew Jones, Lorenzo Uplegger April 29 th Infieri Workshop.
ECFA Workshop, Warsaw, June G. Eckerlin Data Acquisition for the ILD G. Eckerlin ILD Meeting ILC ECFA Workshop, Warsaw, June 11 th 2008 DAQ Concept.
Monitoring for the ALICE O 2 Project 11 February 2016.
15-Aug-08DoE Site Review / Harvard(1) Front End Electronics for the NOvA Neutrino Detector John Oliver, Nathan Felt, Sarah Harder Long baseline neutrino.
Status of the NOνA Experiment Satish Desai - University of Minnesota For the NOνA Collaboration APS Meeting - April 2013 Denver, Colorado.
L1 DAQ 1 process per DAQ board L1 DAQ 1 process per DAQ board Trigger Distribution System BTF Beam Trigger BTF Beam Trigger 50 Hz L1 DAQ Event build L1.
Status of the NO A Experiment Kirk Bays (Caltech) on behalf of the NO A collaboration Lake Louise Winter Institute Saturday, Feb 22, 2014.
Super BigBite DAQ & Trigger Jens-Ole Hansen Hall A Collaboration Meeting 16 December 2009.
The NO A Near Detector: An overview Jose A. Sepulveda-Quiroz For the NO A Collaboration Iowa State University and Argonne National Laboratory APS April.
Event Display The DAQ system architecture The NOA Data Acquisition system with the full 14 kt far detector P.F. Ding, D. Kalra, A. Norman and J. Tripathi.
Fermilab Scientific Computing Division Fermi National Accelerator Laboratory, Batavia, Illinois, USA. Off-the-Shelf Hardware and Software DAQ Performance.
NO A Data Acquisition System TIPP 2011 Sue Kasahara, University of MN for the NO A DAQ Group: K. Biery, G. Deuerling, N. Felt, A. Gavrilenko, S. Goadhouse,
The ALICE Data-Acquisition Read-out Receiver Card C. Soós et al. (for the ALICE collaboration) LECC September 2004, Boston.
IRFU The ANTARES Data Acquisition System S. Anvar, F. Druillole, H. Le Provost, F. Louis, B. Vallage (CEA) ACTAR Workshop, 2008 June 10.
DAQ and Trigger for HPS run Sergey Boyarinov JLAB July 11, Requirements and available test results 2. DAQ status 3. Trigger system status and upgrades.
The NOνA data driven trigger Matthew Tamsett for the NOνA collaboration NOνA Neutrino 2014, XXVI International Conference on Neutrino Physics and Astrophysics.
April 2006 CD1 Review NOvA DAQ 642, hits/sec avg. 20,088 Front-End Boards (~ 3 MByte/sec data rate per FEB) 324 Data Combiner Modules,
Eric Hazen1 Ethernet Readout With: E. Kearns, J. Raaf, S.X. Wu, others... Eric Hazen Boston University.
Front-end Electronic for a neutrino telescope : a new ASIC SCOTT
Enrico Gamberini, Giovanna Lehmann Miotto, Roland Sipos
ALICE – First paper.
ProtoDUNE SP DAQ assumptions, interfaces & constraints
Cover page.
Example of DAQ Trigger issues for the SoLID experiment
Commissioning of the ALICE-PHOS trigger
LHCb Trigger, Online and related Electronics
The LHCb Level 1 trigger LHC Symposium, October 27, 2001
The CMS Tracking Readout and Front End Driver Testing
Readout Systems Update
Presentation transcript:

Jaroslav Zalesak The Institute of Physics of the Czech Academy of Sciences / Fermilab and Peter Shanahan, Andrew Norman, Kurt Biery, Ronald Rechenmacher, Jonathan Paley, Alec Habig, Susan Kasahara, Denis Perevalov, Mathew Muether for the NO ν A Collaboration for the NO ν A Collaboration CHEP 2013 in Amsterdam, The Netherlands 14 Oct 2013 The NO ν A Far Detector Data Acquisition System

14 Oct 2013 CHEP Jaroslav Zalesak - The NOvA DAQ 2 Goals:  Measure ν µ → ν e oscillations:  Measure θ 13  Determine the mass hierarchy  Constrain δ CP  Determine the θ 23 octant  Measure ν µ disappearance.  Precision measurement of |Δm 2 32 |, sin 2 2θ 23  Other physics:  Near Detector neutrino cross-sections  Sterile neutrinos  Supernova search  Monopole search  Dark matter searches The NO A Experiment Fermilab - Near Det Ash River - Far Det A long-baseline neutrino oscillation experiment 810 km

14 Oct 2013 CHEP Jaroslav Zalesak - The NOvA DAQ 3 Far Detector:  Surface Detector  14 kton  60 x 15.6 x 15.6 m 3  896 alternating X-Y planes  cells  “Totally Active” Calorimeter  3D - tracking  Liquid Scintillator filled PVC  Optimized for EM shower reconstruction & muon tracking The NO A Detectors FarDet: Assembly status 7Oct2013 Near Detector:  Identical to Far detector  Underground Detector  1:4 scale size  Optimized for NuMI cavern rates - 4x sampling rate electronics  Construction just began

4 Active Detector Sensor Front End ReadoutEvent Building Event Output Avalanche Photo Diodes (APD):  32 channels  85% Quantum Efficiency  Gain ~100  Cooled to -15C for 2PE dark noise D A Q FPGA: Digital signal processing Front End Board (FEB) ASIC: Amplification, shaping, multiplexing to ADC 14 Oct 2013 CHEP Jaroslav Zalesak - The NOvA DAQ NO ν A Sensor and Front End

FEB Signal Processing 5 ASIC Output Pulse Shaping (FarDet)  380 ns rise time, 7  s fall time  Sample each channel every 500 ns  Actual clock runs at 62.5 ns,  Four 8:1 multiplexors DCS Mode  Dual Correlated Sampling  Output data when V i -V i-2 exceeds configured threshold  Data: pulse height = V i -V i-2, timestamp=T i DSO Mode  Digital Sampling Oscilloscope  Take (1000) contiguous samples  Use to measure noise level and set appropriate DCS threshold Near Detector  60ns rise time, 500ns fall time  Sample each channel every 125 ns  Same base clock, but 2:1 multiplexing 14 Oct 2013 CHEP Jaroslav Zalesak - The NOvA DAQ

Fast Timing 6 14 Oct 2013 CHEP Jaroslav Zalesak - The NOvA DAQ Two pulses with identical DCS output Pulseheight and Time Multi-point sampling  Apply DCS-threshold, but read out multiple contiguous samples  With compression, can read out up to 12 samples with only 2 addition data words  Apply matched filtering to improve timing resolution substantially  Factor of 3 improvement demonstrated in Data with 4 points Benefits  Reduce Near Detector Pile-up  FarDet: track direction from timing

DAQ Requirements 7 14 Oct 2013 CHEP Jaroslav Zalesak - The NOvA DAQ  Record beam spills in a window of at least 30 μs, with no dead time  Every 1.33 seconds  Record ~100 times as many periodic cosmic spills with no dead time  Cosmic rays (200 kHz!) and noise add up to a hit rate of about 52 MHz, or 620 MB/sec  Process 620 MB/sec readout with no dead time  Write out 1.4 MB/sec saved events.

NO A DAQ Layout 8 1 Diblock:  12 DCMs mounted on detector  6 per view  Each DCM reads out a 2- module-wide slice 14 Oct 2013 CHEP Jaroslav Zalesak - The NOvA DAQ FEB Data flow FEB 64 FEBs Per DCM APD 32 Chan APD Per FEB 32 Detector Cell Readout Fibers Per APD Hit data at Max 3.2MB/s 24MB/s Data stream Timing In Timing Out To next DCM DCM (Data Concentrator Module)  Designed to aggregate the continuous data stream 64 front end boards (2048 detector cells) into partially time sorted data block  Combines a large FPGA with a single board computer to build 5ms  Run custom embedded linux with support for NOvA firmware  168 DCMs for the Far Detetor  620 MB/sec readout, ~4 MB/sec/DCM DAQ Cluster (Buffer Farm and Control Nodes)  Ash River – Computing center in Detector building  Fermilab – Lattice Computing Center

NO A DAQ Architecture 9 10 gigabit Ethernet switch DCM Readout  PowerPC-based custom computers  Read out 64 Front End Boards each  Sends all hits over threshold to Buffer Nodes Buffer Farm – Event Buffering and Building  O(100) Commodity Linux nodes  16 cores, 2.66 GHz,  2 GB RAM/core  < $1000 node  Stores every hit above zero suppression threshold for more than 20 seconds  Collates data from DCMs  Sends triggered data to Data Logger Data Logger – Event Output  Like Buffer Nodes, with  22 Terabyte of RAID array 14 Oct 2013 CHEP Jaroslav Zalesak - The NOvA DAQ

NO A Trigger Architecture gigabit Ethernet switch Trigger Outputs  (T 0, T 0 +  T) trigger window  Sent as software message to Buffer Farm and Data Logger  No Hardware trigger in NO A! Buffer Farm  All Data touching trigger window is sent to Data Logger  Data-driven trigger: Desirable cosmic events Neutrinos Supernovae Exotic signatures Data Logger  Receives trigger as cross-check on data from Buffer Farm 14 Oct 2013 CHEP Jaroslav Zalesak - The NOvA DAQ Global Trigger Trigger Inputs  Neutrino beam spills times  Random/Periodic gates  Data-driven triggers

11 NO A Clock:  64 MHz clock represented in 56 bits (24+32), covers 36+ year span with ns ticks  All FEB/DCM/TDUs are synchronized via GPS clock 14 Oct 2013 CHEP Jaroslav Zalesak - The NOvA DAQ NO ν A Timing System

DDT Design Overview 14 Oct 2013 CHEP Jaroslav Zalesak - The NOvA DAQ 12 » Buffer Node Computers 180 Data Concentrator Modules 11,160 Front End Boards Buffer Nodes Data Buffer 5ms data blocks Data Logger Data Driven Triggers System Trigger Processor …. Event builder Data Slice Pointer Table Data Time Window Search Trigger Reception Grand Trigger OR Data Triggered Data Output Data Minimum Bias 0.75GB/S Stream DCM 1 DCMs COtS Ethernet 1Gb/s FEB Zero Suppressed at (6-8MeV/cell) FEB Global Trigger Processor Beam Spill Indicator (Async from Trigger Broadcast Calib. Pulser (50-91Hz) Data Driven Trig. Decisions FEBs (368,4600 det. channels) 200 Buffer Nodes (3200+ Compute Cores) Shared Memory DDT event stack

13 FPGA CPU 64 FEBs / DCM “Kernel module” Serves FPGA data to CPU as device driver DCM Application  DAQ system interface for control of DCM and FEB settings  Monitors data quality  Produces “Microslices” – data between consecutive time markers  Builds Millislice for transfer to buffer node: collection of all Microslices within 5 ms Microslice Header Time Marker Low Time Marker High Nanoslice 0 FEB 0 Nanoslice 1 FEB 0 Nanoslice 0 FEB 2 Nanoslice 0 FEB 3 56-bit 64 MHz time marker Roll over every 36+ years Serial FEB-DCM link limit: ~ 15 hits/FEB/50  s Data Concentrator Module Collates data from all channels with timestamps between consecutive time markers Timing markers sent nominally every 50  s Hit data: Nanoslice (500 ns)

14 Oct 2013 CHEP Jaroslav Zalesak - The NOvA DAQ 14 Buffer Nodes / Data Logger MilliBlock Pointer written to MillBlock pool for buffering Copy to shared memory for analysis by data- driven trigger DataBlock Round-robin input  All DCMs send to the same Buffer Node for a give 5ms period (sequence number)  With 100 buffer nodes, this means a buffer node is used twice each second MilliBlock  1st Millislice seen in a sequence causes creation of new Milliblock  Complete when last DCM with sequence number is seen Trigger data search:  Create a DataBlock for output to DataLogger  Check all MilliBlocks in buffer  Does MilliBlock time overlap with trigger time? Check MicroBlocks:  Copy all MicroBlocks touching Trigger Window into DataBlock Event (“Data Atom” until complete)  Receipt of Trigger or DataBlock with event no  Header, Trigger Info, DataBlocks, Tail  Complete when data received from Trigger and all Buffer Nodes  Note: The trigger, not the hit data, defines the event. The same data can appear in multiple events. Output  Send data to configurable set of streams (files on disk) based on trigger ID

Data Monitoring / Applications 15 Data Logger Shared Memory Segment Event Dispatcher Data Logger Node: Event Dispatcher: servers data to monitoring applications Memory Viewer:  Displays raw data with highlighting to indicate data block type Online monitorig: histogram producer/viewer Event Display:

16 Database DAQ Subsystem Message Logger Message Passing System DDS (open spliced) DAQ Health Monitor Run ControlConfiguration Resource Management Application Manager DAQ Control & Monitor Systems

 Far Detector commissioning is in progress and Near detector construction has begun.  NOνA is achieving its DAQ design goals with a commodity online buffer farm and commodity network.  We have built the foundation for a unique free-running DAQ system  We understand how to perform continuous readout and buffering  We have solved the problem of correlation and synchronization → see talk by Evan Niner  We are exploring the power and opportunities of Data Driven Triggering → see poster by Zukai Wang 17 Summary

BackUp

Resource Management 19 Resource Manager  Keeps track of which resources are in use  Resource ~ a DCM or other node Partition  A totally independent DAQ system  One partition can be used for taking physics data on an established part of the detector, while another is used for checkout and commissioning

Run Control 20  User selects resources for partition via Run Control  Once applications are started, Run Control is used to execute configuration steps, and start/stop run

Application Management 21 Application Manager:  starts/stops/monitors applications and Message Service Daemons

Message Facility 22 Message Facility – Log Messages:  Used in offline ART framework  For Online, has Server destination using DDS backbone  Message rate throttling is run-time-configurable Message Analyzer:  Configured to recognize patterns of errors, e.g., many DCMs complain about the same buffer node  Integration to Automatic Error Recovery actions

NO A DAQ Monitoring 23 Ganglia:  Widely-used open-source cluster monitoring package  Daemon on each host sends to central server  Web-based display of plots of metrics vs. time for “Grid”, “Cluster”, and nodes  Default metrics: CPU use, free memory, etc.  Extensible to custom metrics Nova DAQ Monitor  System for generating warnings and alarms when metrics vary outside configured limits