CLAS12 DAQ, Trigger and Online Computing Requirements

Slides:



Advertisements
Similar presentations
The First-Level Trigger of ATLAS Johannes Haller (CERN) on behalf of the ATLAS First-Level Trigger Groups International Europhysics Conference on High.
Advertisements

In order to acquire the full physics potential of the LHC, the ATLAS electromagnetic calorimeter must be able to efficiently identify photons and electrons.
CLAS12 CalCom Activity CLAS Collaboration Meeting, March 6 th 2014.
CLAS12 CalCom Status Update CLAS Collaboration Meeting November 14, 2014.
DAQ Status & Plans GlueX Collaboration Meeting – Feb 21-23, 2013 Jefferson Lab Bryan Moffit/David Abbott.
Trigger design engineering tools. Data flow analysis Data flow analysis through the entire Trigger Processor allow us to refine the optimal architecture.
Hall A DAQ status and upgrade plans Alexandre Camsonne Hall A Jefferson Laboratory Hall A collaboration meeting June 10 th 2011.
Online Calibration of the D0 Vertex Detector Initialization Procedure and Database Usage Harald Fox D0 Experiment Northwestern University.
DC12 Commissioning Status GOALS: establish operating conditions, determine initial calibration parameters and measure operating characteristics for the.
Thomas Jefferson National Accelerator Facility Page 1 EC / PCAL ENERGY CALIBRATION Cole Smith UVA PCAL EC Outline Why 2 calorimeters? Requirements Using.
SoLID simulation Zhiwen Zhao Uva SoLID Collaboration Meeting 2011/6/2 1.
ATLAS Liquid Argon Calorimeter Monitoring & Data Quality Jessica Levêque Centre de Physique des Particules de Marseille ATLAS Liquid Argon Calorimeter.
The Commissioning of the CLAS12 Detector V. Burkert, D. Carman, R. De Vita, D. Weygand CLAS12 workshop, June 21 st 2011.
JANA and Raw Data David Lawrence, JLab Oct. 5, 2012.
All Experimenters MeetingDmitri Denisov Week of July 16 to July 22 D0 Summary  Delivered luminosity and operating efficiency u Delivered: 1.6pb -1 u Recorded:
Dec.11, 2008 ECL parallel session, Super B1 Results of the run with the new electronics A.Kuzmin, Yu.Usov, V.Shebalin, B.Shwartz 1.New electronics configuration.
Hall-D/GlueX Software Status 12 GeV Software Review III February 11[?], 2015 Mark Ito.
MOLLER DAQ Aug 2015 meeting Team : R. Michaels, P. M. King, M. Gericke, K. Kumar R. Michaels, MOLLER Meeting, Aug, 2015.
D0 Status: 01/14-01/28 u Integrated luminosity s delivered luminosity –week of 01/ pb-1 –week of 01/ pb-1 –luminosity to tape: 40% s major.
All Experimenters MeetingDmitri Denisov Week of July 7 to July 15 Summary  Delivered luminosity and operating efficiency u Delivered: 1.4pb -1 u Recorded:
Compton Trigger Design Update Tanja Horn Compton Working Group 6 June 2008.
June 7, 2004Karen Dow CODA at Bates BLAST Data Acquisition.
Jefferson Laboratory Hall A SuperBigBite Spectrometer Data Acquisition System Alexandre Camsonne APS DNP 2013 October 24 th 2013 Hall A Jefferson Laboratory.
CLAS12 Trigger System part1: Level1 EC Revision 1, Feb 7, 2007.
Forward Carriage Commissioning CLAS Collaboration Meeting 6/19/2015CLAS12 CalCom Status Update1 ECAL PCAL FTOF Panel 1A FTOF Panel 1B Detector Status PMT.
STAR Collaboration Meeting, BNL – march 2003 Alexandre A. P. Suaide Wayne State University Slide 1 EMC Update Update on EMC –Hardware installed and current.
STAR Analysis Meeting, BNL – oct 2002 Alexandre A. P. Suaide Wayne State University Slide 1 EMC update Status of EMC analysis –Calibration –Transverse.
CLAS12 CalCom Activity: a progress report V. Burkert, D. Carman, R. De Vita, D. Weygand CLAS12 meeting, June 14 th 2012.
B.Satyanarayana Department of High Energy Physics Tata Institute of Fundamental Research Homi Bhabha Road, Colaba, Mumbai,
DAQ Status & Plans GlueX Collaboration Meeting – Feb 21-23, 2013 Jefferson Lab Bryan Moffit/David Abbott.
HPS TDAQ Review Sergey Boyarinov, Ben Raydo JLAB June 18, 2014.
Hall–D Level-1 Trigger Commissioning Part II A.Somov, H.Dong Jefferson Lab 12 GeV Trigger Workshop, July 8, 2010  Play Back Test Vector in Hall-D commissioning.
Calorimeter global commissioning: progress and plans Patrick Robbe, LAL Orsay & CERN, 25 jun 2008.
Evelyn Thomson Ohio State University Page 1 XFT Status CDF Trigger Workshop, 17 August 2000 l XFT Hardware status l XFT Integration tests at B0, including:
Simulation and reconstruction of CLAS12 Electromagnetic Calorimeter in GSIM12 S. Stepanyan (JLAB), N. Dashyan (YerPhI) CLAS12 Detector workshop, February.
DAQ and Trigger for HPS run Sergey Boyarinov JLAB July 11, Requirements and available test results 2. DAQ status 3. Trigger system status and upgrades.
EPS HEP 2007 Manchester -- Thilo Pauly July The ATLAS Level-1 Trigger Overview and Status Report including Cosmic-Ray Commissioning Thilo.
"North American" Electronics
Gu Minhao, DAQ group Experimental Center of IHEP February 2011
Digital Trigger System for the COMPASS Experiment
CLAS12 Forward Tracking: Status and Challenges
CLAS12 DAQ & Trigger Status
Micromegas Vertex Tracker Status Report
Latifa Elouadrhiri Jefferson Lab
CLAS12 DAQ/Trigger/Online Status
CLAS12 First Experiment Workshop Summary
CLAS12 Commissioning with Cosmincs
Controlling a large CPU farm using industrial tools
ALICE – First paper.
CLAS12 software workshop
PADME L0 Trigger Processor
CLAS12 Calibration and Commissioning (CALCOM)
Nick Markov, UCONN Valery Kubarovsky, JLAB
Stress tests for CLAS12 software
CLAS12 Ready for Science Answers to the Homework Questions & Summary/Path Forward Latifa Elouadrhiri Jefferson Lab CLAS12 Ready for Science Review.
Commissioning of the ALICE HLT, TPC and PHOS systems
ProtoDUNE SP DAQ assumptions, interfaces & constraints
Overview of CLAS12 Calibration
FORWARD CARRIAGE UPDATE – March-July 2015
CLAS12 First Experiment Workshop Report
Sergey Abrahamyan Yerevan Physics Institute APEX collaboration
The IFR Online Detector Control at the BaBar experiment at SLAC
Example of DAQ Trigger issues for the SoLID experiment
Preparation of the CLAS12 First Experiment Status and Time-Line
CLAS12 Timing Calibration
Commissioning of the ALICE-PHOS trigger
The IFR Online Detector Control at the BaBar experiment at SLAC
The Online Detector Control at the BaBar experiment at SLAC
LHCb Trigger, Online and related Electronics
August 19th 2013 Alexandre Camsonne
Presentation transcript:

CLAS12 DAQ, Trigger and Online Computing Requirements Sergey Boyarinov Sep 25, 2017

Notation ECAL – old EC (electromagnetic calorimeter) PCAL – preshower calorimeter DC – drift chamber HTCC – high threshold cherenkov counter FT – forward tagger TOF – time-of-flight counters MM – micromega tracker SVT – silicon vertex tracker VTP – VXS trigger processor (used on trigger stage 1 and stage 3) SSP – subsystem processor (used on trigger stage 2) FADC – flash analog-to-digital converter FPGA – field-programmable gate array VHDL – VHSIC Hardware Description Language (used to program FPGA) Xilinx – FPGA manufacturer Vivado_HLS – Xilinx High Level Synthesis (translates C++ to VHDL) Vivado – Xilinx Design Suite (translates VHDL to binary image) GEMC – CLAS12 GEANT4-based Simulation Package CLARA – CLAS12 Reconstruction and Analysis Framework DAQ – Data Acquisition System

CLAS12 DAQ Status Online computer cluster is 100% complete and operational Networking is 100% complete and operational DAQ software is operational, reliability is acceptable Performance observed (KPP): 5kHz, 200MByte/sec, 93% livetime (livetime is defined by hold-off timer = 15microsec) Performance expected: 10kHz, 400MBytes/sec, >90% livetime Performance limitation expected: 50kHz, 800MBytes/sec, >90% livetime (based on HallD DAQ performance) Electronics installed (and was used for KPP): ECAL, PCAL, FTOF, LTCC, DC, HTCC, CTOF, SVT TO DO: Electronics installed recently or will be installed: FT, FTMM, CND, MM, scalers, helicity TO DO: CAEN v1290/v1190 boards (re)calibration TO DO: Full DAQ performance test

DAQ/Trigger Hardware Layout 24 1 1 3 1 19 18 3 INSTALLED: 66 crates, 88 readout controllers

Runcontrol: setting DAQ/Trigger parameters

CLAS12 DAQ Commissioning Before beam performance test: running from random pulser trigger with low channel thresholds – measure event rate, data rate and livetime Cosmic data: running from electron trigger (ECAL+PCAL and ECAL+PCAL+DC) – obtain data for offline analysis to check data integrity and monitoring tools Performance test using beam: running from electron trigger – measure event rate, data rate and livetime with different beam conditions and DAQ/Trigger settings Timeline: before beam tests – one month (November); beam test - one shift; we assume that most of problems will be discovered and fixed before beam test

CLAS12 Trigger Status All available trigger electronics installed. It includes 25 VTP boards (stage 1 and 3 of trigger system) in ECAL, PCAL, HTCC, FT, R3 for all sectors and R1/R2 for sector 5 in Drift Chamber and main trigger crates, 9 SSP boards in stage 2 20 move VTP boards arrived and being installed in remaining DC crates and TOF crates, should be ready by October 1 All C++ trigger algorithms completed including ECAL, PCAL, DC and HTCC, FT modeling to be done FPGA implementation completed for ECAL, PCAL and DC, HTCC and FT is in progress ‘pixel’ trigger is implemented in ECAL and PCAL for cosmic calibration purposes Validation procedures under development Experienced delays in trigger development, now back on track

CLAS12 Trigger System Layout VTP for FT (3) HTCC – not “central” detector

PCAL+DC trigger event example

CLAS12 Trigger Components Summary stage name Clock (ns) algorithm input Output to trigger Output to datastream Control registers 1 ECAL 32 Cluster finding with energy correction FADC 4 clusters List of clusters Thresholds, windows PCAL DC Segment finding DCRB Segment mask Thresholds, windows HTCC 4 Cluster finding Cluster mask FT 2 sector Stage1 components coincidence, DC road finding stage1 Component mask Coincidence logic, windows 3 global Stage2 components coincidence stage2 16-bit trigger NOTE: FTOF, CTOF and CND can be added to trigger stage 1

CLAS12 Triggers 1 R1 Pulser Random / regular DAQ 2 R2 Random Tr. # Notation Reaction Final State Trigger 1 R1 Pulser Random / regular DAQ 2 R2 Random Tr. Faraday cup Background 3 E1 ep->e’X e’(CLAS) PCAL*Ectot*(DCtrack) Test HTCC efficiency 4 E2 HTCC*E1 Inclusive electron trigger 5 FT e’(FT) FTcluster(Emin,Emax) FT only, prescaled 6 FTHODO FT*HODO1*HODO2 FT and HODO1,2, prescaled 7 H1 ep->e’h+X 1 hadron FTOF1a*FTOF2*PCAL(DCtrack) One hadron, prescaled 8 H2 ep->e’2h+X 2 hadrons Two hadrons, prescaled 9 H3 ep->e’3h+X 3 hadrons Three hadrons, prescaled 10 mu2 ep->e’p+m+ m- p+m+ m- FTOF1a*FTOF2*[PCAL+ECtot]* (DCtrack) m+ m- +hadron 11 FT1 ep->e’+h e’(FT)+h+X FThodo*H1 FT+ 1 hadron, prescaled 12 FT2 ep->e’+2h e’(FT)+2h+X FThodo*H2 FT+ 2 hadrons, prescaled 13 FT3 ep->e’+3h e’(FT)+3h+X FThodo*H3 FT+3 hadrons

Trigger Configuration file VTP_CRATE adcecal1vtp # trigger stage 1 VTP_ECINNER_HIT_EMIN 100 # strip energy threshold for 1-dim peak search VTP_ECINNER_HIT_DT 8 # +-8clk = +-32ns: strip coincidence window to form 1-dim peaks, # and peak-coincidence window to form 2-dim clusters VTP_ECINNER_HIT_DALITZ 568 584 # x8: 71<dalitz<73 ………………………….. SSP_CRATE trig2 # trigger stage 2 SSP_SLOT all SSP_GT_STRG_ECALIN_CLUSTER_EMIN_EN 0 1 # enable ECAL_INNER clusters in trigger SSP_GT_STRG_ECALIN_CLUSTER_EMIN 0 2000 # ECAL_INNER cluster threshold SSP_GT_STRG_ECALIN_CLUSTER_WIDTH 0 100 # coincidence window to coincide with PCAL etc …………………………….. VTP_CRATE trig2vtp # trigger stage 3 # slot: 10 13 09 14 08 15 07 16 06 17 05 18 04 19 03 20 # payload: 01 02 03 04 05 06 07 08 09 10 11 12 13 14 15 16 VTP_PAYLOAD_EN 0 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 # 6780 corresponds to 7900 FADC latency VTP_GT_LATENCY 6780 # sector bits: trig number, ssp trig mask, ssp sector mask, multiplicity, # coincidence=number_of_extended_clock_cycles VTP_GT_TRGBIT 8 1 1 1 1 ……………………………… Experienced delays in trigger development, now back on track There is a plan to provide GUI to handle configuration files

CLAS12 Trigger Validation All stage 1 trigger algorithms ether implemented in C++ (ECAL, PCAL, HTCC) or have C++ emulation of the hardware implementation (FT – to do) Higher stages will be emulated – under discussion, likely the same way as stage 1 Validation process has 2 steps: Comparison between C++ implementation and hardware response Comparison between C++ implementation and GEANT and reconstruction results Experienced delays in trigger development, now back on track

Xilinx Xilinx Vivado_hls Vivado Step 1 DAQ Validation .VHDL FPGA Image Building Chain and Validation Step 1: Comparison between C++ implementation and hardware response (cosmic and beam) .VHDL .bin (FPGA image) .evio Trigger C++ Xilinx Vivado_hls Xilinx Vivado Step 1 Validation DAQ

Xilinx Vivado_HLS (C++ to VHDL)

Xilinx Vivado (VHDL to FPGA image)

Validation Step 1: Comparison between C++ implementation and hardware response /work/boiarino/data/vtp1_001212.evio.0 event 3: FPGA implementation data bank ========================== TRIG PEAK [0][0]: coord=298 energy=1610 time=7 TRIG PEAK [1][0]: coord=174 energy=1012 time=7 TRIG PEAK [1][1]: coord=174 energy=294 time=8 TRIG PEAK [2][0]: coord=447 energy=1226 time=7 TRIG HIT [0]: coord=298 174 447 energy=4936 time=7 C++ implementation using FADC data ====================== U: IT=7 peakU[0]: coord=298 energy=1610 V: IT=7 peakV[0]: coord=174 energy=1012 W: IT=7 peakW[0]: coord=447 energy=1226 SOFT HIT: coord=298 174 447 energy=4936

CLARA Offline Reconstruction Validation Step 2: Comparison between C++ implementation and GEANT/CLARA GEMC Or DAQ (.evio) Trigger C++ CLARA Offline Reconstruction NOTE: that step allows Offline Reconstruction team to get ready for trigger results processing during beam time

C++ trigger on GEANT data – ECAL event example

ECAL cluster finding: difference trigger-offline Validation Step 2: Comparison between C++ implementation and GEANT/CLARA (cont.) ECAL cluster finding: difference trigger-offline

Validation Step 2: Comparison between C++ implementation and GEANT/CLARA (cont.) ECAL cluster finding: 5GeV electron, no PCAL, energy deposition, offline and C++ Trigger Sampling fraction 27%

CLAS12 Trigger Commissioning Validation step 1 (cosmic): running from electron trigger (ECAL+PCAL and ECAL+PCAL+DC) and comparing hardware trigger results with C++ trigger results Validation step 2 (GEANT): processing GEANT data (currently contains ECAL and PCAL, will add HTCC and others) through C++ trigger and producing trigger banks for following offline reconstruction analysis Validation step 1(beam): running from electron trigger (ECAL+PCAL, ECAL+PCAL+DC, ECAL+PCAL+HTCC) and comparing hardware trigger results with C++ trigger results Taking beam data with random pulser – for trigger efficiency studies in offline reconstruction Taking beam data with electron trigger (ECAL+PCAL [+HTCC] [+DC]) with different trigger settings – for trigger efficiency studies in offline reconstruction Taking beam data with FT trigger - for trigger efficiency studies in offline reconstruction Timeline: cosmic and GEANT tests – underway until beam time; beam validation – 1 shift, after that data taking for detector and physics groups

Online Computing Status Computing hardware is available for most online tasks (runtime databases, messaging system, communication with EPICS etc) There is no designated ‘online farm’ for data processing in real time, two hot-swap DAQ servers can be used as temporary solution; considering part of jlab farm as designated online farm for CLAS12 Available software (some work still needed): process monitoring and control, CLAS event display, data collection from different sources (DAQ, EPICS, scalers etc) and data recording into data stream, hardware monitoring tools, data monitoring tools Online system will be commissioned together with DAQ and Trigger systems, no special time is planned

Online Computing Layout About 110 nodes

ER ET CLAS12 Data Flow (online farm proposed) Computer Center EB 40Gb Disk/Tape Disk ROCs (~110) ET ER Farm Data monitoring (2 servers) 40Gb Online Farm Data Monitoring Results 40Gb Counting House

RCDB (Runtime Condition Database) - operational

Messaging - operational Old Smartsockets was replaced with ActiveMQ ActiveMQ is installed on clon machines and tested on all platforms where it suppose to be used (including DAQ and trigger boards) Critical components were converted to ActiveMQ, more work to be done to complete, mostly on EPICS side Messaging protocol was developed, allows to use different messaging if necessary (in particular ‘xMsg’ from CODA group)

Conclusion CLAS12 DAQ, computing and network was tested during KPP run and worked as expected, reliability meets CLAS12 requirements, performance looks good but final tests have to be done with full CLAS12 configuration (waiting for remaining detectors electronics and complete trigger system) Trigger system (included ECAL and PCAL) worked as expected during KPP. Other trigger components (DC, HTCC and FT) were added after KPP and being tested with cosmic; remaining hardware has been received and being installed Online software development still in progress, but available tools allows to run Timeline: few weeks before beam full system will be commissioned by taking cosmic data and running from random pulser; planning 2 shifts for beam commissioning