Discussion slides Dual-Phase ProtoDune 25/2/2016.

Slides:



Advertisements
Similar presentations
Kondo GNANVO Florida Institute of Technology, Melbourne FL.
Advertisements

Digital RF Stabilization System Based on MicroTCA Technology - Libera LLRF Robert Černe May 2010, RT10, Lisboa
LAPP electronics developments Jean Jacquemier, Yannis Karyotakis, Jean-Marc Nappa,, Jean Tassan, Sébastien Vilalte. CLIC WS 12-16/10/2009.
CHL -2 Level 1 Trigger System Fully Pipelined Custom ElectronicsDigitization Drift Chamber Pre-amp The GlueX experiment will utilize fully pipelined front.
Data Acquisition System for 2D X-Ray Detector Beijing Synchrotron Radiation Facility (BSRF) located at Institute of High Energy Physics is the first synchrotron.
Development of novel R/O electronics for LAr detectors Max Hess Controller ADC Data Reduction Ethernet 10/100Mbit Host Detector typical block.
GEM Design Plans Jason Gilmore TAMU Workshop 1 Oct
Proposal of new electronics integrated on the flanges for LAr TPC S. Cento, G. Meng CERN June 2014.
Q. He, K.T. McDonald  BooNe Electronics/DAQ Meeting Dec. 9,  BooNE Comments on  BooNE DAQ Challenges Qing He, Kirk T. McDonald Princeton University.
1 S. E. Tzamarias Hellenic Open University N eutrino E xtended S ubmarine T elescope with O ceanographic R esearch Readout Electronics DAQ & Calibration.
Prototype Test of SPring-8 FADC Module Da-Shung Su Wen-Chen Chang 02/07/2002.
Update on APV25-SRS Electronics Kondo Gnanvo. Outline Various SRS Electronics Status of the APV25-SRS UVa Test of the SRU with multiple.
U N C L A S S I F I E D FVTX Detector Readout Concept S. Butsyk For LANL P-25 group.
Understanding Data Acquisition System for N- XYTER.
VC Feb 2010Slide 1 EMR Construction Status o General Design o Electronics o Cosmics test Jean-Sebastien Graulich, Geneva.
ILC Trigger & DAQ Issues - 1 ILC DAQ issues ILC DAQ issues By P. Le Dû
6-10 Oct 2002GREX 2002, Pisa D. Verkindt, LAPP 1 Virgo Data Acquisition D. Verkindt, LAPP DAQ Purpose DAQ Architecture Data Acquisition examples Connection.
S.Vereschagin, Yu.Zanevsky, F.Levchanovskiy S.Chernenko, G.Cheremukhina, S.Zaporozhets, A.Averyanov R&D FOR TPC MPD/NICA READOUT ELECTRONICS Varna, 2013.
BI day 2011 T Bogey CERN BE/BI. Overview to the TTpos system Proposed technical solution Performance of the system Lab test Beam test Planning for 2012.
ATLAS Liquid Argon Calorimeter Monitoring & Data Quality Jessica Levêque Centre de Physique des Particules de Marseille ATLAS Liquid Argon Calorimeter.
21-Aug-06DoE Site Review / Harvard(1) Front End Electronics for the NOvA Neutrino Detector John Oliver Long baseline neutrino experiment Fermilab (Chicago)
Features of the new Alibava firmware: 1. Universal for laboratory use (readout of stand-alone detector via USB interface) and for the telescope readout.
DAQ for 4-th DC S.Popescu. Introduction We have to define DAQ chapter of the DOD for the following detectors –Vertex detector –TPC –Calorimeter –Muon.
24/03/2010 TDAQ WG - CERN 1 LKr L0 trigger status report V. Bonaiuto, G. Carboni, L. Cesaroni, A. Fucci, G. Paoluzzi, A. Salamon, G. Salina, E. Santovetti,
1 DAQ Update MEG Review Meeting, Feb. 17 th 2010.
News on GEM Readout with the SRS, DATE & AMORE
1 07/10/07 Forward Vertex Detector Technical Design – Electronics DAQ Readout electronics split into two parts – Near the detector (ROC) – Compresses and.
Guido Haefeli CHIPP Workshop on Detector R&D Geneva, June 2008 R&D at LPHE/EPFL: SiPM and DAQ electronics.
01/04/09A. Salamon – TDAQ WG - CERN1 LKr calorimeter L0 trigger V. Bonaiuto, L. Cesaroni, A. Fucci, A. Salamon, G. Salina, F. Sargeni.
Performance of Programmable Logic Devices (PLDs) in read-out of high speed detectors Jack Fried INSTRUMENTATION DIVISION PLD ? PLD ? Muon Tracker PLD Muon.
1 ENH1 Detector Integration Meeting 18/6/2015 D.Autiero (IPNL) WA105:  Detector installation requirements  Detector racks and power requirements Note:
Beam diagnostics developments at LAPP: Digital part CTF3 Collaboration Meeting Louis Bellier, Richard Hermel, Yannis Karyotakis, Jean Tassan,
Florida Institute of Technology, Melbourne, FL
XLV INTERNATIONAL WINTER MEETING ON NUCLEAR PHYSICS Tiago Pérez II Physikalisches Institut For the PANDA collaboration FPGA Compute node for the PANDA.
1 Electronics Status Trigger and DAQ run successfully in RUN2006 for the first time Trigger communication to DRS boards via trigger bus Trigger firmware.
Time and amplitude calibration of the Baikal-GVD neutrino telescope Vladimir Aynutdinov, Bair Shaybonov for Baikal collaboration S Vladimir Aynutdinov,
BPM stripline acquisition in CLEX Sébastien Vilalte.
LAPP BI Read-out electronics Jean Jacquemier, Yannis Karyotakis, Jean-Marc Nappa,, Jean Tassan, Sébastien Vilalte. CLIC BI WS 02-03/06/2009.
LKr readout and trigger R. Fantechi 3/2/2010. The CARE structure.
Pierre VANDE VYVRE ALICE Online upgrade October 03, 2012 Offline Meeting, CERN.
ECFA Workshop, Warsaw, June G. Eckerlin Data Acquisition for the ILD G. Eckerlin ILD Meeting ILC ECFA Workshop, Warsaw, June 11 th 2008 DAQ Concept.
CLAS12 Central Detector Meeting, Saclay, 3 Dec MVT Read-Out Architecture & MVT / SVT Integration Issues Irakli MANDJAVIDZE.
Rutherford Appleton Laboratory September 1999Fifth Workshop on Electronics for LHC Presented by S. Quinton.
1 WA105 General Meeting CERN, 21/9/2015 Status of charge readout analog and digital FE Dario Autiero, L. Balleyguier, E.Bechetoille, D.Caiulo, B.Carlus,
Software and TDAQ Peter Lichard, Vito Palladino NA62 Collaboration Meeting, Sept Ferrara.
1 WA105 General Meeting CERN, 21/9/2015 Technical Board, introduction Dario Autiero, IPNL Lyon.
DAQ Selection Discussion DAQ Subgroup Phone Conference Christopher Crawford
Overview of TPC Front-end electronics I.Konorov Outline:  TPC prototype development  Readout scheme of the final TPC detector and further developments.
ROD Activities at Dresden Andreas Glatte, Andreas Meyer, Andy Kielburg-Jeka, Arno Straessner LAr Electronics Upgrade Meeting – LAr Week September 2009.
1 WA105 General Meeting CERN, 21/1/2015 Status of charge readout FE electronics and DAQ Dario Autiero,E.Bechetoille, D.Caiulo, B.Carlus, L. Chaussard,
The trigger-less readout for the Mu3e experiment Dirk Wiedner On behalf of the Mu3e collaboration 31 March 20161Dirk Wiedner.
J.Maalmi, D.Breton – SuperB Workshop – Frascati – September 2010 Electronics for the two-bar test. D.Breton & J.Maalmi (LAL Orsay)
Fermilab Scientific Computing Division Fermi National Accelerator Laboratory, Batavia, Illinois, USA. Off-the-Shelf Hardware and Software DAQ Performance.
29/05/09A. Salamon – TDAQ WG - CERN1 LKr calorimeter L0 trigger V. Bonaiuto, L. Cesaroni, A. Fucci, A. Salamon, G. Salina, F. Sargeni.
IRFU The ANTARES Data Acquisition System S. Anvar, F. Druillole, H. Le Provost, F. Louis, B. Vallage (CEA) ACTAR Workshop, 2008 June 10.
1 WA105/ProtoDUNE dual-phase General Meeting CERN, 7/3/2016 Status of charge readout analog and digital FE Dario Autiero, L. Balleyguier, E.Bechetoille,
DAQ ACQUISITION FOR THE dE/dX DETECTOR
DCH FEE STATUS Level 1 Triggered Data Flow FEE Implementation &
Front-end Electronic for a neutrino telescope : a new ASIC SCOTT
PRAD DAQ System Overview
PADME L0 Trigger Processor
CoBo - Different Boundaries & Different Options of
ProtoDUNE SP DAQ assumptions, interfaces & constraints
Toward a costing model What next? Technology decision n Schedule
Front-end electronic system for large area photomultipliers readout
VELO readout On detector electronics Off detector electronics to DAQ
Example of DAQ Trigger issues for the SoLID experiment
The CMS Tracking Readout and Front End Driver Testing
SVT detector electronics
The LHCb Front-end Electronics System Status and Future Development
Presentation transcript:

Discussion slides Dual-Phase ProtoDune 25/2/2016

General requirements and architecture

33 E=0.5 kV/cm Cathode Segmented anode In gas phase with double-phase amplification LAr volume X and Y charge collection strips mm pitch, 3 m long  7680 readout channels Double phase liquid argon TPC 6x6x6 m3 active volume Drift Drift coordinate 6 m = 4 ms sampling 2.5 MHz (400 ns), 12 bits  samples per drift window 6 m 3 m Prompt UV light dE/dx  ionization Photomultipliers  Event size: drift window of 7680 channels x samples = MB

Accessible cold front-end electronics and uTCA DAQ system  Cryogenic ASIC amplifiers (CMOS 0.35um) 16ch externally accessible: Working at 110K at the bottom of the signal chimneys Cards fixed to a plug accessible from outside  Short cables capacitance, low noise at low T  Digital electronics at warm on the tank deck: Architecture based on uTCA standard 1 crate/signal chimney, 640 channels/crate  12 uTCA crates, 10 AMC cards/crate, 64 ch/card 4 ASICs 16 ch. (CMOS 0.35 um) Full accessibility provided by the double-phase charge readout at the top of the detector uTCA crate Signal chimney CRP Warm Cold

 Dual phase ProtoDUNE detector characteristics: - Two views with mm pitch  7680 channels -Long drift 4 us  samples at 2.5 MHz -High S/N~100  All electronics at warm, accessible  Costs minimization, massive use of commercial large bandwidth standards in telecommunication industry, uTCA, Ethernet networks, massive computing  Easy to follow technological evolution, benefit of costs reduction and increase of performance in the long term perspective  Non-zero suppressed data flow handled up to computing farm back-end which is taking care of final part of event building, data filtering, online processing for data quality, purity, gain analysis, local buffer of data and data formatting for transfer to EOS storage in files of a few GBs  Signal lossless compression benefits by high S/N ratio, developed an optimized version of Huffman code reducing data volume by at least a factor 10  Timing and trigger distribution scheme based on White Rabbit (became commercial hardware too); thought since the beginning for a beam application (handles beam window signals, beam trigger counters, external trigger counters for cosmics)  Components of the timing chain purchased and uTCA slave card for signal distributions on crates backplane developed

 FE based on microTCA standard and 10 Gbit/s ethernet  120 uTCA digitization boards under production for the 6x6X6 since the end of last year, will be available this summer  Light readout fully integrated in charge readout uTCA scheme and with different operation modes in-spill out-spill  Back-end actually based on two commercial Bittware cards (already purchased) with x8 10 Gbit links with high computing power and event building capabilities. Each card performs event building for ½ of the detector charge readout, one card deals with light signals too  Online storage and computing facility is an important part of the system, a possible implementation has been designed and costed with DELL, it will be implemented on a smaller scale for the 3x1x1 by September 2016  20 Gbit/s data link foreseen for data transfer to computing division

7 uTCA crates Readout in groups of 640 channels/chimney  640 channels/chimney  1 uTCA crate/chimney, 10 Gb/s link  10 AMC digitization boards per uTCA crate, 64 readout channels per AMC board  12 u TCA crates for charge readout + 1 uTCA crate for light readout uTCA charge readout architecture 7

88 DAQ timing-trigger integration CC 1CC 2 CC 12PMC CC 6 CC GbE links + 2 spares Bittware card 1Bittware card GbE links + 1 spares Meinberg GPS White Rabbit GrandMaster switch WR slave MCH mezzanine WR slave trigger board Time Beam window NIM signal every 20 s Beam Trigger counter NIM signal ~ a few 100 Hz Charge readout Light readout WR Clock + time + triggers Digitized data Time Data processing PC 1Data processing PC 2 Trigger PC Cosmics counters 8

C.R. stands for Counting Room

Front-end electronics

11 uTCA crates Readout in groups of 640 channels/chimney  640 channels/chimney  1 uTCA crate/chimney, 10 Gb/s link  10 AMC digitization boards per uTCA crate, 64 readout channels per AMC board, memory buffer samples/channel  12 u TCA crates for charge readout + 1 uTCA crate for light readout uTCA charge readout architecture 11

Prototype of the 64 channels AMC digitization card ( MHz, 12 bits, 2V, AD5297) hosted on Bittware S4AM, 10 GbE output on uTCA backplane

t=beam trigger - 2 mst=beam trigger  reconstructed event drift t=beam trigger t=beam trigger + 2 ms reconstructed event drift The « belt conveyor » effect +- 4 ms around the beam trigger time 13 Typical event signature for ground surface Liquid Ar TPC operation

 During spills it is needed a continuous digitization of the light in the +-4 ms around the trigger time (the light signal is instantaneous and keeps memory of the real arrival time of the cosmics)  Sampling can be coarse up to 400 ns just to correlate to charge readout  Sum 16 samples at 40MHz to get an effective 2.5 MHz sampling like for the charge readout The LRO card has to know spill/out of spill Out of spill it can define self- triggering light triggers when “n” PMTs are over a certain threshold and transmit its time-stamp over the WR 14

15 DAQ timing-trigger integration CC 1CC 2 CC 12PMC CC 6 CC GbE links + 2 spares Bittware card 1Bittware card GbE links + 1 spares Meinberg GPS White Rabbit GrandMaster switch WR slave MCH mezzanine WR slave trigger board Time Beam window NIM signal every 20 s Beam Trigger counter NIM signal ~ a few 100 Hz Charge readout Light readout WR Clock + time + triggers Digitized data Time Data processing PC 1Data processing PC 2 Trigger PC Cosmics counters 15

Back-end

17 DAQ timing-trigger integration CC 1CC 2 CC 12PMC CC 6 CC GbE links + 2 spares Bittware card 1Bittware card GbE links + 1 spares Meinberg GPS White Rabbit GrandMaster switch WR slave MCH mezzanine WR slave trigger board Time Beam window NIM signal every 20 s Beam Trigger counter NIM signal ~ a few 100 Hz Charge readout Light readout WR Clock + time + triggers Digitized data Time Data processing PC 1Data processing PC 2 Trigger PC Cosmics counters 17

8 10 Gbit links, high computing power in FPGA, OPENCL programming, event building

C.R. stands for Counting Room

Trigger and timing

Trigger counter uTCA crate of LRO PMTs for light readout Crate with NIM logic Start of spill signal (cable) PC with WR time stamping card Time stamps: Beam trigger Start of spill External cosmic trigger Calibration pulses External trigger plane for cosmics WR switch (optical links 1 Gbps) Light readout WR link x12 Charge readout WR links Light Beam triggers from SPS Tertiary beam line ~100Hz 22 Cosmic muon DAQ timing-trigger integration

 White Rabbit developed for the synchronization of CERN accelerators chain offers sub- ns synchronization over ~10 km distance, based on PTP + synchronous ethernet scheme previously developed in 2008 (  White Rabbit chains can be now set up with commercial components:  Network based on Grand Master switch  Time tagging cards for external triggers  Slave nodes in piggy back configuration to interface to uTCA  Transmission of synchronization and trigger data over the WR network + clock  Slave uTCA nodes propagate clock + sync + trigger planes on the uTCA backplane so that the FE digitization cards are aligned in sampling, know absolute time and can compare it with the one of transmitted triggers  FE knows spill time and off spill time and can set up different operation modes  Trigger timestamps may be created by beam counters, cosmic counters, light readout system in uTCA

24 DAQ timing-trigger integration CC 1CC 2 CC 12PMC CC 6 CC GbE links + 2 spares Bittware card 1Bittware card GbE links + 1 spares Meinberg GPS White Rabbit GrandMaster switch WR slave MCH mezzanine WR slave trigger board Time Beam window NIM signal every 20 s Beam Trigger counter NIM signal ~ a few 100 Hz Charge readout Light readout WR Clock + time + triggers Digitized data Time Data processing PC 1Data processing PC 2 Trigger PC Cosmics counters 24

26  WR Grand Master Switch WRS-3/18 18 SFP fiber cages, 1GbE Supports connections up to 10 km distance Connected to Meimberg GPS as time source  PC time-tagging trigger board SPEC carrier board+ Fine Delay FMC Can time-stamp with 1ns accuracy beam trigger signals or external large area scintillator counters  Trigger time-stamps data are transmitted over the WR network to the microTCA MCH  WR-LEN (White rabbit Light Embedded Node)  WR slave node cards basis of WA105 development for a MCH WR uTCA mezzanine SPEC FMC PCIe carrier V4FMC Fine Delay 1 ns 4 channels Commercial components for the WR network: Test-bench with Meimberg + switch + slave node

Online storage and processing

Data size  Data are expected to be taken without zero skipping and exploiting loss-less compression and the system has been designed to support up to 100 Hz of beam triggers without zero-skipping and no compression  7680 channels, 10k samples in a drift windows of 4ms  146.8MB/events, No zero skipping  Beam rate: 100Hz  Data flow= 14.3 GB/s  Light readout does not change in a significant way this picture (<0.5 GB/s)  Integrated local DAQ bandwidth on the “20 GB/s scale” 1 day of data ~ 1000TB (no zero skipping, no compression) (assuming 50% beam duty cycle + contingency) Huffman lossless compression can reduce the non-zero-skipped charge readout data volume by at least a factor 10 (S/N for double phase ~100:1, small noise fluctuations in absolute ADC counts) 28

Online storage  The B/E boards are hosted into Event Building Workstations transferring the data at 40 Gbps to the local storage/processing system (two links per WS)  A standard 10/40 Gbps switch is used to interconnect all these elements and to send the data to be permanently stored to the CERN EOS/CASTOR system.  The online storage/processing system fulfills the CERN requirements of a few days equivalent data storage local buffer for the experiment DAQ system.  The online processing/storage system includes 15 disks servers (Object Storage Servers – OSS, DELL R730xd for the reference design) for a total capacity of the order of 1 PB, 2 redundant MetaData Servers (MDS, DELL R630), 1 configuration server (DELL R430) and 1 processing farm (DELL M1000e with 16 M630 blades leading to 384 cores).  Final system software configuration under evaluation with data writing/access tests at the CCIN2P3 (LYON computing center), first application of a reduced scale system on 3x1x1  Software for data transfer at CERN based on IT standards EOS/CASTOR scripts Online storage and processing 29

For each beam trigger we can have on average 70 cosmics overlapped on the drift window after the trigger (these cosmics may have interacted with the detector in the 4 ms before the trigger and in the 4 ms after the trigger  chopped tracks, “belt conveyor” effect 30 Example of cosmics only event (in one of the views) Typical event signature for ground surface Liquid Ar TPC operation

31 DAQ timing-trigger integration CC 1CC 2 CC 12PMC CC 6 CC GbE links + 2 spares Bittware card 1Bittware card GbE links + 1 spares Meinberg GPS White Rabbit GrandMaster switch WR slave MCH mezzanine WR slave trigger board Time Beam window NIM signal every 20 s Beam Trigger counter NIM signal ~ a few 100 Hz Charge readout Light readout WR Clock + time + triggers Digitized data Time Data processing PC 1Data processing PC 2 Trigger PC Cosmics counters 31

C.R. stands for Counting Room

33

34

35 Offline

Offline and computing model:  MC production and code management actually centralized at CCIN2P3 in Lyon  Massive MC production expected at CCIN2P3 and possible other centers  6x6x6 data production at CERN and Fermilab  Software based on QSCAN (simulation/reconstruction); fast execution and easily adaptable for detector performance analysis, online monitoring and analysis : Framework dealing with all the aspects of the detector simulation and reconstruction going from: geometry description, interface to MC transport codes like Geant3 and 4 via VMC, events generators like Genie, detector response and electronics simulation for both single and dual- phase, simulated and real events reconstruction and event display. Interface to the raw and simulated data formats for various detector setups going from several R&D prototypes to WA105 3x1x1 and 6x6x6 and the 20 and 50 kton detectors considered in the LAGUNA-LBNO design study. Allowing to understand the performance of the detectors under construction like the 3x1X1 and 6x6x6 by also comparing to the result obtained in the last 10 years with different prototypes.  3x1x1 simulation analysis and monitoring with QSCAN (September 2016)  Developing QSCAN for 6x6x6; needed for detector optimization analysis and online analysis/monitoring

 WA105 TDR: 175 M triggers, to be refined once the beam simulation will be available  If totally stored in non-zero-skipped, lossless compression format (assuming Huffman, factor 10 compression: 15MB/event)  2.4 PB (+ cosmic runs and technical tests) Following discussions we had with the IT people at CERN this total data volume of 2.4 PB does not present any technical problem on the CERN and Fermilab storage facilities scale  Requested link from online-storage to CERN computing division at 20 Gbps, compatible with 100 Hz non-zero-skipped, Huffman compressed (factor 10) data flow  Existing fast link between CERN and FNAL Total data volume 37