for the trigger subsystem working group:

Slides:



Advertisements
Similar presentations
Digital Filtering Performance in the ATLAS Level-1 Calorimeter Trigger David Hadley on behalf of the ATLAS Collaboration.
Advertisements

MICE Tracker Front End Progress Tracker Data Readout Basics Progress in Increasing Fraction of Muons Tracker Can Record Determination of Recordable Muons.
6 June 2002UK/HCAL common issues1 Paul Dauncey Imperial College Outline: UK commitments Trigger issues DAQ issues Readout electronics issues Many more.
Y. Karadzhov MICE Video Conference Thu April 9 Slide 1 Absolute Time Calibration Method General description of the TOF DAQ setup For the TOF Data Acquisition.
Straw electronics Straw Readout Board (SRB). Full SRB - IO Handling 16 covers – Input 16*2 links 400(320eff) Mbits/s Control – TTC – LEMO – VME Output.
U N C L A S S I F I E D FVTX Detector Readout Concept S. Butsyk For LANL P-25 group.
AIDA FEE64 development report August 2010 Progress after Texas CAD work Manufacturing 25th August
Claudia-Elisabeth Wulz Institute for High Energy Physics Vienna Level-1 Trigger Menu Working Group CERN, 9 November 2000 Global Trigger Overview.
VC Feb 2010Slide 1 EMR Construction Status o General Design o Electronics o Cosmics test Jean-Sebastien Graulich, Geneva.
Technical Part Laura Sartori. - System Overview - Hardware Configuration : description of the main tasks - L2 Decision CPU: algorithm timing analysis.
Start Counter Collaboration Meeting September 2004 W. Boeglin FIU.
NUMI Off Axis NUMI Off Axis Workshop Workshop Argonne Meeting Electronics for RPCs Gary Drake, Charlie Nelson Apr. 25, 2003 p. 1.
Vertex 2005, Nikko Manfred Pernicka, HEPHY Vienna 1.
1.2.7 Trigger A.Nappi TB Nov 11, Digitizers ( )  Functions 25 MHZ 10 bit p.h. + 6bit time digitizers Digital processing  Flavor A: p.h.
Trigger/inner liner work in Perugia KOPIO collaboration meeting Feb 7,2004.
01/04/09A. Salamon – TDAQ WG - CERN1 LKr calorimeter L0 trigger V. Bonaiuto, L. Cesaroni, A. Fucci, A. Salamon, G. Salina, F. Sargeni.
Jefferson Laboratory Hall A SuperBigBite Spectrometer Data Acquisition System Alexandre Camsonne APS DNP 2013 October 24 th 2013 Hall A Jefferson Laboratory.
RPC Timing Results with Final Splitters Gianpaolo Carlino INFN Napoli The Napoli RPC Group: M.Alviggi, V. Canale, M. Caprio, G.C., R. de Asmundis, M. Della.
Noise and Cosmics in the DHCAL José Repond Argonne National Laboratory CALICE Collaboration Meeting University Hassan II Casablanca, Morocco September.
New L2cal hardware and CPU timing Laura Sartori. - System overview - Hardware Configuration: a set of Pulsar boards receives, preprocess and merges the.
LHCb VELO Upgrade Strip Chip Option: Data Processing Algorithms Giulio Forcolin, Abdul Afandi, Chris Parkes, Tomasz Szumlak* * AGH-Krakow Part I: LCMS.
TPC electronics Status, Plans, Needs Marcus Larwill April
January 17, MICE Tracker Firmware Dead Time and Muon Detection Studies for the MICE Tracker Tracker Data Readout Basics Progress in Increasing Fraction.
LKr readout and trigger R. Fantechi 3/2/2010. The CARE structure.
Proposal of a digital-analog RPC front-end for synchronous density particle measure for DHCAL application R.Cardarelli and R.Santonico INFN and University.
Progress on Pixel Region Optimization and SystemVerilog Simulation Phase 2 Pixel Electronics Meeting – Progress on Pixel Region Optimization and SystemVerilog.
Comparison of different chamber configurations for the high luminosity upgrade of M2R2 G. Martellotti - LNF - 13/03/2015 Roma1 + Alessia.
Feb. 3, 2007IFC meeting1 Beam test report Ph. Bruel on behalf of the beam test working group Gamma-ray Large Area Space Telescope.
Preliminary Design of Trigger System for BES III Zhen’an LIU Inst of High Energy Physics,Beijing Oct
August 24, 2011IDAP Kick-off meeting - TileCal ATLAS TileCal Upgrade LHC and ATLAS current status LHC designed for cm -2 s 7+7 TeV Limited to.
DAQ Selection Discussion DAQ Subgroup Phone Conference Christopher Crawford
C.Beigbeder, D.Breton, M.El Berni, J.Maalmi, V.Tocut – LAL/In2p3/CNRS L.Leterrier, S. Drouet - LPC/In2p3/CNRS P. Vallerand - GANIL/CNRS/CEA SuperB -Collaboration.
29/05/09A. Salamon – TDAQ WG - CERN1 LKr calorimeter L0 trigger V. Bonaiuto, L. Cesaroni, A. Fucci, A. Salamon, G. Salina, F. Sargeni.
The LHCb Calorimeter Triggers LAL Orsay and INFN Bologna.
IRFU The ANTARES Data Acquisition System S. Anvar, F. Druillole, H. Le Provost, F. Louis, B. Vallage (CEA) ACTAR Workshop, 2008 June 10.
DAQ and Trigger for HPS run Sergey Boyarinov JLAB July 11, Requirements and available test results 2. DAQ status 3. Trigger system status and upgrades.
SuperB-DCH S ervizio E lettronico L aboratori F rascati 1LNF-SuperB Workshop – September 2010G. Felici DCH FEE STATUS Some ideas for Level 1 Triggered.
Eric Hazen1 Ethernet Readout With: E. Kearns, J. Raaf, S.X. Wu, others... Eric Hazen Boston University.
EPS HEP 2007 Manchester -- Thilo Pauly July The ATLAS Level-1 Trigger Overview and Status Report including Cosmic-Ray Commissioning Thilo.
Gianluca Lamanna TDAQ WG meeting. CHOD crossing point two slabs The CHOD offline time resolution can be obtained online exploiting hit position.
K + → p + nn The NA62 liquid krypton electromagnetic calorimeter Level 0 trigger V. Bonaiuto (a), A. Fucci (b), G. Paoluzzi (b), A. Salamon (b), G. Salina.
Work on Muon System TDR - in progress Word -> Latex ?
DAQ ACQUISITION FOR THE dE/dX DETECTOR
DCH FEE STATUS Level 1 Triggered Data Flow FEE Implementation &
Digital readout architecture for Velopix
L1Calo Requirements on the DPS
Preliminary considerations on the strip readout chip for SVT
Iwaki System Readout Board User’s Guide
KRB proposal (Read Board of Kyiv group)
DCH FEE 28 chs DCH prototype FEE &
RT2003, Montreal Niko Neufeld, CERN-EP & Univ. de Lausanne
A New Clock Distribution/Topology Processor Module for KOTO (CDT)
Trigger, DAQ, & Online: Perspectives on Electronics
CMS EMU TRIGGER ELECTRONICS
ATLAS L1Calo Phase2 Upgrade
FIT Front End Electronics & Readout
Vertex 2005 November 7-11, 2005 Chuzenji Lake, Nikko, Japan
Implementation of the Jet Algorithm ATLAS Level-1 Calorimeter Trigger
Muon Recording Studies and Progress for the MICE Tracker
Status of n-XYTER read-out chain at GSI
The First-Level Trigger of ATLAS
Dominique Breton, Jihane Maalmi
Example of DAQ Trigger issues for the SoLID experiment
John Harvey CERN EP/LBC July 24, 2001
Commissioning of the ALICE-PHOS trigger
The CMS Tracking Readout and Front End Driver Testing
August 19th 2013 Alexandre Camsonne
ETD parallel session March 18th 2010
SVT detector electronics
U. Marconi, D. Breton, S. Luitz
Presentation transcript:

for the trigger subsystem working group: A.Nappi, May 14,2004 for the trigger subsystem working group: P.Amaudruz, N.Cartiglia, E.Imbergamo, S.Kettell, A.Nappi, T.Numao, G.Redlinger, R.D.Schamberger,

Boundary conditions 80% of the bunchlets (25MHz) have at least 1 hit with E>5MeV in the preradiator (Ermanno’s study) “Singles”  20MHz Aim: tolerate a maximum 5% inefficiency on top of the efficiency due to background separation cuts Rate limitations not yet well known Preradiator chamber readout can take a rate of L1 triggers, with 10s latency, in excess of 100kHz Uncertainties related to: wire hit rate, readout formats, bandwidth of various interconnection links Unified scintillator readout by WFD: Rate limits to be assessed

Prerad data rates (P. Amaudruz) more details Readout Card Chamber Collector Module Collector Anode TDC FPGA FPGA 24ch. FPGA Wire hit Rate 30KHz B D A 96ch. 8 x 96ch. 90Mbits/s >200Mbits/s LV1 100KHz 8 x 8 x 96ch. 200Mbits/s Multiplicity For LV2 LV2 ADC FPGA C 96ch. Data rates (Mb/s) for 100kHz trigger, 100kHz WHR: A : 13 ; B : 23 ; C :15(42) ; D : 152(260) 12 bit “charge” output (full digitizations) Strip hit Rate 150KHz Cathode

Design guidelines Assume, as a work hypothesis, “traditional” 3 level structure à la ATLAS Level 1 Input: subdetector data through data paths parallel to standard readout (trigger primitives) Output Time stamp of selected events Regions of interest of selected events Level 2 Based on (subset of?) readout data before event building Output: decision to pass event to event building Level 3 After event building

Traditional (3 level) scheme Special data path Readout system Readout system First Level ? ? . . . . . . . . Buffer cards Second level processor Buffer cards Second level processor Switch . . . . Event buiders Event buiders

Schematics of the data flow Level 1 Level 2 Level 3 Front end pipelines Output buffers Event building Front end pipelines Output buffers Permanent storage . . . . . Front end pipelines Output buffers

Simulation results Rate/efficiency estimates (George)

More about George’s results: plots     e  + – 0 30

Some data from Ermanno: Energy distributions ( 2 cluster ev.) E in MeV 0 Bunchlet

Handles for further rejection Work in progress by Ermanno and George Reject tracks from inside PR and/or upstream region Energy cuts Selective use of veto counters with individually optimized windows Kinematic correlations?

Open issues We do not have yet a fully satisfactory set of conditions that achieve the target efficiencies and rates Not clear at the moment whether level 1 can achieve a rejection sufficiently interesting to be used in the front end modules to control the transfer from input pipelines to readout buffers This creates interest in an option where level 1 is abolished altogether. There is an intermediate option where level 1 is not used by the front end modules, but is still interesting as a first trigger level using preelaborated information from the front end

Two level scheme George’s proposal

Triggerless readout ? CKM vs KOPIO comparison by S.Kettel CKM triggerless scheme send 250s “macroslices” from each frontend to each processor 400 sources*4000 slices L1 runs in software Identifies events based on the time of each K+ Reduction factors: L1: 30MHz,50GB  4GB Higher levels:  100MB/s Estimated cost: ~$2M

Status of the design Only a provisional work hypothesis on a general architecture exists Even that is subject to the uncertainty on the number of levels Indications for basing LV1 design on the assumption of a ~10s latency No work has gone into developing concrete implementation proposal Guidelines that get some consensus: try to build all trigger conditions based on digitized time and amplitude information a parallel data processing path for the trigger may be useful not only for L1, but also to provide redundancy to the scintillator readout trigger design may be simplified if signals are timed to have the same phase w.r.t the clock

Plan of attack MC: List of setup cuts LV1: High level design of PR+CAL level1 scintillator logic cluster conditions Boolean logic LV2: Develop realistic proposal and understand possible performance MC: List of setup cuts set of realistic condi- tions to achieve 95% efficiency with minimum rate Point of decision: How many levels What is done in each level Parallel processing path for scintillator signals Design at the level of logical boxes, data formats, definition of communication paths Preliminary cost estimates

Manpower For this to become a schedule with the baseline review as a deadline, a crash effort is needed People presently involved E.Imbergamo, G.Redlinger, S.Kettell, A.Nappi @ variable fractions of their time Two more physicists + one electronic engineer working at a large fraction of their time on this project would be needed

Conclusions Trigger work still in its infancy Problem appears more arduous than anticipated Contribution of events not from fiducial region Veto losses Two main lines of attack being pursued in parallel Improve level 1 rejection to a level such that front end systems can cope with Investigate possibility of triggerless front end Need to develop concrete implementation schemes Lots of room for new ideas and contributions

Reserve slides

Performance of TDC chip (P.A.Amaudruz) Trigger rate that saturates the TDC-FPGA link (left scale) Average TDC buffer level (right scale)