26 September 2003Paul Dauncey - DAQ1 System-wide triggering and DAQ issues Paul Dauncey Imperial College London, UK.

Slides:



Advertisements
Similar presentations
6 Mar 2002Readout electronics1 Back to the drawing board Paul Dauncey Imperial College Outline: Real system New VFE chip A simple system Some questions.
Advertisements

17 Sep 2009Paul Dauncey1 US DHCAL integration with CALICE DAQ Paul Dauncey.
Adding electronic noise and pedestals to the CALICE simulation LCWS 19 – 23 rd April Catherine Fry (working with D Bowerman) Imperial College London.
24 September 2002Paul Dauncey1 Trigger issues for the CALICE beam test Paul Dauncey Imperial College London, UK.
28 February 2003Paul Dauncey - HCAL Readout1 HCAL Readout and DAQ using the ECAL Readout Boards Paul Dauncey Imperial College London, UK.
RPC Trigger Software ESR, July Tasks subsystem DCS subsystem Run Control online monitoring of the subsystem provide tools needed to perform on-
31 May 2007LCWS R&D Review - Overview1 WWS Calorimetry R&D Review: Overview of CALICE Paul Dauncey, Imperial College London On behalf of the CALICE Collaboration.
29 June 2004Paul Dauncey1 ECAL Readout Tests Paul Dauncey For the CALICE-UK electronics group A. Baird, D. Bowerman, P. Dauncey, C. Fry, R. Halsall, M.
28 Feb 2006Digi - Paul Dauncey1 In principle change from simulation output to “raw” information equivalent to that seen in real data Not “reconstruction”,
1 VLPC system and Cosmic Ray test results M. Ellis Daresbury Tracker Meeting 30 th August 2005.
20 Feb 2002Readout electronics1 Status of the readout design Paul Dauncey Imperial College Outline: Basic concept Features of proposal VFE interface issues.
2 October 2003Paul Dauncey1 Paris Summary Part 2 and Status of UK Electronics/DAQ Paul Dauncey Imperial College London, UK.
28 August 2002Paul Dauncey1 Readout electronics for the CALICE ECAL and tile HCAL Paul Dauncey Imperial College, University of London, UK For the CALICE-UK.
16 Oct 2006DAQ - Paul Dauncey1 DAQ/Online and the DHCAL Paul Dauncey Overview of existing DAQ/online system H/w requirements for DHCAL S/w requirements.
9 Sep 2005DAQ - Paul Dauncey1 DAQ/Online Status Paul Dauncey Imperial College London.
2 July 2003Paul Dauncey - DAQ1 Progress on CALICE DAQ Paul Dauncey Imperial College London, UK.
12 Oct 2005DAQ - Paul Dauncey1 DAQ/Online Status Paul Dauncey Imperial College London.
6 June 2002UK/HCAL common issues1 Paul Dauncey Imperial College Outline: UK commitments Trigger issues DAQ issues Readout electronics issues Many more.
The first testing of the CERC and PCB Version II with cosmic rays Catherine Fry Imperial College London CALICE Meeting, CERN 28 th – 29 th June 2004 Prototype.
ECFA Sep 2004Paul Dauncey - ECAL DAQ1 Thoughts on Si-W ECAL DAQ Paul Dauncey For the Imperial/UCL groups D. Bowerman, J. Butterworth, P. Dauncey,
4 Dec 2001First ideas for readout/DAQ1 Paul Dauncey Imperial College Contributions from all of UK: result of brainstorming meeting in Birmingham on 13.
LCWS Apr 2004Paul Dauncey - CALICE Readout1 CALICE ECAL Readout Status Paul Dauncey For the CALICE-UK electronics group: A. Baird, D. Bowerman,
29 January 2004Paul Dauncey - CALICE DAQ1 UK ECAL Hardware Status David Ward (for Paul Dauncey)
10 th November 2004Daniel Bowerman1 Dan Bowerman Imperial College Calice Meeting - UCL 10 th November 2004 Electronics Status For the Imperial, Manchester,
2 April 2003Paul Dauncey - CALICE DAQ1 First Ideas For CALICE Beam Test DAQ Paul Dauncey Imperial College London, UK for IC, Manchester, RAL, UCL.
LKr readout: present and future R. Fantechi 30/8/2012.
Data Acquisition Software for CMS HCAL Testbeams Jeremiah Mans Princeton University CHEP2003 San Diego, CA.
Guide to Linux Installation and Administration, 2e 1 Chapter 9 Preparing for Emergencies.
06/15/2009CALICE TB review RPC DHCAL 1m 3 test software: daq, event building, event display, analysis and simulation Lei Xia.
21 January 2003Paul Dauncey - UK Electronics1 UK Electronics Status and Issues Paul Dauncey Imperial College London.
1 MICE Tracker Update M. Ellis UKNFIC Meeting 25 th August 2005.
Reports from DESY Satoru Uozumi (Staying at DESY during Nov 11 – 25) Nov-21 GLDCAL Japan-Korea meeting.
06/03/06Calice TB preparation1 HCAL test beam monitoring - online plots & fast analysis - - what do we want to monitor - how do we want to store & communicate.
14 Sep 2005DAQ - Paul Dauncey1 Tech Board: DAQ/Online Status Paul Dauncey Imperial College London.
AHCAL Electronics. Status Commissioning and Integration Peter Göttlicher for the AHCAL developers CALICE meeting UT Arlington, March 12th, 2010.
7 Nov 2007Paul Dauncey1 Test results from Imperial Basic tests Source tests Firmware status Jamie Ballin, Paul Dauncey, Anne-Marie Magnan, Matt Noy Imperial.
Technical Part Laura Sartori. - System Overview - Hardware Configuration : description of the main tasks - L2 Decision CPU: algorithm timing analysis.
18 Jan 2006DAQ - Paul Dauncey1 DAQ/Online: readiness for DESY and CERN beam tests Paul Dauncey Imperial College London.
Prediction W. Buchmueller (DESY) arXiv:hep-ph/ (1999)
ECFA Sep 2004Paul Dauncey - CALICE Readout1 CALICE ECAL Readout Status Paul Dauncey For the CALICE-UK electronics group A. Baird, D. Bowerman, P.
11 Sep 2009Paul Dauncey1 TPAC test beam analysis tasks Paul Dauncey.
Overview of DAQ at CERN experiments E.Radicioni, INFN MICE Daq and Controls Workshop.
25 Feb 2005Paul Dauncey1 TB Review: DAQ Paul Dauncey Imperial College London For Imperial, RAL, UCL.
DEPARTEMENT DE PHYSIQUE NUCLEAIRE ET CORPUSCULAIRE JRA1 Parallel - DAQ Status, Emlyn Corrin, 8 Oct 2007 EUDET Annual Meeting, Palaiseau, Paris DAQ Status.
Nigel Watson / BirminghamCALICE ECAL, UCL, 06-Mar-2006 Test Beam Task List - ECAL  Aim:  Identify all tasks essential for run and analysis of beam data.
6 Mar 2006DAQ - Paul Dauncey1 DAQ/Online: readiness for DESY and CERN beam tests Paul Dauncey Imperial College London Update of Orsay talk; concentrate.
1 October 2003Paul Dauncey1 Mechanics components will be complete by end of year To assemble ECAL, they need the VFE boards VFE boards require VFE chips.
5 February 2003Paul Dauncey - Calice Status1 CALICE Status Paul Dauncey Imperial College London For the CALICE-UK groups: Birmingham, Cambridge, Imperial,
SPIROC update Felix Sefkow Most slides from Ludovic Raux HCAL main meeting April 18, 2007.
Update on the project - selected topics - Valeria Bartsch, Martin Postranecky, Matthew Warren, Matthew Wing University College London CALICE a calorimeter.
LCWS Apr 2004Paul Dauncey - CALICE Readout1 CALICE ECAL Readout Status Paul Dauncey For CALICE-UK electronics group: A. Baird, D. Bowerman, P. Dauncey,
8 December 2004Paul Dauncey1 ECAL Readout Paul Dauncey For the CALICE-UK electronics group A. Baird, D. Bowerman, P. Dauncey, R. Halsall, M. Postranecky,
1 Electronics Status Trigger and DAQ run successfully in RUN2006 for the first time Trigger communication to DRS boards via trigger bus Trigger firmware.
ScECAL Beam FNAL Short summary & Introduction to analysis S. Uozumi Nov ScECAL meeting.
11 October 2002Paul Dauncey - CDR Introduction1 CDR Introduction and Overview Paul Dauncey Imperial College London.
Time Management.  Time management is concerned with OS facilities and services which measure real time.  These services include:  Keeping track of.
1 Tracker Software Status M. Ellis MICE Collaboration Meeting 27 th June 2005.
5 May 2006Paul Dauncey1 The ILC, CALICE and the ECAL Paul Dauncey Imperial College London.
Notes on visit to Rome 28/04/2014 Christian Joram Szymon Kulis Samir Arfaoui.
Rutherford Appleton Laboratory September 1999Fifth Workshop on Electronics for LHC Presented by S. Quinton.
Sumary of the LKr WG R. Fantechi 31/8/2012. SLM readout restart First goal – Test the same configuration as in 2010 (rack TS) – All old power supplies.
Scalable Readout System Data Acquisition using LabVIEW Riccardo de Asmundis INFN Napoli [Certified LabVIEW Developer]
SVD FADC Status Markus Friedl (HEPHY Vienna) Wetzlar SVD-PXD Meeting, 5 February 2013.
The LHCb Calorimeter Triggers LAL Orsay and INFN Bologna.
Technical Review: DAQ/Online
Rainer Stamen, Norman Gee
Calicoes Calice OnlinE System Frédéric Magniette
CDR Project Summary and Issues
UK ECAL Hardware Status
Trigger issues for the CALICE beam test
Presentation transcript:

26 September 2003Paul Dauncey - DAQ1 System-wide triggering and DAQ issues Paul Dauncey Imperial College London, UK

26 September 2003Paul Dauncey - DAQ2 Need high statistics for accurate simulation comparison Multiple set-ups (energy, particle type, HCAL, angle, etc) ~ 10 2 Need high statistics per set-up; accurate to 3  needs ~ 10 6 Need clean data for accurate simulation comparison Remove double particle events and cosmics in time with beam Minimum trigger bias Need to take data in a reasonable time For 10 8 events total, need around ~100 Hz average 10 6 seconds is around two weeks continuous running time Several months realistic beam time Look here at non-calorimeter elements of the system Timing, trigger handling and distribution Beam monitoring and slow controls DAQ software System requirements

26 September 2003Paul Dauncey - DAQ3 Latency of overall trigger path < 180ns This is from peak of shaping time in VFE chip for sample-and-hold HCALs may set stricter requirement; not yet defined Implies all electronics must be within radiation area Jitter on trigger < 10ns This is again from VFE shaping; gives peak within 1% What is spread of shaping times between VFE chips? (If >10ns within a VFE PCB, then give direct contribution to resolution.) Again, HCAL may have stricter jitter requirement; as yet unknown (but see later) Several trigger types must be selectable Beam, beam veto, cosmic, software, external clock; others? Allow different trigger types inside and outside a spill; needed? Trigger: requirements

26 September 2003Paul Dauncey - DAQ4 Use trigger as system-wide synchronisation marker All timing done relative to trigger arrival time; each system can clock itself independently Removes need for Fast Control and Timing System (no one signed up for this anyway) Trigger sent after event, not before No assumptions about beam line signals available Cosmics can be done in an identical way Requires removal of double particle events (with second particle after trigger) offline Double particle events with second particle before trigger can be vetoed in trigger logic Trigger on external detector (e.g. scintillators), not calorimeter Removes need for trigger electronics at VFEs Minimum bias; easy to simulate Trigger: overview

26 September 2003Paul Dauncey - DAQ5 Need to hold off further triggers until finished previous event Sample-and-hold at VFE cannot be released until data digitised DHCAL has no such requirement; could handle multiple events Does not necessarily require data read out through VME; ECAL boards can buffer up to 2k events Need to enable configuration of different trigger types PC controlled, not require recabling Need to read out some event data for trigger itself Record type of trigger which caused event Offline second particle detection and removal Must be only one control interface to PC in whole system No way to synchronise different PCs to accuracy required Trigger: control requirements

26 September 2003Paul Dauncey - DAQ6 Assume someone provides trigger scintillators with logic in NIM E.g. DESY provides two crossed scintillators, 1  1cm 2, with PMTs, HV, discriminators and NIM logic in beam line Trigger routed through one of ECAL readout boards Provides trigger replication,distribution and timing adjust for HCAL and beam monitoring Provides trigger VME interface for control Trigger: distribution path Trg Logic LVDS-to-NIM HCAL ECAL trg board NIM-to-LVDS ECAL other boards LVDS? NIM? Beam monitor

26 September 2003Paul Dauncey - DAQ7 VME control logic imbedded in FPGA on ECAL readout board Identical firmware in all FPGAs; only activated on one board Trigger: control implementation

26 September 2003Paul Dauncey - DAQ8 AHCAL extremely likely to use silicon PMs for readout Possibly in parallel with APD readout Major complication for trigger Intrinsic signal shape around 30ns; too short for “after event” trigger distribution Noise rate ~ 2MHz at lowest energies Shaping signal to peak at ~ 150ns may introduce too much noise to distinguish individual channel peaks; needs study May have to have “pre-trigger” signal Hope real trigger arrives within ~10ns of signal peak Straightforward if beam line signals available to make pre-trigger Otherwise, potentially, efficiency of only ~ 5% Cosmics are still a problem One other possibility is “self-trigger” Issues of bias and trigger distribution from HCAL VFEs Trigger: HCAL SiPMs?

26 September 2003Paul Dauncey - DAQ9 Need some tracking within the beam line Resolution should be much better than pad size ~1 cm May be provided in beam line, e.g. DESY has an old Zeus silicon strip detector telescope (but no one has succeeded in reading it out recently) If not, we need to supply one (but no one signed up to do this) Tracking must be read per event; technology/format unknown If mixed particle beam, need particle ID also Cherenkov, TOF; also provided? Slow controls “Control” less important; monitoring is what is needed Read out supply voltages and translation stages; also temperature, pressure? Low rate of read out needed; ~ 1 Hz? Independently of event readout Again readout technology/format unknown (to us); is anything defined for this yet? Beam monitoring and slow controls

26 September 2003Paul Dauncey - DAQ10 Event rate of ~ 1 kHz during spill, ~ 100 Hz average DHCAL may require rate limited to ~300 Hz during spill Event sizes of up to 40 kBytes Read all data without zero suppression (except DHCAL) Implies 40 MBytes/s peak readout without buffering; this exceeds maximum rate within a single VME crate Read out ECAL, (A/D)HCAL, trigger, beam line monitoring (Potentially) separate crates, (potentially) different technologies Flexible configuration to work in several beam lines Minimise dependence on external networking, etc. Also must be able to run ECAL and HCAL separately during initial tests Need to take many different types of runs Beam data, beam interspersed with pedestals, calibration, cosmics, etc. DAQ: requirements

26 September 2003Paul Dauncey - DAQ11 Many unknowns; keep flexible Plug-and-play components to be bolted together later as required Simple and robust data structure Keep all information in one place; run file is self-contained All configuration data used stored within file All slow controls readout stored within file Eases merge with simulation and analysis formats Allow arbitrarily complex run structure Number and type of configurations completely flexible within a run Triggers within and outside of spills can be different and can be identified offline Implementation POSIX-compliant (mostly!) C++ running on Linux PCs VME using SBS 620 VME-PCI interface, VME software based on HAL ROOT for graphics and (possibly) eventual persistent data storage DAQ: concept

26 September 2003Paul Dauncey - DAQ12 Multi-PC system driven by common run control PC Each PC is independent; can have separate technology (VME, PCI, CAMAC, etc) PC configuration can be changed easily; single VME crate readout for separate system tests possible. Multiple tasks could be run on one PC; e.g. run control, ECAL and event build Prefer PCs outside radiation area if possible Have own hub and network (cost?) or rely on network infrastructure at beam line? DAQ: overview

26 September 2003Paul Dauncey - DAQ13 Need to store C++ objects in type-safe but flexible way “Record” (generalised event; includes StartRun, EndRun, Event, SlowControls, etc) and “subrecords” (for ECAL, HCAL, etc.) DAQ: data structure Simple data array with identity for run-time type- checking Type-checking through simple id-to-class list Prevents misinterpretation of subrecord Record and subrecord handling completely blind to contents Arbitrary payload class (but cannot have virtual methods)

26 September 2003Paul Dauncey - DAQ14 All parts of DAQ driven round finite state machine Nested layers within run allow arbitrary numbers of configurations E.g. allows beam data, pedestals, beam data, pedestals… E.g. allows calibration at DAC setting 0, setting 1, setting 2… DAQ: state machine

26 September 2003Paul Dauncey - DAQ15 DAQ: data transfer Record movement via standardised interface (DIO) Within PC; each interface driven by separate thread, copy pointer only PC-to-PC; via socket (with same interface), copy actual data Standardised interface allows configuration of data handlers to be easily changed Flexibility to optimise to whatever configuration is needed; e.g. ECAL only Several building blocks needed (and exist)

26 September 2003Paul Dauncey - DAQ16 For tests, assume worst case; each subsystem (ECAL, HCAL, beam monitoring and slow controls) read out with separate PC Require one socket-socket branch for each DAQ: topology Each branch can read out separate technology (VME, PCI, etc) Monitor does not necessarily sample all events; its buffer allows through events only when spare CPU available

26 September 2003Paul Dauncey - DAQ17 First version of data structure software exists Records and subrecords; loading/unloading, etc. Arbitrary payload (templated) for subrecords First version of data transport software exists Buffers, copiers, mergers, demergers, etc. Arbitrary payload (templated) with specialisation for records First version of run control software exists Both automatic (pre-defined) and manual run structures VME hardware access working SBS 620 VME-PCI interface board installed in borrowed VME crate Using Hardware Access Library (CERN/CMS) These work together Sustained rates achieved depend critically on PCs, network between the PCs on the different branches, compiler optimisation, inlining, etc; a lot of tuning needed DAQ: status

26 September 2003Paul Dauncey - DAQ18 Write data and configuration classes Until VME board interfaces defined, cannot finalise data format for event data or for board configuration data Output data format Currently have ASCII and binary (endian specific) output formats Obvious choice would be ROOT; actual objects stored, can be used interactively, easy graphics, machine-independent, etc. However, HCAL people want to use LCIO instead; feasibility/limitations under investigation Will need to convert whatever raw data format is used to zero-suppressed analysis data format offline in “reconstruction” step Online monitoring Done via ROOT memory map facility (TMapFile); allows interactive real- time histogramming Need to write all code to actual define and fill histograms DAQ: major items still to be done

26 September 2003Paul Dauncey - DAQ19 MIDAS (PSI) No experience of using this in UK Written for ~MByte data rates, ~100 Hz event rates, single PC systems Limited state diagram; no ability to take different types of events in run A lot of baggage (databases, slow controls); more complex than required C, not C++, so less natural interface downstream (and not type-safe) XDAQ (CERN/CMS) Significant experience of this in Imperial; useful to have experts on hand Optimised for CMS; no beam spill structure and asynchronous trigger and readout but easily deals with CALICE event rates and data sizes Includes HAL automatically so (should be) simple to retrofit later Deserves further investigation If moving to an existing system, XDAQ seems more suitable (?) Beware of “3am crash” issue; it is hard to debug code written by other people in a hurry… DAQ alternatives: MIDAS? XDAQ?

26 September 2003Paul Dauncey - DAQ20 Trigger Several uncertainties still, particularly with HCAL SiPMs Central control and distribution within ECAL Beam monitoring and slow controls Concept of how to include these What they physically are still very uncertain DAQ Prototype DAQ system exists Allows multiple PCs so partitionable Other existing DAQ systems could/should be studied further Summary