L3 DAQ Doug Chapin for the L3DAQ group DAQShifters Meeting 10 Sep 2002 Overview of L3 DAQ uMon l3xqt l3xmon.

Slides:



Advertisements
Similar presentations
CHEP 2000 Padova Stefano Veneziano 1 The Read-Out Crate in the ATLAS DAQ/EF prototype -1 The Read-Out Crate model The Read-Out Buffer The ROBin ROB performance.
Advertisements

L2 Online Reinhard Schwienhorst DAQ shifters meeting, 03/12/02.
DCM Embedded Software Infrastructure, Build Environment and Kernel Modules A.Norman (U.Virginia) 1 July '09 NOvA Collaboration Mtg.
KX-NS1000 Initial Set Up For step by step : 16 May,
June 19, 2002 A Software Skeleton for the Full Front-End Crate Test at BNL Goal: to provide a working data acquisition (DAQ) system for the coming full.
6-1 I/O Methods I/O – Transfer of data between memory of the system and the I/O device Most devices operate asynchronously from the CPU Most methods involve.
ACAT 2002, Moscow June 24-28thJ. Hernández. DESY-Zeuthen1 Offline Mass Data Processing using Online Computing Resources at HERA-B José Hernández DESY-Zeuthen.
Trigger Framework Tutorial Dan Edmunds Michigan State University 14-Apr-2009 The purpose of this talk is to present the general features of the Trigger.
Takeo Higuchi Institute of Particle and Nuclear Studies, KEK; On behalf of COPPER working group Jan.20, 2004 Hawaii, USA Super B Factory Workshop Readout.
Improving IPC by Kernel Design Jochen Liedtke Shane Matthews Portland State University.
DAQ WS03 Sept 2006Jean-Sébastien GraulichSlide 1 Interface between Control & Monitoring and DDAQ o Introduction o Some background on DATE o Control Interface.
LHCb readout infrastructure NA62 TDAQ WG Meeting April 1 st, 2009 Niko Neufeld, PH/LBC.
Control and monitoring of on-line trigger algorithms using a SCADA system Eric van Herwijnen Wednesday 15 th February 2006.
Use of ROOT in the D0 Online Event Monitoring System Joel Snow, D0 Collaboration, February 2000.
All Experimenters Meetings Windows 7 Migration 1 April 18, 2011 W7 AEM Presentation.
Andrei Sukhanov, BNL 1 DAQ status. Performance of the Mercury part of the DAQ = 234 Hz. Receiving program on vmesparc receives the event from Mercury and.
Emlyn Corrin, DPNC, University of Geneva EUDAQ Status of the EUDET JRA1 DAQ software Emlyn Corrin, University of Geneva 1.
TID and TS J. William Gu Data Acquisition 1.Trigger distribution scheme 2.TID development 3.TID in test setup 4.TS development.
Trigger Supervisor (TS) J. William Gu Data Acquisition Group 1.TS position in the system 2.First prototype TS 3.TS functions 4.TS test status.
Tips from the Trenches Rebekah Atkinson Steve Nye, Kathy Kennedy.
A TCP/IP transport layer for the DAQ of the CMS Experiment Miklos Kozlovszky for the CMS TriDAS collaboration CERN European Organization for Nuclear Research.
Boosting Event Building Performance Using Infiniband FDR for CMS Upgrade Andrew Forrest – CERN (PH/CMD) Technology and Instrumentation in Particle Physics.
Operated by Los Alamos National Security, LLC for NNSA U N C L A S S I F I E D LDAQ – the New Lujan Center Data Acquisition Application Frans Trouw, Gary.
D0 Farms 1 D0 Run II Farms M. Diesburg, B.Alcorn, J.Bakken, T.Dawson, D.Fagan, J.Fromm, K.Genser, L.Giacchetti, D.Holmgren, T.Jones, T.Levshina, L.Lueking,
IceCube DAQ Mtg. 10,28-30 IceCube DAQ: “DOM MB to Event Builder”
Run II DZERO DAQ / Level 3 Trigger System Ron Rechenmacher, Fermilab The DØ DAQ Group (Brown/FNAL-CD/U.of Washington) CHEP03.
PHENIX upgrade DAQ Status/ HBD FEM experience (so far) The thoughts on the PHENIX DAQ upgrade –Slow download HBD test experience so far –GTM –FEM readout.
DAQ Issues for the 12 GeV Upgrade CODA 3. A Modest Proposal…  Replace aging technologies  Run Control  Tcl-Based DAQ components  mSQL  Hall D Requirements.
Data Acquisition for the 12 GeV Upgrade CODA 3. The good news…  There is a group dedicated to development and support of data acquisition at Jefferson.
CHEP 2000, February 7 - February 11, 2000, Padova (Italy), Abstract 377. Data Handling and Filter Framework for the D0 L3/Trigger System A. Boehnlein G.
CDF Offline Production Farms Stephen Wolbers for the CDF Production Farms Group May 30, 2001.
6-10 Oct 2002GREX 2002, Pisa D. Verkindt, LAPP 1 Virgo Data Acquisition D. Verkindt, LAPP DAQ Purpose DAQ Architecture Data Acquisition examples Connection.
L3 DAQ the past, the present, and your future Doug Chapin for the L3DAQ group DAQ Shifters Meeting 26 Mar 2002.
Data Acquisition System of SVD2.0 This series of slides explains how we take normal, calibration and system test data of the SVD 2.0 and monitor the environment.
Bernardo Mota (CERN PH/ED) 17/05/04ALICE TPC Meeting Progress on the RCU Prototyping Bernardo Mota CERN PH/ED Overview Architecture Trigger and Clock Distribution.
3rd April 2001A.Polini and C.Youngman1 GTT status Items reviewed: –Results of GTT tests with 3 MVD-ADC crates. Aims Hardware and software setup used Credit.
Features of the new Alibava firmware: 1. Universal for laboratory use (readout of stand-alone detector via USB interface) and for the telescope readout.
Overview of DAQ at CERN experiments E.Radicioni, INFN MICE Daq and Controls Workshop.
Upgrade to the Read-Out Driver for ATLAS Silicon Detectors Atlas Wisconsin/LBNL Group John Joseph March 21 st 2007 ATLAS Pixel B-Layer Upgrade Workshop.
5/7/2004Tomi Mansikkala User guide for SVT/XTRP TX firmware v1.0 XTRP out Control FPGA Tomi: - Introduction - Control bit descriptions - Test Pattern format.
G. Watts (UW/Seatte) for the DZERO Collaboration “More reliable than an airline*” * Or the GRID The last DZERO DAQ Talk?
LHCb front-end electronics and its interface to the DAQ.
2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD The DataFlow of the ATLAS Trigger.
Level 3 Trigger Leveling D. Cutts(Brown), A. Garcia-Bellido(UW), A. Haas(Columbia), G. Watts(UW)
Sep. 17, 2002BESIII Review Meeting BESIII DAQ System BESIII Review Meeting IHEP · Beijing · China Sep , 2002.
DØ LEVEL 3 TRIGGER/DAQ SYSTEM STATUS G. Watts (for the DØ L3/DAQ Group) “More reliable than an airline*” * Or the GRID.
June 17th, 2002Gustaaf Brooijmans - All Experimenter's Meeting 1 DØ DAQ Status June 17th, 2002 S. Snyder (BNL), D. Chapin, M. Clements, D. Cutts, S. Mattingly.
Field Programmable Port Extender (FPX) 1 NCHARGE: Remote Management of the Field Programmable Port Extender (FPX) Todd Sproull Washington University, Applied.
DØ Online Workshop3-June-1999S. Fuess Online Computing Overview DØ Online Workshop 3-June-1999 Stu Fuess.
L3DAQ An Introduction Michael Clements 5/5/03. A Quick Glance The DZero DAQ L3DAQ components L3DAQ communication An in depth look at monitoring Michael.
Software development Control system of the new IGBT EE switch.
Chapter 13: I/O Systems Silberschatz, Galvin and Gagne ©2005 Operating System Concepts – 7 th Edition, Jan 2, 2005 I/O through system calls Protection.
TELL1 command line tools Guido Haefeli EPFL, Lausanne Tutorial for TELL1 users : 25.February
Vienna Group Discussion Meeting on Luminosity CERN, 9 May 2006 Presented by Claudia-Elisabeth Wulz Luminosity.
18/05/2000Richard Jacobsson1 - Readout Supervisor - Outline Readout Supervisor role and design philosophy Trigger distribution Throttling and buffer control.
D0 Farms 1 D0 Run II Farms M. Diesburg, B.Alcorn, J.Bakken, R. Brock,T.Dawson, D.Fagan, J.Fromm, K.Genser, L.Giacchetti, D.Holmgren, T.Jones, T.Levshina,
L1/HLT trigger farm Bologna setup 0 By Gianluca Peco INFN Bologna Genève,
DAQ & ConfDB Configuration DB workshop CERN September 21 st, 2005 Artur Barczyk & Niko Neufeld.
Plans for the 2015 Run Michal Koval, Peter Lichard and Vito Palladino TDAQ Working Group 25/3/2015.
ROM. ROM functionalities. ROM boards has to provide data format conversion. – Event fragments, from the FE electronics, enter the ROM as serial data stream;
Networked Embedded Systems Pengyu Zhang & Sachin Katti EE107 Spring 2016 Lecture 4 Timers and Interrupts.
The Evaluation Tool for the LHCb Event Builder Network Upgrade Guoming Liu, Niko Neufeld CERN, Switzerland 18 th Real-Time Conference June 13, 2012.
The DZero DAQ System Sean Mattingly Gennady Briskin Michael Clements
CMS DAQ Event Builder Based on Gigabit Ethernet
J.M. Landgraf, M.J. LeVine, A. Ljubicic, Jr., M.W. Schulz
Example of DAQ Trigger issues for the SoLID experiment
Hall D Trigger and Data Rates
L2 CPUs and DAQ Interface: Progress and Timeline
The Performance and Scalability of the back-end DAQ sub-system
Idle Setup Run Startup Process Wait Event Event Wait Answer Send
Presentation transcript:

L3 DAQ Doug Chapin for the L3DAQ group DAQShifters Meeting 10 Sep 2002 Overview of L3 DAQ uMon l3xqt l3xmon

2 Doug Chapin (Brown) 10 Sep 2002 The L3DAQ System ROC SBC ROC SBC ROC SBC ROC SBC CISCO 6509 Ethernet Switch trigger disable Farm CPU 48 farm nodes All ethernet, except for TFW communication Routing Master CPU TFW info Farm CPU Supervisor CPU COOR 63 readout crates

3 Doug Chapin (Brown) 10 Sep 2002 Communication Flow ROC SBC ROC SBC trigger disable Routing Master TFW Supervisor COOR Run information: crate and node list by trigger bit EVBFilter- shell ScriptRunner trigger programming Farm node Crate list by event Bits fired by event “node ready” Routing info: destination farmnode by event SCL: accept

4 Doug Chapin (Brown) 10 Sep 2002 ReadOutCrate Example SBC VME block transfer(s) J3 Slave Ready Done Controller Card SCL Data Card SBC intiated ctrl reg programming Ethernet CPU Card

5 Doug Chapin (Brown) 10 Sep 2002 SBCs Intel ~1GHz, 128MB RAM, dual 100Mb ethernet, 128MB flash, VME Universe2 VME slave interface –VBD control register emulation –Data buffer access Component front-end debugging J3 Control (SlaveReady,Done) –DIO PMC add-on card Also drives status LEDs Software connections –Actively connect to RM –Recieve up to 2 connections from each farmnode

6 Doug Chapin (Brown) 10 Sep 2002 SBC Operation Event Fragment Queue (23) Route Queue (100) Event fragments from VME SBC Process Event Tags From RM Match head event tag to head fragment via event number Fragments have one sec timeout If no tags, throw away head event after 1s Route queue circular, non-blocking Tags can be overwritten if no fragments are available Event number mismatch If frag ev# > tag ev#, throw away tag If frag ev# < tag ev#, drop fragment To node

7 Doug Chapin (Brown) 10 Sep 2002 Supervisor Process Connections to –RM Receives crate and node list per bit RM complains if needed nodes/sbcs not connected –Scriptrunner processes Receive complete trigger programming –Passed on from COOR –When trigger downloaded Node processes may get stuck Big trigger lists take several minutes to download –*N0* connection to SBC or EVB processes

8 Doug Chapin (Brown) 10 Sep 2002 Farm Nodes Dual 1GHz Intel, 1GB RAM –48 nodes so far Runs Event Builder (EVB) and Filtershell processes Filtershell Processes (usually 2) –This is ScriptRunner (L3 filters) –Recieves trigger programming info from supervisor –Recieves full event from EVB process EVB Process Operation –Connects to all SBCs (incl RM), all the time –Receive crate list by event# from RM –Sends no. of free buffers to RM (“node ready”) –Builds event from received fragments second timeout

9 Doug Chapin (Brown) 10 Sep 2002 Routing Master (RM) Recieves “run” information from supervisor –Farm node list and crate list per bit Gets bits fired per event# from TFW Recieves no. of free buffers from each farm node Decides which nodes receive which events Sends routing info by event# to SBCs Sends crate list by event# to farm nodes Disables triggers when necessary

10 Doug Chapin (Brown) 10 Sep 2002 RM Operation Process generates event tags and crate list –based on bits fired and run info One sender thread and event tag queue per SBC –Wait for 10 tags or 250ms, then send Minimize ethernet overhead Crate list sent to target node on every event sender thread To d0sbc001b sender thread To d0sbcXXXb Event Tag Queue Main RM Process TFW bit mask and event# Crate list to target node

11 Doug Chapin (Brown) 10 Sep 2002 L3 Disable RM process running on TFW crate’s SBC (d0sbc001b) Spy on the TFW data block –Extract L1 fired bits and L3 transfer number When L3-disable is needed –Update RM monitoring information –RM tells SBC to stop reading out the TFW crate TFW crate goes 100% front-end-busy and stops triggers Acts as global disable Cons –Actual L3 Disable line is not used Cannot easily separate out l3daq backup (accounting issue) –Shifter must check L3DAQ monitoring every time TFW crate is 100% front-end-busy

12 Doug Chapin (Brown) 10 Sep 2002 Monitoring All processes (SBC, RM, super, EVB) produce monitoring info Monitor clients –l3xmon ScriptRunner status (processing,flattening, sending,…) –üMon is your friend SBC, RM, and some Supervisor info EVB info (missing crates, input rate) Send comments to –l3xqt Mostly expert info Daqmon info (L1/L2 busy fractions) useful to shifters

13 Doug Chapin (Brown) 10 Sep 2002

14 Doug Chapin (Brown) 10 Sep 2002

15 Doug Chapin (Brown) 10 Sep 2002

16 Doug Chapin (Brown) 10 Sep 2002

17 Doug Chapin (Brown) 10 Sep 2002

18 Doug Chapin (Brown) 10 Sep 2002 Scripts Online scripts –l3xreset can restart farmnode processes\ –EVB and ScriptRunner simultaneously can restart Supervisor process (node d0lxmast) –l3xdaq_reset Restarts sbc processes (incl RM) SBC scripts –is_crate_requesting_readout.sh (SlaveReady asserted?) –getInfo.sh (sbc driver stats and status) –reset_all.sh (restarts processes)

19 Doug Chapin (Brown) 10 Sep 2002 L3DAQ Shifter Webpage “What To Do When” –Extremely useful –Frequently updated??? On-call schedule uMon documentation!!! Logfile access

20 Doug Chapin (Brown) 10 Sep 2002 Typical Problems Prescales set for < 10Hz –Unpredictable results (all crates missing) Not enough nodes in run –L3 disables (TFW 100% FEB) –SMT/CFT/CAL in full-readout mode? Farmnode loses connection to SBC –Crate is missing (reset farmnode to fix) Crate 100% missing –Almost always a component problem –Check Route and Event queue state in uMon

21 Doug Chapin (Brown) 10 Sep 2002 Improvements to Come… Phase out l3xmon –Currently uses a separate monitor server –Move features into uMon Trigger Crate –Set L3 disables correctly