Kraków4FutureDaQ Institute of Physics & Nowoczesna Elektronika P.Salabura,A.Misiak,S.Kistryn,R.Tębacz,K.Korcyl & M.Kajetanowicz Discrete event simulations.

Slides:



Advertisements
Similar presentations
GSI Event-driven TDC with 4 Channels GET4
Advertisements

G ö khan Ü nel / CHEP Interlaken ATLAS 1 Performance of the ATLAS DAQ DataFlow system Introduction/Generalities –Presentation of the ATLAS DAQ components.
Performance Analysis of Daisy- Chained CPUs Based On Modeling Krzysztof Korcyl, Jagiellonian University, Krakow Radoslaw Trebacz Jagiellonian University,
Tests of CAEN 1190 Multi-hit TDCs Simona Malace Brad Sawatzky and Brian Moffit JLab Hall C Summer Workshop Aug , JLab.
Copyright© 2000 OPNET Technologies, Inc. R.W. Dobinson, S. Haas, K. Korcyl, M.J. LeVine, J. Lokier, B. Martin, C. Meirosu, F. Saka, K. Vella Testing and.
CHEP04 - Interlaken - Sep. 27th - Oct. 1st 2004T. M. Steinbeck for the Alice Collaboration1/20 New Experiences with the ALICE High Level Trigger Data Transport.
11 November 2003ATLAS MROD Design Review1 The MROD The Read Out Driver for the ATLAS MDT Muon Precision Chambers Marcello Barisonzi, Henk Boterenbrood,
R. Ball September 27 th  Criteria Re-use existing readout systems where possible High density Build inexpensive adapters to existing systems Save.
LAV firmware status Francesco Gonnella Mauro Raggi 23 rd May 2012 TDAQ Working Group Meeting.
BESIII Electronics and On-Line BESIII Workshop in Beijing IHEP Zhao Jing-wei Sheng Hua-yi He Kang-ling October 13, 2001 Brief Measurement Tasks Technical.
Straw electronics Straw Readout Board (SRB). Full SRB - IO Handling 16 covers – Input 16*2 links 400(320eff) Mbits/s Control – TTC – LEMO – VME Output.
Modeling of the architectural studies for the PANDA DAT system K. Korcyl 1,2 W. Kuehn 3, J. Otwinowski 1, P. Salabura 1, L. Schmitt 4 1 Jagiellonian University,Krakow,
The GANDALF Multi-Channel Time-to-Digital Converter (TDC)  GANDALF module  TDC concepts  TDC implementation in the FPGA  measurements.
Data Acquisition Data acquisition (DAQ) basics Connecting Signals Simple DAQ application Computer DAQ Device Terminal Block Cable Sensors.
Proposal for read-out of Forward Detector straw tubes PANDA DAT requirements General layout Prototypes M. Idzik, M. Kajetanowicz, K.Korcyl, G. Korcyl,
SODA: Synchronization Of Data Acquisition I.Konorov  Requirements  Architecture  System components  Performance  Conclusions and outlook PANDA FE-DAQ.
David Cussans, AIDA/CALICE DAQ Palaiseau, 10 Nov 2011 Trigger/Timing Logic Unit (TLU) for AIDA Beam-Test.
U N C L A S S I F I E D FVTX Detector Readout Concept S. Butsyk For LANL P-25 group.
High-Level Interconnect Architectures for FPGAs Nick Barrow-Williams.
NEDA collaboration meeting at IFIC Valencia, 3rd-5th November 2010 M. Tripon EXOGAM2 project Digital instrumentation of the EXOGAM detector EXOGAM2 - Overview.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
PicoTDC Features of the picoTDC (operating at 1280 MHz with 64 delay cells) Focus of the unit on very small time bins, 12ps basic, 3ps interpolation Interpolation.
Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
1 Network Performance Optimisation and Load Balancing Wulf Thannhaeuser.
Data Acquisition Backbone Core J. Adamczewski-Musch, N. Kurz, S. Linev GSI, Experiment Electronics, Data processing group.
A First Look at the June Test Beam DAQ Hardware architecture Talk Presented 8 April, 2008 John Anderson HEP Electronics Group Argonne National Laboratory.
Online-Offsite Connectivity Experiments Catalin Meirosu *, Richard Hughes-Jones ** * CERN and Politehnica University of Bucuresti ** University of Manchester.
An Efficient Gigabit Ethernet Switch Model for Large-Scale Simulation Dong (Kevin) Jin.
LHCb front-end electronics and its interface to the DAQ.
Guido Haefeli CHIPP Workshop on Detector R&D Geneva, June 2008 R&D at LPHE/EPFL: SiPM and DAQ electronics.
01/04/09A. Salamon – TDAQ WG - CERN1 LKr calorimeter L0 trigger V. Bonaiuto, L. Cesaroni, A. Fucci, A. Salamon, G. Salina, F. Sargeni.
Modeling PANDA TDAQ system Jacek Otwinowski Krzysztof Korcyl Radoslaw Trebacz Jagiellonian University - Krakow.
An Efficient Gigabit Ethernet Switch Model for Large-Scale Simulation Dong (Kevin) Jin.
Efficient Gigabit Ethernet Switch Models for Large-Scale Simulation Dong (Kevin) Jin David Nicol Matthew Caesar University of Illinois.
An Efficient Gigabit Ethernet Switch Model for Large-Scale Simulation Dong (Kevin) Jin.
DDRIII BASED GENERAL PURPOSE FIFO ON VIRTEX-6 FPGA ML605 BOARD PART B PRESENTATION STUDENTS: OLEG KORENEV EUGENE REZNIK SUPERVISOR: ROLF HILGENDORF 1 Semester:
LAV firmware status Francesco Gonnella Mauro Raggi 28 th March 2012 TDAQ Working Group Meeting.
Management of the LHCb DAQ Network Guoming Liu *†, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
Research Unit for Integrated Sensor Systems and Oregano Systems Cern Timing Workshop 2008 Patrick Loschmidt, Georg Gaderer, and Nikolaus Kerö.
CBM-TOF-FEE Jochen Frühauf, GSI Picosecond-TDC-Meeting.
4. Operations and Performance M. Lonza, D. Bulfone, V. Forchi’, G. Gaio, L. Pivetta, Sincrotrone Trieste, Trieste, Italy A Fast Orbit Feedback for the.
Plans for the 2015 Run Michal Koval, Peter Lichard and Vito Palladino TDAQ Working Group 25/3/2015.
SuperB DAQ U. Marconi Padova 23/01/09. Bunch crossing: 450 MHz L1 Output rate: 150 kHz L1 Triggering Detectors: EC, DC The Level 1 trigger has the task.
Grzegorz Korcyl - Jagiellonian University, Kraków Grzegorz Korcyl – PANDA TDAQ Workshop, Giessen April 2010.
Software and TDAQ Peter Lichard, Vito Palladino NA62 Collaboration Meeting, Sept Ferrara.
STT read-out concepts Detectors requirements and layout Read-out concepts Developments of Analog FE and Digital Boards STS el. group : INFN, FZ Juelich,
PC-based L0TP Status Report “on behalf of the Ferrara L0TP Group” Ilaria Neri University of Ferrara and INFN - Italy Ferrara, September 02, 2014.
PXD DAQ News S. Lange (Univ. Gießen) Belle II Trigger/DAQ Meeting (Jan 16-18, 2012, Hawaii, USA) Today: only topics important for CDAQ - GbE Connection.
29/05/09A. Salamon – TDAQ WG - CERN1 LKr calorimeter L0 trigger V. Bonaiuto, L. Cesaroni, A. Fucci, A. Salamon, G. Salina, F. Sargeni.
Straw readout status Run 2016 Cover FW SRB FW.
Modeling event building architecture for the triggerless data acquisition system for PANDA experiment at the HESR facility at FAIR/GSI Krzysztof Korcyl.
Work on Muon System TDR - in progress Word -> Latex ?
Youngstown State University Cisco Regional Academy
Modeling event building architecture for the triggerless data acquisition system for PANDA experiment at the HESR facility at FAIR/GSI Krzysztof Korcyl.
Baby-Mind SiPM Front End Electronics
96-channel, 10-bit, 20 MSPS ADC board with Gb Ethernet optical output
Erno DAVID, Tivadar KISS Wigner Research Center for Physics (HU)
DCH FEE 28 chs DCH prototype FEE &
Electronics, Trigger and DAQ for SuperB
Hellenic Open University
PCI BASED READ-OUT RECEIVER CARD IN THE ALICE DAQ SYSTEM
Status of n-XYTER read-out chain at GSI
Example of DAQ Trigger issues for the SoLID experiment
Network Core and QoS.
Performance Evaluation of Computer Networks
Performance Evaluation of Computer Networks
New DCM, FEMDCM DCM jobs DCM upgrade path
PID meeting Mechanical implementation Electronics architecture
SVT detector electronics
Network Core and QoS.
Presentation transcript:

Kraków4FutureDaQ Institute of Physics & Nowoczesna Elektronika P.Salabura,A.Misiak,S.Kistryn,R.Tębacz,K.Korcyl & M.Kajetanowicz Discrete event simulations Prototype board construction ( Drift Chambers, TOF wall in the Forward detector of PANDA )

Modeling DAQ (Krzysztof Korcyl & Radosław Trębacz) The PTOLEMY environment: – (classic version) –DE (Discrete Event) domain simulation program maintains time-ordered list of moments when the modelled system (or part of it) is allowed to change –C++ to build models of components; –Tcl-like scripts to connect components and form architectures; –ROOT to process proprietary ascii files with results –Easy and quick to start: unified and simple interface between components –Substantial expertise (K.Korcyl): modeling TDAQ system (Linux PCs + large GE network) for the LHC experiment ATLAS. Other environments: SystemC, (?) – need evaluation of overheads

Modeling DAQ - II Modeling steps: –Questions: specify questions/issues of interests („how does it work?” – does not work; rather: what is max throughput, how long queues, etc) –Parameterization: simplify components as much as possible but with sufficient details to reproduce behavioral aspects relevant to the issues studied. Each model has a list of measurable parameters. –Calibration: collect values for the model parameters for the software processes: instrument software with time stamps and run dedicated test measurements. for the hardware components: use oscilloscopes, logic analysers in dedicated setups. –Validation: model test setups using parameterized models – cross-check with measurements – refine parameterization or/and calibration if necessary –Prediction: predict performance for the full size system with nominal rates: evaluate various architectures and impact of the possible policies on the overall system performance

Source Sink Node 1 Node N Node 2 Node N-1 Ring Architecture

Compute Engine GBitEth In Intelligent Switch GBitEth Out FIFO Busy Processing Node FIFO

Operation Data is transported between nodes via UDP (no check for packet loss or transmission error) If the local compute resource is idle, the raw input data is converted into processed data, and send to resource for processing If the local compute resource is busy, the data packet is forwarded to the next node in the chain Forwarding of processed data from the compute engine has priority over transporting raw data The switch can transport in parallel –Raw data from the input to the compute source –Processed data to the output At some point, packets will be lost because: –they were forwarded to the sink without being processed –of the network limited throughput Question: –What is the rate of packet loss as a function of the parameters of the system –What is the FIFO occupation in each of the processing nodes

Initial Set of Parameters Raw data packet size (<1500 bytes) Processed data packet size (<1500 bytes) FIFO size (<64 KBytes) Compute time (> 1  s) Load (generated by the source: < 100% GE) Number of nodes

TDC for Future DAQ? Multichannel (32), multihit devices (internal memories): HPTDC (CERN), TDC-F1 acam,.. Variable resolution i.e HPTDC: 785ps, 195ps, 98ps, 25ps (LSB) - measurement wrt. free running clock, self calibration Filtering of hits according to trigger matching mechanism: Trigger latencies up to 50  s, overlaping trigger handling time Input hit rates up to few MHz/channel High output data rates: MHz clock, 8-32 bits parralel output  Does it fit our requirements?

Many different timing detectors : i.e PANDA: Scintilators, Drift chambers, CBM: RPC’s : 1 : 0.03ns resolution needed Forward (Drift chambers to be build in UJ Kraków): Hit rates up to 0.3 MHz, 1ns resolution, ns time range, 6k wires  trigger rates? PANDA: many different trigger types, trigger latancies?, reaction rates of up to 10 7 reactions Intermediate step (?): TOF system for HADES RPC(M.Kajetanowicz) 4 TDC/board: time, Time Over Threshold Fast Ethernet interface: ETRAX 100 Mbit/s Switch 1Gigabit Ethernet Node

Data driven TDC architecture Trigger FIFO (16) 4 channels grouped 8 x