Pixel-Tilecal-MDT Combined Run E. Pasqualucci INFN Roma
September
A test with many expectations 3Organization of a common trigger and busy infrastructure across sub-detectors with different readout 3Integration of three different DAQ systems all based on current DAQ prototype for testbeam (aka DAQ-1 and evolution) 3Full DAQ/EF architecture implementation QEvaluation of online calibration performance QStorage of a set of data to be used for later EF studies 3Exploitation of up- and down-stream tracking for calorimeter reconstruction?
Many contributions from Muons 3Ludovico Pontecorvo for infrastructure and trigger 3Paolo Branchini and Andrea Negri for the Event Filter 3Toni Baroncelli, Antonio Passeri & Mauro Iodice working on Calib 3Giuseppe Avolio for the DAQ 3Enrico Pasqualucci for the DAQ 3Manuela Cirilli for the CDR 3Matteo Beretta for CSM
Detector setup Muon chambers e, Pixel telescope Tilecal modules scintillator dump Pixel beam telescope (fast readout, 125 s busy) Tilecal: 2 barrel modules 2 extended barrel modules MDT: 6 barrel chambers
Combined Trigger and Busy 3Need of cabling trigger and busy lines QTrigger on Pixel scintillator QTrigger on scintillator after Tile Muon busy ~ 200 s Pixel busy ~ 125 s Tile busy ~ 800 s 3Master trigger issued by Pixel QSetup can be run Selecting only muons Selecting muons + prescaled 3Busy lines from MDT and Tile sent to Pixel
DAQ 3Common configuration DB 3Network structure based on muon model QMDT Gigabit and Fast Ethernet Switch Private gigabit network for data Fast ethernet network for control 36 detector specific ROD emulators 33 detector ROSes 3Unique SFI QMuon SFI used for: Event building Interface to event filter 3Dedicated gigabit link to Meyrin Q9x2 processors available in building 513 for EF studies
FAST ETH GB ETH CERNETHCERNETH MAGNI EF Farm 2 Pctbmu02 ROS1 pctiledaq ROS3 Pctbmu05 DFM Pctbmu04 SFI/O Pctbmu06 EF Farm1 Pctbmugw Gateway data Pctbmu01 ONLWS pixdaq01 ROS4 6 RIO2 8 PCs 1 SBC PC
Integrated IGUI panels Tilecal configuration panelMuon trigger configuration panelPixel configuration panel Pixel textual monitor
3Muons Processing Task working in the muon TB QPixel and Tile data could go through Muons Processing Task untouched Need for Event format to modify the decoding routine 3Tile and Pixel processing ? QTo be unified in the Muon Calib program. QToo ambitious ? Event Filter 3More emphasis to offline analysis and playback Q Next year we can repeat the exercise if possible and have a well done Processing Task
Sample online calib plots
What we learned 3Integration of detectors using QDifferent hardware RIOs (mu, tile) and Concurrent Technologies (pixel) QDifferent detector specific software ROD emulators, monitors at different levels QDifferent ROD-ROB links and protocols S-link, TCP/IP on ethernet QDifferent data taking strategies ATLAS-like (mu, tile) and asynchronous (pixel) QExtended muon network structure Private fast and gigabit ethernet networks
What we learned (2) 3Online data quality checks using QOnline monitoring tasks QCalib on different detectors’ data 3Online and dataflow: QPerformance limit not reached QSeveral issues on system usability QMany feedbacks for the DAQ group
We have taken... 3Many runs for QConfiguration test QTrigger setup QEvent filter studies Both old and new software 3During last night QStable runs QMore than 2x10 6 events Only muons Muons + Pions (ratio 1/5)
Web page ready (to be update soon…)
Conclusions 3The first integrated multi-detector run succeeded 3Interesting tests QLooking for problems… and solutions QSet of data useful for Event Filter studies DAQ studies Future useful tools preparation 3Studying solutions for 2004 ATLAS Barrel wedge run QSee BDG talk at AW in Clermont