62cm 64cm. DAQ status and Plans DAQ –Timing tuning –Which tools to learn (SC, Event monitor, GPIO, MIDAS, DIP, laser, storage, analysis) Plan with beam.

Slides:



Advertisements
Similar presentations
Slow Control LHCf Catania Meeting - July 04-06, 2009 Lorenzo Bonechi.
Advertisements

Status report of the LHCf experiment: preparation for data taking
Report ~first collision Kotaro Kijima 2010/01/25 Local lab meeting.
Peter Chochula, January 31, 2006  Motivation for this meeting: Get together experts from different fields See what do we know See what is missing See.
MICE CM15 June 2006Jean-Sébastien GraulichSlide 1 DAQ for BTF o BTF Overview o Hardware Overview o DAQ Software Description o What’s next o Summary Jean-Sebastien.
J. Leonard, U. Wisconsin 1 Commissioning the Trigger of the CMS Experiment at the CERN Large Hadron Collider Jessica L. Leonard Real-Time Conference Lisbon,
MICE CM15 June 2006Jean-Sébastien GraulichSlide 1 DAQ Status o Detector DAQ Test bench in Geneva o Preparing for Test Beam o FE electronics tests o Detector.
6 June 2002UK/HCAL common issues1 Paul Dauncey Imperial College Outline: UK commitments Trigger issues DAQ issues Readout electronics issues Many more.
CDF Silicon Workshop th May 2006 New BLM & BLM electronics Jose E. Garcia, Jennifer Gimmell, Ulrich Husemann.
The performance of LHCf calorimeter was tested at CERN SPS in For electron of GeV, the energy resolution is < 5% and the position resolution.
Shuei MEG review meeting, 2 July MEG Software Status MEG Software Group Framework Large Prototype software updates Database ROME Monte Carlo.
1 S. E. Tzamarias Hellenic Open University N eutrino E xtended S ubmarine T elescope with O ceanographic R esearch Readout Electronics DAQ & Calibration.
© 2004 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice SISP Training Documentation Template.
06/03/06Calice TB preparation1 HCAL test beam monitoring - online plots & fast analysis - - what do we want to monitor - how do we want to store & communicate.
LHCf Run status at 7TeV collisions Takashi Sako (Plots: K.Taki, H.Menjo) LPC meeting 12-Apr-2010.
Status of Data Exchange Implementation in ALICE David Evans LEADE 26 th March 2007.
14 Sep 2005DAQ - Paul Dauncey1 Tech Board: DAQ/Online Status Paul Dauncey Imperial College London.
SITRA Test beams Simulations Zdeněk Doležal Charles University Prague Annual EUDET meeting Munich October 2006.
Status of NA62 straw electronics and services Peter LICHARD, Johan Morant, Vito PALLADINO.
DE/dx measurement with Phobos Si-pad detectors - very first impressions (H.P Oct )
DCS T0 DCS Answers to DCS Commissioning and Installation related questions ALICE week T.Karavicheva and the T0 team T0 DCS Answers to.
DAQ & ECS for TPC commissioning A few statements about what has been done and what is still in front of us F.Carena.
Online Calibration of the D0 Vertex Detector Initialization Procedure and Database Usage Harald Fox D0 Experiment Northwestern University.
Real data reconstruction A. De Caro (University and INFN of Salerno) CERN Building 29, December 9th, 2009ALICE TOF General meeting.
S. De Santis “Measurement of the Beam Longitudinal Profile in a Storage Ring by Non-Linear Laser Mixing” - BIW 2004 May, 5th Measurement of the Beam Longitudinal.
Data Acquisition System of SVD2.0 This series of slides explains how we take normal, calibration and system test data of the SVD 2.0 and monitor the environment.
Prediction W. Buchmueller (DESY) arXiv:hep-ph/ (1999)
FADC progress in Vienna Reported by H.Ishino for Vienna FADC group M.Pernicka and H.Steininger.
Preliminary analysis of p-Pb data update n. 6 Lorenzo Bonechi LHCf Catania meeting – 19 December 2013.
Clara Gaspar, March 2005 LHCb Online & the Conditions DB.
Features of the new Alibava firmware: 1. Universal for laboratory use (readout of stand-alone detector via USB interface) and for the telescope readout.
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
LHC BLM Software revue June BLM Software components Handled by BI Software section –Expert GUIs  Not discussed today –Real-Time software  Topic.
USB Project Ian Coulter. USB Interface USB Menu -Download HEX File -Send Trigger -Start DAQ.
Fast Fault Finder A Machine Protection Component.
1 DAQ Update MEG Review Meeting, Feb. 17 th 2010.
LHCf: installation & commissioning Measurement of Photons and Neutral Pions in the Very Forward Region of LHC Oscar Adriani INFN Sezione di Firenze - Dipartimento.
TGC Timing Adjustment Chikara Fukunaga (TMU) ATLAS Timing Workshop 5 July ‘07.
DAQ Status for cosmic-ray test in RAL Hideyuki Sakamoto MICE Phone meeting 12 th July 2007 Contents Status Setup for cosmic-ray test bench Schedule.
Beam Line BPM Filter Module Nathan Eddy May 31, 2005.
1Ben ConstanceFONT Meeting 1st August 2008 ATF2 digital feedback board 9 channel board with replaceable daughter board (RS232 etc.) − Board will log data.
Jefferson Laboratory Hall A SuperBigBite Spectrometer Data Acquisition System Alexandre Camsonne APS DNP 2013 October 24 th 2013 Hall A Jefferson Laboratory.
Software development Control system of the new IGBT EE switch.
first results from EMCal test beam
1 Calorimeters LED control LHCb CALO meeting Anatoli Konoplyannikov /ITEP/ Status of the calorimeters LV power supply and ECS control Status of.
GLAST LAT Project CU Beam Test Workshop 3/20/2006 C. Sgro’, L. Baldini, J. Bregeon1 Glast LAT Calibration Unit Beam Test Status Report on Online Monitor.
Forward Carriage Commissioning CLAS Collaboration Meeting 6/19/2015CLAS12 CalCom Status Update1 ECAL PCAL FTOF Panel 1A FTOF Panel 1B Detector Status PMT.
Correlator GUI Sonja Vrcic Socorro, April 3, 2006.
DAQ Status & Plans GlueX Collaboration Meeting – Feb 21-23, 2013 Jefferson Lab Bryan Moffit/David Abbott.
October Test Beam DAQ. Framework sketch Only DAQs subprograms works during spills Each subprogram produces an output each spill Each dependant subprogram.
AFP Trigger DAQ and DCS Krzysztof Korcyl Institute of Nuclear Physics - Cracow on behalf of TDAQ and DCS subsystems.
Management of the LHCb DAQ Network Guoming Liu *†, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
LHCf Detectors Sampling Calorimeter W 44 r.l, 1.6λ I Scintilator x 16 Layers Position Detector Scifi x 4 (Arm#1) Scilicon Tracker x 4(Arm#2) Detector size.
Calorimeter global commissioning: progress and plans Patrick Robbe, LAL Orsay & CERN, 25 jun 2008.
1 ECS CALO LED Control System CALO Piquet Training Session Anatoli Konoplyannikov /ITEP/ Outline  Introduction  Calorimeter ECS LED monitoring.
1 4 July 2006 Alan Barr - SCT DAQ Experience and plans from running the (SCT) DAQ at SR1 HEP Cosmics setup Running modes Problems Future.
1 Timing of the calorimeter monitoring signals 1.Introduction 2.LED trigger signal timing * propagation delay of the broadcast calibration command * calibration.
20OCT2009Calo Piquet Training Session - Xvc1 ECS Overview Piquet Training Session Cuvée 2009 Xavier Vilasis.
Trigger system for setup 2016 V. Rogov, V. Yurevich,D.Bogoslovski, S.Sergeev, O.Batenkov* LHEP JINR *V. G. Khlopin Radium Institute, St. Petersburg.
BRAN at IR2 and IR8: status, commissioning and operation Enrico Bravin AB-BI Joint LHC Machine-Experiments Workshop on Very Forward Detectors 25 January.
35t Readout Software John Freeman Dune Collaboration Meeting September 3, 2015.
András László KFKI Research Institute for Particle and Nuclear Physics New Read-out System of the NA61 Experiment at CERN SPS Zimányi Winter School ‑ 25.
DAQ and Trigger for HPS run Sergey Boyarinov JLAB July 11, Requirements and available test results 2. DAQ status 3. Trigger system status and upgrades.
The Data Handling Hybrid
RF acceleration and transverse damper systems
Baby-Mind SiPM Front End Electronics
RHICf Japan meeting General status
Saturday 21st April 00:33 Interlock during ramp on BLM HV
Example of DAQ Trigger issues for the SoLID experiment
BESIII EMC electronics
Presentation transcript:

62cm 64cm

DAQ status and Plans DAQ –Timing tuning –Which tools to learn (SC, Event monitor, GPIO, MIDAS, DIP, laser, storage, analysis) Plan with beam LHCf group meeting in Catania, 4-6, July, 2009 Sako

Timing Tuning PBTX ( m from IP) ATLAS rack BPTX logic FANOUT A B C T A-B = 752ns was measured in 10-Sep-2008 T A-C or T B-C must be measured but similar to T A-B or 0, respectively T A-D must be measured T A1-A2 with actual beams; cable length is adjusted CTP LTP BC (40MHz) D

Best estimate timing table (in LHCf Wiki; Cablings) BPTX Arm1 calo Arm2 caloArm1 FC Arm2 FC GPIO Red; tuned with real BPTX

A B Inter module delays in the LHCf rack were measured. Note; For the FC events taken in 2008, timing were not yet optimized.

Which tools operators must learn?  DIP to receive accelerator info, to send LHCf info  Slow Control for setup (HV, LV, manipulator, temperature)  Trigger setting and timing tuning with GPIO (FPGA module)  MIDAS for data acquisition  (threshold setup)  Event Monitor  Alert System  Quick analysis (event rate, position, pizero)  Data storage  Laser calibration Usage manual for 2008 is available on subversion

DIP lhcfmon1lhcfmon2 lhcfdaq7 lhcfds1 lhcfdaq4 (analyzer1) lhcfdaq6 (analyzer2) lhcfdaq5 (frontend1) lhcfdaq2 (frontend2) lhcfdaq3 (slow control) GPIO VME2 VME1 lhcfds1 (MIDAS server) HV, LV, manipulator, LHCf CR Actual Operation

① ② ③ ④ ① DIP server publish the contents of the text files ② Simple Eventmonitor updates the text file. Using the manipulator position, beam position is calculated. ③ SlowControl-DIP reads the manipulator position from the slow control server and updates the txt file. ④ SC-DIP reads DIP info as a DIP client and records in the slow data. ⑤ Independent DIP client reads DIP info, displays on monitor and records in a text file. Handshake SC-DIP detects injection alert SC-DIP moves the detector to garage SC-DIP writes ‘ready’ in the txt file DIP server answers ‘ready’ LHCf Status Status (TUNING, PHYSICS_RUN, etc) is defined by operator and can be published using a CUI tool. ⑤

LHCf DIP What we publish on DIP? See EDMS Document of LHCf DIP

SC-DIP Client Contents What we describe from DIP See DIP-machine.pdf All LHCf DIP Info Handshake –LHC_ADJUST, –LHC_BEAMDUMP –LHC_INJECTION Beam –Beam/Energy –Beam/BPM/verticalPos, horizontalPos, bpmNames –Beam/Intensity/Beam1(2)/A(B)/arbiterFlag, totalIntensity –BRAN (not defined in 2008) –Beam/IntensityPerBunch/Beam1(2)/A(B) –Average 2D beam size Run Control –RunControl/BeamMode –RunControl/MachineMode –RunControl/SafeBeam/Beam1(2) Postmotem

Slow Control (Detail given by Lorenzo) Control through text/graphical clients –CAEN crate (HV, LV) –Agilent power supply –PCI ADC board (manipulator position, temperature) –SC-DIP –Arm2 temperature Default actions available –Manipulator (garage, beam center, etc) –PMT gain (high/middle/low/laser) –Sequence for power ON/OFF Data is constantly recorded as a part of main data. SC is regarded as the 3 rd front-end.

Trigger logic in GPIO OR Laser Any combination of FC scinti Latency BPTX - L3T is fixed Various default settings are defined Operators upload a predefined file to GPIO via command line New setting can be generated via a CUI interface tool

MIDAS lhcfds1 –MIDAS server (BG) –Data logger (BG) –ODBEdit (CUI run controller; start, stop, run mode) –GPIO setup is necessary when switching beam/laser lhcfdaq5, lhcfdaq2, lhcfdaq3 (frontend) –Frontend1,2,3 (BG) lhcfdaq4, lhcfdaq6 (analyzer) –Analyzer1,2 (BG) lhcfmon1, lhcfmon2 (eventmonitor) –Eventmonitor1,2 (GUI) –Smidas-DIP1,2 (BG) Most of the processes are in BG Operator must be sure all BG processes are launched Check list of BG processes (not only MIDAS) is in the operation manual (in subversion)

MIDAS configuration

Data storage ~20kb/ev = 10Mb/s = 36 Gb/h = 864 Gb/d 1Tbx6 RAID1 (1Tbx3 x mirror) storage at the 1 st stage –HD hot swap ~1/day 6Tb RAID5 in CR –Full after 24hx6days operation –Fast transfer for the offline analysis not to disturb DAQ (Massimo?)

Menu after beam Timing tune at garage (hopefully not only FC, but main detectors) DAQ test and BG study with single beam (or double beams w/o collision) 450 collision (time?, event?) 3TeV collision (time?, event?) Laser run (frequency?, time?, event?) Removal (if no more energy increase for 1 year) Update and comeback

Run table (example) Situation is very different from 7TeV run 450 GeV  pizero is not detected  less science with gammas (very small aperture)  some science with neutrons (even elastic proton?)  n/g ratio?  HV (gain) scan necessary? (need <50GeV?)  position scan necessary? 3TeV  pizero is visible (statistics=dose), position scan is not important  science with gamma/neutrons (statistics=dose)

Summary Tools for data acquisition are ready –Finalization, tuning, documentation, training are necessary during August, September Tools for offline analysis, alert system to be prepared –Works already started We need requirements for non 7TeV run in science, calibration, redundancy and dose points of view –Model discrimination –Other criteria?