RPC Readiness for Data-taking RPC Collaboration 1.

Slides:



Advertisements
Similar presentations
Yokogawa Network Solutions Presents:
Advertisements

Level-1 Trigger CMS Week, Brussels 14 Sep C.-E. Wulz Deputy Trigger Project Manager Institute of High Energy Physics, Vienna Prepared with slides/material.
Kondo GNANVO Florida Institute of Technology, Melbourne FL.
CWG10 Control, Configuration and Monitoring Status and plans for Control, Configuration and Monitoring 16 December 2014 ALICE O 2 Asian Workshop
Il sistema di HV del sottorivelatore DT S. Braibant, P. Giacomelli, M. Giunta, E. Borsato Bologna, 20/01/2007.
Defined plateau region Knee 95% WP=knee+150V RPC operation and Hardware Performance  Annual HV module recalibration – all CAEN A3512N HV module offsets.
RPC Trigger Software ESR, July Tasks subsystem DCS subsystem Run Control online monitoring of the subsystem provide tools needed to perform on-
Data Quality Monitoring for CMS RPC A. Cimmino, D. Lomidze P. Noli, M. Maggi, P. Paolucci.
2011 HV scan SF6 flow-meter accident 2011 Results comparison RPC HV efficiency scan Pigi Paolucci on behalf of RPC collaboration.
MuonDPG/RPC M. Maggi INFN Bari. DQM tool for the DCS data Control Analysis Tool (CAT) by Cimmino, Lomidze, Paolucci, Polese To 1.Make custom DCS data.
Data Quality Monitoring of the CMS Tracker
Emulator System for OTMB Firmware Development for Post-LS1 and Beyond Aysen Tatarinov Texas A&M University US CMS Endcap Muon Collaboration Meeting October.
Web Based Monitoring DT online shifter tutorial Jesús Puerta-Pelayo CIEMAT Muon_Barrel_Workshop_07/July/10.
Clara Gaspar, October 2011 The LHCb Experiment Control System: On the path to full automation.
Status of NA62 straw electronics and services Peter LICHARD, Johan Morant, Vito PALLADINO.
RPC PAC Trigger system installation and commissioning How we make it working… On-line software Resistive Plate Chambers Link Boxes Optical Links Synchronization.
XXVI Workshop on Recent Developments in High Energy Physics and Cosmology Theodoros Argyropoulos NTUA DCS group Ancient Olympia 2008 ATLAS Cathode Strip.
DCS Plans and Pending Activities for 2013/14 Shutdown Muons in general Replacement/upgrade of DCS computers OS migration to Win-7, Win-2008 or SL6 PVSS.
André Augustinus 10 September 2001 DCS Architecture Issues Food for thoughts and discussion.
Muon System Safety Issues Burkhard Schmidt.
Gnam Monitoring Overview M. Della Pietra, D. della Volpe (Napoli), A. Di Girolamo (Roma1), R. Ferrari, G. Gaudio, W. Vandelli (Pavia) D. Salvatore, P.
André Augustinus 21 June 2004 DCS Workshop Detector DCS overview Status and Progress.
A.Golunov, “Remote operational center for CMS in JINR ”, XXIII International Symposium on Nuclear Electronics and Computing, BULGARIA, VARNA, September,
RPC DQA but also Monitoring for the DCS group: status and prospective for Marcello Bindi RPC L1MU Barrel DQM - 08/05/2013.
Giovanni Polese1 RPC Detector Control System for MTCC Pierluigi Paolucci, Anna Cimmino I.N.F.N. of Naples Giovanni Polese Lappeenranta University.
M&O status and program for ATLAS LAr calorimeter R Stroynowski (on vacations)
CMS pixel data quality monitoring Petra Merkel, Purdue University For the CMS Pixel DQM Group Vertex 2008, Sweden.
Annual Review Cern -June 13th, 2006 F. Loddo I.N.F.N. Bari RPC Electronics: Technical Trigger Flavio Loddo I.N.F.N. Bari On behalf of the RPC-Trigger group.
Karol Buńkowski Michał Pietrusiński University of Warsaw RPC PAC trigger software CMS L1 Trigger Online Software Review, 5 February 2009.
1 Status and Outlook of the RPC Commissioning Marcello Maggi/INFN-Bari.
4 th Workshop on ALICE Installation and Commissioning January 16 th & 17 th, CERN Muon Tracking (MUON_TRK, MCH, MTRK) Conclusion of the first ALICE COSMIC.
Draft of talk to be given in Madrid: CSC Operations Summary Greg Rakness University of California, Los Angeles CMS Run Coordination Workshop CIEMAT, Madrid.
L0 DAQ S.Brisbane. ECS DAQ Basics The ECS is the top level under which sits the DCS and DAQ DCS must be in READY state before trying to use the DAQ system.
UCLA group meeting1/11 CSC update – a 2-week summary Status of CMS at LHC: L=2*10 32 reached 25-Oct-2010 (=the original goal for 2011) and 42 pb -1 collected.
4 Dec 2008G. Rakness (UCLA)1 Online Software Updates and RPC data in the RAT …including Pad Bit Mapping and Efficiency… Greg Rakness University of California,
ALICE Pixel Operational Experience R. Santoro On behalf of the ITS collaboration in the ALICE experiment at LHC.
Part I – Shifter Duties Part II – ACR environment Part III – Run Control & DAQ Part IV – Beam Part V – DCS Part VI – Data Quality Monitoring Part VII.
All Experimenters MeetingDmitri Denisov Week of July 7 to July 15 Summary  Delivered luminosity and operating efficiency u Delivered: 1.4pb -1 u Recorded:
February 07, 2002 Online Monitoring Meeting Detector Examines Should aid in: 1.Diagnosing problems early and getting it fixed 2.Making decisions on the.
RPC DQM status Cimmino, M. Maggi, P. Noli, D. Lomidze, P. Paolucci, G. Roselli, C. Carillo.
Chamber Monitor Panel Emiliano Furfaro 16 october 2012.
S.MonteilCOMMISSIONING1 PS/SPD ELECTRONICS OUTLINE 1)STATUS OF PS/SPD FE BOARDS PRODUCTION 2)PHASES OF PS/SPD COMMISSIONING 1)LEDs AND DETECTORS 2)TUBES.
Pixel DQM Status R.Casagrande, P.Merkel, J.Zablocki (Purdue University) D.Duggan, D.Hidas, K.Rose (Rutgers University) L.Wehrli (ETH Zuerich) A.York (University.
DQM for the RPC subdetector M. Maggi and P. Paolucci.
Activity related CMS RPC Hyunchul Kim Korea university Hyunchul Kim Korea university 26th. Sep th Korea-CMS collaboration meeting.
1 Calorimeters LED control LHCb CALO meeting Anatoli Konoplyannikov /ITEP/ Status of the calorimeters LV power supply and ECS control Status of.
DQM for the RPC subdetector M. Maggi and P. Paolucci.
Online Consumers produce histograms (from a limited sample of events) which provide information about the status of the different sub-detectors. The DQM.
LHC CMS Detector Upgrade Project RCT/CTP7 Readout Isobel Ojalvo, U. Wisconsin Level-1 Trigger Meeting June 4, June 2015, Isobel Ojalvo Trigger Meeting:
Валидация TRT DCS CONDITIONS SERVICE Евгений Солдатов, НИЯУ МИФИ “Physics&Computing in ATLAS” – 22/09/2010.
ID Week 13 th of October 2014 Per Johansson Sheffield University.
The NA62RunControl: Status update Nicolas Lurkin School of Physics and Astronomy, University of Birmingham NA62 TDAQ Meeting – CERN, 10/06/2015.
Piquet report (07/Aug/ /Aug/2012) D. Pinci.
1 Top Level of CSC DCS UI 2nd PRIORITY ERRORS 3rd PRIORITY ERRORS LV Primary - MaratonsHV Primary 1 st PRIORITY ERRORS CSC_COOLING CSC_GAS CSC – Any Single.
14/02/2008 Michele Bianco 1 G.Chiodini & E.Gorini ATLAS RPC certification with cosmic rays Università del Salento Facoltà di Scienze MM.FF.NN.
1 Detector Monitoring requirements ( V.Dattilo for the EGO Operations Group ) ( with the collaboration of S.Braccini)  Short history  Current status.
P. Paolucci G. Pugliese M.Tytgat
RPC Data Certification
Muon DPG 2012 Camilo Carrillo (INFN Napoli) Tim Cox (UC Davis)
The IFR Online Detector Control at the BaBar experiment at SLAC
CMS Pixel Data Quality Monitoring
The IFR Online Detector Control at the BaBar experiment at SLAC
DQM for the RPC subdetector
Resistive Plate Chambers performance with Cosmic Rays
CMS Pixel Data Quality Monitoring
Pierluigi Paolucci & Giovanni Polese
Pierluigi Paolucci & Giovanni Polese
RPC Configuration DataBase Pierluigi Paolucci - I.N.F.N. of Naples.
Pierluigi Paolucci & Giovanni Polese
CSC Shift Training and Operation + B904 Status
Presentation transcript:

RPC Readiness for Data-taking RPC Collaboration 1

RPC Hardware Forward -18/432 chambers are in single gap mode due to HV problems -1 chamber is disconnected due to HV problem -1 chamber is disconnected due to LV problem -7 chambers have not I2C communication (no TH control) Barrel -5/1020 double gaps modules are in single gap mode due to HV problems -15/480 chambers have not I2C communication (no TH control). Out of them, 11 can be controlled through the DT redundancy line. The remaining are set to the default TH, but they are currently masked. 2

RPC Firmware CAEN Firmware for HV and LB boards stable Control Board New firmware recently installed to avoid loss of communication after CCU errors. Trigger Board LHC patterns uploaded. They also accommodate cosmic pattern with minimal Pt code and quality A bug related to a wrong forward endcap geometry is still present in the patterns. Modification of the geometry will be first inserted in CMSSW, then new patters will be computed and uploaded. The new firmware is expected to be ready in January. Some loss of efficiency ( few %, under investigation) could arise in RE1/2. 3

PAC: – bug in the logic cone definition: one strip of each chamber not connected to any logic strip – corrected – The PAC latency +1 BX – New feature: PACs can look for the hits coincidence in more then one BX (up to 3) – “BX OR – extended coincidence” – Known bugs: wrong sign bit definition; incorrect shape of some patterns – new firmware loaded New firmware for Final Sort: – bug: the latency of the endcap muon was 2 BX bigger than barrel – corrected New firmware for Half Sort : – Bug: phi ghost busting between Trigger Crates not working (almost) at all - new firmware applied and preliminary tested. RPC Firmware 4

RPC Online software Using version XDAQ 7. We would like not to rush to upgrade to version 9 unless this is required by CMS Automatic procedures recovering the CCU rings are being implemented in the online software Threshold configuration, RBC and TTU configuration from database on going TTU and RBC controlled through the trigger supervisor to be tested Review the LBs setup procedure to speedup. Ready next week Implement warm setup procedure. Do not re- configure hardware which is in “ready” state. Few weeks needed Extensive work to have the trigger supervisor controlling the full configuration phase is in progress. Ready next year. Improvement of the LB monitoring efficiency and start/stop time. Necessary for the TS controlled configuratione regime. Serious change in the software. Ready next year. Task force at work for the configuration : Karol + Filip + Nicolay + Andres + Mikolaj + Krzysztof 5

CMS center Control room P5 Shifter: Operation & Monitor Experts Shifter: Offline DQM, prompt analysis 3 shifters/ day at P5 1 shifter/day at CMS center 1 Shift leader 24h/24h 5 experts on call 24h/24h 3 shifters/ day at P5 1 shifter/day at CMS center 1 Shift leader 24h/24h 5 experts on call 24h/24h Shift Leader RPC Operation Crew Experts Until the end of the year all shifts covered 6

RPC Operation at P5 Procedure for configuration 1.LV ON, HV ON 2.LB configuration 3.TH setting 4.LB Synchronization 5.HV STAND-BY For the November run we plan a partially manual configuration that will take some time (about 40 minutes). Once the configuration through the trigger supervisor will be ready, time will go down to few minutes 7

RPC Operation at P5 1.DCS monitor via PVSS panel (HV, I, T, …) 2.XDAQ monitor (noise, trigger rates) 3.Monitor the chamber and trigger rate 4.DQM online 5.Fill a run report 1.DCS monitor via PVSS panel (HV, I, T, …) 2.XDAQ monitor (noise, trigger rates) 3.Monitor the chamber and trigger rate 4.DQM online 5.Fill a run report Documentation is available at: 8

RPC DCS General Detector View Global performances barrel + Endcap Global FSM states for barrel, endcap, hardware, and gas system Percentage of hardware ok, with trend to spot changes over the shift In case of error condition, the information about the problem, the time stamp, the details and the number to call is displayed Panic buttons List of known problems to be xchecked in case of alarm 8 error conditions flagged by red led on critic subsystems 9

RPC DCS Summarize all the information in one interface for the Central Shifter. Not too technical but all the possible problems are spotted easily for not RPC experts. DCS System has been reviewed at the beginning of October. DSS action matrix implemented and the validation is ongoing. Documentations…? Working condition can be monitored via histograms for all the sensitive parameters Global trend can also be monitored with history plots Working condition can be monitored via histograms for all the sensitive parameters Global trend can also be monitored with history plots 10

RPC Problems and Actions LV channels: If some channels off try command ON to restore (only once) If one/more channels are in error call RPC shifter leader. HV channels in trip: If less than 10 (1%) channels in total, try to restore (only once). If persistent leave the channels OFF and disable then into DCS If more than 10 channels in total, call RPC shifter leader. Gas alarm 1.Check gas flow 2.Kill acknowledge (if allowed) 3.Put HV standby 4.Put HV ON Under investigation possibility that the gas alarm puts the detector in STAND-BY (OFF after 1 hour) HV channels in trip: If less than 10 (1%) channels in total, try to restore (only once). If persistent leave the channels OFF and disable then into DCS If more than 10 channels in total, call RPC shifter leader. Discussion on how (who) deals with it 11

RPC DQM Reference Histos update Decision is taken by the RPC shifter leader. He will evaluate if the appearance of a problem would need an update of a reference histogram also in relation to the solving time perspective of the problem itself. The RPC shift leader should contact the RPC DQM expert who will promptly load the new reference in the DQM database. An additional discussion will be addressed during the weekly RPC Run meeting to validate the update. Online DQM documentation is available at: Offline DQM documentation is available at: In case of DQM warnings, the central shifter should address the issue to the RPC shifter leader 12

RPC DQM 13

RPC Operation at CMS Center  Interface to analysis submission  AlCaRPC Express Stream used  RPCMon Primary Dataset to be fully validate  TTU dedicated dataset need to be defined Several established analyses ready to run on demand  Offline Noise Rates  Chamber Performance  Trigger Performance Condition Data Analysis (Dark Current, Temp, Gas flow, etc…) under major review Several established analyses ready to run on demand  Offline Noise Rates  Chamber Performance  Trigger Performance Condition Data Analysis (Dark Current, Temp, Gas flow, etc…) under major review Prompt feedback on performance 14

RPC Operation at CMS Center  Graphical interface to easily spot problematic chamber  All DQM histograms available from the synoptic view.  Historical DQM  DQM/CAF that would allow even more flexible use of DQM Bari Tier2 to store all relevant data  SKIM Definition under construction  Move to Crab Bulk Submission, subscribing (beta functionality) RPC Skim  DQM/CAF that would allow even more flexible use of DQM Bari Tier2 to store all relevant data  SKIM Definition under construction  Move to Crab Bulk Submission, subscribing (beta functionality) RPC Skim Performance Monitor Vs Time 15

RPC Data Analysis Model Our data analysis model does not rely on the concept of run. Technically can be geared to work on LS basis at high luminosity. The issue is the statistics. Early Analysis Though 2009 low statistics we aim to give green light on results “just the day affter” Dedicated Analysis Team 1)Standard DQM/Prompt Am results with different granularity 2)Summary Plot ala Craft08 paper Procedure to endorse within RPC community Early Results under construction 16

Barrel: OFF It could be useful for endcaps : Time synchronization chamber by chamber (internal) and preliminary LHC synchronization At the beginning only two endcap towers in readout and trigger (downstream) at 8.8 kV (low efficiency) RPC Beam Splash Monitor of: Current from DCS LV current from DCS Occupancy from DQM Noise (from LBB histos) Trigger rate If no problems…. turn on the full endcap system to the next splash 17

HV to STAND-BY HV to ON RPC Operation cycle INTERFILL Check: CCU ring Optical link synchronization LB monitoring (to see if readout is effective) Load new mask, if needed (new LB configuration necessary) Load new TH (if needed) Time: about 3-4 minutes to go from STAND-BY to ON Definition of RPC STAND-BY status: HV ON at 7 kV LV FEB ON LV LB ON 18

RPC Synchronization at start-up Two options possible: Start with current cosmic synchronization (Link Board settings) Calculate the setting for the muons time of flight from the vertex - we will try to do this next week In both cases: We can configure the PACs so that they are looking for hit coincidence in a few consecutive BXs (2 or 3): “BX OR – extended coincidence” feature. Then the PAC trigger is ~100% efficient, even though not well synchronized. Then by analyzing the taken (DAQ) data we can calculate the corrections. Procedure tested for the cosmic muons. We can compute the corrections from about 10 events per Link Board, 1500 LBs, 5 (3) hits/mu. Assuming flat distribution in eta (which is not a case due to different penetration length) we need about 10 4 muons 19

RPC Operation Model A model with 2 full shifts dedicated for 'playtime' every week and possibly 2-3- days every month for longer maintenance period is ok. The two full 'playtime' shifts every week could be dedicated to: - noise and trigger studies (also with cosmics, TTU trigger) - revise the noise channels - software/firmware bug fixes and tests. - establish the detector performance at different HV/TH The 2-3 day maintenance period could be dedicated to: - software bug fixing - major configuration software update, if needed - off detector electronic maintenance 20

RPC hardware fairly ready For this year still some manual configuration. Important effort to have it fully controlled by the Trigger Supervisor DCS/DQM in good shape Still some procedure need to be refined and experts presence really needed through the entire shift. Documentation and procedures need to be validated Interface between P5 and CMS Center still to be improved Prompt analysis tool ready and validated during CRAFT09 Experts and shifters ensured for 2009 RPC Conclusion 21