A. Polini CHEP 2010, October 18-22, Taipei,Taiwan A.Polini (on behalf of the ATLAS Muon Collaboration) Outline: Detector Description System Requirements.

Slides:



Advertisements
Similar presentations
The Control System for the ATLAS Pixel Detector
Advertisements

Detector Control System at the GIF++ A. Polini, M. Romano (AIDA+INFN Bologna) on behalf of WP group Gif++ User Meeting 11/07/2014 Outline Status.
André Augustinus ALICE Detector Control System  ALICE DCS is responsible for safe, stable and efficient operation of the experiment  Central monitoring.
Clara Gaspar, May 2010 The LHCb Run Control System An Integrated and Homogeneous Control System.
19 March 2012Muon Week - Operations1 Muon Week Muon Operations Session Introduction Status and LHC Schedule Detector Readiness + Preparation for Physics.
Robert Gomez-Reino on behalf of PH-CMD CERN group.
DAQ System at the 2002 ATLAS Muon Test Beam G. Avolio – Univ. della Calabria E. Pasqualucci - INFN Roma.
DCS TCSG November 10th 1999, H.J.Burckhart1 Status of the general purpose I/O system LMB u DCS Architecture u LMB u Local Monitor Box (LMB) u Concept u.
CF 8/June/2003 ATLAS muon week in Gallipoli 1 The 1 st Beam Test of TGC electronics at H8 in May/June ‘03 Chikara Fukunaga on behalf of TGC electronics.
Towards a Detector Control System for the ATLAS Pixeldetector Susanne Kersten, University of Wuppertal Pixel2002, Carmel September 2002 Overview of the.
MDT PS DCS for ATLAS Eleni Mountricha
JCOP Workshop September 8th 1999 H.J.Burckhart 1 ATLAS DCS Organization of Detector and Controls Architecture Connection to DAQ Front-end System Practical.
Development and Performance of the High Voltage Distribution System for the ALICE TRD A. Markouizos, P. Mantzaridis, P. Mitseas, A. Petridis, S. Potirakis,
Clara Gaspar, October 2011 The LHCb Experiment Control System: On the path to full automation.
S.Sergeev (JINR). Tracker consists of  4 chambers of 4 views each In total ~7200 drift tubes (~450 per view) To be controlled/monitored  HV system.
Precision Drift Chambers for the ATLAS Muon Spectrometer Susanne Mohrdieck Max-Planck-Institut f. Physik, Munich for the ATLAS Muon Collaboration Abstracts:
XXVI Workshop on Recent Developments in High Energy Physics and Cosmology Theodoros Argyropoulos NTUA DCS group Ancient Olympia 2008 ATLAS Cathode Strip.
ATLAS MDT Power Supply DCS Anastasios Iliopoulos 15/08/2006.
DCS Plans and Pending Activities for 2013/14 Shutdown Muons in general Replacement/upgrade of DCS computers OS migration to Win-7, Win-2008 or SL6 PVSS.
André Augustinus 10 September 2001 DCS Architecture Issues Food for thoughts and discussion.
André Augustinus 17 June 2002 Technology Overview What is out there to fulfil our requirements? (with thanks to Tarek)
Giovanni Polese1 RPC Detector Control System for MTCC Pierluigi Paolucci, Anna Cimmino I.N.F.N. of Naples Giovanni Polese Lappeenranta University.
The Detector Control Power System of the Monitored Drift Tubes of the ATLAS Experiment Theodoros Alexopoulos NTU Athens TWEPP08 September 17, 2008 Naxos,
Clara Gaspar, March 2005 LHCb Online & the Conditions DB.
M. Bianco On behalf of the ATLAS Collaboration
Controls EN-ICE FSM for dummies (…w/ all my respects) 15 th Jan 09.
Progress on the beam tracking instrumentation Position measurement device Tests performed and their resolution Decision on electronics Summary.
Fifth CMS Electronics Week EASY: a new SY1527-based Embedded Assembly SYstem May 7th, 2003 CAEN SpA.
Precision Drift Chambers for the ATLAS Muon Spectrometer
DCS overview - L.Jirdén1 ALICE ECS/DCS – project overview strategy and status L.Jirden u Organization u DCS system overview u Implementation.
First CMS Results with LHC Beam
1 Calorimeters LED control LHCb CALO meeting Anatoli Konoplyannikov /ITEP/ Status of the calorimeters LV power supply and ECS control Status of.
ATLAS DCS ELMB PRR, CERN, March 2002Fernando Varela ELMB Networks CAN/CANopen Interoperability of the ELMB Usage of the ELMB in ATLAS ELMB Networks Full.
2002 LHC days in Split Sandra Horvat 08 – 12 October, Ruđer Bošković Institute, Zagreb Max-Planck-Institute for Physics, Munich Potential is here...
Clara Gaspar on behalf of the ECS team: CERN, Marseille, etc. October 2015 Experiment Control System & Electronics Upgrade.
Clara Gaspar, April 2006 LHCb Experiment Control System Scope, Status & Worries.
LAV thresholds requirements Paolo Valente. LAV answers for Valeri’s questions (old) 1.List of hardware to control (HV, LV, crates, temperatures, pressure,
The DCS Databases Peter Chochula. 31/05/2005Peter Chochula 2 Outline PVSS basics (boring topic but useful if one wants to understand the DCS data flow)
Michele Bianco ICATPP 091 Performance of the Resistive Plate Chambers as LVL1 ATLAS muon trigger Michele Bianco INFN Lecce & Physics Department,
ATLAS DCS ELMB PRR, March 4th 2002, H.J.Burckhart1 Embedded Local Monitor Board ELMB  Context  Aim  Requirements  Add-ons  Our aims of PRR.
S.Sergeev (JINR). Tracker consists of  4 stations of 4 views (planes) each In total ~7200 drift tubes (~450 per view) To be controlled/monitored 
T0 DCS Status DCS Workshop March 2006 T.Karavicheva on behalf of T0 team.
DCS workshop,march 10th P.Saturnini, F. Jouve Slow control trigger of the muon arm Muon trigger of the muon arm Low voltage VME Crate External parameters.
E Ethernet C CAN bus P Profibus HV HV cables LV LV cables (+busbar) S Serial (RS232) Signal cable Other/Unknown Liquid or Gas Cable and/or Bus PCI-XYZ.
ATLAS DCS Workshop on PLCs and Fieldbusses, November 26th 1999, H.J.Burckhart1 CAN and LMB in ATLAS u Controls in ATLAS u CAN u Local Monitor Box u Concept.
Siena, May A.Tonazzo –Performance of ATLAS MDT chambers /1 Performance of BIL tracking chambers for the ATLAS muon spectrometer A.Baroncelli,
ATLAS The ConditionDB is accessed by the offline reconstruction framework (ATHENA). COOLCOnditions Objects for LHC The interface is provided by COOL (COnditions.
14/02/2008 Michele Bianco 1 G.Chiodini & E.Gorini ATLAS RPC certification with cosmic rays Università del Salento Facoltà di Scienze MM.FF.NN.
ALICE. 2 Experiment Size: 16 x 26 meters (some detectors >100m away from IP) Weight: 10,000 tons Detectors: 18 Magnets: 2 Dedicated to study of ultra.
The LHCb Online Framework for Global Operational Control and Experiment Protection F. Alessio, R. Jacobsson, CERN, Switzerland S. Schleich, TU Dortmund,
Real data in the Muon Spectrometer Bernardo RESENDE and a lot of other people not named here NIKHEF Jamboree, December 2008.
EPS HEP 2007 Manchester -- Thilo Pauly July The ATLAS Level-1 Trigger Overview and Status Report including Cosmic-Ray Commissioning Thilo.
Operation, performance and upgrade of the CMS Resistive Plate Chamber system at LHC Marcello Abbrescia Physics Department - University of Bari & INFN,
Design and Performance of the Atlas RPC Detector Control System
ATLAS MDT HV – LV Detector Control System (DCS)
Controlling a large CPU farm using industrial tools
Cosmic Ray Tracker and Detector Control System at the GIF++
The LHCb Run Control System
RPC Detector Control System
RPC Detector Control System
RPC Detector Control System
The IFR Online Detector Control at the BaBar experiment at SLAC
Bringing the ATLAS Muon Spectrometer to Life with Cosmic Rays
Pierluigi Paolucci & Giovanni Polese
RPC Detector Control System
RPC Detector Control System
Pierluigi Paolucci & Giovanni Polese
Pierluigi Paolucci & Giovanni Polese
Pierluigi Paolucci & Giovanni Polese
Pierluigi Paolucci & Giovanni Polese
Presentation transcript:

A. Polini CHEP 2010, October 18-22, Taipei,Taiwan A.Polini (on behalf of the ATLAS Muon Collaboration) Outline: Detector Description System Requirements Architecture and Special Features Status and Performance Future and Outlook Design and Performance of the Atlas Muon Detector Control System

A. Polini CHEP 2010, October 18-22, Taipei,Taiwan The ATLAS Detector 44 meters length 25 meters diameter Barrel region Endcap region LHC 2

A. Polini CHEP 2010, October 18-22, Taipei,Taiwan 3 ATLAS Muon Spectrometer Monitored Drift Tube |  | < k ch, 6 multi layers Thin Gap Chamber 1.1 < |  | < k ch, 7 layers Cathode Strip Chamber 2.0 < |  | < k ch, 4 layers Resistive Plate Chamber |  | < k ch, 6 layers Trigger and Readout Precise spatial measurement Toroidal Magnets (2~6 T  m) Precise spatial measurement Trigger and Readout

A. Polini CHEP 2010, October 18-22, Taipei,Taiwan Detector Control System nFour different detector technologies (CSC, MDT, RPC, TGC) with different characteristics, requirements and operating procedures nSystem operated in radiation area and strong B field The Muon DCS system is in charge of: nControl the detector power system (chamber HV, frontend electronics LV) nRead and archive all non event-based environmental and conditions data nAdjust working point parameters (HV/LV thresholds etc.) to ensure efficient detector data taking nControl and archive data from the alignment system nConfigure the frontend electronics (MDT) nProvide coherent shift and expert tools for detector monitoring commissioning/maintenance nControl which actions are allowed under what conditions to prevent configurations potentially harmful for the detector 4

A. Polini CHEP 2010, October 18-22, Taipei,Taiwan 5 ATLAS Detector Control System Hierarchical approach: nSeparation of frontend (process) and supervisory layer nCommercial SCADA System + CERN JCOP Framework (LHC standard) + Detector Specific Developments nScalable, Distributed, Communication with HW (OPC, TCP) nInterfaces to Central Database (History Archiving, ConditionDB, ConfigurationDB) PVSS Manager concept (ETM) DCS to DAQ Communication

A. Polini CHEP 2010, October 18-22, Taipei,Taiwan 6 Finite State Machine (FSM) Concept n Bringing/keeping the Detector in(to) ‘Ready-for-Physics’ state involves many tens of thousands of hardware channels to control/monitor n Abstract finite state model adopted: Summary information decouples hardware details and complex setting procedures from the shifter operation n Tree structure, modeling geometrical or functional granularity of each sub-detector n Device Units and Control Units n Command execution: – from top FSM nodes (ATLAS runs) – for individual/groups devices (debug) n Typically 100 – 1000 nodes/subdetector READY STANDBY TRANSITION SHUTDOWN NOT_READY UNKNOWN

A. Polini CHEP 2010, October 18-22, Taipei,Taiwan 7 Power System Hardware Choice  Commercial Solution: CAEN EASY system (MDT, RPC, TGC HV+LV; CSC HV) nScalable system with huge number of HV/LV channels to control nMainframe (SY1527) + branch controller boards in counting room nBoards can operate in radiation area and magnetic field (up to 2kG) nDedicated modules Main Power, LV, High Voltage, DAC, ADC nCommunication via standard OPC server and TCP nATLAS Muon Setup: 8 Mainframes: Mainframe OPC Branch Controllers HV/LV Boards Crate1Crate2 … AC/DC converter 48V … Hostile Area Counting Room - ~9000 HV/LV channels - ~6000 ADC (RPC current monitoring) - ~3000 DAC (RPC thresholds) channels

A. Polini CHEP 2010, October 18-22, Taipei,Taiwan Satisfactory performance: nafter distributing the load on an appropriate number of systems nlatest hw/fw versions and tuning of OPC parameters* *(RPC running in event mode, MDT,CSC,TGC in polling) One example: RPC n4 Symmetric Mainframes: 128-channel DAC (A3802) threshold tuning (~3100 ch) 128-channel ADC module (A3801) with average and peak current measurement (~6400 ch.)  Gap currents, Env variables, etc. n100 CAEN EASY crates controlling overall about parameters nHigh granularity detector monitoring: Power & Monitoring System 8

A. Polini CHEP 2010, October 18-22, Taipei,Taiwan Environmental Monitoring & FE Initialization nBased on ATLAS custom Embedded Local Monitoring Board (ELMB) nCommunication to host computer via CAN Field Bus. nEach ELMB ( ~1500 ELMBs for TGC, ~1200 for MDT) : –64 channel, 16bit ADC –18 bi-directional digital I/O –8 digital input –8 digital output lines nADC used for readout of: –temperatures (~14000 probes), –chamber displacements (~3700 TGC) –magnetic field (1650 3D hall probes) –frontend electronics V and T nFront End electronics Initialization via JTAG, programming frontend shift registers from ELMB digital outputs. (Interplay of DCS and DAQ for stop-less chamber removal/recovery) 9 JTAG CSM-ADC SPI-AUX

A. Polini CHEP 2010, October 18-22, Taipei,Taiwan B Field Measurement nB-sensors: 1773 nPrecision nMax. field 1.4 [Tesla] nCalibrated nUnique ID nstability < 0.5 Gauss nHall-sensor(x,y,z) + T nELMB+CAN readout 10

A. Polini CHEP 2010, October 18-22, Taipei,Taiwan TGC CCMC nDedicated DCS-PS board (1500 on the whole detector). nELMB with custom firmware nCustom Chamber Charge Measuring Circuit  Estimate performance and noise level –Problematic status is sent to the condition DB along with the other DCS information (HV LV etc.) –HV or thresholds may be changed 11 The histogram collected with a threshold of 50mV ( 180mV) is shown as a gray (black) line Typical CCMC histogram ADC channel units

A. Polini CHEP 2010, October 18-22, Taipei,Taiwan MDT Alignment System Goal: measure deformation and position of the barrel chambers nperformed by 5812 optical lines (channels) nFramegrabber  8 PCs  PVSS readout nData stored into DB for off-line analysis nAccuracy of track sagitta: < 50 μm 12

A. Polini CHEP 2010, October 18-22, Taipei,Taiwan 13 UI: Some Snapshots Muon User Panels

A. Polini CHEP 2010, October 18-22, Taipei,Taiwan 14 Expert & Analysis Tools

A. Polini CHEP 2010, October 18-22, Taipei,Taiwan ATLAS Running : Stable running  Combined Operation: nHandle 4 sub-detectors with over 40 systems as One nUnify procedures, User interfaces, shift personnel Automatic HV Operation (HV Nominal - HV Standby Depending on LHC stable beam- adjust handshakes) Very good experience so far ATLAS

A. Polini CHEP 2010, October 18-22, Taipei,Taiwan 16 DCS and Beyond: ATLAS Beam Splash Events nBeam splash effects visible not only from data but also directly via DCS nThe peak current of the RPC gaps is read via ADC (DCS standard) nThe instantaneous gap current is sampled at 1kHz and if a programmed threshold is passed the charge peak is recorded by the RPC DCS nIn the beam splash run the threshold was roughly equivalent to about 100 hits/m 2 n of gaps over threshold time 140m p RPC DCS

A. Polini CHEP 2010, October 18-22, Taipei,Taiwan DCS and beyond: Background Maps nProbe background levels from gas gap current monitoring via DCS: –Typical current 100nA (no beam) single gap readout sensitivity 2nA –Average charge per count 30pC at READY (for MIPS) nHave recently introduced additional delay before going to STANDBY after a Beam dump from stable beams to study “after-glow” effects, activation, ….  Extrapolate to a luminosity of cm -2 t -1 to validate MC simulations and assumptions on RPC high luminosity operation (resistivity, rate capability, …) DCS Data Outer layer Middle layer Inner layer RPC average current readings over 1min, [  A/m 2 ] Luminosity x10 30 Total RPC normalized gas gap current Beam 1 Beam 2 17

A. Polini CHEP 2010, October 18-22, Taipei,Taiwan 18 Conclusions nThe ATLAS MUON DCS offers a complete solution for operating and controlling large LHC detectors nThe use of few commercial solutions and a distributed and scalable design has proven its benefits in terms of stability, development and maintenance nThe system is fully operative and has shown to be extremely flexible and powerful allowing shifter (FSM) as well as expert operation and analysis nThe very large number of detector elements monitored trough the DCS, will provide a statistical study of the different detector behavior and represent a powerful tool for a deeper understanding on present and future detector physics