Presentation is loading. Please wait.

Presentation is loading. Please wait.

A. Polini CHEP 2010, October 18-22, Taipei,Taiwan A.Polini (on behalf of the ATLAS Muon Collaboration) Outline: Detector Description System Requirements.

Similar presentations


Presentation on theme: "A. Polini CHEP 2010, October 18-22, Taipei,Taiwan A.Polini (on behalf of the ATLAS Muon Collaboration) Outline: Detector Description System Requirements."— Presentation transcript:

1 A. Polini CHEP 2010, October 18-22, Taipei,Taiwan A.Polini (on behalf of the ATLAS Muon Collaboration) Outline: Detector Description System Requirements Architecture and Special Features Status and Performance Future and Outlook Design and Performance of the Atlas Muon Detector Control System

2 A. Polini CHEP 2010, October 18-22, Taipei,Taiwan The ATLAS Detector 44 meters length 25 meters diameter Barrel region Endcap region LHC 2

3 A. Polini CHEP 2010, October 18-22, Taipei,Taiwan 3 ATLAS Muon Spectrometer Monitored Drift Tube |  | < 2.7 370k ch, 6 multi layers Thin Gap Chamber 1.1 < |  | < 2.4 440k ch, 7 layers Cathode Strip Chamber 2.0 < |  | < 2.7 31k ch, 4 layers Resistive Plate Chamber |  | < 1.1 360k ch, 6 layers Trigger and Readout Precise spatial measurement Toroidal Magnets (2~6 T  m) Precise spatial measurement Trigger and Readout

4 A. Polini CHEP 2010, October 18-22, Taipei,Taiwan Detector Control System nFour different detector technologies (CSC, MDT, RPC, TGC) with different characteristics, requirements and operating procedures nSystem operated in radiation area and strong B field The Muon DCS system is in charge of: nControl the detector power system (chamber HV, frontend electronics LV) nRead and archive all non event-based environmental and conditions data nAdjust working point parameters (HV/LV thresholds etc.) to ensure efficient detector data taking nControl and archive data from the alignment system nConfigure the frontend electronics (MDT) nProvide coherent shift and expert tools for detector monitoring commissioning/maintenance nControl which actions are allowed under what conditions to prevent configurations potentially harmful for the detector 4

5 A. Polini CHEP 2010, October 18-22, Taipei,Taiwan 5 ATLAS Detector Control System Hierarchical approach: nSeparation of frontend (process) and supervisory layer nCommercial SCADA System + CERN JCOP Framework (LHC standard) + Detector Specific Developments nScalable, Distributed, Communication with HW (OPC, TCP) nInterfaces to Central Database (History Archiving, ConditionDB, ConfigurationDB) PVSS Manager concept (ETM) DCS to DAQ Communication

6 A. Polini CHEP 2010, October 18-22, Taipei,Taiwan 6 Finite State Machine (FSM) Concept n Bringing/keeping the Detector in(to) ‘Ready-for-Physics’ state involves many tens of thousands of hardware channels to control/monitor n Abstract finite state model adopted: Summary information decouples hardware details and complex setting procedures from the shifter operation n Tree structure, modeling geometrical or functional granularity of each sub-detector n Device Units and Control Units n Command execution: – from top FSM nodes (ATLAS runs) – for individual/groups devices (debug) n Typically 100 – 1000 nodes/subdetector READY STANDBY TRANSITION SHUTDOWN NOT_READY UNKNOWN

7 A. Polini CHEP 2010, October 18-22, Taipei,Taiwan 7 Power System Hardware Choice  Commercial Solution: CAEN EASY system (MDT, RPC, TGC HV+LV; CSC HV) nScalable system with huge number of HV/LV channels to control nMainframe (SY1527) + branch controller boards in counting room nBoards can operate in radiation area and magnetic field (up to 2kG) nDedicated modules Main Power, LV, High Voltage, DAC, ADC nCommunication via standard OPC server and TCP nATLAS Muon Setup: 8 Mainframes: Mainframe OPC Branch Controllers HV/LV Boards Crate1Crate2 … AC/DC converter 48V … Hostile Area Counting Room - ~9000 HV/LV channels - ~6000 ADC (RPC current monitoring) - ~3000 DAC (RPC thresholds) channels

8 A. Polini CHEP 2010, October 18-22, Taipei,Taiwan Satisfactory performance: nafter distributing the load on an appropriate number of systems nlatest hw/fw versions and tuning of OPC parameters* *(RPC running in event mode, MDT,CSC,TGC in polling) One example: RPC n4 Symmetric Mainframes: 128-channel DAC (A3802) threshold tuning (~3100 ch) 128-channel ADC module (A3801) with average and peak current measurement (~6400 ch.)  Gap currents, Env variables, etc. n100 CAEN EASY crates controlling overall about 50.000 parameters nHigh granularity detector monitoring: Power & Monitoring System 8

9 A. Polini CHEP 2010, October 18-22, Taipei,Taiwan Environmental Monitoring & FE Initialization nBased on ATLAS custom Embedded Local Monitoring Board (ELMB) nCommunication to host computer via CAN Field Bus. nEach ELMB ( ~1500 ELMBs for TGC, ~1200 for MDT) : –64 channel, 16bit ADC –18 bi-directional digital I/O –8 digital input –8 digital output lines nADC used for readout of: –temperatures (~14000 probes), –chamber displacements (~3700 TGC) –magnetic field (1650 3D hall probes) –frontend electronics V and T nFront End electronics Initialization via JTAG, programming frontend shift registers from ELMB digital outputs. (Interplay of DCS and DAQ for stop-less chamber removal/recovery) 9 JTAG CSM-ADC SPI-AUX

10 A. Polini CHEP 2010, October 18-22, Taipei,Taiwan B Field Measurement nB-sensors: 1773 nPrecision 10 -4 nMax. field 1.4 [Tesla] nCalibrated nUnique ID nstability < 0.5 Gauss nHall-sensor(x,y,z) + T nELMB+CAN readout 10

11 A. Polini CHEP 2010, October 18-22, Taipei,Taiwan TGC CCMC nDedicated DCS-PS board (1500 on the whole detector). nELMB with custom firmware nCustom Chamber Charge Measuring Circuit  Estimate performance and noise level –Problematic status is sent to the condition DB along with the other DCS information (HV LV etc.) –HV or thresholds may be changed 11 The histogram collected with a threshold of 50mV ( 180mV) is shown as a gray (black) line Typical CCMC histogram ADC channel units

12 A. Polini CHEP 2010, October 18-22, Taipei,Taiwan MDT Alignment System Goal: measure deformation and position of the barrel chambers nperformed by 5812 optical lines (channels) nFramegrabber  8 PCs  PVSS readout nData stored into DB for off-line analysis nAccuracy of track sagitta: < 50 μm 12

13 A. Polini CHEP 2010, October 18-22, Taipei,Taiwan 13 UI: Some Snapshots Muon User Panels

14 A. Polini CHEP 2010, October 18-22, Taipei,Taiwan 14 Expert & Analysis Tools

15 A. Polini CHEP 2010, October 18-22, Taipei,Taiwan ATLAS Running 15 2010: Stable running  Combined Operation: nHandle 4 sub-detectors with over 40 systems as One nUnify procedures, User interfaces, shift personnel Automatic HV Operation (HV Nominal - HV Standby Depending on LHC stable beam- adjust handshakes) Very good experience so far ATLAS

16 A. Polini CHEP 2010, October 18-22, Taipei,Taiwan 16 DCS and Beyond: ATLAS Beam Splash Events nBeam splash effects visible not only from data but also directly via DCS nThe peak current of the RPC gaps is read via ADC (DCS standard) nThe instantaneous gap current is sampled at 1kHz and if a programmed threshold is passed the charge peak is recorded by the RPC DCS nIn the beam splash run the threshold was roughly equivalent to about 100 hits/m 2 n of gaps over threshold time 140m p RPC DCS

17 A. Polini CHEP 2010, October 18-22, Taipei,Taiwan DCS and beyond: Background Maps nProbe background levels from gas gap current monitoring via DCS: –Typical current 100nA (no beam) single gap readout sensitivity 2nA –Average charge per count 30pC at READY (for MIPS) nHave recently introduced additional delay before going to STANDBY after a Beam dump from stable beams to study “after-glow” effects, activation, ….  Extrapolate to a luminosity of 10 34 cm -2 t -1 to validate MC simulations and assumptions on RPC high luminosity operation (resistivity, rate capability, …) DCS Data Outer layer Middle layer Inner layer RPC average current readings over 1min, [  A/m 2 ] Luminosity x10 30 Total RPC normalized gas gap current Beam 1 Beam 2 17

18 A. Polini CHEP 2010, October 18-22, Taipei,Taiwan 18 Conclusions nThe ATLAS MUON DCS offers a complete solution for operating and controlling large LHC detectors nThe use of few commercial solutions and a distributed and scalable design has proven its benefits in terms of stability, development and maintenance nThe system is fully operative and has shown to be extremely flexible and powerful allowing shifter (FSM) as well as expert operation and analysis nThe very large number of detector elements monitored trough the DCS, will provide a statistical study of the different detector behavior and represent a powerful tool for a deeper understanding on present and future detector physics


Download ppt "A. Polini CHEP 2010, October 18-22, Taipei,Taiwan A.Polini (on behalf of the ATLAS Muon Collaboration) Outline: Detector Description System Requirements."

Similar presentations


Ads by Google