Download presentation
Presentation is loading. Please wait.
Published byMark Evans Modified over 8 years ago
1
Outline Availability for BI (figures from Evian 2012) Availability seen from BI Sketch of control system with dependencies Tools used for BI availability checks Critical BI systems Conclusions L. Jensen on behalf of: Lars K. Jensen CERN BE/BI1
2
Availability figures for BI (Evian 2012) https://indico.cern.ch/conferenceOtherViews.py?vie w=standard&confId=211614 https://indico.cern.ch/conferenceOtherViews.py?vie w=standard&confId=211614 BI equipment fault-time (=un-availability) – BCT+BPM+OFB+BLM+BSRT = 90 hours (6%) Lars K. Jensen CERN BE/BI2 Courtesy: B. Todd Total 1524 hours (64 days) of fault time
3
Availability definition Characteristic of resource that is operable when required to perform its designated or required function – Provides beam permit when expected to (BLM, BPM, “SIS-BI”) – Provides updated and realistic acquisition values Function of the resource's accessibility, reliability and maintainability – Issues with radiation (R2E) and beam (RF) heating – Emphasis on remote diagnostic and reset Access to equipment (LHC covers big area + underground..) Lars K. Jensen CERN BE/BI3
4
BI system integration with external dependencies Client 1 Client 3 Client 2 CMW proxy or LSA concentrator (+FB-SU) FEC (VME) Settings (RBAC/ MCS) Acquisitions (RBAC/PM) Electronics: BI home-grown (Linux) CO supported (+ commercial) Software: BI home-grown on FESA framework (port to Linux) (FESA 2.10 until end of LHC) Alcove or Tunnel Detectors Electronics: BI home-grown Accessibility issue Radiation hardness (UJ76) Beam Particle-type Bunch/Total Intensity Filling pattern Lars K. Jensen CERN BE/BI4 Sequence tasks Logging service CMW(2/3) BI Exp’t GUI SIS Client 4
5
Tools used by BI for availability checks BI sequencer tasks (run before each fill) (“AGANT”) – OK/Not OK for next fill => most useful after long stops or interventions – email to system experts => Improve/spread the information? – Extended to other systems “Get” on “Status” property for systems declared operational? BI “Expert” applications – On and off-line data analysis – Hardware and software status overviews BI Python scripts extracting data from measurement/logging DB – LS1 development for a framework (now based on “PyRoot”) Timber tool for correlations (MDB/LDB) – Main difficulty for arrays (bunches..) New tool to display BxB data (Intensity, size etc) eLogBook search facility – Difficult to find what you’re looking for.. JIRA (BIOP entries from eLogBook) – BI in favour of increased use (now mainly for injectors) Lars K. Jensen CERN BE/BI5
6
Beam Interlocked systems Beam Loss Monitors Beam Position Interlock (point 6) Abort-gap population (SIS proposed after LS1) Lars K. Jensen CERN BE/BI6
7
Beam Loss Monitors overview Lars K. Jensen CERN BE/BI7 Issues seen 2012->2013: No beam-permit – Optical links & power-supplies – CMW errors BLM sequencer tests (no beam) – Sanity check errors
8
BLM Issue #1 VME Power-supply faults Lars K. Jensen CERN BE/BI8 Refurbishment of all VME PS (~30 systems for LHC BLM) – Replacement of all fans during LS1 (* 6 per system) Majority of fans not operational -> over-heating and failure Fan lifetime = 30’000 – 60’000 h (systems installed 2006 => 70’000 hours operation) – Checks in test-bench before (surface) installation Modify all BLECF modules (~700 installed) Change limit for the HV level detection flag currently too restrictive for proper use by the SIS BLM Issue #2 Tunnel electronics
9
BLM Issue #3 Lars K. Jensen CERN BE/BI9 Combiner and Survey firmware modifications: – Improve regular automatic system checks – Reduce connectivity check errors: – Improve Energy value reception and logging – Add compatibility with new VME CPUs – Preparation for the “Injection Inhibit” feature
10
BLM Issue #4 (acquisition electronics) Lars K. Jensen CERN BE/BI10 Maintenance of all processing modules (~400) – Repair or replace ~ 20% of mezzanines – Clean-up of the optical adaptors and connectors Shuffle optical links – Expect to improve availability by removing common mode failure – Reduce optical link errors and failures Optical Link 1 input is less reliable Accumulation of dust on fibre connectors
11
BLM ‘cron’ tasks (data from MDB) Threshold changes per monitor (24 hours) – Email with summary plots to experts Card temperatures (24 hours) – Access to history per acquisition card High voltage – Reports unexpected measurements Lars K. Jensen CERN BE/BI11
12
BPM Interlock (SR6) Sequencer task in place for functionality testing – Interlock logic executed as beam calibrator simulates position outside dump window Dependency on correct sensitivity setting – Troublesome bunch-intensity overlap before LS1 – Remote controlled attenuators being introduced (MPP discussion pending) Seen issues with bunch intensities at upper limit of high sensitivity range – Modifications being made to strip-line detectors Tools: – Dedicated BI diagnostics tools started before LS1 => logic to ABT – New firmware and software being prepared for after LS1 BxB position data for XPOC analysis Lars K. Jensen CERN BE/BI12
13
BSRA (abort-gap monitor) Reduced system availability due to: – Dependency on light extraction mirrors (BSRT) Beam related RF heating => break-down (BSRT + BSRA + BLDM) – Need to calibrate system after technical stops Slow drifts and interventions (sequencer task with safe beam?) Tools: – MDB/LDB data monitored off-line LS1 developments: – Light extraction system being re-designed to reduce failure risks Run after LS1: – Sequencer task to automatically re-calibrate system? (to be done with beam) – Performance at 6.5TeV to be tested Lars K. Jensen CERN BE/BI13
14
Other BI systems Closed-orbit BPMs Orbit feedback BCT (DC and Fast) TUNE with feedback BWS BSRT BTVDD Lars K. Jensen CERN BE/BI14
15
BPM (distributed beam position) Sequencer task in place (97% pass) – Repair faulty channels during LS1 Lars K. Jensen CERN BE/BI15 Ideas for regular / automatic performance checks (cron) being formalised (as for BLM) Help detecting modules starting to fail (repair during technical stops) Dependency on ambient temperature main source of errors before LS1 Temperature controlled racks being commissioned with remote monitoring Orbit data RT issues observed during last run New Linux CPUs for improved RT performance
16
Orbit and tune feedback Tools used to assess availability – BI expert application + OP (YASP) LS1 developments: – Team with OP and BI members put in place with new staff resource expected during 2014 – OFB/QFB and service-units being consolidated New hardware, updated software and documentation – Ideas for a test-system maturing Restart after LS1: – Re-commissioning (dry-runs ++) will be required Lars K. Jensen CERN BE/BI16
17
Tune systems Tools are BI Expert Applications and raw FFT spectra stored in MDB Main measurement problem is lack of coherent signal with high damper gain – Bunch gated tune systems put in place – Some controls integration to be made QPS current limits versus tune-FB – To be carefully followed after LS1 Lars K. Jensen CERN BE/BI17
18
DC BCT (total beam intensity) BI task in sequencer check checks measurement chain (calibration pulses) against thresholds (DC offsets stored in MDB) Lars K. Jensen CERN BE/BI18 Beam 1 Beam 2 Parallel acquisition system (since 2011) used for intensity and life-time calculation To be fully integrated during LS1 Other developments: MEN A20 CPU with VD80 (improved ADC resolution) Systems prepared for higher bunch- intensity @ 25nsec spacing Correlation plot for parallel systems
19
Fast BCT (bunch intensity) Performance verified off-line with MDB/LDB data extraction tools – On-line Expert GUI started Main issues: – Timing glitches -> reboot – Gain selection complex OP setting (sequencer?) Other developments: – New (ICT) detectors under tests Reduced dependency on bunch spacing and beam position – MEN A20 CPU for improved processing – Prepare for higher bunch-intensities and 25nsec Lars K. Jensen CERN BE/BI19 Intensity = f(beam position) (timber tool)
20
BSRT (average/bunch beam size) Availability figures affected by: – Light extraction system performance RF heating on mirrors (see BSRA) – Complex software algorithms (steering mirrors as function of energy and intensity) Tools used up until LS1: – MDB/LDB data extraction for correlations (off-line) – BI Expert and OP tools used (average and bunch beam sizes) – Real-time video signal streaming Availability after LS1: – Improved light extraction system being installed – Ideas for regular performance checks to be specified and put in place Lars K. Jensen CERN BE/BI20
21
Wire-scanners Tools used up until LS1: – LDB/MDB data for performance analysis (re-fitting) and status information (number of scans / error details) LS1 developments: – Bellows being exchanged (preventively) – Front-end software to Linux (fitting algorithms) – Design for new/more precise scanners (20m/s) and electronics on-going in view of LHC installation (LS2?) Lars K. Jensen CERN BE/BI21
22
BTVDD (dump screens) Transverse image of dumped beam profile used for XPOC Dependency of radiation decay on alumina screens (avoid saturation) as function of total beam intensity and particle type Tools – Images published to post-mortem system – BI Expert + OP + XPOC/PM applications LS1 developments – Improve filter and gain settings handling Lars K. Jensen CERN BE/BI22
23
Conclusions Unavailability of critical BI systems (6%) in the shadow of bigger culprits (but not zero) Several developments on-going for LHC BLM, BSRT and feedbacks to improve availability We propose to extend use of sequencer tasks before each fill and with safe beams – Workload to be estimated (specification and implementation) Framework for regular/daily checks of BI performance being formalised Lars K. Jensen CERN BE/BI23
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.