icBLM verification plan

Slides:



Advertisements
Similar presentations
Troubleshooting Training Course.  Visual and General Test  Perform System Test (Mode  Perform System Test)  Identify The Error  If you need Technical.
Advertisements

Digital RF Stabilization System Based on MicroTCA Technology - Libera LLRF Robert Černe May 2010, RT10, Lisboa
RPC Trigger Software ESR, July Tasks subsystem DCS subsystem Run Control online monitoring of the subsystem provide tools needed to perform on-
1Calice-UK Cambridge 9/9/05D.R. Ward David Ward Compare Feb’05 DESY data with Geant4 and Geant3 Monte Carlos. Work in progress – no definitive conclusions.
Y. Karadzhov MICE Video Conference Thu April 9 Slide 1 Absolute Time Calibration Method General description of the TOF DAQ setup For the TOF Data Acquisition.
CFT Calibration Calibration Workshop Calibration Requirements Calibration Scheme Online Calibration databases.
New Features of APV-SRS-LabVIEW Data Acquisition Program Eraldo Oliveri on behalf of Riccardo de Asmundis INFN Napoli [Certified LabVIEW Developer] NYC,
ATF2 Q-BPM System 19 Dec Fifth ATF2 Project Meeting J. May, D. McCormick, T. Smith (SLAC) S. Boogert (RH) B. Meller (Cornell) Y. Honda (KEK)
Clock Board Test and Preliminary Acceptance Criteria Ciemat (Madrid), April 2009 Juan de Vicente, Javier Castilla, Gustavo Martínez.
Tests with JT0623 & JT0947 at Indiana University Nagoya PMT database test results for JT0623 at 3220V: This tube has somewhat higher than usual gain. 5×10.
CRIO as a hardware platform for Machine Protection. W. Blokland S. Zhukov.
SNS Integrated Control System SNS Timing Master LA-UR Eric Bjorklund.
DC12 Commissioning Status GOALS: establish operating conditions, determine initial calibration parameters and measure operating characteristics for the.
ATLAS Liquid Argon Calorimeter Monitoring & Data Quality Jessica Levêque Centre de Physique des Particules de Marseille ATLAS Liquid Argon Calorimeter.
Status of Beam loss Monitoring on CTF3 Results of Tests on LINAC and PETS as R&D for TBL Anne Dabrowski Northwestern University Thibaut Lefevre CERN CTF3.
HBD FEM Overall block diagram Individual building blocks Outlook ¼ detector build.
21 March 2007 Controls 1 Hardware, Design and Standards Issues for ILC Controls Bob Downing.
Clara Gaspar, March 2005 LHCb Online & the Conditions DB.
Fast Fault Finder A Machine Protection Component.
SL1Calo Input Signal-Handling Requirements Joint Calorimeter – L1 Trigger Workshop November 2008 Norman Gee.
March 9, 2005 HBD CDR Review 1 HBD Electronics Preamp/cable driver on the detector. –Specification –Schematics –Test result Rest of the electronics chain.
1M. Ellis - 17th May 2007 SciFi Decoding (Everything you never wanted to know but couldn’t avoid going over and over)  VLSB Data (unpacking to AFE, MCM,
1 ENH1 Detector Integration Meeting 18/6/2015 D.Autiero (IPNL) WA105:  Detector installation requirements  Detector racks and power requirements Note:
28 June 2004ATLAS Pixel/SCT TIM FDR/PRR1 TIM tests with ROD Crate John Hill.
NTOF DAQ status D. Macina (EN-STI-EET) Acknowledgements: EN-STI-ECE section: A. Masi, A. Almeida Paiva, M. Donze, M. Fantuzzi, A. Giraud, F. Marazita,
S.MonteilCOMMISSIONING1 PS/SPD ELECTRONICS OUTLINE 1)STATUS OF PS/SPD FE BOARDS PRODUCTION 2)PHASES OF PS/SPD COMMISSIONING 1)LEDs AND DETECTORS 2)TUBES.
GLAST LAT Project LAT System Engineering 1 GLAST Large Area Telescope: LAT System Engineering Pat Hascall SLAC System Engineering Manager
Single Module Tests (electrical) Inspect connectors and cables for visible damage C-measurement: make sure all pads are correctly connected HV test (1.5.
5 June 2002DOM Main Board Engineering Requirements Review 1 DOM Main Board Software Engineering Requirements Review June 5, 2002 LBNL Chuck McParland.
1 Calorimeters LED control LHCb CALO meeting Anatoli Konoplyannikov /ITEP/ Status of the calorimeters LV power supply and ECS control Status of.
(s)T3B Update – Calibration and Temperature Corrections AHCAL meeting– December 13 th 2011 – Hamburg Christian Soldner Max-Planck-Institute for Physics.
GAN: remote operation of accelerator diagnosis systems Matthias Werner, DESY MDI.
1 HBD Commissioning Itzhak Tserruya DC meeting, BNL December 13, 2006 Progress from October 3 to November 28, 2006.
S.MonteilPS COMMISSIONING1 MaPMT-VFE-FE ELECTRONICS COMMISSIONING AND MONITORING. OUTLINE 1)Ma-PMT TEST BENCHES MEASUREMENTS 2)VFE AND FE ELECTRONICS FEATURES.
Rutherford Appleton Laboratory September 1999Fifth Workshop on Electronics for LHC Presented by S. Quinton.
TRIPLEGEM and VFATs at The Test Beam Area TRIPLEGEM and VFATs at The Test Beam Area N. Turini……reporter Eraldo Oliveri! Eraldo Oliveri main worker! N.
Study of the Radiation Damage of Hamamatsu Silicon Photo Multipliers Wander Baldini Istituto Nazionale di Fisica Nucleare and Universita’ degli Studi di.
WIR SCHAFFEN WISSEN – HEUTE FÜR MORGEN Steps towards beam commissioning: Low Level RF system group Thomas Schilcher :: Paul Scherrer Institut SwissFEL.
Crashcourse Oscilloscope and Logic Analyzer By Christoph Zimmermann.
Reliability and Performance of the SNS Machine Protection System Doug Curry 2013.
Rainer Stamen, Norman Gee
2007 IEEE Nuclear Science Symposium (NSS)
LKr status R. Fantechi.
Injectors BLM system: PS Ring installation at EYETS
PSD Front-End-Electronics A.Ivashkin, V.Marin (INR, Moscow)
CLAS12 Calibration and Commissioning (CALCOM)
LHC BLM system: system overview
GlueX Electronics Review Jefferson Lab July 23-24, 2003
TB8100 Technical Training July 2005
Combiner functionalities
Low Level RF Status Outline LLRF controls system overview
BESIII EMC electronics
Optical links in the 25ns test beam
SPOKE Cavities frequency preparation plan and RF control procedures
Low Level RF Status Outline LLRF controls system overview
Testing and deployment
The CMS Tracking Readout and Front End Driver Testing
Particle Detection System for MERIT
PSS verification and validation
Control Systems for the APTM and GRID
The LHCb Front-end Electronics System Status and Future Development
Cables, Racks, and the rest
Machine Protection PLC Based System Verification and Validation Plan
Clement Derrez European Spallation Source ERIC Date
Clement Derrez Test Engineer
Electronics, motion control and verification
Radiation Detectors for the PSS1 at ESS
Kaj Rosengren FPGA Designer – Beam Diagnostics
Rack installation and local tests for the MEBT Chopper Rack
Presentation transcript:

icBLM verification plan Clement Derrez European Spallation Source ERIC Date

Reminders: Naming convention, data management tools Naming convention: Every Field Replaceable Unit (FRU) is named. A FRU can be a simple part or an assembly Data management system: going from Insight to EAM Data export / import with other software tools Ensures traceability between tests and production data, system components and laboratory devices. Systems and subsystems tests results coming from different locations (ESS, IK Partner, commercial partners…) SciCat is a good candidate to complete EAM Data files format: HDF5 Files, groups and datasets minimum mandatory attributes are defined Insight -> EAM Component requirements System FRUs list Tests report Test plan Laboratory devices Test Design document JIRA: timeline

Reminders: acceptance workflow System requirements System requirements are analyzed for each system and its components System Verification plan Describes subsystems' test setup and characterization procedures When not COTS: test plan is prepared/discussed and agreed with IK partner Validation of verification plan Verification document is validated by system lead: we make sure test requirements are correctly translated Test coverage is reviewed. Tests IDs defined for each measurements Run tests Acceptance tests performed in BI lab, possibly compared with IK results. System verification performed after installation, without beam and with beam from the LCR System hand over

System FRUs Component ID Acronym Description AMC 1 per up to 2 FMC, IOxOS 1411 type FMC Digitizer FMC CAEN FMC-PICO-1M4-C3 Timing receiver EVR 1 per MTCA chassis, event receiver for trigger, clock, calibration announcement and beam/machine mode information Chassis MTCA 1 per system MicroTCA Carrier Hub MCH 1 per MTCA chassis MicroTCA Power Supply PS 1 per MTCA chassis, 600/1000 W MicroTCA CPU CPU 1 per MTCA chassis, intel i7 ECAT crate, populated ECAT 1 per HV supply HV supply HV 1 per n systems, ISEG HV supply Rack patch panel PP 1 per n systems, top of rack Rack U 1 per n ACCTs, rack for electronics Detector unit icBLM 1 ionization chamber Beam line patch panel BLPP 1 per N ionization chambers

Verification identification HV unit and ionization chamber detectors Test ID Verification procedure icBLM_1 Visual inspection. Check visually the HV card delivered icBLM_2 Install HV unit with ECAT. Connect to IC with test cable. Apply test voltage. Verify communication between ECAT and HV Supply icBLM_3 Verify Current readback icBLM_4 Verify Voltage readback icBLM_5 Verify Voltage readback RMS HV Unit Detectors Test ID Verification procedure icBLM_6 Visual inspection. Check visually the Detector: SHV connectors, BNC connector, Serial number labels. icBLM_7 Connect Signal cable to pico-ammeter in leakage testbench. Run automated measurement icBLM_8 Connect Signal cable to pico-ammeter in Rad Source testbench. Run automated measurement

Verification identification HV unit and ionization chamber detectors Digitizer Test ID Verification procedure Threshold   Run auto test software to measure: icBLM_9 Channels RMS noise, OC, RNG0 < 250 nA icBLM_10 Channels RMS noise, OC, RNG1 < 15 na icBLM_11 Channels RMS noise, with long cable and detector, RNG0 < 150 nA icBLM_12 Channels RMS noise, with long cable and detector, RNG1 < 180 nA icBLM_13 Bandwidth 300 KHz icBLM_14 ADC ENOB 14.5 bits icBLM_15 Crosstalk < 0.1% FS

Reminders: After-installation tests sequence cold check-out commissioning with beam quick self-check Sequential testing is important (tests timestamps are checked automatically). Otherwise the statement: “the instrument is installed and working properly” has less confidence. During debugging: Relevant tests in each architectural layer are repeated until satisfactory results are obtained. If the problem is identified and can be isolated within its layer, there is normally no need to repeat all of the tests which are sequentially following. Depending on the situation, some test might become mandatory nevertheless (for instance a software recompilation might entail a standard interface check, and a repaired connector might entail a signal transmission check). Cold check out: Testing of the whole instrument on all architectural layers: monitor, front-end electronics, cables, mTCA-electronics, timing, data treatment, publishing, network transmission and machine protection interface. Commissioning with beam: Aims at verifying the correctness of the integration for machine operations. Includes initial comparisons and cross-calibrations in order to gain confidence in the instrument. Performance limiting factors are identified. Quick self test: The self-test procedure includes testing of calibration, machine protection and data transmission.

Verification identification Cold check out Assembled system: detector, uTCA electronics, ECAT electronics Cold check out Part ID: NA Detailed tests results document: links on CHESS, if applicable. we use the control room OPI for all these steps from now on Test ID Verification procedure Threshold Result Pass / Fail icBLM_16 Check signal RMS and pkpk noise in RNG1 < 180 nA   icBLM_17 Self -test verification: Check signal continuity through the HV modulation signal icBLM_18 Detector mapping and complete acquisition chain verification: Use a plastic hammer to induce a shock to the detector. Verify the corresponding signal peak on the OPI Review pass/fail results: possible risks assessment when test is not passed etc. Explain why a particular step is ignored (if applicable)

Verification identification Commissioning with beam Commissioning with beam Part ID: NA Detailed tests results document: links on CHESS, if applicable. The test requires RF (with stable settings - not necessary nominal) present. Test ID Verification procedure icBLM_20 Trigger delays / Verify background subtraction using offline analysis procedure icBLM_21 Trigger delays / Monitor accumulated loss (or neutron counts) over the beam pulse: For each trigger delay setting an average value for this loss is computed Report the trigger setting with highest average value icBLM_22 Monte Carlo Simulation verification and equivalent loss scaling factor: Define a set of controlled losses for simulation verification Compare simulated and measured results. In case of larger discrepancies find the source and modify simulation model accordingly. Once the simulation geometry model is verified, use the results to define scaling factors. For each detector connect number of lost protons to measured current for particular loss scenario. This gives a factor that can be used to calculate "equivalent lost protons" from the measurement during operation. Here each group of detectors has a loss scenario assigned to. icBLM_23 Protection function commissioning Identify controlled loss scenarios with different loss time evolution (different time constants) and tune the protection function algorithm Produce these controlled losses to tune the protection algorithms icBLM_24 MP thresholds Identify a few likely accidental scenarios that are most damaging Use Monte Carlo simulations (Geant4) coupled with thermo-mechanical simulations (ANSYS) to understand damage potential. Produce risk matrix which serves as a bassline to select MP thresholds. Review pass/fail results: possible risks assessment when test is not passed etc. Explain why a particular step is ignored (if applicable) Trigger delays / Verify background subtraction, offline analysis procedure: For each trigger setting: raw data without baseline subtraction (one of the Data on Demand-DoD buffers) is acquired. Average baseline is subtracted (calculated from another DoD buffer where waveforms that are used for baseline subtraction are stored). For each pulse period (14Hz) average baseline waveform from the raw data is subtracted to obtain "processed" waveform. From at least 100 "processed waveforms" an average is computed. Report average processed waveform with the best trigger setting. Trigger delays / Monitor accumulated loss (or neutron counts) over the beam pulse: (ie. loss over BEAM_ON period inside the pulse period): For each trigger delay setting an average value for this loss is computed Report the trigger setting with highest average value Monte Carlo Simulation verification and equivalent loss scaling factor: Define a set of controlled losses for simulation verification Compare simulated and measured results. In case of larger discrepancies find the source and modify simulation model accordingly. Once the simulation geometry model is verified, use the results to define scaling factors. For each detector connect number of lost protons to measured current for particular loss scenario. This gives a factor that can be used to calculate "equivalent lost protons" from the measurement during operation. Here each group of detectors has a loss scenario assigned to. Protection function commissioning Identify controlled loss scenarios with different loss time evolution (different time constants) and tune the protection function algorithm Produce these controlled losses to tune the protection algorithms MP thresholds Identify a few likely accidental scenarios that are most damaging Use Monte Carlo simulations (Geant4) coupled with thermo-mechanical simulations (ANSYS) to understand damage potential. Produce risk matrix which serves as a bassline to select MP thresholds.

Coming priorities Continue with tests in FREIA to better assess cables’ noise performances and background levels Further integration tests with the ISEG HV PSU in ICS integration lab Deeper evaluation of the picoammeter FMC on the IOxOS carrier Detector tests in BD lab

Thank you! Questions ?