Clement Derrez Test Engineer

Slides:



Advertisements
Similar presentations
Page 1 Group/Presentation Title Agilent Restricted 8 January 2014 Remove this slide before customer presentation This is the slide set that should be used.
Advertisements

Hardware Integration of the Prototype Wes Grammer NRAO September 24-26, 2012EOVSA Prototype Review1.
Lecture # 2 : Process Models
Data Acquisition Risanuri Hidayat.
HPS MPS OVERVIEW Dan Sexton & the Safety Systems Group JLAB 1.
Test of LLRF at SPARC Marco Bellaveglia INFN – LNF Reporting for:
MSIS 110: Introduction to Computers; Instructor: S. Mathiyalakan1 Systems Design, Implementation, Maintenance, and Review Chapter 13.
The TIMING System … …as used in the PS accelerators.
ATF2 Q-BPM System 19 Dec Fifth ATF2 Project Meeting J. May, D. McCormick, T. Smith (SLAC) S. Boogert (RH) B. Meller (Cornell) Y. Honda (KEK)
Mathieu Goffe EUDET JRA1 meeting, DESY Wednesday 30 January 2008 IPHC, 23 rue du Loess BP 28, 67037, Strasbourg Cedex 02, France.
Semester 1, 2003 Week 7 CSE9020 / 1 Software Testing and Quality Assurance With thanks to Shonali Krishnaswamy and Sylvia Tucker.
Performance of SPIROC chips in SKIROC mode
Implementation Yaodong Bi. Introduction to Implementation Purposes of Implementation – Plan the system integrations required in each iteration – Distribute.
Principles of Information Systems, Sixth Edition Systems Design, Implementation, Maintenance, and Review Chapter 13.
Principles of Information Systems, Sixth Edition Systems Design, Implementation, Maintenance, and Review Chapter 13.
BI day 2011 T Bogey CERN BE/BI. Overview to the TTpos system Proposed technical solution Performance of the system Lab test Beam test Planning for 2012.
Ideas about Tests and Sequencing C.N.P.Gee Rutherford Appleton Laboratory 3rd March 2001.
Acquisition Crate Design BI Technical Board 26 August 2011 Beam Loss Monitoring Section William Vigano’ 26 August
Principles of Information Systems, Sixth Edition 1 Systems Design, Implementation, Maintenance, and Review Chapter 13.
S.MonteilCOMMISSIONING1 PS/SPD ELECTRONICS OUTLINE 1)STATUS OF PS/SPD FE BOARDS PRODUCTION 2)PHASES OF PS/SPD COMMISSIONING 1)LEDs AND DETECTORS 2)TUBES.
PSD upgrade: concept and plans - Why the PSD upgrade is necessary? - Concept and status of the PSD temperature control - Concept of the PSD analog part.
GAN: remote operation of accelerator diagnosis systems Matthias Werner, DESY MDI.
Common test for L0 calorimeter electronics (2 nd campaign) 4 April 2007 Speaker : Eric Conte (LPC)
BPM stripline acquisition in CLEX Sébastien Vilalte.
PSD upgrade: concept and plans - Why the PSD upgrade is necessary? - Concept of the PSD temperature stabilization and control - Upgrade of HV control system.
XFEL The European X-Ray Laser Project X-Ray Free-Electron Laser Wojciech Jalmuzna, Technical University of Lodz, Department of Microelectronics and Computer.
February 4 th, Quality assurance plan: production, installation and commissioning N.Vauthier TE-CRG Technical Review on Beam Screen Heater Electronics.
FUNCTION GENERATOR.
ESS Timing System Plans Timo Korhonen Chief Engineer, Integrated Control System Division Nov.27, 2014.
Crashcourse Oscilloscope and Logic Analyzer By Christoph Zimmermann.
XFEL The European X-Ray Laser Project X-Ray Free-Electron Laser Wojciech Jalmuzna, Technical University of Lodz, Department of Microelectronics and Computer.
SOFTWARE TESTING TRAINING TOOLS SUPPORT FOR SOFTWARE TESTING Chapter 6 immaculateres 1.
MicroTCA Development and Status
RF acceleration and transverse damper systems
Data providers Volume & Type of Analysis Kickers
2007 IEEE Nuclear Science Symposium (NSS)
Definition CASE tools are software systems that are intended to provide automated support for routine activities in the software process such as editing.
Injectors BLM system: PS Ring installation at EYETS
Autotest – Calibrated Leak
Status of the Beam Phase and Intensity Monitor for LHCb
Test stand preparation and EMR "live" data summaries
Commissioning and Testing the LHC Beam Interlock System
Proton Source and LEBT Luigi Celona & Lorenzo Neri
Applied Software Implementation & Testing
TB8100 Technical Training July 2005
Introduction to Software Testing
BESIII EMC electronics
Information Systems, Ninth Edition
Training Module Introduction to the TB9100/P25 CG/P25 TAG Customer Service Software (CSS) Describes Release 3.95 for Trunked TB9100 and P25 TAG Release.
Test Case Test case Describes an input Description and an expected output Description. Test case ID Section 1: Before execution Section 2: After execution.
Systems Construction and Implementation
SPOKE Cavities frequency preparation plan and RF control procedures
System Construction and Implementation
Report on ATF2 Third Project Meeting ATF2 Magnet Movers ATF2 Q-BPM Electronics Is SLAC ILC Instrumentation Group a good name?
Systems Construction and Implementation
Testing and deployment
Safety, reliability and maintainability
Series 5300 Lithium Cell Formation System
icBLM verification plan
PSS verification and validation
Machine Protection PLC Based System Verification and Validation Plan
Clement Derrez European Spallation Source ERIC Date
Electronics, motion control and verification
ESHAC #8 Safety Readiness Review Thomas Hansson, ESH
Radiation Detectors for the PSS1 at ESS
Rack installation and local tests for the MEBT Chopper Rack
SKAMP Square Kilometre Array Molonglo Prototype
Multichannel Link Path Analysis
Rack and Interconnect Design in E-plan
Presentation transcript:

Clement Derrez Test Engineer BCM verification plan Clement Derrez Test Engineer www.europeanspallationsource.se 9 July, 2019

Outline Naming convention Data management system: Insight Acceptance tests workflow BCM tests description After-installation tests Walk through: workflow from acceptances tests to commissonning

Naming convention Every Field Replaceable Unit (FRU) is named. A FRU can be a simple part or an assembly Ex: FEB-050ROW:PBI-PPC-002 for a cabinet cable Naming convention: Sec-Sub:Dis-Dev-Idx Two major areas of FRU installation slots are identified: tunnel: including stub, tunnel wall and gallery wall, support: front end building (FEB), klystron gallery, gallery support area (GSA) in A2T  System name is not misused! Naming: Section – subsection : Discipline – Device type - Index

Data management system Data export / import with other software tools Systems and subsystems tests results coming from different locations (ESS, IK Partner, Industry partners…) We must be able to trace back acceptance tests results to laboratory measuring devices We need to be able to prepare an installation batch when an installation slot is ready: need for a dynamic tool  Having a reliable Data management system is critical! Illustration: mention a possible issue happening if we want to find out what tests results are affected when a deviation is detected in a lab device calibration. How can we make sure we’ll be able to store reliably all our systems docs? The answer is using insight.

Data management system: Insight Ensures traceability between tests and production data, system components and laboratory devices. Objects’ attributes store all FRU info: responsible, current status (procured, received, RFI…) etc. Added value, current installation or production progress… can be automatically extracted for each system or FRU Timeline is managed in Jira, tasks are linked to Insight. We do not want to replace CHESS! Insight: PBI shopping list Component requirements System FRUs list Tests report Test plan Laboratory devices Test Design document JIRA: timeline Add a demo here using Insight?? We can get a quick overview of any project extracted from this system: % of all BPMs in “RFI” state for instance. Stress that we do not replace CHESS when using Insight! This is a data management tool.

Data management system: Insight Change of status are triggered when all required documents and conditions are met. ICS database requirements are being defined. It will be automatically populated from Insight when available.

Data management system: Insight No lost effort, as everything can be scripted to populate external tools and extract any needed information! Installation batch: installation status easily verified and prepared Data is uploaded by BD team. Training IK and industrial partners on that is possible. Main additional points to mention: When we prepare a system for installation, we can make sure all components are stored and tested by checking the corresponding installation batch status Each object share a minimum set of mandatory attributes. Screenshot: example of a BPM beam line element: a BPM button and its beam line cables are visible here… It is part of this system etc (we can go up to the mTCA objects easily)

Data management system: Insight Hierarchy goes down to systems cables and their status and properties Example of a Beam line cable connecting an ACCT to a Patch Panel:

Measurements files format Large amount of measurements parameters: Data logged in HDF5 and uploaded to Insight. Data files produced at ESS follow a fixed format: Files, groups and datasets mandatory attributes are defined. Existing data received from IK and industry partners is stored as is for now, adding the needed metadata. Re-formatting to HDF5 is planned for each test results. A few words on the measuremnts files format: We already have data from DESY and ESS bilbao in a shared drive. We’ll upload it as it is now, adding metadata to it.

Acceptance tests workflow System tests requirements Tests requirements are defined for each subsystem Test plan Describes subsystems' test setup and characterization procedures When not COTS: test plan is prepared/discussed and agreed with IK partner Test Design Document Design document is validated by system lead: we make sure test requirements are correctly translated Test coverage is reviewed. Possible risk assessment if we do not reach 100% coverage Tests IDs defined for each measurements testbench implementation Test system implemented in python. Run tests Acceptance tests performed in RATS Data logging in Insight

Where do we test our systems? RATS preliminary layout BI area 36 m 87 m

Acceptance tests workflow RATS: Component received. RATS: Component tested. RATS: Component stored. RATS: System assembly and test  RFI RATS: System stored Installation slot: System goes to ESS AT site. Guideline: Install and test as much as possible, as early as possible Learning curve will help us moving faster with installation work after the first systems are processed Last step: commissioning We focus on having as much automation in our test and tuning stations! This flow might change depending on if we score some area on site…

Tests description When struck digitizer boards are received, acceptance tests are performed to characterize the ADC on all channels: Parameter Test description Pass/Fail thresholds ADC SNR Using low noise & distortion Generator 65 dBc ADC SFDR ADC SINAD 60 dBc ADC ENOB 11.5 bits RTM clock source Test the ADC clock using the RTM input as source Specifications are based on “BCM electronics and cables” presentation. We will

Tests description The next tests then include the complete system, from toroid +FE to digitizer card: Parameter Test description Pass/Fail thresholds Current measurement range Measure non linearity on a full scale input waveform Non linearity < 0.1 % FS Pulse rise time Measure pulse rise time, sweeping pulse amplitude until 80 mA. < 300 ns (cable length < 20m) latency Measurement of response time from current through toroid to AMC/RTM digital output. < 2 us Droop Acquire a 6 ms pulse, measure droop value < 2% per ms

Time and amplitude thresholds to define Tests description Parameter Test description Pass/Fail thresholds Noise Measure noise in lab conditions. Adjust threshold after first tests. < 1% (RMS) FS Temperature induced drift Measure signal drift vs temperature over a range of xx degC < +/- 1% FS Bandwidth Measure data acquisition system’s bandwidth: sweep sine input frequency until 3 MHz > 1 MHz (cable length < 20m) Overshoot Sweep pulse amplitude until 80 mA, measure overshoot Time and amplitude thresholds to define Specifications are based on “BCM electronics and cables” presentation. We will adjust noise pass/fail threshold after the first systems’ tests Uploaded to database: ACCT-E calibration data Automated electronics tests results: output of the tests described in the table above. Production tests results from manufacturer (Bergoz)

After-installation tests sequence initial first-time commissioning: test_ID cold check-out: test_ID commissioning with beam: test_ID quick self-check: test_ID Sequential testing is important (tests timestamps are checked automatically). Otherwise the statement: “the instrument is installed and working properly” has less confidence. During debugging: Relevant tests in each architectural layer are repeated until satisfactory results are obtained. If the problem is identified and can be isolated within its layer, there is normally no need to repeat all of the tests which are sequentially following. Depending on the situation, some test might become mandatory nevertheless (for instance a software recompilation might entail a standard interface check, and a repaired connector might entail a signal transmission check).

After-installation tests sequence Cold check out: Testing of the whole instrument on all architectural layers: monitor, front-end electronics, cables, mTCA-electronics, timing, data treatment, publishing, network transmission and machine protection interface. Commissioning with beam: Aims at verifying the correctness of the integration for machine operations. Includes initial comparisons and cross-calibrations in order to gain confidence in the instrument. Performance limiting factors are identified. Quick self test: The self-test procedure includes testing of calibration, machine protection and data transmission.

Thank you! Questions ?