Test Systems Software / FEE Controls Peter Chochula.

Slides:



Advertisements
Similar presentations
JCOP FW Update ALICE DCS Workshop 6 th and 7 th October, 2005 Fernando Varela Rodriguez, IT-CO Outline Organization Current status Future work.
Advertisements

The Detector Control System – FERO related issues
Sezione di Bari September 16, 2002D. Elia - DCS workshop / ALICE week 1 SPD PS system and Controls Domenico Elia, INFN-Bari.
Peter Chochula CERN-ALICE ALICE DCS Workshop, CERN September 16, 2002 DCS – Frontend Monitoring and Control.
SPD DCS Status Report Ivan Amos Calì a,b, S.Ceresa a,c, C.Torcato de Matos a a CERN-AIT a CERN-AIT b Università degli studi di Bari b Università degli.
Peter Chochula CERN-ALICE ALICE DCS Workshop, CERN September 16, 2002 DCS – Frontend Monitoring and Control.
Experiment Control Systems at the LHC An Overview of the System Architecture An Overview of the System Architecture JCOP Framework Overview JCOP Framework.
André Augustinus ALICE Detector Control System  ALICE DCS is responsible for safe, stable and efficient operation of the experiment  Central monitoring.
Peter Chochula, January 31, 2006  Motivation for this meeting: Get together experts from different fields See what do we know See what is missing See.
1 ALICE Detector Control System (DCS) TDR 28 January 2004 L.Jirdén On behalf of ALICE Controls Coordination (ACC): A.Augustinus, P.Chochula, G. De Cataldo,
Supervision of Production Computers in ALICE Peter Chochula for the ALICE DCS team.
RPC Trigger Software ESR, July Tasks subsystem DCS subsystem Run Control online monitoring of the subsystem provide tools needed to perform on-
Control and Monitoring of Front-end and Readout Electronics in ALICE Peter Chochula.
Clara Gaspar, May 2010 The LHCb Run Control System An Integrated and Homogeneous Control System.
L. Granado Cardoso, F. Varela, N. Neufeld, C. Gaspar, C. Haen, CERN, Geneva, Switzerland D. Galli, INFN, Bologna, Italy ICALEPCS, October 2011.
Peter Chochula CERN-ALICE / Dpt. Of Nucl. Physics MFF UK Bratislava Queued... HTTP IPX,SPX Component Client Component Client COM Component Client COM DCE-RPC.
Peter Chochula.  DAQ architecture and databases  DCS architecture  Databases in ALICE DCS  Layout  Interface to external systems  Current status.
Clara Gaspar, November 2012 Experiment Control System LS1 Plans…
Calo Piquet Training Session - Xvc1 ECS Overview Piquet Training Session Cuvée 2012 Xavier Vilasis.
1 DCS TDR Key technical points & milestones TB 15 Dec 2003 L.Jirdén.
Designing a HEP Experiment Control System, Lessons to be Learned From 10 Years Evolution and Operation of the DELPHI Experiment. André Augustinus 8 February.
1 Status & Plans DCS WS L.Jirdén. 2 DCS Planning FINAL INST COM- MISS BEAM OP PRE- INST DET DCS URD ENG. SOLUTIONS PROTOTYPE SUBSYSTEM.
09/11/20061 Detector Control Systems A software implementation: Cern Framework + PVSS Niccolo’ Moggi and Stefano Zucchelli University and INFN Bologna.
1 ALICE Control System ready for LHC operation ICALEPCS 16 Oct 2007 L.Jirdén On behalf of the ALICE Controls Team CERN Geneva.
(Preliminary) Results of Evaluation of the CCT SB110 Peter Chochula and Svetozár Kapusta 1 1 Comenius University, Bratislava.
JCOP Workshop September 8th 1999 H.J.Burckhart 1 ATLAS DCS Organization of Detector and Controls Architecture Connection to DAQ Front-end System Practical.
Clara Gaspar, October 2011 The LHCb Experiment Control System: On the path to full automation.
Update on Database Issues Peter Chochula DCS Workshop, June 21, 2004 Colmar.
DCS T0 DCS Answers to DCS Commissioning and Installation related questions ALICE week T.Karavicheva and the T0 team T0 DCS Answers to.
André Augustinus 10 September 2001 DCS Architecture Issues Food for thoughts and discussion.
André Augustinus 17 June 2002 Technology Overview What is out there to fulfil our requirements? (with thanks to Tarek)
ALICE, ATLAS, CMS & LHCb joint workshop on
P. Chochula ALICE Week Colmar, June 21, 2004 Status of FED developments.
20th September 2004ALICE DCS Meeting1 Overview FW News PVSS News PVSS Scaling Up News Front-end News Questions.
Naming and Code Conventions for ALICE DCS (1st thoughts)
André Augustinus 10 March 2003 DCS Workshop Detector Controls Layout Introduction.
Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
Clara Gaspar, March 2005 LHCb Online & the Conditions DB.
G. Dissertori ETHZ CMS Electronics ECAL DCS : Plans for 2003 G. Dissertori ETHZ
Overview of DAQ at CERN experiments E.Radicioni, INFN MICE Daq and Controls Workshop.
CERN, O.Pinazza: ALICE TOF DCS1 ALICE TOF DCS Answers to DCS Commissioning and Installation related questions ALICE week at CERN O. Pinazza and.
4 th Workshop on ALICE Installation and Commissioning January 16 th & 17 th, CERN Muon Tracking (MUON_TRK, MCH, MTRK) Conclusion of the first ALICE COSMIC.
Peter Chochula ALICE Offline Week, October 04,2005 External access to the ALICE DCS archives.
L0 DAQ S.Brisbane. ECS DAQ Basics The ECS is the top level under which sits the DCS and DAQ DCS must be in READY state before trying to use the DAQ system.
19/05/10FV 1 HyTec crate – DCS integration issues.
Peter Chochula.  DCS architecture in ALICE  Databases in ALICE DCS  Layout  Interface to external systems  Current status and experience  Future.
Alice DCS workshop S.Popescu ISEG Crate controller + HV modules ISEG HV modules 12 Can bus PVSS OPC Client 1 Generic OPC Client Iseg OPC.
Management of the LHCb Online Network Based on SCADA System Guoming Liu * †, Niko Neufeld † * University of Ferrara, Italy † CERN, Geneva, Switzerland.
1 Calorimeters LED control LHCb CALO meeting Anatoli Konoplyannikov /ITEP/ Status of the calorimeters LV power supply and ECS control Status of.
TDAQ Experience in the BNL Liquid Argon Calorimeter Test Facility Denis Oliveira Damazio (BNL), George Redlinger (BNL).
14 November 08ELACCO meeting1 Alice Detector Control System EST Fellow : Lionel Wallet, CERN Supervisor : Andre Augustinus, CERN Marie Curie Early Stage.
Clara Gaspar on behalf of the ECS team: CERN, Marseille, etc. October 2015 Experiment Control System & Electronics Upgrade.
Clara Gaspar, April 2006 LHCb Experiment Control System Scope, Status & Worries.
Management of the LHCb DAQ Network Guoming Liu *†, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
The DCS Databases Peter Chochula. 31/05/2005Peter Chochula 2 Outline PVSS basics (boring topic but useful if one wants to understand the DCS data flow)
1 DAQ.IHEP Beijing, CAS.CHINA mail to: The Readout In BESIII DAQ Framework The BESIII DAQ system consists of the readout subsystem, the.
CONFIGURATION OF FERO IN ALICE Peter Chochula 7 th DCS Workshop, June 16, 2003.
The ALICE Silicon Pixel Detector Control system and Online Calibration tools Ivan Amos Calì (a,b) On behalf of the SPD Project in.
T0 DCS Status DCS Workshop March 2006 T.Karavicheva on behalf of T0 team.
Summary of TPC/TRD/DCS/ECS/DAQ meeting on FERO configuration CERN,January 31 st 2006 Peter Chochula.
Database Issues Peter Chochula 7 th DCS Workshop, June 16, 2003.
JCOP Framework and PVSS News ALICE DCS Workshop 14 th March, 2006 Piotr Golonka CERN IT/CO-BE Outline PVSS status Framework: Current status and future.
M. Caprini IFIN-HH Bucharest DAQ Control and Monitoring - A Software Component Model.
20OCT2009Calo Piquet Training Session - Xvc1 ECS Overview Piquet Training Session Cuvée 2009 Xavier Vilasis.
SPD DCS Overview & FED Server
Peter Chochula Calibration Workshop, February 23, 2005
Controlling a large CPU farm using industrial tools
Pierluigi Paolucci & Giovanni Polese
Pierluigi Paolucci & Giovanni Polese
Presentation transcript:

Test Systems Software / FEE Controls Peter Chochula

2 PTS Status  PTS v 2.0 Analysis and DBMS decoupled from system (easy to upgrade now) System configuration via ASCII files Possibility to dump settings to new config files Loadable Maskbit and Testbit matrices Fully integrated bus Updated panels …. And bugs fixed

Peter Chochula 3 PTS Version 2.0 – Main CP Version 2.0 Help available (to be extended…) DBMS Integration Status Overview Simplified Configuration

Peter Chochula 4 PTS 2.0 – JTAG Integration Supported Controllers:  Corelis MVME with or without external multiplexer  Corelis 100f (ISA)  JTAG Technologies 3710 PCI (testbeams)  KEJTAG v2.0 Automatic Controller Test

Peter Chochula 5 PTS 2.0 – Supported Testbeam Setup Reference Tested Object Scintillators

Peter Chochula 6 PTS 2.0 – DAQ Software 3 planes 1-10 chips / plane Automatic Data Integrity Checks

Peter Chochula 7 PTS New Debugging Tool - Data Analyser Plugins Run Conditions Buffered Beam Profile Data Frame Decoder Event Display Single Event Processing

Peter Chochula 8 PTS 2.0 – A1 and BUS Manual Controls integrated with Pilot MCM (Beta) Status of MCM JTAG Configuration MCM Manual Control – JTAG Configuration …Analog Pilot not yet fully integrated

Peter Chochula 9 PTS 2.0 – Threshold Scans – New Data Format New (Flexible) Data Format Root Interface recognizes the Data format

Peter Chochula 10 PTS 2.0 DAC Sweep Ready for any BUS configuration Using MB DACS or external device Integration of external device is easy ( 1 VI only)

Peter Chochula 11  LabView upgrade to v.6? If yes then in all Institutes at the same time CERN can upgrade only as the last one

SPD Front-end and Readout Electronics Setup & Configuration Based on talk given on ALICE-TB, January 2003 Please see also related document on ALICE DCS web (Documents -> FERO)

Peter Chochula 13 ALICE online software hierarchy DCS TPC FERO Gas LV HV SPD FERO LV HV DAQ/RC TPCSPD TRG TPCSPD HLT ECS … …… (Source: S. Vascotto, TB presentation, October 2002)

Peter Chochula 14 Partitioning of ALICE Online systems DAQ/R C PCA DCSTRG ECA (Source: S. Vascotto, TB presentation, October 2002) Partition A DAQ/R C PCA DCSTRG Partition A

Peter Chochula 15 Example: The Design of SPD Pilot MCM Sensor Readout Chip Bus

Peter Chochula 16 Summary: Alice FERO Architectures FERO Class A FERO Class B Class C FERO Class D FERO Configuration Monitoring DDL is used to configure FERO Monitoring is based on different technology There are 2 options to configure FERO: DDL based ( same as Class A) Non-DDL (Ethernet, etc.) DDL is not involved in configuration Configuration and monitoring are sharing the access path to FERO Configuration Monitoring Configuration DDL

Peter Chochula 17 Controls Technologies  DCS interacts with devices via well defined interfaces  Hardware details are usually transparent to upper layers (Example: CAEN, ISEG)  Preferred communication technologies are OPC and DIM Device Hardware Process Management (PLC…) Communications (OPC,DIM) Supervision (SCADA) Customization, FSM

Peter Chochula 18 Concept of the Front-end Device (FED) PCA DAQ/RC DCS FED CPU FERO Hardware DIM Server DIM Client PLC LVPS Profibus, JTAG, etc. Additional monitoring path DAQ Workstation (LDC) DDL Sw DDL FED

Peter Chochula 19 SPD – FED Interface to DCS DAQ Data Halfstave control JTAGJTAG Router DDL SPD DATA JTAG Return Dedicated CPU (Workstation) Memory DIM DCS - PVSS VR Control, VR Status, I,V,Temp, Time Critical tasks Standard Interface Private Software SPD FED

Peter Chochula 20 DIM Protocol  Service based protocol  Client can subscribe to service and define the update policy  Easy to implement on different platforms DIM –custom protocol Name Server Service Info Request Service Subscribe to service Service Data Commands Register services Client Source: C.Gaspar Server

Peter Chochula 21 Controls Hierarchy is Based on Functionality FERO Hardware FED Configuration DUMonitoring DU Trigger status DU DAQ/R C DCS PCA Trigger see C. Gaspar: Hierarchical Controls Configuration & Operation, published as a CERN JCOP framework document Configuration CUMonitoring CU Trigger status CU Commands Status Definition and implementation of Device Units is detector’s responsibility CU – Control Unit DU – Device Unit

Peter Chochula 22 Time Flow of FERO Configuration PCA DAQ/R C DCS FERO Hardware FERO CPU DCS 3 PCA DAQ/R C Definition and implementation of FSM is detector’s responsibility

Peter Chochula 23 SPD Readout Layout Router(s) MXI-2 1 router services 6 halfstaves SPD contains 20 routers PCI-MXI DC S DAQ MXI-VME

Peter Chochula 24 Controlling the VME Crates – MXI Daisy-Chain.. only one PCI Controller needed programming is easy – chain is transparent to SW performance related questions

Peter Chochula 25 Controlling the VME Crates – 2 PCI-MXI Bridges in one PC.. two PCI Controllers needed programming still easy – (lookup table?) performance – we could gain using parallel processes

Peter Chochula 26 Controlling the VME Crates – 2 PCI-MXI Bridges in one PC.. two PCI Controllers and two Computers needed programming more complicated on upper level performance – probably the best

Peter Chochula 27 Tasks Running on the Control Workstation PVSS DIM Servers Local Monitoring Can a single machine handle this load? Do we need to separate PVSS from local control? Do we need to separate the two sides of SPD? Do we even need 3 computers….? Answer will be obtained from prototypes FAST – Time Critical tasks “Slow”

Peter Chochula 28 SPD needs additional processing configuration of data XX we need to develop a procedure for fast detection of bus status configuration data must be correctly formatted

Peter Chochula 29 Internal Chip Problems Can Affect the Configuration Strategy XXXX

Peter Chochula 30 Internal Chip Problems Can Affect the Configuration Strategy XXXX We need to develop a mechanism for problems recovery This should not be implemented as a patch in the configuration routine! Problems should be described in a “recipe” which is loaded from configuration database together with configuration data

Peter Chochula 31 Detector Calibration – Standard Approach PCA DCSDAQ/R C Load Thresholds and Test Patterns Log Data Run DAQ Analyze Data Prepare Configuration Data OFFLINE ONLINE

Peter Chochula 32 Detector Calibration – Standard Approach  Synchronization between DAQ and DCS via PCA will add some overhead Conservative estimate ~ 7680 synchronization cycles, will add about 2 (or even more) hours dead time…  We need a local calibration procedure SPD will be put into ignored state during the calibration We need to define FSM and DCS recipe PCA DCSDAQ/R C

Peter Chochula 33 …But…  This was still not the bad message

Peter Chochula 34 Software/hardware overhead  Loading of a single chip needs ~ 300 ms Out of this time more than 99% is the communication overhead This time seems to be negligible …but….  The ALICE1 chip is really complicated and big  Remember, when we started, we needed some 2 hours to scan a single chip. This has been reduced to some 5 minutes using several tricks  Time needed to scan a bus is still ~45 (or 15 with less statistics) minutes and cannot be reduced (the amount of data is bigger by an order of magnitude)

Peter Chochula 35 Detector Calibration  We cannot simply implement the present procedures Estimated time for scan is ~30 hours, with 8 hours of JTAG activity  Ways to reduce the needed time: Run scans in parallel  …but – only one Router can be addresses at a time Use the built in macro option in KE-JTAG controller Implement a part of scanning procedures in Router’s hardware

Peter Chochula 36 SEU Monitoring  Standard approach: Write the configuration data into the Alice1 chips Compare output with previously written configuration  …But… Analyzing routines must understand the way how the configuration is written (bus configuration) Part of data will be lost  Due to the nature of Alice1 chips (stuck LSB)  Due to the tricks used to load chips with internal problems

Peter Chochula 37 DCS Architecture: Data Flow (Configration & Logging) Configuration DB Archive Subsystems Hardware Conditions DB PVSS Configuration DCS Recipes FERO Config. Device Config.

Peter Chochula 38 Required tasks  Definition of configuration data  Definition of monitoring limits (recipes)  Definition of data subset written to Conditions DB  Development of offline analysis tools

Peter Chochula 39 A few recommendations  Base the development of PTS reverse engineering  Use Windows XP and if possible Visual Studio.NET as development platform (at least for final product testing)  Use MySQL for database prototyping  Restrict Database programming to standard SQL We will probably change the underlying database for final system (ORACLE?)

Peter Chochula 40 Conclusions  PTS 2.0 is available  FERO configuration & monitoring needs a lot of work