Hardware Commissioning Review, Markus Zerlauth, 11 th July 2007 1 CO deliverables for HWC – lessons learnt and the proposed way ahead Markus Zerlauth,

Slides:



Advertisements
Similar presentations
1 15 April 2010: Post Mortem Analysis by M.Zerlauth Automatic POWERING EVENT Analysis Adam, Arek, Gert-Jan, Ivan, Knud, Hubert, Markus, Nikolai, Nikolay,
Advertisements

AT Equ. Groups / AB-CO Control responsibility Sharing.
Post Mortem Workshop - discussion1 17/1/2007 HWC Requirements GMPMA  Concentrate on pre-beam requirements Post-quench analysis ([semi-] automatic) Integrity.
LHC-MAC1 LHC Controls Infrastructure and the timing system Hermann Schmickler on behalf of the CERN Accelerator and Beams Controls Group.
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
Implementation/Acceptance Testing CSE Week 8 / 1 CSE9020 Case Study Week 8 Implementation and Acceptance Testing.
Industrial Control Engineering Industrial Controls in the Injectors: "You (will) know that they are here" Hervé Milcent On behalf of EN/ICE IEFC workshop.
Overview of Data Management solutions for the Control and Operation of the CERN Accelerators Database Futures Workshop, CERN June 2011 Zory Zaharieva,
Control and monitoring of on-line trigger algorithms using a SCADA system Eric van Herwijnen Wednesday 15 th February 2006.
TE/MPE/MI OP section meeting 29 th September 2009 HCC 2009 Frequently Asked Questions 0v1 M. Zerlauth.
Classroom User Training June 29, 2005 Presented by:
Pierre Charrue – BE/CO.  Preamble  The LHC Controls Infrastructure  External Dependencies  Redundancies  Control Room Power Loss  Conclusion 6 March.
IMMW14, Ferney Voltaire, September 2005 (slide 1/35) Experience with configurable acquisition software for magnetic measurement.
SCADA. 3-Oct-15 Contents.. Introduction Hardware Architecture Software Architecture Functionality Conclusion References.
14 December 2006 CO3 Data Management section Controls group Accelerator & Beams department Limits of Responsibilities in our Domains of Activities Ronny.
TEAM Basic TotalElectrostatic ManagementAwareness&
Update on a New EPICS Archiver Kay Kasemir and Leo R. Dalesio 09/27/99.
Log analysis in the accelerator sector Steen Jensen, BE-CO-DO.
Operational tools Laurette Ponce BE-OP 1. 2 Powering tests and Safety 23 July 2009  After the 19 th September, a re-enforcement of access control during.
TE-MPE-EP, RD, 06-Dec QPS Data Transmission after LS1 R. Denz, TE-MPE-EP TIMBER PM WinCC OA Tsunami warning:
Web application for detailed real-time database transaction monitoring for CMS condition data ICCMSE 2009 The 7th International Conference of Computational.
LHC Hardware Commissioning Review May 2005, CERN, 1211 Geneva 23 1 LHC Hardware Commissioning Review Contribution ID: 20 – Quality assurance and.
HC Review, May 2005 Hardware Commissioning Review Hardware Commissioning Review Quality Assurance and Documentation of Results Félix Rodríguez Mateos,
Logging Mike Lamont Georges Henry Hemlesoet AB/OP Discussions with M. Pace & C. Roderick.
Analysis, & future direction A FRAMEWORK FOR OFFLINE VERIFICATION OF BEAM INSTRUMENTATION SYSTEMS.
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
Session 1 Introduction  What is RADE  Technology  Palette  Tools  Template  Combined Example  How to get RADE  Questions? RADE Applications EN-ICE-MTA.
Peter Chochula ALICE Offline Week, October 04,2005 External access to the ALICE DCS archives.
1 UNICOS PVSS Evolution Cryogenics Control Enrique BLANCO Industrial Controls & Electronics Group.
Source Controller software Ianos Schmidt The University of Iowa.
Software development Control system of the new IGBT EE switch.
PVSS: Windows  Linux. 13th May Outline Current software architecture PVSS on Linux Demo What has to be redone for Linux console Vacuum, GCS, cryogenics.
Andrzej Siemko On behalf of the MPP-GMPMA Task Force: (A. Ballarino, R. Denz, B. Khomenko, A.Perrin, P. Pugnat, A. Rijllart, L. Serio, A. Siemko, A. Vergara.
MND review. Main directions of work  Development and support of the Experiment Dashboard Applications - Data management monitoring - Job processing monitoring.
1 15 April 2010: Post Mortem Analysis by M.Zerlauth Automated analysis of Powering Events – Progress update.
OVERVIEW OF THE NEW FEATURES PVSS SCADA SYSTEMS USED DURING HCC MP3 - Frédéric BERNARD.
MA PM performance, Adriaan Rijllart Post Mortem data handling and performance Adriaan Rijllart, Beno î t Pannetier, Boris Khomenko, Greg Kruk,
Maria Girone CERN - IT Tier0 plans and security and backup policy proposals Maria Girone, CERN IT-PSS.
MPE activities within MP3 Arjan Verweij A. Verweij, MPE review, 2 June 2015 The questions: 1. Define the scope of work for your activity 2. Structure of.
AB/CO Review, Interlock team, 20 th September Interlock team – the AB/CO point of view M.Zerlauth, R.Harrison Powering Interlocks A common task.
The DCS Databases Peter Chochula. 31/05/2005Peter Chochula 2 Outline PVSS basics (boring topic but useful if one wants to understand the DCS data flow)
16-17 January 2007 Post-Mortem Workshop Logging data in relation with Post-Mortem and archiving Ronny Billen AB-CO.
Status of the AWAKE Control System Edda Gschwendtner, Janet Schmidt, Marine Gourber-Pace, Roman Gorbonosov, Peter Sherwood AWAKE Technical Board, 27 January.
Training LHC Powering - Markus Zerlauth Powering Interlocks Markus Zerlauth AB/CO/MI.
Tunnel Cryogenics Instrumentation & Controls for the LHC Enrique Blanco AB/CO IS.
HWC Review – Sequencer Vito Baggiolini AB/CO, with the team: Carlos Castillo, Daniele Raffo, Roman Gorbonosov.
MPP Meeting 07/03/2007 MPP Main Ring Magnet Performance Panel Meeting Wednesday 7th March 2007 Agenda: 1)Matters arising 2)Recommendations for the case.
An invitation to an open discussion We are team reviewing its performance globally We want to help each other to reach the objectives We know the other.
Training LHC Powering Robin Lauckner Software Tools for Commissioning Robin Lauckner 28 th March, 2007.
Industrial Control Engineering Session 1 Introduction  What is RADE  Technology  Palette  Tools  Template  Combined Example  How to get RADE 
LHC RT feedback(s) CO Viewpoint Kris Kostro, AB/CO/FC.
European Organization For Nuclear Research CERN Accelerator Logging Service Overview Focus on Data Extraction for Offline Analysis Ronny Billen & Chris.
Cofax Scalability Document Version Scaling Cofax in General The scalability of Cofax is directly related to the system software, hardware and network.
AB-CO Review September Session on circuit commissioning Session on circuit commissioning Post-Mortem requirements F. Rodríguez-Mateos on behalf.
MPE Workshop 14/12/2010 Post Mortem Project Status and Plans Arkadiusz Gorzawski (on behalf of the PMA team)
AB-CO Exploitation 2006 & Beyond Presented at AB/CO Review 20Sept05 C.H.Sicard (based on the work of Exploitation WG)
1 Cryogenics Instrumentation & Controls Commissioning for the LHC AB/CO viewpoint Enrique Blanco AB/CO IS.
AB-CO Review, 23 September 2005 L. Walckiers AT-MTM p. 1 Project description  To maintain for LHC completion  Magnet Rescue Activities [MAR]  Support.
1 PVSS Oracle scalability Target = changes per second (tested with 160k) changes per client 5 nodes RAC NAS 3040, each with one.
LHC Post Mortem Workshop - 1, CERN, January 2007 (slide 1/52) AB-CO Measurement & Analysis Present status of the individual.
R. Denz, A. Gomez Alonso, AT-MEL-PM
CMS High Level Trigger Configuration Management
IT-DB Physics Services Planning for LHC start-up
Markus, Mathieu, Enrique, Rudiger, Serge
Summary of first LHC logging DB meeting
How do we tackle the extended requirements?
Status of the Accelerator Online Operational Databases
The Control System For LHC Hardware Commissioning
Magnet Safety System for NA61/Shine
Presentation transcript:

Hardware Commissioning Review, Markus Zerlauth, 11 th July CO deliverables for HWC – lessons learnt and the proposed way ahead Markus Zerlauth, AB-CO-MI for the CO group

Hardware Commissioning Review, Markus Zerlauth, 11 th July Outline  Post Mortem analysis and test automation  Circuit Synoptic  Logging  CCC environment for HWC  E-logbook  Conclusion

Hardware Commissioning Review, Markus Zerlauth, 11 th July Post Mortem Analysis

Hardware Commissioning Review, Markus Zerlauth, 11 th July PNO.2 auto analysis D2 Q5 Q4 1 s Current PMA packages Courtesy of PM team

Hardware Commissioning Review, Markus Zerlauth, 11 th July Current Architecture of PMA system v5 Individual system analysis Logical system analysis Magnet system SDDS logging additional results Raw data Result data 1 Analysis sequencer Result data 2 General system analysis Cryo circuits PC PIC QPS Cryo Vac BLM Circuit 600 A PIC 1, 2 Circuit Electrical circuits Equip DFB, SCLink Stand. cell GSA Analysis sequencer ready Circuit 60, 80, 120A Courtesy of A.Rijllart

Hardware Commissioning Review, Markus Zerlauth, 11 th July Post Mortem Analysis – further automation  No fully automatic analysis feasible yet, but further automated hand-shake between sequencer and PMA possible  Complete Logical system analysis + with further dedicated analysis packages of type PNO.2  Priority must be on circuit types <6kA (largest quantity)  PO, PMA and sequencer teams are investigating possibilities to further automate PO related tests for 60A and 600A circuits (PCC, crow-bar, etc..)  Issues: –In-homogeneous test plan & interest of involved parties requires global coordination of priorities for Final sequences (need to agree on last changes & freeze as much as possible) Related automated analysis –Development priorities need to be urgently agreed upon in between all involved parties in the next weeks to provide operational & tested packages already for sector 45 –Automated test plan versus manual (MTF reporting, parameter update,..)  Series of SACEC meetings to address these issues

Hardware Commissioning Review, Markus Zerlauth, 11 th July Sequences versus ‘automated’ PMA A x 1036 PIC1.x PCC PIC2.x PNO.1 PNO.2 PNO.3 PNO.4 PAC 600A X 410 PIC1.x PCC PIC2.x PCS.1-4 PLI1.x PLI2.x PLI3.x PNO.1 PNO.6 PNO.10 IPD X 16 PIC1.x PCC PIC2.x PLI1.x PLI2.x PLI3.x PNO.5 PNO.6 PNO.10 PAC IPQ X 78 PIC1.x PCC PIC2.x PLI1.x PLI2.x PLI3.x PNO.5 PNO.6 PNO.7 PNO.10 PSQ PAC 13kA X 32 PIC1.x PCC PIC2.x PCS.1-4 PLI1.x PLI2.x PLI3.x PNO.1 PNO.6 PNO.10 PAC Sequences ready PM analysis automated

Hardware Commissioning Review, Markus Zerlauth, 11 th July Post Mortem Analysis – ongoing improvements  PM Data Browser –Automated loading of correlated PM data from QPS/PIC/EE, etc. based on the FGC event –Event search based on circuit name and/or time window –possibility to manage the list of the signals to display in the HWC view (via a text editor) –user can save the jpeg of the CHART or the excel sheet from the TABLE in a local folder –Link with cryogenic signals link between Electrical circuits & cryogenic equipment foreseen to be established in the Layout DB used for PM Data browser, Cryo SCADA, CIET and Circuit Synoptics scripts generating a set of signals / circuit

Hardware Commissioning Review, Markus Zerlauth, 11 th July Post Mortem System – scalability  Dedicated scalability tests + HWC experience have not shown any bottlenecks in the available PM infrastructure (data collection on PM server and data conversion) –in case it does in the future, further servers will be deployed  Current limitations mostly lie with the gathering of data on the client side  PM example of AUG test 09 th July at 9:50:28 –49 ext FGC events, 77 self FGC events, 17 DQAMS events –The PM data was received on the server in two batches, each ~1min long: at 9:51 and 9:56. No errors on the server side, no client disconnections detected during the test. –Conversion time to pmd format ~50ms for 600A converter file, ~500ms for 60A converter file, ~40ms for DQAMS file

Hardware Commissioning Review, Markus Zerlauth, 11 th July Circuit Synoptic

Hardware Commissioning Review, Markus Zerlauth, 11 th July Circuit Synoptic  Circuit Synoptic an important tool during HWC, gathering necessary information related to circuit powering on a single screen  2 major improvements proposed for next sectors –Inclusion of 60A converters (60A SW power permit and FGC status) –Link of circuit synoptics with related cryogenic signals, e.g. CL temperatures, liquid levels in related DFBs, CRYO_START, CRYO_MAINTAIN, etc...  Requires considerable work on configuration DB for mapping of signals (together with ACR) + data publishing from CRYO system (ACR & CO), but could be exploited by circuit synoptic, CIET, CRYO SCADA, etc..  Would allow for ‚real-time‘ trends of e.g. circuit current vs. current lead temperature within PVSS, etc... Courtesy of F.Bernard

Hardware Commissioning Review, Markus Zerlauth, 11 th July Circuit Synoptic v2 Courtesy of F.Bernard

Hardware Commissioning Review, Markus Zerlauth, 11 th July Logging

Hardware Commissioning Review, Markus Zerlauth, 11 th July LHC Logging Service The LHC Logging Services - Architecture Logging DB Measurement DB Technical Services DB HTTP JDBC PL/SQL Generic Usage Measurement Database  Contain raw time series data, at up to 2Hz rate  Store data for 7 days  Derive new values based on measured values  Sends filtered data of interest to Logging DB  Generate statistics on accelerator performance Logging Database  Contain relatively slow time series data  Store data online for at least 20 years 10g AS Oracle RAC Database Host  2 x SUN Fire V240  Dual 1 GHz CPU  4 GB RAM  Dual internal disks  Dual power supply  SUN StorEdge 3510FC disk array  2-Gb fiber channel bus  Dual RAID controllers 1 GB cache  12 x 146 GB 10k rpm FC disks  24 x 300 GB 10k rpm FC disks (2007)  SUN Solaris 9  Veritas Volume Manager 3.5  Oracle9i Enterprise Edition Release ,000,000 records / day 129,000,000 records / day 147,000 records / day 6,800,000 records / day Today’s Data Rates Courtesy of C.Roderick

Hardware Commissioning Review, Markus Zerlauth, 11 th July Logging - Infrastructure + Interface  HW infrastructure –current Logging service has to cope with rapidly increasing demands –initially intended to deal purely with LHC machine/beam related data; yet today, data comes from PS Complex, SPS, SPS-EA, CNGS, ATLAS, LHCb, CMS, CIET… –Worry about interdependency with LSA (currently on the same database server) + scalability for 8 parallel sectors –DM has launched a study and will present a proposal for the backend upgrades at a TC on August 16. HW upgrades are tough to be in place for October  Data extraction interface –technology behind the current data extraction tool allows for an easy web- access from inside and outside CERN, but is for this reason also specific and by now outdated –Already planned to be replaced new implementation, thus addressing the open issues such as compatibility, performance and flexibility –implementation has partially started but the increasing workload for projects such as LSA, hardware commissioning,.. has made it impossible to further progress up to now (estimated to 2man-months, due to specific knowledge no easy 'outsourcing‘ possible) –Currently on hold….

Hardware Commissioning Review, Markus Zerlauth, 11 th July Logging & Data providers – example of CRYO data  The present Logging system is used by most players of HWC, which store their system data at rates and accuracies which are primarily defined by system experts and inherent limitations of their systems (while CO will discuss with every client to minimise data volumes for long-term storage if possible)  For the global event analysis, additional data of in this case the cryogenic system has been retrieved, but first attempts showed limited time resolution and accuracy  First meeting with experts revealed several possibilities of improvement in the controls infrastructure that deserve follow-up, but also inherent limitations of the acquisition electronics -> see also summery paper of Mike (to be released)  Requirements for data storage of system designers (doing process control & system setup) and HWC & MPP experts (doing event analysis) can be very different and should be discussed by the latter two in case the provided accuracy and data rates are not sufficient for HWC activities

Hardware Commissioning Review, Markus Zerlauth, 11 th July The CIET system LT data 0.1 Hz FEC 5 Hz WFIP Signal Processing (median calculation) Delay 30 s SCADA Threshold (0.5%) db LT data 0.1 Hz FEC 5 Hz WFIP PLC The CRYO SCADA system 1Hz Time stamp PLC SCAD A Threshold (0.08%) Threshold (0.1%) SCADA Archive db Dead time (10s) Signal Processing (median calculation) Logging – CRYO data Courtesy of M.Koratzinos

Hardware Commissioning Review, Markus Zerlauth, 11 th July CCC Environment + elogbook

Hardware Commissioning Review, Markus Zerlauth, 11 th July Comments on working environment…  Generally, CCC environment very much appreciated by teams  Passwords, account-timeouts, CMF updates and limited access to GPN perceived as disturbing during HWC, but need to follow new CERN security policy  Unique way to start programs from Linux and Windows consoles not easily possible as many clients only run on Windows (PVSS) or Linux, but to be followed up  Printing from the different consoles perceived difficult  Installation of LabView 8.2 on Windows consoles to run PM application directly  Elogbook –Vital tool for tracking and re-construction of HWC activities (EICs can confirm, especially when multiple tests are ongoing and time is limited) –Operators have the good habit to accurately fill these logbooks –Especially for parallel commissioning, changing shifts and if tests are carried out manually (no LSA tracking) an asset  Do we need further μCCC during HWC?

Hardware Commissioning Review, Markus Zerlauth, 11 th July Conclusions and Outlook  Many CO services involved during sector 78 commissioning, which all gathered valuable input for further evolvement and improvements that will be digested and implemented during the next weeks (timing, infrastructure, PIC, WIC, CRYO SCADA, CIET, QPS SCADA, PMA, applications, WorldFIP, databases, Logging, etc...)  Further input needed from HWC and MPP to have asap a common view on priorities for further improvements, test automation & global analysis -> SACEC  Issues as databases, timing, etc not treated here but will be adressed with concerned teams -> SACEC  Feedback from equipment groups & this HWC review will be addressed and followed up within the CO group during a dedicated technical commitee next week