Selected Subjects on Controls System - Quality Assurance P.Charrue On behalf of the AB Controls Group LHC Machine Advisory Committee 16 June 2006.

Slides:



Advertisements
Similar presentations
Controls Configuration Service Overview GSI Antonio on behalf of the Controls Configuration team Beams Department Controls Group Data & Applications.
Advertisements

1 15 April 2010: Post Mortem Analysis by M.Zerlauth Automatic POWERING EVENT Analysis Adam, Arek, Gert-Jan, Ivan, Knud, Hubert, Markus, Nikolai, Nikolay,
Supervision of Production Computers in ALICE Peter Chochula for the ALICE DCS team.
LHC-MAC1 LHC Controls Infrastructure and the timing system Hermann Schmickler on behalf of the CERN Accelerator and Beams Controls Group.
 M.A - BIS Workshop – 4th of February 2015 BIS software layers at CERN Maxime Audrain BIS workshop for CERN and ESS, 3-4 of February 2015 On behalf of.
LabVIEW Basic I with RADE introduction A. Raimondo (EN/ICE)
Brian Bradley.  Data is any type of stored digital information.  Security is about the protection of assets.  Prevention: measures taken to protect.
Industrial Control Engineering Industrial Controls in the Injectors: "You (will) know that they are here" Hervé Milcent On behalf of EN/ICE IEFC workshop.
Isabelle Laugier, AT/VAC/ICM Section February 7 th 2008.
Overview of Data Management solutions for the Control and Operation of the CERN Accelerators Database Futures Workshop, CERN June 2011 Zory Zaharieva,
Firewalls and the Campus Grid: an Overview Bruce Beckles University of Cambridge Computing Service.
Rapid Application Development Environment based on LabVIEW A. Raimondo (AB/CO) ATC/ABOC Days, January 2008.
controls Middleware – OVERVIEW & architecture 26th June 2013
E. Hatziangeli – LHC Beam Commissioning meeting - 17th March 2009.
Lesson 8-Information Security Process. Overview Introducing information security process. Conducting an assessment. Developing a policy. Implementing.
W. Sliwinski – eLTC – 7March08 1 LSA & Safety – Integration of RBAC and MCS in the LHC control system.
06/05/2004AB/CO TC RF controls issues Brief overview & status Requested from AB/CO Hardware, Timing, VME/FESA for LEIR, SPS, LHC Controls for LHC RF Power.
Summary DCS Workshop - L.Jirdén1 Summary of DCS Workshop 28/29 May 01 u Aim of workshop u Program u Summary of presentations u Conclusion.
Requirements Review – July 21, Requirements for CMS Patricia McBride July 21, 2005.
14 December 2006 CO3 Data Management section Controls group Accelerator & Beams department Limits of Responsibilities in our Domains of Activities Ronny.
SMS 2003 Deployment and Managing Windows Security Rafal Otto Internet Services Group Department of Information Technology CERN 26 May 2016.
Peter Chochula ALICE DCS Workshop, October 6,2005 DCS Computing policies and rules.
Proposal for Decisions 2007 Work Baseline M.Jonker for the Cocost* * Collimation Controls Steering Team.
The Grid System Design Liu Xiangrui Beijing Institute of Technology.
Stephane Deghaye (AB/CO) ATC/ABOC days.
Logging Mike Lamont Georges Henry Hemlesoet AB/OP Discussions with M. Pace & C. Roderick.
Eugenia Hatziangeli Beams Department Controls Group CERN, Accelerators and Technology Sector E.Hatziangeli - CERN-Greece Industry day, Athens 31st March.
Session 1 Introduction  What is RADE  Technology  Palette  Tools  Template  Combined Example  How to get RADE  Questions? RADE Applications EN-ICE-MTA.
Wojciech Sliwinski BE/CO for the RBAC team 25/04/2013.
15 Sep 2007 ICALEPCS'07 - P.Charrue - CERN1 The Status of the LHC Controls System Shortly Before Injection of Beam Pierre Charrue on behalf of CERN Accelerator.
The Status of the LHC Controls System Shortly Before Injection of Beam
BP & RS: BIS & SLP for AB/CO Review, 23 h Sept Realisation of the interlocking between SPS, LHC and CNGS and open issues Beam Interlock Systems.
ITGS Network Architecture. ITGS Network architecture –The way computers are logically organized on a network, and the role each takes. Client/server network.
RBAC Content: LHC Operational Mode Piquet Roles RBAC Strict LHC Operational mode and CMW Acknowledgements: Pierre C., Wojtek S., Stephen P., Lars J., Verena.
Architectural issues M.Jonker. Things to do MD was a success. Basic architecture is satisfactory. This is not the end: Understanding of actual technical.
Commitments and Major Issues for LHC Applications Eugenia Hatziangeli on behalf of CO, OP & LSA application providers.
Hardware Commissioning Tools Robin. AB-CO TC 8th February 2007Hardware Commissioning Tools - RJL2 Summary Hardware Commissioning View Slides from Antonio.
26 Jan 06Marine Pace - AB/CO1 LEIR Controls : Gain of Experience for the Running-in of LHC Marine Pace on behalf of AB/CO and LSA.
AT Control Forum First Meeting. Introduction  The AT Controls FORUM :  Is responsible for coordination of the overall strategy for controls activities.
Andrzej Siemko On behalf of the MPP-GMPMA Task Force: (A. Ballarino, R. Denz, B. Khomenko, A.Perrin, P. Pugnat, A. Rijllart, L. Serio, A. Siemko, A. Vergara.
MA PM performance, Adriaan Rijllart Post Mortem data handling and performance Adriaan Rijllart, Beno î t Pannetier, Boris Khomenko, Greg Kruk,
Issues concerning Device Access (JAPC / CMW / FESA) With input from: A.Butterworth, E.Carlier, A. Guerrero, JJ. Gras, St. Page, S. Deghaye, R. Gorbonosov,
MPE activities within MP3 Arjan Verweij A. Verweij, MPE review, 2 June 2015 The questions: 1. Define the scope of work for your activity 2. Structure of.
AB/CO Review, Interlock team, 20 th September Interlock team – the AB/CO point of view M.Zerlauth, R.Harrison Powering Interlocks A common task.
DIAMON Project Project Definition and Specifications Based on input from the AB/CO Section leaders.
16-17 January 2007 Post-Mortem Workshop Logging data in relation with Post-Mortem and archiving Ronny Billen AB-CO.
Status of the AWAKE Control System Edda Gschwendtner, Janet Schmidt, Marine Gourber-Pace, Roman Gorbonosov, Peter Sherwood AWAKE Technical Board, 27 January.
LHC Section Meeting 1.eLogbook 2.LHC Controls Security Panel.
HWC Review – Sequencer Vito Baggiolini AB/CO, with the team: Carlos Castillo, Daniele Raffo, Roman Gorbonosov.
© 2001 By Default! A Free sample background from Slide 1 Controls for LEIR AB/CO Technical Committee - 18 th March 2004.
Industrial Control Engineering Session 1 Introduction  What is RADE  Technology  Palette  Tools  Template  Combined Example  How to get RADE 
European Organization For Nuclear Research CERN Accelerator Logging Service Overview Focus on Data Extraction for Offline Analysis Ronny Billen & Chris.
ADE Alessandro Raimondo (ICE/MTA) ICE workshop, 23 th April 2009.
MPE Workshop 14/12/2010 Post Mortem Project Status and Plans Arkadiusz Gorzawski (on behalf of the PMA team)
LSA Core overview 6 / 11 / 2007 Wojciech Śliwiński (AB-CO-AP) on behalf of LSA team.
AB-CO Exploitation 2006 & Beyond Presented at AB/CO Review 20Sept05 C.H.Sicard (based on the work of Exploitation WG)
H2LC The Hitchhiker's guide to LSA Core Rule #1 Don’t panic.
LabVIEW Core I with RADE introduction EN/ICE/MTA.
LHC Post Mortem Workshop - 1, CERN, January 2007 (slide 1/52) AB-CO Measurement & Analysis Present status of the individual.
Securing Network Servers
Commitments and Major Issues for LHC Applications
Overview of IT Auditing
How do we tackle the extended requirements?
Computing infrastructure for accelerator controls and security-related aspects BE/CO Day – 22.June.2010 The first part of this talk gives an overview of.
The Control System For LHC Hardware Commissioning
PLANNING A SECURE BASELINE INSTALLATION
Presentation transcript:

Selected Subjects on Controls System - Quality Assurance P.Charrue On behalf of the AB Controls Group LHC Machine Advisory Committee 16 June 2006

LHC Machine Advisory Commitee2 Preamble AB/CO was asked to talk about Quality Assurance - this is a wide subject Today we talk about a selection of subject which are representatives in the QA domains and which deserve attention from the management

16 June 2006LHC Machine Advisory Commitee3 Outline Overview of the Controls Infrastructure Network Security (CNIC project) LHC Application production Post-Mortem project Conclusions

Drawing here

16 June 2006LHC Machine Advisory Commitee5 Outline Overview of the Controls Infrastructure Network Security (CNIC project) LHC Application production Post-Mortem project Conclusions

16 June 2006LHC Machine Advisory Commitee6 The CNIC Working Group The Computing and Network Infrastructure for Controls working group was created by the CERN Executive Board  From the recommendations made by the Technical Network Management Working Group (Jul 2004) Delegated by the CERN Controls Board (Sep 2004)  “…with a mandate to propose and enforce that the computing and network support provided for controls applications is appropriate” to cope with security issues.  Mandate covers only control systems, not office computing Members from all CERN controls domains and activities  Service providers (Network, NICE, Linux, Computer Security)  Service users (AB, AT, LHC Experiments, SC, TS)

16 June 2006LHC Machine Advisory Commitee7 Networking at CERN General Purpose Network (GN)  For office, mail, www, development, …  No formal connection restrictions by CNIC Technical Network (TN) and Experiment Network (EN)  For operational equipment  Formal connection and access restrictions  Limited services available (e.g. no mail server, no external web browsing)  Authorization based on MAC addresses  Network monitored by IT/CS

Office development PC Trusted Application Gateways Home or remote PC CERN Firewall Connection to Internet INTERNET CERN Public Gateways (LXPLUS, CERNTS)

16 June 2006LHC Machine Advisory Commitee9 Possible Threats Malicious access  A hacker accessing our devices from outside  A deliberate attack  ‘Sniffing’ the data that transits on the TN Erroneous access  Un-intentional errors  Errors committed by CERN personnel in ignorance Control/Grant access from outside the CCC (Cern Control Center) ‘Anonymous’ traceability Generic accounts with weak password

16 June 2006LHC Machine Advisory Commitee10 Malicious access Not much protection possible from CO side against intentional and motivated security attack from outside or within CERN However the TN is relatively difficult to get into from outside without a CERN account IT security covers protection against these type of threads. CNIC is currently studying intrusion detection on TN

16 June 2006LHC Machine Advisory Commitee11 What can be done Security enhancement and traceability are possible at four different levels :  Communication Layer  Accounts  CNIC  Applications

16 June 2006LHC Machine Advisory Commitee12 Communication Implement a ‘role-based’ access to the equipment in the communication infrastructure Depending on WHICH action is made, on WHO is making the call, and from WHERE the call is issued, the access will be granted or denied This will allow for filtering, for control and for traceability of the access to the equipment

16 June 2006LHC Machine Advisory Commitee13 Accounts Forbid ‘anonymous’ generic accounts Enforce usage of accelerator-oriented accounts Enforce the password change regularly Limit operational accounts to CCC All these measures cost nothing They may be seen as constraints to the operators working habits

16 June 2006LHC Machine Advisory Commitee14 Main outcomes of CNIC 9 January 2006 : closure of the GPN TN connection  No communication allowed to cross the bridge except from TRUSTED hosts on the GPN to EXPOSED hosts on the TN  This reduced the TRUSTED hosts from 10’000+ to 2’000 NICEFC and LINUXFC deployed operationally on more than 200 hosts Restrict access to the Network Description Database (NETOPS) via identification More than 40 Application Gateways deployed Connection to the TN requires authorization MAC address authentication

Operator in the CCC Specialist access from home Access from the office inside CERN Office development PC Trusted Application Gateways Home or remote PC CERN Firewall Connection to Internet INTERNET CERN Public Gateways (LXPLUS, CERNTS) 3 typical Use Cases

16 June 2006LHC Machine Advisory Commitee16 Pending Studies Areas Critical Settings encryption  Discussions still on-going Authentication means (e.g. card readers in the consoles, bank-like authentication, …) Reduction of the Trusted list

16 June 2006LHC Machine Advisory Commitee17 Outline Overview of the Controls Infrastructure Network Security (CNIC project) LHC Application production Post-Mortem project Conclusions

16 June 2006LHC Machine Advisory Commitee18 Mandate The Controls group  provides core control functionality & applications (HWC sequencer, equip state, equip monitoring, SDDS,…) in collaboration with AB/OP  produces and maintains standard facilities (Logging, FDs, LASER, JAPC, SIS, BIC, OASIS, CCM, PM, …)  develops, maintains and supports UNICOS based applications (Cryo, QPS, PIC, WIC,..) for industrial control system  provides support for modeling of the Controls database (SPS, HWC, LEIR, LHC) and for the logging and measurement services (Timber, Meter) AB/CO is also providing development environment, tools and graphical components to be used by application developers, equipment and MD specialists FESA editor Java dataviewer General purpose graphical beans Java GUI frame LabVIEW development environment UNICOS frame Working sets & Knobs Jython Build and release tools Software support to developers

16 June 2006LHC Machine Advisory Commitee19 Frameworks for LHC Applications Three approaches in place to build applications  Beam based control applications Majority of applications Java infrastructure  Industrial control PLC/SCADA based applications UNICOS frame based on PVSS  Post Mortem data analysis Based on LabVIEW

LHC Java Applications and Core LSA Controls System Core FESA Equipments Controls Middleware Monitoring & Concentration LSA Trim Beam Steering Settings Generation BDI Applic Fixed Displays Controls Settings LSA API FESA Equipments FESA Equipments Standard Equipment Access (JAPC) Core applications Equipment and instrumentation applications Standard Equipment Access High-Level Services LSA Core

16 June 2006LHC Machine Advisory Commitee21 LHC Java Applications - Organization The work is done in a close collaboration with the OP group - we work in a team One single project in place (LSA) providing the common architecture Aim to use for LEIR, SPS, their transfer lines TT40, TI8, TI2, LHC HWC and operation Test/validate using every possible controls or operational milestone and several dry runs

16 June 2006LHC Machine Advisory Commitee22 Issues - Remote Access and Security Experts and on-call teams require access to LHC controls from outside the CCC Who has the right to modify LHC parameters?  Control of certain devices (Schottky) from other institutes is already requested (US-LARP collaboration) We need remote access and role based access policy and manpower to implement it Identification of the originating account and host has to be registered and propagated through all the chain (who and where from) Business logic between the GUI and the equipment has to react differently according to the origin of the request

16 June 2006LHC Machine Advisory Commitee23 Issues -Time allocated for Testing TT40/TI8, HWC, LEIR, CNGS and SPS ring will be used now to validate the LSA core and applications extensively Due to the most probable cancellation of the LHC Sector Tests end of 2006, AB/CO will :  organize scalability tests for the complete controls infrastructure  need well coordinated dry runs We request time allocated during LHC commissioning for the final tests of the deployed software

16 June 2006LHC Machine Advisory Commitee24 Issues - Resources Major core activities are staffed by temporary or departing staff The same application developers are working for HWC, LEIR, CNGS, PS and SPS startup LHC applications list documented but not fully staffed clearly showing lack of resources See Today 4 FTE from AB/CO/AP, 3 from OP and 1 associates are working on the LHC software production We need experienced Java software developers Since Apr’05 we actively seek for 6 more associates (3 for HWC and 3 for LSA) :  Hired 1 for HWC in April’06, 1 for LSA in July’06 and 1 for LSA in September  We still miss 2 for HWC and 1 for LSA

16 June 2006LHC Machine Advisory Commitee25 Outline Overview of the Controls Infrastructure Network Security (CNIC project) LHC Application production Post-Mortem project Conclusions

16 June 2006LHC Machine Advisory Commitee26 Post Mortem project mandate After a failure during the operation of the LHC, leading to a beam abort or a power abort, a coherent set of so called “Post Mortem” information will be collected from the various sub-systems to analyze the causes of failure. To be able to understand the failure before resuming LHC operation, the collected information needs to be analysed within a few minutes and this requires a highly automated data collection and analysis process. The Post Mortem system is aimed at providing the operators and system experts with data visualisation tools which can combine raw data and automatically analysed data.

16 June 2006LHC Machine Advisory Commitee27 Summary of systems with PM requirements

16 June 2006LHC Machine Advisory Commitee28 PMA: Data flow QPSPICPC LHC Raw data files Systems Result data Logging Alarms Other systems … PM viewer PM analyser Data bases PM server

PM used for LHC Hardware Commissioning Example: Automatic analysis of the QPS tests for quality assurance. 1.The quench detection signal gets driven over the 100 mV threshold. 2.View of QPS signals to see that the system triggered and the quench heaters fired. 3.Automatic analysis of the quench heater discharge (log scale) showing the results. 4.Automatic analysis of the event with “passed/not passed” indication

16 June 2006LHC Machine Advisory Commitee30 PM: Milestones 1.June ‘06: Data Viewer for QPS, PIC and PC data 2.Sept. ‘06: Extended PM data storage model for new clients 3.Sept. ’06:Dry run, correlation of QPS, PIC and PC data 4.Oct. ‘06:PM system scaling test, including BI, BT and RF 5.Nov. ‘06:HW commissioning analysis, as defined in LHC-D-HCP During‘07:Analysis for Beam Commissioning

16 June 2006LHC Machine Advisory Commitee31 Issues A successful PM system was developed for SM18 magnet quench analysis served as the base of the LHC PM system Recently a new Project Leader has been assigned due to succession planning and the scope has increased through data collection, storage, browsing and analysis Many technological choices and user interfaces still to be defined and solved We are rather late with the work due to the late arrival of user specifications.

16 June 2006LHC Machine Advisory Commitee32 Outline Overview of the Controls Infrastructure Network Security (CNIC project) LHC Application production Post-Mortem project Conclusions

16 June 2006LHC Machine Advisory Commitee33 Conclusions Network and Security :  Activities are well defined  Reduction of the TRUSTED list is not trivial  encryption, authentication and role based access need global coordination LHC applications  Framework well defined  There are issues on resources and on time for testing  Hiring JAVA experts is very difficult PostMortem  First operational version used in HWC for QPS, PIC and PC  Project changed leadership and mandate has been extended  Work is late